Skip to main content

Toddler robots help reveal how human kids learn about their world

toddler robot learning toddler1
Image used with permission by copyright holder
There’s a lot of focus on looking at the means by which humans learn and using these insights to make machines smarter. This is the entire basis for artificial neural networks, which try to replicate a simple model of the human brain inside a machine.

However, the opposite can be true as well: Examining robots can help reveal how we as humans absorb and make sense of new information.

Recommended Videos

That’s the basis for a new research project, carried out out by researchers in the United Kingdom. Looking to understand more about how young kids learn new words, they programmed a humanoid robot called iCub — equipped with a microphone and camera — to learn new words.

Their conclusion? That children may well learn new words in a similar way to robots; based less on conscious thought than on an automatic ability to associate objects.

“We were interested in finding out whether it’s possible to learn words without a complex reasoning ability,” Katie Twomey, a psychology department researcher from the U.K.’s Lancaster University, told Digital Trends.

“To explore this we used the iCub humanoid robot, which learns by making simple links between what it sees and what it hears. Importantly, iCub can’t think explicitly about what it knows. We reasoned that if iCub can learn object names like toddlers do, it’s possible that children’s early learning is also driven by a simple but powerful association-making mechanism.”

In the study, a group of kids aged 2 1/2 were given the task of selecting a particular toy of out of lineup consisting of, alternately, three, four, or five different objects. In each case, one of the objects was something unfamiliar to them. The study aimed to get the kids to learn the name of the unknown object using a process of elimination, based on information they already knew.

robot1
Image used with permission by copyright holder

“We know that toddlers can work out what a new word means, based on the words they already know,” Twomey continued. “For example, imagine a 2-year-old sees two toys: their favorite toy car, and a brown, furry toy animal that they’ve never seen before. If the toddler hears a new word ‘bear,’ they will assume that it refers to the new toy, because they already know that their toy is called ‘car’.”

In this case, it is possible that kids are able to think in detail about what they already know, and use reasoning to figure out that their favorite is called a “car,” so the new toy must be a “bear.” However, it’s also possible that children solve this puzzle automatically by simply associating new words with new objects.

The researchers then asked the iCub to carry out the same task. It was trained to recognize 12 items but, like the kids, was then shown a combination of objects it recognized and ones it did not. Intriguingly, it performed exactly the same as the kids when it came to learning new words.

“Critically, iCub learned words by making simple associations between words and objects, rather than using complex reasoning,” Twomey said. “This suggests that we don’t need to assume children reflect in detail about what they know and what words refer to. Instead, early word learning could depend on making in-the-moment links between words and objects.”

It’s an interesting use of robotics to help uncover insights about developmental psychology. It can also reveal previously unconsidered details which may also tell us something about how humans learn.

“In our study, the amount of time it took for the robot to move its head to look at objects affected how easily it learned words,” Twomey concluded. “This suggests that the way objects are set out in children’s visual scene could also affect their early word learning: a prediction we are planning to test in new work with toddlers.”

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
The UK’s Wayve brings its AI automated driving software to U.S. shores
wayve ai automated driving us driver assist2 1920x1152 1

It might seem that the autonomous driving trend is moving at full speed and on its own accord, especially if you live in California.Wayve, a UK startup that has received over $1 billion in funding, is now joining the crowded party by launching on-road testing of its AI learning system on the streets of San Francisco and the Bay Area.The announcement comes just weeks after Tesla unveiled its Robotaxi at the Warner Bros Studios in Burbank, California. It was also in San Francisco that an accident last year forced General Motors’ robotaxi service Cruise to stop its operations. And it’s mostly in California that Waymo, the only functioning robotaxi service in the U.S., first deployed its fleet of self-driving cars. As part of its move, Wayve opened a new office in Silicon Valley to support its U.S. expansion and AI development. Similarly to Tesla’s Full-Self Driving (FSD) software, the company says it’s using AI to provide automakers with a full range of driver assistance and automation features.“We are now testing our AI software in real-world environments across two continents,” said Alex Kendall, Wayve co-founder and CEO.The company has already conducted tests on UK roads since 2018. It received a huge boost earlier this year when it raised over $1 billion in a move led by Softbank and joined by Microsoft and Nvidia. In August, Uber also said it would invest to help the development of Wayve’s technology.Just like Tesla’s FSD, Wayve’s software provides an advanced driver assistance system that still requires driver supervision.Before driverless vehicles can legally hit the road, they must first pass strict safety tests.So far, Waymo’s technology, which relies on pre-mapped roads, sensors, cameras, radar, and lidar (a laser-light radar), is the only of its kind to have received the nod from U.S. regulators.

Read more
Aptera’s 3-wheel solar EV hits milestone on way toward 2025 commercialization
Aptera 2e

EV drivers may relish that charging networks are climbing over each other to provide needed juice alongside roads and highways.

But they may relish even more not having to make many recharging stops along the way as their EV soaks up the bountiful energy coming straight from the sun.

Read more
Ford ships new NACS adapters to EV customers
Ford EVs at a Tesla Supercharger station.

Thanks to a Tesla-provided adapter, owners of Ford electric vehicles were among the first non-Tesla drivers to get access to the SuperCharger network in the U.S.

Yet, amid slowing supply from Tesla, Ford is now turning to Lectron, an EV accessories supplier, to provide these North American Charging Standard (NACS) adapters, according to InsideEVs.

Read more