Children’s social interactions with a humanoid robot
Although development of robots meant to help us in our everyday lives has been hindered by the economy, there are efforts to develop social robots for children that could be used to monitor them and guard them from harm. A group of researchers from University of Washington used a Robovie robot to explore how children react to robots that could be used for such duties.
“We need to talk about how to best design social robots for children, because corporations will go with the design that makes the most money, not necessarily what’s best for children”, said Peter Kahn, associate professor of psychology at the University of Washington. “In developing robot nannies, we should be concerned with how we might be dumbing down relationships and stunting the emotional and intellectual growth of children.”
The researchers conducted a study with an even mix of 90 boys and girls, aged 9, 12 or 15, in which the children interacted with the robot and participated in an interview afterwards. The robot was remotely controlled in a way that made it look as it was acting autonomously.
The children exchanged social pleasantries with the robot, such as shaking hands, hugging and making small talk. The children also played a game of “I Spy” with Robovie, allowing the researchers to test what morality children attribute to the robot. The game started with the children guessing an object in the room chosen by Robovie, who then got a turn to guess an object chosen by the child.
As a part of the study, a researcher would interrupt the robot while it was its turn to say it was time for the interview part of the experiment and told Robovie that it had to go into a storage closet. The robot was controlled via hidden experimenter’s commands in order to protest because its feelings were hurt because it didn’t have an equal number of guesses and that the closet was dark and scary.
When interviewed by the researchers, 88 percent of the children thought the robot was treated unfairly in not having a chance to take its turn, and 54 percent thought that it was not right to put it in the closet. Nearly 80 percent of the children believed that the robot was intelligent, and 60 percent believed it had feelings.
Although social interactions with Robovie led the children to sympathize with the robot and attribute some moral standing to it, they still perceive it as an object. While A little more than half said that they would go to Robovie for emotional support or to share secrets, the children were less agreeable about allowing Robovie civil liberties, like being paid for work or having a right to vote, and that it could be bought or sold.
While a robot could ultimately be developed enough to ensure the safety of the children and interact naturally on its own, there’s a question of how agreeable a robot should be with a child. Should a robot be programmed to give in to all the child’s desires or be stricter and able to protest as Robovie did when the “I Spy” game ended early? Will children treat them as personified entities, or like servants or tools that can be bought and sold?
As a partial solution of that problem would be the use of telepresence of the parent who could ensure that children listen to the robot, however, it could also lead to abhor the children could develop toward robot if it always tells on them. Kahn believes that as social robots become a part of our everyday lives, they can benefit children but also potentially lower their emotional and social development.
While robo-nannies might not seem needed, there are some cases where they could be beneficial, such as remote areas, taking care of hypoallergenic children, or the fact that there are no vacant spots in the daycare and you simply can’t trust a babysitter.
For more information and videos from the study, visit the Human Interaction With Nature and Technological Systems Lab page or read the paper published in the journal Developmental Psychology: “’Robovie, you’ll have to go into the closet now’: Children’s social and moral relationships with a humanoid robot”.
I think they would treat them as toys, and agree that telepresence could partially solve the problem but it isn’t a good solution. Some children might be mischievous just to make their parents contact them via robot.
This can’t be good for their social development. Although I could say the same for cartoons served to them these days.
Maybe as a toy but I think it is better as help for the elderly rather than children.
Have you watched the footage of the study on their page? I find the part where they order the robot to go into the closet while arguing a bit disturbing.