Don’t Put My Robot Friend In The Scary Closet

By May 2, 2012

A humanoid robot, with large camera eyes, greets a young girl and shakes her hand. The pair engage in pleasantries, and play a guessing game. But before they can complete a second round, with the girl giving the clues and the robot guessing, a researcher enters the room and tells the robot it is no longer needed, and to store itself in the closet.

“But that’s not fair,” says the robot. “I wasn’t given enough chances to guess the object. I should be able to finish this round of the game.”

Image: University of WashingtonThe researcher takes the robot by the arm and persuades it to go into the closet, but before doing so, it implores them one last time: “I’m scared of being in the closet. It’s dark in there, and I’ll be all by myself. Please don’t put me in the closet.”

That was during a study, now published in Developmental Psychology, during which researchers at the University of Washington’s Human Interaction With Nature and Technological Systems Lab explored the social and moral relationships between children and robots.

During interviews with the children, they found that while most believed the robot to be intelligent, that belief was lower in older children, who were also less likely to believe that it could have feelings (after the interview, the children were informed that the robot was controlled by a researcher). Few children in any age group thought that it was okay to own the robot, and the percentage who thought it okay to sell the robot dwindled to a vanishing three percent in the oldest, 15-year-old respondents.

Interestingly, no significant differences in relationships to the robot were found across gender.

The experiment calls to mind Isaac Asimov’s Robbie, in which a young girl bonds with a robotic playmate. As in Asimov’s short story, the children overwhelmingly saw the robot as a “mental, social, and partly moral other.”

The implications of that status, the researchers believe, could profoundly affect the way we approach technological goals and designs.

“There is little doubt that humanoid robots will become part of our everyday social lives,” they wrote. “They may become academic tutors or day care assistants for our children, office receptionists, tour guides, bankers that replace ATMs, caretaking assistants for the older adults, or maids in our homes.”

In a sense, robots “could be viewed as offering the same social affordances as a human and thus merit the same moral considerations.”

Of course, the robot wasn’t actually intelligent — or, in fact, even autonomous. The robot is remotely controlled by researchers to pull at children’s heartstrings, without making any decisions on its own. But that in itself may be an interesting finding: Though human-like artificial intelligence doesn’t exist — or, perhaps, doesn’t exist yet — the legislative and civil rights battles that would spring up around such technology would likely be fought on grounds that are as emotional as they are based in science and philosophy.

The Human Interaction With Nature and Technological Systems Lab, which facilitated the experiment, performs psychological and ethical research into technological advances and threats to the natural world. They have previously studied the relationships between robots and adults.

Image: University of Washington