Oxford University Press's
Academic Insights for the Thinking World

Why we like a good robot story

Jim and Kerry Kelly live in a small town in the rural Midwest. Their sons, Ben, six, and Ryan, twelve, attend the local public school. Their school district is always short staffed. The closest town is 40 miles away and the pay for teachers is abysmal.

This year, the district’s staffing has hit a critical low: Class size will have to be huge, and there’s limited money for aides who might help with the teaching load, which will further discourage teacher applications. The school board considers accepting underqualified teachers. Parents are up in arms. Teachers and principals are stressed. The situation comes to a head at a school board meeting that drags on past midnight with shouting, frustration, threats, and anger. But, it’s too important to back off. Children’s futures are at stake.

In the next week, the superintendent finds a possible solution. The state has money available to help school boards implement technology in qualified districts. The Kellys’ town qualifies. They can receive funds to buy robots to serve in the classrooms. The robots can assume some of the teaching load, improve teaching quality, and relieve the overcrowding.

Ten years ago, this would have been unthinkable. Soulless machines educating our children? But, solutions are few, and the superintendent has found reports of success from other schools. He sells his plan well, and against all expectations, the school board agrees. The following fall, the Kelly kids, like all the kids in the district, have a robot in the classroom.

At the half-year mark, the school board reviews their decision. In six-year-old Ben’s class, the results are outstanding. Kids learn fast—pretty much as fast from the robot as from the human teacher. And, the kids like the interactions with the machines. The teacher can accomplish more and is less stressed in the process. Everyone is pleased.

But Ryan’s class, at age eleven, has a far different experience. The robot used in their class is identical to the one in Ben’s class—very human-like. By January, the kids hate it. They call it names; they hit it; they learn little from it. By midyear, it sits in a corner, scorned by students and teacher alike.

This scenario is fiction, but it reflects the real world. Many school districts are hurting for staff, and robots are entering the classroom. In Japan, Robovie speaks and teaches English with elementary school children, like Robosem does in Korea. In the United States, RUBI teaches children Finnish. Child-like robots are helping autistic children practice social interactions through imitation-teaching games.

Using robots as teachers makes some sense. Children learn much of their knowledge from others: parents, teachers, peers. Children trust that 8 × 8 = 64, that Earth is round, and that dinosaurs are extinct, not because they have uncovered these facts themselves but because reliable sources have told them so. Research shows that children are adapted to learn general knowledge from human communication. The phenomenon is known as “trust in testimony.”

But, do children trust the testimony of robots? Does it matter if the robot behaves, responds, or even looks like a human? If children learn from a robot, do they learn in the same way they learn from a human teacher? Excellent questions.

Research shows that when children as young as preschool age learn from other people, they monitor their informants’ knowledge, expertise, and confidence. They remember whether a person has given them accurate information in the past. They also monitor an informant’s access to information: Did she see the thing she is telling me about? They attend to the person’s qualifications: Is he a knowledgeable adult or a naïve child?

Surprisingly little is known about how and if children learn from robots. Because robots are machines, children could see them as infallible, like calculators or electronic dictionaries. If so, they might accept any information from a robot without considering if it is accurate. Or, children might see robots as a more fallible machine: a toaster that burns the toast, a virtual assistant that can give wrong or even outlandish answers or an alarm clock that goes off in the middle of the night. If so, they might resist a robot’s teachings.

Young children clearly can and do learn from robots, and they are appropriately choosy about the sort of robot teachers they accept. Research also shows that young children are more likely to learn from and accept robot teachers that are older children—just like in Ben Kelly’s first grade classroom—and older children are less likely, even unlikely, to do so—like in Ryan Kelly’s sixth grade classroom.

Isaac Asimov’s book, I, Robot, is primarily about morality and robotics: how robots interact with humans for good or evil. Most of the stories revolve around the efforts of Dr. Susan Calvin, chief roboticist for the fictional U.S. Robotics and Mechanical Men, Incorporated (USRMM), the primary producer of advanced humanoid robots in the world. She worries about aberrant behavior of advanced robots, and she develops a new field, robopsychology, to help figure out what is going on in their electrical (“positronic”) brains.

All robots produced by USRMM are meant to be programmed with the “Three Laws of Robotics:”

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  • A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

But in Asimov’s stories, flaws are found in the robots and their prototypes. These lead to robots imploding, harming people, and in one key case, killing a man.

I, Robot stories have spawned numerous spin-offs, continuations, and commentaries. A 2004 episode of The Simpsons (titled “I, D’oh Bot”) featured a robot boxer named Smashius Clay. Smashius, self-defeatingly, follows all of Asimov’s three laws and loses to every human he fights.

The Twentieth Century Fox 2004 film, I, Robot, starred Will Smith as detective Dell Spooner of the 2035 Chicago police department. Dell investigates a murder of the roboticist Dr. Alfred Lanning, which may have been at the hand of a USRMM robot.

Currently, robots like Robovie (or NAO or RUBI or other robots designed to interact with adults or with children) don’t have moral codes programmed in. But then, they also don’t have anything like full positronic machine intelligence. Conversely, we don’t have laws or codes about how to treat robots. For example, should a really human-like robot have rights? In November 2017, the Kingdom of Saudi Arabia granted a very human-looking robot, Sophia, citizenship. This set off an uproar about rights among women in Saudi Arabia. For example, Saudi women must veil their faces when they are in public; Sophia appeared in public and on TV without a veil.

Researchers, robot designers, parents, and teachers have become increasingly concerned that interactions with robots will promote antisocial behaviors. A hitchhiking robot who traveled a bit around Europe, taking pictures and carrying on conversations with other travelers, was vandalized and destroyed after several extended weeks in the United States. And, it’s easy to imagine a world where robots displace humans from jobs and so are attacked and sabotaged by those they’ve displaced.

Research suggests that antisocial behaviors toward robots can be reduced by modifying the robots. Preschool children in a classroom comforted a robot with a hug and protected it from aggression when it started to cry after being damaged or played with too roughly. At least one study has shown that younger children say a robot should be treated fairly and should not be psychologically harmed after having conversed and played with the robot for fifteen minutes.

Every year, robots become a larger part of our lives. You can find robots in malls, hotels, assembly lines, hospitals and, of course, research labs. The National Robotics Initiative foresees a future in which “robots are as commonplace as today’s automobiles, computers, and cell phones. Robots will be found in homes, offices, hospitals, factories, farms, and mines; and in the air, on land, under water, and in space.”

And robots are becoming more and more present in the lives of children. They are already teaching children in classrooms and helping them in hospitals. Robovie is just one of many robots that have been manufactured in the last few years designed to be used with children–created to play games, answer questions, read stories, and even watch children unsupervised.

This unbridled production of hopefully child-friendly robots certainly needs more research. But, it’s clear that in some ways and at some ages children can successfully learn from robot teachers who actively interact with them.

Featured Image by Andy Kelly from Unsplash

Recent Comments

There are currently no comments.