Archer

Patrol bot maker pulled the plug after robot was accused of running down a toddler.

 The future of robot policing is in limbo after a collision between a toddler and a 300-pound patrol bot. Now, a California shopping mall may soon decide if the bots will march again, or stay unplugged.

It began on July 7, when a family visiting the Stanford Shopping Center came face to face with a one of the mall’s robocops, an automated, five-foot-tall Knightscope patroller. Mother Tiffany Teng said the bot knocked her 16-month-old son over and ran over his foot.

“The robot hit my son’s head and he fell down on the floor and the robot did not stop and it kept moving forward,” Teng told KGO-TV

The boy ended up a swollen foot and a scrape on his leg. His mom called the robots dangerous. The patrol bot—and its wheeled colleagues—were put under house arrest.

“The moment we heard about it, we shut the machines down,” said Stacy Stephens, vice president of marketing and sales at bot maker Knightscope in Mountain View, just a few miles from the shopping center.

“I’m a father of three, so the thought of one of our machines hurting somebody was horrific,” he told Archer News.

Screenshot 2016-07-26 16.04.16

What went wrong?

For around $7 an hour, companies can hire the Knightscope bots to roam their open areas, using light detection and ranging devices, such as LIDAR, thermal imaging, proximity sensors, GPS and more.

“The Knightscope K5 is fully autonomous and can navigate through an environment with moving objects,” the company site says. “The technology is similar to that being utilized by the driverless cars that actively operate on the public roadways of Silicon Valley today.”

The bots are supposed to stop when someone steps right in front of them, and then go around the person, KPIX-TV reported.

The incident

Knightscope’s initial incident report on July 13 laid out its version of the events.

“A K5 Autonomous Data Machine (Machine Identification Number 13) was patrolling at a local shopping center when, at approximately 2:39pm PDT, a child left the vicinity of his guardians and began running towards the machine,” the report said. “The machine veered to the left to avoid the child, but the child ran backwards directly into the front quarter of the machine, at which point the machine stopped and the child fell on the ground.”

“The machine’s sensors registered no vibration alert and the machine motors did not fault as they would when encountering an obstacle,” the report added.

“Once the guardians retrieved the child and the path was clear, the machine resumed patrolling. The entire incident lasted a few seconds and a scrape on the child’s leg and a bruise with minor swelling were reported,” it said.

The report did not explain further about why the machine’s sensors did not register a vibration alert and the motors did not fault. It also did not say if a human was monitoring the situation through the bot’s video cameras. “We’re in the investigative stage right now,” Stephens told Archer News on July 12, the day before the report came out.

Screenshot 2016-07-26 16.40.38

Other police bots

The bot in this incident operates autonomously—the Knightscope robots make decisions about where and how to move on their own.

But the developers of another police robot say their patroller’s movements will be controlled by a person, not a bot.

“All decisions are made by a human from a remote location,” said Nagarajan Prabakar of Florida International University’s Discovery Lab. “The robot does not make decisions on its own.”

Prabakar and his team are creating the Telebot, a patrol machine designed to help injured or disabled officers and veterans work in law enforcement again. It should be ready for a full demo next spring, according to Prabakar.

The human officers wear a sort of “virtual reality” suit that allows them to move the Telebot from afar, with a special headset to take in a 3D view from the bot’s cameras, he said. A sonar sensor in the patroller’s nose helps inform the human officer how close a person or object is to the bot.

The human can talk to people through the Telebot’s microphone, turn the robot’s head, and mobilize the machine on its Segway-like wheels.

Why not automate?

The human officer can train the Telebot do some motions automatically, like waving an arm to let traffic through. Prabakar compared it to cruise control, allowing the operator some rest from repetitive tasks.

“Just like a car, cruise control,” he told Archer News. “All of a sudden you find something is blocking the way. You put on the brake. You take the control on it.”

Why not completely automate the bots, like the Stanford Shopping Center patrollers?

“We thought about it,” said Prabakar. “But the problem is when the robot makes decisions on its own, the decisions are only on a well-defined, highly well-defined framework. If there is one small variation, the robot cannot adapt to it correctly.”

“Someone should be fully accountable,” he added. “Someone should remotely control it.”

Track record

The family has questioned whether this kind of bot incident has happened before at the mall, saying a security guard mentioned a similar problem with a child before July 7.

Knightscope said this is the first time it has received this kind of report about its bots.

“K5 Autonomous Data Machines have driven over 25,000 miles and have been in operation for over 35,000 hours typically traveling at approximately 1 mph without any reported incidents,” the company statement said. “There have been thousands of encounters with adults, children and both large and small pets documented daily on social media that have also taken place without any reported incidents.”

The company indicated it would work to change the bots, if needed.

“We’ll figure it out,” Stephens said. “The technology is such that we are constantly making improvements to it anyway.”

Apology

Knightscope apologized to Teng and her family in its statement, and invited them to company headquarters to show them how the technology works, ask them questions about the incident, and provide a formal, in-person apology.

“Allow the entire team to stand in front of her and say we’re sorry,” Stephens said. “We don’t want anybody walking away from this thinking that machines, robots or technology are unsafe.”

At first, the family seemed to resist.

“I don’t want a tour. All I want is an answer why this robot didn’t stop. It’s not Disneyland,” Teng said in a Palo Alto Online report.

But this week, the company said it has been in contact with the family and is not conducting further interviews at Teng’s request, according to the Stanford Daily. Teng did not return calls to Archer News.

What next?

Will the Knightscope robocops patrol again at the Stanford Shopping Center? 

“We’re not responsible for the patrols there. We don’t comment on that because that’s their deal,” said Stephens.

A mall representative said the shopping center management would reply “at the appropriate time.”

The decision could steer the future of automated robot officers at public places across the country, either putting on the brakes, or giving the technology a path forward.

The Knightscope bots’ creators say the patrollers could play an important role in public safety, providing an authoritative presence, looking for anomalies, and sending surveillance video back for monitoring.

“In an increasingly volatile world, we are developing one of the most important technologies to come out of Silicon Valley that will empower the public and private sectors to proactively build stronger, safer communities, ultimately saving money and lives,” Knightscope’s statement said.