Researchers in America have programmed a computer to read body language. The work undertaken by a team from Carnegie Mellon University’s Robotics Institute could pave the way for autonomous cars that can predict if a person is about to walk on to the road – by reading their body language.
By honing the human instincts of robots, they will be better equipped to function in social situations.
Using video in real time the computers detect and interpret movement. The platform was developed with data gathered from 500 videos cameras at the Panoptic Studio, which make it possible to gauge the ‘pose of a group of people using a single camera and a laptop computer’.
Yaser Sheikh, associate professor of robotics, said: “We communicate almost as much with the movement of our bodies as we do with our voice. But computers are more or less blind to it.”
This breakthrough is already attracting car makers – and other commercial groups – who want to license the technology, according to Sheikh.