MIT engineers have developed a ‘socially aware’ autonomous robot that can navigate naturally in human environments.
They trained a robot using machine learning technology to make it capable of making split second (every one tenth of a second to be exact) – decisions on the best route forward. This allows it to move at walking speed and keep up with the flow of human movement.
“Socially aware navigation is a central capability for mobile robots operating in environments that require frequent interactions with pedestrians,” says Yu Fan “Steven” Chen, who led the work as a former MIT graduate student and is the lead author of the study. “For instance, small robots could operate on sidewalks for package and food delivery. Similarly, personal mobility devices could transport people in large, crowded spaces, such as shopping malls, airports, and hospitals.”
Giving robots the human touch is a key priority for researchers, and there have been a number of developments in recent months, including a breakthrough by a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) which has improved how tactile they can be and another from University of Minnesota with the development of 3D bionic skin which could give robots the sense of touch.