You'll never drive alone if your car has the Affective Intelligent Driving Agent now under development byAG and Massachusetts Institute of Technology researchers.
However, the robot may be years away from becoming available in production cars.
AIDA is designed to be a driver's companion that will monitor moods and driving preferences. The smiley face robot can help navigate a driver through traffic, offer reminders when the gas tank is running low and give information about the vehicle and its surroundings.
AIDA was created by a group of researchers led by Professors Cynthia Breazeal of MIT's Personal Robots Group and Carlo Ratti, director of MIT’s Senseable City Lab in Cambridge, MA. They were joined by Mike Siegel ofGroup of America Inc.’s Electronics Research Lab in Palo Alto, CA.
“AIDA builds on our long experience in building sociable robots,” Breazeal says. “We are developing AIDA to read the driver's mood from facial expression and other cues and respond in a socially appropriate and informative way.”
The goal is to create an “affective” bond between the driver and robot, so both learn from one another.
AIDA is programmed with information regarding city events, commercial activities and tourist attractions, as well as environmental and, eventually, traffic conditions. The robot also analyzes the driver's patterns and keeps track of routes and destinations.
“Within a week, AIDA will have figured out your home and work location,” says Assaf Biderman, associate director of the Senseable City Lab. “Soon afterward, the system will be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam.”
The robot also could provide feedback on driving to help achieve better fuel economy and safer road habits.
Many of these functions will be achieved through a system of capacitive sensors mated with electronics and computer interfaces being developed by MIT and VW, says Giusy Di Lorenzo, postdoctoral fellow at the Senseable City Lab and leader of the AIDA project.
The Apple Macintosh Mini computer used by AIDA has a microphone and speaker, but the robot doesn't yet speak.
Lorenzo says sound will be added after primary behavioral and learning software is developed. VW's Siegel says the current-generation robot is not intended to be installed in a moving vehicle but will be used in a driving simulator.
Audi AG has a version of AIDA installed in a dash that soon will be shipped to MIT, where studies will be conducted.
“A later version of the robot, modified (with) information gained from the studies, will eventually be installed in a drivable test vehicle,” Siegel says in an email to Ward's.
AIDA currently is not functionally integrated into any automobile systems. Siegel says that will happen in a redesigned version of the robot. The goal is to give AIDA the ability to provide “useful and timely suggestions to the driver in the process of developing a positive and trusting relationship.”
The Audi Clean-Air Initiative, in which the auto maker is collaborating with several universities, is dedicated to making driving more environmentally friendly, says Charles Lee, a research engineer at VW's ERL.
This work includes intelligently regulating engine parameters based on traffic-flow predictions, as well as developing environmentally conscious navigation algorithms.
“Where the MIT-AIDA collaboration fits in is helping the driver accept these suggestions by making them timely, considerate and intuitive,” Lee says, adding AIDA can give a car more personality and even express concern when a driver doesn't buckle up.
“The ability to express an emotional state and develop awareness of the driver-passenger emotional and social context allows for an expansion of the human-car relationship,” Siegel says.