Driving companion AIDA – Affective, Intelligent Driving Agent
The AIDA project (Affective, Intelligent Driving Agent), a collaboration between Volkswagen of America and the Massachusetts Institute of Technology (SENSEable City Lab and Personal Robots Group of Media Lab), is a platform comprising of a personal robot and an intelligent navigation system that aims to bring more pleasant driving experience. Its developers envision it as a navigation system that mimics the friendly expertise of a driving companion who is familiar with both the driver and the city.
The aim of the Personal Robot’s portion of the project is to expand the relationship between the car and the driver with the goal of making the driving experience more effective, safer, and more enjoyable. Instead of focusing solely on determining routes to a specified waypoint, the system utilizes analysis of driver behavior in order to identify the set of goals the driver would like to achieve. AIDA involves an understanding of the city beyond what can be seen through the windshield, incorporating information such as business and shopping districts, tourist and residential areas, as well as real-time event information and environmental conditions.
“With the ubiquity of sensors and mobile computers, information about our surroundings is ever abundant. AIDA embodies a new effort to make sense of these great amounts of data, harnessing our personal electronic devices as tools for behavioral support,” comments professor Carlo Ratti, director of the SENSEable City Lab. “In developing AIDA we asked ourselves how we could design a system that would offer the same kind of guidance as an informed and friendly companion.”
This will in turn allow for useful reactions from AIDA such as proposing route alternatives when something unexpected happens in the predicted route, or providing the right information at the right time (e.g. a fuel warning before passing by a gas station) or even helping save energy.
“When it merges knowledge about the city with an understanding of the driver’s priorities and needs, AIDA can make important inferences,” explains Assaf Biderman, associate director of the SENSEable City Lab. “Within a week AIDA will have figured out your home and work location. Soon afterwards the system will be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam. On the way AIDA might recommend a stop to fill up your tank, upon noticing that you are getting low on gas,” says Biderman. “AIDA can also give you feedback on your driving, helping you achieve more energy efficiency and safer behavior.”
This interface is a research platform, which can be used as a tool for evaluating various topics in the area of social human-automobile interaction. Ultimately, the research conducted using the AIDA platform should lead to the development of new kinds of automobile interfaces, and an evolution in the relationship between car and driver.
AIDA communicates with the driver through a small robot embedded in the dashboard. “AIDA builds on our long experience in building sociable robots,” explains professor Cynthia Breazeal, director of the Personal Robots Group at the MIT Media Lab. “We are developing AIDA to read the driver’s mood from facial expression and other cues and respond in a socially appropriate and informative way.”
Currently, the AIDA research platform consists of a fully functional robotic prototype embedded in a stand-alone automobile dash. The robot has a video camera for face and emotion recognition, touch sensing, and an embedded laser projector inside of the head. A driving simulator is being developed around the AIDA research platform in order to explore this new field of social human-automobile interaction.
It is a great example of a good implementation of various technologies into an interactive robot that makes our interaction with the machines and world around us more comfortable and natural. The design is much approved, compared to previous models, even to another sociable robot Nexi which is kind of scary. Abandoning the animated mouth idea makes it simpler to build, safer and less spooky.
A few obvious downsides are solvable, but their solutions are yet to be developed and shown. AIDA could be a distraction in the wrong moments during driving, since it is moving and blocking a part of the drivers view, and that should be the first problem its software developers should pay attention to. Another downside is the fact people actually don’t like onboard navigators and argue with them often. The robot is presenting an actual backseat driver which gives navigational advice too, so we can only hope the developers won’t make it too nagging. The last fact and difference in tolerance between people, and even the same person on different days, will additionally complicate the algorithms needed to make it adaptable to us, and that will be the biggest step toward the commercial usage of this robot.
Leave your response!