Smarter Smart Vehicles

A new generation of “smart” cars that enable safer, stress-free, efficient and enjoyable driving will soon hit the road thanks to research in the Laboratory for Intelligent and Safe Automobiles (LISA) at UC San Diego, led by electrical engineering professor Mohan Trivedi.

LISA’s vision for intelligent vehicles is what Trivedi calls a human-centered distributed cognitive system, in which humans and robots cooperate with each other, rather than compete with each other. “This distributed cognitive system should be able to learn and execute perceptual, cognitive and motor functions in a synergistic manner, where humans and machines both understand the strengths and limits of one another,” said Trivedi. A main goal of LISA, he added, is not for the car to completely take over the driving, but to better understand the driver to help avoid accidents and navigate through chaotic situations

Over the last 15 years, LISA researchers have developed technologies to monitor and understand what’s happening both inside and outside cars on the road. The team equips its vehicles with cameras and other sensors to observe the movement of the driver’s head, eyes, hands and feet and to monitor the surrounding traffic. They then use these data to develop machine vision and machine learning algorithms that can learn the driver’s patterns and predict the driver’s intended maneuvers a few seconds before they happen.


Members of LISA in front of an intelligent testbed vehicle
Left to right: Frankie Lu, Mohan Trivedi, Sean Lee, Ravi Satzoda

Moving forward, LISA researchers are developing intelligent driver assistance systems that assess when it’s safe to merge, brake, change lanes, accelerate and decelerate. So if drivers take their eyes off the road and begin swerving, cars could momentarily take control of steering and braking to avoid obstacles and collisions. During his talk at the Contextual Robotics Forum last October, Trivedi announced that these intelligent assistance features will be installed in the Audi A8 in 2017. Other versions of these systems are also being adopted for consumers in Asia and Europe.

Learn more:


RUBI, the social robot

Expression Detection

RUBI the robot is a preschool classroom veteran. With a screen for a head and a tablet for a body, she has helped researchers study the development of young children whose ability to speak is limited. RUBI teaches preschoolers colors, shapes and songs. She is equipped with a computer expression recognition toolbox, and some of the technology that researchers developed on RUBI led to an algorithm that can detect smiles and is currently in use in Sony digital cameras.

RUBI is the brainchild of Javier Movellan, the founder and long-time director of the Machine Perception Lab, which also developed Diegosan, a robot that helped elucidate why infants smile at their mothers, and the Einstein robot, which helped researchers learn more about how both humans and robots perceive emotions.

Movellan is also a co-founder of Emotient, a pioneer in analyzing facial expression that was recently acquired by Apple, according to news reports. Movellan, Emotient co-founders Marni Bartlett and Gwen Littlewort, who are also from the Machine Perception Lab, and their team, developed a facial-recognition technology with applications in many fields. The startup’s technology has helped advertisers assess how viewers are reacting to ads in real time. Physicians have used Emotient software to interpret pain levels in patients who otherwise have difficulty expressing what they’re feeling, while a retailer has employed the company’s AI technology to monitor consumers’ reactions to products on store shelves.

Movellan, Bartlett and Littlewort are now at Apple’s Cupertino headquarters, but the Machine Perception Lab will go on, said electrical engineering professor Ramesh Rao, director of the Qualcomm Institute, where the lab is housed. “We will expand the use of research around Diego-san as a testbed for developing new software and hardware for more specialized robotic systems,” he said. “We are […] also leveraging the lab for faculty and staff researchers to develop other types of robotic systems to serve a variety of purposes and environments.”

RUBI is joining the lab of cognitive science professor Andrea Chiba, where she will help researchers learn more about robot-human interactions in the classroom.

Table of Contents