acm-header
Sign In

Communications of the ACM

ACM News

The Next Generation of AI-Enabled Cars Will Understand You


View as: Print Mobile App Share:
Sensors inside the cabin of the car detect points on the body, infer movement, and assess possible driver impairment.

Some automakers are now moving the camera to the rearview mirror. With this new perspective, engineers can develop systems that detect not only people's emotions and cognitive states, but also their behaviors, activities, and interactions with one another

Credit: Affectiva

It's an old-fashioned idea that drivers manage their cars, steering them straight and keeping them out of trouble. In the emerging era of smart vehicles, it's the cars that will manage their drivers. We're not talking about the now-familiar assistance technology that helps drivers stay in their lanes or parallel park. We're talking about cars that by recognizing the emotional and cognitive states of their drivers can prevent them from doing anything dangerous.

There are already some basic driver-monitoring tools on the market. Most of these systems use a camera mounted on the steering wheel, tracking the driver's eye movements and blink rates to determine whether the person is impaired—perhaps distracted, drowsy, or drunk.

But the automotive industry has begun to realize that measuring impairment is more complicated than just making sure that the driver's eyes are on the road, and it requires a view beyond just the driver. These monitoring systems need to have insight into the state of the entire vehicle—and everyone in it—to have a full understanding of what's shaping the driver's behavior and how that behavior affects safety.

From IEEE Spectrum
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account