The biggest hurdle in developing the self-driving car has been making it understand the world like a human driver would do. This includes giving it eyes and senses to help it see obstructions, track other cars, as well as make smart judgment while on the road to avoid possible accidents.
Well, we cannot say that this has been achieved fully, but progress is currently at advanced stages. Companies are busy developing AI-powered car parts, with Mitsubishi working on its mirrorless technology which now gives cars a vision accuracy of 81 percent within a distance of 100 meters away.
Nonetheless, the scope of research is broadening and now the engineers are shifting some attention to techs that would help autonomous cars sense and interpret the emotions of the occupants.
The AI-based Emotion Sensing Technology
When this comes to full maturity, it will be among the biggest achievements in machine intelligence because it ideally means training robots to understand human emotions. However, the main intent of Affetiva’s technology is to monitor and enhance driver alertness with the bigger goal of reducing car accidents.
As it progresses, the tech might one day fully comprehend the mood and preference of the occupants. Making it able to do automatic adjustments that would affect the inside environment of cars, like for instance adjusting the temperatures or reducing the music inside the car — in the quest to improve the riding experience.
“For the longest time, most efforts have been about making agents do a certain task or making devices more conversational. Well, cars as well can now talk, but it’s time they start to interact with the passengers emotionally,” said Rana el Kaliouby the CEO of Affectiva.
Application of the Tech
Located in Boston, Affectiva is becoming a sought-after startup because of its products and innovations that focus on sensing people’s emotions. To achieve this, the company has been using optical sensors and webcams to master people’s facial expressions — where the data was then run through an algorithm for interpretation.
This technology has been in application in areas such as gaming, marketing, and advertising. However, the company broadened its scope to develop a voice analysis software which it says can detect emotions in speech. The software can be served from the clouds.
Now, by combining the above-mentioned approaches (image and voice analysis,) Affectiva has managed to develop a usable software that employs AI to detect and understand the emotional state of people. The algorithm analyzes the data it receives from the cameras and microphones that are strategically placed inside the car.
The Future of The Emotion Sensing Cars
For some clarity, Affectiva doesn’t make cameras and microphones, it focuses on the software – which it claims can read and interpret blink rates, laughter, yawing, anger, excitement, drowsiness and vocal expressions that suggest a person’s real-time state of emotion. This ideally means cars will in future be able to detect what the passengers need, and take the necessary action.
From the overview, while this technology seems to target autonomous vehicles, it’s obvious that it can also be very instrumental even with the current type of cars we have. That is, such a tech can help improve taxi services by automatically enhancing the passenger experience, adjusting temperatures, signaling voice assistance like Alexa to play cool music and so on…
This product gives Affectiva and additional edge to compete with Vokaturi EMOSpeech, Nvidia, Eyeris and other companies which make AI-powered car parts.