Today, Affectiva, the pioneer of Human Perception AI, announced that it has been awarded six new patents in the past three months for advanced in-cabin sensing capabilities to improve vehicle safety and the occupant experience. Affectiva now has 39 patents issued, two more allowed and more than 25 additional pending, making its IP portfolio among the largest of startups in the Emotion AI and Human Perception AI space.
The new patents build on capabilities of Affectiva Automotive AI, the company’s in-cabin sensing solution, which uses computer vision, deep learning, massive amounts of real-world data and centrally placed in-vehicle cameras, to analyze the state of the driver, the occupants and the cabin. The AI runs in real time, on automotive embedded systems, providing insight into the human experience — from occupancy, activities and cognitive states such as drowsiness and distraction, to detecting objects, child seats, mood and emotions. These insights enable OEMs, Tier 1s, fleet management companies and ridesharing providers to deliver robust safety features and more personalized, enjoyable transportation experiences.
Affectiva’s latest issued patents include:
- Drowsiness Mental State Analysis Using Blink Rate (US Patent No. 10867197): detects and analyzes levels of drowsiness, mental fatigue, attention and cognitive load, based on a temporal analysis of an individual’s blink rate. This information can be used to manipulate vehicle operation, including recommending a break, adjusting the in-cabin environment, or activating braking and steering control. This is a crucial capability for vehicle safety and driver monitoring, as it provides an understanding of how impairment states progress over time.
- Cognitive State Evaluation For Vehicle Navigation (US Patent No. 10922566): provides multi-modal sensing of occupants’ cognitive states, such as drowsiness and distraction, to influence vehicle control. This capability is relevant for all levels of autonomous vehicles, and particularly in semi-autonomous vehicles to ensure the safe transfer of control between the car and the driver. The analysis is done using an in-cabin camera, as well as signals like speech and non-speech vocalization, such as yawns and snoring, and physiological data such as heart and respiration rate. It also considers environmental conditions like temperature, lighting, time of day, weather and infotainment status.
- Cognitive State Based Vehicle Manipulation Using Near Infrared Image Processing (US Patent No. 10922567): processes near infrared (NIR) images from in-vehicle cameras, using deep neural networks that run on embedded systems. The cameras provide a view of the entire cabin, including the driver and backseat passengers. This is relevant as many vehicles deploy NIR cameras that create readable images under darker lighting conditions, such as driving at night or through a tunnel.
- Vehicle Content Recommendation Via Affect (US Patent No. 10911829): provides video content recommendations based on an understanding of occupants’ emotional state, expressed through facial expressions, voice volume or tone, words used, heart and respiration rates, and additional responses collected by cameras, microphones, and other sensors. These personalized content recommendations improve the transportation experience and provide monetization opportunities for automakers and advertisers.
- Vehicle Content Recommendation Using Cognitive States (US Patent No. 10897650): offers recommendations for audio or video content to a vehicle’s occupants, based on an understanding of cognitive states drawn from multiple sensors inside a vehicle. Tailored content recommendations can be used to help occupants focus, relax or calm down while traveling, creating a highly personalized and more enjoyable experience.
Image Analysis for Emotional Metric Evaluation (US Patent No. 10869626):
collects an individual’s emotional responses to content and digital experiences consumed on mobile devices and in vehicles. This helps an individual track and quantify their emotions over time, all with opt-in and consent — serving as a fitness tracker for emotional wellbeing.
“Over the last decade, Affectiva has continuously pursued new patents as we have pioneered and advanced the fields of Emotion AI and Human Perception AI,” said Dr. Rana el Kaliouby, Co-Founder and CEO of Affectiva. “The breadth and depth of our patent portfolio reflects our commitment to pushing the boundaries of computer vision, machine learning, deep learning and AI at the edge; and, is a testament to our leadership in defining the many creative and diverse applications of Human Perception AI that are shaping industries today and in the future.”
Affectiva’s patent portfolio has broad applicability across different verticals, use cases, technologies, platforms and sensors, and protects for user opt-in and consent. The portfolio builds strong defensibility across industries such as automotive, media analytics, video conferencing, robotics and gaming.
“At Affectiva, we treat patenting as a team sport,” Dr. el Kaliouby continued. “These patents are a testament to our shared innovation mindset, and reflect the incredible diversity of ideas that come from our diverse team. We’re proud of these innovations and the value they provide to the company, our shareholders and the industry at large, as we work together to shape the future of Human Perception AI in automotive and beyond.”
Affectiva is on a mission to humanize technology. An MIT Media Lab spin-off, Affectiva created and defined the Emotion AI and Human Perception AI categories. Built on deep learning, computer vision, speech science and massive amounts of real-world data, Affectiva’s technology can detect nuanced human emotions, complex cognitive states, activities, interactions and objects people use. In automotive, Affectiva’s in-cabin sensing AI is enabling leading car manufacturers, fleet managers and ridesharing companies to build next-generation mobility that adapts to complex human states. Affectiva’s technology is also used by 25 percent of the Fortune Global 500 companies to test consumer engagement with ads, videos and TV programming.