Emotional intelligence engine via the eye

Inventors

Zakariaie, David BobbakThurman, StevenAsher, DerrikParzivand, Jacqueline

Assignees

Senseye Inc

Publication Number

US-10575728-B2

Publication Date

2020-03-03

Expiration Date

2036-10-08

Interested in licensing this patent?

MTEC can help explore whether this patent might be available for licensing for your application.


Abstract

A method of discovering relationships between eye movements and emotional responses of a user starts with engaging the user in a plurality of tasks configured to elicit a predicted specific emotional response or allowing the user to experience a naturalistic environment with free-form social interactions. A first camera films at least one eye of the user recording a first time series of eye movements and a second camera films an outward looking view of the user recording a second time series of outward events perceived by the user. The first and second time series are sent to a computing device which compares the eye movements from the first time series and the outward events from the second time series. The computing device can then identify at least one relationship between eye movements that correlate to an actual specific emotional response.

Core Innovation

The invention provides a computational system that leverages video images of the eye to distinguish human mental states based on information collected in naturalistic behavioral settings. By using machine learning algorithms, specifically Bayesian deep belief networks and other neural network techniques, the system adaptively learns the mapping between various eye behaviors and dynamic changes in mental state, including cognitive and emotional states. This model interfaces with any device capable of providing a sufficient set of eye behavior data, such as pupil dilation, blink rate, blink duration, and eye movements.

The problem being addressed is the difficulty computers face in accurately inferring human mental states, emotions, and cognitive responses due to the limited and often deceptive nature of observable expressive behaviors like facial expressions and voice intonations. While humans can naturally perform 'mind reading' to interpret such states, artificial systems have historically struggled with extracting relevant features from behavior and decoding the underlying mental states, particularly because many traditional cues can be voluntarily controlled or masked.

The approach of this invention is to focus on eye behavior—which is less susceptible to cognitive control and more universally expressive of underlying mental states—using computer vision to extract relevant features from video feeds. The extracted data serve as inputs for adaptive machine learning models that can be trained to discriminate multiple key emotional and cognitive dimensions in real-time. The system includes a mobile hardware device, such as glasses mounting both an eye-facing and world-facing camera, that collects time-synchronized data streams for analysis and inference.

The invention also integrates user feedback to improve model accuracy by verifying predicted emotional responses against actual self-reported emotions during controlled or naturalistic tasks. Over time, this adaptive learning enhances predictive precision, offering a live feed of inferred mental states for use in a variety of software applications. The future direction includes miniaturized hardware, such as smart contact lenses with onboard molecular sensors and DNA computing, to further advance non-invasive human-computer interaction in emotion and mental state monitoring.

Claims Coverage

There are three independent claims defining the inventive features of this patent.

Method of discovering relationships between eye movements and emotional responses via synchronized dual camera recordings and user feedback

This inventive feature involves: 1. Engaging a user in a plurality of computer video game tasks, each configured to elicit a predicted specific emotional response. 2. Varying the game environment during play to invoke the targeted response. 3. Recording a synchronized first time series of the user's eye movements (using a first camera) and a second time series of events perceived by the user (using a second, outward-looking camera). 4. Sending both time series to a computing device, which compares eye movements and outward events. 5. Gathering user feedback on the actual emotional response and verifying if it matches the predicted response. 6. Identifying relationships between eye movements and emotional responses through the computing device. Key elements include dual cameras, synchronized data collection, user feedback, and computational identification of correlating relationships.

Method of discovering relationships between eye movements and emotional responses during video game tasks without user feedback

This inventive feature includes: - Engaging users in multiple computer video game tasks, each designed to elicit a predicted emotional response. - Varying the environment during gameplay to produce these responses. - Using a first camera to record eye movements and a second camera to record the outward view, both time-aligned. - Sending both time series to a computing device for comparison. - Computing device identifies at least one relationship between eye movements and the user’s actual specific emotional response during tasks designed to elicit predicted responses. Unlike the previous claim, this method does not require feedback or verification from the user of their emotional state.

Method of discovering relationships between eye movements and emotional responses in a naturalistic free-form social environment

This inventive feature involves: - Engaging the user in a naturalistic environment with free-form social interactions. - Using a first camera to film the user's eye and a second to film the user’s outward view, with both cameras integrated into wearable glasses (which may or may not contain lenses). - Synchronously recording time series of eye movements and outward events. - Transmitting data to a computing device, which compares the two streams. - Computing device identifies at least one specific interaction from the social interactions and determines the relationship between eye movements and predicted specific emotional responses during these interactions. Optional steps include gathering user feedback (e.g., via text message) and verifying the match between predicted and actual emotional responses.

The inventive features claim novel methods for discovering the relationships between user eye movements and emotional responses by synchronizing dual camera recordings (eye and outward view), varying environmental or social tasks, utilizing computing devices for analysis, and optionally integrating user feedback. The invention is applicable both in controlled task-based and naturalistic, free-form environments.

Stated Advantages

The invention provides reliable, non-invasive, and adaptive inference of human cognitive and emotional states using eye behavior, which is less susceptible to voluntary control and thus more robust than traditional methods relying on facial expressions or vocal cues.

By using machine learning to discover complex relationships in eye data, the system can deliver high predictive accuracy for mental state discrimination, including emotional valence, cognitive load, engagement, and fatigue, exceeding the granularity of previous methods.

The real-time feedback and adaptive learning capability improve model accuracy for individual users, resulting in precise user-specific inference of mental states.

The system offers a platform with broad application potential, integrating seamlessly into consumer electronics (e.g., glasses or smart devices) and providing actionable mental state information for other software applications.

Mobile and wearable hardware designs, such as clip-on glasses, enable easy, comfortable, and stable eye tracking in naturalistic settings, overcoming issues of noise from head movements in desktop solutions.

Documented Applications

Improving mental health diagnoses and rehabilitation in medicine through more accurate and adaptive neurocognitive monitoring.

Providing customizable teaching and learning applications in education by adapting content or strategies based on real-time monitoring of emotional and cognitive states.

Enhancing emotionally resonant and adaptive gaming experiences in entertainment by adjusting game responses to the user's inferred emotional state.

Supporting innovative data analysis methods for market research and psychological or basic research.

Serving as a platform for app development in mobile consumer applications, enabling a live stream of thoughts and feelings as users interact with computer software.

Potential use in developing emotionally intelligent robotics and fostering more symbiotic human-machine relationships.

Facilitating law enforcement or immigration interviews by detecting emotional states in persons of interest.

Assisting psychologists and therapists in understanding patient mental states and emotions for improved therapy and counseling.

Allowing users to self-diagnose their mental states and emotions using their smart device.

Using emotional state detection to dynamically alter content in video game or virtual reality environments.

JOIN OUR MAILING LIST

Stay Connected with MTEC

Keep up with active and upcoming solicitations, MTEC news and other valuable information.