Cognitive and emotional intelligence engine via the eye

Inventors

Zakariaie, David BobbakAsher, DerrikThurman, StevenParzivand, JacquelineBowden, Jared B.Sommerlot, Andrew R.Weisberg, SethBrown, Joseph

Assignees

Senseye Inc

Publication Number

US-11382545-B2

Publication Date

2022-07-12

Expiration Date

2036-10-08

Interested in licensing this patent?

MTEC can help explore whether this patent might be available for licensing for your application.


Abstract

A method of discovering relationships between eye physiology and cognitive and/or emotional responses of a user starts with engaging the user in a plurality of tasks configured to elicit a predicted specific cognitive and/or emotional response. A first camera films at least one eye of the user recording a time series of events of eye movements of the user, the camera not being in physical contact with the user. The first time series of eye movements are sent to a computing device which compares the eye movements and the plurality of events. The computing device can then identify at least one relationship between eye movements that correlate to an actual specific cognitive and/or emotional response.

Core Innovation

The invention provides a method and system for discovering relationships between eye physiology and the cognitive and/or emotional responses of a user. The system works by engaging the user in multiple tasks, such as computer video games configured to elicit specific predicted cognitive or emotional responses. At least one camera, which is not in physical contact with the user and may be a full-color or infrared camera, films the user’s eye and records a time series of eye movements during these tasks.

The recorded time series of eye movements are sent to a computing device, which applies computer vision and machine learning algorithms—including Bayesian deep belief networks—to analyze and compare the eye movement data with the events occurring in the tasks. Through this analysis, the computing device identifies relationships and correlations between eye movements and actual specific cognitive or emotional responses, verified via feedback directly gathered from the user during the tasks.

This approach leverages a non-invasive and broadly accessible method to infer a user’s mental states based on features such as pupil diameter, blink rate, gaze direction, and other dynamic eye behaviors. The system employs real-time analysis, can operate with ubiquitous full-color cameras by transforming noisy color data into clear infrared datasets using neural networks, and aims to provide actionable information for applications in social intelligence, neurocognitive monitoring, and adaptive systems.

Claims Coverage

There are two independent claims, each describing a method with several inventive features for discovering relationships between eye movements and cognitive and/or emotional responses.

Non-contact camera acquisition of user eye movements during varied, elicited cognitive or emotional task responses

A method is described wherein a user engages in a plurality of tasks, each configured to elicit specific predicted cognitive and/or emotional responses. A first camera, not physically attached to the user and being a full-color camera, films at least one eye of the user, recording a time series of eye movements while the environment of the computer video game is varied during game play to provoke such responses.

Simultaneous recording and mapping of eye movements to task events

The method concurrently records the time series of eye movements and the corresponding task events, associating these data streams. These are then sent to a computing device for further analysis.

Use of neural network transformation from noisy color image data to clear infrared image data

The recorded eye movement data—specifically noisy color image data from the full-color camera—is transformed by a neural network into a clear infrared image dataset. This enables comparison of eye movements from the first time series and the recorded tasks, facilitating analysis even when only full-color cameras (not infrared) are used for acquisition.

Feedback gathering and verification from user to confirm actual cognitive or emotional response

The system gathers feedback from the user regarding their actual experienced response during at least one task. This self-report is used to verify whether the predicted cognitive/emotional response induced by the task matches the actual response, improving the quality and reliability of the discovered correlations.

Computing device identification of relationships correlating eye behavior to actual user responses

A computing device analyzes the transformed and temporally aligned eye movement and task data, identifying at least one relationship between patterns in eye movements and actual specific cognitive or emotional responses experienced by the user.

The independent claims provide coverage for a comprehensive method utilizing non-contact color cameras, neural network data transformation, user feedback-based verification, and machine learning analysis to discover real-time relationships between eye movements and cognitive or emotional states induced by structured tasks.

Stated Advantages

The system offers a non-invasive, inexpensive, and highly accessible approach for quantifying and inferring cognitive and emotional states from eye behavior.

Real-time and adaptive analysis is supported, enabling actionable and individualized insights into user mental states.

The method leverages widely available full-color cameras, eliminating the need for specialized infrared cameras by neural network transformation of noisy data.

The approach increases predictive precision by adaptively learning user-specific models to improve inference for individual persons.

Documented Applications

Improving mental health diagnoses and rehabilitation in medicine.

Development of customizable teaching and learning applications in education.

Emotionally adaptive gaming and entertainment systems that respond to user emotional states.

Tools for innovative methods in data analysis for market research and psychological research.

Psychologists and therapists using the invention to better understand patient mental states.

Self-diagnosis of mental states and emotions by users through smart devices.

Law enforcement and immigration use in interviews to detect emotional states of persons of interest.

Development of emotionally intelligent robotics.

Adaptive display or content changes in video game or virtual reality settings based on user emotional state.

JOIN OUR MAILING LIST

Stay Connected with MTEC

Keep up with active and upcoming solicitations, MTEC news and other valuable information.