Techniques for gesture recognition using wearable device data
Inventors
Assignees
Publication Number
US-12340026-B2
Publication Date
2025-06-24
Expiration Date
2042-09-16
Interested in licensing this patent?
MTEC can help explore whether this patent might be available for licensing for your application.
Abstract
Methods, systems, and devices for gesture recognition are described. A system may identify geographical location data associated with a user throughout a time interval and may identify a set of gesture profiles corresponding to a set of gestures associated with the geographical location data of the user. The system may additionally acquire physiological data, including motion data, associated with the user from a wearable device worn by the user and may identify a set of motion segments within the time interval based on the motion data. Additionally, the system may identify a gesture the user engaged in based on matching a motion segment of the set of motion segments to a gesture profile of the set of gesture profiles and may cause a graphical user interface (GUI) of a user device running the application to display an indication of the gesture.
Core Innovation
The invention described supports techniques for gesture recognition using data collected from wearable devices and external data such as geographical location and time of day to improve accuracy and efficiency. The system identifies a user’s geographical location data over a time interval and selects a subset of gesture profiles corresponding to gestures typically associated with that location from a broader set of gestures defined within an application. The wearable device collects physiological data, including motion data, and identifies motion segments during the time interval. The gesture recognition is performed by matching motion segments to the selected gesture profiles. Subsequently, the system causes a user device graphical user interface (GUI) to display the identified gesture to the user.
The problem being solved arises from conventional wearable devices' inability to recognize specific gestures beyond general motion intensity detection, resulting in excessive processing power consumption and unreliable gesture detection. Wearable devices traditionally differentiate between broad activity types like walking or running but fail to identify distinct gestures such as different swimming strokes or actions like eating or drinking. Searching for hundreds of potential gesture matches at any time is computationally intensive and decreases recognition reliability.
This disclosure addresses these issues by using contextual data such as geographical location and time of day to filter and narrow the set of candidate gesture profiles, reducing processing requirements and improving recognition accuracy. For example, if a user is at a gym, the system selects exercise-related gesture profiles, whereas at a restaurant, it selects eating and drinking gestures. Time of day data further refines the gesture subset based on historical user activity, like drinking coffee in the morning regardless of location. The system may obtain location data via GPS or calendar applications and leverage historical gesture data and user input to associate gestures with locations or times. Gesture recognition may be performed by the wearable device, user device, server, or a combination thereof, and users may confirm or modify detected gestures via the GUI, allowing continuous improvement of gesture profiles.
Claims Coverage
The patent includes multiple independent claims that focus on methods, apparatuses, and computer-readable media configured to perform gesture recognition by leveraging geographical location data, physiological data from wearable devices, and gesture profiles.
Gesture recognition using geographical location and motion data
Receiving geographical location data associated with a user throughout a time interval; identifying a subset of gesture profiles corresponding to gestures linked with the geographical location of the user from a plurality defined within a wearable device application; acquiring physiological data including motion data from the wearable device; identifying motion segments within the time interval; identifying a gesture based on matching the motion segments to a gesture profile; and causing a GUI to display an indication of the identified gesture.
Transmitting gesture profiles to wearable device and receiving gesture indication
Transmitting the identified subset of gesture profiles to the wearable device, where the device is configured to identify the gesture by matching motion segments to the profiles, and receiving a message from the device indicating the identified gesture, which forms the basis for displaying the gesture indication on the GUI.
Utilizing historical gesture and geographical location data
Identifying historical gesture data associated with the user including past gestures and associated geographical locations; determining correspondence between current geographical location data and historical data; and basing the subset of gesture profiles on this association to improve gesture recognition accuracy.
Utilizing historical gesture and time of day data
Identifying historical gesture data comprising historical gestures and times of day they were performed; establishing that the current time interval is within such a time of day; and selecting the subset of gesture profiles based at least partially on this temporal association.
User interaction for gesture confirmation and modification
Receiving user input via the GUI to confirm, modify, or otherwise interact with the displayed gesture indication to improve gesture recognition accuracy.
Identifying activity segments or taggable events based on gestures
Based on the identified gestures, determining an activity segment, taggable event, or both, and causing the GUI to display an indication of these to the user.
Associating semantic locations with gestures
Receiving user input indicating semantic locations and associated gestures; identifying that the current geographical location data matches the semantic location; and selecting gesture profiles accordingly.
Generating new gesture profiles via user input
Receiving additional physiological data including motion segments; obtaining user input associating new motion segments with gestures; and generating or updating gesture profiles for enhanced recognition.
Identifying relationships between gestures and physiological data
Determining relationships between identified gestures and physiological data collected during or outside the gesture time intervals; and displaying related messages via the GUI.
Adjusting user metrics based on identified gestures
Selective adjustment of user scores, such as Readiness or Activity Scores, based on the gestures identified from the wearable device data.
The claims cover a comprehensive system for gesture recognition that integrates geographical location, time data, wearable physiological data, user feedback, and historical data to efficiently and accurately identify gestures and present related information to users, enabling adaptive and personalized gesture detection.
Stated Advantages
Reduced processing power consumption by narrowing down candidate gestures based on geographical location and time of day.
Improved reliability and accuracy of gesture recognition due to contextual filtering of gesture profiles.
Enhanced user experience by enabling users to confirm, deny, or modify detected gestures to improve recognition systems.
Ability to associate gestures with user activities and physiological metrics, providing richer insights into user behavior and health.
Efficient use of wearable device and user device resources by allocating gesture recognition tasks based on device capabilities.
Documented Applications
Recognition of specific gestures such as eating, drinking, sports-related gestures including swimming strokes, weight lifting, racket sports, golf swings, running, and walking.
Leveraging geographical location data and calendar events to predict and associate gestures relevant to user location, e.g., gestures at restaurants, gyms, tennis centers.
Using time of day data to anticipate gestures, e.g., morning coffee drinking regardless of location.
Providing graphical user interfaces to display identified gestures, allow user confirmation or correction, and present gesture-related insights such as heart rate during activity.
Adjusting user health scores such as Sleep Scores and Readiness Scores based on identified gestures and related physiological data.
Interested in licensing this patent?