System for facilitating speech-based communication for individuals unable to speak or write
Inventors
Goldberg, Miriam Anna Rimm • Hochberg, Leigh
Assignees
Brown University • US Department of Veterans Affairs • University of Massachusetts Amherst • University of Massachusetts Chan Medical School
Publication Number
US-12042303-B2
Publication Date
2024-07-23
Expiration Date
2039-03-01
Interested in licensing this patent?
MTEC can help explore whether this patent might be available for licensing for your application.
Abstract
A patient communication system comprises a hand-held data input subsystem and a display subsystem. The data input subsystem may be configured to sense, and wirelessly convey, information associated with 3-D movement of the data input subsystem, and actuation of a signaling element within the data input subsystem. The display subsystem may receive the information and to present a visual representation of the information in a graphical user interface (GUI). The display subsystem may adaptively interpret the information with respect to range of motion sensitivity and orientation of the data input subsystem, and movement of the data input subsystem with respect to its prior state. The system may map the conveyed information to the GUI, generate and display images on the GUI with which a patient may interact by manipulating the data input subsystem, and capture one or more selections executed by the patient through actuation of the signaling element.
Core Innovation
The invention discloses a patient communication system designed to assist individuals unable to communicate effectively through speech or writing, such as patients who are intubated, have orofacial injuries, or have limited lung capacity for speech. The system comprises a hand-held data input subsystem and a display subsystem, wherein the data input subsystem senses and wirelessly conveys information associated with its three-dimensional movement and actuation of a signaling element, while the display subsystem presents a visual representation of this information within a graphical user interface (GUI).
The data input subsystem translates the user's 3-D movements into positional information, which is adaptively interpreted by the display subsystem with reference to motion sensitivity, orientation, and movement with respect to prior states. This adaptive interpretation accommodates different patient physical impairments and learning styles. The display subsystem maps and generates interactive images on the GUI that the patient can control by manipulating the data input device and actuating the signaling element. Although contemplated primarily for clinical use, the system is also envisaged for other environments and users.
Claims Coverage
The patent includes multiple independent claims covering a system, method, and computer-readable medium, which together define eleven main inventive features.
Hand-held data input subsystem sensing and wireless conveying movement and actuation
A hand-held data input subsystem configured to sense movement in three-dimensional space and actuation of a signaling element, and to wirelessly convey this information.
Display subsystem adaptive interpretation of movement information
A display subsystem configured to receive the wirelessly conveyed information, present a visual representation in a GUI, and adaptively interpret the information based on motion sensitivity and user movement characteristics.
Interpretation of movement as nearest archetypal movement using a total difference parameter
The display subsystem interprets 3-D movement information as the nearest archetypal movement by calculating a total difference parameter comparing sensed velocity vectors to unit vectors defining a spatial reference framework.
Adaptive interpretation with respect to orientation and prior state
The display subsystem further adapts interpretation based on orientation of the data input subsystem and movement relative to its prior state.
Computer code execution causing GUI mapping and image generation
A processor and memory with computer code instructions cause the system to map conveyed information to the GUI, generate and display images, and capture user selections via signaling element actuation.
Hand-held data input subsystem form factor conforming to user's hand
The hand-held subsystem is designed with a shape conforming to a user's hand, including a guide engaging at least one digit and a strap for securing the hand.
Data input subsystem surface design minimizing contamination
An outer surface of the data input subsystem that is substantially smooth, minimizes pockets or crevices that could collect foreign materials, and is fabricated from material resisting retention of foreign substances.
Sealable coupling seam forming isolated interior of data input subsystem
A rigid portion and flexible portion of the data input subsystem housing that engage along a sealable coupling seam to isolate the interior from the external environment.
Operation configured for clinical environment use
Configuration of the data input subsystem and display subsystem for operation by a patient within a clinical environment.
Display system generating control signals for external system control
The display system produces a control signal based on movement information, which can be conveyed to and used to control an external system via the user manipulating the hand-held data input subsystem.
Interactive control panel for adjusting system parameters
The display system includes an interactive control panel allowing the user to adjust motion sensitivity, twist and tilt interpretation, orientation, and archetypal movement variations.
Together, these inventive features provide a comprehensive system that integrates sensing, adaptive interpretation, user interface mapping, ergonomic design, environmental sealing, clinical applicability, external control capability, and user-configurable sensitivity to enable communication for users unable to speak or write.
Stated Advantages
The hand-held input device does not need to be maintained in a fixed position and allows movement in any direction, degree, or speed, accommodating diverse patient learning styles and physical impairments.
Adaptive interpretation of movements based on sensitivity, orientation, and prior state enhances usability for users with varied motor control issues, such as tremors or limited movement ranges.
The form factor and surface design reduce contamination risks and facilitate cleaning, suitable for clinical environments.
The system supports flexible interaction including audible feedback, which supplements visual GUI images, aiding communication.
The inclusion of control panels allows customization of sensitivity and movement interpretations to individual user needs.
Documented Applications
Communication assistance for patients unable to speak or write due to clinical conditions such as intubation, orofacial injury, or tracheotomy.
Use in clinical environments such as intensive care units to facilitate patient communication with healthcare providers and family members.
Control of external systems including hospital beds, room lights, orthotic/exoskeleton devices, and remotely controlled devices by interpreting data input subsystem movements.
General navigation applications such as video games or remote-controlled devices, using the data input subsystem's 3-D movement sensing.
Control of musical instruments via movement input analogous to a theremin.
Assistive communication for patients with aphasia or severe movement disorders such as essential tremor.
Control of presentation materials or surgical robots.
Interested in licensing this patent?