Adaptive visual assistive device

Inventors

Rizzo, III, Joseph F.Priplata, AttilaXu, MinnanWall, III, Conrad

Assignees

US Department of Veterans AffairsMassachusetts Eye and Ear

Publication Number

US-10571715-B2

Publication Date

2020-02-25

Expiration Date

2032-11-05

Interested in licensing this patent?

MTEC can help explore whether this patent might be available for licensing for your application.


Abstract

The disclosure features visual assistive devices that include a detection apparatus configured to receive information about an environment surrounding a user of the visual assistive device, a communication apparatus configured to transmit information to the user, and an electronic processor coupled to the detection apparatus and the communication apparatus, and configured to: receive information about an activity of the user; filter the information about the environment surrounding the user based on the activity of the user; assign a priority rank to each element of the filtered information based on the activity of the user; and transmit at least some of the elements of the filtered information, according to the assigned priority ranks, to the user using the communication apparatus.

Core Innovation

The invention is a visual assistive device designed to enhance the ability of visually impaired users to perceive relevant information about their environment. The device includes a detection apparatus that receives environmental information, and an electronic processor that obtains information about the user's activity. Based on the activity, the device filters the environmental information, assigns priority ranks to elements of the filtered information, and transmits prioritized information to the user through a communication apparatus. This adaptive filtering optimizes the relevance and timeliness of the information delivered to the user.

The problem being solved addresses the difficulty visually impaired persons face in processing large volumes of environmental sensory data. Existing assistive devices either relay excessive information, causing confusion and heavy computational demands, or rely on bulky hardware unsuitable for portable use. Further, users often face challenges learning to interpret augmented sensory inputs. This invention mitigates these issues by selectively processing and delivering only the most pertinent sensory elements tailored to the user's activity, thereby reducing computational load, hardware bulk, and user confusion, while enabling real-time environmental interaction.

Claims Coverage

The claims disclose one independent device claim and one independent method claim focusing on adaptive environmental information processing.

Activity-based filtering and prioritization of environmental information

The device obtains information about a user's activity including commands, audio signals, gestures, and position; filters environmental information accordingly to generate multiple filtered elements; assigns priority ranks to each filtered element based on the user's activity; orders the elements per their priority ranks; and transmits the ordered information to the user.

Configurable detection apparatus and communication apparatus

The detection apparatus includes image detectors, microphones, GPS sensors, range-finding sensors, acoustic wave sensors, and motion sensors. The communication apparatus includes speakers and vibrotactile transmitters capable of delivering different information types with varying frequencies and amplitudes, including usage of paired vibrotactile transmitters for navigation and additional transmitters for other information.

Integration with implanted prosthetic devices

The device features wireless communication interfaces enabling transmission of filtered and prioritized information to implanted prosthetic devices, such as ocular implants or devices positioned in proximity to visual pathways, delivering representations of environmental elements directly to the user’s eye, optic nerve, or brain regions.

Refinement and identification of environmental elements

Prior to transmission, the device further processes filtered elements by assigning identities to objects, faces, and speakers detected in visual and audio information. Additionally, refinement may be offloaded to remote processors, providing enhanced identification or characterization of detected elements.

Adaptive re-assessment of priority ranks

The device is configured to re-assess priority ranks of filtered elements prior to transmission based on triggering events related to the user's activity, by retrieving and applying rule sets from a database. Re-assessment is performed only upon occurrence of triggering events, enabling dynamic adjustment of information delivery.

Method for providing environmental information to visually impaired users

The method comprises receiving environmental information; determining user activity from commands, audio signals, gestures, or position; filtering environmental data based on that activity; assigning priority ranks to filtered elements; and ordering and transmitting the prioritized information to the subject.

The claims collectively cover an adaptive visual assistive device and corresponding method that dynamically filter, prioritize, refine, and communicate environmental information based on detected or received user activity. The inclusion of configurable sensors, multiple communication modalities, integration with implanted prosthetics, and remote processing capabilities emphasize a comprehensive, flexible system for tailored sensory augmentation.

Stated Advantages

Selective processing reduces computational load and power consumption, enabling use in portable, wearable devices.

Adaptive filtering and prioritization reduce user confusion by delivering only the most relevant sensory information.

Improves the user's ability to adapt and interact with their environment in real-time or near-real-time.

Integration with implanted prosthetics provides complementary multi-modal sensory information.

Compact, cosmetically acceptable housing promotes user acceptance and social comfort.

Documented Applications

Assistance for visually impaired persons in activities of daily living such as indoor and outdoor navigation, reading, conversing, eating, and monetary transactions.

Use with implanted visual prosthetic devices to provide complementary stimuli to the user.

Sensory augmentation devices for non-visually impaired users, including military and firefighting applications to detect information outside normal human perception ranges.

Activity monitoring for athletic training, rehabilitation, and mobility assessments.

Human interfaces for electronic devices including hands-free calling and voice-activated internet searching.

Security applications such as fall detection and emergency notifications.

JOIN OUR MAILING LIST

Stay Connected with MTEC

Keep up with active and upcoming solicitations, MTEC news and other valuable information.