Test cell presence system and methods of visualizing a test environment

Inventors

Pavloff, Alexander GeorgeRedd, Bryan Lafayette

Assignees

Augmntr Inc

Publication Number

US-11681834-B2

Publication Date

2023-06-20

Expiration Date

2040-01-30

Interested in licensing this patent?

MTEC can help explore whether this patent might be available for licensing for your application.


Abstract

Exemplary embodiments described herein include methods of systems for visualization of test cell environments. Exemplary embodiments may include a test cell presence system and method of providing test cell visualization that displays and permits virtual interaction with complex, three-dimensional (3-D) data sets. Exemplary embodiments permit visualization through digital reality, such as Virtual Reality (VR), Augmented Reality (AR), and other display solutions.

Core Innovation

The invention provides a test cell presence system and methods for comprehensive visualization of test cell environments. It displays and permits virtual interaction with complex, three-dimensional (3-D) data sets representing physical test objects and their environment. This visualization is achieved through digital reality solutions such as Virtual Reality (VR), Augmented Reality (AR), and other display formats, including flat screen approximations. The system is hardware-agnostic and can incorporate various sensors, such as cameras in multiple spectral bands, temperature sensors, mechanical sensors, and electrical sensors, to collect data representing the physical test object and its environment.

The problem addressed is the limitation of conventional test monitoring approaches which rely on multiple cameras and sensors producing isolated data feeds. These data feeds are often presented individually, lacking comprehensive correlation to the test object or environment, limiting the user's understanding. Furthermore, increasing the number of cameras to achieve desired resolution results in overwhelming bandwidth and data management challenges. There is no existing comprehensive method or system to compile, synchronize, and display the multiple sensor inputs into an integrated, spatially and temporally aligned virtual representation of the test environment.

The invention solves this problem by generating an accurate three-dimensional virtual representation of the physical test object, often using computer aided design (CAD) models or sensor-based 3D reconstructions. The system calibrates the physical sensors and cameras relative to this virtual object, enabling overlay of live or recorded sensor data onto the virtual representation with precise alignment. It also provides methods for data aggregation, filtering, fidelity adjustment based on virtual perspective, and user interaction to create an immersive and integrated visualization experience that improves understanding and monitoring of the test object in its environment.

Claims Coverage

The claims include one independent method claim detailing the main inventive steps for providing three-dimensional test cell visualization. The inventive features involve creating and displaying a virtual representation of a physical test object with overlayed sensor information that adjusts based on user perspective.

Three-dimensional virtual test cell visualization

Providing a virtual test object corresponding to a physical test object in a test cell by generating a computer aided design model, receiving one or more digital feeds from cameras, overlaying portions of these feeds onto the virtual representation, and displaying views of the virtual object with overlaid information.

Perspective-adaptive data mapping and visualization

Receiving user input to change perspective view of the virtual test object, mapping and overlaying corresponding different portions of digital feeds from cameras onto the virtual representation, and displaying updated views reflecting the changed perspective.

Calibration of physical sensor arrangement to virtual model

Calibrating the physical test cell, including sensor positioning and orientation, with the virtual representation of the test object to ensure precise spatial alignment of data overlays.

Data synchronization and amalgamation

Receiving information from multiple time synchronized sensors, aligning and filtering overlapping information to create amalgamated sensor data, and visualizing this amalgamated data accurately on the 3D virtual representation.

Level-of-detail data fidelity adjustment

Manipulating data streams by reducing or increasing resolution and fidelity based on the virtual distance between the viewer's perspective and the virtual test object to manage bandwidth and rendering performance.

Data storage of sensor feeds

Saving one or more digital feeds from cameras for later use, analysis, or replay.

The independent claim covers a comprehensive method combining generation of a 3D virtual test object model, calibration and mapping of multiple sensor feeds onto this model, adaptive rendering of data based on user interaction and viewing perspective, data synchronization and filtering, and storage of sensor data. These features provide an integrated test cell visualization system allowing immersive and correlated observation of a physical test environment.

Stated Advantages

Provides comprehensive, three-dimensional visualization of the test environment integrating multiple sensor feeds for improved situational awareness.

Enables virtual interaction with complex data sets through digital reality solutions such as VR and AR, facilitating remote or enhanced monitoring.

Preserves precise temporal and spatial registration between physical sensors and virtual model, ensuring accurate data overlay.

Allows dynamic adjustment of data fidelity based on virtual viewing distance to manage bandwidth and enhance visualization performance.

Hardware-agnostic design supports various sensor types and VR/AR devices, allowing adaptability to evolving technologies.

Supports aggregation, filtering, and synthesis of multiple time-synchronized sensor feeds, providing a unified and enriched data representation.

Facilitates user inputs and annotations for real-time or replay tagging and enhanced post-test analysis.

Enables multiple users to engage independently or collaboratively through different interface modalities (digital reality or traditional displays).

Permits faster visualization and a more comprehensive understanding of operational test cell environments, helping detect minor issues early.

Documented Applications

Performance testing of specialized test articles in various environments, including strenuous environmental conditions.

Observation and remote monitoring of physical test objects in test cells through integrated multi-sensor data visualization.

Environmental and run time testing including variable parameters such as temperature, pressure, humidity, vibration, and acceleration.

Multi-spectral sensing applications incorporating visible, infrared, ultraviolet, and other frequency band imaging.

Use in test environments requiring data aggregation from visual, thermal, mechanical, electrical, and other sensor types.

Training, collaborative analysis, and interactive visualization of test object status via VR, AR, and 2D displays.

JOIN OUR MAILING LIST

Stay Connected with MTEC

Keep up with active and upcoming solicitations, MTEC news and other valuable information.