Using augmented reality for interacting with radiation data
Inventors
Greenwood, M. Scott • Smith, Michael B. R. • Thompson, Nick • Nelson, Noel B. • Peplow, Douglas E.
Assignees
Publication Number
US-12354501-B2
Publication Date
2025-07-08
Expiration Date
2043-04-14
Interested in licensing this patent?
MTEC can help explore whether this patent might be available for licensing for your application.
Abstract
Interactive augmented-reality technologies track simulated-radiation exposure of a user moving through a physical scene. The simulated-radiation exposure tracking is performed using radiation voxels associated with a simulated radiation field caused as if a radioactive source of a particular type was emitting radiation from a predetermined location of the scene. Each radiation voxel is indicative of a respective level of the simulated radiation field at the voxel's scene location.
Core Innovation
The invention presents interactive augmented reality technologies that track simulated radiation exposure of a user moving through a physical scene. This is achieved by using radiation voxels associated with a simulated radiation field, which is caused as if a radioactive source of a particular type were emitting radiation from a predetermined location in the scene. Each radiation voxel indicates a respective level of the simulated radiation field at its scene location.
The system integrates augmented reality devices equipped with controlling means, tracking means, and viewing means to obtain radiation voxels, monitor user movement, determine user location, and calculate the simulated radiation exposure at that location. The technologies provide visual and audio guidance through composited holograms, including isocontours that represent levels of simulated radiation, enabling users to see and interact with three-dimensional radiation environments overlaid on real-world scenes.
The problem being solved addresses the limitations of conventional radiation monitoring tools that are mainly physical detectors providing momentary or accumulated radiation levels without immersive, interactive visualization. Existing tools lack the ability to intuitively and effectively present complex three-dimensional radiation information, especially for radiological operational and training teams. The invention improves training, education, and worker perception of ionizing radiation by providing efficient, physically accurate immersive environments where radiation is presented visually and aurally in real-time, enhancing comprehension and recall.
Claims Coverage
The patent includes multiple independent claims covering systems and methods involving augmented reality devices for interacting with simulated and measured radiation data. Key inventive features focus on radiation voxel acquisition, user tracking, simulated radiation determination, hologram presentation, and user interface interactions.
Tracking and determining simulated radiation exposure along a user path
An augmented-reality device obtains radiation voxels indicative of simulated radiation from a radioactive source of a particular type in a physical scene, monitors a user's path, identifies user locations along the path, and determines simulated radiation exposure based on overlapping radiation voxel levels.
Presentation of simulated-radiation path holograms
The device presents simulated-radiation path holograms overlapping the user's path, color coded or scaled in accordance with the simulated radiation experienced, upon user request, including overlaying on the scene floor.
Emitting audio guidance corresponding to simulated radiation
The augmented-reality device transmits instructions to speakers to emit audio sounds reflecting the simulated radiation levels experienced by the user at different identified path locations, as requested by the user.
Logging simulated radiation exposure data
The system logs simulated radiation exposure experienced by the user at identified path locations to a data store for record keeping.
Presentation of simulated-radiation measurement indicia at user-indicated locations
The device detects user measurement inputs (e.g., hand gestures), identifies scene locations, acquires simulated radiation measurements from radiation voxels corresponding to those locations, and presents visual indicia overlapping those locations. Presentation can be user-triggered, and data logged or accompanied by audio feedback.
Presentation of source holograms using isocontours or point clouds
The device obtains one or more isocontours or point clouds corresponding to levels of radiation voxels, then presents these as source holograms overlapping the scene viewed by the user, updating the presentation based on the viewing means' line of sight orientation relative to the scene.
Rescaling and aligning radiation voxels and holograms with digital scenes
The device retrieves digital scenes corresponding to physical scenes, recognizes objects to determine relative scales, and uses this scaling to properly present radiation voxels and source holograms aligned to the physical environment.
Inter-device communication for shared visualization
A second augmented-reality device communicatively coupled to the first can receive radiation exposure data or hologram information to present shared simulated-radiation path holograms, measurement indicia, or source holograms overlapping scenes in synchronization with the first user's experience.
Obtaining radiation voxels either from data storage or via radiation transport models
The system either retrieves precomputed radiation voxels associated with a simulated radioactive source or determines them using one or more radiation transport computational models.
The inventive features collectively enable augmented-reality devices to provide immersive, interactive visualization and tracking of simulated radiation fields in physical scenes, including audio and visual guidance, user interaction inputs, real-time updating of holograms, and data sharing between users for enhanced radiation training and operational awareness.
Stated Advantages
Provides intuitive, efficient, and physically accurate training environments for complex three-dimensional ionizing radiation fields.
Enables users to visualize radiation exposure spatially and temporally overlaid on real-world scenes, enhancing comprehension and recall.
Offers real-time feedback and recording of user behaviors and exposures to support training and performance analysis.
Supports multi-user collaboration and remote monitoring during radiological operations or training.
Improves worker awareness, reduces radiation exposure levels, and mitigates liability by providing sensory guidance through audio, visual, and haptic feedback.
Documented Applications
Radiological operational and training teams' immersive radiation exposure tracking and visualization.
Radiological source search exercises where users navigate physical spaces while tracking simulated or measured radiation exposure along their path.
Radiological survey experiences allowing users to take simulated or measured radiation readings at indicated scene locations using hand gestures.
Radiological workflow training whereby users navigate scenes with simulated radioactive sources, receiving sensory feedback to optimize paths minimizing radiation exposure.
Validation of simulated radiation data against measurements from real radioactive sources using radiation sensors.
Machine learning classification and localization of radioactive sources using spatial radiation measurements to tailor augmented reality guidance for exposure reduction.
Interested in licensing this patent?