Using augmented reality in surgical navigation

Inventors

Samadani, UzmaZahid, Abdullah BinDarrow, David P.

Assignees

Hennepin Healthcare Systems IncUS Department of Veterans Affairs

Publication Number

US-11883118-B2

Publication Date

2024-01-30

Expiration Date

2038-04-23

Interested in licensing this patent?

MTEC can help explore whether this patent might be available for licensing for your application.


Abstract

A surgical navigation system may include a processor and a display. The processor may receive a patient image and sensor data captured by a sensor, receive a medical image, generate a hologram of the medical image, perform coregistration between the patient image and the hologram, superimpose the hologram on the patient image, and display the superimposed image. Coregistration may be performed manually via a user interaction, or automatically based on one or more fiducials in the medical image and sensor data related to the fiducials. The system may monitor a change in the environment and update the display correspondingly. For example, the system may monitor a movement of a body of the patient, monitor the size of an organ of the patient as the organ is being under operation, or a movement of the surgical instrument. The sensor may be an augmented reality (AR) sensor in an AR device.

Core Innovation

The invention describes a surgical navigation system using augmented reality (AR) to enhance surgical procedures. The system receives a patient image and sensor data captured by AR sensors, along with a medical image from imaging modalities such as MRI, CT, ultrasound, or microscopes. It generates a hologram representing the medical image and performs coregistration between the hologram and the patient image to spatially map and superimpose the hologram onto the patient image, displaying this combined view to the surgeon in real time.

The system addresses the problem faced in traditional surgeries where surgeons must look away from the patient to reference preoperative images, which can be inefficient and challenging. Existing neuronavigation systems display images on separate screens but require the surgeon's gaze to shift, leading to technical difficulties and limitations. This invention integrates holographic representations directly onto the real-world view of the patient's body, allowing surgeons to visualize anatomical structures and surgical targets without diverting their attention.

The system also solves the problem of changes during surgery, such as tissue movement, organ size changes, or surgical instrument motion, by continuously monitoring the environment with AR sensors. It updates the holographic display accordingly, enabling intraoperative imaging and real-time feedback. The coregistration process can be manual or automatic and uses fiducials such as the patient's skin, markers, or anatomical landmarks to maintain the alignment between the hologram and the patient despite patient movement or surgical changes.

Claims Coverage

The patent includes multiple independent claims that describe a system and a method for AR-based surgical navigation. They focus on coregistration, superimposition of medical images, and updating displays based on anatomical changes during surgery. Key inventive features include patient image acquisition, representation of medical images as holograms, handling of fiducials for coregistration, and dynamic updates during surgery.

Coregistration of patient and medical images

The processor performs coregistration between the patient image and the representation (e.g., hologram) of the medical image to generate a transformation matrix allowing spatial mapping and overlay of the medical image onto the patient image.

Manual and automatic registration methods

The system can perform coregistration either manually by user input to adjust overlay positions or automatically by extracting features (fiducials) and generating volumetric data for transformation matrix computation.

Use of fiducials for coregistration

Fiducials can include skin or external surfaces, deep anatomical structures, markers placed on the patient, or arterial/septal divides. Sensor data includes information about these fiducials to aid registration.

Integration of AR sensors

The system employs AR sensors such as cameras, 3D scanning devices, or ultrasound devices to capture sensor data for patient images and environment monitoring.

Dynamic monitoring and updating of anatomical changes

The system detects changes in anatomical structures caused by surgical interventions like cutting or drilling. It updates the transformation matrix and the displayed holograms accordingly to reflect changes in organ shape, size, or position.

Tracking and displaying surgical instruments and surgeon's hands

The system includes methods to detect and superimpose surgical instruments and hands onto medical images, monitoring changes in their position or shape during surgery.

Display capabilities

The display is part of an AR device capable of rendering holograms and providing 3D binocular vision, with options for adjusting scaling factors to enhance visualization.

The independent claims cover a surgical navigation system that integrates AR for accurate spatial mapping and dynamic visualization of medical images filtered and superimposed onto patient views. It provides methods for both manual and automated coregistration using fiducials, incorporates real-time updates to account for anatomical changes during surgery, and enhances surgical instrument tracking and display through AR.

Stated Advantages

Allows surgeons to view medical images overlaid directly on the patient in real time without looking away during surgery.

Facilitates intraoperative imaging by detecting changes in anatomy and updating medical images accordingly, reducing the need for repeat MRI scans.

Increases patient comfort by allowing patient movement during surgery without loss of registration accuracy, eliminating immobilization devices.

Improves surgical precision and understanding by providing magnified and binocular 3D views of anatomical structures.

Enables tracking of hands and instruments without special probes, allowing enhanced instrument localization and mapping within the surgical field.

Documented Applications

Use in various surgical procedures including neurosurgery, tumor resection, and acoustic neuroma removal.

Intraoperative imaging to assess the amount of diseased tissue removed and monitor surgical progress.

Facilitates endoscopic, laparoscopic, and bronchoscopy surgeries by providing augmented 3D views, including when traditional camera views are obscured (e.g., by blood).

Surgical navigation involving anatomical structures such as brain, nerves, arteries, veins, hematomas, heart, lungs, and other internal organs.

Tracking and display of surgical instruments and surgeon’s hands during medical procedures.

JOIN OUR MAILING LIST

Stay Connected with MTEC

Keep up with active and upcoming solicitations, MTEC news and other valuable information.