Navigation of tubular networks
Inventors
Mintz, David S. • Ghoreyshi, Atiyeh • Jeevan, Prasanth • Xu, Yiliang • Yang, Gehua • Leotta, Matthew Joseph • Stewart, Charles V.
Assignees
Auris Health Inc • Kitware Inc
Publication Number
US-10796432-B2
Publication Date
2020-10-06
Expiration Date
2036-09-16
Interested in licensing this patent?
MTEC can help explore whether this patent might be available for licensing for your application.
Abstract
Methods and apparatuses provide improved navigation through tubular networks such as lung airways by providing improved estimation of location and orientation information of a medical instrument (e.g., an endoscope) within the tubular network. Various input data such as image data, EM data, and robot data are used by different algorithms to estimate the state of the medical instrument, and the state information is used to locate a specific site within a tubular network and/or to determine navigation information for what positions/orientations the medical instrument should travel through to arrive at the specific site. Probability distributions together with confidence values are generated corresponding to different algorithms are used to determine the medical instrument's estimated state.
Core Innovation
The invention provides methods and apparatuses for improved navigation through tubular networks, such as lung airways, by enhancing the estimation of the location and orientation of a medical instrument, such as an endoscope, within the network. It combines various input data—including image data from an optical sensor, electromagnetic (EM) data, and robotic movement data—using different algorithms to estimate the current state of the instrument. This estimated state is then employed to locate a specific site within the tubular network and to determine navigation information for guiding the instrument to that site.
A significant aspect of the system is the use of probability distributions and confidence values, which are generated by multiple algorithm modules corresponding to different data types. These are used to assess the instrument's estimated state and to improve accuracy in localizing the medical instrument in real time. The approach includes generating a three-dimensional (3D) model of the tubular network from CT scan data, selecting a target area within the network, and planning a path for the medical instrument to reach the target. During the procedure, the system repeatedly analyzes input data to provide real-time updates on the instrument’s location and orientation.
The background identifies a need for improved navigation techniques due to challenges in accurately estimating motion and location of medical instruments within tubular anatomical networks using prior approaches, even when employing robotic bronchoscopes and 3D models. Inaccurate localization can lead to misleading visual references for physicians during procedures, hence the necessity for the improved methods described.
Claims Coverage
There are three independent claims, each introducing inventive features related to navigation in tubular networks using image analysis, probability-based state estimation, and data fusion from different sensors.
Surgical robotic system with optical sensor-based state estimation
A surgical robotic system comprising: - An endoscopic tool with a flexible tip and an optical sensor coupled to it. - A processor configured to: - Receive a plurality of images from the optical sensor. - Identify, within an image, a division in the tubular network with multiple openings to respective segments. - Track the detected openings over sequential image frames to determine probabilities for entering each opening. - Determine a first estimated state of the flexible tip based on the identified division and the probabilities, where the state comprises at least one of position or orientation within the network. - Display the estimated position of the flexible tip within a segment on a display, determined from the estimated state.
Non-transitory computer readable storage medium with instructions for navigation using optical images and state estimation
A non-transitory computer readable storage medium containing instructions that, when executed by one or more processors, cause the processors to: - Receive multiple images from an optical sensor on the flexible tip of an endoscopic tool within a tubular network. - Identify, within an image, a division of the network defining multiple openings to respective segments. - Track these detected openings over sequential image frames to determine probabilities for entering each opening. - Determine a first estimated state of the flexible tip based on the division and probabilities, where the state is at least one of position or orientation. - Show the estimated position of the tip within a segment, based on this estimated state, on a display.
Method for navigating a tubular network using optical sensor state estimation and probability tracking
A method including: - Receiving multiple images from an optical sensor on a flexible tip of an endoscopic tool inserted in a tubular network. - Identifying, within an image, a division with multiple openings leading to segments of the network. - Tracking these detected openings over sequential images to determine entry probabilities. - Determining a first estimated state (at least position or orientation) of the tip using the division and probabilities. - Showing, based on the estimated state, an estimated position of the tip within a segment on a display.
The inventive features center on identifying divisions in a tubular network through image-based detection, tracking openings to estimate entry probabilities for navigation, and displaying real-time position or orientation estimates by fusing image-derived state information, optionally in combination with other sensor data.
Stated Advantages
Provides improved estimation of location and orientation information of a medical instrument within a tubular network, enhancing real-time navigation accuracy.
Allows the use of multiple algorithms, probability distributions, and confidence values to generate more accurate state and navigation data for the medical instrument.
Facilitates more convenient and accurate operations for physicians by integrating various data sources (image, EM, robotic, CT-derived 3D model) in navigation.
Enables real-time tracking and correction of instrument position errors using feedback from multiple sensor inputs.
Documented Applications
Navigation and localization of endoscopic or medical instruments within tubular networks of a patient’s body, such as lung airways (e.g., bronchoscopy).
Pre-operative planning for endoscopic surgical procedures, including generating 3D models and determining paths to target areas (such as lesions) in tubular networks.
Providing real-time guidance and visualization for physicians during minimally invasive surgeries involving flexible or rigid elongated medical instruments in networked anatomical structures.
Interested in licensing this patent?