Monocular visual-inertial alignment for scaled distance estimation on mobile devices
Inventors
German, Stan • Kim, Michael • Roth, Henry
Assignees
Publication Number
US-11372017-B2
Publication Date
2022-06-28
Expiration Date
2040-08-24
Interested in licensing this patent?
MTEC can help explore whether this patent might be available for licensing for your application.
Abstract
Methods, techniques, apparatus, and algorithms are described for robustly measuring real-world distances using any mobile device equipped with an accelerometer and monocular camera. A general software implementation processes 2D video, precisely tracking points of interest across frames to estimate the unsealed trajectory of the device, which is used to correct the device's inertially derived trajectory. The visual and inertial trajectories are then aligned in scale space to estimate the physical distance travelled by the device and the true distance between the visually tracked points.
Core Innovation
The invention presents methods, algorithms, and apparatus for robustly measuring real-world distances using any mobile device that is equipped with an accelerometer and a monocular camera. A software implementation processes 2D video to precisely track points of interest across frames, resulting in an unsealed visual trajectory of the device’s motion. This visual trajectory is used to correct the inertially derived trajectory, and then both the visual and inertial trajectories are aligned in scale space to estimate the physical distance traveled by the device as well as the true distance between the visually tracked points.
The disclosed approach addresses a key challenge in developing easy-to-use, low-cost tools for automatic real-world anthropometric measurements, such as assessing respirator fit or measuring pupillary distance. Traditional 3D and monocular camera-based methods require specialized hardware, multiple simultaneous cameras, or are limited by the need for large trackable feature sets and are often sensitive to motion and object deformations. Furthermore, conventional Structure From Motion (SFM) techniques require higher-quality inertial sensors and extensive device motion, conditions not usually met by mobile devices or when measuring human facial features.
By fusing visual velocities derived from tracked image features with inertial trajectories extracted from the device's accelerometer, the invention provides a streamlined and computationally efficient solution suitable for handheld mobile devices. The technique enables real-world distance measurements between visually tracked points without needing reference objects or large feature sets and is specifically tailored for execution on smartphones, making it adaptable to various measurement contexts and objects.
Claims Coverage
There are two independent claims specifying the main inventive features: one for a method and another for a processor system for measuring real-world distance using mobile devices equipped with a camera and an inertial measurement unit (IMU).
Method for measuring real-world distance via visual-inertial alignment
A method that utilizes a mobile device with a camera and an inertial measurement unit (IMU) to measure real-world distance between two or more points in an image. The main inventive steps are: - Aligning estimated visual and inertial trajectories. - Converting both visual and inertial trajectory signals into velocity signals. - Scale aligning the velocity signals by minimizing their Euclidean distance.
Processor system for visual-inertial distance measurement
A processor system configured for measuring real-world distance using a mobile device equipped with a camera and an IMU, comprising: - A processor and memory storing instructions that, when executed, cause the processor to: - Align estimated visual and inertial trajectories. - Convert the visual and inertial trajectory signals into velocity signals. - Scale align the velocity signals by minimizing their Euclidean distance.
The inventive features focus on visual-inertial trajectory alignment and scale estimation through velocity signal alignment, realized as both a method and a processor system on mobile devices.
Stated Advantages
Enables robust and accurate real-world distance measurement using a mobile device equipped only with a monocular camera and accelerometer, without requiring a reference object.
Is computationally efficient and lightweight enough to run on smartphones or in a mobile web browser.
Is robust to low data rates, shallow device motions, and provides accurate measurement even when only a single pair of repeatable features (such as pupils) is available for tracking.
Mitigates the effects of IMU sensor drift through a novel filtering approach based on visual trajectory zero velocity detection.
Meets optometry standards for accuracy in pupillary distance measurement and is suitable for sizing and fitting of respirator masks.
Documented Applications
Sizing and fitting of respirator masks using a smartphone camera.
Measurement of pupillary distance for optometry purposes using a smartphone camera.
3D body measurements and real-world measurement of distances between any two fixed points.
Interested in licensing this patent?