You are here

Integrated Visual and Inertial Sensor Information for Onboard Navigation (I-VISION)

Award Information
Agency: Department of Defense
Branch: Air Force
Contract: FA9550-08-C-0032
Agency Tracking Number: F064-032-0116
Amount: $499,912.00
Phase: Phase II
Program: STTR
Solicitation Topic Code: AF06-T032
Solicitation Number: N/A
Timeline
Solicitation Year: 2006
Award Year: 2008
Award Start Date (Proposal Award Date): 2007-11-02
Award End Date (Contract End Date): 2009-11-02
Small Business Information
9950 Wakeman Drive
Manassas, VA 20110
United States
DUNS: 604717165
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
Principal Investigator
 James Paduano
 Principal Investigator
 (617) 225-4378
 jpaduano@aurora.aero
Business Contact
 Diana Eichfeld
Title: Contracts Manager
Phone: (703) 396-6329
Email: deichfeld@aurora.aero
Research Institution
 MASSACHUSETTS INST. OF TECHNOLOGY
 Michael P Corcoran
 
77 Massachusetts Avenue Room E19-750
Cambridge, MA 2139
United States

 (617) 253-3906
 Nonprofit College or University
Abstract

Given the vulnerability of the GPS system to intentional or hostile denial of service, it is imperative that the U.S. military develop technologies that provide robust navigation solutions that do not rely on GPS data. To achieve this capability, Aurora and MIT have teamed to develop the Integrated Visual and Inertial Sensor Information Onboard Navigation (I-VISION) system, which will couple optic flow-based ego-motion estimation with inertial navigation systems to achieve precision navigation without GPS. Current-generation inertial measurement units (IMUs) provide very accurate angular rate and linear acceleration information, which can be integrated to estimate a platform’s position, velocity, and orientation over time. Unfortunately, due to sensor biases and scale factor errors, the accuracy of the estimation degrades with time. The coupling of visual flow and IMU-based ego-motion estimation enables the IMU to aid in resolving vision-based errors and ambiguities, and visual flow estimates can be used to reduce IMU sensor biases and scale factor errors. A multiple-camera system is being employed in order to better address the ambiguity that is introduced in the estimation of optical flow-based ego-motion. Both tightly coupled and ultra-tightly coupled implementations are being considered.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government