The Knuth Cyberphysics Laboratory in collaboration with Autonomous Exploration Inc. has been awarded a 2009 NASA Phase I SBIR grant for the proposal titled Advanced Bayesian Methods for Lunar Surface Navigation. We will be working toward developing advanced Visual Odometry (VO) navigation systems that target the new Lunar Electric Rover and the Mark III space suits. The image to the right shows one of our prototype VO systems on a spacesuit testbed.
The key innovation of this project will be the application of advanced Bayesian methods to integrate real-time dense stereo vision and high-speed optical flow with an Inertial Measurement Unit (IMU) to produce a highly accurate planetary rover navigation system. The software developed in this project will leverage current computing technology to implement advanced Visual Odometry (VO) methods that will accurately track much faster rover movements. Our fully Bayesian approach to VO will utilize more information from the images than previous methods are capable of using. Our Bayesian VO does not explicitly select features to track. Instead it implicitly determines what can be learned from each image pixel and weights the information accordingly. This means that our approach can work with images that have no distinct corners, which can be a significant advantage with low contrast images from permanently shadowed areas. We expect that the error characteristics of the visual processing with be complementary to the error characteristics of a low-cost IMU. Therefore, the combination of the two should provide highly accurate navigation.
Potential NASA Commercial Applications
Visual Odometry (VO) has played a key role in Mars exploration with the Spirit and Opportunity Mars Exploration Rovers (MERs). However, limitations in onboard computing power severely limit the speed of movement that can be tracked by MERS VO, requiring an order of magnitude reduction in forward progress in area where VO was required. The software developed in this project will leverage current computing technology to implement advanced VO methods that will accurately track much faster rover movements. This will greatly increase exploration productivity. This improvement will become even more significant when exploring the more distant planetary bodies.
This project will also investigate whether combining vision with a low-cost, lightweight, low-power Micro-ElectroMechanical System (MEMS) Inertial Measurement Unit (IMU) can produce acceptable accuracy for lunar and planetary exploration. If so, this will facilitate the design of lower-cost, light-weight rovers, which will make it feasible to launch a team of rovers for wide area exploration.
Potential non-NASA Commericial Applications
There will be many potential terrestrial applications for a Bayesian VO system. Although GPS-IMU systems can work well in open outdoor settings, GPS is degraded or unavailable in indoor settings or in outdoor areas with significant tree cover. A navigation system combining a GPS and an IMU with Bayesian VO could provide continuous operation in all environments. The success of this project should lay the groundwork for low-cost, low-power, light-weight integrated navigation systems for robots and autonomous vehicles operating in a wide range of environments. One potential market for this technology is the Department of Defense (DoD). Congress has given DoD a mandate that by 2020 30% of ground vehicles should be robotic. An accurate, low-cost VO system should allow many of these vehicles to be semi-autonomous, enabling only supervisory control for many missions.
NASA’s technology taxonomy has been developed by the SBIR-STTR program to disseminate awareness of proposed and awarded R/R&D in the agency. It is a listing of over 100 technologies, sorted into broad categories, of interest to NASA.
Technology Taxonomy Mapping
Guidance, Navigation, and Control Perception/Sensing