V-GPS - Image-Based Control for 3D Guidance Systems

Darius Burschka, Gregory D. Hager

Research output: Contribution to conferencePaperpeer-review

17 Scopus citations


We present our approach for pose verification with monocular cameras in 3-dimensional space based on the image-based control paradigm. We describe the extensions to our previous control system for mobile navigation that allow us to estimate the complete set of pose parameters in space. The major contribution of this approach is a sensor-independent formulation of the image formation that allows a flexible configuration with a variety of sensor systems including standard cameras, omnidirectional cameras and laser systems. Our second contribution is a way to reinitialize the tracked landmarks during a multi-segment navigation in applications with significant deviations from the pre-taught trajectory as it is the case for handheld systems and flying robots. The presented system can be used as a guidance system for visitors. The localization is based on known landmarks that are in our case natural landmarks in the environment. These landmarks correspond to the satellites of a GPS system. We call it V-GPS (vision-based GPS) because of this similarity in the concept. A camera carried by a person allows to navigate along pre-specified paths through environments, like galleries, hospitals, parks, and other public places.

Original languageEnglish (US)
Number of pages7
StatePublished - Dec 26 2003
Event2003 IEEE/RSJ International Conference on Intelligent Robots and Systems - Las Vegas, NV, United States
Duration: Oct 27 2003Oct 31 2003


Other2003 IEEE/RSJ International Conference on Intelligent Robots and Systems
Country/TerritoryUnited States
CityLas Vegas, NV

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications


Dive into the research topics of 'V-GPS - Image-Based Control for 3D Guidance Systems'. Together they form a unique fingerprint.

Cite this