Paper
13 January 2012 Fusion of inertial and vision data for accurate tracking
Jing Chen, Wei Liu, Yongtian Wang, Junwei Guo
Author Affiliations +
Abstract
We present a sensor fusion framework for real-time tracking applications combining inertial sensors with a camera. In order to make clear how to exploit the information in the inertial sensor, two different fusion models gyroscopes only model and accelerometers model are presented under extended Kalman filter framework. Gyroscopes only model uses gyroscopes to support the vision-based tracking without considering acceleration measurements. Accelerometers model utilizes both measurements from the gyroscopes, accelerometers and vision data to estimate the camera pose, velocity, acceleration and sensor biases. Synthetic data and real image experimental sequences show dramatic improvements in tracking stability and robustness of estimated motion parameters for gyroscope model, when the accelerometer measurements exist drift.
© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jing Chen, Wei Liu, Yongtian Wang, and Junwei Guo "Fusion of inertial and vision data for accurate tracking", Proc. SPIE 8349, Fourth International Conference on Machine Vision (ICMV 2011): Machine Vision, Image Processing, and Pattern Analysis, 83491D (13 January 2012); https://doi.org/10.1117/12.920343
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication and 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Visual process modeling

Gyroscopes

Cameras

Sensors

Motion models

Data modeling

Sensor fusion

Back to Top