The helmet-mounted sight is a sighting device fixed on the pilot’s helmet and becomes increasingly important in flight battles. An important premise for the helmet to perform correctly is obtaining the pilot’s head orientation. The traditional vision-based method to obtain the pilot’s head orientation relies on cooperative targets, which can be constituted by multiple LEDs embedded on the helmet. However, the installation of multiple LEDs will increase the weight of the helmet. In addition, the measurement accuracy will decline due to the complex environment interference, strong illumination, and uncertainty of the LED luminous center. In order to solve these problems, this paper proposes a tightly coupled visual/IMU system for head pose estimation based on non-cooperative targets. A camera and an inertial measurement unit (IMU) are installed on the helmet to track the head orientation. In the reconstruction phase, a binocular system is used to reconstruct the internal environment of the cockpit and build the feature points database based on structure from motion (SFM) and the scale-invariant feature transform (SIFT) feature descriptors. In the measurement phase, the feature points of the captured images are extracted and matched with the database to obtain the 3D world coordinates of feature points. The coordinates are directly fused with the inertial data through the cubature Kalman filter (CKF) to realize fast and accurate attitude measurement. The practical experimental platform is set up to simulate the measurement of the pilot’s head attitude. The experimental results effectively verify the feasibility of the proposed measurement system and scheme.
|