Hyperspectral object tracking aims to estimate the bounding box for the given target using hyperspectral data. Different from traditional color videos, hyperspectral videos have more abundant band information for their capacity to capture the reflectance spectrum of the target at a wider range of wavelengths provides important capabilities and opportunities, which provides new capabilities for discriminating targets in complex scenes, but also presents new challenges. The limited dataset and the high dimensionality of hyperspectral data are two new challenges in constructing hyperspectral trackers, resulting in existing hyperspectral tracking methods based mainly on correlation filters. This paper proposes a new Complementary Features-aware Attentive Multi-Adapter Network (CFA-MANet), which can train a neural network well and achieve high performance for Hyperspectral Object tracking just using the limited dataset. Specifically, we add a complementary features-aware module to the multi-adapter network, which employs two different strategies to reduce the dimensionality of hyperspectral data from two complementary perspectives, and the joint implementation of these two strategies results in a reduction in the amount of computed data and parameters of the designed neural network while achieving competitive results. Moreover, spatial and channel attention modules are used to learn a wider range of contexts and improve the representation of different semantic features, respectively. Crossattention is used to learn complementary information and thus generate more discriminative representations. Experimental results on hyperspectral datasets show that our method achieves the best results compared to several recent hyperspectral tracking methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.