This paper discusses the depth acuity research conducted in support of the development of a Modular Multi-Spectral Stereoscopic (M2S2) night vision goggle (NVG), a customizable goggle that lets the user select one of five goggle configurations: monocular thermal, monocular image intensifier (I2), binocular I2, binocular thermal, and binocular dual-waveband (thermal imagery to one eye and I2 imagery to the other eye). The motives for the development of this type of customizable goggle were (1) the need for an NVG that allows the simultaneous use of two wavebands, (2) the need for an alternative sensor fusion method to avoid the potential image degradation that may accompany digitally fused images, (3) a requirement to provide the observer with stereoscopic, dual spectrum views of a scene, and (4) the need to handle individual user preferences for sensor types and ocular configurations employed in various military operations. Among the increases in functionality that the user will have with this system is the ability to convert from a binocular I2 device (needed for detailed terrain analysis during off-road mobility) to a monocular thermal device (for increased situational awareness in the unaided eye during nights with full moon illumination). Results of the present research revealed potential depth acuity advantages that may apply to off-road terrain hazard detection for the binocular thermal configuration. The results also indicated that additional studies are needed to address ways to minimize binocular incompatibility for the dual waveband configuration.
The Advanced Displays and Interactive Displays Federated Laboratory is in its fourth year of a five year research project. Diverse area of research such as intelligent information processing techniques for the display of course of action, augmented reality, tactile displays, bimodal speech recognition and other cognitive engineering result are being readied for transition to various Army customers. In this paper we describe some of the research results along with their potential Army applications.
KEYWORDS: Human-machine interfaces, Human-computer interaction, Visualization, Sensors, Databases, Control systems, 3D displays, Systems modeling, Telecommunications, Head
A Human Computer Interface HCI research program is being conducted collaboratively between Rockwell, its Consortium Members and the Army Research Laboratory. The research is exploring human-computer interfaces and displays to dramatically improve the interface between Army users and their information systems. This critical link to the human must be maximized to enable the Army to fully leverage its investments in information hardware and systems for the Digitization ofthe Battlefield and the Army After Next. Research is being conducted in the areas of databases, information filtering, visualization, augmented reality, and virtual reality. It also includes the integration of novel interfaces like gesture recognition and tactile to improve the overall throughput and reliability. Constructs, algorithms and techniques are being developed to provide users common views of the battlefield regardless of the display system utilized. Future systems must consider the human to maximize the conversion of data and information into usable knowledge.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.