This work reports on the results of shore-based maritime environment and shipboard at-sea testing of the Shipboard Panoramic Electro-Optical Infrared (EO/IR) Cueing and Surveillance System (SPECSS) developed under an Office of Naval Research (ONR Code 31) Future Naval Capabilities program. The key technology enablers for the system are 25 Megapixel mid-wave infrared (MWIR) cameras that utilize III-V strained layer superlattice nBn detectors, and highly-parallel detection processing on the latest commercially available graphic processing units (GPUs). The effects of the maritime atmospheric and background conditions, as well as ship motion and line of sight stabilization, on the performance of the system will be discussed. Demonstration of highly accurate angular target cueing to other high resolution sensors will be presented.
Folded path reflection and catadioptric optics are of growing interest, especially in the long wave infrared (LWIR), due to continuing demands for reductions in imaging system size, weight and power (SWAP). We present the optical design and laboratory data for a 50 mm focal length low f/# folded-path compact LWIR imaging system. The optical design uses 4 concentric aspheric mirrors, each of which is described by annular aspheric functions well suited to the folded path design space. The 4 mirrors are diamond turned onto two thin air-spaced aluminum plates which can be manually focused onto the uncooled LWIR microbolometer array detector. Stray light analysis will be presented to show how specialized internal baffling can be used to reduce stray light propagation through the folded path optical train. The system achieves near diffraction limited performance across the FOV with a 15 mm long optical train and a 5 mm back focal distance. The completed system is small enough to reside within a 3 inch diameter ball gimbal.
One of the desired capabilities for wide-area persistent ISR systems is to reliably locate and subsequently track the movement of targets within the field of view. Current wide-area persistent ISR systems are characterized by large pixel overall counts and very large fields of view. This leads to a large ground sample distance with few pixels-on-target. Locating targets under these constraints is extremely difficult due to the fact that the targets present very little detailed structure. In this paper we will present the application of rich image feature descriptors combined with advanced statistical target detection methodologies to the airborne ISR problem. We will demonstrate that these algorithms can reliably locate targets in the scene without relying on the target's motion to form a detection. This is useful in ISR application where it is desirable to be able to continuously track a target through stops and maneuvers.
The capabilities of tactical intelligence, surveillance, and reconnaissance (ISR) payloads continue to expand from single sensor imagers to integrated systems of systems architectures. We describe here flight test results of the Sensor Management System (SMS) designed to provide a flexible central coordination component capable of managing multiple collaborative sensor systems onboard an aircraft or unmanned aerial system (UAS). The SMS architecture is designed to be sensor and data agnostic and provide flexible networked access for both data providers and data consumers. It supports pre-planned and ad-hoc missions, with provisions for on-demand tasking and updates from users connected via data links. The SMS system is STANAG 4575 compliant as a removable memory module (RMM) and can act as a vehicle specific module (VSM) to provide STANAG 4586 compliance (level-3 interoperability) to a noncompliant sensor system. The SMS architecture will be described and results from several flight tests that included multiple sensor combinations and live data link updates will be shown.
Unmanned aerial systems (UASs) have become a critical asset in current battlespaces and continue to play an increasing role for intelligence, surveillance and reconnaissance (ISR) missions. With the development of medium-to-low altitude, rapidly deployable aircraft platforms, the ISR community has seen an increasing push to develop ISR sensors and systems with real-time mission support capabilities. This paper describes recent flight demonstrations and test results of the RASAR (Real-time, Autonomous, Synthetic Aperture Radar) sensor system. RASAR is a modular, multi-band (L and X) synthetic aperture radar (SAR) imaging sensor designed for self-contained, autonomous, real-time operation with mission flexibility to support a wide range of ISR needs within the size, weight and power constraints of Group III UASs. The sensor command and control and real-time image formation processing are designed to allow integration of RASAR into a larger, multi-intelligence system of systems. The multi-intelligence architecture and a demonstration of real-time autonomous cross-cueing of a separate optical sensor will be presented.
The utilization of unmanned aerial systems (UASs) for intelligence, surveillance and reconnaissance (ISR) applications
continues to increase and unmanned systems have become a critical asset in current and future battlespaces. With the
development of medium-to-low altitude, rapidly deployable aircraft platforms, the ISR community has seen an
increasing push to develop ISR sensors and systems with real-time mission support capabilities. This paper describes the
design and development of the RASAR (Real-time, Autonomous, Synthetic Aperture Radar) sensor system and presents
demonstration flight test results. RASAR is a modular, multi-band (L and X) synthetic aperture radar (SAR) imaging
sensor designed for self-contained, autonomous, real-time operation with mission flexibility to support a wide range of
ISR needs within the size, weight and power constraints of Group III UASs. SAR waveforms are generated through
direct digital synthesis enabling arbitrary waveform notching to enable operations in cluttered RF environments. RASAR
is capable of simultaneous dual-channel receive to enable polarization based target discrimination. The sensor command
and control and real-time image formation processing are designed to enable integration of RASAR into larger, multi-intelligence
system of systems. The multi-intelligence architecture and a demonstration of real-time autonomous cross-cueing
of a separate optical sensor will be presented.
The capabilities of tactical intelligence, surveillance, and reconnaissance (ISR) payloads are expanding from single
sensor imagers to integrated systems-of-systems architectures. Increasingly, these systems-of-systems include multiple
sensing modalities that can act as force multipliers for the intelligence analyst. Currently, the separate sensing modalities
operate largely independent of one another, providing a selection of operating modes but not an integrated intelligence
product. We describe here a Sensor Management System (SMS) designed to provide a small, compact processing unit
capable of managing multiple collaborative sensor systems on-board an aircraft. Its purpose is to increase sensor
cooperation and collaboration to achieve intelligent data collection and exploitation. The SMS architecture is designed to
be largely sensor and data agnostic and provide flexible networked access for both data providers and data consumers. It
supports pre-planned and ad-hoc missions, with provisions for on-demand tasking and updates from users connected via
data links. Management of sensors and user agents takes place over standard network protocols such that any number
and combination of sensors and user agents, either on the local network or connected via data link, can register with the
SMS at any time during the mission. The SMS provides control over sensor data collection to handle logging and routing
of data products to subscribing user agents. It also supports the addition of algorithmic data processing agents for
feature/target extraction and provides for subsequent cueing from one sensor to another. The SMS architecture was
designed to scale from a small UAV carrying a limited number of payloads to an aircraft carrying a large number of
payloads. The SMS system is STANAG 4575 compliant as a removable memory module (RMM) and can act as a
vehicle specific module (VSM) to provide STANAG 4586 compliance (level-3 interoperability) to a non-compliant
sensor system. The SMS architecture will be described and results from several flight tests and simulations will be
shown.
The Naval Research Laboratory has developed and demonstrated an autonomous multi-sensor motion-tracking and
interrogation system that reduces the workload for analysts by automatically finding moving objects, and then
presenting high-resolution images of those objects with little-to-no human input. Intelligence, surveillance and
reconnaissance (ISR) assets in the field generate vast amounts of data that can overwhelm human operators and can
severely limit an analyst's ability to generate intelligence reports in operationally relevant timeframes. This multiuser
tracking capability enables the system to manage the collection of imagery without continuous monitoring by a
ground or airborne operator, thus requiring fewer personnel and freeing up operational assets. During flight tests,
March 2011, multiple real-time motion-target-indicator (MTI) tracks generated by a wide-area persistent
surveillance sensor (WAPSS) were autonomously cross-cued to a high-resolution narrow filed-of-view interrogation
sensor via an airborne network. Both sensors were networked by the high-speed Tactical Reachback Extended
Communications (TREC) data-link provided by the NRL Information Technology Division.
The availability of imagery simultaneously collected from sensors of disparate modalities enhances an image analyst's
situational awareness and expands the overall detection capability to a larger array of target classes. Dynamic
cooperation between sensors is increasingly important for the collection of coincident data from multiple sensors either
on the same or on different platforms suitable for UAV deployment. Of particular interest is autonomous collaboration
between wide area survey detection, high-resolution inspection, and RF sensors that span large segments of the
electromagnetic spectrum. The Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory
(SDL) is building sensors with such networked communications capability and is conducting field tests to demonstrate
the feasibility of collaborative sensor data collection and exploitation. Example survey / detection sensors include:
NuSAR (NRL Unmanned SAR), a UAV compatible synthetic aperture radar system; microHSI, an NRL developed
lightweight hyper-spectral imager; RASAR (Real-time Autonomous SAR), a lightweight podded synthetic aperture
radar; and N-WAPSS-16 (Nighttime Wide-Area Persistent Surveillance Sensor-16Mpix), a MWIR large array gimbaled
system. From these sensors, detected target cues are automatically sent to the NRL/SDL developed EyePod, a high-resolution,
narrow FOV EO/IR sensor, for target inspection. In addition to this cooperative data collection, EyePod's
real-time, autonomous target tracking capabilities will be demonstrated. Preliminary results and target analysis will be
presented.
EyePod is a compact survey and inspection day/night imaging sensor suite for small unmanned aircraft systems (UAS).
EyePod generates georeferenced image products in real-time from visible near infrared (VNIR) and long wave infrared
(LWIR) imaging sensors and was developed under the ONR funded FEATHAR (Fusion, Exploitation, Algorithms, and
Targeting for High-Altitude Reconnaissance) program. FEATHAR is being directed and executed by the Naval Research
Laboratory (NRL) in conjunction with the Space Dynamics Laboratory (SDL) and FEATHAR's goal is to develop and
test new tactical sensor systems specifically designed for small manned and unmanned platforms (payload weight < 50
lbs). The EyePod suite consists of two VNIR/LWIR (day/night) gimbaled sensors that, combined, provide broad area
survey and focused inspection capabilities. Each EyePod sensor pairs an HD visible EO sensor with a LWIR bolometric
imager providing precision geo-referenced and fully digital EO/IR NITFS output imagery. The LWIR sensor is mounted
to a patent-pending jitter-reduction stage to correct for the high-frequency motion typically found on small aircraft and
unmanned systems. Details will be presented on both the wide-area and inspection EyePod sensor systems, their modes
of operation, and results from recent flight demonstrations.
NuSAR (Naval Research Laboratory Unmanned Synthetic Aperture Radar) is a sensor developed under the ONRfunded
FEATHAR (Fusion, Exploitation, Algorithms, and Targeting for High-Altitude Reconnaissance) program.
FEATHAR is being directed and executed by the Naval Research Laboratory (NRL) in conjunction with the Space
Dynamics Laboratory (SDL). FEATHAR's goal is to develop and test new tactical sensor systems specifically designed
for small manned and unmanned platforms (payload weight < 50 lbs). NuSAR is a novel dual-band (L- and X-band)
SAR capable of a variety of tactically relevant operating modes and detection capabilities. Flight test results will be
described for narrow and wide bandwidth and narrow and wide azimuth aperture operating modes.
FEATHAR (Fusion, Exploitation, Algorithms, and Targeting for High-Altitude Reconnaissance) is an ONR funded
effort to develop and test new tactical sensor systems specifically designed for small manned and unmanned platforms
(payload weight < 50 lbs). This program is being directed and executed by the Naval Research Laboratory (NRL) in
conjunction with the Space Dynamics Laboratory (SDL). FEATHAR has developed and integrated EyePod, a combined
long-wave infrared (LWIR) and visible to near infrared (VNIR) optical survey & inspection system, with NuSAR, a
combined dual band synthetic aperture radar (SAR) system. These sensors are being tested in conjunction with other
ground and airborne sensor systems to demonstrate intelligent real-time cross-sensor cueing and in-air data fusion.
Results from test flights of the EyePod and NuSAR sensors will be presented.
The Naval Research Laboratory (NRL) and Space Dynamics Laboratory (SDL) are executing a joint effort,
DUSTER (Deployable Unmanned System for Targeting, Exploitation, and Reconnaissance), to develop and
test a new tactical sensor system specifically designed for Tier II UAVs. The system is composed of two
coupled near-real-time sensors: EyePod (VNIR/LWIR ball gimbal) and NuSAR (L-band synthetic aperture
radar). EyePod consists of a jitter-stabilized LWIR sensor coupled with a dual focal-length optical system
and a bore-sighted high-resolution VNIR sensor. The dual focal-length design coupled with precision
pointing an step-stare capabilities enable EyePod to conduct wide-area survey and high resolution inspection
missions from a single flight pass. NuSAR is being developed with partners Brigham Young University
(BYU) and Artemis, Inc and consists of a wideband L-band SAR capable of large area survey and embedded
real-time image formation. Both sensors employ standard Ethernet interfaces and provide geo-registered
NITFS output imagery. In the fall of 2007, field tests were conducted with both sensors, results of which will
be presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.