The Roman Space Telescope (RST) Wide Field Instrument (WFI) will be utilizing a preliminary Science Data Processing (SDP) pipeline during its Integration and Test, and to some extent during Operations, to track basic statistics and identify known features such as cosmic rays, snowballs as well as possible anomalies in raw detector data. In our detectors, these anomalies appear as jumps in the ramp of a readout and are classified as cosmic rays if they appear as a streak or snowballs if they’re more circular. The WFI employs an array of 18 H4RG-10 detectors that collect image samples. Each set of raw frames within a non-destructive exposure is packaged by the SDP pipeline into image cubes for each detector. Each cube is a time series of 4096 × 4096 accumulating pixel frames. The preliminary analysis pipeline is used to locate anomalies in these time-series accumulation frames and identify the type of anomaly, either natural phenomena or detector characteristic. To compare different methods, we’ve implemented both heuristic-based and data-driven methods to identify anomalies. For the heuristic-based approach, we identify snowballs and cosmic rays by the size and shape of outlier pixel clusters between consecutive frames. For data driven methods, we evaluated a Convolutional Neural Network (CNN) model, and more traditional methods like Principal Component Analysis (PCA). CNN is a supervised learning/classification method. Thus, we used a labeled dataset of anomalies to perform segmentation of the image and identify anomalies. We used previously identified cosmic rays and snowballs to measure the accuracy and efficiency of the mentioned approaches. In evaluating these methods, we aim to pick the best fit for the SDP pipeline’s anomaly detection in terms of both performance and runtime.
The PISCES (Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies) is a lenslet array based integral field spectrograph (IFS) designed to advance the technology readiness of the WFIRST-AFTA high contrast Coronagraph Instrument. We present the end to end optical simulator and plans for the data reduction pipeline (DRP). The optical simulator was created with a combination of the IDL-based PROPER library and Zemax, while the data reduction pipeline is a modified version of the Gemini Planet Imager's (GPI) IDL pipeline. The simulations of the propagation of light through the instrument are based on Fourier transform algorithms. The DRP enables transformation of the PISCES IFS data to calibrated spectral data cubes.
Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies (PISCES) is a lenslet array based integral field spectrometer (IFS) designed for high contrast imaging of extrasolar planets. PISCES will be used to advance the technology readiness of the high contrast IFS baselined on the Wide-Field InfraRed Survey Telescope/Astrophysics Focused Telescope Assets (WFIRST-AFTA) coronagraph instrument. PISCES will be integrated into the high contrast imaging testbed (HCIT) at the Jet Propulsion Laboratory (JPL) and will work with both the Hybrid Lyot Coronagraph (HLC) and the Shaped Pupil Coronagraph (SPC) configurations. We discuss why the lenslet array based IFS was selected for PISCES. We present the PISCES optical design, including the similarities and differences of lenslet based IFSs to normal spectrometers, the trade-off between a refractive design and reflective design, as well as the specific function of our pinhole mask on the back surface of the lenslet array to reduce the diffraction from the edge of the lenslets. The optical analysis, alignment plan, and mechanical design of the instrument will be discussed.
The James Webb Space Telescope (JWST) relies on several innovations to complete its five year mission. One vital
technology is microshutters, the programmable field selectors that enable the Near Infrared Spectrometer (NIRSpec) to
perform multi-object spectroscopy. Mission success depends on acquiring spectra from large numbers of galaxies by
positioning shutter slits over faint targets. Precise selection of faint targets requires field selectors that are both high in
contrast and stable in position. We have developed test facilities to evaluate microshutter contrast and alignment stability
at their 35K operating temperature. These facilities used a novel application of image registration algorithms to obtain
non-contact, sub-micron measurements in cryogenic conditions. The cryogenic motion of the shutters was successfully
characterized. Optical results also demonstrated that shutter contrast far exceeds the NIRSpec requirements. Our test
program has concluded with the delivery of a flight-qualified field selection subsystem to the NIRSpec bench.
KEYWORDS: Interferometry, Interferometers, Space telescopes, Telescopes, James Webb Space Telescope, Infrared telescopes, Data modeling, Mirrors, Visibility, Sensors
Interferometry is an affordable way to bring the benefits of high resolution to space far-IR astrophysics. We summarize
an ongoing effort to develop and learn the practical limitations of an interferometric technique that will enable the
acquisition of high-resolution far-IR integral field spectroscopic data with a single instrument in a future space-based
interferometer. This technique was central to the Space Infrared Interferometric Telescope (SPIRIT) and Submillimeter
Probe of the Evolution of Cosmic Structure (SPECS) space mission design concepts, and it will first be used on the
Balloon Experimental Twin Telescope for Infrared Interferometry (BETTII). Our experimental approach combines data
from a laboratory optical interferometer (the Wide-field Imaging Interferometry Testbed, WIIT), computational optical
system modeling, and spatio-spectral synthesis algorithm development. We summarize recent experimental results and
future plans.
The Wide-Field Imaging Interferometry Testbed (WIIT) is a wide-field spectral imaging Michelson
interferometer developed at the NASA/Goddard Space Flight Center. WIIT is operational and effectively
demonstrates imaging and spectroscopy over fields-of-view larger than the narrow primary beam footprint
of a conventional Michelson interferometer. At the heart of this technique is the "double-Fourier" approach
whereby the apertures and a delay line are both moved to collect interferograms over a 2D wide field
detector grid simultaneously; one interferogram per detector pixel. This aggregate set of interferograms, as
a function of baseline and delay line, is algorithmically processed to construct a single spatial-spectral cube
with angular resolution approaching the ratio of the wavelength to the longest baseline. Herein we develop
the mathematical spatial-spectral imaging model and the baseline processing algorithm and show results
using simulated data and using WIIT testbed data.
A moon or natural satellite is a celestial body that orbits a planetary body such as a planet, dwarf planet, or an asteroid.
Scientists seek understanding the origin and evolution of our solar system by studying moons of these bodies.
Additionally, searches for satellites of planetary bodies can be important to protect the safety of a spacecraft as it
approaches or orbits a planetary body. If a satellite of a celestial body is found, the mass of that body can also be
calculated once its orbit is determined. Ensuring the Dawn spacecraft's safety on its mission to the asteroid (4) Vesta
primarily motivated the work of Dawn's Satellite Working Group (SWG) in summer of 2011. Dawn mission scientists
and engineers utilized various computational tools and techniques for Vesta's satellite search. The objectives of this
paper are to 1) introduce the natural satellite search problem, 2) present the computational challenges, approaches, and
tools used when addressing this problem, and 3) describe applications of various image processing and computational
algorithms for performing satellite searches to the electronic imaging and computer science community. Furthermore,
we hope that this communication would enable Dawn mission scientists to improve their satellite search algorithms and
tools and be better prepared for performing the same investigation in 2015, when the spacecraft is scheduled to approach
and orbit the dwarf planet (1) Ceres.
Image registration, or alignment of two or more images covering the same scenes or objects, is of great interest in many
disciplines such as remote sensing, medical imaging, astronomy, and computer vision. In this paper, we introduce a
new application of image registration algorithms. We demonstrate how through a wavelet based image registration
algorithm, engineers can evaluate stability of Micro-Electro-Mechanical Systems (MEMS). In particular, we applied
image registration algorithms to assess alignment stability of the MicroShutters Subsystem (MSS) of the Near Infrared
Spectrograph (NIRSpec) instrument of the James Webb Space Telescope (JWST). This work introduces a new
methodology for evaluating stability of MEMS devices to engineers as well as a new application of image registration
algorithms to computer scientists.
A large depth-of-field Particle Image Velocimeter (PIV) has been developed at NASA GSFC to characterize dynamic
dust environments on planetary surfaces. This instrument detects and senses lofted dust particles. To characterize a
dynamic planetary dust environment, the instrument would have to operate for at least several minutes during an
observation period, easily producing more than a terabyte of data per observation. Given current technology, this
amount of data would be very difficult to store onboard a spacecraft and downlink to Earth. We have been developing
an autonomous image analysis algorithm architecture for the PIV instrument to greatly reduce the amount of data that
it has to store and downlink. The algorithm analyzes PIV images and reduces the image information down to only the
particle measurement data we are interested in receiving on the ground - typically reducing the amount of data to be
handled by more than two orders of magnitude. We give a general description of the PIV algorithms and describe in
detail the algorithm for estimating the direction and velocity of the traveling particles, which was done by taking
advantage of the optical properties of moving dust particles along with image processing techniques.
The Wide-Field Imaging Interferometry Testbed (WIIT) at NASA's Goddard Space Flight Center and a computational
model of the testbed were developed to demonstrate and learn the practical limitations of techniques for wide-field
spatial-spectral ("double Fourier") interferometry. WIIT is an automated and remotely operated system, and it is now
producing substantial amounts of high-quality data from its state-of-the-art operating environment, Goddard's Advanced
Interferometry and Metrology Lab. In this paper, we discuss the characterization and operation of the testbed and present
recently acquired data. We also give a short description of the computational model and its applications. Finally, we
outline future research directions. A companion paper within this conference discusses the development of new widefield
double Fourier data analysis algorithms.
Research with the Wide-Field Imaging Interferometry Testbed (WIIT) is ongoing, and in the past year we have
achieved several important milestones. We have moved WIIT into the Advanced Interferometry and Metrology
(AIM) Laboratory at Goddard, and have characterized the testbed in this well-controlled environment. The system
is now completely automated and we are in the process of acquiring large data sets for analysis. In this paper, we
discuss these new developments and outline our future research directions. The WIIT testbed, combined with new
data analysis techniques and algorithms, provides a demonstration of the technique of wide-field interferometric
imaging, a powerful tool for future space-borne interferometers. Algorithm development is discussed in a separate
paper within this conference.
The Wide-Field Imaging Interferometry Testbed (WIIT) is a wide-field spectral imaging Michelson
interferometer designed and developed at the NASA/Goddard Space Flight Center. WIIT is now
operational and is being used to demonstrate imaging and spectroscopy over fields-of-view larger than the
typically narrow primary beam footprint of a conventional Michelson interferometer. At the heart of this
technique is the "double-Fourier" approach whereby the apertures and a delay line are both moved to
collect interferograms over a 2D wide field detector grid simultaneously; one interferogram per detector
pixel. This aggregate set of interferograms, as a function of baseline and delay line, is algorithmically
processed to construct a hyperspectral image cube. Herein is developed and discussed the algorithm that
constructs the image cube. We show our preliminary results using observed laboratory WIIT data and
discuss our ongoing work for image deconvolution.
KEYWORDS: Space operations, Observatories, Space telescopes, Telescopes, Data communications, Target recognition, Prototyping, Data storage, Telecommunications, Software development
In the coming decade, the drive to increase the scientific returns on capital investment and to reduce costs will force automation to be implemented in many of the scientific tasks that have traditionally been manually overseen. Thus, spacecraft autonomy will become an even greater part of mission operations. While recent missions have made great strides in the ability to autonomously monitor and react to changing health and physical status of spacecraft, little progress has been made in responding quickly to science driven events. The new generation of space-based telescopes/observatories will see deeper, with greater clarity, and they will generate data at an unprecedented rate. Yet, while onboard data processing and storage capability will increase rapidly, bandwidth for downloading data will not increase as fast and can become a significant bottleneck and cost of a science program.
For observations of inherently variable targets and targets of opportunity, the ability to recognize early if an observation will not meet the science goals of variability or minimum brightness, and react accordingly, can have a major positive impact on the overall scientific returns of an observatory and on its operational costs. If the observatory can reprioritize the schedule to focus on alternate targets, discard uninteresting observations prior to downloading, or download them at a reduced resolution its overall efficiency will be dramatically increased.
We are investigating and developing tools for a science goal monitoring (SGM) system. The SGM will have an interface to help capture higher-level science goals from scientists and translate them into a flexible observing strategy that SGM can execute and monitor. SGM will then monitor the incoming data stream and interface with data processing systems to recognize significant events. When an event occurs, the system will use the science goals given it to reprioritize observations, and react appropriately and/or communicate with ground systems - both human and machine - for confirmation and/or further high priority analyses.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.