Multi-scale exposure fusion is an effective way to directly fuse low dynamic range (LDR) image with different exposures into a content-rich LDR image for high dynamic range (HDR) reconstruction. Previous researches have shown that edge-preserving smoothing can be used to improve multi-scale exposure fusion. However, multi-scale exposure fusion via edge-preserving smoothing pyramids suffers from loss of details. To address this issue, we propose a side window gradient guided image filtering (SGGIF) and use it to construct an edge-preserving smooth pyramid. First, by adding eight kernels to the gradient guided image filtering(GGIF), a SGGIF with effective edge preserving is developed. Furthermore, we select the weight map with the minimum mean as the guidance image, which can further preserve details in the brightest and darkest regions of HDR scenes. Finally, we developed a detail-preserving multi-scale exposure fusion method based on edge-preserving smooth pyramids. Experimental results indicate that our method can effectively preserve details and reduce halo artifacts. Both quantitative and qualitative analyses demonstrate the effectiveness of our proposed approach.
Phase error compensation is very important for inverse synthetic aperture radar (ISAR) high-resolution imaging. However, the high-order motion of maneuvering target will cause the echo to produce spatial-variant phase error, traditional compensation methods cannot achieve good compensation results. Therefore, a phase error compensation method for maneuvering target is proposed. Firstly, the spatial-variant phase error model of maneuvering target is established. Then, the target motion is modeled as a high-order polynomial, and the phase error compensation model based on image entropy minimization is established. Finally, the whale optimization algorithm (WOA) algorithm is used to iteratively search the target motion parameters for phase error compensation. The proposed method's performance is demonstrated by the simulation results.
Unlike traditional optical cameras, Event Cameras are a new type of neuromorphic vision sensors which generate asynchronous streams of events in response to changes in log-illumination at each pixel. These devices are, therefore, extremely fast, allow for imaging while the device is moving, and enable low-power space imaging equally well during daytime as well as night. It can compensate for the limitations of optical equipment detection and meet the current space object detection requirements. Based on the background of space object detection and the optical observation technology of event cameras, the research status and several future development trends of event-based space object detection methods are summarized and reviewed. Firstly, the basic principle of event cameras and the advantages and disadvantages of their application in the aerospace field are described. Then the technical development of event cameras in space object detection, recognition and tracking are introduced. Finally, the development direction of event cameras in space object detection is discussed.
For high dynamic range image reconstruction in dynamic scenes, this paper proposes a multi-exposure sequence image fusion method based on histogram matching and convolutional neural network. First, image alignment is performed on the reference image based on the histogram matching method. Then, a multi-exposure fusion model based on convolutional neural network is constructed for the aligned multi-exposure sequence images. Finally, the fusion model is trained based on the training data, and the multi-exposure sequence image fusion in dynamic scenes is realized. The experimental results show that the method proposed in this chapter effectively avoids the influence of dynamic scenes on the fusion of multi-exposure sequence images, and at the same time can generate high-quality fused images.
For near-polar orbit satellite constellations with multiple orbital planes, the reconfiguration control of orbital planes is required, whether for multi-planar deployment during one rocket launch multi-satellites, for replacing satellites in the adjacent orbital planes after satellite failure, or for reconfiguring the constellation due to changes in mission requirements. To reduce the cost of orbital plane reconfiguration, this paper proposes a continuous small thrust based orbital plane reconfiguration control method for high inclination near-polar orbit satellite constellations. The orbital plane reconfiguration is achieved by changing the orbital inclination with continuous small thrust, using the difference in the change rate of the Right Ascension of the Ascending Node at different orbital inclinations under the influence of J2 perturbation; Finally, the effectiveness of the method proposed in this paper is verified by simulation. The simulation results also show that, compared with the method of directly changing the orbital plane, the method in this paper can significantly reduce the velocity increment required for the orbital plane reconfiguration, which has more practical significance.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.