The Pyramid Wavefront Sensor (PWFS) is one of the preferred choices for measuring wavefront aberrations for adaptive optics in highly sensitive astronomical applications. Despite its inherent high sensitivity, it has a low linearity that degrades the operational range of the phase estimation. This problem has been solved by optically modulating the PSF across the pyramid. However, modulation requires movable physical parts, requiring additional calibration while degrading the sensitivity in exchange for linearity. We created an End-To-End (E2E) trainable scheme that includes the PWFS model of propagation, an optical diffractive layer at the Fourier plane, and a state-of-the-art deep neural network that performs wavefront reconstruction. The joint training routine for the physical and digital trainable elements is conducted under a variety of atmospheric conditions simulated at different strengths along with its nth-Zernike decomposition for further comparison with the ones estimated by our model. We develop a variety of training schemes, varying turbulence ranges and balance between optical and digital layers. In this way, simulation results show an overall improvement in wavefront estimation even beyond the trained turbulence ranges, improving linearity while trying to maintain the sensitivity at weak turbulence, surpassing previous results that considered only one diffractive element and linear wavefront estimation. We are currently performing experimental closed-loop adaptive optics tests, while simulations are displaying encouraging results.
In this work, we evaluate a especially crafted deep convolutional neural network to provide with estimations of the wavefront aberration modes directly from pyramidal wavefront sensor (PyWFS) images. Overall, the use of deep neural networks allow to improve the estimation performance as well as the operational range of the PyWFS, especially when considering cases of strong turbulence or bad seeing ratios D0/r0. Our preliminary results provide with evidence that by using neural nets, instead of the classic linear estimation methods, we can obtain a low modulation sensitivity response while extending the linearity range of the PyWFS, reducing the residual variance by a factor of 1.6 when dealing with a r0 as low as a few centimeters.
We have recently proposed the deep learning wavefront sensor, capable of directly estimating Zernike coefficients of aberrated wavefronts from a single intensity image by using a convolutional neural network. However, deep neural networks demand an intensive training stage, where more training examples allow to improve the accuracy and increase the amount of the estimated Zernike modes. Since low order aberrations such as tip and tilt only produce space-invariant motion of the PSF, we propose to treat tip and tilt estimation separately when training the deep learning wavefront sensor, decreasing the training efforts while keeping the wavefront sensor performance. In this paper, we also introduce and test simpler architectures for deep learning wavefront sensing, while exploring the impact of reducing the number of pixels to estimate a given amount of Zernike coefficients. Our preliminary results indicate that we can achieve a significant prediction speedup aiming for real time adaptive optics systems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.