In recent years there have been a growing demand in stereoscopic displays to perceive realistic 3D presentations of objects with depth perception. Our Person-adaptive Autostereoscopic Display (PAM) is easy in usage and cost-effective to establish and so ideal for mass-market. For successful acceptance in a wide range of applications, accurate and efficient image generation and presentation on the spatial-multiplex autostereoscopic display is absolutely necessary. This includes the simulation of motion parallax as well as presentation of two perspectives on display with low cross-talk. This paper describes first the image generation
steps, then the relation and arrangement of the perspective image pixel to the display screen executed by software. Therefore the x,y,z-observer's eye-positions are incorporated. Both steps are
capsulated in an extension to OpenGL 3D graphic API. For generating the two perspectives, a stereo buffer on the graphic board is not necessary.
The tracking of observer's eyes positions in front of a 3D display is necessary to ensure a correct autostereoscopic view of position-dependent 3D images. We present a new real- time eye tracking system using two commercially available web cams detecting the observer's eyes in x, y and z direction. The entire system can be installed on a standard PC together with an autostereoscopic display. In a first process eye-candidates are detected by implementation of a fast pattern recognition. The additional use of the color information of the web cams provides more useful information on finding eye-candidates. In a second step from eye-pairs that are nearest to a monitor world-coordinates are calculated. Signal transmission and processing delays are compensated by an adaptive predictor. Thus the entire system is cheaper, smaller in size and it can be installed on a standard PC. In addition the tracking software can also support other applications, e.g. to set up a teleconference system in conjunction with an autostereoscopic monitor.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.