LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Languages: English
Types: Doctoral thesis
Subjects: RE

Classified by OpenAIRE into

arxiv: Physics::Optics
Conventional imaging devices only capture a part of the total information carried by the light. A new generation of imaging devices, plenoptic systems, use an array of micro lenses to codify the light coming from an object into a four dimensional function called the light field. The final image is then obtained after post processing computations on the light field.\ud \ud In this work plenoptic imaging devices are analysed using a wave optics approach. A platform to simulate light propagating under the Fresnel approximation in a generic optical system was developed in MATLAB. An optical system can be modelled as the composition of two basic operators: the free space propagation and the lens. The first one was implemented developing an original method derived from the angular spectrum of plane waves theory\ud of propagation. The second was implemented using a phase mask. The code was developed to optimize the signal to noise ratio and the computational time.\ud \ud Two different configurations of plenoptic imaging systems were simulated. The first is the plenoptic 1.0 configuration. The general theory of plenoptic 1.0 and the post processing algorithms presented in the literature were verified using the simulation platform. The effects of diffraction were also evaluated and an original refocusing method is presented. For the second configuration, plenoptic 2.0, a full study of the optical resolution has been made and a detailed analysis of the effects of diffraction is presented. The results achieved with the simulations have been used to design a working prototype of a plenoptic microscope.\ud \ud This novel wave optics approach enables us to quantify for the first time in the literature the effects of diffraction on this class of devices. In plenoptic 1.0 diffraction is a source of noise due to the crosstalk between neighbouring lenslets. In plenoptic 2.0 systems the optical resolution is directly proportional to the magnification of the lenslet array. A small magnification leads to a high directional sampling but at the same time to a loss of optical resolution. The finite dimensions of the lenslets together with the wave nature of light produce a physical limit to the amount of information that can be achieved sampling the optical fields with those kind of devices.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • 1 Theory and Methods of Plenoptic Imaging 43 1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 1.2 Plenoptic Function and the Light Field . . . . . . . . . . . . . 45 1.3 Plenoptic Camera . . . . . . . . . . . . . . . . . . . . . . . . . 48 1.4 Plenoptic Camera 1.0 . . . . . . . . . . . . . . . . . . . . . . . 50 1.4.1 Phase Space in Plenoptic 1.0 . . . . . . . . . . . . . . . 56 1.4.2 Light Field Parameterization . . . . . . . . . . . . . . . 58 1.5 Plenoptic 1.0 Rendering . . . . . . . . . . . . . . . . . . . . . 63 1.6 Plenoptic Camera 2.0 . . . . . . . . . . . . . . . . . . . . . . . 66 1.7 Plenoptic Camera 2.0: a Geometrical Optics Analysis . . . . . 74 1.8 Plenoptic 2.0 Rendering . . . . . . . . . . . . . . . . . . . . . 78 1.9 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
    • 2 Fresnel Simulation Toolbox: Theoretical Prerequisites 81 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 2.2 Scalar Theory of Diffraction . . . . . . . . . . . . . . . . . . . 83 2.2.1 Helmholtz Equation . . . . . . . . . . . . . . . . . . . 86
    • 3 Fresnel Simulation Toolbox: applications 117 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 3.2 Performances of the Angular Spectrum Methods . . . . . . . . 118 3.3 Comparison between the Free Space Propagation Operators . 122 3.3.1 Description of the System . . . . . . . . . . . . . . . . 122 3.3.2 Image of a Point Source . . . . . . . . . . . . . . . . . 123 3.4 Coherence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128 3.4.1 Temporal Coherence . . . . . . . . . . . . . . . . . . . 129 3.4.2 Spatial Coherence . . . . . . . . . . . . . . . . . . . . . 130 3.4.3 Simulation of Spatial Coherence . . . . . . . . . . . . . 131 3.4.4 Simulation of Temporal Coherence . . . . . . . . . . . 138
    • 3.5 Optimization of Coherence Parameters . . . . . . . . . . . . . 139 3.5.1 Optimization of Spatial Coherence . . . . . . . . . . . 139 3.5.2 Optimization of Temporal Coherence . . . . . . . . . . 143 3.5.3 Coherent Imaging vs Incoherent Imaging . . . . . . . . 146
    • 3.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
    • 4 Simulations of a Plenoptic 1.0 System 149 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 4.2 Description of the System . . . . . . . . . . . . . . . . . . . . 150 4.2.1 Diffraction Effects on the System . . . . . . . . . . . . 152 4.3 Advanced Rendering Techniques . . . . . . . . . . . . . . . . . 155 4.3.1 Changing Aperture . . . . . . . . . . . . . . . . . . . . 155 4.3.2 Changing the Point of View . . . . . . . . . . . . . . . 159 4.4 Depth Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 159 4.5 Synthetic Refocus . . . . . . . . . . . . . . . . . . . . . . . . . 165 4.5.1 Synthetic Refocus Algorithm . . . . . . . . . . . . . . . 167 4.5.2 Refocusing Operator . . . . . . . . . . . . . . . . . . . 170 4.6 Results of the Simulations . . . . . . . . . . . . . . . . . . . . 172 4.7 Focal Stack Depth Estimation . . . . . . . . . . . . . . . . . . 178 4.8 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
    • 6 Light Field Microscope 227 6.1 From Simulation to Reality . . . . . . . . . . . . . . . . . . . 227 6.2 Description of the System . . . . . . . . . . . . . . . . . . . . 229 6.3 Image Acquisition Protocol . . . . . . . . . . . . . . . . . . . . 236 6.4 Rendering of experimental data . . . . . . . . . . . . . . . . . 242 6.5 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
    • 1.8 F-number matching between the main lens and the micro lens in the array. . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
    • [6] Tom E Bishop, Sara Zanetti, and Paolo Favaro. Light field superresolution. In Computational Photography (ICCP), 2009 IEEE International Conference on, pages 1-9. IEEE, 2009.
    • [22] Todor G Georgiev and Andrew Lumsdaine. Super-resolution with the focused plenoptic camera, November 20 2012. US Patent 8,315,476.
    • [56] Christian Perwass and Lennart Wietzke. Single lens 3d-camera with extended depth-of-field. In Proc. SPIE, volume 8291, page 829108, 2012.
    • [62] Arnold Sommerfeld. Optics lectures on theortical physics, vol. iv. Optics Lectures on Theortical Physics, Vol. IV by Arnold Sommerfeld New York, NY: Academic Press INC, 1954, 1, 1954.
    • [75] Changyin Zhou and Shree K Nayar. Computational cameras: Convergence of optics and processing. Image Processing, IEEE Transactions on, 20(12):3322-3340, 2011.
  • No related research data.
  • Discovered through pilot similarity algorithms. Send us your feedback.

Share - Bookmark

Download from

Cite this article