J. M. Trujillo-Sevilla
University of La Laguna
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by J. M. Trujillo-Sevilla.
Proceedings of SPIE | 2011
J. M. Rodríguez-Ramos; J. P. Lüke; R. López; José Gil Marichal-Hernández; I. Montilla; J. M. Trujillo-Sevilla; Bruno Femenia; Marta Puga; M. López; J. J. Fernández-Valdivia; F. Rosa; C. Dominguez-Conde; J. C. Sanluis; Luis Fernando Rodríguez-Ramos
Plenoptic cameras have been developed over the last years as a passive method for 3d scanning. Several superresolution algorithms have been proposed in order to increase the resolution decrease associated with lightfield acquisition with a microlenses array. A number of multiview stereo algorithms have also been applied in order to extract depth information from plenoptic frames. Real time systems have been implemented using specialized hardware as Graphical Processing Units (GPUs) and Field Programmable Gates Arrays (FPGAs). In this paper, we will present our own implementations related with the aforementioned aspects but also two new developments consisting of a portable plenoptic objective to transform every conventional 2d camera in a 3D CAFADIS plenoptic camera, and the novel use of a plenoptic camera as a wavefront phase sensor for adaptive optics (OA). The terrestrial atmosphere degrades the telescope images due to the diffraction index changes associated with the turbulence. These changes require a high speed processing that justify the use of GPUs and FPGAs. Na artificial Laser Guide Stars (Na-LGS, 90km high) must be used to obtain the reference wavefront phase and the Optical Transfer Function of the system, but they are affected by defocus because of the finite distance to the telescope. Using the telescope as a plenoptic camera allows us to correct the defocus and to recover the wavefront phase tomographically. These advances significantly increase the versatility of the plenoptic camera, and provides a new contribution to relate the wave optics and computer vision fields, as many authors claim.
Proceedings of SPIE | 2012
Luis Fernando Rodríguez-Ramos; I. Montilla; J. P. Lüke; R. López; José Gil Marichal-Hernández; J. M. Trujillo-Sevilla; Bruno Femenia; M. López; J. J. Fernández-Valdivia; Marta Puga; F. Rosa; J. M. Rodríguez-Ramos
Plenoptic cameras have been developed the last years as a passive method for 3d scanning, allowing focal stack capture from a single shot. But data recorded by this kind of sensors can also be used to extract the wavefront phases associated to the atmospheric turbulence in an astronomical observation. The terrestrial atmosphere degrades the telescope images due to the diffraction index changes associated to the turbulence. Na artificial Laser Guide Stars (Na-LGS, 90km high) must be used to obtain the reference wavefront phase and the Optical Transfer Function of the system, but they are affected by defocus because of the finite distance to the telescope. Using the telescope as a plenoptic camera allows us to correct the defocus and to recover the wavefront phase tomographically, taking advantage of the two principal characteristics of the plenoptic sensors at the same time: 3D scanning and wavefront sensing. Then, the plenoptic sensors can be studied and used as an alternative wavefront sensor for Adaptive Optics, particularly relevant when Extremely Large Telescopes projects are being undertaken. In this paper, we will present the first observational wavefront phases extracted from real astronomical observations, using punctual and extended objects, and we show that the restored wavefronts match the Kolmogorov atmospheric turbulence.
Optics Letters | 2014
J. M. Trujillo-Sevilla; L. F. Rodríguez-Ramos; I. Montilla; J. M. Rodríguez-Ramos
Plenoptic imaging systems are becoming more common since they provide capabilities unattainable in conventional imaging systems, but one of their main limitations is the poor bidimensional resolution. Combining the wavefront phase measurement and the plenoptic image deconvolution, we propose a system capable of improving the resolution when a wavefront aberration is present and the image is blurred. In this work, a plenoptic system is simulated using Fourier optics, and the results show that an improved resolution is achieved, even in the presence of strong wavefront aberrations.
workshop on information optics | 2014
J. M. Trujillo-Sevilla; J. J. Fernández-Valdivia; José Gil Marichal-Hernández; J. M. Rodríguez-Ramos; Luis Fernando Rodríguez-Ramos; I. Mantilla
The deconvolution applied to plenoptic sensors has only been studied in the area of light intensity, including no treatment for the possible path changes that optical rays suffer due to the refraction index changes in the medium. Using the plenoptic sensor not only will be possible to know the wavefront phase aberration induced by atmospheric turbulence but also, using the deconvolution presented here, it will be possible to restore the spatial structure of the observed source.
workshop on information optics | 2013
M. G. Thomas; José Gil Marichal-Hernández; J. J. Fernández-Valdivia; J. M. Trujillo-Sevilla; J. M. Rodríguez-Ramos; I. Montilla
The CAFADIS plenoptic camera was mounted onto a Leica M205A stereomicroscope and used as a prototype light field microscope. The ability to generate image stacks focused at different focal planes from a single plenoptic image was demonstrated. The focal stacks were used to create a composite extended focus image and depth map. The positions of the focal planes of the stack were measured experimentally and the data used to calibrate a computational model of the plane distribution. The resulting microscope images have an extended depth of field and a corresponding depth map with real distance estimates.
Proceedings of SPIE | 2014
J. M. Trujillo-Sevilla; Luis Fernando Rodríguez-Ramos; Juan J. Fernández-Valdivia; José Gil Marichal-Hernández; J. M. Rodríguez-Ramos
Modern astronomic telescopes take advantage of multi-conjugate adaptive optics, in which wavefront sensors play a key role. A single sensor capable of measuring wavefront phases at any angle of observation would be helpful when improving atmospheric tomographic reconstruction. A new sensor combining both geometric and plenoptic arrangements is proposed, and a simulation demonstrating its working principle is also shown. Results show that this sensor is feasible, and also that single extended objects can be used to perform tomography of atmospheric turbulence.
Three-Dimensional Imaging, Visualization, and Display 2018 | 2018
David Carmona-Ballester; J. M. Trujillo-Sevilla; Lara Díaz-García; Daniel Walo; Ángela Hernández-Delgado; J. J. Fernández-Valdivia; Óscar Casanova-González; Óscar Gómez-Cárdenes; J. M. Rodríguez-Ramos
In this work we have presented a brief insight into the capabilities of multilayer displays as to selectively display information in relation to the observers. We labeled the views of a light-field as blocked and non-blocked, and then a predefined text was assigned accordingly, modifying it to achieve a privacy criterion in the blocked case. Two ways to define the private views were presented. An evaluation of the output for both techniques was carried over in simulation, in both the spatial and frequency domain. Results showed that privacy was achievable and that each technique had an optimal operation point when taking into account the time-multiplexing capabilities of the multilayer display. Also, a trade-off between the quality of the blocked and non-blocked views was found.
Astronomy and Astrophysics | 2018
David Carmona-Ballester; J. M. Trujillo-Sevilla; Sergio Bonaque-González; Óscar Gómez-Cárdenes; J. M. Rodríguez-Ramos
Context. Increasing the area on the sky over which atmospheric turbulences can be corrected is a matter of wide interest in astrophysics, especially when a new generation of extremely large telescopes (ELT) is to come in the near future. Aims. In this study we tested if a method for visual representation in three-dimensional displays, the weighted nonnegative tensor factorization (WNTF), is able to improve the quality of the atmospheric tomography (AT) reconstruction as compared to a more standardized method like a randomized Kaczmarz algorithm. Methods. A total of 1000 different atmospheres were simulated and recovered by both methods. Recovering was computed for two and three layers and for four different constellations of laser guiding stars (LGS). The goodness of both methods was tested by means of the radial average of the Strehl ratio across the field of view of a telescope of 8m diameter with a sky coverage of 97.8 arcsec. Results. The proposed method significantly outperformed the Kaczmarz in all tested cases (p ≤ 0.05). In WNTF, three-layers configuration provided better outcomes, but there was no clear relation between different LGS constellations and the quality of Strehl ratio maps. Conclusions. The WNTF method is a novel technique in astronomy and its use to recover atmospheric turbulence profiles was proposed and tested. It showed better quality of reconstruction than a conventional Kaczmarz algorithm independently of the number and height of recovered atmospheric layers and of the constellation of laser guide star used. The WNTF method was shown to be a useful tool in highly ill-posed AT problems, where the difficulty of classical algorithms produce high Strehl value maps.
Three-Dimensional Imaging, Visualization, and Display 2017 | 2017
Óscar Gómez-Cárdenes; José Gil Marichal-Hernández; J. M. Trujillo-Sevilla; David Carmona-Ballester; J. M. Rodríguez-Ramos
The discrete Radon transform, DRT, calculates, with linearithmic complexity, the sum of pixels through a set of discrete lines covering all possible slopes and intercepts in an image. In 2006, a method was proposed to compute the inverse DRT that remains exact and fast, in spite of being iterative. In this work the DRT pair is used to propose a Ridgelet and a Curvelet transform that perform focus measurement of an image. Then the shape from focus approach based on DRT pair is applied to a focal stack to create a depth map of a scene.
workshop on information optics | 2016
J. J. Fernández-Valdivia; J. M. Trujillo-Sevilla; O. Casanova-Gonzalez; Roberto López; S. Velasco; Carlos Colodro-Conde; Marta Puga; Alejandro Oscoz; R. Rebolo; Craig D. Mackay; Antonio Pérez-Garrido; Luis Fernando Rodríguez-Ramos; David A. King; Lucas Labadie; Balaji Muthusubramanian; G. Rodriguez-Coira; J. M. Rodríguez-Ramos
This paper presents a method to recover the wavefront phase at the telescope pupil, distorted because of the atmosphere action, and its use to command a deformable mirror to compensate for the optical aberrations in real time (AOLI instrument). For this purpose, an evolution of the geometric sensor [1] was used to restore the wavefront from two defocused images. Furthermore, by using specialized hardware the computations can be performed in real time, within the stability time of the atmosphere.