Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Juan B. Hurtado-Ramos is active.

Publication


Featured researches published by Juan B. Hurtado-Ramos.


International Journal of Advanced Robotic Systems | 2011

LIDAR Velodyne HDL-64E Calibration Using Pattern Planes

Gerardo Atanacio-Jiménez; José-Joel González-Barbosa; Juan B. Hurtado-Ramos; Francisco J. Ornelas-Rodríguez; Hugo Jiménez-Hernández; Teresa García-Ramirez; Ricardo Gonzalez-Barbosa

This work describes a method for calibration of the Velodyne HDL-64E scanning LIDAR system. The principal contribution was expressed by a pattern calibration signature, the mathematical model and the numerical algorithm for computing the calibration parameters of the LIDAR. In this calibration pattern the main objective is to minimize systematic errors due to geometric calibration factor. It describes an algorithm for solution of the intrinsic and extrinsic parameters. Finally, its uncertainty was calculated from the standard deviation of calibration result errors.


Measurement Science and Technology | 2014

3D reconstruction of hollow parts analyzing images acquired by a fiberscope

Octavio Icasio-Hernández; José-Joel González-Barbosa; Juan B. Hurtado-Ramos; Miguel Viliesid-Alonso

A modified fiberscope used to reconstruct difficult-to-reach inner structures is presented. By substituting the fiberscope?s original illumination system, we can project a profile-revealing light line inside the object of study. The light line is obtained using a sandwiched power light-emitting diode (LED) attached to an extension arm on the tip of the fiberscope. Profile images from the interior of the object are then captured by a camera attached to the fiberscope?s eyepiece. Using a series of those images at different positions, the system is capable of generating a 3D reconstruction of the object with submillimeter accuracy. Also proposed is the use of a combination of known filters to remove the honeycomb structures produced by the fiberscope and the use of ring gages to obtain the extrinsic parameters of the camera attached to the fiberscope and the metrological traceability of the system. Several standard ring diameter measurements were compared against their certified values to improve the accuracy of the system. To exemplify an application, a 3D reconstruction of the interior of a refrigerator duct was conducted. This reconstruction includes accuracy assessment by comparing the measurements of the system to a coordinate measuring machine. The system, as described, is capable of 3D reconstruction of the interior of objects with uniform and non-uniform profiles from 10 to 60?mm in transversal dimensions and a depth of 1000?mm if the material of the walls of the object is translucent and allows the detection of the power LED light from the exterior through the wall. If this is not possible, we propose the use of a magnetic scale which reduces the working depth to 170?mm. The assessed accuracy is around ?0.15?mm in 2D cross-section reconstructions and ?1.3?mm in 1D position using a magnetic scale, and ?0.5?mm using a CCD camera.


Journal of Applied Remote Sensing | 2016

Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

Angel-Iván García-Moreno; José-Joel González-Barbosa; Alfonso Ramírez-Pedraza; Juan B. Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez

Abstract. Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera–LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier’s analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.


The International Commission for Optics | 2011

Illuminance-spatial-distribution-based total luminous flux determination for white LEDs

N. Vidal; E. Rosas; Juan B. Hurtado-Ramos

Here we report on the application of the well known photo-goniometric method, based on illuminance spatial distribution direct measurements, to determine the total luminous flux for high-intensity white LED sources; thus testing the CENAM primary metrology capabilities recently developed to face the increasing needs claimed by the rapidly moving solid-state lighting industrial sector in Mexico. These first results obtained at CENAM after the gonio-photometric method implementation, allowed us to determine in good accuracy the total luminous flux for a high intensity white LED source, with an estimated uncertainty of 2.86 % (k=1); still lower than the claimed by the tested LED manufacturer. We clearly identified the spectral mismatch correction factor determination and the LED junction temperature measurement and control as the dominant uncertainly sources, and will be addressed in order to improve the accuracy of the measurement system for future experiments.


Automatika | 2017

Performance evaluation of a portable 3D vision coordinate measuring system

Octavio Icasio-Hernández; José-Joel González-Barbosa; Yajaira-Ilse Curiel-Razo; Juan B. Hurtado-Ramos

ABSTRACT In this work, we present a portable 3D vision coordinate measuring machine (PCMM) for short range-real time photogrammetry. The PCMM performs 3D measurements of points using a single camera in combination with a hand tool and a computer. The hand tool has infrared LEDs serving as photogrammetric targets. The positions of these targets were pre-calibrated with an optical coordinate-measuring machine defining a local coordinate system on the hand tool. The camera has an infrared filter to exclude all ambient light but infrared targets. Positions of the imaged infrared targets are converted to 3D coordinates using pixel positions and pre-calibrated positions of the targets. Also, we present a set of criteria for selecting the infrared LEDs and the camera filter, a camera calibration method, a tracking and POSE algorithms, and a 3D coordinate error correction for the PCMM. The correction is performed using the PCMM as a range meter, which implies comparing the 3D coordinate points of the PCMM with a coordinate measuring machine, and then generating a look up table (LUT) for correction. The global error of the PCMM was evaluated under ASME B89.4.22-2004. Sphere and single point errors were around 1 mm, volumetric error were under 3 mm.


computer science and software engineering | 2014

Mobile remote sensing platform: An uncertainty calibration analysis

Angel-Iván García-Moreno; José-Joel González-Barbosa; Juan B. Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez

This paper presents a method to estimate the uncertainty in the calibration of two sensors, a laser rangefinder and a multicamera-system. Both sensors were used in urban environment reconstruction tasks. A new calibration pattern, visible to both sensors is proposed. By this means, correspondence between each laser point and its position in the camera image is obtained so that texture and color of each LIDAR point can be known. Besides this allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. A practical methodology is presented to predict the uncertainty analysis inside the calibration process between both sensors. Statistics of the behavior of the camera calibration parameters and the relationship between the camera, LIDAR and the noisy of both sensors were calculated using simulations, an analytical analysis and experiments. Results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and the panoramic camera.


Archive | 2014

Uncertainty Analysis of LIDAR and Panoramic Camera Calibration

Angel-Iván García-Moreno; José-Joel González-Barbosa; Francisco-Javier Ornelas-Rodriguez; Juan B. Hurtado-Ramos

Terrestrial platforms for 3D reconstruction typically combine several data acquisition systems such as lasers, cameras and inertial systems. However the geometrical combination of different sensors requires their calibration and data fusion. These topics are an important task for vision-based systems since it estimates the values of sensor model parameters, such as cameras. The uncertainty of these parameters must be known in order to evaluate the error of the final calibration and their applications. The aim of this paper is to present a method to compute the calibration of both sensors. A new calibration pattern, visible to both sensors is used. Correspondence is obtained between each laser point and its position in the image, the texture and color of each point of LIDAR can be know. Experimental results are presented for data collected with the platform integrated with a 3D laser scanner and a panoramic camera system.


Eighth Symposium Optics in Industry | 2011

Traffic infraestructure inventory system by analyses images

Aldo I. Rico Martinez; J. C. Camarillo-Paz; Francisco-Javier Ornelas-Rodriguez; José-Joel González-Barbosa; Yazmín C. Peña Cheng; Juan B. Hurtado-Ramos

This paper describes a computer vision system designed to perform an inventory of traffic signals. The system consists of five Ethernet synchronized cameras; the acquisition strategy allows us to take one image per camera every other second. We then use those five images to generate a panoramic image each second. Signal detection and recognition is carried out offline. Detection of traffic signal is done in the panoramic image using the Hough transform and enhancement of HSV color space. Traffic signal recognition is made by a combination of Haar wavelet and violates Jones classifier. Finally, we present experimental results using a database of one hundred images.


MAPAN | 2018

Calibration of Endoscopic Systems Coupled to a Camera and a Structured Light Source

Octavio Icasio-Hernández; Juan B. Hurtado-Ramos; José-Joel González-Barbosa


Journal of Computing and Information Science in Engineering | 2015

Complete Sensitivity Analysis in a LiDAR-Camera Calibration Model

Angel-Iván García-Moreno; José-Joel González-Barbosa; Alfonso Ramírez-Pedraza; Juan B. Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez

Collaboration


Dive into the Juan B. Hurtado-Ramos's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alfonso Ramírez-Pedraza

Instituto Politécnico Nacional

View shared research outputs
Top Co-Authors

Avatar

Gerardo Atanacio-Jiménez

Instituto Tecnológico de Querétaro

View shared research outputs
Top Co-Authors

Avatar

Hugo Jiménez-Hernández

Autonomous University of Queretaro

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Teresa García-Ramirez

Autonomous University of Queretaro

View shared research outputs
Top Co-Authors

Avatar

Yajaira-Ilse Curiel-Razo

Instituto Politécnico Nacional

View shared research outputs
Researchain Logo
Decentralizing Knowledge