Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Francisco-Javier Ornelas-Rodriguez is active.

Publication


Featured researches published by Francisco-Javier Ornelas-Rodriguez.


mexican conference on pattern recognition | 2013

LIDAR and Panoramic Camera Extrinsic Calibration Approach Using a Pattern Plane

Angel-Iván García-Moreno; José-Joel González-Barbosa; Francisco-Javier Ornelas-Rodriguez; Juan-Bautista Hurtado-Ramos; Marco-Neri Primo-Fuentes

Mobile platforms typically combine several data acquisition systems such as lasers, cameras and inertial systems. However the geometrical combination of the different sensors requires their calibration, at least, through the definition of the extrinsic parameters, i.e., the transformation matrices that register all sensors in the same coordinate system. Our system generate an accurate association between platform sensors and the estimated parameters including rotation, translation, focal length, world and sensors reference frame. The extrinsic camera parameters are computed by Zhang’s method using a pattern composed of white rhombus and rhombus holes, and the LIDAR with the results of previous work. Points acquired by the LIDAR are projected into images acquired by the Ladybug cameras. A new calibration pattern, visible to both sensors is used. Correspondence is obtained between each laser point and its position in the image, the texture and color of each point of LIDAR can be know.


Robotics and Autonomous Systems | 2014

Error propagation and uncertainty analysis between 3D laser scanner and camera

Angel-Iván García-Moreno; Denis-Eduardo Hernández-García; José-Joel González-Barbosa; Alfonso Ramírez-Pedraza; Juan-Bautista Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez

In this work we present an in-situ method to compute the calibration of two sensors, a LIDAR (Light Detection and Ranging) and a spherical camera. Both sensors are used in urban environment reconstruction tasks. In this scenario the speed at which the various sensors acquire and merge the information is very important; however reconstruction accuracy, which depends on sensors calibration, is also of high relevance. Here, a new calibration pattern, visible to both sensors is proposed. By this means, the correspondence between each laser point and its position in the camera image is obtained so that the texture and color of each LIDAR point can be known. Experimental results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and a spherical camera. We calibrated two different kinds of sensors, LIDAR and camera.We calculated the uncertainty analysis between both sensors.The performance of the error propagation is computed.We develop a new algorithm to obtain the extrinsic parameters of the LIDAR sensor.A new calibration pattern is used.


International Journal of Advanced Robotic Systems | 2014

A Panoramic 3D Reconstruction System Based on the Projection of Patterns

Diana-Margarita Córdova-Esparza; José-Joel González-Barbosa; Juan-Bautista Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez

This work presents the implementation of a 3D reconstruction system capable of reconstructing a 360-degree scene with a single acquisition using a projection of patterns. The system is formed by two modules: the first module is a CCD camera with a parabolic mirror that allows the acquisition of catadioptric images. The second module consists of a light projector and a parabolic mirror that is used to generate the pattern projections over the object that will be reconstructed. The projection system has a 360-degree field of view and both modules were calibrated to obtain the extrinsic parameters. To validate the functionality of the system, we performed 3D reconstructions of three objects, and show the reconstruction error analysis.


mexican international conference on artificial intelligence | 2012

Automatic 3d city reconstruction platform using a LIDAR and DGPS

Angel-Iván García-Moreno; José-Joel González-Barbosa; Francisco-Javier Ornelas-Rodriguez; Juan-Bautista Hurtado-Ramos; Alfonso Ramírez-Pedraza; Erick-Alejandro González-Barbosa

In this work an approach for geo-referenced 3D reconstruction of outdoor scenes using LIDAR (Light Detection And Ranging) and DGPS (Diferencial Global Positioning System) technologies is presented. We develop a computationally efficient method for 3D reconstruction of city-sized environments using both sensors providing an excellent base point for high-detail street views. In the proposed method, the translation between consecutive local maps is obtained using DGPS data and the rotation is obtained extracting correspondant planes of two point clouds and matching them, after extracting these parameters we merge many local scenes to obtain a global map. We validate the accuracy of the proposed method making a comparison between the reconstruction and real measures and plans of the scanned scene. The results show that the proposed system is a useful solution for 3D reconstruction of large scale city models.


international conference on electronics, communications, and computers | 2011

3D city models: Mapping approach using LIDAR technology

Denis-Eduardo Hernández-García; José-Joel González-Barbosa; Juan-Bautista Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez; Eduardo Castillo Castañeda; Alfonso Ramírez; Angel Ivan García; Ricardo Gonzalez-Barbosa; Juan Gabriel Aviña-Cervantez

We propose a map building system for generation of 3D city models using a vehicle equipped with LIDAR (Light Detection And Ranging) technology, which is capable of obtaining more than one million points per second. This approach is based in two methods: plane segmentation and local map construction. The First method introduces a fast method to extract the ground plane. The rest of the points are used to extract planes using a general plane extraction. For the general plane extraction process, is used a method that choose a random point and its neighbor in order to compute a plane; this plane is evaluated with respect to the inliers number, and the best plane is the one containing the biggest number of points. Finally, using the planes, we construct an environment map.


Journal of Applied Remote Sensing | 2016

Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

Angel-Iván García-Moreno; José-Joel González-Barbosa; Alfonso Ramírez-Pedraza; Juan B. Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez

Abstract. Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera–LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier’s analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.


international conference on telecommunications | 2015

Three-dimensional terrestrial reconstruction system: Calibration and error propagation approach

Angel-Iván García-Moreno; José-Joel González-Barbosa; Juan-Bautista Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez

In this paper, we compute the precision with which 3D points, camera orientation, position and calibration are estimated for a laser rangefinder and a multicamera-system. Both sensors are used to digitize urban environments. Errors in the localization of image features introduce errors in the reconstruction. Some algorithms are numerically unstable, intrinsically, or in conjunction to particular setups of points and/or of cameras. A practical methodology is presented to predict the error propagation inside the calibration process between both sensors. Performance charts of the error propagation in the intrinsic camera parameters and the relationship between the laser and the noisy of both sensors were calculated using simulations, an analitical analysis. Results for the calibration, error projection between camera-laser and uncertainty analysis are presented for data collected by the mobile terrestrial platform.


computer science and software engineering | 2014

Mobile remote sensing platform: An uncertainty calibration analysis

Angel-Iván García-Moreno; José-Joel González-Barbosa; Juan B. Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez

This paper presents a method to estimate the uncertainty in the calibration of two sensors, a laser rangefinder and a multicamera-system. Both sensors were used in urban environment reconstruction tasks. A new calibration pattern, visible to both sensors is proposed. By this means, correspondence between each laser point and its position in the camera image is obtained so that texture and color of each LIDAR point can be known. Besides this allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. A practical methodology is presented to predict the uncertainty analysis inside the calibration process between both sensors. Statistics of the behavior of the camera calibration parameters and the relationship between the camera, LIDAR and the noisy of both sensors were calculated using simulations, an analytical analysis and experiments. Results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and the panoramic camera.


Archive | 2014

Uncertainty Analysis of LIDAR and Panoramic Camera Calibration

Angel-Iván García-Moreno; José-Joel González-Barbosa; Francisco-Javier Ornelas-Rodriguez; Juan B. Hurtado-Ramos

Terrestrial platforms for 3D reconstruction typically combine several data acquisition systems such as lasers, cameras and inertial systems. However the geometrical combination of different sensors requires their calibration and data fusion. These topics are an important task for vision-based systems since it estimates the values of sensor model parameters, such as cameras. The uncertainty of these parameters must be known in order to evaluate the error of the final calibration and their applications. The aim of this paper is to present a method to compute the calibration of both sensors. A new calibration pattern, visible to both sensors is used. Correspondence is obtained between each laser point and its position in the image, the texture and color of each point of LIDAR can be know. Experimental results are presented for data collected with the platform integrated with a 3D laser scanner and a panoramic camera system.


Eighth Symposium Optics in Industry | 2011

Traffic infraestructure inventory system by analyses images

Aldo I. Rico Martinez; J. C. Camarillo-Paz; Francisco-Javier Ornelas-Rodriguez; José-Joel González-Barbosa; Yazmín C. Peña Cheng; Juan B. Hurtado-Ramos

This paper describes a computer vision system designed to perform an inventory of traffic signals. The system consists of five Ethernet synchronized cameras; the acquisition strategy allows us to take one image per camera every other second. We then use those five images to generate a panoramic image each second. Signal detection and recognition is carried out offline. Detection of traffic signal is done in the panoramic image using the Hough transform and enhancement of HSV color space. Traffic signal recognition is made by a combination of Haar wavelet and violates Jones classifier. Finally, we present experimental results using a database of one hundred images.

Collaboration


Dive into the Francisco-Javier Ornelas-Rodriguez's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alfonso Ramírez-Pedraza

Instituto Politécnico Nacional

View shared research outputs
Top Co-Authors

Avatar

Juan B. Hurtado-Ramos

Instituto Politécnico Nacional

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Erick-Alejandro González-Barbosa

Instituto Tecnológico Superior de Irapuato

View shared research outputs
Top Co-Authors

Avatar

Alfonso Ramírez

Instituto Politécnico Nacional

View shared research outputs
Top Co-Authors

Avatar

Angel Ivan García

Instituto Politécnico Nacional

View shared research outputs
Researchain Logo
Decentralizing Knowledge