Angel-Iván García-Moreno
Instituto Politécnico Nacional
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Angel-Iván García-Moreno.
mexican conference on pattern recognition | 2013
Angel-Iván García-Moreno; José-Joel González-Barbosa; Francisco-Javier Ornelas-Rodriguez; Juan-Bautista Hurtado-Ramos; Marco-Neri Primo-Fuentes
Mobile platforms typically combine several data acquisition systems such as lasers, cameras and inertial systems. However the geometrical combination of the different sensors requires their calibration, at least, through the definition of the extrinsic parameters, i.e., the transformation matrices that register all sensors in the same coordinate system. Our system generate an accurate association between platform sensors and the estimated parameters including rotation, translation, focal length, world and sensors reference frame. The extrinsic camera parameters are computed by Zhang’s method using a pattern composed of white rhombus and rhombus holes, and the LIDAR with the results of previous work. Points acquired by the LIDAR are projected into images acquired by the Ladybug cameras. A new calibration pattern, visible to both sensors is used. Correspondence is obtained between each laser point and its position in the image, the texture and color of each point of LIDAR can be know.
Robotics and Autonomous Systems | 2014
Angel-Iván García-Moreno; Denis-Eduardo Hernández-García; José-Joel González-Barbosa; Alfonso Ramírez-Pedraza; Juan-Bautista Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez
In this work we present an in-situ method to compute the calibration of two sensors, a LIDAR (Light Detection and Ranging) and a spherical camera. Both sensors are used in urban environment reconstruction tasks. In this scenario the speed at which the various sensors acquire and merge the information is very important; however reconstruction accuracy, which depends on sensors calibration, is also of high relevance. Here, a new calibration pattern, visible to both sensors is proposed. By this means, the correspondence between each laser point and its position in the camera image is obtained so that the texture and color of each LIDAR point can be known. Experimental results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and a spherical camera. We calibrated two different kinds of sensors, LIDAR and camera.We calculated the uncertainty analysis between both sensors.The performance of the error propagation is computed.We develop a new algorithm to obtain the extrinsic parameters of the LIDAR sensor.A new calibration pattern is used.
mexican international conference on artificial intelligence | 2012
Angel-Iván García-Moreno; José-Joel González-Barbosa; Francisco-Javier Ornelas-Rodriguez; Juan-Bautista Hurtado-Ramos; Alfonso Ramírez-Pedraza; Erick-Alejandro González-Barbosa
In this work an approach for geo-referenced 3D reconstruction of outdoor scenes using LIDAR (Light Detection And Ranging) and DGPS (Diferencial Global Positioning System) technologies is presented. We develop a computationally efficient method for 3D reconstruction of city-sized environments using both sensors providing an excellent base point for high-detail street views. In the proposed method, the translation between consecutive local maps is obtained using DGPS data and the rotation is obtained extracting correspondant planes of two point clouds and matching them, after extracting these parameters we merge many local scenes to obtain a global map. We validate the accuracy of the proposed method making a comparison between the reconstruction and real measures and plans of the scanned scene. The results show that the proposed system is a useful solution for 3D reconstruction of large scale city models.
Journal of Applied Remote Sensing | 2016
Angel-Iván García-Moreno; José-Joel González-Barbosa; Alfonso Ramírez-Pedraza; Juan B. Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez
Abstract. Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera–LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier’s analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.
international conference on telecommunications | 2015
Angel-Iván García-Moreno; José-Joel González-Barbosa; Juan-Bautista Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez
In this paper, we compute the precision with which 3D points, camera orientation, position and calibration are estimated for a laser rangefinder and a multicamera-system. Both sensors are used to digitize urban environments. Errors in the localization of image features introduce errors in the reconstruction. Some algorithms are numerically unstable, intrinsically, or in conjunction to particular setups of points and/or of cameras. A practical methodology is presented to predict the error propagation inside the calibration process between both sensors. Performance charts of the error propagation in the intrinsic camera parameters and the relationship between the laser and the noisy of both sensors were calculated using simulations, an analitical analysis. Results for the calibration, error projection between camera-laser and uncertainty analysis are presented for data collected by the mobile terrestrial platform.
computer science and software engineering | 2014
Angel-Iván García-Moreno; José-Joel González-Barbosa; Juan B. Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez
This paper presents a method to estimate the uncertainty in the calibration of two sensors, a laser rangefinder and a multicamera-system. Both sensors were used in urban environment reconstruction tasks. A new calibration pattern, visible to both sensors is proposed. By this means, correspondence between each laser point and its position in the camera image is obtained so that texture and color of each LIDAR point can be known. Besides this allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. A practical methodology is presented to predict the uncertainty analysis inside the calibration process between both sensors. Statistics of the behavior of the camera calibration parameters and the relationship between the camera, LIDAR and the noisy of both sensors were calculated using simulations, an analytical analysis and experiments. Results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and the panoramic camera.
Archive | 2014
Angel-Iván García-Moreno; José-Joel González-Barbosa; Francisco-Javier Ornelas-Rodriguez; Juan B. Hurtado-Ramos
Terrestrial platforms for 3D reconstruction typically combine several data acquisition systems such as lasers, cameras and inertial systems. However the geometrical combination of different sensors requires their calibration and data fusion. These topics are an important task for vision-based systems since it estimates the values of sensor model parameters, such as cameras. The uncertainty of these parameters must be known in order to evaluate the error of the final calibration and their applications. The aim of this paper is to present a method to compute the calibration of both sensors. A new calibration pattern, visible to both sensors is used. Correspondence is obtained between each laser point and its position in the image, the texture and color of each point of LIDAR can be know. Experimental results are presented for data collected with the platform integrated with a 3D laser scanner and a panoramic camera system.
ieee electronics, robotics and automotive mechanics conference | 2012
Angel-Iván García-Moreno; José-Joel González-Barbosa
3D reconstruction of large scale areas was created by merging real world information from different sensors, this is an important issue in multimedia research. In this paper, we propose an efficient construction of urban scenes from LIDAR data. The system is composed of a Velodyne LIDAR 64E and a differential GPS. The sensors are synchronized through their internal clocks, and the position of the LIDAR acquisition is given by the GPS data. In this work, we computed the position using plane geodetic UTM (Universal Transverse Mercator) coordinates by Cotticchia-Surace algorithm which allows us to obtain centimeter accuracy. The translation vector between acquisitions is computed using the GPS position, and the rotation matrix is computed using the planes extracted from the environment. The results show a 3D reconstruction of large scale city.
Revista Internacional De Metodos Numericos Para Calculo Y Diseno En Ingenieria | 2016
Angel-Iván García-Moreno; José-Joel González-Barbosa; Juan-Bautista Hurtado-Ramos; Francisco-Javier Ornelas-Rodriguez; Alfonso Ramírez-Pedraza
Revista Iberoamericana De Automatica E Informatica Industrial | 2015
Alfonso Ramírez-Pedraza; José-Joel González-Barbosa; Francisco-Javier Ornelas-Rodriguez; Angel-Iván García-Moreno; Adan Salazar-Garibay; Erick-Alejandro González-Barbosa
Collaboration
Dive into the Angel-Iván García-Moreno's collaboration.
Erick-Alejandro González-Barbosa
Instituto Tecnológico Superior de Irapuato
View shared research outputs