R. Urraca
University of La Rioja
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by R. Urraca.
Journal of Renewable and Sustainable Energy | 2015
R. Urraca; J. Antonanzas; F.J. Martinez-de-Pison; F. Antonanzas-Torres
Solar global irradiation is barely recorded in remote areas around the world. The lack of access to an electricity grid in these areas presents an enormous opportunity for electrification through renewable energy sources and, specifically, with photovoltaic energy where great solar resources are available. Traditionally, solar resource estimation was performed using parametric-empirical models based on the relationship between solar irradiation and other atmospheric and commonly measured variables, such as temperatures, rainfall, sunshine duration, etc., achieving a relatively high level of certainty. The significant improvement in soft-computing techniques, applied extensively in many research fields, has led to improvements in solar global irradiation modeling. This study conducts a comparative assessment of four different soft-computing techniques (artificial neural networks, support vector regression, M5P regression trees, and extreme learning machines). The results were also compared with two well-kn...
Remote Sensing of Environment | 2017
R. Urraca; Ana M. Gracia-Amillo; Elena Koubli; Thomas Huld; Jörg Trentmann; Aku Riihelä; Anders Lindfors; Diane Palmer; Ralph Gottschalg; F. Antonanzas-Torres
This work presents a validation of three satellite-based radiation products over an extensive network of 313 pyranometers across Europe, from 2005 to 2015. The products used have been developed by the Satellite Application Facility on Climate Monitoring (CM SAF) and are one geostationary climate dataset (SARAH-JRC), one polar-orbiting climate dataset (CLARA-A2) and one geostationary operational product. Further, the ERA-Interim reanalysis is also included in the comparison. The main objective is to determine the quality level of the daily means of CM SAF datasets, identifying their limitations, as well as analyzing the different factors that can interfere in the adequate validation of the products. The quality of the pyranometer was the most critical source of uncertainty identified. In this respect, the use of records from Second Class pyranometers and silicon-based photodiodes increased the absolute error and the bias, as well as the dispersion of both metrics, preventing an adequate validation of the daily means. The best spatial estimates for the three datasets were obtained in Central Europe with a Mean Absolute Deviation (MAD) within 8–13 W/m2, whereas the MAD always increased at high-latitudes, snow-covered surfaces, high mountain ranges and coastal areas. Overall, the SARAH-JRCs accuracy was demonstrated over a dense network of stations making it the most consistent dataset for climate monitoring applications. The operational dataset was comparable to SARAH-JRC in Central Europe, but lacked of the temporal stability of climate datasets, while CLARA-A2 did not achieve the same level of accuracy despite predictions obtained showed high uniformity with a small negative bias. The ERA-Interim reanalysis shows the by-far largest deviations from the surface reference measurements.
hybrid artificial intelligence systems | 2015
R. Urraca; Andres Sanz-Garcia; Julio Fern'andez-Ceniceros; Enrique Sodupe-Ortega; F.J. Martinez-de-Pison
This paper presents a hybrid methodology, in which a KDD-scheme is optimized to build accurate parsimonious models. The methodology tries to find the best model by using genetic algorithms to optimize a KDD scheme formed with the following stages: feature selection, transformation of the skewed input and output data, parameter tuning, and parsimonious model selection. In this work, experiments demonstrated that optimization of these steps significantly improved the model generalization capabilities in some UCI databases. Finally, this methodology was applied to create room demand parsimonious models using booking databases from a hotel located in a region of Northern Spain. Results proved that the proposed method was useful to create models with higher generalization capacity and lower complexity to those obtained with classical KDD processes.
Journal of the Science of Food and Agriculture | 2016
R. Urraca; Andres Sanz-Garcia; Javier Tardáguila; Maria P. Diago
BACKGROUND Recent studies have reported the potential of near infrared (NIR) spectral analysers for monitoring the ripeness of grape berries as an alternative to wet chemistry methods. This study covers various aspects regarding the calibration and implementation of predictive models of total soluble solids (TSS) in grape berries using laboratory and in-field collected NIR spectra. RESULTS The performance of the calibration models obtained under laboratory conditions indicated that at least 700 berry samples are required to assure enough prediction accuracy. A statistically significant error reduction (ΔRMSECV = 0.1°Brix) with P < 0.001 was observed when measuring berries without epicuticular wax, which was negligible from a practical point of view. Under field conditions, the prediction errors (RMSEP = 1.68°Brix, and SEP = 1.67°Brix) were close to those obtained with the laboratory dataset (RMSEP = 1.42°Brix, SEP = 1.40°Brix). CONCLUSION This work clarifies several methodological factors to develop a protocol for in-field assessing TSS in grape berries using an affordable, non-invasive, portable NIR spectral analyser.
Neurocomputing | 2018
R. Urraca; Enrique Sodupe-Ortega; J. Antonanzas; F. Antonanzas-Torres; F.J. Martinez-de-Pison
Abstract Most proposed metaheuristics for feature selection and model parameter optimization are based on a two-termed L o s s + P e n a l t y function. Their main drawback is the need of a manual set of the parameter that balances between the loss and the penalty term. In this paper, a novel methodology referred as the GA-PARSIMONY and specifically designed to overcome this issue is evaluated in detail in thirteen public databases with five regression techniques. It is a GA-based meta-heuristic that splits the classic two-termed minimization functions by making two consecutive ranks of individuals. The first rank is based solely on the generalization error, while the second (named ReRank) is based on the complexity of the models, giving a special weight to the complexity entailed by large number of inputs. For each database, models with lowest testing RMSE and without statistical difference among them were referred as winner models. Within this group, the number of features selected was below 50%, which proves an optimal balance between error minimization and parsimony. Particularly, the most complex algorithms (MLP and SVR) were mostly selected in the group of winner models, while using around40–45% of the available attributes. The most basic IBk, ridge regression (LIN) and M5P were only classified as winner models in the simpler databases, but using less number of features in those cases (up to a 20–25% of the initial inputs).
soco-cisis-iceute | 2016
R. Urraca; J. Antonanzas; F. Antonanzas-Torres; F.J. Martinez-de-Pison
Empirical models are widely used to estimate solar radiation at locations where other more readily available meteorological variables are recorded. Within this group, soft computing techniques are the ones that provide more accurate results as they are able to relate all recorded variables with solar radiation. In this work, a new implementation of Gradient Boosting Machines (GBMs) named XGBoost is used to predict daily global horizontal irradiation at locations where no pyranometer records are available. The study is conducted with data from 38 ground stations in Castilla-La Mancha from 2001 to 2013.
Journal of Renewable and Sustainable Energy | 2016
M. Alia-Martinez; J. Antonanzas; R. Urraca; F.J. Martinez-de-Pison; F. Antonanzas-Torres
Nowadays, solar resource estimation via clear-sky models is widely accepted when correctly validated with on ground records. In the past, different approaches have been proposed in order to determine clear-sky periods of solar radiation on-ground records: visual inspection of registers, discretization via a threshold value of clear sky index, and correlation with estimated clear sky solar irradiation. However, due to the fact that the process must be automated and the need for universality, the search for clear-sky conditions presents a challenging feat. This study proposes a new algorithm based on the persistent value of the Linke turbidity in conjunction with a transitory filter. The determinant of the correlation matrix of estimated clear-sky solar irradiance and measured irradiance is calculated to distinguish between days under clear-sky conditions and cloudy or overcast days. The method was compared and proved superior against a review of other 10 commonly used techniques at 21 sites of the Baseline Surface Radiation Network, which includes diverse climates and terrain.
hybrid artificial intelligence systems | 2015
M. Alia-Martinez; J. Antonanzas; F. Antonanzas-Torres; Alpha Pernía-Espinoza; R. Urraca
General purpose computing on graphics processing units (GPGPU) is a promising technique to cope with nowadays arising computational challenges due to the suitability of GPUs for parallel processing. Several libraries and functions are being released to boost the use of GPUs in real world problems. However, many of these packages require a deep knowledge in GPUs’ architecture and in low-level programming. As a result, end users find trouble in exploiting GPGPU advantages. In this paper, we focus on the GPU-acceleration of a prediction technique specially designed to deal with big datasets: the extreme learning machine (ELM). The intent of this study is to develop a user-friendly library in the open source R language and subsequently release the code in https://github.com/maaliam/EDMANS-elmNN-GPU.git. Therefore R users can freely implement it with the only requirement of having a NVIDIA graphic card. The most computationally demanding operations were identified by performing a sensitivity analysis. As a result, only matrix multiplications were executed in the GPU as they take around 99 % of total execution time. A speedup rate up to 15 times was obtained with this GPU-accelerated ELM in the most computationally expensive scenarios. Moreover, the applicability of the GPU-accelerated ELM was also tested with a typical case of model selection, in which genetic algorithms were used to fine-tune an ELM and training thousands of models is required. In this case, still a speedup of 6 times was obtained.
hybrid artificial intelligence systems | 2018
R. Urraca; J. Antonanzas; Andres Sanz-Garcia; Alvaro Aldama; F.J. Martinez-de-Pison
We present a hybrid quality control (QC) for identifying defects in ground sensors of solar radiation. The method combines a window function that flags potential defects in radiation time series with a visual decision support system that eases the detection of false alarms and the identification of the causes of the defects. The core of the algorithm is the window function that filters out groups of daily records where the errors of several radiation products, mainly satellite-based models, are greater than the typical values for that product, region and time of the year.
Applied Soft Computing | 2018
Alpha Pernía-Espinoza; Julio Fern'andez-Ceniceros; J. Antonanzas; R. Urraca; F.J. Martinez-de-Pison
Abstract This study presents a new soft computing method to create an accurate and reliable model capable of determining three key points of the comprehensive force–displacement curve of bolted components in steel structures. To this end, a database with the results of a set of finite element (FE) simulations, which represent real responses of bolted components, is utilized to create a stacking ensemble model that combines the predictions of different parsimonious base models. The innovative proposal of this study is using GA-PARSIMONY, a previously published GA-method which searches parsimonious models by optimizing feature selection and hyperparameter optimization processes. Therefore, parsimonious solutions created with a variety of machine learning methods are combined by means of a nested cross-validation scheme in a unique meta-learner in order to increase diversity and minimize the generalization error rate. The results reveal that efficiently combining parsimonious models provides more accurate and reliable predictions as compared to other methods. Thus, the informational model is able to replace costly FE simulations without significantly comprising accuracy and could be implemented in structural analysis software.