Alpha Pernía-Espinoza
University of La Rioja
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alpha Pernía-Espinoza.
Neural Networks | 2005
Alpha Pernía-Espinoza; Joaquín Ordieres-Meré; F.J. Martinez-de-Pison; Ana González-Marcos
In several fields, as industrial modelling, multilayer feedforward neural networks are often used as universal function approximations. These supervised neural networks are commonly trained by a traditional backpropagation learning format, which minimises the mean squared error (mse) of the training data. However, in the presence of corrupted data (outliers) this training scheme may produce wrong models. We combine the benefits of the non-linear regression model tau-estimates [introduced by Tabatabai, M. A. Argyros, I. K. Robust Estimation and testing for general nonlinear regression models. Applied Mathematics and Computation. 58 (1993) 85-101] with the backpropagation algorithm to produce the TAO-robust learning algorithm, in order to deal with the problems of modelling with outliers. The cost function of this approach has a bounded influence function given by the weighted average of two psi functions, one corresponding to a very robust estimate and the other to a highly efficient estimate. The advantages of the proposed algorithm are studied with an example.
Applied Soft Computing | 2015
Andres Sanz-Garcia; Julio Fern'andez-Ceniceros; F. Antonanzas-Torres; Alpha Pernía-Espinoza; F.J. Martinez-de-Pison
Graphical abstractDisplay Omitted HighlightsGA-PARSIMONY combines feature selection and model parameter optimization.Selection of best parsimonious models according to cost and complexity separately.Lower number of features selected in 65% of 20 UCI and Statlib databases tested.GA-PARSIMONY proved useful in SVR control models for a hot dip galvanizing line. This article proposes a new genetic algorithm (GA) methodology to obtain parsimonious support vector regression (SVR) models capable of predicting highly precise setpoints in a continuous annealing furnace (GA-PARSIMONY). The proposal combines feature selection, model tuning, and parsimonious model selection in order to achieve robust SVR models. To this end, a novel GA selection procedure is introduced based on separate cost and complexity evaluations. The best individuals are initially sorted by an error fitness function, and afterwards, models with similar costs are rearranged according to model complexity measurement so as to foster models of lesser complexity. Therefore, the user-supplied penalty parameter, utilized to balance cost and complexity in other fitness functions, is rendered unnecessary. GA-PARSIMONY performed similarly to classical GA on twenty benchmark datasets from public repositories, but used a lower number of features in a striking 65% of models. Moreover, the performance of our proposal also proved useful in a real industrial process for predicting three temperature setpoints for a continuous annealing furnace. The results demonstrated that GA-PARSIMONY was able to generate more robust SVR models with less input features, as compared to classical GA.
Journal of Applied Logic | 2012
Andres Sanz-Garcia; Alpha Pernía-Espinoza; R. Fern'andez-Mart'inez; Francisco Javier Martinez-de-Pison-Ascacibar
Abstract Most of the times the optimal control of steel industrial processes is a very complicated task because of the elevated number of parameters to adjust. For that reason, in steel plants, engineers must estimate the best values of the operational parameters of processes, and sometimes, it is also necessary to obtain the appropriate model for steel material behaviour. This article deals with three successful experiences gained from genetic algorithms and the finite element method in order to solve engineering optimisation problems. On one hand, a fully automated method for determining the best material behaviour laws is described, and on the other hand we present a common methodology to find the most appropriate settings for two cases of improvement in steel industrial processes. The study of the three reported cases allowed us to show the reliability and effectiveness of combining both techniques.
International Journal of Information Technology and Decision Making | 2016
E. Martinez-De-Pison; Julio Fern'andez-Ceniceros; Alpha Pernía-Espinoza; F.J. Martinez-de-Pison; Andres Sanz-Garcia
Room demand estimation models are crucial in the performance of hotel revenue management systems. The advent of websites for online room booking has produced a decrease in the accuracy of prediction models due to the complex customers’ patterns. A reduction that has been particularly dramatic due to last-minute reservations. We propose the use of parsimonious models for improving room demand forecasting. The creation of the models is carried out by using a flexible methodology based on genetic algorithms whereby a wrapper-based scheme is optimized. The methodology includes not only an automated model parameter optimization but also the selection of most relevant inputs and the transformation of the skewed room demand distribution. The effectiveness of our proposal was evaluated using the historical room booking data from a hotel located at La Rioja region in northern Spain. The dataset also included sociological and meteorological information, and the list of local and regional festivities. Nine types of regression models were tuned using the optimization scheme proposed and grid search as the reference method. Models were compared showing that our proposal generated more parsimonious models, which in turn led to higher overall accuracy and better generalization performance. Finally, the applicability of the methodology was demonstrated through the creation of a six-month calendar with the estimated room demand.
hybrid artificial intelligence systems | 2015
M. Alia-Martinez; J. Antonanzas; F. Antonanzas-Torres; Alpha Pernía-Espinoza; R. Urraca
General purpose computing on graphics processing units (GPGPU) is a promising technique to cope with nowadays arising computational challenges due to the suitability of GPUs for parallel processing. Several libraries and functions are being released to boost the use of GPUs in real world problems. However, many of these packages require a deep knowledge in GPUs’ architecture and in low-level programming. As a result, end users find trouble in exploiting GPGPU advantages. In this paper, we focus on the GPU-acceleration of a prediction technique specially designed to deal with big datasets: the extreme learning machine (ELM). The intent of this study is to develop a user-friendly library in the open source R language and subsequently release the code in https://github.com/maaliam/EDMANS-elmNN-GPU.git. Therefore R users can freely implement it with the only requirement of having a NVIDIA graphic card. The most computationally demanding operations were identified by performing a sensitivity analysis. As a result, only matrix multiplications were executed in the GPU as they take around 99 % of total execution time. A speedup rate up to 15 times was obtained with this GPU-accelerated ELM in the most computationally expensive scenarios. Moreover, the applicability of the GPU-accelerated ELM was also tested with a typical case of model selection, in which genetic algorithms were used to fine-tune an ELM and training thousands of models is required. In this case, still a speedup of 6 times was obtained.
soco-cisis-iceute | 2014
Ruben Urraca-Valle; Enrique Sodupe-Ortega; Alpha Pernía-Espinoza; Andres Sanz-Garcia
In this paper we propose different strategies to apply non-parametric multiple comparisons in industrial environments. These techniques have been widely used in theoretical studies and research to evaluate the performance of models, but they are still far from being implemented in real applications. So, we develop three new automatized strategies to ease the selection of soft computing models using data from industrial processes. A rubber products manufacturer was selected as a real industry to conduct the experiments. More specifically, we focus our study on the mixing phase. The rheology curve of rubber compounds is predicted to anticipate possible failures in the vulcanization process. More accurate predictions are needed to provide set points to enhance the control the process, particularly working in this rapidly changing environment. Selecting among a wide range of models increases the probability of achieving the best predictions. The main goal of our methodology is therefore to automatize the selection process when many choices are availables. The models based on soft computing used to validate our proposal are neural networks and support vector machines and also other alternatives such as linear and rule-based models.
hybrid artificial intelligence systems | 2018
F.J. Martinez-de-Pison; R. Gonzalez-Sendino; J. Ferreiro; E. Fraile; Alpha Pernía-Espinoza
Nowadays, there is an increasing interest in automating KDD processes. Thanks to the increasing power and costs reduction of computation devices, the search of best features and model parameters can be solved with different meta-heuristics. Thus, researchers can be focused in other important tasks like data wrangling or feature engineering. In this contribution, GAparsimony R package is presented. This library implements GA-PARSIMONY methodology that has been published in previous journals and HAIS conferences. The objective of this paper is to show how to use GAparsimony for searching accurate parsimonious models by combining feature selection, hyperparameter optimization, and parsimonious model search. Therefore, this paper covers the cautions and considerations required for finding a robust parsimonious model by using this package and with a regression example that can be easily adapted for another problem, database or algorithm.
Materials | 2018
Enrique Sodupe-Ortega; Andres Sanz-Garcia; Alpha Pernía-Espinoza; Carmen Escobedo-Lucea
Most of the studies in three-dimensional (3D) bioprinting have been traditionally based on printing a single bioink. Addressing the complexity of organ and tissue engineering, however, will require combining multiple building and sacrificial biomaterials and several cells types in a single biofabrication session. This is a significant challenge, and, to tackle that, we must focus on the complex relationships between the printing parameters and the print resolution. In this paper, we study the influence of the main parameters driven multi-material 3D bioprinting and we present a method to calibrate these systems and control the print resolution accurately. Firstly, poloxamer hydrogels were extruded using a desktop 3D printer modified to incorporate four microextrusion-based bioprinting (MEBB) printheads. The printed hydrogels provided us the particular range of printing parameters (mainly printing pressure, deposition speed, and nozzle z-offset) to assure the correct calibration of the multi-material 3D bioprinter. Using the printheads, we demonstrated the excellent performance of the calibrated system extruding different fluorescent bioinks. Representative multi-material structures were printed in both poloxamer and cell-laden gelatin-alginate bioinks in a single session corroborating the capabilities of our system and the calibration method. Cell viability was not significantly affected by any of the changes proposed. We conclude that our proposal has enormous potential to help with advancing in the creation of complex 3D constructs and vascular networks for tissue engineering.
Applied Soft Computing | 2018
Alpha Pernía-Espinoza; Julio Fern'andez-Ceniceros; J. Antonanzas; R. Urraca; F.J. Martinez-de-Pison
Abstract This study presents a new soft computing method to create an accurate and reliable model capable of determining three key points of the comprehensive force–displacement curve of bolted components in steel structures. To this end, a database with the results of a set of finite element (FE) simulations, which represent real responses of bolted components, is utilized to create a stacking ensemble model that combines the predictions of different parsimonious base models. The innovative proposal of this study is using GA-PARSIMONY, a previously published GA-method which searches parsimonious models by optimizing feature selection and hyperparameter optimization processes. Therefore, parsimonious solutions created with a variety of machine learning methods are combined by means of a nested cross-validation scheme in a unique meta-learner in order to increase diversity and minimize the generalization error rate. The results reveal that efficiently combining parsimonious models provides more accurate and reliable predictions as compared to other methods. Thus, the informational model is able to replace costly FE simulations without significantly comprising accuracy and could be implemented in structural analysis software.
hybrid artificial intelligence systems | 2017
J. Antonanzas; R. Urraca; Alpha Pernía-Espinoza; Alvaro Aldama; Luis Alfredo Fernández-Jiménez; F.J. Martinez-de-Pison
Solar power forecasts are gaining continuous importance as the penetration of solar energy into the grid rises. The natural variability of the solar resource, joined to the difficulties of cloud movement modeling, endow solar power forecasts with a certain level of uncertainty. Important efforts have been carried out in the field to reduce as much as possible the errors. Various approaches have been followed, being the predominant nowadays the use of statistical techniques to model production.