Leobardo Valera
University of Texas at El Paso
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Leobardo Valera.
WCSC | 2016
Fernando Cervantes; Bryan Usevitch; Leobardo Valera; Vladik Kreinovich
In many practical applications, it turned out to be efficient to assume that the signal or an image is sparse, i.e., that when we decompose it into appropriate basic functions (e.g., sinusoids or wavelets), most of the coefficients in this decomposition will be zeros. At present, the empirical efficiency of sparsity-based techniques remains somewhat a mystery. In this paper, we show that fuzzy-related techniques can explain this empirical efficiency. A similar explanation can be obtained by using probabilistic techniques; this fact increases our confidence that our explanation is correct.
ieee international conference on fuzzy systems | 2016
Fernando Cervantes; Bryan Usevitch; Leobardo Valera; Vladik Kreinovich; Olga Kosheleva
One of the main techniques used to de-noise and deblur signals and images is regularization, which is based on the fact that signals and images are usually smoother than noise. Traditional Tikhonov regularization assumes that signals and images are differentiable, but, as Mandelbrot has shown in his fractal theory, many signals and images are not differentiable. To de-noise and de-blur such images, researchers have designed a heuristic method of ℓp-regularization. ℓp-regularization leads to good results, but it is not used as widely as should be, because it lacks a convincing theoretical explanation - and thus, practitioners are often reluctant to use it, especially in critical situations. In this paper, we show that fuzzy techniques provide a theoretical explanation for the ℓp-regularization. Fuzzy techniques also enables us to come up with natural next approximations to be used when the accuracy of the ℓp-based denoising and de-blurring is not sufficient.
systems, man and cybernetics | 2017
Leobardo Valera; Angel Garcia; Afshin Gholamy; Martine Ceberio; Horacio Florez
The ability to conduct fast and reliable simulations of dynamic systems is of special interest to many fields of operations. Such simulations can be very complex and, to be thorough, involve millions of variables, making it prohibitive in CPU time to run repeatedly for many different configurations. Reduced-Order Modeling (ROM) provides a concrete way to handle such complex simulations using a realistic amount of resources. However, uncertainty is hardly taken into account. Changes in the definition of a model, for instance, could have dramatic effects on the outcome of simulations. Therefore, neither reduced models nor initial conclusions could be 100% relied upon. In this research, Interval Constraint Solving Techniques (ICST) are employed to handle and quantify uncertainty. The goal is to identify key features of a given dynamical phenomenon in order to be able to propagate the characteristics of the model forward and predict its future behavior to obtain 100% guaranteed results. This is specifically important in applications, as a reliable understanding of a developing situation could allow for preventative or palliative measures before a situation aggravates.
north american fuzzy information processing society | 2017
Leobardo Valera; Angel F. Garcia Contreras; Martine Ceberio
Computer simulations of dynamic systems are really important to better understand some processes or phenomena without having to physically execute them, and/or to make offline decisions, or decisions that do not need immediate, “on-the-fly” answers in general. However, given a set of equations describing a dynamic phenomenon, wouldn’t it be nice to be able to exploit them more? Instead of simulating a situation, could we gear it or even veer it to a predefined performance? This paper is concerned with the identification of parameters of dynamic systems that ensure a specific performance or behavior. We propose to carry such computations using intervals and constraint solving techniques. However, realistically, aiming to enable such identification and decision on an on-going process or phenomena requires being able to conduct very fast computations on possibly very large systems of equations. We further propose to combine interval and constraint solving techniques with reduced-order modeling techniques to guarantee results in a practical amount of time.
north american fuzzy information processing society | 2016
Leobardo Valera; Martine Ceberio
The ability to make observations of natural phenomena has played a fundamental role in our world. From what we observe, models are derived and we can get an understanding about how things work by simulating our models. This has been particularly important in areas such as medicine, physics, chemistry. However, when we do not initiate simulations but that we are simply observing a phenomenon, it is valuable to be able to understand it “on the fly” and be able to predict its future behavior. Added challenges come from the fact that observations are never 100% accurate and therefore we must deal with uncertainty. In this work, we use Interval Constraint Solving Techniques (ICST) to handle uncertainty in the observations of a given phenomenon, and to be able to determine its initial conditions and unfold the dynamic behavior further in time.
north american fuzzy information processing society | 2015
Martine Ceberio; Leobardo Valera; Olga Kosheleva; Rodrigo Romero
In many application areas, such as meteorology, traffic control, etc., it is desirable to employ swarms of Unmanned Arial Vehicles (UAVs) to provide us with a good picture of the changing situation and thus, to help us make better predictions (and make better decisions based on these predictions). To avoid duplication, interference, and collisions, UAVs must coordinate their trajectories. As a result, the optimal control of each of these UAVs should depend on the positions and velocities of all others - which makes the corresponding control problem very complicated. Since, in contrast to controlling a single UAV, the resulting problem is too complicated to expect an explicit solution, a natural idea is to extra expert rules and use fuzzy control methodology to translate these rules into a precise control strategy. However, with many possible combinations of variables, it is not possible to elicit that many rules. In this paper, we show that, in general, it is possible to use model reduction techniques to decrease the number of questions and thus, to make rules elicitation possible. In addition to general results, we also show that for the UAVs, optimal control indeed leads to a model reduction - and thus, to a drastic potential decrease in the corresponding number of questions.
north american fuzzy information processing society | 2015
Octavio Lerma; Leobardo Valera; Vladik Kreinovich
Computers are getting faster and faster; the operating systems are getting more sophisticated. Often, these improvements necessitate that we migrate existing software to the new platform. In an ideal world, the migrated software should run perfectly well on a new platform; however, in reality, when we try that, thousands of errors appear, errors that need correcting. As a result, software migration is usually a very time-consuming process. A natural way to speed up this process is to take into account that errors naturally fall into different categories, and often, a common correction can be applied to all error from a given category. To efficiently use this idea, it is desirable to estimate the number of errors of different types. In this paper, we show how imprecise expert knowledge about such errors can be used to produce very realistic estimates.
Journal of Uncertain Systems | 2016
Leobardo Valera; Martine Ceberio
Journal of Uncertain Systems | 2015
L. Octavio Lerma; Leobardo Valera; Deana Pennington; Vladik Kreinovich
Archive | 2018
Leobardo Valera; Martine Ceberio; Vladik Kreinovich