Vicente J. Romero
Sandia National Laboratories
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Vicente J. Romero.
IEEE Control Systems Magazine | 1997
John T. Feddema; Clark R. Dohrmann; Gordon G. Parker; Rush D. Robinett; Vicente J. Romero; Dan J. Schmitt
This article describes two methods for controlling the surface of a liquid in an open container as it is being carried by a robot arm. Both methods make use of the fundamental mode of oscillation and damping of the liquid in the container as predicted from a boundary element model of the fluid. The first method uses an infinite impulse response filter to alter the acceleration profile so that the liquid remains level except for a single wave at the beginning and end of the motion. The motion of the liquid is similar to that of a simple pendulum. The second method removes the remaining two surface oscillations by tilting the container parallel to the beginning and ending wave. A double pendulum model is used to determine the trajectory for this motion. Experimental results of a FANUC S-800 robot moving a 230 mm diameter hemispherical container of water are presented.
Reliability Engineering & System Safety | 2006
Vicente J. Romero; John Burkardt; Max Gunzburger; Janet S. Peterson
Abstract A recently developed centroidal Voronoi tessellation (CVT) sampling method is investigated here to assess its suitability for use in statistical sampling applications. CVT efficiently generates a highly uniform distribution of sample points over arbitrarily shaped M -dimensional parameter spaces. On several 2-D test problems CVT has recently been found to provide exceedingly effective and efficient point distributions for response surface generation. Additionally, for statistical function integration and estimation of response statistics associated with uniformly distributed random-variable inputs (uncorrelated), CVT has been found in initial investigations to provide superior points sets when compared against latin-hypercube and simple-random Monte Carlo methods and Halton and Hammersley quasi-random sequence methods. In this paper, the performance of all these sampling methods and a new variant (“Latinized” CVT) are further compared for non -uniform input distributions. Specifically, given uncorrelated normal inputs in a 2-D test problem, statistical sampling efficiencies are compared for resolving various statistics of response: mean, variance, and exceedence probabilities.
AIAA Journal | 2008
John McFarland; Sankaran Mahadevan; Vicente J. Romero; Laura Painton Swiler
Model calibration analysis is concerned with the estimation of unobservable modeling parameters using observations of system response. When the model being calibrated is an expensive computer simulation, special techniques such as surrogate modeling and Bayesian inference are often fruitful. In this paper, we show how the flexibility of the Bayesian calibration approach can be exploited to account for a wide variety of uncertainty sources in the calibration process. We propose a straightforward approach for simultaneously handling Gaussian and non-Gaussian errors, as well as a framework for studying the effects of prescribed uncertainty distributions for model inputs that are not treated as calibration parameters. Further, we discuss how Gaussian process surrogate models can be used effectively when simulator response may be a function of time and/or space (multivariate output). The proposed methods are illustrated through the calibration of a simulation of thermally decomposing foam.
international conference on robotics and automation | 1996
John T. Feddema; Clark R. Dohrmann; Gordon G. Parker; Rush D. Robinett; Vicente J. Romero; Dan J. Schmitt
This paper describes two methods for controlling the surface of a liquid in an open container as it is being carried by a robot arm. Both methods make use of the fundamental mode of oscillation and damping of the liquid in the container as predicted from a boundary element model of the fluid. The first method uses an infinite impulse response filter to alter an acceleration profile so that the liquid remains level except for a single wave at the beginning and end of the motion. The motion of the liquid is similar to that of a simple pendulum. The second method removes the remaining two surface oscillations by tilting the container parallel to the beginning and ending wave. A double pendulum model is used to determine the trajectory for this motion. Experimental results of a FANUC S-800 robot moving a 230 mm diameter hemispherical container of water are presented.
46th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference | 2005
Vicente J. Romero; Martin Sherman; J.F. Dempsey; Jay D. Johnson; L.R. Edwards; K.C. Chen; R.V. Baron; C.F. King
We extend a methodology here for formalized development and validation of a component failure model in a system-modeling context. The component thermal-failure characterization problem we consider is a real one, having a rather full set of aspects and elements of model development and validation that may be encountered in practice. The interplay between modeling, experiments, and statistics illustrated here provides an interesting application example of model development and validation work.
SAE International Journal of Materials and Manufacturing | 2013
Vicente J. Romero; Joshua Mullins; Laura Painton Swiler; Angel Urbina
This paper discusses the treatment of uncertainties corresponding to relatively few samples of random-variable quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse samples it is not practical to have a goal of accurately estimating the underlying variability distribution (probability density function, PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a desired percentage of the actual PDF, say 95% included probability, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the random-variable range corresponding to the desired percentage of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem an interesting and difficult one. In this paper the performance of five uncertainty representation techniques is characterized on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods exhibit significantly better overall performance than the others according to the objectives and performance measures emphasized.
52nd AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference | 2011
Vicente J. Romero; Laura Painton Swiler; Angel Urbina
This paper discusses the handling and treatment of uncertainties corresponding to relatively few data samples in experimental characterization of random quantities. The importance of this topic extends beyond experimental uncertainty to situations where the derived experimental information is used for model validation or calibration. With very sparse data it is not practical to have a goal of accurately estimating the underlying variability distribution (probability density function, PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a desired percentage of the actual PDF, say 95% included probability, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the random-variable range corresponding to the desired percentage of the actual PDF. The performance of a variety of uncertainty representation techniques is tested and characterized in this paper according to these two opposing objectives. An initial set of test problems and results is presented here from a larger study currently underway.
Fourth International Symposium on Uncertainty Modeling and Analysis, 2003. ISUMA 2003. | 2003
Vicente J. Romero; John Burkardt; M.D. Gunzberger; Janet S. Peterson
A recently developed centroidal Voronoi tessellation (CVT) unstructured sampling method is investigated here to assess its suitability for use in statistical sampling and function integration. CVT efficiently generates a highly uniform distribution of sample points over arbitrarily shaped M-dimensional parameter spaces. It has recently been shown on several 2D test problems to provide superior point distributions for generating locally conforming response surfaces. Its performance as a statistical sampling and function integration method is compared to that of latin-hypercube sampling (LHS) and simple random sampling (SRS) Monte Carlo methods, and Halton and Hammersley quasiMonte-Carlo sequence methods. Specifically, sampling efficiencies are compared for function integration and for resolving various statistics of response in a 2D test problem. It is found that on balance CVT performs best of all these sampling methods on our test problems
53rd AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference<BR>20th AIAA/ASME/AHS Adaptive Structures Conference<BR>14th AIAA | 2012
Vicente J. Romero; J. Franklin Dempsey; Gerald W. Wellman; Bonnie R. Antoun; William Mark Scherzinger
This paper describes a practical method for representing, propagating, and aggregating aleatory and epistemic uncertainties associated with sparse samples of discrete random functions and processes. An example is material strength variability represented by multiple stress-strain curves from repeated material characterization tests. The functional relationship underlying the stress-strain curves is not known─no identifiable parametric relationship between the curves exists─so they are here treated as non-parametric or discrete glimpses of the material variability. Hence, representation and propagation of the material variability cannot be accomplished with standard parametric uncertainty approaches. Accordingly, a novel approach which also avoids underestimation of strength variability due to limited numbers of material tests (small numbers of samples of the variability) has been developed. A methodology for aggregation of non-parametric variability with parametric variability is described.
47th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference<BR> 14th AIAA/ASME/AHS Adaptive Structures Conference<BR> 7th | 2006
Vicente J. Romero; Chun-Hung Chen
A very general and robust approach to solving continuous-variable optimization problems involving uncertainty in the objective function is through the use of ordinal optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the uncertainty effects on local design alternatives, rather than on precise quantification of the effects. One simply asks “Is that alternative better or worse than this one?” –not “HOW MUCH better or worse is that alternative to this one?” The answer to the latter question requires precise characterization of the uncertainty—with the corresponding sampling/ integration expense for precise resolution. However, in this paper we demonstrate correct decision-making in a continuous-variable probabilistic optimization problem despite extreme vagueness in the statistical characterization of the design options. We present a new adaptive ordinal method for probabilistic optimization in which the trade-off between computational expense and vagueness in the uncertainty characterization can be conveniently managed in various phases of the optimization problem to make cost-effective stepping decisions in the design space. Spatial correlation of uncertainty in the continuous-variable design space is exploited to dramatically increase method efficiency. Under many circumstances the method appears to have favorable robustness and cost-scaling properties relative to other probabilistic optimization methods, and uniquely has mechanisms for quantifying and controlling error likelihood in design-space stepping decisions. The method is asymptotically convergent to the true probabilistic optimum, so could be useful as a reference standard against which the efficiency and robustness of other methods can be compared—analogous to the role that Monte Carlo simulation plays in uncertainty propagation.