Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Felipe A. C. Viana is active.

Publication


Featured researches published by Felipe A. C. Viana.


12th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference | 2008

Design and Analysis of Computer Experiments in Multidisciplinary Design Optimization: A Review of How Far We Have Come - Or Not

Timothy W. Simpson; Vasilli Toropov; Vladimir Balabanov; Felipe A. C. Viana

The use of metamodeling techniques in the design and analysis of computer experiments has progressed remarkably in the past two decades, but how far have we really come? This is the question that we investigate in this paper, namely, the extent to which the use of metamodeling techniques in multidisciplinary design optimization have evolved in the two decades since the seminal paper on Design and Analysis of Computer Experiments by Sacks et al. As part of this review, we examine the motivation for advancements in metamodeling techniques from both a historical perspective and the research itself. Based on current thrusts in the field, we emphasize multi-level/multi-fidelity approximations and ensembles of metamodels, as well as the availability of metamodels within commercial software and for design space exploration and visualization in this review. Our closing remarks offer insight into future research directions – nearly the same ones that have motivated us in the past.


AIAA Journal | 2014

Special Section on Multidisciplinary Design Optimization: Metamodeling in Multidisciplinary Design Optimization: How Far Have We Really Come?

Felipe A. C. Viana; Timothy W. Simpson; Vladimir Balabanov; Vasilli Toropov

The use of metamodeling techniques in the design and analysis of computer experiments has progressed remarkably in the past 25 years, but how far has the field really come? This is the question addressed in this paper, namely, the extent towhich the use ofmetamodeling techniques inmultidisciplinary design optimization have evolved in the 25 years since the seminal paper on design and analysis of computer experiments by Sacks et al. (“Design and Analysis of Computer Experiments,” Statistical Science, Vol. 4, No. 4, 1989, pp. 409–435). Rather than a technical review of the entire body of metamodeling literature, the focus is on the evolution and motivation for advancements in metamodeling with some discussion on the research itself; not surprisingly, much of the current research motivation is the same as it was in the past. Based on current research thrusts in the field, multifidelity approximations and ensembles (i.e., sets) of metamodels, as well as the availability of metamodels within commercial software, are emphasized. Design space exploration and visualization via metamodels are also presented as they rely heavily onmetamodels for rapid design evaluations during exploration. The closing remarks offer insight into future research directions, mostly motivated by the need for new capabilities and the ability to handle more complex simulations.


Journal of Global Optimization | 2013

Efficient global optimization algorithm assisted by multiple surrogate techniques

Felipe A. C. Viana; Raphael T. Haftka; Layne T. Watson

Surrogate-based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) algorithm. However, adding one single point at a time may not be efficient when the main concern is wall-clock time (rather than number of simulations) and simulations can run in parallel. Also, the need for uncertainty estimates limits EGO-like strategies to surrogates normally implemented with such estimates (e.g., kriging and polynomial response surface). We propose the multiple surrogate efficient global optimization (MSEGO) algorithm, which adds several points per optimization cycle with the help of multiple surrogates. We import uncertainty estimates from one surrogate to another to allow use of surrogates that do not provide them. The approach is tested on three analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard, and six different instances of support vector regression. We found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.


Journal of The Brazilian Society of Mechanical Sciences and Engineering | 2006

Multimodal vibration damping through piezoelectric patches and optimal resonant shunt circuits

Felipe A. C. Viana; Valder Steffen

Piezoelectric elements connected to shunt circuits and bonded to a mechanical structure form a dissipation device that can be designed to add damping to the mechanical system. Due to the piezoelectric effect, part of the vibration energy is transformed into electrical energy that can be conveniently dissipated. Therefore, by using appropriate electrical circuits, it is possible to dissipate strain energy and, as a consequence, vibration is suppressed through the added passive damping. From the electrical point of view, the piezoelectric element behaves like a capacitor in series with a controlled voltage source and the shunt circuit, commonly formed by an RL network, is tuned to dissipate the electrical energy, more efficiently in a given frequency band. It is important to know that large inductances are frequently required, leading to the necessity of using synthetic inductors (obtained from operational amplifiers). From the mechanical point of view, the vibration energy can be attenuated in a single mode, or in multiple modes, according to the design of the damping device and the frequency band of interest. This work is devoted to the study of passive damping systems for single modes or multiple modes, based on piezoelectric patches and resonant shunt circuits. The present contribution discusses the modeling of piezoelectric patches coupled to shunt circuits, where the basics of resonant shunt circuits (series and parallel topologies) are presented. Following, the devices used in passive control (piezoelectric patch and synthetic inductors) are analyzed from the electrical and experimental viewpoints. The modeling of multi-degree-of-freedom mechanical systems, including the effects of the passive damping devices is revisited, and, then a design methodology for the multi-modal case is defined. Also, it is briefly reviewed the optimization method used for design purposes, namely the LifeCycle Model. Finally, experimental results are reported, illustrating the success of using the methodology presented in passive damping applications applied to mechanical and mechatronic systems.


design automation conference | 2010

Making the Most Out of Surrogate Models: Tricks of the Trade

Felipe A. C. Viana; Christian Gogu; Raphael T. Haftka

Design analysis and optimization based on high-fidelity computer experiments is commonly expensive. Surrogate modeling is often the tool of choice for reducing the computational burden. However, even after years of intensive research, surrogate modeling still involves a struggle to achieve maximum accuracy within limited resources. This work summarizes advanced and yet simple statistical tools that help. We focus on four techniques with increasing popularity in the design automation community: (i) screening and variable reduction in both the input and the output spaces, (ii) simultaneous use of multiple surrogates, (iii) sequential sampling and optimization, and (iv) conservative estimators.


AIAA Journal | 2010

Using Cross Validation to Design Conservative Surrogates

Felipe A. C. Viana; Victor Picheny; Raphael T. Haftka

this work we use safety margins to conservatively compensate for fitting errors associated with surrogates. We propose the use of cross validation for estimating the required safety margin for a desired level of conservativeness (percentage of safe predictions). The approach was tested on three algebraic examples for two basic surrogates: namely, kriging and polynomial response surface. For these examples we found that cross validation is effective for selecting the safety margin. We also applied the approach to the probabilistic design optimization of a composite laminate. This design under uncertainty example showed that the approach can be successfully used in engineering applications.


51st AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference<BR> 18th AIAA/ASME/AHS Adaptive Structures Conference<BR> 12th | 2010

Why Not Run the Efficient Global Optimization Algorithm with Multiple Surrogates

Felipe A. C. Viana; Raphael T. Haftka; Layne T. Watson

Surrogate-based optimization has become popular in the design of complex engineering systems. Each optimization cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally performing exact simulation at the design obtained by the optimization. Adaptive sampling algorithms that add one point per cycle are readily available in the literature. They use uncertainty estimators to guide the selection of the next sampling point(s). The addition of one point at a time may not be efficient when it is possible to run simulations in parallel. So we propose an algorithm for adding several points per optimization cycle based on the simultaneous use of multiple surrogates. The need for uncertainty estimates usually limits adaptive sampling algorithms to surrogates such as kriging and polynomial response surface because of the lack of uncertainty estimates in the implementation of other surrogates. We import uncertainty estimates from surrogates having such estimates to use with other surrogates such as support vector regression models. The approach was tested on two analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard and support vector regression. For these examples we compare our approach with traditional sequential optimization based on kriging. We found that our approach was able to deliver better results in a fraction of the optimization cycles needed by the traditional kriging implementation.


Journal of Global Optimization | 2009

Optimization of aircraft structural components by using nature-inspired algorithms and multi-fidelity approximations

Felipe A. C. Viana; Valder Steffen; Sergio Butkewitsch; Marcus de Freitas Leal

In this work, a flat pressure bulkhead reinforced by an array of beams is designed using a suite of heuristic optimization methods (Ant Colony Optimization, Genetic Algorithms, Particle Swarm Optimization and LifeCycle Optimization), and the Nelder-Mead simplex direct search method. The compromise between numerical performance and computational cost is addressed, calling for inexpensive, yet accurate analysis procedures. At this point, variable fidelity is proposed as a tradeoff solution. The difference between the low-fidelity and high-fidelity models at several points is used to fit a surrogate that corrects the low-fidelity model at other points. This allows faster linear analyses during the optimization; whilst a reduced set of expensive non-linear analyses are run “off-line,” enhancing the linear results according to the physics of the structure. Numerical results report the success of the proposed methodology when applied to aircraft structural components. The main conclusions of the work are (i) the variable fidelity approach enabled the use of intensive computing heuristic optimization techniques; and (ii) this framework succeeded in exploring the design space, providing good initial designs for classical optimization techniques. The final design is obtained when validating the candidate solutions issued from both heuristic and classical optimization. Then, the best design can be chosen by direct comparison of the high-fidelity responses.


13th AIAA/ISSMO Multidisciplinary Analysis Optimization Conference | 2010

Surrogate-based Optimization with Parallel Simulations using the Probability of Improvement

Felipe A. C. Viana; Raphael T. Haftka

In surrogate-based optimization, each cycle consists of carrying out a number of simulations, fitting a surrogate, performing optimization based on the surrogate, and finally running exact simulation at the candidate solutions. Adaptive sampling algorithms that add one point per cycle are readily available. For example, the efficient global optimization (EGO) algorithm uses the kriging prediction and uncertainty estimate to guide the selection of the next sampling point. However, the addition of one point at a time may not be efficient when it is possible to run simulations in parallel. Additionally, the extension to include multiple points per cycle turns out to be either limited or computationally challenging. We propose an algorithm for adding several points per optimization cycle based on approximated computation of the probability of improvement. We assume that the probabilities at different points of the design space are independent from each other. The approach was tested on three analytic examples. For these examples we compare our approach with traditional sequential optimization based on kriging. We found that indeed our approach was able to deliver better results in a fraction of the optimization cycles needed by the traditional kriging implementation.


AIAA Journal | 2009

Cross Validation Can Estimate How Well Prediction Variance Correlates with Error

Felipe A. C. Viana; Raphael T. Haftka

T HE use of surrogates for facilitating optimization and statistical analysis of computationally expensive simulations has become commonplace [1–4]. They offer easy-to-compute prediction and in some cases (such as kriging [5,6] and polynomial response surfaces [7,8]), they also furnish the prediction variance as a measure of uncertainty [9]. Figure 1a illustrates the concepts of prediction and prediction variance. Adaptive sampling and optimization methods use the prediction variance to select the next sampling point. For example, the Efficient Global Optimization (EGO) [10] and the Enhanced Sequential Optimization [11] algorithms use the kriging prediction variance to seek the point of maximum expected improvement as the next simulation for the optimization process. For such methods, it is important to assess the accuracy of the prediction variance; but presently, this is not available (although there is work on how to improve the uncertainty structure [12]). Cross validation is a standard tool for estimating the mean square errors (see the Appendix), thus the quality of the fit; and it can be used for selecting surrogates in a set [13–15]. Cross validation divides a set of p data points into k subsets. The surrogate isfit to all subsets except one, and the error is checked in the subset that was left out. This process is repeated for all subsets to produce a vector of cross-validation errors, eXV . Figure 1b illustrates cross validation when only one point is omitted. We propose using cross validation for estimating the correlation between the prediction variance and the errors. Specifically we propose to use the correlation between the absolute values of the cross-validation errors and the square root of the prediction variance at the points that were left out. II. Correlation Between Square Root of Prediction Variance and Absolute Errors

Collaboration


Dive into the Felipe A. C. Viana's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Valder Steffen

Federal University of Uberlandia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Domingos Alves Rade

Federal University of Uberlandia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Timothy W. Simpson

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge