Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gene Wiggs is active.

Publication


Featured researches published by Gene Wiggs.


12th AIAA Aviation Technology, Integration, and Operations (ATIO) Conference and 14th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference | 2012

Multimodal Particle Swarm Optimization: Enhancements and Applications

Gulshan Singh; Felipe A. C. Viana; Arun K. Subramaniyan; Liping Wang; Douglas Decesare; Genghis Khan; Gene Wiggs

Optimization problems with multiple optima (local or global) are called multimodal problems. Many real world problems are multimodal and challenging to solve as compared to unimodal problems due to the possibility of premature convergence to a local optimum solution and possibly requiring a higher number of function evaluations. A multimodal optimization algorithm provides multiple solutions, thus a better understanding of the design space at minimal additional computational cost. The goal of multimodal particle swarm optimizer (MPSO) is to converge to multiple local and global optima with a reasonable number of function evaluations. The modifications to PSO include reduction in the personal best weight and an additional step to replace the global best with a group best in the PSO procedure. The modifications only allow a user defined number of particles (m ) to converge to a solution and relocate the particles if more than m particles converge to a solution. The relocation is active or inactive based on a predefined set of rules. MPSO is demonstrated on several optimization problems such as benchmark problems from the literature, spring design, and sequential sampling.


Volume 5: Marine; Microturbines and Small Turbomachinery; Oil and Gas Applications; Structures and Dynamics, Parts A and B | 2006

DACE Based Probabilistic Optimization of Mechanical Components

Vinay Ramanath; Gene Wiggs

Application of DACE (D esign and A nalysis of C omputer E xperiments) methods for probabilistic design space exploration and optimization to the design of a mechanical component is demonstrated. The key part of the paper is focused on the problem formulation and process flow for performing a probabilistic optimization. The authors have shown that for computationally intensive problems, probabilistic optimization can be carried out efficiently within a DACE framework. For problems that are not costly to compute, direct probabilistic optimization can be carried out by the efficient integration of probabilistic analysis and global optimization (such as Genetic Algorithms). The strategy in the paper proves to be especially beneficial for those organizations that are reluctant to move to probabilistic methods and also for the current practitioners of probabilistics. The methodology is illustrated with examples from both simple and computationally intensive engineering problems.Copyright


ASME Turbo Expo 2013: Turbine Technical Conference and Exposition | 2013

Calibrating Transient Models With Multiple Responses Using Bayesian Inverse Techniques

Natarajan Chennimalai Kumar; Arun K. Subramaniyan; Liping Wang; Gene Wiggs

Several engineering applications of high interest to turbomachinery involve transient models with multiple outputs. Thus, the ability to calibrate transient models with multiple correlated outputs is critical for enabling predictive models for design and analysis of turbomachinery. When the number of calibration parameters becomes large along with limited knowledge about those parameters (large uncertainty), traditional deterministic methods like least squares don’t yield reasonable parameter estimates. We employ the Bayesian calibration framework, proposed by Kennedy and O’Hagan [1], to perform calibration of industrial scale transient problems. The focus of this article is on Bayesian calibration of models with multiple transient outputs. The methodology is demonstrated with two problems with transient outputs. The advantages of using a Bayesian framework are highlighted. Specific challenges related to Bayesian calibration of transient responses are discussed along with potential solutions.Copyright


ASME 2011 Turbo Expo: Turbine Technical Conference and Exposition | 2011

Challenges in Uncertainty, Calibration, Validation and Predictability of Engineering Analysis Models

Liping Wang; Xingjie Fang; Arun K. Subramaniyan; Giridhar Jothiprasad; Martha Gardner; Amit Kale; Srikanth Akkaram; Don Beeson; Gene Wiggs; John Nelson

Model calibration, validation, prediction and uncertainty quantification have progressed remarkably in the past decade. However, many issues remain. This paper attempts to provide answers to the key questions: 1) how far have we gone? 2) what technical challenges remain? and 3) what are the future directions for this work? Based on a comprehensive literature review from academic, industrial and government research and experience gained at the General Electric (GE) Company, the paper will summarize the advancements of methods and the application of these methods to calibration, validation, prediction and uncertainty quantification. The latest research and application thrusts in the field will emphasize the extension of the Bayesian framework to validation of engineering analysis models. Closing remarks will offer insight into possible technical solutions to the challenges and future research directions.Copyright


ASME Turbo Expo 2007: Power for Land, Sea, and Air | 2007

Design for Six Sigma: The First 10 Years

Martha Gardner; Gene Wiggs

Six Sigma was launched at GE in 1995 by Jack Welch as a systematic way of improving the quality of delivered products and reducing cost across the entire Corporation. Soon after the first wave of Master Black Belts returned from their initial training, it was obvious that GE needed a “version” of Six Sigma adapted by a Design Engineering community that was focused on achieving specific goals of improved product performance, reliability and producibility while achieving a simultaneous reduction in the design cycle time for new products. The purpose of this paper is to share our lessons learned in adapting Six Sigma to the needs of the Design Engineering Community.Copyright


design automation conference | 2008

A Practical Robust and Efficient RBF Metamodel Method for Typical Engineering Problems

Xingjie Fang; Liping Wang; Don Beeson; Gene Wiggs

Radial Basis Function (RBF) metamodels have recently attracted increased interest due to their significant advantages over other types of non-parametric metamodels. However, because of the interpolation nature of the RBF mathematics, the accuracy of the model may dramatically deteriorate if the training data set used contains duplicate information, noise or outliers. Also constructing the metamodel may be time consuming whenever the training data sets are large or a high dimensional model is required. In this paper, we propose a robust and efficient RBF metamodeling approach based on data pre-processing techniques that alleviate the accuracy and efficiency issues commonly encountered when RBF models are used in typical real engineering situations. These techniques include 1) the removal of duplicate training data information, 2) the generation of smaller uniformly distributed subsets of training data from large data sets and 3) the quantification and identification of outliers by principal component analysis (PCA) and Hotelling statistics. Simulation results are used to validate the generalization accuracy and efficiency of the proposed approach.Copyright


Quality Engineering | 2007

Robust, Producible Design Process Evolution

David L. Rumpf; Gene Wiggs; Todd Williams; James Worachek

ABSTRACT Until about 10 years ago, design engineering and manufacturing at our company were separate organizations. Design engineers produced part designs for an integrated engine system and expected manufacturing to make and assemble the parts. Tolerance decisions were influenced by the arguing ability of each discipline along with historic precedent. Most form-fit-function characteristics were about 95% producible or two-sigma designs. Nonconformance control by a Material Review Board (MRB) was used by design engineering to monitor manufacturing quality. Although this process demonstrated the ability to produce excellent engines, it depended on inspecting in quality, multiple rework loops and resulting high cost. This article will discuss the evolving process used to design engines that are producible and error proofed. Discussion will include the organizational structure supporting the needed culture change, the six-sigma impact of common terminology and data driven decisions, the structured approach using manufacturing process capability data to facilitate producibility, use of assembly defect and customer escape data to drive error-proofing early during the design process, and the focus on standardized notes and automated characteristic accountability for error prevention. Discussion will include examples demonstrating the significant improvements in quality and producibility accomplished by the new process.


Volume 5: Marine; Microturbines and Small Turbomachinery; Oil and Gas Applications; Structures and Dynamics, Parts A and B | 2006

Analytical Derivatives Technology for Parametric Shape Design and Analysis in Structural Applications

Srikanth Akkaram; Jean-Daniel Beley; Bob Maffeo; Gene Wiggs

The ability to perform and evaluate the effect of shape changes on the stress, modal and thermal response of components is an important ingredient in the ‘design’ of aircraft engine components. The classical design of experiments (DOE) based approach that is motivated from statistics (for physical experiments) is one of the possible approaches for the evaluation of the component response with respect to design parameters [1]. Since the underlying physical model used for the component response is deterministic and understood through a computer simulation model, one needs to re-think the use of the classical DOE techniques for this class of problems. In this paper, we explore an alternate sensitivity analysis based technique where a deterministic parametric response is constructed using exact derivatives of the complex finite-element (FE) based computer models to design parameters. The method is based on a discrete sensitivity analysis formulation using semi-automatic differentiation [2,3] to compute the Taylor series or its Pade equivalent for finite element based responses. Shape design or optimization in the context of finite element modeling is challenging because the evaluation of the response for different shape requires the need for a meshing consistent with the new geometry. This paper examines the differences in the nature and performance (accuracy and efficiency) of the analytical derivatives approach against other existing approaches with validation on several benchmark structural applications. The use of analytical derivatives for parametric analysis is demonstrated to have accuracy benefits on certain classes of shape applications.Copyright


ASME Turbo Expo 2013: Turbine Technical Conference and Exposition | 2013

Design for Six Sigma: The First 15 Years

Martha Gardner; Gene Wiggs

The original MAIC version of Six Sigma was launched at GE in 1995. Within a couple of years after launch it was recognized that the Engineering Design community needed methods and tools that focused on achieving performance goals, reliability, and producibility, while meeting cost targets and reducing cycle time. Design for Six Sigma was born out of this need and has evolved into a company-wide approach integrated into many engineering design processes. The initiative has continued to grow over the last 15 years. This paper is an update to a paper the authors published in 2007. Many new learnings from the subsequent 5 years are being integrated into the prior version of the paper by the authors (Ref. 1).Copyright


ASME Turbo Expo 2007: Power for Land, Sea, and Air | 2007

Pseudo-ARMA Model for Meta-Modeling Extrapolation

Huageng Luo; Liping Wang; Don Beeson; Gene Wiggs

In spite of exponential growth in computing power, the enormous computational cost of complex and large-scale engineering design problems make it impractical to rely exclusively on original high fidelity simulation codes. Therefore, there has been an increasing interest in the use of fast executing meta-models to alleviate the computational cost required by slow and expensive simulation models — especially for optimization and probabilistic design. However, many state-of-the-art meta-modeling techniques, such as Radial Basis Function (RBF), Gaussian Process (GP), and Kriging can only make good predictions in the case of interpolation. Their ability for extrapolation is not impressive since the models are mathematically constructed for interpolations. Although Multivariate Adaptive Regression Splines (MARS) and Artificial Neural Network (ANN) have been tried for extrapolation problems (forecasting), the results do not always meet accuracy requirements. The autoregressive moving-average (ARMA) model is a popular time series modeling and forecasting tool. It has been widely used in many engineering applications in which all the inputs and outputs are time dependent. Many researchers have tried to extend the time series ARMA modeling technique into so-called spatial ARMA modeling or time-space ARMA modeling. However, the time-space ARMA modeling requires extensive computation in grid data generation as well as in model building, particularly for high dimensional problems. In this paper, a pseudo-ARMA approach is proposed to strengthen meta-modeling extrapolation capability. Each input is randomly sampled at a given mean value and distribution range to form a pseudo-time series. The output variables are evaluated based on input variables, which formulate output variable pseudo time series. The pseudo-ARMA model is built based on the pseudo input and output time series. Using the constructed pseudo-ARMA model, and new input variables generated with extended distribution parameters, such as distribution means and distribution ranges, the output variables can be evaluated to achieve extrapolations. Several numerical examples are presented to demonstrate the proposed approach. The results are compared with Radial Basis Function (RBF) meta-modeling results for both interpolation and extrapolation.Copyright

Collaboration


Dive into the Gene Wiggs's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge