Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Timothy G. Trucano is active.

Publication


Featured researches published by Timothy G. Trucano.


Progress in Aerospace Sciences | 2002

VERIFICATION AND VALIDATION IN COMPUTATIONAL FLUID DYNAMICS

William L. Oberkampf; Timothy G. Trucano

Abstract Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different from traditional experiments and testing. A description is given of a relatively new procedure for estimating experimental uncertainty that has proven more effective at estimating random and correlated bias errors in wind-tunnel experiments than traditional methods. Consistent with the authors’ contention that nondeterministic simulations are needed in many validation comparisons, a three-step statistical approach is offered for incorporating experimental uncertainties into the computational analysis. The discussion of validation assessment ends with the topic of validation metrics, where two sample problems are used to demonstrate how such metrics should be constructed. In the spirit of advancing the state of the art in V&V, the paper concludes with recommendations of topics for future research and with suggestions for needed changes in the implementation of V&V in production and commercial software.


Applied Mechanics Reviews | 2004

Verification, Validation, and Predictive Capability in Computational Engineering and Physics

William L. Oberkampf; Timothy G. Trucano; Charles Hirsch

Developers of computer codes, analysts who use the codes, and decision makers who rely on the results of the analyses face a critical question: How should confidence in modeling and simulation be critically assessed? Verification and validation (V&V) of computational simulations are the primary methods for building and quantifying this confidence. Briefly, verification is the assessment of the accuracy of the solution to a computational model. Validation is the assessment of the accuracy of a computational simulation by comparison with experimental data. In verification, the relationship of the simulation to the real world is not an issue. In validation, the relationship between computation and the real world, i.e., experimental data, is the issue.


Reliability Engineering & System Safety | 2006

Calibration, validation, and sensitivity analysis : What's what

Timothy G. Trucano; Laura Painton Swiler; Takera Igusa; William L. Oberkampf; Martin Pilch

Abstract One very simple interpretation of calibration is to adjust a set of parameters associated with a computational science and engineering code so that the model agreement is maximized with respect to a set of experimental data. One very simple interpretation of validation is to quantify our belief in the predictive capability of a computational code through comparison with a set of experimental data. Uncertainty in both the data and the code are important and must be mathematically understood to correctly perform both calibration and validation. Sensitivity analysis, being an important methodology in uncertainty analysis, is thus important to both calibration and validation. In this paper, we intend to clarify the language just used and express some opinions on the associated issues. We will endeavor to identify some technical challenges that must be resolved for successful validation of a predictive modeling capability. One of these challenges is a formal description of a “model discrepancy” term. Another challenge revolves around the general adaptation of abstract learning theory as a formalism that potentially encompasses both calibration and validation in the face of model uncertainty.


AIAA Fluids 2000 Conference, Denver, CO (US), 06/19/2000--06/22/2000 | 2000

Validation Methodology in Computational Fluid Dynamics

William L. Oberkampf; Timothy G. Trucano

Verification and validation are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in computational validation and develops a number of extensions to existing ideas. We discuss the early work in validation by the operations research, statistics, and CFD communities. The emphasis in our review is to bring together the diverse contributors to validation methodolozv and procedures. The disadvantages of standard c a hypersonic cruise missile. We present six recommended characteristics of how a validation experiment is designed, executed, and analyzed. Since one of the key features of a validation experiment is a careful experimental uncertainty estimation analysis, we discuss a statistical procedure that has been developqd for improving the estimation of experimental uncertainty. One facet of code verification, the estimation of computational error and uncertainty, is discussed in some detail, but we do not address many other important issues in code verification. We argue for the separation of the concepts of error and uncertainty in computational simulations. Error estimation, primarily that due to numerical solution error, is discussed with regard to its importance in validation. In the same vein, we explain the need to move toward nondeterministic simulations in CFD validation, that is, the propagation of input quantity uncertainty in CFD simulations which yield probabilistic output quantities. We discuss the relatively new concept of validation quantification, also referred to as validation metrics. The inadequacy, in our view, of hypothesis testing in computational validation is discussed. We close the paper by presenting our ideas on validation metrics and we apply them to tsvo conceptual examples. *Distinguished Member Technical Staff, Associate Fellow **Distinguished Member Technical Staff Copyright Q 2000 The American Institute of Aeronautics and Astronautics Inc. All rights reserved. Sandia is a multiprogram laborato~ operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract No. DE-AC04-94AL85000.


International Journal of Impact Engineering | 1997

Recent progress in ALEGRA development and application to ballistic impacts

Randall M. Summers; James S. Peery; Michael W. Wong; Eugene S. Hertel; Timothy G. Trucano; Lalit C. Chhabildas

ALEGRA is a multi-material, arbitrary-Lagrangian-Eulerian (ALE) code for solid dynamics being developed by the Computational Physics Research and Development Department at Sandia National Laboratories. It combines the features of modem Eulerian shock codes, such as CTH, with modem Lagrangian structural analysis codes. With the ALE algorithm , the mesh can be stationary (Eulerian) with the material flowing through the mesh, the mesh ran move with the material (Lagrangian) so there is no flow between elements, or the mesh motion can be entirely independent of the material motion (Arbitrary). All three mesh types can coexist in the same problem and any mesh may change its type during the calculation. In this paper we summarize several key capabilities that have recently been added to the code or are currently being implemented. As a demonstration of the capabilities of ALEGRA, we have applied it to the experimental data taken by Silsby.


Geophysical Research Letters | 1994

Mass and penetration depth of Shoemaker‐Levy 9 fragments from time‐resolved photometry

Mark B. Boslough; David A. Crawford; Allen C. Robinson; Timothy G. Trucano

Computational simulations of the first 100 seconds of interaction of Shoemaker-Levy 9 fragments with the Jovian atmosphere have revealed a potential method for estimating the masses and penetration depths of the individual objects. For sufficiently large fragments, impact-generated fireballs will rise into line-of-sight over the Jovian limb (less than one minute after impact for a 3-km diameter fragment). It is possible that time-resolved radiometric measurements from Earth- and orbital-based observatories may detect two different arrivals for each impact: first the shock wave and, a few seconds later, a debris front (fireball). Measurements of one or both arrival times with time resolutions of better than one second will provide information that would place strong restrictions on the range of values of equivalent explosive yield (from which fragment mass can be extracted) and effective penetration depth. We believe that time-resolved photometry measurements of impact-induced light emission (impact-flash signatures) will provide the best means by which Shoemaker-Levy 9 fragment masses can be determined if they are greater than about 5×1015 g (corresponding to a 1-km diameter ice sphere).


Other Information: PBD: 1 Mar 2002 | 2002

General Concepts for Experimental Validation of ASCI Code Applications

Timothy G. Trucano; Martin Pilch; William L. Oberkampf

This report presents general concepts in a broadly applicable methodology for validation of Accelerated Strategic Computing Initiative (ASCI) codes for Defense Programs applications at Sandia National Laboratories. The concepts are defined and analyzed within the context of their relative roles in an experimental validation process. Examples of applying the proposed methodology to three existing experimental validation activities are provided in appendices, using an appraisal technique recommended in this report.


International Journal of Impact Engineering | 1990

Debris cloud dynamics

Charles E. Anderson; Timothy G. Trucano; Scott A. Mullin

Abstract The hypervelocity impact of a projectile upon a thin metal plate and subsequent formation of back-surface debris is reviewed. At sufficiently high impact velocities, roughly greater than 3.0 km/s (depending upon the shock impedances of the materials involved), shock formation and interaction dominate and control the overall response of both the projectile and the target plate. We focus upon the importance of shock heating, melting, and vaporization in this application. Because of the complexity of the physical interactions, numerical simulation of such problems is necessary to draw quantitative conclusions. Thus, we also assess the current status of computational modeling of this kind of impact event, specifically addressing recent work bearing on the sensitivity of such modeling to the equations of state and certain numerical issues.


Other Information: PBD: 1 Mar 2002 | 2002

Verification and Validation in Computational Fluid Dynamics

William L. Oberkampf; Timothy G. Trucano

Verification and validation (V and V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V and V, and develops a number of extensions to existing ideas. The review of the development of V and V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V and V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized.


Shock Waves | 1994

The impact of comet Shoemaker-Levy 9 on Jupiter

David A. Crawford; Mark B. Boslough; Timothy G. Trucano; Allen C. Robinson

We have performed computational shock-physics simulations of the hypervelocity (60 km/s) impact of 1–3 km, water-ice spheres entering a hydrogen-helium Jovian atmosphere. These conditions simulate the best current estimates for the collision of fragments of periodic comet Shoemaker-Levy 9 with Jupiter in July, 1994. We used the Eulerian shock-physics code CTH, and its parallel version PCTH to perform 2-D analyses of penetration and breakup, and 3-D analyses of the growth of the resulting fireball during the first 100 seconds after fragment entry. We can use our simulations to make specific predictions of the time interval between fragment entry and fireball arrival into line-of-sight from the earth. For a fragment larger than about 1 km, we believe that the time of fireball arrival above Jupiters limb will be directly observable from earth. Measurements of this time by observers, in conjunction with our simulations, may allow mass of cometary fragments to be determined.

Collaboration


Dive into the Timothy G. Trucano's collaboration.

Top Co-Authors

Avatar

William L. Oberkampf

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Martin Pilch

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

David A. Crawford

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Mark B. Boslough

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Allen C. Robinson

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

J. R. Asay

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Lalit C. Chhabildas

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Laura Painton Swiler

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Randall M. Summers

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Dennis E. Grady

Sandia National Laboratories

View shared research outputs
Researchain Logo
Decentralizing Knowledge