Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeffrey T. Fong is active.

Publication


Featured researches published by Jeffrey T. Fong.


Journal of Pressure Vessel Technology-transactions of The Asme | 2006

Uncertainty in Finite Element Modeling and Failure Analysis: A Metrology-Based Approach

Jeffrey T. Fong; James J. Filliben; Roland deWit; Richard J. Fields; Barry Bernstein; Pedro V. Marcal

In this paper, we first review the impact of the powerful finite element method (FEM) in structural engineering, and then address the shortcomings of FEM as a tool for riskbased decision making and incomplete-data-based failure analysis. To illustrate the main shortcoming of FEM, i.e., the computational results are point estimates based on “deterministic” models with equations containing mean values of material properties and prescribed loadings, we present the FEM solutions of two classical problems as reference benchmarks: (RB-101) The bending of a thin elastic cantilever beam due to a point load at its free end and (RB-301) the bending of a uniformly loaded square, thin, and elastic plate resting on a grillage consisting of 44 columns of ultimate strengths estimated from 5 tests. Using known solutions of those two classical problems in the literature, we first estimate the absolute errors of the results of four commercially available FEM codes (ABAQUS, ANSYS, LSDYNA, and MPAVE) by comparing the known with the FEM results of two specific parameters, namely, (a) the maximum displacement and (b) the peak stress in a coarse-meshed geometry. We then vary the mesh size and element type for each code to obtain grid convergence and to answer two questions on FEM and failure analysis in general: (Q-1) Given the results of two or more FEM solutions, how do we express uncertainty for each solution and the combined? (Q-2) Given a complex structure with a small number of tests on material properties, how do we simulate a failure scenario and predict time to collapse with confidence bounds? To answer the first question, we propose an easy-to-implement metrology-based approach, where each FEM simulation in a gridconvergence sequence is considered a “numerical experiment,” and a quantitative uncertainty is calculated for each sequence of grid convergence. To answer the second question, we propose a progressively weakening model based on a small number (e.g., 5) of tests on ultimate strength such that the failure of the weakest column of the grillage causes a load redistribution and collapse occurs only when the load redistribution leads to instability. This model satisfies the requirement of a metrology-based approach, where the time to failure is given a quantitative expression of uncertainty. We conclude that in today’s computing environment and with a precomputational “design of numerical experiments,” it is feasible to “quantify” uncertainty in FEM modeling and progressive failure analysis. DOI: 10.1115/1.2150843


Nuclear Engineering and Design | 1978

Uncertainties in fatigue life prediction and a rational definition of safety factors

Jeffrey T. Fong

Abstract To cope with uncerainties in mechanical and structural design, enigineers exercise their judgement through the use of safety factors based on service experience and laboratory data on relevant design parameters. Using the problem of fatigue life prediction as a vehicle, the relationship between the size of a safety factor and the associated risk and cost-benefit estimates of the engineering judgement based on new technical information, is demonstrated. The subtle influence of the choice of a distribution function for a given set of data is exhibited by comparing the gaussian with the three-parameter Weibull fits of a set of fatigue life data on 6061-T6 aluminum. A system of ranking the importance of different sources of uncertainties based on an analysis of service data is proposed along with an example to “refine” the system using up-to-date laboratory and field measurements. The concept of a rational definition of safety factors as a tool for engineers who design under uncertainty is discussed.


IEEE Transactions on Industrial Electronics | 2016

Reliability of High-Voltage Molding Compounds: Particle Size, Curing Time, Sample Thickness, and Voltage Impact on Polarization

Andrés Calderín García; Nathan Warner; Nandika Anne D'Souza; Enis Tuncer; Luu Nguyen; Marie Denison; Jeffrey T. Fong

Reliability of dielectric composite materials (DCMs) is a multifaceted problem. Both high voltage and dimensional miniaturization decrease the path length for charge dissipation in the DCMs. This results in short life due to charge entrapment and uncontrolled release. The source of trapped charges in the DCM arises due to the intrinsic interfacial polarization in the filled system. From the compound perspective, at least four interacting variables are involved in determining relative reliability of DCMs: filler size, curing time, thickness, and polarization voltage magnitude. Experimental and data analysis of all four variables, with uncertainty propagated for each factor, is paramount to understanding behavior of each property and its interactions, shaping DCMs characteristics. Combining views from The Design of Experiments (Fisher, 1935), modern statistics (Hand, 2008), and analytical computer software (NIST Dataplot 2015) yields a free graphic user interface tool to optimize the design of compound mixtures to minimize the polarization for increased reliability.


Journal of Applied Physics | 1993

A nonequilibrium thermodynamic theory of viscoplastic materials

Barry Bernstein; Jeffrey T. Fong

A nonequilibrium thermodynamic theory for predicting the mechanical behavior of materials beyond the elastic range is formulated. The theory incorporates the idea of a ‘‘concealed’’ parameter α, originally due to Bridgman [Rev. Mod. Phys. 22, 56 (1950)], where the constitutive equations are governed by (a) a thermodynamic potential such as a generalized Gibbs function, G, or Helmholtz free‐energy function, F, each with an explicit dependence on α, and (b) a prescription for α, the time rate of change of α, such that α is directly proportional to the negative of Gα or Fα, the partial derivative of G or F with respect to α, respectively. The theory is found to be consistent with (1) the second law of thermodynamics regarding entropy production; (2) the concept of Lyapunov stability at equilibrium; (3) the rule of invariance with respect to a transformation of parameters; and (4) the powerful law of invariance with respect to the Legendre transformation. Significance of the new formulation is discussed by ...


Nuclear Engineering and Design | 1980

Inservice data reporting standards for engineering reliability and risk analysis

Jeffrey T. Fong

Abstract On two recent occasions, structural and mechanical engineers were challenged either to come up with a solution to the data base problem of the reliability analysis methodology or to avoid using the tool as a serious mathematical model to resolve issues of safety and productivity. The two occasions were: (1) The September 1978 publication of an assessment of the 1975 Reactor Safety Study (WASH-1400), by a review panel by H.W. Lewis. (2) The convening in December 1978 of an international symposium on inservice data reporting and analysis, sponsored by the American Society of Mechanical Engineers and held at San Francisco. This paper is a direct response to the challenge. The notion of an adequate data base is first defined in terms of three essential elements. It is then demonstrated via a medical analogy that an ‘optimal’ plan of data reporting and some national or international standards for such reporting are desirable. A formula for estimating variabilities based on a combination of inservice and failure data is proposed.


ASME 2010 Pressure Vessels and Piping Conference: Volume 6, Parts A and B | 2010

A New Approach to Creating Composite Materials Elastic Property Database with Uncertainty Estimation Using a Combination of Mechanical and Thermal Expansion Tests

José Daniel D. Melo; Jeffrey T. Fong

In composite structural design, a fundamental requirement is to furnish the designer with a set of elastic constants. For example, to design for a given temperature a laminate consisting of transversely isotropic fiber-reinforced laminae, we need five independent elastic constants of each lamina of interest, namely, E1, E2, 12, G12, and 23. At present, there exist seven tests, two of mechanicallamina, two of thermal-expansion-lamina, and three of thermalexpansion-laminate types, to accomplish this task. It is known in the literature that the mechanical tests are capable of measuring E1, E2, and 12 , whereas the two thermal-expansionlamina tests will measure 1 and 2, and the three thermalexpansion-laminate tests yield an over-determined system of three simultaneous equations of the remaining two unknown elastic constants, G12 and 23 . In this paper, we propose a new approach to determining those five elastic constants with uncertainty bounds using the extra information obtainable from an overdetermined system. The approach takes advantage of the classical theory of error propagation for which variance formulas were derived to estimate standard deviations of some of our five elastic constants. To illustrate this approach, we apply it to a set of experimental data on PEEK/IM7 unidirectional lamina. The experiment consists of the following tests: Two tensile tests with four samples of unidirectional specimens to measure E1, E2 and 12; two thermal-expansion-lamina tests for coefficients (1 and 2) each using four [(0)32]T unidirectional specimens; and three thermal-expansion-laminate tests on four samples of [(+30/-30)8]s laminates. The results of our new approach are compared with those of a similar but more ad hoc approach that has appeared in the literature. The potential of applying this new methodology to the creation of a composite material elastic property database with uncertainty estimation and to the reliability analysis of composite structure is discussed. (*) Contribution of National Institute of Standards and Technology. Not subject to copyright.


ASME 2010 Pressure Vessels and Piping Conference: Volume 6, Parts A and B | 2010

Composite Material Property Database Using Smooth Specimens to Generate Design Allowables with Uncertainty Estimation

Pranav D. Shah; José Daniel D. Melo; Carlos Alberto Cimini; Jeffrey T. Fong

For brevity, the class of “composite materials” in this paper is intended to refer to one of its subclasses, namely, the fiber-reinforced composite materials. In developing composite material property databases, three categories of data are needed. Category 1 consists of all raw test data with detailed information on specimen preparation, test machine description, specimen size and number per test, test loading history including temperature and humidity, etc., test configuration such as strain gage type and location, grip description, etc. Category 2 is the design allowable derived from information contained in Category 1 without making further experimental tests. Category 3 is the same design allowable for applications such that new experiments prescribed by user to obtain more reliable properties for the purpose on hand. At present, most handbook-based composite material property databases contain incomplete information in Category 1 (raw data), where a user is given only the test average values of properties such as longitudinal, transverse, and shear moduli, major and out-of-plane Poisson’s ratios, longitudinal tensile and compressive, transverse tensile and compressive, and shear strengths, inter-laminar shear strength, ply thickness, hygrothermal expansion coefficients, specific gravity, fiber volume fraction, etc. The presentation in Category 1 ignores the inclusion of the entire test environment description necessary for a user to assess the uncertainty of the raw data. Furthermore, the design allowable listed in Category 2 is deterministically obtained from Category 1 and the user is given average design allowable without uncertainty estimation. In this paper, it is presented a case study where average design allowable failure envelopes of open hole specimens were obtained numerically for two different quasi-isotropic carbon fiber-epoxy laminates using the appropriate Category 1 data. Using the method of statistical design of experiments, it is then showed how the average design allowable can be supplemented with uncertainty estimates if the Category 1 database is complete. Application of this methodology to predicting reliability of composite structures is discussed.© 2010 ASME


ASME 2009 Pressure Vessels and Piping Conference | 2009

Artificial Intelligence (AI) Tools for Data Acquisition and Probability Risk Analysis of Nuclear Piping Failure Databases

Pedro V. Marcal; Jeffrey T. Fong; Nobuki Yamagata

Over the last thirty years, much research has been done on the development and application of in-service inspection (ISI) and failure event databases for pressure vessels and piping, as reported in two recent symposia: (1) ASME 2007 PVP Symposium (in honor of the late Dr. Spencer Bush), San Antonio, Texas, on “Engineering Safety, Applied Mechanics, and Nondestructive Evaluation (NDE).” (2) ASME 2008 PVP Symposium, Chicago, Illinois, on “Failure Prevention via Robust Design and Continuous NDE Monitoring.” The two symposia concluded that those databases, if properly documented and maintained on a worldwide basis, could hold the key to the continued safe and profitable operation of numerous aging nuclear power or petro-chemical processing plants. During the 2008 symposium, four uncertainty categories associated with causing uncertainty in fatigue life estimates were identified, namely, (1) Uncertainty-1 in failure event databases, (2) Uncertainty-2 in NDE databases, (3) Uncertainty-3 in material property databases, and (4) Uncertainty-M in crack-growth and damage modeling. In this paper, which is one of a series of four to address all those four uncertainty categories, we introduce an automatic natural language abstracting and processing (ANLAP) tool to address Uncertainty-1. Three examples are presented and discussed.


ASME 2008 Pressure Vessels and Piping Conference | 2008

Robust Engineering Design for Failure Prevention

Jeffrey T. Fong; James J. Filliben; N. Alan Heckert; Roland deWit; Barry Bernstein

To advance the state of the art of engineering design, we introduce a new concept on the “robustness” of a structure by measuring its ability to sustain a sudden loss of a part without causing an immediate collapse. The concept is based on the premise that most structures have built-in redundancy such that when the loss of a single part leads to a load redistribution, the “crippled” structure tends to seek a new stability configuration without immediate collapse. This property of a “robust” structure, when coupled with a continuous or periodic inspection program using nondestructive evaluation (NDE) techniques, is useful in failure prevention, because such structure is expected to display “measurable” signs of “weakening” long before the onset of catastrophic failure. To quantify this “robustness” concept, we use a large number of simulations to develop a metric to be named the “Robustness Index (RBI).” To illustrate its application, we present two examples: (1) the design of a simple square grillage in support of a water tank, and (2) a classroom model of a 3-span double-Pratt-truss bridge. The first example is a “toy” problem, which turned out to be a good vehicle to test the feasibility of the RBI concept. The second example is taken from a textbook in bridge design (Tall, L., Structural Steel Bridge , 2nd ed., page 99, Fig. 4.3(b), Ronald Press, New York NY, 1974). It is not a case study for failure analysis, but a useful classroom exercise in an engineering design course. Significance and limitations of this new approach to catastrophic failure avoidance through “robust” design, are discussed.Copyright


ASME 2005 Pressure Vessels and Piping Conference | 2005

ABC of Statistics for Verification and Validation (V&V) of Simulations of High-Consequence Engineering Systems

Jeffrey T. Fong

We begin this expository essay by reviewing, with examples from the materials and fabrication testing literature, what a typical engineer already knows about statistics. We then consider a central question in engineering decision making, i.e., given a computer simulation of high-consequence systems, how do we verify and validate (V & V) and what are the margins of errors of all the important predicted results? To answer this question, we assert that we need three basic tools that already exist in statistical and metrological sciences: (A) Error Analysis. (B) Experimental Design. (C) Uncertainty Analysis. Those three tools, to be known as A B C of statistics, were developed through a powerful linkage between the statistical and metrological sciences. By extending the key concepts of this linkage from physical experiments to numerical simulations, we propose a new approach to answering the V & V question posed earlier. The key concepts are: (1) Uncertainty as defined in ISO Guide to the Expression of Uncertainty in Measurement (1993). (2) Design of experiments prior to data collection in a randomized or orthogonal scheme to evaluate interactions among model variables. (3) Standard reference benchmarks for calibration, and inter-laboratory studies for “weighted” consensus mean. To illustrate the need for and to discuss the plausibility of this metrology-based approach to V & V, two example problems are presented: (a) the verification of 12 simulations of the deformation of a cantilever beam, and (b) the calculation of a mean time to failure for a uniformly loaded 100-column single-floor steel grillage on fire.Copyright

Collaboration


Dive into the Jeffrey T. Fong's collaboration.

Top Co-Authors

Avatar

James J. Filliben

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

N. Alan Heckert

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Li Ma

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Roland deWit

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Barry Bernstein

Illinois Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nathanael A. Heckert

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Robert E. Chapman

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Alan Heckert

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

David L. Rudland

Battelle Memorial Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge