Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thomas J. Mackin is active.

Publication


Featured researches published by Thomas J. Mackin.


Engineering Failure Analysis | 2002

Thermal cracking in disc brakes

Thomas J. Mackin; Steven C. Noe; K.J. Ball; B.C. Bedell; D.P. Bim-Merle; M.C. Bingaman; D.M. Bomleny; G.J. Chemlir; D.B. Clayton; H.A. Evans; R. Gau; J.L. Hart; J.S. Karney; B.P. Kiple; R.C. Kaluga; P. Kung; A.K. Law; D. Lim; R.C. Merema; B.M. Miller; T.R. Miller; T.J. Nielson; T.M. O'Shea; M.T. Olson; H.A. Padilla; B.W. Penner; C. Penny; R.P. Peterson; V.C. Polidoro; A. Raghu

Abstract Disc brakes are exposed to large thermal stresses during routine braking and extraordinary thermal stresses during hard braking. High-g decelerations typical of passenger vehicles are known to generate temperatures as high as 900°C in a fraction of a second. These large temperature excursions have two possible outcomes: thermal shock that generates surface cracks; and/or large amounts of plastic deformation in the brake rotor. In the absence of thermal shock, a relatively small number of high-g braking cycles are found to generate macroscopic cracks running through the rotor thickness and along the radius of the disc brake. The analysis herein shows that rotor failure is a consequence of low cycle thermo-mechanical fatigue. An analysis of the vehicle dynamics was used to find a heat flux equation related to braking forces. The heat flux equation was then used in a finite element analysis to determine the temperature profile in the brake. Once the brake temperature was obtained, a simplified shrink fit analysis was used to estimate the stresses that arise during hard braking. This approach shows that plastic deformation occurs due to the large thermal strains associated with high-g braking. The calculated strain amplitude was then used in a Coffin–Manson law to predict the number of high-g braking cycles to failure. Good agreement was obtained between reported braking cycles to failure and the proposed theoretical approach


Review of Scientific Instruments | 2005

Infrared grey-field polariscope: A tool for rapid stress analysis in microelectronic materials and devices

Gavin P. Horn; Jon Lesniak; Thomas J. Mackin; Brad Boyce

The infrared grey-field polariscope (IR-GFP) has been developed to provide rapid, full-field stress analysis for infrared-transparent materials. Grey-field photoelastic theory is outlined and the advantages of this implementation for microelectronic materials inspection highlighted. The capabilities of this scientific tool are proven using standard sample geometries fabricated from single crystal silicon substrates and the general applicability of the instrument demonstrated on bonded devices and silicon wafer geometries. Stress resolution in silicon wafers is better than 0.1MPa at wafer inspection speeds of 10s for a 100mm wafer. Initial applications of the IR-GFP have shown that the tool provides improvements in defect detection and stress quantification when compared to conventional infrared transmission imaging while also providing several important advantages over other currently utilized inspection technologies.


IEEE\/ASME Journal of Microelectromechanical Systems | 2002

A thermomechanical model for adhesion reduction of MEMS cantilevers

James W. Rogers; Thomas J. Mackin; Leslie M. Phinney

Presents a thermomechanical model that describes adhesion reduction in MEMS structures using laser heating. A fracture mechanics model is developed where the interface between the stiction-failed microcantilever and the substrate is treated as a crack, and the energy release rate is calculated using elastic theory. In order to include the effect of a temperature difference between the microcantilever and the substrate, an associated thermal strain energy is included in the fracture model. If the free length is longer than the critical buckling length, the beam buckles decreasing the strain energy of the system. For surface-micromachined polycrystalline silicon cantilevers with an initial crack length of 400 /spl mu/m, the model predicts that a temperature difference of 100 K repairs microcantilevers as long as 1300 /spl mu/m. The peeling of adhered beams from the substrate after laser irradiation is experimentally shown with measured crack lengths within 15% of predicted values indicating that the proposed model establishes the mechanism of adhesion reduction by laser irradiation.


Materials Science and Engineering A-structural Materials Properties Microstructure and Processing | 1995

Interfacial properties of SiC monofilament reinforced β′-SiAlON composites

Chao M. Huang; Dong Zhu; Youren Xu; Thomas J. Mackin; Waltraud M. Kriven

Abstract Interfacial mechanical properties of SiC monofilament-reinforced β′-SiAlON composites were characterized by a single fiber push-out technique. Interfacial parameters were studied as a function of embedded filament lengths, including comparisons of linear, nonlinear shear-lag, and progressive debonding analysis models. The interfacial debonding peak load ( P p ) and maximum frictional sliding load ( P max ) were both measured from the apparent load-displacement curves. Linear and shear-lag analyses were fitted to the data as a function of embedded filament lengths, respectively. In comparison, the progressive debonding analysis was conducted by fitting the effective load-displacement curves obtained by subtraction of machine compliance from the apparent load-displacement curves. The nonlinear shear-lag model gave better regression fits to the data than did the linear model, while the progressive debonding model provided much more interfacial information than did the shear-lag model. In addition to the coefficient of friction (μ) and radial residual stress ( σ N ), axial residual load ( P r ), critical load for interfacial crack initiation or propagation ( P d ), interfacial fracture toughness ( G i ), as well as the interfacial roughness amplitude ( A ) and its contribution to the interfacial normal stress ( σ r ) were extracted from the progressive debonding model, using a three-parameter, non-linear least squares fitting method on the effective load-displacement curves.


IEEE\/ASME Journal of Microelectromechanical Systems | 2007

A Fracture Mechanics Description of Stress-Wave Repair in Stiction-Failed Microcantilevers: Theory and Experiments

Zayd C. Leseman; Sai B. Koppaka; Thomas J. Mackin

Microcantilever beams are frequently utilized as sensor platforms in microelectromechanical system devices. These highly compliant surface-micromachined structures generally fail by adhering to the underlying substrate during processing or subsequent operation. Such failures, which are commonly known as ldquostictionrdquo failures, can be prevented or repaired in a number of ways, including low adhesion coatings, rinsing with low surface energy agents, and active approaches such as laser irradiation. Gupta [ J. Microelectromech. Syst. vol. 13, pp. 696-700, 2004] recently demonstrated that stress waves could be used to repair stiction-failed structures. This paper extends the work of Gupta by developing a fracture mechanics theory of the repair process and compares that theory with corresponding experiments. We show that: 1) incremental crack growth is associated with each laser pulse, the extent of which is directly related to the laser fluence; 2) repeated pulsing fully repairs all of the microcantilevers; and 3) a fracture mechanics model accurately predicts the observed experimental results. [1664].


WIT Transactions on State-of-the-art in Science and Engineering | 2012

Model-based Risk Analysis For CriticalInfrastructures

Ted G. Lewis; Rudolph P. Darken; Thomas J. Mackin; Donald Dudenhoeffer

This chapter describes a risk-informed decision-making process for analysing and protecting large-scale critical infrastructure and key resource (CI/KR) systems, and a Model-Based Risk Analysis (MBRA) tool for modelling risk, quantifying it and optimally allocating fi xed resources to reduce system vulnerability. MBRA is one of the fi rst tools to adopt a systems approach to risk-informed decision-making. It applies network science metrics, height, degree, betweeness and contagiousness to a network of interdependent infrastructure assets across multiple sectors. Resource allocation is applied across entire networks to reduce risk and to determine threat, vulnerability and consequence values using Stackelberg game theory. MBRA is compared with non-network assessment tools: CARVER, Maritime Security Risk Analysis Model (MSRAM) and Knowledge Display and Aggregation System (KDAS) – three leading infrastructure analysis tools currently in use by practitioners. MBRA has been used successfully to model a variety of sectors, ranging from water, power, energy and telecommunications to transportation.


Journal of The Electrochemical Society | 2008

Detection and Quantification of Surface Nanotopography-Induced Residual Stress Fields in Wafer-Bonded Silicon

Gavin P. Horn; Y. S. Chu; Yuncheng Zhong; Thomas J. Mackin; Jon Lesniak; Daniel Reiniger

Semiconductor wafer bonding has been identified as an enabling technology for a wide variety of semiconductor device processing applications such as wafer level encapsulation, three-dimensional structures and interconnects, and silicon-on-insulator substrates. In many of these applications accurate measurement and control of local residual stresses is critical for acceptable device yields and quality control. In this paper, synchrotron X-ray topography (XRT) and the infrared gray field polariscope (IR-GFP) are employed as full-field tools for the detection and measurement of residual stresses in wafer-bonded silicon. Both tools are used to inspect samples with varying levels of residual stresses from both wafer nanotopography and patterned interfacial features, resulting in excellent qualitative correlation between the tools. While the XRT offers higher spatial resolution and greater sensitivity to strain, the IR-GFP provides dramatically faster imaging rates, simple operating procedures, and instrument affordability. Based on these comparisons, the two techniques were shown to be complimentary tools for semiconductor processing control, the XRT being ideally suited as a laboratory research and development tool, while the IR-GFP is applicable for rapid process or quality control.


Experimental Mechanics | 2005

Trapped particle detection in bonded semiconductors using gray-field photoelastic imaging

Gavin P. Horn; Thomas J. Mackin; J. Lesniak

In this paper we present inspection results from several bonded wafer systems using a newly developed infrared gray-field polariscope (IR-GFP). This device measures the residual stress-fields associated with defects trapped at the bonded interface to enable the detection of subwavelength defects. Results from IR-GFP imaging are contrasted with conventional infrared transmission (IRT) imaging of the same samples, showing marked improvements in defect detection as well as the ability to quantify the residual stress fields. This inspection method reveals that interfaces deemed defect-free using IRT imaging may be, in fact, teeming with defects.


Journal of Thermal Science and Engineering Applications | 2016

Design and Simulation of Passive Thermal Management System for Lithium-Ion Battery Packs on an Unmanned Ground Vehicle

Kevin K. Parsons; Thomas J. Mackin

The transient thermal response of a 15-cell, 48 V, lithium-ion battery pack for an unmanned ground vehicle (UGV) was simulated using ANSYS FLUENT. Heat generation rates and specific heat capacity of a single cell were experimentally measured and used as input to the thermal model. A heat generation load was applied to each battery, and natural convection film boundary conditions were applied to the exterior of the enclosure. The buoyancy-driven natural convection inside the enclosure was modeled along with the radiation heat transfer between internal components. The maximum temperature of the batteries reached 65.6 C after 630 s of usage at a simulated peak power draw of 3600 W or roughly 85 A. This exceeds the manufacturer’s maximum recommended operating temperature of 60 C. We present a redesign of the pack that incorporates a passive thermal management system consisting of a composite expanded graphite (EG) matrix infiltrated with a phase-changing paraffin wax. The redesigned battery pack was similarly modeled, showing a decrease in the maximum temperature to 50.3 C after 630 s at the same power draw. The proposed passive thermal management system kept the batteries within their recommended operating temperature range. [DOI: 10.1115/1.4034904]


International Journal of Cyber Warfare and Terrorism (IJCWT) | 2011

Critical Infrastructure as Complex Emergent Systems

Ted G. Lewis; Thomas J. Mackin; Rudy Darken

The United States Department of Homeland Security (DHS) is charged with “build[ing] a safer, more secure, and more resilient America by enhancing protection of the Nation’s Critical infrastructure and key resources (CI/KR) ...” using an all-hazards approach. The effective implementation of this strategy hinges upon our understanding of catastrophes and their potential effect on the functioning of our infrastructure. Unfortunately, there has been no unifying theory of catastrophe to guide decisionmaking, preparedness, or response. We do not know, for example, why some catastrophes are “worse” than others, or if the rate of catastrophes is increasing or decreasing. Furthermore, DHS has adopted a risk-informed decision-making process, but has done so without defining key terms, such as “risk”, or quantifying the primary elements of risk – definitions that are badly needed before setting a course of action and allocating resources. We present a framework, based upon network science and normal accident theory that can be used to guide policy decisions for homeland security. We show that exceedance probability, which is commonly used by the insurance industry to set hazard insurance premiums, provides a unifying policy framework for homeland security investments. Furthermore, since the exceedance probability for catastrophic consequences obeys a power law, we define resilience, explicitly, as the exponent of that power law. This allows a mathematical definition of resilience that resonates with our innate sense of resilience. That is, the more resilient a given system, the larger it’s resiliency exponent. Such an approach also allows one to classify hazards as ‘high’ or ‘low’ risk, according to the resiliency exponent, and to guide investments towards prevention or response. This framework provides a more rigorous foundation for Federal investment decisions and a rational basis for policies to best protect the Nation’s infrastructure. A strategy without a theory The United States Department of Homeland Security (DHS) is charged with the responsibility of “build[ing] a safer, more secure, and more resilient America by enhancing protection of the Nation’s Critical infrastructure and key resources (CI/KR) to prevent, deter, neutralize, or mitigate the effects of deliberate efforts by terrorists to destroy, incapacitate, or exploit them; and to strengthen national preparedness, timely response, and rapid recovery in the event of an attack, natural disaster, or other emergency.” The homeland security strategy is considered all-hazards because it embraces both natural and human-made catastrophes such as Hurricane Katrina, and the 9/11 Terrorist attacks. The effective implementation of the all-hazards strategy hinges upon our understanding of catastrophes: earthquakes and wild fires in Southern California; hurricanes in Florida; terrorist attacks on infrastructure; and pandemic threats such as the H1N1 influenza. Unfortunately, there has been no unifying theory of catastrophe to guide decisionmaking, preparedness, or response. We do not know, for example, why some catastrophes are “worse” than others, or if the rate of catastrophes is increasing or decreasing. Moreover, we do not know what properties of a human or natural system contribute to fragility or resilience. This lack of understanding has led to organizational confusion (what is the goal?), duplication of effort (different agencies doing the same thing), and poor utilization of limited resources (inadequate identification of the most at-risk assets, maximal return on investment, and resourcing of adequate response capability). DHS has adopted a riskinformed decision-making process, but has done so without defining key terms such as “risk” or quantifying the primary elements of risk: “threat”, “vulnerability”, “resilience”, and “consequence” – terms used throughout DHS policy and strategy documents. Riskinformed decisions are difficult to make without operational definitions of risk and resiliency! For example, the National Strategy for the Physical Protection of Critical Infrastructure and Key Assets recommends, “the first objective of this strategy is to identify and assure the protection of those assets, systems, and functions that we deem most ‘critical’ in terms of national-level public health and safety, governance, economic and national security, and public confidence. We must develop a comprehensive, prioritized assessment of facilities, systems, and functions of national-level criticality and monitor their preparedness across infrastructure sectors.” This is a laudable objective, but since 2003 DHS has not been able to define ‘critical’, ‘prioritization’, or ‘preparedness’ – definitions that are badly needed before setting a course of action and allocating precious resources. The authors claim this malady will continue to persist until a suitable theory of catastrophe is developed and turned into practice. We propose a theory of all-hazards catastrophe, the results of which can be used to guide policy decisions for homeland security. Our theory is based on network science and normal accident theory. In a related approach, Ramo borrows on ideas taken from physical science to explain how political disasters happen. Ramo’s ideas were previously explored and illustrated by Buchanan in a broader context. Similarly, Taleb’s highly popular book on randomness lays the foundation for some of the ideas expressed in the author’s theory of catastrophe – specifically addressing the claim that many catastrophes are the result of random processes, rather than deterministic cause-andeffects. While Taleb focuses on “black swans” – highly unlikely, highly consequential, unpredictable events, we argue that black swans are statistically predictable and follow a power law exceedence probability distribution. Lewis applied the theory of complex systems to critical infrastructure and showed the relationship between power laws, black swans, and normal accident theory to critical infrastructure systems. Thus, power laws appear to be fundamental to catastrophe theory, which raises the question of “why”? Our answer: catastrophic events, including black swans, are normal accidents that increase with increasing self-organization.

Collaboration


Dive into the Thomas J. Mackin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amit Savkar

University of Connecticut

View shared research outputs
Top Co-Authors

Avatar

Kevin D. Murphy

University of Connecticut

View shared research outputs
Top Co-Authors

Avatar

Khawar Abbas

University of New Mexico

View shared research outputs
Top Co-Authors

Avatar

Ted G. Lewis

Naval Postgraduate School

View shared research outputs
Top Co-Authors

Avatar

A.G. Evans

University of California

View shared research outputs
Top Co-Authors

Avatar

Bert Copsey

California Polytechnic State University

View shared research outputs
Top Co-Authors

Avatar

C. Cady

University of California

View shared research outputs
Top Co-Authors

Avatar

Daniel Layton

California Polytechnic State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge