Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yacov Y. Haimes is active.

Publication


Featured researches published by Yacov Y. Haimes.


Risk Analysis | 2009

On the Complex Definition of Risk: A Systems‐Based Approach

Yacov Y. Haimes

The premise of this article is that risk to a system, as well as its vulnerability and resilience, can be understood, defined, and quantified most effectively through a systems-based philosophical and methodological approach, and by recognizing the central role of the system states in this process. A universally agreed-upon definition of risk has been difficult to develop; one reason is that the concept is multidimensional and nuanced. It requires an understanding that risk to a system is inherently and fundamentally a function of the initiating event, the states of the system and of its environment, and the time frame. In defining risk, this article posits that: (a) the performance capabilities of a system are a function of its state vector; (b) a systems vulnerability and resilience vectors are each a function of the input (e.g., initiating event), its time of occurrence, and the states of the system; (c) the consequences are a function of the specificity and time of the event, the vector of the states, the vulnerability, and the resilience of the system; (d) the states of a system are time-dependent and commonly fraught with variability uncertainties and knowledge uncertainties; and (e) risk is a measure of the probability and severity of consequences. The above implies that modeling must evaluate consequences for each risk scenario as functions of the threat (initiating event), the vulnerability and resilience of the system, and the time of the event. This fundamentally complex modeling and analysis process cannot be performed correctly and effectively without relying on the states of the system being studied.


systems man and cybernetics | 1981

Hierarchical Holographic Modeling

Yacov Y. Haimes

A first-phase development of a mathematical theory for a new modeling schema that is termed hierarchical holographic modeling is presented. This theory will provide a methodology for capturing and dealing with a fundamental, but heretofore neglected, characteristic of large-scale systems-their multifarious nature. Truly large-scale systems reflect a bewildering variety of resources and capabilities and respond to an equally wide variety of objectives in response to the action of diverse users. Many elements involved in this congeries of resources, objectives, actions, etc., are noncommensurable and, at least potentially, conflicting. Realistic attempts at modeling have necessarily represented commensurable features of limited aspects of the overall systems leaving many difficult questions of a more comprehensive nature to be posed and solved heuristically or even subconsciously. The prospective application of hierarchical holographic modeling to energy and water resources systems is discussed.


IEEE Computer | 2000

Are we forgetting the risks of information technology

Thomas A. Longstaff; Clyde Chittister; Rich Pethia; Yacov Y. Haimes

The complexity and interconnectedness of information systems is growing. There must be some way to systematically assess the risk to critical infrastructures. Work began two decades ago (1980s) on a comprehensive theoretical framework to model and identify risks to large-scale and complex systems. The framework, hierarchical holographic modeling (HHM) (Y.Y. Haimes, 1981; 1998) is to conventional modeling schemes what holography is to conventional photography. Holography captures images in three dimensions, as compared with conventional photographys two-dimensional, planar representation. Likewise, HHM endorses a gestalt and holistic philosophy, which allows it to capture more dimensions than modeling schemes that yield planar models. HHM promotes a systemic process that identifies most, if not all, important and critical sources of risk.


Economic Systems Research | 2007

A Risk-based Input–Output Methodology for Measuring the Effects of the August 2003 Northeast Blackout

Christopher W. Anderson; Joost R. Santos; Yacov Y. Haimes

The 2003 Northeast Blackout revealed vulnerabilities within the US electric power-grid system. With the economy so dependent on electric power for most aspects of life, a power-grid failure can have far-reaching higher-order effects and can impair the operability of other critical infrastructures. An inoperability of the power sector can result from different types of disasters (e.g., accidents, natural catastrophe, or willful attacks). This paper demonstrates the Inoperability Input-Output Model (IIM) to measure the financial and inoperability effects of the Northeast Blackout. The case study uses information from sources such as the US input–output tables and sector-specific reports to quantify losses for specific inoperability levels. The IIM estimated losses of the same magnitude as other published reports; however, with a detailed accounting of all affected economic sectors. Finally, a risk management framework is proposed to extend the IIMs capability for evaluating investment options in terms of their implementation costs and loss-reduction potentials.


Automatica | 1979

Paper: Kuhn-Tucker multipliers as trade-offs in multiobjective decision-making analysis

Yacov Y. Haimes; Vira Chankong

Useful relationships between the optimal Kuhn-Tucker multipliers and trade-offs in the multiobjective decision-making problems are developed based on the sensitivity interpretation of such multipliers. Practical and theoretical applications of these results are discussed. The results provide a convenient way for obtaining necessary (trade-off) information for continuing into the analyst-decision-maker interactive phase of the multiobjective decision-making process. This paper further extends the theoretical basis of the Surrogate Worth Trade-off (SWT) Method; a multiobjective optimization method which first appeared in the scientific literature in 1974.


Risk Analysis | 1999

A Survey of Approaches for Assessing and Managing the Risk of Extremes

Vicki M. Bier; Yacov Y. Haimes; James H. Lambert; Nicholas C. Matalas; Rae Zimmerman

In this paper, we review methods for assessing and managing the risk of extreme events, where “extreme events” are defined to be rare, severe, and outside the normal range of experience of the system in question. First, we discuss several systematic approaches for identifying possible extreme events. We then discuss some issues related to risk assessment of extreme events, including what type of output is needed (e.g., a single probability vs. a probability distribution), and alternatives to the probabilistic approach. Next, we present a number of probabilistic methods. These include: guidelines for eliciting informative probability distributions from experts; maximum entropy distributions; extreme value theory; other approaches for constructing prior distributions (such as reference or noninformative priors); the use of modeling and decomposition to estimate the probability (or distribution) of interest; and bounding methods. Finally, we briefly discuss several approaches for managing the risk of extreme events, and conclude with recommendations and directions for future research.


Reliability Engineering & System Safety | 2009

Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

Kash Barker; Yacov Y. Haimes

Abstract Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input–output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences.


systems man and cybernetics | 1987

The envelope approach for multiobjeetive optimization problems

Duan Li; Yacov Y. Haimes

A multiobjective optimization problem is usually solved by finding the set of all noninferior solutions to the problem. A methodology termed the envelope approach is presented for generating the set of noninferior solutions. The relationship between the envelope approach and multiobjective optimization is explored. Investigation of the use of the envelope approach in multiobjective dynamic programming and in the parametric decomposition method shows that this approach is very suitable for solving certain classes of multiobjective optimization problems by decomposition and coordination.


Water Resources Research | 1992

Optimal maintenance‐related decision making for deteriorating water distribution systems: 1. Semi‐Markovian Model for a water main

Duan Li; Yacov Y. Haimes

An optimal maintenance-related decision-making problem is investigated in this paper for deteriorating water distribution systems. A semi-Markovian model is developed to capture the dynamic evolution of the failure mode of a deteriorating main pipe, thus facilitating the determination of the optimal replacement/repair decision at various deteriorating stages. An example problem is solved to demonstrate the proposed methodology and to show the trade-off between the systems availability and the expected maintenance cost.


Automatica | 1988

Hierarchical multiobjective analysis for large-scale systems: reviews and current status

Yacov Y. Haimes; Duan Li

Abstract Large-scale systems are often characterized by hierarchical structure, and they usually have multiple objectives that are noncommensurable. Two well-known approaches to the analysis of systems—hierarchical system theory and multiobjective optimization—have been developed to deal with these two aspects of large-scale systems. The past decade has seen an increasing concern with the integration of these two approaches into a unified framework for large-scale systems, leading to the emergence of a new field known as hierarchical multiobjective analysis. This paper provides a systematic review of the literature associated with the modeling and optimization of large-scale systems, focusing on research based on the hierarchical multiobjective approach, the overlapping decomposition approach, and the multimodel approach. A large number of recently published papers that fall within these areas are reviewed.

Collaboration


Dive into the Yacov Y. Haimes's collaboration.

Top Co-Authors

Avatar

Duan Li

Case Western Reserve University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Vira Chankong

Case Western Reserve University

View shared research outputs
Top Co-Authors

Avatar

Joost R. Santos

George Washington University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julia Pet-Edwards

Case Western Reserve University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eugene Z. Stakhiv

United States Army Corps of Engineers

View shared research outputs
Researchain Logo
Decentralizing Knowledge