Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Walter L. Perry is active.

Publication


Featured researches published by Walter L. Perry.


Archive | 2013

Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations

Walter L. Perry; Brian McInnis; Carter Price; Susan Smith; John S. Hollywood

Limited Electronic Distribution Rights This document and trademark(s) contained herein are protected by law as indicated in a notice appearing later in this work. This electronic representation of RAND intellectual property is provided for non-commercial use only. Unauthorized posting of RAND electronic documents to a non-RAND website is prohibited. RAND electronic documents are protected under copyright law. Permission is required from RAND to reproduce, or reuse in another form, any of our research documents for commercial use. For information on reprint and linking permissions, please see RAND Permissions. Skip all front matter: Jump to Page 16 The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis.


international symposium on intelligent control | 1991

Belief function divergence as a classifier

Walter L. Perry; Harry E. Stephanou

A technique is presented for classifying observations of the environment that can be expressed as aggregates of disparate belief functions. The belief functions reflect a level of precision consistent with the current operational status of the sensor suite and the occlusions present in the environment. The classification process consists of applying a divergence measure to the evidential aggregate of belief functions and a set of prototype aggregate belief functions in a knowledge base. Divergence measures the difference between the information present in the combination of the two belief functions and each of the constituents alone. The measure of divergence addresses both the similarity between the two belief functions and their levels of support, thus incorporating both aspects of precision, namely, specificity and certainty. The knowledge base consists of a inference classes, each consisting of a set of aggregate prototypes typifying the class to a degree expressed by a fuzzy epitome coefficient.<<ETX>>


Battlespace Digitization and Network-Centric Systems III | 2003

Advanced metrics for network-centric naval operations

Walter L. Perry; Fred D. Bowden

Defense organizations around the world are formulating new visions, strategies, and concepts that utilize emerging information-age technologies. Central among these is network-based operations. Measures and metrics are needed that allow analysts to link the effects of alternative network structures, operating procedures and command and control arrangements to combat outcomes. This paper reports on measures and mathematical metrics that begin to address this problem. Networks are assessed in terms of their complexity, their ability to adapt, and the collaboration opportunity they afford. The metrics measure the contributions of complexity to information flow, and the deleterious effects of information overload and disconfirming reports to overall network performance. In addition, they measure the contributions of collaboration to shared situational awareness in terms of the accuracy and precision of the information produced and the costs associated with an imbalance of the two. We posit a fixed network connecting a Naval Task Force’s various platforms, and assess the ability of this network to support the range of missions required of the task force. The emphasis is not on connectivity, but rather on information flow and how well the network is able to adapt to alternative flow requirements. We assess the impact alternative network structures, operating procedures and command arrangements have on combat outcomes by applying the metrics to a cruise missile defense scenario.


winter simulation conference | 2015

Using causal models in heterogeneous information fusion to detect terrorists

Paul K. Davis; David Manheim; Walter L. Perry; John S. Hollywood

We describe basic research that uses a causal, uncertainty-sensitive computational model rooted in qualitative social science to fuse disparate pieces of threat information. It is a cognitive model going beyond rational-actor methods. Having such a model has proven useful when information is uncertain, fragmentary, indirect, soft, conflicting, and even deceptive. Inferences from fusion must then account for uncertainties about the model, the credibility of information, and the fusion methods - i.e. we must consider both structural and parametric uncertainties, including uncertainties about the uncertainties. We use a novel combination of (1) probabilistic and parametric methods, (2) alternative models and model structures, and (3) alternative fusion methods that include nonlinear algebraic combination, variants of Bayesian inference, and a new entropy-maximizing approach. Initial results are encouraging and suggest that such an analytically flexible and model-based approach to fusion can simultaneously enrich thinking, enhance threat detection, and reduce harmful false alarms.


Archive | 2015

Operation IRAQI FREEDOM: Decisive War, Elusive Peace

Walter L. Perry; Jefferson P. Marquis; Richard E. Darilek; Laurinda L. Rohn; Andrea Mejia; Jerry M. Sollinger; Vipin Narang; Bruce R. Pirnie; John Gordon; Rick Brennan; Forrest E. Morgan; Alexander C. Hou; Chad Yost; David E. Mosher; Stephen T. Hosmer; Edward O'Connell; Miranda Priebe; Lowell H. Schwartz; Nora Bensahel; Olga Oliker; Keith Crane; Heather S. Gregg; Andrew Rathmell; Eric Peltz; David Kassing; Marc Robbins; Kenneth J. Girardini; Brian Nichiporuk; Peter Schirmer; John Halliday

Soon after Operation IRAQI FREEDOM began in March 2003, RAND Arroyo Center began a project, completed in January 2006, to produce an authoritative account of the planning and execution of combat and stability operations in Iraq and to recommend changes to Army plans, operational concepts, doctrine, and Title 10 functions. This report presents a broad overview of the study findings based on unclassified source material.


Archive | 2015

Causal Models and Exploratory Analysis in Heterogeneous Information Fusion for Detecting Potential Terrorists

Paul K. Davis; David Manheim; Walter L. Perry; John S. Hollywood

Abstract : We describe research fusing heterogeneous information in an effort eventually to detect terrorists, reduce false alarms, and exonerate those falsely identified. The specific research is more humble, using synthetic data and first versions of fusion methods. Both the information and the fusion methods are subject to deep uncertainty. The information may also be fragmentary, indirect, soft, conflicting, and even deceptive. We developed a research prototype of an analyst centric fusion platform. This uses (1) causal computational models rooted in social science to relate observable information about individuals to an estimate of the threat that the individual poses and (2) a battery of different methods to fuse across information reports. We account for uncertainties about the causal model, the information, and the fusion methods. We address structural and parametric uncertainties, including uncertainties about the uncertainties, at different levels of detail. We use a combination of (1) probabilistic and parametric methods, (2)alternative models, and (3) alternative fusion methods that include nonlinear algebraic combination, Bayesian inference, and an entropy-maximizing approach. This paper focuses primarily on dealing with deep uncertainty in multiple dimensions.


conference on decision and control | 1991

A quantitative treatment of multilevel specificity and certainty in variable precision reasoning

Walter L. Perry; Harry E. Stephanou

The authors develop a methodology for reasoning about the state of the environment based on evidence received from some source. It is assumed that the evidence is expressed as a probability mass function defined on a discrete set of mutually exclusive hypotheses about the state of the environment. Given that the quality of the evidence is variable, it follows that the precision of the reasoning process must also vary. That is, the level of specificity and the certainty associated with decisions made at that level depend directly on the quality of the evidence. An indistinguishability measure is used to generate a core set of aggregate focal elements, each of which may consist of logical disjunctions of the basic hypothesis set. The measure takes into account both the differences in support levels for the hypotheses and the degree to which they are similar. Partial dominance is then used to associate a basic probability assignment on the core set. This approach makes it possible to apply simple, quantitative methods to express the variations in the precision associated with decisions. The result is a set of aggregate hypotheses and their support levels which become input to the classification process. In most cases, multiple sets of aggregate hypotheses will be used in an evidential classification scheme to produce a composite characterization of the environment.<<ETX>>


systems man and cybernetics | 1993

A quantitative treatment of multilevel specificity and uncertainty in variable precision reasoning

Walter L. Perry; Harry E. Stephanou

A methodology for reasoning about the state of the environment based on evidence received from some source is developed. It is assumed that the evidence is expressed as a probability mass function defined on a discrete set of mutually exclusive hypotheses about the state of the environment. Given that the quality of the evidence is variable, it follows that the precision of the reasoning process must also vary. The level of specificity and the certainty associated with decisions made at that level depend directly on the quality of the evidence. An indistinguishability measure is used to generate a core set of aggregate focal elements, each of which may consist of logical disjunctions of the basic hypothesis set. Partial dominance is then used to associate a basic probability assignment on the core set. This approach allows simple, quantitative methods to express the variations in the precision associated with decisions. The result is a set of aggregate hypotheses and their support levels that become input to the classification process. >


Archive | 2002

Measures of Effectiveness for the Information-Age Navy

Richard E. Darilek; Brian Nichiporuk; John Gordon; Walter L. Perry; Jerome Bracken


Archive | 2002

Measurements of Effectiveness for the Information-Age Navy: The Effects of Network-Centric Operations on Combat Outcomes

Walter L. Perry; Robert W. Button; Jerome Bracken; Thomas Sullivan; Jonathan Mitchell

Collaboration


Dive into the Walter L. Perry's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Harry E. Stephanou

University of Texas at Arlington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul K. Davis

Frederick S. Pardee RAND Graduate School

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge