Sallie Keller-McNulty
Los Alamos National Laboratory
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sallie Keller-McNulty.
Journal of the American Statistical Association | 1998
Vicki A. Lancaster; Sallie Keller-McNulty
Abstract A composite is formed by collecting multiple sample units and combining them in their entirety or in part, to form a new sample. The sample units that make up the composite may retain their integrity or be homogenized through physical processes such as ball milling, sieving, shaking, or centrifuging. One or more subsequent measurements are taken on the composite and the information on the sample units is lost. This counterintuitive loss of information has fueled opposition to composite sampling, while the methodologys adherents find their motivation in its ability to reduce measurement costs for many classes of problems. This article reviews the scientific literature related to the development of composite sampling methods. The literature on compositing exists only as a compendium derived from disparate disciplines in which terms such as compositing, group screening, pooling, and weighing designs are used. The goal of this review is to synthesize this body of literature. The articles reviewed ar...
Archive | 2005
Alyson G. Wilson; Nikolaos Limnios; Sallie Keller-McNulty; Yvonne Armijo
# Competing Risk Modeling in Reliability (T Bedford) # Game-Theoretic and Reliability Methods in Counter-Terrorism and Security (V Bier) # Regression Models for Reliability Given the Usage Accumulation History (T Duchesne) # Bayesian Methods for Assessing System Reliability: Models and Computation (T Graves & M Hamada) # Dynamic Modeling in Reliability and Survival Analysis (E A Pena & E Slate) # End of Life Analysis (H Wynn et al.) # and other papers
Technometrics | 2008
David M. Steinberg; Søren Bisgaard; Necip Doganaksoy; N. I. Fisher; Bert Gunter; Gerald J. Hahn; Sallie Keller-McNulty; Jon R. Kettenring; William Q. Meeker; Douglas C. Montgomery; C. F. Jeff Wu
Technometrics was founded in 1959 as a forum for publishing statistical methods and applications in engineering and the physical and chemical sciences. The expanding role of statistics in industry was a major stimulus, and, throughout the years many articles in the journal have been motivated by industrial problems. In this panel discussion we look ahead to the future of industrial statistics. Ten experts, encompassing a range of backgrounds, experience, and expertise, answered my request to share with us their thoughts on what lies ahead in industrial statistics. Short biographical sketches of the panelists are provided at the end of the discussion. The panelists wrote independent essays, which I have combined into an integrated discussion. Most of the essays were written as responses to a list of 10 questions that I provided to help the participants direct their thoughts. I have organized the discussion in that same fashion, stating the questions and then providing the related responses. Several discussants added remarks on the role of statistics journals, particularly of Technometrics, and I have added that as a final question. We see this article, not as the end of the story, but rather as the takeoff point for further discussion. To that end, we are initiating an open discussion forum; to participate, go to http://www.asq.org/pub/techno/ and click on Networking and Events. The American Society for Quality will host the forum and Bert Gunter has graciously agreed to serve as moderator.
IEEE Transactions on Reliability | 2003
Thomas R. Bennett; Jane M. Booker; Sallie Keller-McNulty; Nozer D. Singpurwalla
As science and technology become increasingly sophisticated, government and industry are relying more and more on sciences advanced methods to determine reliability. Unfortunately, political, economic, time, and other constraints imposed by the real world, inhibit the ability of researchers to calculate reliability efficiently and accurately. Because of such constraints, reliability must undergo an evolutionary change. The first step in this evolution is to re-interpret the concept so that it meets the new centurys needs. The next step is to quantify reliability using both empirical methods and auxiliary data sources, such as expert knowledge, corporate memory, and mathematical modeling and simulation.
Journal of the American Statistical Association | 2007
Sallie Keller-McNulty
Science, engineering, technology, and people—these are the ingredients that must come together to support the growing complexity of todays global challenges, ranging from international security to space exploration. As scientists and engineers, it is essential that we develop the means to put our work into a decision context for policy makers; otherwise, our efforts will only inform the writers of textbooks and not the leaders who shape the world within which we live. Statisticians must step up to that challenge! Scientific and technical progress requires interdisciplinary teams, because it is impossible for a single individual to have enough knowledge to solve many of todays problems, for example, mapping the genome, modeling the spread of a pandemic, and developing diagnostic and treatment devices for developing countries. A principal role of the statistician is to bring the cutting edge of statistical sciences to these problems. By the nature of our training, statisticians are well poised to assume the role of science and technology integrator. To be successful, this must place statisticians closer to policy pressures and politics. This address will focus on the growing expectations facing statistical sciences and how we, as statisticians, must take responsibility for separating the scientific method from the politics of the scientific process to guarantee that scientific excellence and impact is communicated to decision makers.
Chance | 2005
Sallie Keller-McNulty; Alyson G. Wilson; Gregory D. Wilson
D science” is more complex today than ever. Yet, as scientists move toward addressing more difficult problems and realize the necessity to address them in a multidisciplinary fashion, efforts are complicated by the ‘stovepiping’ of disciplines and individuals’ expertise and the fact that established scientific methods do not lend themselves to many forms of multidisciplinary or team science. By stovepiping, we mean that many scientists today are only able to keep current on a narrow slice of disciplinary expertise. Due to the increase in the number of journals and the amount of research being conducted, it is getting harder to be good at what they do and have a general perspective on their own disciplines, let alone science as a whole. The scientific method we traditionally have relied upon was developed centuries ago so that lone scientists could convince other lone scientists that their physical experiments were conducted ‘objectively.’ As part of this ritual of objectivity, experiments were simplified to the point that only one idea was being considered and one answer produced. Today, we often must rely upon complex computer modeling and symbolic experimentation because physical experimentation is impractical or impossible; we must integrate types of information that once would have been dismissed as subjective; and we often must work in diverse teams to address complicated, multifaceted, ongoing problems in order to produce equally robust ‘answers.’ To address the demands of modern multidisciplinary science, we are eager to build upon the foundation of the scientific method, seeking enhancements to the scientific process both by noticing the changes that have occurred in scientific practices and by pushing to develop methods that better fit the task environments in which we work. As suggested above, this method was developed to isolate and minimize variables, to facilitate simple description of procedures for far-flung colleagues, and to follow the principles of logic popular during the scientific revolution. One of the key features of this model is its linearity—once the process starts, it needs to proceed to its conclusion and produce a product in order to be seen as successful. R.A. Fisher noted that this linear approach to science (and statistics) is flawed.
Bayesian Analysis | 2006
Brian J. Williams; Dave Higdon; Jim Gattiker; Leslie M. Moore; Michael D. McKay; Sallie Keller-McNulty
Archive | 2008
Dl Brown; John B. Bell; Donald Estep; William Gropp; B Hendrickson; Sallie Keller-McNulty; David E. Keyes; J Oden; L Petzold; Margaret H. Wright
International Statistical Review | 2006
Sallie Keller-McNulty; Charles Nakhleh; Nozer D. Singpurwalla
Submitted to: Mathematical Methods in Reliability proceedings. | 2002
Sallie Keller-McNulty; Alyson G. Wilson