Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Deborah Nolan is active.

Publication


Featured researches published by Deborah Nolan.


Statistics & Probability Letters | 1988

Canonical kernels for density estimation

J. S. Marron; Deborah Nolan

The kernel function in density estimation is uniquely determined up to a scale factor. In this paper, we advocate one particular rescaling of a kernel function, called the canonical kernel, because it is the only version which uncouples the problems of choice of kernel and choice of scale factor. This approach is useful for both pictorial comparison of kernel density estimators and for optimal kernel theory.


The American Statistician | 1999

Teaching Statistics Theory through Applications

Deborah Nolan; Terence P. Speed

Abstract This article presents a model for developing case studies, or labs, for use in undergraduate mathematical statistics courses. The model proposed here is to design labs that are more in-depth than most examples in statistical texts by providing rich background material, investigations and analyses in the context of a scientific problem, and detailed theoretical development within the lab. An important goal of this approach is to encourage and develop statistical thinking. It is also advocated that the labs be made the centerpiece of the theoretical course. As a result, the curriculum, lectures, and assignments are significantly restructured. For example, the course work includes written assignments based on open-ended data analyses, and the lectures include group work and discussions of the case-studies.


The American Statistician | 2010

Computing in the Statistics Curricula

Deborah Nolan; Duncan Temple Lang

The nature of statistics is changing significantly with many opportunities to broaden the discipline and its impact on science and policy. To realize this potential, our curricula and educational culture must change. While there are opportunities for significant change in many dimensions, we focus more narrowly on computing and call for computing concepts to be integrated into the statistics curricula at all levels. Computational literacy and programming are as fundamental to statistical practice and research as mathematics. We advocate that our field needs to define statistical computing more broadly to include advancements in modern computing, beyond traditional numerical algorithms. Information technologies are increasingly important and should be added to the curriculum, as should the ability to reason about computational resources, work with large datasets, and perform computationally intensive tasks. We present an approach to teaching these topics in combination with scientific problems and modern statistical methods that focuses on ideas and skills for statistical inquiry and working with data. We outline the broad set of computational topics we might want students to encounter and offer ideas on how to teach them. We also discuss efforts to share pedagogical resources to help faculty teach this modern material (including supplemental materials).


The American Statistician | 2015

Data Science in Statistics Curricula: Preparing Students to “Think with Data”

Johanna Hardin; Roger Hoerl; Nicholas J. Horton; Deborah Nolan; Benjamin Baumer; O. Hall-Holt; Paul Murrell; Roger D. Peng; P. Roback; D. Temple Lang; Mark Daniel Ward

A growing number of students are completing undergraduate degrees in statistics and entering the workforce as data analysts. In these positions, they are expected to understand how to use databases and other data warehouses, scrape data from Internet sources, program solutions to complex problems in multiple languages, and think algorithmically as well as statistically. These data science topics have not traditionally been a major component of undergraduate programs in statistics. Consequently, a curricular shift is needed to address additional learning outcomes. The goal of this article is to motivate the importance of data science proficiency and to provide examples and resources for instructors to implement data science in their own statistics curricula. We provide case studies from seven institutions. These varied approaches to teaching data science demonstrate curricular innovations to address new needs. Also included here are examples of assignments designed for courses that foster engagement of undergraduates with data and data science. [Received November 2014. Revised July 2015.]


The American Statistician | 2002

You Can Load a Die, But You Can't Bias a Coin

Andrew Gelman; Deborah Nolan

Dice can be loaded—that is, one can easily alter a die so that the probabilities of landing on the six sides are dramatically unequal. However, it is not possible to bias a coin flip—that is, one cannot, for example, weight a coin so that it is substantially more likely to land “heads” than “tails” when flipped and caught in the hand in the usual manner. Coin tosses can be biased only if the coin is allowed to bounce or be spun rather than simply flipped in the air. We describe a student activity with dice and coins that gives empirical evidence to support this property, and we use this activity when we teach design of experiments and hypothesis testing in our introductory statistics courses. We explain this phenomenon by summarizing a physical argument made in earlier literature.


Statistics & Probability Letters | 1999

On min{max majority and deepest points

Deborah Nolan

The asymptotic properties of a multivariate location estimator are obtained in this paper. The estimator examined is based on the notion of half-space depth, where the depth of a point is the minimum probability content of all half spaces containing the point. The location estimator of interest is the deepest point with respect to the empirical measure on half spaces. For angularly symmetric distributions, this estimator is consistent. For two dimensions, the exact limit distribution is derived, and the extension of the limit distribution results to higher dimensions is discussed.


Annual Review of Statistics and Its Application | 2017

Curriculum Guidelines for Undergraduate Programs in Data Science

Richard D. De Veaux; Mahesh Agarwal; Maia Averett; Benjamin Baumer; Andrew Bray; Thomas C. Bressoud; Lance Bryant; Lei Z. Cheng; Amanda Francis; Robert G. Gould; Albert Y. Kim; Matt Kretchmar; Qin Lu; Ann Moskol; Deborah Nolan; Roberto Pelayo; Sean Raleigh; Ricky J. Sethi; Mutiara Sondjaja; Neelesh Tiruviluamala; Paul X. Uhlig; Talitha M. Washington; Curtis L. Wesley; David White; Ping Ye

The Park City Math Institute (PCMI) 2016 Summer Undergraduate Faculty Program met for the purpose of composing guidelines for undergraduate programs in Data Science. The group consisted of 25 undergraduate faculty from a variety of institutions in the U.S., primarily from the disciplines of mathematics, statistics and computer science. These guidelines are meant to provide some structure for institutions planning for or revising a major in Data Science.


Teaching Statistics | 2002

A Probability Model for Golf Putting

Andrew Gelman; Deborah Nolan

We derive a model, using trigonometry and the Normal distribution, for the probability that a golf putt is successful. We describe a class activity in which we lead the students through the steps of examining the data, considering possible models, constructing a probability model and checking the fit. The model is,of necessity, oversimplified, a point which the class discusses at the end of the demonstration.


Probability Theory and Related Fields | 1989

Uniform consistency of automatic and location-adaptive delta-sequence estimators

Deborah Nolan; J. Stephen Marron

SummaryThe class of delta-sequence estimators for a probability density includes the kernel, histogram and orthogonal series types, because each can be characterized as a collection of averages of some function that is indexed by a smoothing parameter. There are two important extensions of this class. The first allows a random smoothing parameter, for example that specified by a cross-validation method. The second allows the smoothing parameter to be a function of location, for example an estimator based on nearest-neighbor distance. In this paper a general method is presented which establishes uniform consistency for all of these estimators.


The American Statistician | 1998

Student Projects on Statistical Literacy and the Media

Andrew Gelman; Deborah Nolan; Anna Men; Steve Warmerdam; Michelle Bautista

Abstract An important theme in an introductory statistics course is the connection between statistics and the outside world. This article describes some assignments that have been useful in getting students to learn how to gather and process information presented in the newspaper articles and scientific reports they read. We discuss two related assignments. For the first kind of assignment, students work through prepared instructional packets. Each packet contains a newspaper article that reports on a scientific study or statistical analysis, the original report on which the article was based, a worksheet with guidelines for summarizing the reported study, and a series of questions. In the second kind of assignment, each student is required to find a newspaper article themselves, track down the original report, summarize the study using our guidelines, and write a critique of the article. Here, we describe the guidelines we developed to help the student in reading the newspaper article and original source...

Collaboration


Dive into the Deborah Nolan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Roger D. Peng

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amanda Francis

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anna Men

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge