Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert B. Noble is active.

Publication


Featured researches published by Robert B. Noble.


Ecological Applications | 2004

NON‐TIMBER FOREST PRODUCT EXTRACTION: EFFECTS OF HARVEST AND BROWSING ON AN UNDERSTORY PALM

Bryan A. Endress; David L. Gorchov; Robert B. Noble

Despite the advocacy for non-timber forest product (NTFP) extraction as a form of sustainable development, the population ecology of many NTFPs remains unstudied, making it difficult to assess the ecological impacts of extraction. We investigated the demography and population dynamics of the harvested, understory palm, Chamaedorea radicalis in the El Cielo Biosphere Reserve, Mexico. Our objectives were: (1) to describe patterns of C. radicalis abundance and population size structure, (2) to document C. radicalis demography, (3) to test experimentally how this demography was affected by different leaf harvest regimes and livestock browse intensities, and (4) to project their effects on transient and long-term population dynamics. Data on palm abundance and population size structure were collected from belt-transects along hillsides. We also exposed 100 adult palms to each of five leaf harvest treatments (N = 500): control, harvest once per year, harvest twice per year, harvest four times per year, and a mo...


Risk Analysis | 2005

Model Uncertainty and Risk Estimation for Experimental Studies of Quantal Responses

A. John Bailer; Robert B. Noble; Matthew W. Wheeler

Experimental animal studies often serve as the basis for predicting risk of adverse responses in humans exposed to occupational hazards. A statistical model is applied to exposure-response data and this fitted model may be used to obtain estimates of the exposure associated with a specified level of adverse response. Unfortunately, a number of different statistical models are candidates for fitting the data and may result in wide ranging estimates of risk. Bayesian model averaging (BMA) offers a strategy for addressing uncertainty in the selection of statistical models when generating risk estimates. This strategy is illustrated with two examples: applying the multistage model to cancer responses and a second example where different quantal models are fit to kidney lesion data. BMA provides excess risk estimates or benchmark dose estimates that reflects model uncertainty.


Environmental Toxicology and Chemistry | 2009

Comparing methods for analyzing overdispersed binary data in aquatic toxicology

Robert B. Noble; A. John Bailer; Douglas A. Noe

Historically, death is the most commonly studied effect in aquatic toxicity tests. These tests typically employ a gradient of concentrations and exposure with more than one organism in a series of replicate chambers in each concentration. Whereas a binomial distribution commonly is employed for such effects, variability may exceed that predicted by binomial probability models. This additional variability could result from heterogeneity in the probabilities across the chambers in which the organisms are housed and subsequently exposed to concentrations of toxins. Incorrectly assuming a binomial distribution for the statistical analysis may lead to incorrect statistical inference. We consider the analysis of grouped binary data, here motivated by the study of survival. We use a computer simulation study to examine the impact of overdispersion or outliers on the analysis of binary data. We compare methods that assume binomial or generalizations that accommodate this potential overdispersion. These generalizations include adjusting the standard probit model for clustering/correlation or using alternative estimation methods, generalized estimating equations, or generalized linear mixed models (GLMM). When data were binomial or overdispersed binomial, none of the models exhibited any significant bias when estimating regression coefficients. When the data were truly binomial, the probit model controlled type I errors, as did the Donald and Donner method and the GLMM method. When data were overdispersed, the probit model no longer controlled type I error, and the standard errors were too small. In general, the Donald and Donner and the GLMM methods performed reasonably based on this study, although all procedures suffered some impact in the presence of potential outliers.


Risk Analysis | 2009

Model-Averaged Benchmark Concentration Estimates for Continuous Response Data Arising from Epidemiological Studies

Robert B. Noble; A. John Bailer; Robert Park

Worker populations often provide data on adverse responses associated with exposure to potential hazards. The relationship between hazard exposure levels and adverse response can be modeled and then inverted to estimate the exposure associated with some specified response level. One concern is that this endpoint may be sensitive to the concentration metric and other variables included in the model. Further, it may be that the models yielding different risk endpoints are all providing relatively similar fits. We focus on evaluating the impact of exposure on a continuous response by constructing a model-averaged benchmark concentration from a weighted average of model-specific benchmark concentrations. A method for combining the estimates based on different models is applied to lung function in a cohort of miners exposed to coal dust. In this analysis, we see that a small number of the thousands of models considered survive a filtering criterion for use in averaging. Even after filtering, the models considered yield benchmark concentrations that differ by a factor of 2 to 9 depending on the concentration metric and covariates. The model-average BMC captures this uncertainty, and provides a useful strategy for addressing model uncertainty.


Environmental Toxicology and Chemistry | 2004

Copper tolerance in fathead minnows: I. The role of genetic and nongenetic factors

Alan S. Kolok; Elizabeth B. Peake; Laura L. Tierney; Shaun A. Roark; Robert B. Noble; Kyoungah See; Sheldon I. Guttman

Swim performances of male and female fathead minnows (Pimephales promelas) from three different suppliers were determined before and after an 8- to 9-d exposure to 175 microg/L copper (Cu). The reduction in swim performance (delta) due to the Cu exposure varied widely among individual fish, but was surprisingly consistent from one supplier to the next and between males and females. Genetic analysis of the individuals revealed significant correlations between delta and genotypic variation at the glucosephosphate isomerase-1, phosphoglucomutase-1, and lactate dehydrogenase-2 enzyme loci. Based upon delta, the most Cu-resistant fathead minnows were bred together, as were the most Cu-susceptible individuals and two groups of unselected minnows. Larvae produced by each group of adults were subjected to a survival test. The median lethal concentration (LC50) for larvae produced by Cu-resistant adults was significantly greater than the LC50s for the control groups. Surprisingly, the LC50 for the larvae produced by Cu-susceptible adults was also significantly greater than the LC50s for the control groups, but not significantly different from the larvae produced by Cu-resistant parents. While Cu tolerance has a genetic component in fathead minnows, the Cu tolerance of larval fish appears to be influenced by nongenetic as well as genetic factors.


Health Services and Outcomes Research Methodology | 2006

Sample size requirements for studying small populations in gerontology research

Robert B. Noble; A. John Bailer; Suzanne Kunkel; Jane Straker

Calculating sample sizes required to achieve a specified level of precision when estimating population parameters is a common statistical task. As consumer surveys become increasingly common for nursing homes, home care agencies, other service providers, and state and local administrative agencies, standard methods to calculate sample size may not be adequate. Standard methods typically assume a normal approximation and require the specification of a plausible value of the unknown population trait. This paper presents a strategy to estimate sample sizes for small finite populations and when a range of possible population values is specified. This sampling strategy is hierarchical, employing first a hypergeometric sampling model, which directly addresses the finite population concern. This level is then coupled with a beta-binomial distribution for the number of population elements possessing the characteristic of interest. This second level addresses the concern that the population trait may range over an interval of values. The utility of this strategy is illustrated using a study of resident satisfaction in nursing homes.


International Journal of Risk Assessment and Management | 2005

Incorporating uncertainty and variability in the assessment of occupational hazards

A. John Bailer; Matthew W. Wheeler; David A. Dankovic; Robert B. Noble; James F. Bena

Uncertainty reflects ignorance associated with population traits (e.g. average exposure levels to a contaminant), with models used to predict risk (e.g. which statistical model is correct), and with a host of other considerations. Variability reflects an intrinsic property of a system (e.g. body mass indices possess a distribution across a population). The incorporation of uncertainty and variability in the assessment of occupational hazards is an important objective. General issues of uncertainty and variability in occupational risk estimation are discussed. This is followed by three illustrations where: firstly, the impact of variability in an exposure assessment and sampling variability in a regression model on risk estimates is considered; secondly, the impact of uncertainty in the size of a workforce on rate modelling is considered; and thirdly, the impact of using different models to predict risk is considered.


Human and Ecological Risk Assessment | 2002

A Pooled Response Strategy for Combining Multiple Lines of Evidence to Quantitatively Estimate Impact

A. John Bailer; Michael R. Hughes; Kyoungah See; Robert B. Noble; Robert L. Schaefer

The impacts of sediment contaminants can be evaluated by different lines of evidence, including toxicity tests and ecological community studies. Responses from 10 different toxicity assays/tests were combined to arrive at a “site score.” We employed a relatively simple summary measure, pooled P-values where we quantify a potential decrement in response in a contaminated site relative to nominally clean reference sites. The response-specific P-values were defined relative to a “null” distribution of responses in reference sites, and were then pooled using standard meta-analytic methods. Ecological community data were also evaluated using an analogous strategy. A distribution of distances of the reference sites from thecentroid of the reference sites was obtained. The distance from each of the test sites from the centroid of the reference sites was then calculated, and the proportion of reference distances that exceed the test site difference was used to define an empirical P-value for that test site. A plot of the toxicity P-value versus the community P-value was used to identify sites based on both alteration in community structure and toxicity, that is, by weight-of-evidence. This approach provides a useful strategy for examining multiple lines of evidence that should be accessible to the broader scientific community. The use of a large collection of reference sites to empirically define P-values is appealing in that parametric distribution assumptions are avoided, although this does come at the cost of assuming the reference sites provide an appropriate comparison group for test sites.


Journal of Statistical Computation and Simulation | 2007

Equal-precision allocations and other constraints in stratified random sampling

S. E. Wright; Robert B. Noble; A. J. Bailer

Selecting the number of observations from each stratum is a primary decision in stratified random sampling design. Typically, allocation schemes aim to minimize or bound the variance associated with estimating some overall population parameter, subject to a limitation on sampling resources. This paper examines the impact of further constraints on allocations for stratified sampling; it was motivated by an application requiring all stratum means to be estimated with equal precision. Simple procedures for analyzing trade-offs between equal-precision allocations and those optimizing total cost or precision of overall population estimates are presented. The effect of equal-precision allocation is illustrated within the context of an anthropological study of eight strata defined by villages. In this example, the equal-precision allocation greatly improves the precision of estimating the stratum-specific means over the customary proportional allocation, with only mild degradation in the precision used to estimate the population mean.


Journal of Statistical Computation and Simulation | 2007

Comparison of relative efficiencies of sampling plans excluding certain neighboring units: a simulation study

Kyoungah See; Robert B. Noble; A. John Bailer

Ecological and environmental studies frequently involve work in settings where some physical conditions influence the effectiveness of standard sampling plans. These studies may require expensive and time-consuming sampling processes. These conditions motivate researchers to find sampling plans that provide the highest precision at the lowest cost. Several sampling plans that exclude some types of ‘neighboring’ units are discussed in this article. Sampling of this type is called sampling excluding neighboring units and is a two-dimensional adaptation of balanced sampling excluding contiguous units [Hedayat, A.S., Rao, C.R. and Stufken, J., 1988, Sampling designs excluding contiguous units. Journal of Statistical Planning and Inference, 19, 159–170]. Key features of this article are: (i) the construction of sampling plans designed to yield representative samples by avoiding the simultaneous selection of units that are, in some sense, neighbors; (ii) a simulation study which compares the relative efficiencies of these sampling plans on the basis of different correlations and sample sizes. In this study, we assume that correlations (or correlated errors) between units decrease as a function of the distance between units. The construction of a sampling plan is illustrated using actual data from a 1998 field study, which examined the insect species assemblage in a beech-maple forest.

Collaboration


Dive into the Robert B. Noble's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Matthew W. Wheeler

National Institute for Occupational Safety and Health

View shared research outputs
Top Co-Authors

Avatar

Alan S. Kolok

University of Nebraska Omaha

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge