Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gentry White is active.

Publication


Featured researches published by Gentry White.


PLOS ONE | 2015

Holocene Demographic Changes and the Emergence of Complex Societies in Prehistoric Australia

Alan N. Williams; Sean Ulm; Chris S. M. Turney; David Rohde; Gentry White

A continental-scale model of Holocene Australian hunter-gatherer demography and mobility is generated using radiocarbon data and geospatial techniques. Results show a delayed expansion and settlement of much of Australia following the termination of the late Pleistocene until after 9,000 years ago (or 9ka). The onset of the Holocene climatic optimum (9-6ka) coincides with rapid expansion, growth and establishment of regional populations across ~75% of Australia, including much of the arid zone. This diffusion from isolated Pleistocene refugia provides a mechanism for the synchronous spread of pan-continental archaeological and linguistic attributes at this time (e.g. Pama-Nyungan language, Panaramitee art style, backed artefacts). We argue longer patch residence times were possible at the end of the optimum, resulting in a shift to more sedentary lifestyles and establishment of low-level food production in some parts of the continent. The onset of El Niño - Southern Oscillation (ENSO; 4.5-2ka) restricted low-level food production, and resulted in population fragmentation, abandonment of marginal areas, and reduction in ranging territory of ~26%. Importantly, climate amelioration brought about by more pervasive La Niña conditions (post-2ka), resulted in an intensification of the mobility strategies and technological innovations that were developed in the early- to mid-Holocene. These changes resulted in population expansion and utilization of the entire continent. We propose that it was under these demographically packed conditions that the complex social and religious societies observed at colonial contact were formed.


Computational Statistics & Data Analysis | 2011

Mechanism-based emulation of dynamic simulation models: Concept and application in hydrology

P. Reichert; Gentry White; M.J. Bayarri; E.B. Pitman

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.


Journal of Urban Affairs | 2014

ETHNIC DIVERSITY AND ITS IMPACT ON COMMUNITY SOCIAL COHESION AND NEIGHBORLY EXCHANGE

Rebecca Wickes; Renee Zahnow; Gentry White; Lorraine Mazerolle

ABSTRACT: Putnam’s “constrict theory” suggests that ethnic diversity creates challenges for developing and sustaining social capital in urban settings. He argues that diversity decreases social cohesion and reduces social interactions among community residents. While Putnam’s thesis is the subject of much debate in North America, the United Kingdom, and Europe, there is a limited focus on how ethnic diversity impacts upon social cohesion and neighborly exchange behaviors in Australia. Employing multilevel modeling and utilizing administrative and survey data from 4,000 residents living in 148 Brisbane suburbs, we assess whether ethnic diversity lowers social cohesion and increases “hunkering.” Our findings indicate that social cohesion and neighborly exchange are attenuated in ethnically diverse suburbs. However, diversity is less consequential for neighborly exchange among immigrants when compared to the general population. Our results provide at least partial support for Putnam’s thesis.


Journal of Research in Crime and Delinquency | 2015

Burglar Target Selection: A Cross-national Comparison

Michael Kenneth Townsley; Daniel James Birks; Wim Bernasco; Stijn Ruiter; Shane D. Johnson; Gentry White; Scott Baum

Objectives: This study builds on research undertaken by Bernasco and Nieuwbeerta and explores the generalizability of a theoretically derived offender target selection model in three cross-national study regions. Methods: Taking a discrete spatial choice approach, we estimate the impact of both environment- and offender-level factors on residential burglary placement in the Netherlands, the United Kingdom, and Australia. Combining cleared burglary data from all study regions in a single statistical model, we make statistical comparisons between environments. Results: In all three study regions, the likelihood an offender selects an area for burglary is positively influenced by proximity to their home, the proportion of easily accessible targets, and the total number of targets available. Furthermore, in two of the three study regions, juvenile offenders under the legal driving age are significantly more influenced by target proximity than adult offenders. Post hoc tests indicate the magnitudes of these impacts vary significantly between study regions. Conclusions: While burglary target selection strategies are consistent with opportunity-based explanations of offending, the impact of environmental context is significant. As such, the approach undertaken in combining observations from multiple study regions may aid criminology scholars in assessing the generalizability of observed findings across multiple environments.


Computational Statistics & Data Analysis | 2009

A stochastic neighborhood conditional autoregressive model for spatial data

Gentry White; Sujit K. Ghosh

A spatial process observed over a lattice or a set of irregular regions is usually modeled using a conditionally autoregressive (CAR) model. The neighborhoods within a CAR model are generally formed deterministically using the inter-distances or boundaries between the regions. An extension of CAR model is proposed in this article where the selection of the neighborhood depends on unknown parameter(s). This extension is called a Stochastic Neighborhood CAR (SNCAR) model. The resulting model shows flexibility in accurately estimating covariance structures for data generated from a variety of spatial covariance models. Specific examples are illustrated using data generated from some common spatial covariance functions as well as real data concerning radioactive contamination of the soil in Switzerland after the Chernobyl accident.


Law & Policy | 2012

Violence in and Around Entertainment Districts: A Longitudinal Analysis of the Impact of Late‐Night Lockout Legislation

Lorraine Mazerolle; Gentry White; Janet Ransley; Patricia Ferguson

Violence in entertainment districts is a major problem across urban landscapes throughout the world. Research shows that licensed premises are the third most common location for homicides and serious assaults, accounting for one in ten fatal and nonfatal assaults. One class of interventions that aims to reduce violence in entertainment districts involves the use of civil remedies: a group of strategies that use civil or regulatory measures as legal “levers” to reduce problem behavior. One specific civil remedy used to reduce problematic behavior in entertainment districts involves manipulation of licensed premise trading hours. This article uses generalized linear models to analyze the impact of lockout legislation on recorded violent offences in two entertainment districts in the Australian state of Queensland. Our research shows that 3 a.m. lockout legislation led to a direct and significant reduction in the number of violent incidents inside licensed premises. Indeed, the lockouts cut the level of violent crime inside licensed premises by half. Despite these impressive results for the control of violence inside licensed premises, we found no evidence that the lockout had any impact on violence on streets and footpaths outside licensed premises that were the site for more than 80 percent of entertainment district violence. Overall, however, our analysis suggests that lockouts are an important mechanism that helps to control the level of violence inside licensed premises but that finely grained contextual responses to alcohol-related problems are needed rather than one-size-fits-all solutions.


Computational Statistics & Data Analysis | 2014

GPU accelerated MCMC for modeling terrorist activity

Gentry White; Michael D. Porter

The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.


Evaluation Review | 2013

Nonresponse bias in randomized controlled experiments in criminology: Putting the Queensland Community Engagement Trial (QCET) under a microscope.

Emma Antrobus; Henk Elffers; Gentry White; Lorraine Mazerolle

Objectives: The goal of this article is to examine whether or not the results of the Queensland Community Engagement Trial (QCET)—a randomized controlled trial that tested the impact of procedural justice policing on citizen attitudes toward police—were affected by different types of nonresponse bias. Method: We use two methods (Cochrane and Elffers methods) to explore nonresponse bias: First, we assess the impact of the low response rate by examining the effects of nonresponse group differences between the experimental and control conditions and pooled variance under different scenarios. Second, we assess the degree to which item response rates are influenced by the control and experimental conditions. Results: Our analysis of the QCET data suggests that our substantive findings are not influenced by the low response rate in the trial. The results are robust even under extreme conditions, and statistical significance of the results would only be compromised in cases where the pooled variance was much larger for the nonresponse group and the difference between experimental and control conditions was greatly diminished. We also find that there were no biases in the item response rates across the experimental and control conditions. Conclusion: RCTs that involve field survey responses—like QCET—are potentially compromised by low response rates and how item response rates might be influenced by the control or experimental conditions. Our results show that the QCET results were not sensitive to the overall low response rate across the experimental and control conditions and the item response rates were not significantly different across the experimental and control groups. Overall, our analysis suggests that the results of QCET are robust and any biases in the survey responses do not significantly influence the main experimental findings.


Australian and New Zealand Journal of Criminology | 2015

Optimising the length of random breath tests: Results from the Queensland Community Engagement Trial

Lorraine Mazerolle; Lyndel Bates; Sarah Bennett; Gentry White; Jason Ferris; Emma Antrobus

Research suggests that the length and quality of police–citizen encounters affect policing outcomes. The Koper Curve, for example, shows that the optimal length for police presence in hot spots is between 14 and 15 minutes, with diminishing returns observed thereafter. Our study, using data from the Queensland Community Engagement Trial (QCET), examines the impact of encounter length on citizen perceptions of police performance. QCET involved a randomised field trial, where 60 random breath test (RBT) traffic stop operations were randomly allocated to an experimental condition involving a procedurally just encounter or a business-as-usual control condition. Our results show that the optimal length of time for procedurally just encounters during RBT traffic stops is just less than 2 minutes. We show, therefore, that it is important to encourage and facilitate positive police–citizen encounters during RBT at traffic stops, while ensuring that the length of these interactions does not pass a point of diminishing returns.


Statistics and Computing | 2016

A pseudo-marginal sequential Monte Carlo algorithm for random effects models in Bayesian sequential design

James McGree; Christopher C. Drovandi; Gentry White; Anthony N. Pettitt

Motivated by the need to sequentially design experiments for the collection of data in batches or blocks, a new pseudo-marginal sequential Monte Carlo algorithm is proposed for random effects models where the likelihood is not analytic, and has to be approximated. This new algorithm is an extension of the idealised sequential Monte Carlo algorithm where we propose to unbiasedly approximate the likelihood to yield an efficient exact-approximate algorithm to perform inference and make decisions within Bayesian sequential design. We propose four approaches to unbiasedly approximate the likelihood: standard Monte Carlo integration; randomised quasi-Monte Carlo integration, Laplace importance sampling and a combination of Laplace importance sampling and randomised quasi-Monte Carlo. These four methods are compared in terms of the estimates of likelihood weights and in the selection of the optimal sequential designs in an important pharmacological study related to the treatment of critically ill patients. As the approaches considered to approximate the likelihood can be computationally expensive, we exploit parallel computational architectures to ensure designs are derived in a timely manner.

Collaboration


Dive into the Gentry White's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Emma Antrobus

University of Queensland

View shared research outputs
Top Co-Authors

Avatar

Rebecca Wickes

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Wim Bernasco

VU University Amsterdam

View shared research outputs
Top Co-Authors

Avatar

David Rohde

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge