B. D. Ripley
University of Oxford
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by B. D. Ripley.
Journal of Applied Probability | 1976
B. D. Ripley
This paper provides a rigorous foundation for the second-order analysis of stationary point processes on general spaces. It illuminates the results of Bartlett on spatial point processes, and covers the point processes of stochastic geometry, including the line and hyperplane processes of Davidson and Krickeberg. The main tool is the decomposition of moment measures pioneered by Krickeberg and Vere-Jones. Finally some practical aspects of the analysis of point processes are discussed.
Analyst | 1987
B. D. Ripley; Michael Thompson
Regression techniques are commonly applied to compare two analytical methods at several concentrations and to test the biases of one method relative to another. However, regression is strictly applicable only when one method is without error, for example in comparisons with reference materials. A regression-like technique, maximum-likelihood fitting of a functional relationship (MLFR), is explained and is demonstrated to work well. Under some conditions weighted regression provides a good approximation to MLFR, and so can be used if more convenient.
Journal of Ecology | 1978
B. D. Ripley
SUMMARY (1) Spectral analysis is a relatively untried method for the analysis of data from a line of contiguous quadrats. Conventional block-size analyses are shown to be related to square waves. In spectral analysis square waves are replaced by sine waves. (2) These methods and Meads test are compared with conventional methods, using artificial and field data. Spectral analysis performed reliably and gave a good indication of the type of departure from a random pattern. Meads test proved sensitive but hard to interpret, often contradicting other methods. (3) It is argued that standardization should not be used with methods based on variances.
Journal of Computational and Applied Mathematics | 1990
B. D. Ripley
Abstract Much of the informal discussion at the Workshop concerned the merits of different pseudorandom number generators. Here we record some comments based on comparing generators across a wide range of machines.
NeuroImage | 2001
Stephen M. Smith; Peter R. Bannister; Christian F. Beckmann; Michael Brady; Stuart Clare; David Flitney; Peter C. Hansen; Mark Jenkinson; Didier G. Leibovici; B. D. Ripley; Mark W. Woolrich; Yongyue Zhang
FSL: New Tools for Functional and Structural Brain Image Analysis Stephen Smith*, Peter R Bannister *, Christian Beckmann*, Mike Brady?, Stuart Glare*, David Flitney*, Peter Hansen*, Mark Jenkinson*, Didier Leibovici*, Brian Ripley+, Mark Woolrich*, Yongyue Zhang* *FMRIB, Oxford University, UK “FMedical Vision Lab, Dept. Engineering Science, Oxford University, UK
Journal of Applied Statistics | 1989
Rafael Molina; B. D. Ripley
Dept. Statistics, Oxford University, UK
Archive | 1995
B. D. Ripley
Optical astronomers now normally collect digital images by means of charge-coupled device detectors, which are blurred by atmospheric motion and distorted by physical noise in the detection process. We examine Bayesian procedures to clean such images using explicit models from spatial statistics for the underlying structure, and compare these methods with those based on maximum entropy. This is an undated version of Molina and Ripley (1989) containing brief details of later work. Sections 1–5, 7 and 8 follow that paper and describe the deconvolution of galaxies. Further examples have been published in Molina et al. (1992a) for an astronomical audience. Work on the deconvolution of planetary images from Molina et al. (1992b, c) is reported in Section 6 with examples included in Section 7.
Philosophical Transactions of the Royal Society A | 1990
B. D. Ripley; A. L. Sutherland
Choosing the architecture of a neural network is one of the most important problems in making neural networks practically useful, but accounts of applications usually sweep these details under the carpet. How many hidden units are needed? Should weight decay be used, and if so how much? What type of output units should be chosen? And so on.
Archive | 2002
W. N. Venables; B. D. Ripley
Much recent work in statistical image analysis has been concerned with ‘cleaning’ images by a bayesian statistical analysis incorporating a prior model, which reflects the spatial structure of the image. In almost all cases this has involved a description of the image at pixel level. In this paper we take the process further, and develop a spatial stochastic process of objects present in the image. The general theory is given and applied to images of spiral galaxies, with the aims of producing better schematic reconstructions and of automatically classifying galaxies.
Journal of the American Statistical Association | 2007
Kristin N. Javaras; B. D. Ripley
Models with mixed effects contain both fixed and random effects. Fixed effects are what we have been considering up to now; the only source of randomness in our models arises from regarding the cases as independent random samples. Thus in regression we have an additive measurement error that we assume is independent between cases, and in a GLM we observe independent binomial, Poisson, gamma ... random variates whose mean is a deterministic function of the explanatory variables.
Collaboration
Dive into the B. D. Ripley's collaboration.
Commonwealth Scientific and Industrial Research Organisation
View shared research outputs