Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Warne is active.

Publication


Featured researches published by David Warne.


ieee international conference on high performance computing data and analytics | 2011

CAD/CAM-assisted breast reconstruction

Ferry P.W. Melchels; Paul Severin Wiggenhauser; David Warne; Mark D. Barry; Fook Rhu Ong; Woon Shin Chong; Dietmar W. Hutmacher; Jan Thorsten Schantz

The application of computer-aided design and manufacturing (CAD/CAM) techniques in the clinic is growing slowly but steadily. The ability to build patient-specific models based on medical imaging data offers major potential. In this work we report on the feasibility of employing laser scanning with CAD/CAM techniques to aid in breast reconstruction. A patient was imaged with laser scanning, an economical and facile method for creating an accurate digital representation of the breasts and surrounding tissues. The obtained model was used to fabricate a customized mould that was employed as an intra-operative aid for the surgeon performing autologous tissue reconstruction of the breast removed due to cancer. Furthermore, a solid breast model was derived from the imaged data and digitally processed for the fabrication of customized scaffolds for breast tissue engineering. To this end, a novel generic algorithm for creating porosity within a solid model was developed, using a finite element model as intermediate.


Biophysical Journal | 2017

Optimal Quantification of Contact Inhibition in Cell Populations

David Warne; Ruth E. Baker; Matthew J. Simpson

Contact inhibition refers to a reduction in the rate of cell migration and/or cell proliferation in regions of high cell density. Under normal conditions, contact inhibition is associated with the proper functioning tissues, whereas abnormal regulation of contact inhibition is associated with pathological conditions, such as tumor spreading. Unfortunately, standard mathematical modeling practices mask the importance of parameters that control contact inhibition through scaling arguments. Furthermore, standard experimental protocols are insufficient to quantify the effects of contact inhibition because they focus on data describing early time, low-density dynamics only. Here we use the logistic growth equation as a caricature model of contact inhibition to make recommendations as to how to best mitigate these issues. Taking a Bayesian approach, we quantify the trade off between different features of experimental design and estimates of parameter uncertainty so that we can reformulate a standard cell proliferation assay to provide estimates of both the low-density intrinsic growth rate, λ, and the carrying capacity density, K, which is a measure of contact inhibition.


Environmental Modelling and Software | 2013

Image-based flow visualisation (IBFV) to enhance interpretation of complex flow patterns within a shallow tidal barrier estuary

David Warne; Genevieve R. Larsen; Joseph A. Young; Malcolm Cox

We applied a texture-based flow visualisation technique to a numerical hydrodynamic model of the Pumicestone Passage in southeast Queensland, Australia. The quality of the visualisations using our flow visualisation tool, are compared with animations generated using more traditional drogue release plot and velocity contour and vector techniques. The texture-based method is found to be far more effective in visualising advective flow within the model domain. In some instances, it also makes it easier for the researcher to identify specific hydrodynamic features within the complex flow regimes of this shallow tidal barrier estuary as compared with the direct and geometric based methods.


Journal of Computing in Civil Engineering | 2018

Utility of Genetic Algorithms for Solving Large-Scale Construction Time-Cost Trade-Off Problems

Duzgun Agdas; David Warne; Jorge Osio-Norgaard; Forrest J. Masters

The Time/Cost Trade-off (TCT) problem has long been a popular optimization question for construction engineering and management researchers. The problem manifests itself as the opti- mization of total costs of construction projects that consist of indirect project costs and individual activity costs. The trade-off occurs as project duration and, as a result, indirect project costs de- crease with reduced individual activity duration. This reduction in individual activity duration is achieved by increasing resource allocation to individual activities, which increases their costs to completion. Historically, metaheuristic solutions have been applied to small scale problems due to computational complexities and requirements of larger networks. In this paper, we demonstrate that the metaheuristic approach is highly effective for solving large scale construction TCT problems. A custom Genetic Algorithm (GA) is developed and used to solve large benchmark networks of up to 630 variables with high levels of accuracy (<3% deviation) consistently using computational power of a personal computer in under ten minutes. The same method can also be used to solve larger net- works of up to 6,300 variables with reasonable accuracy (∼7% deviation) at the expense of longerprocessing times. A number of simple, yet effective, techniques that improve GA performance for TCT problems are demonstrated; the most effective of which is a novel problem encoding, based on weighted graphs, that enables the critical path problem to be partially solved for all candidate solutions a priori, thus significantly increasing fitness evaluation. Other improvements include parallel fitness evaluations, optimal algorithm parameters, and the addition of a stagnation criteria. We also present some guidelines of optimal algorithm parameter selection through a comprehensive parameter sweep and a computational demand profile analysis. Moreover, the methods proposed in this article are based on open source development projects that enable scalable solutions without significant development efforts. This information will be beneficial for other researchers in improving computational efficiency of their solution in addressing TCT problems.


ieee international conference on high performance computing data and analytics | 2014

Comparison of high level FPGA hardware design for solving tri-diagonal linear systems

David Warne; Neil A. Kelson; Ross F. Hayward

Abstract Reconfigurable computing devices can increase the performance of compute intensive algorithms by implementing application specific co-processor architectures. The power cost for this performance gain is often an order of magnitude less than that of modern CPUs and GPUs. Exploiting the potential of reconfigurable devices such as Field-Programmable Gate Arrays (FPGAs) is typically a complex and tedious hardware engineering task. Recently the major FPGA vendors (Altera, and Xilinx) have released their own high-level design tools, which have great potential for rapid development of FPGA based custom accelerators. In this paper, we will evaluate Alteras OpenCL Software Development Kit, and Xilinxs Vivado High Level Sythesis tool. These tools will be compared for their performance, logic utilisation, and ease of development for the test case of a tri-diagonal linear system solver.


ieee international conference on high performance computing data and analytics | 2018

Multilevel rejection sampling for approximate Bayesian computation

David Warne; Ruth E. Baker; Matthew J. Simpson

Likelihood-free methods, such as approximate Bayesian computation, are powerful tools for practical inference problems with intractable likelihood functions. Markov chain Monte Carlo and sequential Monte Carlo variants of approximate Bayesian computation can be effective techniques for sampling posterior distributions in an approximate Bayesian computation setting. However, without careful consideration of convergence criteria and selection of proposal kernels, such methods can lead to very biased inference or computationally inefficient sampling. In contrast, rejection sampling for approximate Bayesian computation, despite being computationally intensive, results in independent, identically distributed samples from the approximated posterior. An alternative method is proposed for the acceleration of likelihood-free Bayesian inference that applies multilevel Monte Carlo variance reduction techniques directly to rejection sampling. The resulting method retains the accuracy advantages of rejection sampling while significantly improving the computational efficiency.


bioRxiv | 2018

Using experimental data and information criteria to guide model selection for reaction--diffusion problems in mathematical biology

David Warne; Ruth E. Baker; Matthew J. Simpson

Reaction–diffusion models describing the movement, reproduction and death of individuals within a population are key mathematical modelling tools with widespread applications in mathematical biology. A diverse range of such continuum models have been applied in various biological contexts by choosing different flux and source terms in the reaction–diffusion framework. For example, to describe collective spreading of cell populations, the flux term may be chosen to reflect various movement mechanisms, such as random motion (diffusion), adhesion, haptotaxis, chemokinesis and chemotaxis. The choice of flux terms in specific applications, such as wound healing, is usually made heuristically, and rarely is it tested quantitatively against detailed cell density data. More generally, in mathematical biology, the questions of model validation and model selection have not received the same attention as the questions of model development and model analysis. Many studies do not consider model validation or model selection, and those that do often base the selection of the model on residual error criteria after model calibration is performed using nonlinear regression techniques. In this work, we present a model selection case study, in the context of cell invasion, with a very detailed experimental data set. Using Bayesian analysis and information criteria, we demonstrate that model selection and model validation should account for both residual errors and model complexity. These considerations are often overlooked in the mathematical biology literature. The results we present here provide a clear methodology that can be used to guide model selection across a range of applications. Furthermore, the case study we present provides a clear example where neglecting the role of model complexity can give rise to misleading outcomes.


ieee international conference on high performance computing data and analytics | 2016

A heterogeneous computing approach to maximum likelihood parameter estimation for the Heston model of stochastic volatility

Stan Hurn; Kenneth A. Lindsay; David Warne

Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.


bioRxiv | 2016

Accelerating computational Bayesian inference for stochastic biochemical reaction network models using multilevel Monte Carlo sampling

David Warne; Ruth E. Baker; Matthew J. Simpson

Investigating the behavior of stochastic models of biochemical reactionnetworks generally relies upon numerical stochastic simulation methods to generate many realizations of the model. For many practical applications, such numerical simulation can be computationally expensive. The statistical inference of reaction rate parameters based on observed data is, however, a significantly greater computational challenge; often relying upon likelihood-free methods such as approximate Bayesian computation, that requirethe generation of millions of individual stochastic realizations. In this study, we investigate a new approach to computational inference, based on multilevel Monte Carlo sampling: we approximate the posterior cumulative distribution function through a combination of model samples taken over a range of acceptance thresholds. We demonstrate this approach using a variety of discrete-state, continuous-time Markov models of biochemical reactionnetworks. Results show that a computational gain over standard rejection schemes of up to an order of magnitude is achievable without significant loss in estimator accuracy. Author Summary We develop a new method to infer the reaction rate parameters for stochastic models of biochemical reaction networks. Standard computational approaches, based on numerical simulations, are often used to estimate parameters. These computational approaches, however, are extremely expensive, potentially requiring millions of simulations. To alleviate this issue, we apply a different method of sampling allowing us to find an optimal trade-off between performance and accuracy. Our approach is approximately one order of magnitude faster than standard methods, without significant loss in accuracy.


ieee international conference on high performance computing data and analytics | 2014

Pulse-coupled neural network performance for real-time identification of vegetation during forced landing

David Warne; Ross F. Hayward; Neil A. Kelson; Jasmine Banks; Luis Mejias

Collaboration


Dive into the David Warne's collaboration.

Top Co-Authors

Avatar

Neil A. Kelson

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Matthew J. Simpson

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Ross F. Hayward

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joseph A. Young

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrew Fielding

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Tanya Kairn

Royal Brisbane and Women's Hospital

View shared research outputs
Top Co-Authors

Avatar

Duzgun Agdas

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar

J. Kenny

Australian Radiation Protection and Nuclear Safety Agency

View shared research outputs
Top Co-Authors

Avatar

Jamie Trapp

Queensland University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge