Serryn Eagleson
University of Melbourne
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Serryn Eagleson.
BMC Medical Informatics and Decision Making | 2008
Rochelle E Watkins; Serryn Eagleson; Bert Veenendaal; Graeme Wright; Aileen J. Plant
BackgroundThe automated monitoring of routinely collected disease surveillance data has the potential to ensure that important changes in disease incidence are promptly recognised. However, few studies have established whether the signals produced by automated monitoring methods correspond with events considered by epidemiologists to be of public health importance. This study investigates the correspondence between retrospective epidemiological evaluation of notifications of Ross River virus (RRv) disease in Western Australia, and the signals produced by two cumulative sum (cusum)-based automated monitoring methods.MethodsRRv disease case notification data between 1991 and 2004 were assessed retrospectively by two experienced epidemiologists, and the timing of identified outbreaks was compared with signals generated from two different types of cusum-based automated monitoring algorithms; the three Early Aberration Reporting System (EARS) cusum algorithms (C1, C2 and C3), and a negative binomial cusum.ResultsWe found the negative binomial cusum to have a significantly greater area under the receiver operator characteristic curve when compared with the EARS algorithms, suggesting that the negative binomial cusum has a greater level of agreement with epidemiological opinion than the EARS algorithms with respect to the existence of outbreaks of RRv disease, particularly at low false alarm rates. However, the performance of individual EARS and negative binomial cusum algorithms were not significantly different when timeliness was also incorporated into the area under the curve analyses.ConclusionOur retrospective analysis of historical data suggests that, compared with the EARS algorithms, the negative binomial cusum provides greater sensitivity for the detection of outbreaks of RRv disease at low false alarm levels, and decreased timeliness early in the outbreak period. Prospective studies are required to investigate the potential usefulness of these algorithms in practice.
BMC Medical Informatics and Decision Making | 2009
Rochelle E Watkins; Serryn Eagleson; Bert Veenendaal; Graeme Wright; Aileen J. Plant
BackgroundRoutine surveillance of disease notification data can enable the early detection of localised disease outbreaks. Although hidden Markov models (HMMs) have been recognised as an appropriate method to model disease surveillance data, they have been rarely applied in public health practice. We aimed to develop and evaluate a simple flexible HMM for disease surveillance which is suitable for use with sparse small area count data and requires little baseline data.MethodsA Bayesian HMM was designed to monitor routinely collected notifiable disease data that are aggregated by residential postcode. Semi-synthetic data were used to evaluate the algorithm and compare outbreak detection performance with the established Early Aberration Reporting System (EARS) algorithms and a negative binomial cusum.ResultsAlgorithm performance varied according to the desired false alarm rate for surveillance. At false alarm rates around 0.05, the cusum-based algorithms provided the best overall outbreak detection performance, having similar sensitivity to the HMMs and a shorter average time to detection. At false alarm rates around 0.01, the HMM algorithms provided the best overall outbreak detection performance, having higher sensitivity than the cusum-based Methods and a generally shorter time to detection for larger outbreaks. Overall, the 14-day HMM had a significantly greater area under the receiver operator characteristic curve than the EARS C3 and 7-day negative binomial cusum algorithms.ConclusionOur findings suggest that the HMM provides an effective method for the surveillance of sparse small area notifiable disease data at low false alarm rates. Further investigations are required to evaluation algorithm performance across other diseases and surveillance contexts.
BMC Public Health | 2006
Rochelle E Watkins; Serryn Eagleson; Robert Hall; Lynne Dailey; Aileen J. Plant
BackgroundAn increasing number of methods are being developed for the early detection of infectious disease outbreaks which could be naturally occurring or as a result of bioterrorism; however, no standardised framework for examining the usefulness of various outbreak detection methods exists. To promote comparability between studies, it is essential that standardised methods are developed for the evaluation of outbreak detection methods.MethodsThis analysis aims to review approaches used to evaluate outbreak detection methods and provide a conceptual framework upon which recommendations for standardised evaluation methods can be based. We reviewed the recently published literature for reports which evaluated methods for the detection of infectious disease outbreaks in public health surveillance data. Evaluation methods identified in the recent literature were categorised according to the presence of common features to provide a conceptual basis within which to understand current approaches to evaluation.ResultsThere was considerable variation in the approaches used for the evaluation of methods for the detection of outbreaks in public health surveillance data, and appeared to be no single approach of choice. Four main approaches were used to evaluate performance, and these were labelled the Descriptive, Derived, Epidemiological and Simulation approaches. Based on the approaches identified, we propose a basic framework for evaluation and recommend the use of multiple approaches to evaluation to enable a comprehensive and contextualised description of outbreak detection performance.ConclusionThe varied nature of performance evaluation demonstrated in this review supports the need for further development of evaluation methods to improve comparability between studies. Our findings indicate that no single approach can fulfil all evaluation requirements. We propose that the cornerstone approaches to evaluation identified provide key contributions to support internal and external validity and comparability of study findings, and suggest these be incorporated into future recommendations for performance assessment.
BMC Medical Informatics and Decision Making | 2007
Rochelle E Watkins; Serryn Eagleson; Sam D. Beckett; Graeme Garner; Bert Veenendaal; Graeme Wright; Aileen J. Plant
BackgroundThe ability to detect disease outbreaks in their early stages is a key component of efficient disease control and prevention. With the increased availability of electronic health-care data and spatio-temporal analysis techniques, there is great potential to develop algorithms to enable more effective disease surveillance. However, to ensure that the algorithms are effective they need to be evaluated. The objective of this research was to develop a transparent user-friendly method to simulate spatial-temporal disease outbreak data for outbreak detection algorithm evaluation.A state-transition model which simulates disease outbreaks in daily time steps using specified disease-specific parameters was developed to model the spread of infectious diseases transmitted by person-to-person contact. The software was developed using the MapBasic programming language for the MapInfo Professional geographic information system environment.ResultsThe simulation model developed is a generalised and flexible model which utilises the underlying distribution of the population and incorporates patterns of disease spread that can be customised to represent a range of infectious diseases and geographic locations. This model provides a means to explore the ability of outbreak detection algorithms to detect a variety of events across a large number of stochastic replications where the influence of uncertainty can be controlled. The software also allows historical data which is free from known outbreaks to be combined with simulated outbreak data to produce files for algorithm performance assessment.ConclusionThis simulation model provides a flexible method to generate data which may be useful for the evaluation and comparison of outbreak detection algorithm performance.
Computers, Environment and Urban Systems | 2002
Serryn Eagleson; Francisco Escobar; Ian Williamson
Throughout history, humankind has segmented and delineated the geographic environment in various ways to support administrative, political and economic activities. To date, the majority of spatial boundaries have been constructed in an uncoordinated manner with individual organisations generating individual boundaries to meet their own specific needs. As a result of this lack of coordination, there is a fragmentation of information over a series of boundary units, which not only limits the potential uses for data collected, but also the scope of analysis possible between boundary layers. The proposed solution outlined in this research involves the reorganisation of the spatial environment based on Hierarchical Spatial Reasoning (HSR) and the application of a GIS-based algorithm for the automated delineation of boundaries. By using this approach, it is expected that administrative boundaries can be formed through the aggregation of smaller units. This proposed system is focussed towards facilitating rapid and efficient cross analysis of data sets.
International Journal of Health Geographics | 2013
Hannah Badland; Marcus White; Gus Macaulay; Serryn Eagleson; Suzanne Mavoa; Christopher Pettit; Billie Giles-Corti
BackgroundPedestrian-friendly neighborhoods with proximal destinations and services encourage walking and decrease car dependence, thereby contributing to more active and healthier communities. Proximity to key destinations and services is an important aspect of the urban design decision making process, particularly in areas adopting a transit-oriented development (TOD) approach to urban planning, whereby densification occurs within walking distance of transit nodes. Modeling destination access within neighborhoods has been limited to circular catchment buffers or more sophisticated network-buffers generated using geoprocessing routines within geographical information systems (GIS). Both circular and network-buffer catchment methods are problematic. Circular catchment models do not account for street networks, thus do not allow exploratory ‘what-if’ scenario modeling; and network-buffering functionality typically exists within proprietary GIS software, which can be costly and requires a high level of expertise to operate.MethodsThis study sought to overcome these limitations by developing an open-source simple agent-based walkable catchment tool that can be used by researchers, urban designers, planners, and policy makers to test scenarios for improving neighborhood walkable catchments. A simplified version of an agent-based model was ported to a vector-based open source GIS web tool using data derived from the Australian Urban Research Infrastructure Network (AURIN). The tool was developed and tested with end-user stakeholder working group input.ResultsThe resulting model has proven to be effective and flexible, allowing stakeholders to assess and optimize the walkability of neighborhood catchments around actual or potential nodes of interest (e.g., schools, public transport stops). Users can derive a range of metrics to compare different scenarios modeled. These include: catchment area versus circular buffer ratios; mean number of streets crossed; and modeling of different walking speeds and wait time at intersections.ConclusionsThe tool has the capacity to influence planning and public health advocacy and practice, and by using open-access source software, it is available for use locally and internationally. There is also scope to extend this version of the tool from a simple to a complex model, which includes agents (i.e., simulated pedestrians) ‘learning’ and incorporating other environmental attributes that enhance walkability (e.g., residential density, mixed land use, traffic volume).
International Journal of Geographical Information Science | 2003
Francisco Javier Escobar Martínez; Serryn Eagleson; Ian Williamson
This paper addresses the problems associated with the integration of data between incongruent boundary systems. Currently, the majority of spatial boundaries are designed in an uncoordinated manner with individual organizations generating individual boundaries to meet individual needs. As a result, current technologies for analysing geospatial information, such as geographical information systems (GISs), are not reaching their full potential. In response to the problem of uncoordinated boundaries, the authors present an algorithm for the hierarchical structuring of administrative boundaries. This algorithm applies hierarchical spatial reasoning (HSR) theory to the automated structuring of polygons. In turn, these structured boundary systems facilitate accurate data integration and analysis whilst meeting the spatial requirements of selected agencies. The algorithm is presented in two parts. The first part outlines previous research undertaken by the authors into the delineation of administrative boundaries in metropolitan regions. The second part outlines the distinctly different constraints required for administrative-boundary design in rural areas. The development of the algorithm has taken place in a GIS environment utilizing Avenue, an object-orientated programming language that operates under ArcView, the desktop software developed and distributed by ESRI.
Australian Geographical Studies | 2003
S.D. Jones; Serryn Eagleson; Francisco Escobar; G.J. Hunter
It is now common practice, by users of geographic information, to link data held at the postcode level to that obtained from the national census. This paper examines the relationship between Australia Post (AP) postcodes and Australian Bureau of Statistics (ABS) derived postal areas - which are an approximation of the former based on aggregated census collection districts (CDs). A group of adjacent ABS postal areas in northwest Melbourne was compared with the true AP postcode areas they purported to represent and the discrepancies were investigated. Firstly, shape mismatches were studied and their potential impacts upon resource allocation decisions were assessed. Next, comparisons of areas were undertaken. It was found that, in established inner city urban areas, the two sets of boundaries were highly correlated. However, outer suburban neighborhoods were identified as being particularly prone to major areal discrepancies. The implications of mismatches between these two key boundary data sources may be severe, given that management decisions and the allocation of public and private resources are often based on spatial statistical analyses which use these data sets. The authors acknowledge ABS efforts in providing information at the levels of aggregation that society demands. The introduction of ABS postal areas data has undoubtedly facilitated the use of demographic data in many sectors; it has, however, also caused some problems, for instance, when users assume that ABS postal areas are identical to AP postcodes. These issues could easily be avoided with the inclusion of more comprehensive metadata documentation accompanying ABS data. Research is continuing to develop a method by which agencies may derive common boundaries for their administrative units, yet still meet their own individual data and sampling requirements.
International Journal of E-Planning Research archive | 2017
Abbas Rajabifard; Ian D. Bishop; Serryn Eagleson; Christopher Pettit; Hannah Badland; Jennifer Day; John Furler; Mohsen Kalantari; Sophie Sturup; Marcus White
This paper introduces an online spatial data portal with advanced data access, analytical and visualisation capabilities which can be used for evidence based city planning and supporting data driven research. Through a case study approach, focused in the city of Melbourne, the authors show how the Australian Urban Infrastructure Network AURIN portal can be used to investigate a multi-facetted approach to understanding the various spatial dimension of livability. While the tools explore separate facets of livability employment, housing, health service and walkability, their outputs flow through to the other tools showing the benefits of integrated systems.
Australian Planner | 2013
Chris A. Hale; Serryn Eagleson
This paper offers a station-focused snapshot of growth and movement dynamics in Melbournes passenger rail network. A variety of data is engaged and interpreted around a number of core themes, including growth at higher-volume and selected stations, the role of transfer and the identification of locations generating a distinct afternoon and evening (PM) home-return market. Ridership figures are analysed and discussed with reference to both urban planning policy contexts and the pure mass transit challenges and opportunities that they imply. With regard to station volumes, the analysis draws on crude agency data to develop a clustering of relatively high passenger volume stations, which are based on four distinct volume bands. While the Central Business District (CBD) core cluster is well-known, an important set of lower-order, but still high volume, stations are identified, towards which facilities and investment should possibly be directed towards to a greater degree. The second analysis task surrounds the identification of stations experiencing the strongest dynamic of recent passenger growth in percentage terms. Some of these stations are new, while others are clearly experiencing the impacts of some form of locally-driven growth, either population- or job-based. Others may simply be reflecting strong growth in the CBD-bound commuter market. In many instances, the full extent of growth is not readily explained by broad-based backgrounding demographics, and this is an important finding in, and of, itself. A third analysis exercise identifies the twelve stations in Melbourne, which hold the key to the rail-to-rail transfer task of the Melbourne system. This aspect of transfer relates closely to the idea of developing a more ‘network’ style public transport system in Melbourne over time. The final piece of analysis surrounds the identification of stations that cater to a clear PM peak ‘return journey’ market. While the CBD-located stations obviously lead this role, there are a number of other locations where PM peak volumes would seem to speak to the emergence of distinct CBD-alternative job clusters. Figures and analysis are then contextualised with reference to both Melbournes stated and emerging aims of sustainable growth and development. Discussion suggests that the conceptualisation of rail in Melbourne is somewhat outdated and that a mixture of both ‘push and pull’ factors need to be engaged with for the metropolitan regions transport paradigm to enter the twenty-first century. These issues are broadened to reference changes, pressures and transport industry cultural challenges that are currently observable in many Australian or North American cities.