Bert Veenendaal
Curtin University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bert Veenendaal.
BMC Medical Informatics and Decision Making | 2008
Rochelle E Watkins; Serryn Eagleson; Bert Veenendaal; Graeme Wright; Aileen J. Plant
BackgroundThe automated monitoring of routinely collected disease surveillance data has the potential to ensure that important changes in disease incidence are promptly recognised. However, few studies have established whether the signals produced by automated monitoring methods correspond with events considered by epidemiologists to be of public health importance. This study investigates the correspondence between retrospective epidemiological evaluation of notifications of Ross River virus (RRv) disease in Western Australia, and the signals produced by two cumulative sum (cusum)-based automated monitoring methods.MethodsRRv disease case notification data between 1991 and 2004 were assessed retrospectively by two experienced epidemiologists, and the timing of identified outbreaks was compared with signals generated from two different types of cusum-based automated monitoring algorithms; the three Early Aberration Reporting System (EARS) cusum algorithms (C1, C2 and C3), and a negative binomial cusum.ResultsWe found the negative binomial cusum to have a significantly greater area under the receiver operator characteristic curve when compared with the EARS algorithms, suggesting that the negative binomial cusum has a greater level of agreement with epidemiological opinion than the EARS algorithms with respect to the existence of outbreaks of RRv disease, particularly at low false alarm rates. However, the performance of individual EARS and negative binomial cusum algorithms were not significantly different when timeliness was also incorporated into the area under the curve analyses.ConclusionOur retrospective analysis of historical data suggests that, compared with the EARS algorithms, the negative binomial cusum provides greater sensitivity for the detection of outbreaks of RRv disease at low false alarm levels, and decreased timeliness early in the outbreak period. Prospective studies are required to investigate the potential usefulness of these algorithms in practice.
BMC Medical Informatics and Decision Making | 2009
Rochelle E Watkins; Serryn Eagleson; Bert Veenendaal; Graeme Wright; Aileen J. Plant
BackgroundRoutine surveillance of disease notification data can enable the early detection of localised disease outbreaks. Although hidden Markov models (HMMs) have been recognised as an appropriate method to model disease surveillance data, they have been rarely applied in public health practice. We aimed to develop and evaluate a simple flexible HMM for disease surveillance which is suitable for use with sparse small area count data and requires little baseline data.MethodsA Bayesian HMM was designed to monitor routinely collected notifiable disease data that are aggregated by residential postcode. Semi-synthetic data were used to evaluate the algorithm and compare outbreak detection performance with the established Early Aberration Reporting System (EARS) algorithms and a negative binomial cusum.ResultsAlgorithm performance varied according to the desired false alarm rate for surveillance. At false alarm rates around 0.05, the cusum-based algorithms provided the best overall outbreak detection performance, having similar sensitivity to the HMMs and a shorter average time to detection. At false alarm rates around 0.01, the HMM algorithms provided the best overall outbreak detection performance, having higher sensitivity than the cusum-based Methods and a generally shorter time to detection for larger outbreaks. Overall, the 14-day HMM had a significantly greater area under the receiver operator characteristic curve than the EARS C3 and 7-day negative binomial cusum algorithms.ConclusionOur findings suggest that the HMM provides an effective method for the surveillance of sparse small area notifiable disease data at low false alarm rates. Further investigations are required to evaluation algorithm performance across other diseases and surveillance contexts.
Remote Sensing | 2009
Soo-See Chai; Jeffrey P. Walker; Oleg Makarynskyy; Michael Kuhn; Bert Veenendaal; Geoff A. W. West
Passive microwave remote sensing is one of the most promising techniques for soil moisture retrieval. However, the inversion of soil moisture from brightness temperature observations is not straightforward, as it is influenced by numerous factors such as surface roughness, vegetation cover, and soil texture. Moreover, the relationship between brightness temperature, soil moisture and the factors mentioned above is highly non-linear and ill-posed. Consequently, Artificial Neural Networks (ANNs) have been used to retrieve soil moisture from microwave data, but with limited success when dealing with data different to that from the training period. In this study, an ANN is tested for its ability to predict soil moisture at 1 km resolution on different dates following training at the same site for a specific date. A novel approach that utilizes information on the variability of soil moisture, in terms of its mean and standard deviation for a (sub) region of spatial dimension up to 40 km, is used to improve the current retrieval accuracy of the ANN method. A comparison between the ANN with and without the use of the variability information showed that this enhancement enables the ANN to achieve an average Root Mean Square Error (RMSE) of around 5.1% v/v when using the variability information, as compared to around 7.5% v/v without it. The accuracy of the soil moisture retrieval was
BMC Medical Informatics and Decision Making | 2007
Rochelle E Watkins; Serryn Eagleson; Sam D. Beckett; Graeme Garner; Bert Veenendaal; Graeme Wright; Aileen J. Plant
BackgroundThe ability to detect disease outbreaks in their early stages is a key component of efficient disease control and prevention. With the increased availability of electronic health-care data and spatio-temporal analysis techniques, there is great potential to develop algorithms to enable more effective disease surveillance. However, to ensure that the algorithms are effective they need to be evaluated. The objective of this research was to develop a transparent user-friendly method to simulate spatial-temporal disease outbreak data for outbreak detection algorithm evaluation.A state-transition model which simulates disease outbreaks in daily time steps using specified disease-specific parameters was developed to model the spread of infectious diseases transmitted by person-to-person contact. The software was developed using the MapBasic programming language for the MapInfo Professional geographic information system environment.ResultsThe simulation model developed is a generalised and flexible model which utilises the underlying distribution of the population and incorporates patterns of disease spread that can be customised to represent a range of infectious diseases and geographic locations. This model provides a means to explore the ability of outbreak detection algorithms to detect a variety of events across a large number of stochastic replications where the influence of uncertainty can be controlled. The software also allows historical data which is free from known outbreaks to be combined with simulated outbreak data to produce files for algorithm performance assessment.ConclusionThis simulation model provides a flexible method to generate data which may be useful for the evaluation and comparison of outbreak detection algorithm performance.
International Journal of Digital Earth | 2014
Youhei Kawamura; Ashraf M. Dewan; Bert Veenendaal; Masahiro Hayashi; Takeshi Shibuya; Itaru Kitahara; Hajime Nobuhara; Kento Ishii
Communications network damage resulting from a large disaster causes difficulties in the ability to rapidly understand the current situation and thus make appropriate decisions towards mitigating problems, such as where to send and dispense emergency supplies. The research outlined in this paper focuses on the rapid construction of a network after a disaster occurs. This study suggests ZigBee and geographic information systems (GIS) technologies to resolve these problems and provide an effective communication system. The experimental results of the ZigBee network system are presented, examples are provided of the mapping and analysis undertaken using GIS for the disaster-stricken area of Tsukuba City, Japan, and the communications node arrangements are determined for this region. These results demonstrate the effectiveness of establishing such a communications system for supporting efforts to relieve disaster-damaged areas.
Transactions in Gis | 2011
Suzana Dragicevic; Songnian Li; Maria Antonia Brovelli; Bert Veenendaal
The Web has witnessed a dramatic increase in usage and corresponding requests for new applications and services. In terms of the speed and the extent of developments within geography, cartography, geographic information science and engineering, it seems like a very long time ago since the first web map viewer was developed by Xerox Corporation but in fact it was just in 1993 (Dragicevic 2004). Since then web-based geographic information systems and services have evolved from web-cartography (Kraak and Brown 2000) and web or Internet GIS (Peng and Tsou 2003), to the geospatial web or GeoWeb (Scharl and Tochtermann 2007). The popularity of social networking, blogging, and multimedia data streaming as well as the rise of Citizen Science (Cooper et al. 2007) and Volunteered Geographic Information (Goodchild 2007) has required enhancements in web mapping applications and services as well as geoprocessing capabilities to keep pace with user demands and an ever increasing user base. The Web is now rapidly progressing into the de facto platform on which to deliver web mapping applications and geoprocessing services (Fu and Sun 2010, Li et al. 2011). In this context, the First International Workshop on Pervasive Web Mapping, Geoprocessing and Services was held on the Politecnico di Milano campus in Como, Italy from August 26 to 27, 2010 with the goal to examine the existing capabilities and future potential of Pervasive Web Mapping, Geoprocessing and Services. The Workshop was organized by the International Society for Photogrammetry and Remote Sensing (ISPRS) Working Group (WG) IV/5 on “Distributed, Web-based Geoinformation Services and Applications” and Politecnico di Milano, and co-organized by ISPRS WG IV/1 on “Geospatial Data Infrastructure”, WG IV/4 on “Virtual Globes and Context-Aware Visualization”, and ICWG IV/II on “GeoSensor Networking and GEOGRID”. Six of the workshop papers that integrated software development and application design were invited to develop full manuscripts. All the manuscripts were rigorously peer-reviewed and are now presented in this special journal issue. The first article on “Exploring the Boundaries of Web Map Services: The Example of the Online Injury Atlas for Ontario” by Claus Rinner, Byron Moldofsky, Michael Cusimano, Sean Marshall and Tony Hernandez begins with an overview of web mapping related to public health decision-making and the integration of health information into spatial data infrastructures. More particularly a web atlas was designed to integrate injury-related data collected separately and by a variety of stakeholders for use in decision-making and injury prevention. Open Source Web mapping frameworks and Transactions in GIS, 2011, 15(2): 125–127
International Journal of Digital Earth | 2014
Jacob Delfos; Bert Veenendaal; Tele Tan
Web-based geographic information systems have advanced rapidly on the back of web-based technologies, increased bandwidths and access to Digital Earth imagery and functionality. However, these advances are causing its capabilities to slowly overtake those of end-users. Additionally, the introduction of non-desktop devices such as smartphones, tablets and netbooks is starting to undo progress made towards standardisation of web-based technology. Large variations in screen sizes, computational power, bandwidth, and operating environments are once again introducing the need to ensure software remains functional across different platforms, standards-compliant or not. These two issues highlight the need for a mechanism to tune content and capability to end-users and their environment, to prevent information and complexity overload in a field already troubled by poor usability, while promoting cross-platform compatibility. This paper proposes the use of adaptivity to accommodate for users from different backgrounds accessing web mapping systems in different technical environments. It describes adaptive profiles aligned to the finite number of states a system can adopt, rather than the limitless range of user or environment characteristics that cannot be adapted to. Each profile consists of a combination of adaptive states comprising functionality, information detail, or technical demands to optimise for individual users or technical environments.
International Journal of Digital Earth | 2014
Bert Veenendaal; Songnian Li; Suzana Dragicevic; Maria Antonia Brovelli
Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content.
Journal of Location Based Services | 2008
Jacob Delfos; Tele Tan; Bert Veenendaal
Although research in location-based services (LBS) is advancing well, the problem of obtaining a position for the user is still a major obstacle. Commonly available methods suffer from problems of availability, financial cost, and lack of precision or accuracy. The concept that IP-addresses tend to be spatially clustered which makes them attractive as a means for positioning. IP-based positioning would be applicable to any immobile device or interface, such as a computer or a wireless access point. Although it is believed that LBS equates to mobile computing, in reality the audience among static users in homes and offices may in fact be greater at this point in time. VRILS (varying resolution IP locating system) uses the relationship between network clusters and spatial clusters to provide positions for IP-addresses. It uses different levels of spatial precision to cope with conflicting locations within subnets, which enhances the chance of being able to provide a location. VRILS has been tested on the campus of Curtin University, where the positions of 461 IP-addresses were used in a network of over 20,000 computers. The outcome showed perfect results at the broadest spatial resolution of ‘campus’, and a reasonable result at the resolution of ‘building’. Randomness of IP-addresses across certain buildings was shown to strongly affect the accuracy. In general, it could be seen that with a relatively small amount of data, accurate positions could be obtained, but a lack of spatial clustering would decrease the efficiency to that of simple lookups.
ISPRS international journal of geo-information | 2017
Bert Veenendaal; Maria Antonia Brovelli; Songnian Li
Web mapping and the use of geospatial information online have evolved rapidly over the past few decades. Almost everyone in the world uses mapping information, whether or not one realizes it. Almost every mobile phone now has location services and every event and object on the earth has a location. The use of this geospatial location data has expanded rapidly, thanks to the development of the Internet. Huge volumes of geospatial data are available and daily being captured online, and are used in web applications and maps for viewing, analysis, modeling and simulation. This paper reviews the developments of web mapping from the first static online map images to the current highly interactive, multi-sourced web mapping services that have been increasingly moved to cloud computing platforms. The whole environment of web mapping captures the integration and interaction between three components found online, namely, geospatial information, people and functionality. In this paper, the trends and interactions among these components are identified and reviewed in relation to the technology developments. The review then concludes by exploring some of the opportunities and directions.