Benjamin J. K. Evans
Australian National University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Benjamin J. K. Evans.
International Journal of Digital Earth | 2016
Adam Lewis; Leo Lymburner; Matthew B. J. Purss; Brendan P. Brooke; Benjamin J. K. Evans; Alex Ip; Arnold G. Dekker; James R. Irons; Stuart Minchin; Norman Mueller; Simon Oliver; Dale Roberts; Barbara Ryan; Medhavy Thankappan; Robert Woodcock; Lesley Wyborn
ABSTRACT The effort and cost required to convert satellite Earth Observation (EO) data into meaningful geophysical variables has prevented the systematic analysis of all available observations. To overcome these problems, we utilise an integrated High Performance Computing and Data environment to rapidly process, restructure and analyse the Australian Landsat data archive. In this approach, the EO data are assigned to a common grid framework that spans the full geospatial and temporal extent of the observations – the EO Data Cube. This approach is pixel-based and incorporates geometric and spectral calibration and quality assurance of each Earth surface reflectance measurement. We demonstrate the utility of the approach with rapid time-series mapping of surface water across the entire Australian continent using 27 years of continuous, 25 m resolution observations. Our preliminary analysis of the Landsat archive shows how the EO Data Cube can effectively liberate high-resolution EO data from their complex sensor-specific data structures and revolutionise our ability to measure environmental change.
international symposium on environmental software systems | 2015
Benjamin J. K. Evans; Lesley Wyborn; Tim Pugh; Chris Allen; Joseph Antony; Kashif Gohar; David Porter; Jon Smillie; Claire Trenham; Jingbo Wang; Alex Ip; Gavin Bell
The National Computational Infrastructure (NCI) at the Australian National University (ANU) has co-located a priority set of over 10 PetaBytes (PBytes) of national data collections within a HPC research facility. The facility provides an integrated high-performance computational and storage platform, or a High Performance Data (HPD) platform, to serve and analyse the massive amounts of data across the spectrum of environmental collections – in particular from the climate, environmental and geoscientific domains. The data is managed in concert with the government agencies, major academic research communities and collaborating overseas organisations. By co-locating the vast data collections with high performance computing environments and harmonising these large valuable data assets, new opportunities have arisen for Data-Intensive interdisciplinary science at scales and resolutions not hitherto possible.
General Relativity and Gravitation | 2002
Benjamin J. K. Evans; S. M. Scott; A. C. Searle
We have developed a new tool for numerical work in General Relativity: GRworkbench. We discuss how GRworkbenchs implementation of a numerically-amenable analogue to Differential Geometry facilitates the development of robust and chart-independent numerical algorithms. We consider, as an example, geodesic tracing on two charts covering the exterior Schwarzschild space-time.
General Relativity and Gravitation | 2000
Bronwen Whiting; B Coldwell; S. M. Scott; Benjamin J. K. Evans; D. E. McClelland
This work incorporates a review of the status, in Australia, of data analysis for gravitational wave detection using laser interferometers, within an overview of the present state of such research in the world generally. In this context, data analysis refers not so much to signal simulation as to what might be called the thorough process of noise characterization and the subsequent, quality-controlled signal extraction. To the extent that problems identified here arise for all currently planned instruments, there is necessarily a global component to the discussion presented. In Australia, there are unique circumstances, associated with attempting to carry out work in gravitational wave detection, which demand also a local aspect to the ensuing discussion.
Informatics | 2017
Benjamin J. K. Evans; Kelsey Druken; Jingbo Wang; Rui Yang; Clare Richards; Lesley Wyborn
To ensure seamless, programmatic access to data for High Performance Computing (HPC) and analysis across multiple research domains, it is vital to have a methodology for standardization of both data and services. At the Australian National Computational Infrastructure (NCI) we have developed a Data Quality Strategy (DQS) that currently provides processes for: (1) Consistency of data structures needed for a High Performance Data (HPD) platform; (2) Quality Control (QC) through compliance with recognized community standards; (3) Benchmarking cases of operational performance tests; and (4) Quality Assurance (QA) of data through demonstrated functionality and performance across common platforms, tools and services. By implementing the NCI DQS, we have seen progressive improvement in the quality and usefulness of the datasets across the different subject domains, and demonstrated the ease by which modern programmatic methods can be used to access the data, either in situ or via web services, and for uses ranging from traditional analysis methods through to emerging machine learning techniques. To help increase data re-usability by broader communities, particularly in high performance environments, the DQS is also used to identify the need for any extensions to the relevant international standards for interoperability and/or programmatic access.
D-lib Magazine | 2017
Jingbo Wang; Benjamin J. K. Evans; Lesley Wyborn; Amir Aryani; Melanie Barlow
This paper demonstrates the connectivity graphs made by Research Data Switchboard (RD-Switchboard) using NCIs metadata database. Making research data connected, discoverable and reusable are some of the key enablers of the new data revolution in research. We show how the Research Data Switchboard identified the missing critical information in our database, and what improvements have been made by this system. The connections made by the RD-Switchboard demonstrated the various use of the datasets, and the network of researchers and cross-referenced publications.
D-lib Magazine | 2017
Jingbo Wang; Benjamin J. K. Evans; Lesley Wyborn; Nick Car; Edward King
Scientific research is published in journals so that the research community is able to share knowledge and results, verify hypotheses, contribute evidence-based opinions and promote discussion. However, it is hard to fully understand, let alone reproduce, the results if the complex data manipulation that was undertaken to obtain the results are not clearly explained and/or the final data used is not available. Furthermore, the scale of research data assets has now exponentially increased to the point that even when available, it can be difficult to store and use these data assets. In this paper, we describe the solution we have implemented at the National Computational Infrastructure (NCI) whereby researchers can capture workflows, using a standards-based provenance representation. This provenance information, combined with access to the original dataset and other related information systems, allow datasets to be regenerated as needed which simultaneously addresses both result reproducibility and storage issues.
international conference on big data | 2015
Dean N. Williams; Michael Lautenschlager; V. Balaji; Luca Cinquini; Cecelia DeLuca; Sebastien Denvil; Daniel Q. Duffy; Benjamin J. K. Evans; Robert D. Ferraro; Martin Juckes; Claire Trenham
This article describes the Earth System Grid Federation (ESGF) mission and an international integration strategy for data, database and computational architecture, and stable infrastructure highlighted by the authors (the ESGF Executive Committee). These highlights are key developments needed over the next five to seven years in response to large-scale national and international climate community projects that depend on ESGF for success. Quality assurance and baseline performance from laptop to high performance computing characterizes available and potential data streams and strategies. These are required for interactive data collections to remedy gaps in handling enormous international federated climate data archives. Appropriate cyber security ensures protection of data according to projects but still allows access and portability to different ESGF and individual groups and users. A timeline and plan for forecasting interoperable tools takes ESGF from a federated database archive to a robust virtual laboratory and concludes the article.
international world wide web conferences | 2017
Jingbo Wang; Amir Aryani; Lesley Wyborn; Benjamin J. K. Evans
In this position paper, we describe a pilot project that provides Research Graph records to external web services using JSON-LD. The Research Graph database contains a large-scale graph that links research datasets (i.e., data used to support research) to funding records (i.e. grants), publications and researcher records such as ORCID profiles. This database was derived from the work of the Research Data Alliance Working Group on Data Description Registry Interoperability (DDRI), and curated using the Research Data Switchboard open source software. By being available in Linked Data format, the Research Graph database is more accessible to third-party web services over the Internet, which thus opens the opportunity to connect to the rest of the world in the semantic format. The primary purpose of this pilot project is to evaluate the feasibility of converting registry objects in Research Graph to JSON-LD by accessing widely used vocabularies published at Schema.org. In this paper, we provide examples of publications, datasets and grants from international research institutions such as CERN INSPIREHEP, National Computational Infrastructure (NCI) in Australia, and Australian Research Council (ARC). Furthermore, we show how these Research Graph records are made semantically available as Linked Data through using Schema.org. The mapping between Research Graph schema and Schema.org is available on GitHub repository. We also discuss the potential need for an extension to Schema.org vocabulary for scholarly communication.
international symposium on environmental software systems | 2015
Mark Cheeseman; Benjamin J. K. Evans; Dale Roberts; Marshall Ward
A 3-year investigation is underway into the performance of applications used in the Australian Community Climate and Earth System Simulator on the petascale supercomputer Raijin hosted at the National Computational Infrastructure. Several applications have been identified as candidates for this investigation including the UK MetOffice’s Unified Model (UM) atmospheric model and Princeton University’s Modular Ocean Model (MOM). In this paper we present initial results of the investigation of the performance and scalability of UM and MOM on Raijin. We also present initial results of a performance study on the data assimilation package (VAR) developed by the UK MetOffice and used by the Australian Bureau of Meteorology in its operational weather forecasting suite. Further investigation and optimization is envisioned for each application investigated and will be discussed.
Collaboration
Dive into the Benjamin J. K. Evans's collaboration.
Commonwealth Scientific and Industrial Research Organisation
View shared research outputs