Quillon Harpham
HR Wallingford
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Quillon Harpham.
international conference on e science | 2014
Emanuele Danovaro; Luca Roverelli; Gabriele Zereik; Antonella Galizia; Daniele D'Agostino; Giacomo Paschina; Alfonso Quarati; Andrea Clematis; Fabio Delogu; Elisabetta Fiori; Antonio Parodi; Christian Straube; Nils gentschen Felde; Quillon Harpham; Bert Jagers; Luis Garrote; Ljiljana Dekic; M. Ivković; Olivier Caumont; Evelyne Richard
Predicting weather and climate and its impacts on the environment, including hazards such as floods and landslides, is a big challenge that can be efficiently supported by a distributed and heterogeneous infrastructure, exploiting several kinds of computational resources: HPC, Grids and Clouds. This can help researchers in speeding up experiments, improve resolution and accuracy, simulate with different numerical models and model chains. Such numerical models are complex with heavy computational requirements, huge numbers of parameters to tune, and not fully standardized interfaces. Hence, each research entity is usually focusing on a limited set of tools and hard-wired solutions to enable their interaction. The DRIHM approach is based on strong standardization, well defined interfaces, and an easy to use web interface for model configuration and experiment definition. A researcher can easily compare outputs from different hydrologic models forced by the same meteorological model, or compare different meteorological models to validate or improve her research. This paper presents the benefit of a web-based interface for hydro-meteorology research through a detailed analysis of the portal (based on liferay-gUse) developed by the DRIHM project.
international conference on system of systems engineering | 2012
Andrea Clematis; Daniele D'Agostino; Emanuele Danovaro; Antonella Galizia; Alfonso Quarati; Antonio Parodi; Nicola Rebora; Tatiana Bedrina; Dieter Kranzlmueller; Michael Schiffers; Bert Jagers; Quillon Harpham; Pierre-Henri Cros
One of the main challenges of the 21st century is represented by accurate weather predictions together with the estimate of extreme phenomena and their impacts on the environment and on the society. The key point of this challenge is to enable the acceleration of advances in hydrometeorological research, and to integrate these advances in the everyday forecasts thus improving the protection of civilians and of the environment. The DRIHMS (Distributed Research Infrastructure for Hydro-Meteorology Study) project suggests that a step forward in this direction lies on the ability to easily access hydrometeorological data, to share predictive models, and to facilitate the collaboration among different experts in this area. At this aim it is necessary the support of an e-infrastructure permitting to deal with the massive amount of information needed and providing the adequate level of systems interoperability. These are the goals of the DRIMH, Distributed Research Infrastructure for Hydro-Meteorology, project presented hereafter.
Bulletin of the American Meteorological Society | 2017
Antonio Parodi; Dieter Kranzlmüller; Andrea Clematis; Emanuele Danovaro; Antonella Galizia; Luis Garrote; M. C. Llasat; Olivier Caumont; Evelyne Richard; Quillon Harpham; Franco Siccardi; Luca Ferraris; Nicola Rebora; Fabio Delogu; Elisabetta Fiori; Luca Molini; Efi Foufoula-Georgiou; Daniele D’Agostino
AbstractFrom 1970 to 2012, about 9,000 high-impact weather events were reported globally, causing the loss of 1.94 million lives and damage of
Environmental Modelling and Software | 2015
Quillon Harpham
2.4 trillion (U.S. dollars). The scientific community is called to action to improve the predictive ability of such events and communicate forecasts and associated risks both to affected populations and to those making decisions. At the heart of this challenge lies the ability to have easy access to hydrometeorological data and models and to facilitate the necessary collaboration between meteorologists, hydrologists, and computer science experts to achieve accelerated scientific advances. Two European Union (EU)-funded projects, Distributed Research Infrastructure for Hydro-Meteorology (DRIHM) and DRIHM to United States of America (DRIHM2US), sought to help address this challenge by developing a prototype e-science environment providing advanced end-to-end services (models, datasets, and postprocessing tools), with the aim of paving the way to a step change in how...
Journal of The American Water Resources Association | 2016
Quillon Harpham; Julien Lhomme; Antonio Parodi; Elisabetta Fiori; Bert Jagers; Antonella Galizia
Structured environments for executing environmental numerical models are becoming increasingly common, typically including functions for discovering models, running and integrating them. As these environments proliferate and mature, a set of topics is emerging as common ground between them. This paper abstracts common characteristics from leading integrated modelling technologies and derives a generic framework, characterised as a Model MAP - Metadata (including documentation and licence), Adaptors (to common standards) and Portability (of model components). The idea is to form a gateway concept consisting of a checklist of elements which must be in place before a numerical model is offered for interoperability in a structured environment and at a level of abstraction suitable to support environmental model interoperability in general. Following comparison to the Component-Based Water Resource Model Ontology, the Model MAP is applied to DRIHM, an?hydro-meteorological research infrastructure, as the initial use case and more generic aspects are also discussed. A gateway set of requirements for legacy environmental model interoperability.A route towards legacy environmental model standardisation.Comparison to the Component-Based Water Resource Model Ontology.Application of the Model MAP for a?hydro-meteorological model eInfrastructure.Applying the Model MAP to WRF-ARW and an OpenMI composition.
Earth Science Informatics | 2017
Quillon Harpham; Olalla Gimeno; Antonio Parodi; Daniele D’Agostino
Extreme hydrometeorological events such as flash floods have caused considerable loss of life and damage to infrastructure over recent years. Flood events in the Mediterranean region between 1990 and 2006 caused over 4,500 fatalities and cost over €29 billion in damage, with Italy one of the worst affected countries. The Distributed Computing Infrastructure for Hydro-Meteorology (DRIHM) project is a European initiative aiming at providing an open, fully integrated eScience environment for predicting, managing, and mitigating the risks related to such extreme weather phenomena. Incorporating both modeled and observational data sources, it enables seamless access to a set of computing resources with the objective of providing a collection of services for performing experiments with numerical models in meteorology, hydrology, and hydraulics. The purpose of this article is to demonstrate how this flexible modeling architecture has been constructed using a set of standards including the NetCDF and WaterML2 file formats, in-memory coupling with OpenMI, controlled vocabularies such as CF Standard Names, ISO19139 metadata, and a Model MAP (Metadata, Adaptors, Portability) gateway concept for preparing numerical models for standardized use. Hydraulic results, including the impact to buildings and hazards to people, are given for the use cases of the severe and fatal flash floods, which occurred in Genoa, Italy in November 2011 and October 2014.
mathematical methods for curves and surfaces | 2016
Vibeke Skytt; Quillon Harpham; Tor Dokken; Heidi E. I. Dahl
The purpose of this paper is to report on and analyse an international consultation into hydro-meteorological e-Science environments with the objective of identifying key functions and features together with exploring show-stopping issues and organisational structure. Transatlantic experiences were compared and contrasted. Including strong participation from both Europe and the USA with high quality responses from experienced practitioners, the consultation was undertaken as part of a joint initiative and took the form of an online questionnaire supported by a set of stakeholder interviews and other discussions. Topics included were functions and features such as provision of numerical models and data, usability, and ease of access; show-stopping issues such as flexibility, reliability and longevity; centralised and distributed structures, and funding models. The results demonstrated a broadly similar set of experiences and implied a future as an evolution of that which exists today. The consultation exercise ran alongside the development of the DRIHM e-Infrastructure which had, in itself, already benefitted from the prior DRIHMS consultation. Results were fed in to the development process at appropriate intervals allowing the consultation to shape the resultant services.
Environmental Modelling and Software | 2016
Quillon Harpham; Nigel Tozer; Paul Cleverley; David Wyncoll; Doug Cresswell
A set of bathymetry point clouds acquired by different measurement techniques at different times, having different accuracy and varying patterns of points, are approximated by an LR B-spline surface. The aim is to represent the sea bottom with good accuracy and at the same time reduce the data size considerably. In this process the point clouds must be cleaned by selecting the “best” points for surface generation. This cleaning process is called deconfliction, and we use a rough approximation of the combined point clouds as a reference surface to select a consistent set of points. The reference surface is updated using only the selected points to create an accurate approximation. LR B-splines is the selected surface format due to its suitability for adaptive refinement and approximation, and its ability to represent local detail without a global increase in the data size of the surface.
international conference on e-science | 2016
Quillon Harpham
New innovations are emerging which offer opportunities to improve forecasts of wave conditions. These include probabilistic modelling results, such as those based on an ensemble of multiple predictions which can provide a measure of the uncertainty, and new sources of observational data such as GNSS reflectometry and FerryBoxes, which can be combined with an increased availability of more traditional static sensors. This paper outlines an application of the Bayesian statistical methodology which combines these innovations. The method modifies the probabilities of ensemble wave forecasts based on recent past performance of individual members against a set of observations from various data source types. Each data source is harvested and mapped against a set of spatio-temporal feature types and then used to post-process ensemble model output. A prototype user interface is given with a set of experimental results testing the methodology for a use case covering the English Channel. Novel data sources such as GNSS reflectometry and ferryboxes are incorporated.Data is characterised against a set of spatio-temporal feature types.Model output ensemble member weights are updated using a Bayesian data incorporation method.Results are portrayed using an example web interface.The method is evaluated using a pilot application in the English Channel.
Journal of Hydroinformatics | 2014
Quillon Harpham; Paul Cleverley; David M. Kelly
New data acquisition techniques are emerging and are providing a fast and efficient means for multidimensional spatial data collection. Single and multi-beam echo-sounders, airborne LIDAR, SAR satellites and mobile mapping systems are increasingly used for the digital reconstruction of the environment. All these systems provide point clouds, often enriched with other sensor data providing extremely high volumes of raw data. With these acquisition approaches, a great deal of data is collected, but it often requires harmonisation and integration before reaching its maximum use potential. Use cases include supporting numerical modelling on land such as simulations of flooding and drought, and for use in modelling waves and flow in seas and oceans. The IQmulus high volume fusion and analysis platform offers an architecture for processing such geospatial point clouds, through a set of pre-defined workflows, on a cloud infrastructure. Workflow elements include deconfliction of spatially overlapping data, spline interpolation to create high precision surfaces and the latest visualisation techniques for these datasets. Featured in the presentation is a workflow designed to process collections of surveys of water depth. Individual surveys vary both spatially and temporally and can overlap with many other similar surveys. Where measurements of water depth differ greatly between surveys a strategy needs to be employed to determine how to create an optimal bathymetric surface using all of the relevant, available data. As part of its SeaZone suite of data products, HR Wallingford employs the latest deconfliction techniques to produce such a ‘best’ surface. The workflow begins with a methodology for prioritising individual surveys, followed by spline interpolation of adjacent or overlapping datasets with a potentially parallel implementation which includes tiling and stitching to create the final completed surface. An example of how these datasets can support the immersive visualisation of civil engineering applications is shown through HR Wallingfords advanced Ship Simulation Centre.