Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Philip James is active.

Publication


Featured researches published by Philip James.


Journal of Flood Risk Management | 2017

Assessing the utility of social media as a data source for flood risk management using a real‐time modelling framework

Luke S. Smith; Qiuhua Liang; Philip James; Wen Lin

The utility of social media for both collecting and disseminating information during natural disasters is increasingly recognised. The rapid nature of urban flooding from intense rainfall means accurate surveying of peak depths and flood extents is rarely achievable, hindering the validation of urban flood models. This paper presents a real-time modelling framework to identify areas likely to have flooded using data obtained only through social media. Graphics processing unit (GPU) accelerated hydrodynamic modelling is used to simulate flooding in a 48-km2 area of Newcastle upon Tyne, with results automatically compared against flooding identified through social media, allowing inundation to be inferred elsewhere in the city with increased detail and accuracy. Data from Twitter during two 2012 flood events are used to test the framework, with the inundation results indicative of good agreement against crowd-sourced and anecdotal data, even though the sample of successfully geocoded Tweets was relatively small.


ISPRS international journal of geo-information | 2015

Transport Accessibility Analysis Using GIS: Assessing Sustainable Transport in London

Alistair Ford; Stuart Barr; Richard Dawson; Philip James

Transport accessibility is an important driver of urban growth and key to the sustainable development of cities. This paper presents a simple GIS-based tool developed to allow the rapid analysis of accessibility by different transport modes. Designed to be flexible and use publicly-available data, this tool (built in ArcGIS) uses generalized cost to measure transport costs across networks including monetary and distance components. The utility of the tool is demonstrated on London, UK, showing the differing patterns of accessibility across the city by different modes. It is shown that these patterns can be examined spatially, by accessibility to particular destinations (e.g., employment locations), or as a global measure across a whole city system. A number of future infrastructure scenarios are tested, examining the potential for increasing the use of low-carbon forms of transport. It is shown that private car journeys are still the least cost mode choice in London, but that infrastructure investments can play a part in reducing the cost of more sustainable transport options.


IEEE Transactions on Automation Science and Engineering | 2010

Orchestration of Grid-Enabled Geospatial Web Services in Geoscientific Workflows

Gobe Hobona; David Fairbairn; Hugo Hiden; Philip James

The need for computational resources capable of processing geospatial data has accelerated the uptake of geospatial web services. Several academic and commercial organizations now offer geospatial web services for data provision, coordinate transformation, geocoding and several other tasks. These web services adopt specifications developed by the Open Geospatial Consortium (OGC) - the leading standardization body for Geographic Information Systems. In parallel with efforts of the OGC, the Grid computing community has published specifications for developing Grid applications. The Open Grid Forum (OGF) is the main body that promotes interoperability between Grid computing systems. This study examines the integration of Grid services and geospatial web services into workflows for Geoscientific processing. An architecture is proposed that bridges web services based on the abstract geospatial architecture (ISO19119) and the Open Grid Services Architecture (OGSA). The paper presents a workflow management system, called SAW-GEO, that supports orchestration of Grid-enabled geospatial web services. An implementation of SAW-GEO is presented, based on both the Simple Conceptual Unified Flow Language (SCUFL) and the Business Process Execution Language for Web Services (WS-BPEL or BPEL for short).


advances in geographic information systems | 2007

Semantically-assisted geospatial workflow design

Gobe Hobona; David Fairbairn; Philip James

The value of service oriented architectures has been demonstrated in several studies. A key aspect of the advantage of web services is their orchestration into complex business workflows. The Organization for the Advancement of Structured Information Systems (OASIS) has recently approved an industry-wide standard for workflow specification, the Business Process Execution Language (BPEL). The Open Geospatial Consortium (OGC), a member of OASIS, has adopted BPEL for its series of interoperability experiments. This paper presents a study concerned with the use of ontology in assisting geospatial web service orchestration. A methodology for calculating the degree of suitability of various candidate workflows is proposed. The implementation of a prototype plug-in for Eclipse-based BPEL editors is discussed. The proposed system presents candidate workflows based on semantic descriptions of feature, coverage and processing services. An evaluation of the system, based on a workflow involving a variety of geospatial web services is also presented.


Environmental Modelling and Software | 2010

Software, Data and Modelling News: Graphical user interface for rapid set-up of SHETRAN physically-based river catchment model

Stephen Birkinshaw; Philip James; John Ewen

The SHETRAN physically-based distributed rainfall-runoff modelling system gives detailed simulations in time and space of water flow and sediment and solute transport in river catchments. It is therefore a powerful tool for studying hydrological and environmental impacts associated with land-use and climate change. A Graphical User Interface (GUI) has been developed that allows a catchment data set to be set up quickly using a minimum of information. The GUI has an algorithm for the automatic generation of river channel networks from a DEM and has access to libraries of soil and vegetation parameters.


geographic information science | 2006

Multidimensional visualisation of degrees of relevance of geographic data

Gobe Hobona; Philip James; David Fairbairn

The ever‐increasing number of spatial data sets accessible through spatial data clearinghouses continues to make geographic information retrieval and spatial data discovery major challenges. Such challenges have been addressed in the discipline of Information Retrieval through ranking of data according to inferred degrees of relevance. Spatial data, however, present an additional challenge as they are characteristically made up of geometry, attribute and, optionally, temporal components. As these components are mutually independent of one another, this paper suggests that they be ranked independently of one another. The representation of the results of the independent ranking of these three components of spatial data suggests that representation of the results of the ranking process requires an alternative approach to currently used textual ranked lists: visualisation of relevance in a three‐dimensional visualisation environment. To illustrate the possible application of such an approach, a prototype browser is presented.


Computers & Geosciences | 2005

A model for spatio-temporal network planning

Edward John Nash; Philip James; David Parker

Temporal GIS research has tended to focus on representing a single history through a series of states. For planning future work involving alternative scenarios a branching model of time may be required, however for large systems such models soon become highly complex. In this paper we introduce the temporal topology model which allows sections of work and the spatial, temporal and logical relationships between them to be represented efficiently together with the associated costs. We then discuss how this model could be used for analysis to determine an optimal plan, illustrated with a case study involving cycle network planning, and briefly describe some practical results which have been obtained.


International Journal of Digital Earth | 2012

The challenges of developing an open source, standards-based technology stack to deliver the latest UK climate projections

A. Stephens; Philip James; David Alderson; Stephen Pascoe; Simon Abele; Alan Iwi; Peter Chiu

Abstract To improve the understanding of local and regional effects of climate change, the UK government supported the development of new climate projections. The Met Office Hadley Centre produced a sophisticated set of probabilistic projections for future climate. This paper discusses the design and implementation of an interactive website to deliver those projections to a broad user community. The interface presents complex data sets, generates on-the-fly products and schedules jobs to an offline weather generator capable of outputting gigabytes of data in response to a single request. A robust and scalable physical architecture was delivered through significant use of open source technologies and open standards.


IEEE Cloud Computing | 2017

Orchestrating BigData Analysis Workflows

Rajiv Ranjan; Saurabh Kumar Garg; Ali Reza Khoskbar; Ellis Solaiman; Philip James; Dimitrios Georgakopoulos

Data analytics has become not only an essential part of day-to-day decision making, but also reinforces long-term strategic decisions. Whether it is real-time fraud detection, resource management, tracking and prevention of disease outbreak, natural disaster management or intelligent traffic management, the extraction and exploitation of insightful information from unparalleled quantities of data (BigData) is now a fundamental part of all decision making processes. Success in making smart decisions by analyzing BigData is possible due to the availability of improved analytical capabilities, increased access to different data sources, and cheaper and improved computing power in the form of cloud computing. However, BigData analysis is far more complicated than the perception created by the recent publicity. For example, one of the myths is that BigData analysis is driven purely by the innovation of new data mining and machine learning algorithms. While innovation of new data mining and machine learning algorithms is critical, this is only one aspect of producing BigData analysis solutions. Just like many other software solutions, BigData analysis solutions are not monolithic pieces of software that are developed specifically for every application. Instead, they often combine and reuse existing trusted software components that perform necessary data analysis steps. Furthermore, in order to deal with the large variety, volume and velocity of BigData, they need to take advantage of the elasticity of cloud and edge datacenter computation and storage resources as needed to meet the requirements of their owners.


International Journal of Geographical Information Science | 2018

Volunteered geographic information quality assessment using trust and reputation modelling in land administration systems in developing countries

Kealeboga K. Moreri; David Fairbairn; Philip James

ABSTRACT This article presents an innovative approach to establish the quality and credibility of Volunteered Geographic Information (VGI) such that it can be considered in Land Administration Systems (LAS) on a Fit for Purpose (FFP) basis. A participatory land information system can provide affordable and timely FFP information about land and its resources. However, the establishment of such a system involves more than just technical solutions and administrative procedures: many social, economic and political aspects must be considered. Innovative approaches like VGI can help address the lack of accurate, reliable and FFP land information for LAS, but integration of such sources relies on the quality and credibility of VGI. Verifying volunteer efforts can be difficult without reference to ground truth: a novel Trust and Reputation Modelling methodology is proposed as a suitable technique to effect such VGI data set validation. This method has been applied to successfully demonstrate that VGI can produce accurate and reliable data sets which can be used to conduct regular systematic updates of geographic information in official systems. It relies on a view that the public can police themselves in establishing proxy measures of VGI quality thus facilitating VGI to be used on a FFP basis in LAS.

Collaboration


Dive into the Philip James's collaboration.

Top Co-Authors

Avatar

Gobe Hobona

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar

David Parker

University of Newcastle

View shared research outputs
Top Co-Authors

Avatar

Adam Etches

University of Newcastle

View shared research outputs
Top Co-Authors

Avatar

Dimitrios Georgakopoulos

Swinburne University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sean Ince

University of Newcastle

View shared research outputs
Top Co-Authors

Avatar

A. Stephens

Rutherford Appleton Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

C. Harpham

University of East Anglia

View shared research outputs
Top Co-Authors

Avatar

Jeremy Morley

University of Nottingham

View shared research outputs
Researchain Logo
Decentralizing Knowledge