Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Norman L. Jones is active.

Publication


Featured researches published by Norman L. Jones.


Journal of The American Water Resources Association | 2016

A High-Resolution National-Scale Hydrologic Forecast System from a Global Ensemble Land Surface Model†

Alan D. Snow; Scott D. Christensen; Nathan Swain; E. James Nelson; Daniel P. Ames; Norman L. Jones; Deng Ding; Nawajish Sayeed Noman; Cédric H. David; Florian Pappenberger; Ervin Zsoter

Abstract Warning systems with the ability to predict floods several days in advance have the potential to benefit tens of millions of people. Accordingly, large‐scale streamflow prediction systems such as the Advanced Hydrologic Prediction Service or the Global Flood Awareness System are limited to coarse resolutions. This article presents a method for routing global runoff ensemble forecasts and global historical runoff generated by the European Centre for Medium‐Range Weather Forecasts model using the Routing Application for Parallel computatIon of Discharge to produce high spatial resolution 15‐day stream forecasts, approximate recurrence intervals, and warning points at locations where streamflow is predicted to exceed the recurrence interval thresholds. The processing method involves distributing the computations using computer clusters to facilitate processing of large watersheds with high‐density stream networks. In addition, the Streamflow Prediction Tool web application was developed for visualizing analyzed results at both the regional level and at the reach level of high‐density stream networks. The application formed part of the base hydrologic forecasting service available to the National Flood Interoperability Experiment and can potentially transform the nations forecast ability by incorporating ensemble predictions at the nearly 2.7 million reaches of the National Hydrography Plus Version 2 Dataset into the national forecasting system.


Journal of Hydrology | 1995

Reducing elevation roundoff errors in digital elevation models

E. James Nelson; Norman L. Jones

Abstract A smoothing algorithm is presented for the removal of roundoff error, inherent in almost all digital elevation data. Elevation adjustments are kept within the tolerance of roundoff error, so that the resulting terrain model is not over smoothed. After smoothing, the digital elevation model is more suitable for use with algorithms that seek to automatically delineate stream networks and basins of a watershed. The algorithm is specifically intended for use with gridded data, and is particularly effective when used with elevations originating from the 7.5 min quadrangles provided by the United States Geological Survey.


Computers & Geosciences | 2000

Fast algorithm for generating sorted contour strings

Norman L. Jones; Michael J. Kennard; Alan K. Zundel

Abstract Automatic generation of contours for graphical display and map plotting has been studied extensively since the early days of computing. The individual segments making up a contour line are often determined by subdividing the object of interest into small triangles and computing the contours assuming a linear variation on each triangle. However, efficient storage of contour data and the need to place labels (automatically) or to smooth the contours require that the contours be generated in continuous strings of segments. A simple approach to generate such strings is to sort the randomly generated contour segments. Since sorting can be time-consuming, the majority of previous approaches are contour-tracing algorithms that traverse the surface and generate the contour in a continuous sequence of segments. In this paper, we present a new sorting algorithm. The algorithm is relatively easy to implement, can be applied to any type of surface, and works for both 2D and 3D objects. The algorithm is significantly faster than the contour tracing approach, particularly when large numbers of segments are involved.


Journal of The American Water Resources Association | 2016

From Global to Local: Providing Actionable Flood Forecast Information in a Cloud-Based Computing Environment†

J. Fidel Perez; Nathan Swain; Herman Guillermo Dolder; Scott D. Christensen; Alan D. Snow; E. James Nelson; Norman L. Jones

Global and continental scale flood forecast provide coarse resolution flood forecast, but from the perspective of emergency management, flood warnings should be detailed and specific to local conditions. The desired refinement can be provided by the use of downscaling global scale models and through the use of distributed hydrologic models to produce a high-resolution flood forecast. Three major challenges associated with transforming global flood forecasting to a local scale are addressed in this work. The first is using open-source software tools to provide access to multiple data sources and lowering the barriers for users in management agencies at local level. This can be done through the Tethys Platform that enables web water resources modeling applications. The second is finding a practical solution for the computational requirements associated with running complex models and performing multiple simulations. This is done using Tethys Cluster that manages distributed and cloud computing resources as a companion to the Tethys Platform for web app development. The third challenge is discovering ways to downscale the forecasts from the global extent to the local context. Three modeling strategies have been tested to address this, including downscaling of coarse resolution global runoff models to high-resolution stream networks and routing with Routing Application for Parallel computatIon of Discharge (RAPID), the use of hierarchical Gridded Surface and Subsurface Hydrologic Analysis (GSSHA) distributed models, and pre-computed distributed GSSHA models.


Critical Transitions in Water and Environmental Resources Management | 2004

A Generic Format for Multi-Dimensional Models

Norman L. Jones; R. D. Jones; C. D. Butler; R. M. Wallace

The Environmental Modeling Research Laboratory (EMRL) at Brigham Young University in partnership with the U.S. Army Engineer Research and Development Center (ERDC) is currently developing a generic data format for multi -dimensional models. The goal of this exercise is to develop, promote, and deploy a common modeling format that facilitates data storage, exchange, access, analysis, and discovery of scientific and engineering data . The project encompasses one-, two-, and three-dimensional models including river cross-sections, scatter points, unstructured (finite element) grids, and structured grids. The objective of the project is to define a standard file format for all computational models developed at ERDC. The new model format is called XMDF (for Generic Model Data Format) and consists of a file format and an object code library ( Application Programming Interface (API)). The API consists of a series of subroutines in both C /C++ and FORTRAN that can be used to read and write model geometry and data sets to the XMDF format. Model developers within ERDC will be encouraged to adopt the format for all existing and future models. Numerous benefits will be derived from the standardized model format including highly compact and efficient file i/o. Using a common format makes it possible to easily share data between models, link models, and gain access to powerful visualization tools .


Transportation Research Record | 1996

Three-Dimensional Characterization of Contaminant Plumes

Norman L. Jones; R. Davis

Before remediation of a site with contaminated soil or groundwater, the contaminant plume must first be characterized. This involves sampling the contaminant concentration at a set of locations in and around the contaminated area. To present the measured concentrations in a meaningful form, the concentrations are typically interpolated to the nodes of a three-dimensional grid, and the plume is visualized by constructing iso-surfaces from the gridded data. The critical step in this process is the interpolation stage. Improper application of an interpolation scheme can result in grossly misleading three-dimensional plume maps. There are a number of problems that often occur when interpolating contaminant plume data, including generation of negative concentrations, oscillation of interpolated values, improper estimation of maximum concentrations, and skewing of the results due to data clustering. These and other difficulties associated with plume characterization are discussed, along with a simple set of guidelines for detecting and overcoming these problems.


Proceedings of the World Environmental and Water Resources Congress 2010, Providence, Rhode Island, USA, 16-20 May, 2010. | 2010

Automated well permitting via GIS geoprocessing tools.

Norman L. Jones; Gil Strassberg; Alan M. Lemon

In this paper we describe a GIS-based system for automated well permitting. The system involves the integration of a calibrated MODFLOW groundwater model into an ArcGIS geodatabase using the Simulation feature dataset in the Arc Hydro Groundwater data model. The model is then used as a baseline for the analysis of candidate wells. Each candidate well is added to the model and the model is run to determine the impact of the well on streamflow, drawdown, etc. The entire process is implemented using a series of connected, low-level geoprocessing tools resulting in a simple automated process. The automation serves to reduce error and increase efficiency. The outputs include tables and GIS maps. The process, evaluation criteria, and products can be customized on an agency-by-agency basis. We illustrate the process using case studies from Virginia and Florida.


Journal of The American Water Resources Association | 2017

A Comprehensive Python Toolkit for Accessing High-Throughput Computing to Support Large Hydrologic Modeling Tasks

Scott D. Christensen; Nathan Swain; Norman L. Jones; E. James Nelson; Alan D. Snow; Herman Guillermo Dolder

The National Flood Interoperability Experiment (NFIE) was an undertaking that initiated a transformation in national hydrologic forecasting by providing streamflow forecasts at high spatial resolution over the whole country. This type of large-scale, high-resolution hydrologic modeling requires flexible and scalable tools to handle the resulting computational loads. While high-throughput computing (HTC) and cloud computing provide an ideal resource for large-scale modeling because they are cost-effective and highly scalable, nevertheless, using these tools requires specialized training that is not always common for hydrologists and engineers. In an effort to facilitate the use of HTC resources the National Science Foundation (NSF) funded project, CI-WATER, has developed a set of Python tools that can automate the tasks of provisioning and configuring an HTC environment in the cloud, and creating and submitting jobs to that environment. These tools are packaged into two Python libraries: CondorPy and TethysCluster. Together these libraries provide a comprehensive toolkit for accessing HTC to support hydrologic modeling. Two use cases are described to demonstrate the use of the toolkit, including a web app that was used to support the NFIE national-scale modeling.


Ground Water | 2014

Efficient Storage of Large MODFLOW Models

Norman L. Jones; Alan M. Lemon; Michael J. Kennard

We present a methodology for storing the bulkier portions of a set of MODFLOW input and output files in a compressed binary format using the HDF5 library. This approach results in compression ratios of up to 99% with no significant time penalty. The highly compressed format is particularly beneficial when dealing with large regional models or Monte Carlo simulations. The strategy is focused on the list- and array-based portions of the input files including the cell property and recharge arrays, and is compatible with models containing parameters, including pilot points. The utilities are based on a modified version of the MODFLOW code and are, therefore, compatible with any standard MODFLOW simulation. We present used cases and instructions on how to use the utilities.


World Environmental and Water Resources Congress 2013: Showcasing the Future | 2013

CI-WATER: Cyberinfrastructure to Advance High Performance Water Resource Modeling

Norman L. Jones; Jim Nelson; Gus Williams; Fred L. Ogden; David G. Tarboton; Steve Burian

We present a collaborative research project called CI-WATER which involves a consortium of Utah and Wyoming researchers. The objective of the project is to acquire and develop hardware and software cyberinfrastructure (CI) to support the development and use of large-scale, high-resolution computational water resources models to enable comprehensive examination of integrated system behavior through physically-based, data-driven simulation. The scientific problem that this project addresses is: How are the quality and availability of water resources sensitive to climate variability, watershed alterations, and management activities? The CI challenge that we are addressing is: How can we best structure data and computer models to address this scientific problem through the use of high-performance and data-intensive computing by discipline scientists coming to this problem without extensive computational and algorithmic knowledge and experience? The project thus aims to broaden the application of CI and HPC techniques into the domain of integrated water resources modeling.

Collaboration


Dive into the Norman L. Jones's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nathan Swain

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alan M. Lemon

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

Alan K. Zundel

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

Jim Nelson

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

Alan D. Snow

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

Daniel P. Ames

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gil Strassberg

University of Texas at Austin

View shared research outputs
Researchain Logo
Decentralizing Knowledge