William L. George
National Institute of Standards and Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by William L. George.
Journal of Research of the National Institute of Standards and Technology | 2002
Dale P. Bentz; Symoane Mizell; Steven G. Satterfield; Judith Ellen Devaney; William L. George; Peter M. Ketcham; James Graham; James K Porterfield; Daniel Quenard; F. Vallee; Hébert Sallée; Elodie Boller; J. Baruchel
With advances in x-ray microtomography, it is now possible to obtain three-dimensional representations of a material’s microstructure with a voxel size of less than one micrometer. The Visible Cement Data Set represents a collection of 3-D data sets obtained using the European Synchrotron Radiation Facility in Grenoble, France in September 2000. Most of the images obtained are for hydrating portland cement pastes, with a few data sets representing hydrating Plaster of Paris and a common building brick. All of these data sets are being made available on the Visible Cement Data Set website at http://visiblecement.nist.gov. The website includes the raw 3-D datafiles, a description of the material imaged for each data set, example two-dimensional images and visualizations for each data set, and a collection of C language computer programs that will be of use in processing and analyzing the 3-D microstructural images. This paper provides the details of the experiments performed at the ESRF, the analysis procedures utilized in obtaining the data set files, and a few representative example images for each of the three materials investigated.
Modelling and Simulation in Materials Science and Engineering | 2010
Jeffrey W. Bullard; Edith Enjolras; William L. George; Steven G. Satterfield; Judith E. Terrill
A recently described stochastic reaction-transport model on three-dimensional lattices is parallelized and is used to simulate the time-dependent structural and chemical evolution in multicomponent reactive systems. The model, called HydratiCA, uses probabilistic rules to simulate the kinetics of diffusion, homogeneous reactions and heterogeneous phenomena such as solid nucleation, growth and dissolution in complex three-dimensional systems. The algorithms require information only from each lattice site and its immediate neighbors, and this localization enables the parallelized model to exhibit near-linear scaling up to several hundred processors. Although applicable to a wide range of material systems, including sedimentary rock beds, reacting colloids and biochemical systems, validation is performed here on two minerals that are commonly found in Portland cement paste, calcium hydroxide and ettringite, by comparing their simulated dissolution or precipitation rates far from equilibrium to standard rate equations, and also by comparing simulated equilibrium states to thermodynamic calculations, as a function of temperature and pH. Finally, we demonstrate how HydratiCA can be used to investigate microstructure characteristics, such as spatial correlations between different condensed phases, in more complex microstructures.
Journal of Research of the National Institute of Standards and Technology | 2000
William L. George; John G. Hagedorn; Judith Ellen Devaney
The Message Passing Interface (MPI) is the de facto standard for writing parallel scientific applications in the message passing programming paradigm. Implementations of MPI were not designed to interoperate, thereby limiting the environments in which parallel jobs could be run. We briefly describe a set of protocols, designed by a steering committee of current implementors of MPI, that enable two or more implementations of MPI to interoperate within a single application. Specifically, we introduce the set of protocols collectively called Interoperable MPI (IMPI). These protocols make use of novel techniques to handle difficult requirements such as maintaining interoperability among all IMPI implementations while also allowing for the independent evolution of the collective communication algorithms used in IMPI. Our contribution to this effort has been as a facilitator for meetings, editor of the IMPI Specification document, and as an early testbed for implementations of IMPI. This testbed is in the form of an IMPI conformance tester, a system that can verify the correct operation of an IMPI-enabled version of MPI.
Journal of Research of the National Institute of Standards and Technology | 2000
James S. Sims; William L. George; Steven G. Satterfield; Howard Hung; John G. Hagedorn; Peter M. Ketcham; Terence J. Griffin; Stanley A. Hagstrom; Julien C. Franiatte; Garnett W. Bryant; W. Jaskólski; Nicos Martys; C. E. Bouldin; Vernon Simmons; Oliver P. Nicolas; James A. Warren; Barbara A. Am Ende; John Koontz; B. James Filla; Vital G. Pourprix; Stefanie R. Copley; Robert B. Bohn; Adele P. Peskin; Yolanda M. Parker; Judith Ellen Devaney
The rate of scientific discovery can be accelerated through computation and visualization. This acceleration results from the synergy of expertise, computing tools, and hardware for enabling high-performance computation, information science, and visualization that is provided by a team of computation and visualization scientists collaborating in a peer-to-peer effort with the research scientists. In the context of this discussion, high performance refers to capabilities beyond the current state of the art in desktop computing. To be effective in this arena, a team comprising a critical mass of talent, parallel computing techniques, visualization algorithms, advanced visualization hardware, and a recurring investment is required to stay beyond the desktop capabilities. This article describes, through examples, how the Scientific Applications and Visualization Group (SAVG) at NIST has utilized high performance parallel computing and visualization to accelerate condensate modeling, (2) fluid flow in porous materials and in other complex geometries, (3) flows in suspensions, (4) x-ray absorption, (5) dielectric breakdown modeling, and (6) dendritic growth in alloys.
Journal of Rheology | 2014
Maxime Liard; Nicos Martys; William L. George; Didier Lootens; Pascal Hébraud
It has been observed that flow curves (viscosity vs shear rate) of spherical solid inclusions suspended in a generalized Newtonian fluid medium can be rescaled so as to collapse onto the flow curve of the fluid medium. This result is surprising given the range of values and the spatial heterogeneity of local shear rates and viscosity in such systems. We consider such scaling for the cases of shear thinning, Newtonian, and shear-thickening fluid media. Results from experiment and computational modeling are presented that examine the microscopic origins of this scaling behavior. Over a wide range of volume fractions (5–50%), it is shown that the distribution of local shear rates can be collapsed onto a single universal curve. The parameters for rescaling the shear rate distributions can be analytically related to the macroscopic rescaling parameters for the viscosity. As a result of this rescaling capability, one may measure the properties of the fluid medium and predict the macroscopic behavior of the susp...
ambient intelligence | 2005
Judith Ellen Devaney; Steven G. Satterfield; John G. Hagedorn; John T. Kelso; Adele P. Peskin; William L. George; Terence J. Griffin; Howard Hung; Ronald D. Kriz
Scientific discoveries occur with iterations of theory, experiment, and analysis. But the methods that scientists use to go about their work are changing [1]. Experiment types are changing. Increasingly, experiment means computational experiment [2], as computers increase in speed, memory, and parallel processing capability. Laboratory experiments are becoming parallel as combinatorial experiments become more common. Acquired datasets are changing. Both computer and laboratory experiments can produce large quantities of data where the time to analyze data can exceed the time to generate it. Data from experiments can come in surges where the analysis of each set determines the direction of the next experiments. The data generated by experiments may also be non-intuitive. For example, nanoscience is the study of materials whose properties may change greatly as their size is reduced [3]. Thus analyses may benefit from new ways to examine and interact with data.
Book chapter in Trends in Interactive Visualization | 2009
Judith E. Terrill; William L. George; Terence J. Griffin; John G. Hagedorn; John T. Kelso; Marc Olano; Adele P. Peskin; Steven G. Satterfield; James S. Sims; Jeffrey W. Bullard; Joy P. Dunkers; Nicos Martys; Agnes O’Gallagher; Gillian Haemer
We describe three classes of tools to turn visualizations into a visual laboratory to interactively measure and analyze scientific data. We move the nor- mal activities that scientists perform to understand their data into the visualization environment, which becomes our virtual laboratory, combining the qualitative with the quantitative. We use representation, interactive selection, quantification, and display to add quantitative measurement methods, input tools, and output tools. These allow us to obtain numerical information from each visualization. The exact form that the tools take within each of our three categories depends on features present in the data, hence each is manifested differently in different situations. We illustrate the three approaches with a variety of case studies from immersive to desktop environments that demonstrate the methods used to obtain quantitative knowledge interactively from visual objects.
European Physical Journal E | 2012
Nicos Martys; Mouhamad khalil; William L. George; Didier Lootens; Pascal Hébraud
The stress propagation in a concentrated attractive colloidal suspension under shear is studied using numerical simulations. The spatial correlations of the intercolloidal stress field are studied and an inertia-like tensor is defined in order to characterize the anisotropic nature of the stress field. It is shown that the colloids remain in a liquid order, the intercolloidal stress is strongly anisotropic. A transition under flow is observed: during a transient regime at low deformation, the stress propagates along the compression direction of the shear, whereas at larger deformations, the stress is organized into layers parallel to the (flow, vorticity) plane.
Journal of Research of the National Institute of Standards and Technology | 2008
James S. Sims; William L. George; Terence J. Griffin; John C. Hagedorn; Howard Hung; John T. Kelso; Marc Olano; Adele P. Peskin; Steven G. Satterfield; Judith Devaney Terrill; Garnett W. Bryant; Jose G. Diaz
This is the third in a series of articles that describe, through examples, how the Scientific Applications and Visualization Group (SAVG) at NIST has utilized high performance parallel computing, visualization, and machine learning to accelerate scientific discovery. In this article we focus on the use of high performance computing and visualization for simulations of nanotechnology.
2016 IEEE 9th Workshop on Software Engineering and Architectures for Realtime Interactive Systems (SEARIS) | 2016
Wesley Griffin; William L. George; Terence J. Griffin; John G. Hagedorn; Marc Olano; Steven G. Satterfield; James S. Sims; Judith E. Terrill
Content creation for realtime interactive systems is a difficult problem. In game development, content creation pipelines are a major portion of the code base and content creation is a major portion of the budget. In research environments, the choice of rendering and simulation systems is frequently driven by the need for easy to use content authoring tools. In visualization, this problem is compounded by the widely varying types of data that users desire to visualize. We present a visualization application creation framework incorporated into our visualization system that enables measurement and quantitative analysis tasks in both desktop and immersive environments on diverse input data sets.