Wes Bethel
Lawrence Berkeley National Laboratory
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Wes Bethel.
Journal of Computer-aided Molecular Design | 2004
Silvia N. Crivelli; Oliver Kreylos; Bernd Hamann; Nelson L. Max; Wes Bethel
We describe ProteinShop, a new visualization tool that streamlines and simplifies the process of determining optimal protein folds. ProteinShop may be used at different stages of a protein structure prediction process. First, it can create protein configurations containing secondary structures specified by the user. Second, it can interactively manipulate protein fragments to achieve desired folds by adjusting the dihedral angles of selected coil regions using an Inverse Kinematics method. Last, it serves as a visual framework to monitor and steer a protein structure prediction process that may be running on a remote machine. ProteinShop was used to create initial configurations for a protein structure prediction method developed by a team that competed in CASP5. ProteinShops use accelerated the process of generating initial configurations, reducing the time required from days to hours. This paper describes the structure of ProteinShop and discusses its main features.
Other Information: PBD: 30 Jun 2004 | 2004
Ian Bowman; John Shalf; Kwan-Liu Ma; Wes Bethel
The visualization of large, remotely located data sets necessitates the development of a distributed computing pipeline in order to reduce the data, in stages, to a manageable size. The required baseline infrastructure for launching such a distributed pipeline is becoming available, but few services support even marginally optimal resource selection and partitioning of the data analysis workflow. We explore a methodology for building a model of overall application performance using a composition of the analytic models of individual components that comprise the pipeline. The analytic models are shown to be accurate on a testbed of distributed heterogeneous systems. The prediction methodology will form the foundation of a more robust resource management service for future Grid-based visualization applications.
Proceedings of SPIE | 2011
Daniela Ushizima; Dilworth Y. Parkinson; Peter S. Nico; Jonathan B. Ajo-Franklin; Alastair A. MacDowell; Benjamin D. Kocar; Wes Bethel; James A. Sethian
High-resolution x-ray micro-tomography is used for imaging of solid materials at micrometer scale in 3D. Our goal is to implement nondestructive techniques to quantify properties in the interior of solid objects, including information on their 3D geometries, which supports modeling of the fluid dynamics into the pore space of the host object. The micro-tomography data acquisition process generates large data sets that are often difficult to handle with adequate performance when using current standard computing and image processing algorithms. We propose an efficient set of algorithms to filter, segment and extract features from stacks of image slices of porous media. The first step tunes scale parameters to the filtering algorithm, then it reduces artifacts using a fast anisotropic filter applied to the image stack, which smoothes homogeneous regions while preserving borders. Next, the volume is partitioned using statistical region merging, exploiting the intensity similarities of each segment. Finally, we calculate the porosity of the material based on the solid-void ratio. Our contribution is to design a pipeline tailored to deal with large data-files, including a scheme for the user to input image patches for tuning parameters to the datasets. We illustrate our methodology using more than 2,000 micro-tomography image slices from 4 different porous materials, acquired using high-resolution X-ray. Also, we compare our results with standard, yet fast algorithms often used for image segmentation, which includes median filtering and thresholding.
ieee particle accelerator conference | 2007
Andreas Adelmann; Achim Gsell; Benedikt Oswald; Thomas Schietinger; Wes Bethel; John Shalf; Cristina Siegerist; Kurt Stockinger
Significant problems facing all experimental and computational sciences arise from growing data size and complexity. Common to all these problems is the need to perform efficient data I/O on diverse computer architectures. In our scientific application, the largest parallel particle simulations generate vast quantities of six-dimensional data. Such a simulation run produces data for an aggregate data size up to several TB per run. Motived by the need to address data I/O and access challenges, we have implemented H5Part, an open source data I/O API that simplifies the use of the Hierarchical Data Format v5 library (HDF5). HDF5 is an industry standard for high performance, cross- platform data storage and retrieval that runs on all contemporary architectures from large parallel supercomputers to laptops. H5Part, which is oriented to the needs of the particle physics and cosmology communities, provides support for parallel storage and retrieval of particles, structured and in the future unstructured meshes. In this paper, we describe recent work focusing on I/O support for particles and structured meshes and provide data showing performance on modern supercomputer architectures like the IBM POWER 5.
Visualization Handbook | 2005
Wes Bethel; John Shalf
Visapult is a visualization application composed of visualization of multiple software components that execute in a pipelined-parallel fashion over wide-area networks. By design, Visapult was tailored for use in a remote and distributed visualization context. Visapult is arguably the worlds fastest performing distributed application, consuming approximately 16.8 gigabits per second in sustained network band-width during the SC 2002 Bandwidth Challenge over transcontinental network links. Visapults performance is a direct result of architecture, careful use of custom network protocols, and application performance tuning. This chapter reveals the secrets used to create the worlds highest-performing network application. The chapter begins with presenting an overview of Visapults fundamental architecture. The chapter also presents three short case studies that reflect the experiences using Visapult to win the SC Bandwidth Challenge in 2000, 2001, and 2002. The chapter concludes with a discussion on future research and development directions in the field of remote and distributed visualization.
Journal of Synchrotron Radiation | 2017
Talita Perciano; Daniela Ushizima; Harinarayan Krishnan; Dilworth Y. Parkinson; Natalie M. Larson; Daniël M. Pelt; Wes Bethel; Frank W. Zok; James A. Sethian
Three-dimensional (3D) micro-tomography (µ-CT) has proven to be an important imaging modality in industry and scientific domains. Understanding the properties of material structure and behavior has produced many scientific advances. An important component of the 3D µ-CT pipeline is image partitioning (or image segmentation), a step that is used to separate various phases or components in an image. Image partitioning schemes require specific rules for different scientific fields, but a common strategy consists of devising metrics to quantify performance and accuracy. The present article proposes a set of protocols to systematically analyze and compare the results of unsupervised classification methods used for segmentation of synchrotron-based data. The proposed dataflow for Materials Segmentation and Metrics (MSM) provides 3D micro-tomography image segmentation algorithms, such as statistical region merging (SRM), k-means algorithm and parallel Markov random field (PMRF), while offering different metrics to evaluate segmentation quality, confidence and conformity with standards. Both experimental and synthetic data are assessed, illustrating quantitative results through the MSM dashboard, which can return sample information such as media porosity and permeability. The main contributions of this work are: (i) to deliver tools to improve material design and quality control; (ii) to provide datasets for benchmarking and reproducibility; (iii) to yield good practices in the absence of standards or ground-truth for ceramic composite analysis.
Lawrence Berkeley National Laboratory | 2002
William Kramer; Wes Bethel; James Craw; Brent Draney; William J. Fortney; Brend Gorda; William Harris; Nancy Meyer; Esmond G. Ng; Francesca Verdier; Howard Walter; Tammy. Welcome
This strategic proposal presents NERSCs vision for its activities and new directions over the next five years. NERSCs continuing commitment to providing high-end systems and comprehensive scientific support for its users will be enhanced, and these activities will be augmented by two new strategic thrusts: support for Scientific Challenge Teams and deployment of a Unified Science Environment. The proposal is in two volumes, the Strategic Plan and the Implementation Plan.
Lawrence Berkeley National Laboratory | 2003
Wes Bethel; Cristina Siegerist; John Shalf; Praveenkumar Shetty; T. J. Jankun-Kelly; Oliver Kreylos; Kwan-Liu Ma
international conference on computer graphics and interactive techniques | 2000
Wes Bethel
Lawrence Berkeley National Laboratory | 2009
Cameron Geddes; E. Cormier-Michel; E. Esarey; C. B. Schroeder; Jean-Luc Vay; Wim Leemans; David L. Bruhwiler; John R. Cary; B. Cowan; Marc Durant; Paul Hamill; Peter Messmer; Paul Mullowney; Chet Nieter; Kevin Paul; Svetlana G. Shasharina; Seth A. Veitzer; Gunther H. Weber; Oliver Rübel; Daniela Ushizima; Wes Bethel; John Wu