Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John W Cobb is active.

Publication


Featured researches published by John W Cobb.


conference on high performance computing (supercomputing) | 2007

Optimizing center performance through coordinated data staging, scheduling and recovery

Zhe Zhang; Chao Wang; Sudharshan S. Vazhkudai; Xiaosong Ma; Gregory G. Pike; John W Cobb; Frank Mueller

Procurement and the optimized utilization of Petascale supercomputers and centers is a renewed national priority. Sustained performance and availability of such large centers is a key technical challenge significantly impacting their usability. Storage systems are known to be the primary fault source leading to data unavailability and job resubmissions. This results in reduced center performance, partially due to the lack of coordination between I/O activities and job scheduling. In this work, we propose the coordination of job scheduling with data staging/offloading and on-demand staged data reconstruction to address the availability of job input data and to improve center-wide performance. Fundamental to both mechanisms is the efficient management of transient data: in the way it is scheduled and recovered. Collectively, from a centers standpoint, these techniques optimize resource usage and increase its data/service availability. From a users standpoint, they reduce the job turnaround time and optimize the allocated time usage.


Journal of Physics: Conference Series | 2009

Concurrent, parallel, multiphysics coupling in the FACETS project

John R. Cary; Jeff Candy; John W Cobb; R.H. Cohen; Tom Epperly; Donald Estep; S. I. Krasheninnikov; Allen D. Malony; D. McCune; Lois Curfman McInnes; A.Y. Pankin; Satish Balay; Johan Carlsson; Mark R. Fahey; Richard J. Groebner; Ammar Hakim; Scott Kruger; Mahmood Miah; Alexander Pletzer; Svetlana G. Shasharina; Srinath Vadlamani; David Wade-Stein; T.D. Rognlien; Allen Morris; Sameer Shende; Greg Hammett; K. Indireshkumar; A. Yu. Pigarov; Hong Zhang

FACETS (Framework Application for Core-Edge Transport Simulations), is now in its third year. The FACETS team has developed a framework for concurrent coupling of parallel computational physics for use on Leadership Class Facilities (LCFs). In the course of the last year, FACETS has tackled many of the difficult problems of moving to parallel, integrated modeling by developing algorithms for coupled systems, extracting legacy applications as components, modifying them to run on LCFs, and improving the performance of all components. The development of FACETS abides by rigorous engineering standards, including cross platform build and test systems, with the latter covering regression, performance, and visualization. In addition, FACETS has demonstrated the ability to incorporate full turbulence computations for the highest fidelity transport computations. Early indications are that the framework, using such computations, scales to multiple tens of thousands of processors. These accomplishments were a result of an interdisciplinary collaboration among computational physics, computer scientists and applied mathematicians on the team.


Journal of Physics: Conference Series | 2008

First results from core-edge parallel composition in the FACETS project

John R. Cary; Jeff Candy; R.H. Cohen; S. I. Krasheninnikov; D. McCune; Donald Estep; Jay Walter Larson; Allen D. Malony; A.Y. Pankin; Patrick H. Worley; Johann Carlsson; Ammar Hakim; Paul Hamill; Scott Kruger; Mahmood Miah; S Muzsala; Alexander Pletzer; Svetlana G. Shasharina; David Wade-Stein; Nanbor Wang; Satish Balay; Lois Curfman McInnes; Hong Zhang; T. A. Casper; Lori Freitag Diachin; Thomas Epperly; T.D. Rognlien; Mark R. Fahey; John W Cobb; Allen Morris

FACETS (Framework Application for Core-Edge Transport Simulations), now in its second year, has achieved its first coupled core-edge transport simulations. In the process, a number of accompanying accomplishments were achieved. These include a new parallel core component, a new wall component, improvements in edge and source components, and the framework for coupling all of this together. These accomplishments were a result of an interdisciplinary collaboration among computational physics, computer scientists, and applied mathematicians on the team.


Archive | 2011

Challenges in Data Intensive Analysis at Scientific Experimental User Facilities

Kerstin Kleese van Dam; Dongsheng Li; Stephen D Miller; John W Cobb; Mark L. Green; Catherine L. Ruby

Today’s scientific challenges such as routes to a sustainable energy future, materials by design or biological and chemical environmental remediation methods, are complex problems that require the integration of a wide range of complementary expertise to be addressed successfully. Experimental and computational science research methods can hereby offer fundamental insights for their solution.


Concurrency and Computation: Practice and Experience | 2007

The Neutron Science TeraGrid Gateway: a TeraGrid science gateway to support the Spallation Neutron Source: Research Articles

John W Cobb; Al Geist; James Arthur Kohl; Stephen D Miller; Peter F. Peterson; Gregory G. Pike; Michael A. Reuter; Tom Swain; Sudharshan S. Vazhkudai; Nithya N. Vijayakumar

Web portals are one of the possible ways to access the remote computing resources offered by Grid environments. Since the emergence of the first middleware for the Grid, works have been conducted on delivering the functionality of Grid services on the Web. Many interesting Grid portal solutions have been designed help organize remote access to Grid resources and applications from within Web browsers. They are technically advanced and more and more widely used around the world, resulting in feedback from the community. Some of these user comments concern the flexibility and user-friendliness of the developed solutions. In this paper we present how we addressed the need for a flexible and user-friendly Grid portal environment within the PROGRESS project and how our approach facilitates the use of the Grid within Web portals. Copyright


teragrid conference | 2011

DataONE member node pilot integration with TeraGrid

Nicholas C. Dexter; John W Cobb; Dave Vieglais; Matthew Jones; Mike Lowe

The NSF DataONE [1] DataNet project and the NSF Tera-Grid [2] project have initiated a pilot collaboration to deploy and operate the DataONE Member Node software stack on TeraGrid infrastructure. The appealing feature of this collaboration is that it opens up the possibility to add large scale computing as an adjunct to DataONE data, metadata, and workflow manipulation and analysis tools. Additionally, DataONE data archive and curation services are exposed as an option for large scale computing and storage efforts such as TeraGrid/XSEDE. With this joint effort, DataONE also brings an open, persistent, robust, and secure method for accessing Earth sciences data collected by science communities such as The National Evolutionary Synthesis Centers Dryad [3], The Ecological Society of Americas Ecological Archive [4], NASAs Distributed Active Archive Center at the Oak Ridge National Laboratory [5], the USGSs National Biological Information Infrastructure [6], the Fire Research & Management Exchange System [7], the Long Term Ecological Research Network [8], and the Knowledge Network for Biocomplexity [9]. Beginning with an April 1st, 2011, allocation, the DataONE Core Cyberinfrastructure Team has been working with the IU Quarry [10] virtual hosting service, and more generally with the TeraGrid data area, on this pilot implementation. The implementation includes multiple virtual servers in order to test different reference implementations of the common DataONE Member Node RESTful web-service functions [11]. These implementations include implementation as a Metacat server [12], as well as a Python Generic Member Node developed by DataONE [13]. The implementations will also mount TeraGrid-wide global storage services (DC-WAN [14] and Albedo [15]) and thus allow integration of input and output of large scale computational runs with wide area archival data and metadata services.


Journal of Physics: Conference Series | 2010

Neutron Science TeraGrid Gateway

V. E. Lynch; Meili Chen; John W Cobb; James Arthur Kohl; Stephen D Miller; David A Speirs; Sudharshan S. Vazhkudai

The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of


ieee international conference on high performance computing data and analytics | 2008

TeraGrid: Analysis of organization, system architecture, and middleware enabling new types of applications

Charlie Catlett; William E. Allcock; Phil Andrews; Ruth A. Aydt; Ray Bair; Natasha Balac; Bryan Banister; Trish Barker; Mark Bartelt; Peter H. Beckman; Francine Berman; Gary R. Bertoline; Alan Blatecky; Jay Boisseau; Jim Bottum; Sharon Brunett; J. Bunn; Michelle Butler; David Carver; John W Cobb; Tim Cockerill; Peter Couvares; Maytal Dahan; Diana Diehl; Thom H. Dunning; Ian T. Foster; Kelly P. Gaither; Dennis Gannon; Sebastien Goasguen; Michael Grobe

1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nations scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.


Concurrency and Computation: Practice and Experience | 2007

The Neutron Science TeraGrid Gateway: a TeraGrid science gateway to support the Spallation Neutron Source

John W Cobb; Al Geist; James Arthur Kohl; Stephen D Miller; Peter F. Peterson; Gregory G. Pike; Michael A. Reuter; Tom Swain; Sudharshan S. Vazhkudai; Nithya N. Vijayakumar


Journal of Physics: Conference Series | 2010

The SNS/HFIR Web Portal System – How Can it Help Me?

Stephen D Miller; Al Geist; Kenneth W. Herwig; Peter F. Peterson; Michael A. Reuter; Shelly Ren; Jean-Christophe Bilheux; Stuart I. Campbell; James Arthur Kohl; Sudharshan S. Vazhkudai; John W Cobb; V. E. Lynch; Meili Chen; James R Trater; Bradford C Smith; Tom Swain; Jian Huang; Ruth Mikkelson; D. Mikkelson; Mar K L Gr een

Collaboration


Dive into the John W Cobb's collaboration.

Top Co-Authors

Avatar

Stephen D Miller

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

James Arthur Kohl

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

V. E. Lynch

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Meili Chen

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Michael A. Reuter

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Peter F. Peterson

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Al Geist

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Bradford C Smith

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Gregory G. Pike

Oak Ridge National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge