David P. Schissel
General Atomics
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David P. Schissel.
Fusion Engineering and Design | 2000
J Schachter; Q. Peng; David P. Schissel
Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is GAP1otObj, an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is MDSplus, which is used at DIED as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are ReviewPlus and EFITviewer. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server.
challenges of large applications in distributed environments | 2004
Katarzyna Keahey; Michael E. Papka; Qian Peng; David P. Schissel; G. Abla; Takuya Araki; Justin Burruss; Eliot Feibush; Peter Lane; Scott Klasky; Ti Leggett; D. McCune; Lewis Randerson
The National Fusion Collaboratory focuses on enabling fusion scientists to explore grid capabilities in support of experimental science. Fusion experiments are structured as a series of plasma pulses initiated roughly every 20 minutes. In the between-pulse intervals scientists perform data analysis and discuss results to reach decisions affecting changes to the next plasma pulse. This interaction can be made more efficient by performing more analysis and engaging more expertise from a geographically distributed team of scientists and resources. In this paper, we describe a virtual control room experiment that unites collaborative, visualization, and grid technologies to provide such environment and shows how their combined effect can advance experimental science. We also report on FusionGrid services whose use during the fusion experimental cycle became possible for the first time thanks to this technology. We also describe the Access Grid, experimental data presentation tools, and agreement-based resource management and workflow systems enabling time-bounded end-to-end application execution. The first virtual control room experiment represented a mock-up of a remote interaction with the DIII-D control room and was presented at SC03 and later reviewed at an international ITER Grid Workshop.
Fusion Engineering and Design | 2002
David P. Schissel; A. Finkelstein; Ian T. Foster; Tom W. Fredian; M. Greenwald; C.D. Hansen; C.R. Johnson; Katarzyna Keahey; S.A. Klasky; K. Li; D.C. McCune; Qian Peng; R. Stevens; Mary R. Thompson
The long-term vision of the Fusion Collaboratory described in this paper is to transform fusion research and accelerate scientific understanding and innovation so as to revolutionize the design of a fusion energy source. The Collaboratory will create and deploy collaborative software tools that will enable more efficient utilization of existing experimental facilities and more effective integration of experiment, theory, and modeling. The computer science research necessary to create the Collaboratory is centered on three activities: security, remote and distributed computing, and scientific visualization. It is anticipated that the presently envisioned Fusion Collaboratory software tools will require 3 years to complete.
Review of Scientific Instruments | 2010
G. Abla; T. Fredian; David P. Schissel; J. Stillerman; M. Greenwald; D. N. Stepanov; D.J. Ciarlette
Tokamak diagnostic settings are repeatedly modified to meet the changing needs of each experiment. Enabling the remote diagnostic control has significant challenges due to security and efficiency requirements. The Operation Request Gatekeeper (ORG) is a software system that addresses the challenges of remotely but securely submitting modification requests. The ORG provides a framework for screening all the requests before they enter the secure machine zone and are executed by performing user authentication and authorization, grammar validation, and validity checks. A prototype ORG was developed for the ITER CODAC that satisfies their initial requirements for remote request submission and has been tested with remote control of the KSTAR Plasma Control System. This paper describes the software design principles and implementation of ORG as well as worldwide test results.
Lawrence Berkeley National Laboratory | 2006
Paul D. Adams; Shane Canon; Steven Carter; Brent Draney; M. Greenwald; Jason Hodges; Jerome Lauret; George Michaels; Larry Rahn; David P. Schissel; Gary Strand; Howard Walter; Michael F. Wehner; Dean N. Williams
The Energy Sciences Network (ESnet) is the primary providerof network connectivity for the US Department of Energy Office ofScience, the single largest supporter of basic research in the physicalsciences in the United States. In support of the Office of Scienceprograms, ESnet regularly updates and refreshes its understanding of thenetworking requirements of the instruments, facilities and scientiststhat it serves. This focus has helped ESnet to be a highly successfulenabler of scientific discovery for over 20 years. In August, 2002 theDOE Office of Science organized a workshop to characterize the networkingrequirements for Office of Science programs. Networking and middlewarerequirements were solicited from a representative group of scienceprograms. The workshop was summarized in two documents the workshop finalreport and a set of appendixes. This document updates the networkingrequirements for ESnet as put forward by the science programs listed inthe 2002 workshop report. In addition, three new programs have beenadded. Theinformation was gathered through interviews with knowledgeablescientists in each particular program or field.
Cluster Computing | 2005
Katarzyna Keahey; Michael E. Papka; Qian Peng; David P. Schissel; G. Abla; Takuya Araki; Justin Burruss; Eliot Feibush; Peter Lane; Scott Klasky; Ti Leggett; D. McCune; Lewis Randerson
The National Fusion Collaboratory project seeks to enable fusion scientists to exploit Grid capabilities in support of experimental science. To this end we are exploring the concept of a collaborative control room that harnesses Grid and collaborative technologies to provide an environment in which remote experimental devices, codes, and expertise can interact in real time during an experiment. This concept has the potential to make fusion experiments more efficient by enabling researchers to perform more analysis and by engaging more expertise from a geographically distributed team of scientists and resources. As the realities of software development, talent distribution, and budgets increasingly encourage pooling resources and specialization, we see such environments as a necessary tool for future science.In this paper, we describe an experimental mock-up of a remote interaction with the DIII-D control room. The collaborative control room was demonstrated at SC03 and later reviewed at an international ITER Grid Workshop. We describe how the combined effect of various technologies—collaborative, visualization, and Grid—can be used effectively in experimental science. Specifically, we describe the Access Grid, experimental data presentation tools, and agreement-based resource management and workflow systems enabling time-bounded end-to-end application execution. We also report on FusionGrid services whose use during the fusion experimental cycle became possible for the first time thanks to this technology, and we discuss its potential use in future fusion experiments.
symposium on fusion technology | 2001
David P. Schissel; J.R. Burruss; Q. Peng; J Schachter; T.B Terpstra; K.K Keith; B.B McHarg; J.C. Phillips
A long-term plan is being implemented to enhance the computational infrastructure of the DIII-D National Fusion Facility. One of the goals of this plan is more efficient utilization of DIII-D experimental run-time by decreasing the time to analyze, store and distribute analyzed data during tokamak operations. A multi-processor Linux cluster is reducing data analysis time and a Unix based MDSplus data management system is providing rapid access to analyzed data. A second goal of the long-term plan is to reduce the time required for more detailed physics analysis after experimental operations. This goal is being accomplished with the underlying philosophy of uniformity, both in the look and feel of our own GUI-based tools, in terms of access methods to analyzed datasets, and access to existing computer power via a load balanced UNIX cluster. Additionally, we have enhanced our remote meeting capability resulting in improved communication within the geographically diverse DIII-D Research Team.
Grid-Based Problem Solving Environments | 2007
David P. Schissel
Fusion science seeks a new power source and is advanced by experiments on fusion devices located worldwide. Fundamental to increasing understanding of fusion is the comparison of theory and experiment; measurements from fusion devices are analyzed and compared with the output of simulations to test the validity of fusion models and to uncover new physical properties. Integrating simulations with experimental data (or with other simulations) is in many cases a laborintensive task as different codes use different data storage formats. Moreover, the timely comparison of simulation with observations made during an experiment requires rapid turnaround both of analysis codes and simulation runs. Many simulations require extensive input and output processing, further increasing the amount of work necessary to achieve viable scientific results. Workers with the National Fusion Collaboratory are developing, deploying, and evaluating new technologies that facilitate analysis of experimental data and comparison with the results with those of simulations. Complex physics codes are made available on the National Fusion Grid (FusionGrid) as comprehensive computational services. Using the Globus Toolkit”, a servicebased approach was developed and subsequently combined with the TRANSP transport code to the benefit of fusion scientists. Output from both simulation and experimental codes are stored in MDSplus, the de facto standard for secure data storage of fusion data. Access control_for the resources of FusionGrid is greatly simplifiedfor both users and administrators_through unified authentication and authorization using X.509, a gridwide certificate management system, and a gridwide authorization system. Webbased solutions such as the recently developed Elfresco reflectometry code further simplify the process by making simulations available to scientists and providing an alternative to traditional distribution of code. Future work includes the development of parallelized modules to speed up longrunning codes along with the extension of MDSplus. These improvements will help accommodate the continuous data streams that will be found in future fusion devices such as ITER. This paper will present a discussion on specific solutions, examine deployment areas that present a challenge, and highlight areas where further work is required.
Journal of Physics: Conference Series | 2005
David P. Schissel
The National Fusion Collaboratory Project is developing a persistent infrastructure to enable scientific collaboration for all aspects of magnetic fusion energy research by creating a robust, user-friendly collaborative environment and deploying this to the more than one thousand fusion scientists in forty institutions who perform magnetic fusion research in the US. Work specifically focusing on advancing real-time interpretation of fusion experiments includes collocated collaboration in tokamak control rooms via shared display walls, remote collaboration using Internet based audio and video, and pseudo-real-time data analysis via the National Fusion Energy Grid (FusionGrid). The technologies being developed and deployed will also scale to the next generation experimental devices such as ITER.
grid computing | 2002
Katarzyna Keahey; Tom W. Fredian; Qian Peng; David P. Schissel; Mary R. Thompson; Ian T. Foster; M. Greenwald; D. McCune