Patrick O'Leary
Kitware
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Patrick O'Leary.
ieee international conference on high performance computing data and analytics | 2014
James P. Ahrens; Sébastien Jourdain; Patrick O'Leary; John Patchett; David H. Rogers; Mark R. Petersen
Extreme scale scientific simulations are leading a charge to exascale computation, and data analytics runs the risk of being a bottleneck to scientific discovery. Due to power and I/O constraints, we expect in situ visualization and analysis will be a critical component of these workflows. Options for extreme scale data analysis are often presented as a stark contrast: write large files to disk for interactive, exploratory analysis, or perform in situ analysis to save detailed data about phenomena that a scientists knows about in advance. We present a novel framework for a third option - a highly interactive, image-based approach that promotes exploration of simulation results, and is easily accessed through extensions to widely used open source tools. This in situ approach supports interactive exploration of a wide range of results, while still significantly reducing data movement and storage.
ieee vgtc conference on visualization | 2016
Andrew C. Bauer; Hasan Abbasi; James P. Ahrens; Hank Childs; Berk Geveci; Scott Klasky; Kenneth Moreland; Patrick O'Leary; Venkatram Vishwanath; Brad Whitlock; E.W. Bethel
The considerable interest in the high performance computing (HPC) community regarding analyzing and visualization data without first writing to disk, i. e., in situ processing, is due to several factors. First is an I/O cost savings, where data is analyzed/visualized while being generated, without first storing to a filesystem. Second is the potential for increased accuracy, where fine temporal sampling of transient analysis might expose some complex behavior missed in coarse temporal sampling. Third is the ability to use all available resources, CPUs and accelerators, in the computation of analysis products. This STAR paper brings together researchers, developers and practitioners using in situ methods in extreme‐scale HPC with the goal to present existing methods, infrastructures, and a range of computational science and engineering applications using in situ analysis and visualization.
Proceedings of the First Workshop on In Situ Infrastructures for Enabling Extreme-Scale Analysis and Visualization | 2015
Utkarsh Ayachit; Andrew C. Bauer; Berk Geveci; Patrick O'Leary; Kenneth Moreland; Nathan D. Fabian; Jeffrey Mauldin
Computer simulations are growing in sophistication and producing results of ever greater fidelity. This trend has been enabled by advances in numerical methods and increasing computing power. Yet these advances come with several costs including massive increases in data size, difficulties examining output data, challenges in configuring simulation runs, and difficulty debugging running codes. Interactive visualization tools, like ParaView, have been used for post-processing of simulation results. However, the increasing data sizes, and limited storage and bandwidth make high fidelity post-processing impractical. In situ analysis is recognized as one of the ways to address these challenges. In situ analysis moves some of the post-processing tasks in line with the simulation code thus short circuiting the need to communicate the data between the simulation and analysis via storage. ParaView Catalyst is a data processing and visualization library that enables in situ analysis and visualization. Built on and designed to interoperate with the standard visualization toolkit VTK and the ParaView application, Catalyst enables simulations to intelligently perform analysis, generate relevant output data, and visualize results concurrent with a running simulation. In this paper, we provide an overview of the Catalyst framework and some of the success stories.
extreme science and engineering discovery environment | 2013
Homa Karimabadi; Burlen Loring; Patrick O'Leary; Amit Majumdar; Mahidhar Tatineni; Berk Geveci
Petascale simulations have become mission critical in diverse areas of science and engineering. Knowledge discovery from such simulations remains a major challenge and is becoming more urgent as the march towards ultra-scale computing with millions of cores continues. One major issue with the current paradigm of running the simulations and saving the data to disk for post-processing is that it is only feasible to save the data at a small number of time slices. This low temporal resolution of the saved data is a serious handicap in many studies where the time evolution of the system is of principle interest. One way to address this I/O issue is through in-situ visualization strategies. The idea is to minimize data storage by extracting important features of the data and saving them, rather than raw data, at high temporal resolution. Parallel file systems of current petascale and future exascale systems are expensive shared resources and need to be utilized effectively, and similarly archival storage can be limited and both of these will benefit from in-situ visualization as it will lead to intelligent way of utilizing storage. In this paper, we present preliminary results from our in-situ visualization for global hybrid (electron fluid, kinetic ions) simulations which are used to study the interaction of the solar wind with planetary magnetospheres such as the Earth and Mercury. In particular, we examine the overhead and effect on code performance associated with the inline computations associated with in-situ visualization.
parallel computing | 2016
Patrick O'Leary; James P. Ahrens; Sébastien Jourdain; Scott Wittenburg; David H. Rogers; Mark R. Petersen
We created an in situ exploration visualization of an MPAS-Ocean simulation.We leveraged compositing in Cinema to provide interactive exploration.We decreased the storage footprint of the analysis and visualization results. Due to power and I/O constraints associated with extreme scale scientific simulations, in situ analysis and visualization will become a critical component to scientific exploration and discovery. Current analysis and visualization options at extreme scale are presented in opposition: write files to disk for interactive, exploratory analysis, or perform in situ analysis to save data products about phenomena that a scientists knows about in advance. In this paper, we demonstrate extreme scale visualization of MPAS-Ocean simulations leveraging a third option based on Cinema, which is a novel framework for highly interactive, image-based in situ analysis and visualization that promotes exploration.
ieee international conference on high performance computing data and analytics | 2016
Utkarsh Ayachit; Andrew C. Bauer; Earl P. N. Duque; Greg Eisenhauer; Nicola J. Ferrier; Junmin Gu; Kenneth E. Jansen; Burlen Loring; Zarija Lukić; Suresh Menon; Dmitriy Morozov; Patrick O'Leary; Reetesh Ranjan; Michel Rasquin; Christopher P. Stone; Venkatram Vishwanath; Gunther H. Weber; Brad Whitlock; Matthew Wolf; K. John Wu; E. Wes Bethel
A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. This paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: scalability, overhead, performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.
ieee international conference on cloud computing technology and science | 2015
Patrick O'Leary; Mark A. Christon; Sébastien Jourdain; Chris Harris; Markus Berndt; Andrew C. Bauer
Advanced modeling and simulation has enabled the design of a variety of innovative products and the analysis of numerous complex phenomenon. However, significant barriers exist to widespread adoption of these tools. In particular, advanced modeling and simulation: (1) is considered complex to use, (2) needs in-house expertise, and (3) requires high capital costs. In this paper, we describe the development of an end-to-end, advanced modeling and simulation cloud platform that encapsulates best practices for scientific computing in the cloud, and demonstrate using Hydra-TH as a prototypical application. As an alternative to traditional advanced modeling and simulation workflows, our Web-based approach simplifies the processes, decreases the need for in-house computational science and engineering experts, and lowers the capital investments. In addition to providing significantly improved, intuitive software, the environment offers reproducible workflows where the full lifecycle of data from input to final analyzed results can be saved, shared, and even published.
ieee virtual reality conference | 2017
Patrick O'Leary; Sankhesh Jhaveri; Aashish Chaudhary; William R. Sherman; Ken Martin; David Lonie; Eric T. Whiting; James H. Money; Sandy McKenzie
Modern scientific, engineering and medical computational simulations, as well as experimental and observational data sensing/measuring devices, produce enormous amounts of data. While statistical analysis provides insight into this data, scientific visualization is tactically important for scientific discovery, product design and data analysis. These benefits are impeded, however, when scientific visualization algorithms are implemented from scratch — a time-consuming and redundant process in immersive application development. This process can greatly benefit from leveraging the state-of-the-art open-source Visualization Toolkit (VTK) and its community. Over the past two (almost three) decades, integrating VTK with a virtual reality (VR) environment has only been attempted to varying degrees of success. In this paper, we demonstrate two new approaches to simplify this amalgamation of an immersive interface with visualization rendering from VTK. In addition, we cover several enhancements to VTK that provide near real-time updates and efficient interaction. Finally, we demonstrate the combination of VTK with both Vrui and OpenVR immersive environments in example applications.
Proceedings of the In Situ Infrastructures on Enabling Extreme-Scale Analysis and Visualization | 2017
David S. Thompson; Sébastien Jourdain; Andrew C. Bauer; Berk Geveci; Robert Maynard; Ranga Raju Vatsavai; Patrick O'Leary
Summarization and compression at current and future scales requires a framework for developing and benchmarking algorithms. We present a framework created by integrating existing, production-ready projects and provide timings of two particular algorithms that serve as exemplars for summarization: a wavelet-based data reduction filter and a generator for creating image-like databases of extracted features (isocontours in this case). Both support browser-based, post-hoc, interactive visualization of the summary for decision-making. A study of their weak-scaling on a distributed multi-GPU system is included.
arXiv: Software Engineering | 2013
Marcus D. Hanwell; Amitha Perera; Wes Turner; Patrick O'Leary; Katie Osterdahl; Bill Hoffman; Will Schroeder