Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Laura Monroe is active.

Publication


Featured researches published by Laura Monroe.


high performance graphics | 2011

Randomized selection on the GPU

Laura Monroe; Joanne Wendelberger; Sarah Michalak

We implement here a fast and memory-sparing probabilistic top k selection algorithm on the GPU. The algorithm proceeds via an iterative probabilistic guess-and-check process on pivots for a three-way partition. When the guess is correct, the problem is reduced to selection on a much smaller set. This probabilistic algorithm always gives a correct result and always terminates. Las Vegas algorithms of this kind are a form of stochastic optimization and can be well suited to more general parallel processors with limited amounts of fast memory.


IEEE Transactions on Visualization and Computer Graphics | 2007

NPU-Based Image Compositing in a Distributed Visualization System

David L. Pugmire; Laura Monroe; Carolyn Connor Davenport; Andrew J. DuBois; David H. DuBois; Stephen W. Poole

This paper describes the first use of a network processing unit (NPU) to perform hardware-based image composition in a distributed rendering system. The image composition step is a notorious bottleneck in a clustered rendering system. Furthermore, image compositing algorithms do not necessarily scale as data size and number of nodes increase. Previous researchers have addressed the composition problem via software and/or custom-built hardware. We used the heterogeneous multicore computation architecture of the Intel IXP28XX NPU, a fully programmable commercial off-the-shelf (COTS) technology, to perform the image composition step. With this design, we have attained a nearly four-times performance increase over traditional software-based compositing methods, achieving sustained compositing rates of 22-28 fps on a 1.021times1.024 image. This system is fully scalable with a negligible penalty in frame rate, is entirely COTS, and is flexible with regard to operating system, rendering software, graphics cards, and node architecture. The NPU-based compositor has the additional advantage of being a modular compositing component that is eminently suitable for integration into existing distributed software visualization packages.


european conference on parallel processing | 2013

GPU Behavior on a Large HPC Cluster

Nathan DeBardeleben; Sean Blanchard; Laura Monroe; Philip Romero; Daryl Grunau; Craig Idler; Cornell Wright

We discuss observed characteristics of GPUs deployed as accelerators in an HPC cluster at Los Alamos National Laboratory. GPUs have a very good theoretical FLOPS rate, and are reasonably inexpensive and available, but they are relatively new to HPC, which demands both consistently high performance across nodes and consistently low error rate.


IEEE Transactions on Nuclear Science | 2014

Modern GPUs Radiation Sensitivity Evaluation and Mitigation Through Duplication With Comparison

Daniel Oliveira; Paolo Rech; Heather Quinn; Thomas D. Fairbanks; Laura Monroe; Sarah Michalak; Christine M. Anderson-Cook; Philippe Olivier Alexandre Navaux; Luigi Carro

Graphics processing units (GPUs) are increasingly common in both safety-critical and high-performance computing (HPC) applications. Some current supercomputers are composed of thousands of GPUs so the probability of device corruption becomes very high. Moreover, the GPUs parallel capabilities are very attractive for the automotive and aerospace markets, where reliability is a serious concern. In this paper, the neutron sensitivity of the modern GPU caches, and internal resources are experimentally evaluated. Various Duplication With Comparison strategies to reduce GPU radiation sensitivity are then presented and validated through radiation experiments. Threads should be carefully duplicated to avoid undesired errors on shared resources and to avoid the exacerbation of errors in critical resources such as the scheduler.


IEEE Transactions on Visualization and Computer Graphics | 2012

Visual Data Analysis as an Integral Part of Environmental Management

Meyer J; E.W. Bethel; Horsman Jl; Hubbard Ss; Harinarayan Krishnan; Romosan A; Keating Eh; Laura Monroe; Strelitz R; Moore P; Taylor G; Torkian B; Johnson Tc; Gorton I

The U.S. Department of Energys (DOE) Office of Environmental Management (DOE/EM) currently supports an effort to understand and predict the fate of nuclear contaminants and their transport in natural and engineered systems. Geologists, hydrologists, physicists and computer scientists are working together to create models of existing nuclear waste sites, to simulate their behavior and to extrapolate it into the future. We use visualization as an integral part in each step of this process. In the first step, visualization is used to verify model setup and to estimate critical parameters. High-performance computing simulations of contaminant transport produces massive amounts of data, which is then analyzed using visualization software specifically designed for parallel processing of large amounts of structured and unstructured data. Finally, simulation results are validated by comparing simulation results to measured current and historical field data. We describe in this article how visual analysis is used as an integral part of the decision-making process in the planning of ongoing and future treatment options for the contaminated nuclear waste sites. Lessons learned from visually analyzing our large-scale simulation runs will also have an impact on deciding on treatment measures for other contaminated sites.


ieee virtual reality conference | 2006

La Cueva Grande: a 43-Megapixel Immersive System

Curt Canada; Tim Harrington; Robert Kares; Dave Modl; Laura Monroe; Steve Stringer

Los Alamos National Laboratory (LANL) has deployed a 43- megapixel multi-panel immersive environment, La Cueva Grande (LCG), to be used in visualizing the terabytes of data produced by simulations. This paper briefly discusses some of the technical challenges encountered and overcome during the deployment of a 43-million pixel immersive visualization environment.


european dependable computing conference | 2017

Resilience Analysis of Top K Selection Algorithms

Ryan Slechta; Laura Monroe; Nathan DeBardeleben; Qiang Guan; Joanne Wendelberger; Sarah Michalak

As the number of components in high-performance computing (HPC) systems continues to grow, the number of vehicles for soft errors will rise in parallel. Petascale research has shown that soft errors on supercomputers can occur as frequently as multiple times per day, and this rate will only increase with the exascale era. Due to this frequency, the resilience community has taken an interest in algorithmic resilience as a means for reliable computing in faulty environments. Probabilistic algorithms in particular have generated interest, due to their imprecise nature and ability to handle incorrect guesses. In this paper, we analyze the intrinsic resilience of a probabilistic Top K selection algorithm to silent data corruption in the event of a single event upset. We introduce a new paradigm of analytically quantifying an algorithms resilience as a function of its inputs, which permits a precise comparison of the resilience of competing algorithms. In addition, we discuss the implications of our findings on the resilience of probabilistic algorithms as a whole in comparison to their deterministic counterparts.


IEEE Transactions on Nuclear Science | 2017

Experimental and Analytical Analysis of Sorting Algorithms Error Criticality for HPC and Large Servers Applications

Caio Lunardi; Heather Quinn; Laura Monroe; Daniel Oliveira; Philippe Olivier Alexandre Navaux; Paolo Rech

In this paper, we investigate neutron-induced errors in three implementations of sort algorithms (QuickSort, MergeSort, and RadixSort) executed on modern graphics processing units designed for high-performance computing and large server applications. We measure the radiation-induced error rate of sort algorithms taking advantage of the neutron beam available at the Los Alamos Neutron Science Center facility. We also analyze output error criticality by identifying specific output error patterns. We found that radiation can cause wrong elements to appear in the sorted array, misalign values as well as application crashes or system hangs. This paper presents results showing that the criticality of the radiation-induced output error pattern depends on the application. Additionally, an extensive fault-injection campaign has been performed. This campaign allows for better understanding of the observed phenomena. We take advantage of SASS-assembly Intrumentator Fault Injector developed by NVIDIA, which can inject faults into all the user-accessible architectural state. Comparing fault-injection results with radiation experiments data provides an understanding that not all the output errors observed under radiation can be replicated in fault injection. However, fault injection is useful in identifying possible root causes of the output errors observed in radiation testing. Finally, we take advantage of our experimental and analytical study to design efficient experimentally tuned hardening strategies. We detect the error patterns that are critical to the final application and find the more efficient way to detect them. With an overhead as low as 16% of the execution time, we are able to reduce the output error rate of sort of about one order of magnitude.


european conference on parallel processing | 2016

On the Inherent Resilience of Integer Operations

Laura Monroe; William M. Jones; Scott R. Lavigne; Claude H. Davis; Qiang Guan; Nathan DeBardeleben

It is of great interest to correctly quantify corruption rates in computing systems. Masking effects of individual operations can complicate this effort by hiding faults. Beyond this, identification of fault-masking operations may be useful in designing resilient algorithms.


Proceedings of SPIE | 2013

Art, science, and immersion: data-driven experiences

Ruth West; Laura Monroe; Jacquelyn Ford Morie; Julieta C. Aguilera

This panel and dialog-paper explores the potentials at the intersection of art, science, immersion and highly dimensional, “big” data to create new forms of engagement, insight and cultural forms. We will address questions such as: “What kinds of research questions can be identified at the intersection of art + science + immersive environments that can’t be expressed otherwise?” “How is art+science+immersion distinct from state-of-the art visualization?” “What does working with immersive environments and visualization offer that other approaches don’t or can’t?” “Where does immersion fall short?” We will also explore current trends in the application of immersion for gaming, scientific data, entertainment, simulation, social media and other new forms of big data. We ask what expressive, arts-based approaches can contribute to these forms in the broad cultural landscape of immersive technologies.

Collaboration


Dive into the Laura Monroe's collaboration.

Top Co-Authors

Avatar

Sarah Michalak

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Nathan DeBardeleben

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Qiang Guan

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Heather Quinn

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Joanne Wendelberger

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Sean Blanchard

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

David L. Pugmire

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Panruo Wu

University of California

View shared research outputs
Top Co-Authors

Avatar

Daniel Oliveira

Universidade Federal do Rio Grande do Sul

View shared research outputs
Top Co-Authors

Avatar

Paolo Rech

Universidade Federal do Rio Grande do Sul

View shared research outputs
Researchain Logo
Decentralizing Knowledge