Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert Granat is active.

Publication


Featured researches published by Robert Granat.


Bulletin of the Seismological Society of America | 2005

Real-time Earthquake Location Using Kirchhoff Reconstruction

Teresa Baker; Robert Granat; Robert W. Clayton

Real-time location of earthquakes can be achieved by using direct imaging of the recorded wave field based on a Kirchhoff reconstruction method similar to that used in the migration of seismic reflection data. The standard method of event location requires the wave arrival at each sensor to be picked and associated with an event. By using direct imaging, the event is identified once in the imaged wave field. The computation is independent of the level of seismic activity and can be carried out on a typical desktop computer. The procedure has been successfully demonstrated in two and three dimensions using data from the Southern California Seismic Network (Trinet). At higher resolutions, the reconstruction method can identify finite source effects. Further work considers extending the method by implementing full elastic theory and solving for moment tensors at all locations in the mesh.


IEEE Transactions on Computers | 2003

Tests and tolerances for high-performance software-implemehted fault detection

Michael J. Turmon; Robert Granat; Daniel S. Katz; John Z. Lou

We describe and test a software approach to fault detection in common numerical algorithms. Such result checking or algorithm-based fault tolerance (ABFT) methods may be used, for example, to overcome single-event upsets in computational hardware or to detect errors in complex, high-efficiency implementations of the algorithms. Following earlier work, we use checksum methods to validate results returned by a numerical subroutine operating subject to unpredictable errors in data. We consider common matrix and Fourier algorithms which return results satisfying a necessary condition having a linear form; the checksum tests compliance with this condition. We discuss the theory and practice of setting numerical tolerances to separate errors caused by a fault from those inherent in finite-precision floating-point calculations. We concentrate on comprehensively defining and evaluating tests having various accuracy/computational burden tradeoffs, and we emphasize average-case algorithm behavior rather than using worst-case upper, bounds on error.


dependable systems and networks | 2000

Software-implemented fault detection for high-performance space applications

Michael J. Turmon; Robert Granat; Daniel S. Katz

We describe and test a software approach to overcoming radiation-induced errors in spaceborne applications running on commercial off-the-shelf components. The approach uses checksum methods to validate results returned by a numerical subroutine operating subject to unpredictable errors in data. We can treat subroutines that return results satisfying a necessary condition having a linear form; the checksum tests compliance with this condition. We discuss the theory and practice of setting numerical tolerances to separate errors caused by a fault from those inherent infinite-precision numerical calculations. We test both the general effectiveness of the linear fault tolerant schemes we propose, and the correct behavior of our parallel implementation of them.


Pure and Applied Geophysics | 2006

iSERVO: Implementing the International Solid Earth Research Virtual Observatory by Integrating Computational Grid and Geographical Information Web Services

Mehmet S. Aktas; Galip Aydin; Andrea Donnellan; Geoffrey C. Fox; Robert Granat; Lisa B. Grant; Greg Lyzenga; Dennis McLeod; Shrideep Pallickara; Jay Parker; Marlon E. Pierce; John B. Rundle; Ahmet Sayar; Terry E. Tullis

We describe the goals and initial implementation of the International Solid Earth Virtual Observatory (iSERVO). This system is built using a Web Services approach to Grid computing infrastructure and is accessed via a component-based Web portal user interface. We describe our implementations of services used by this system, including Geographical Information System (GIS)-based data grid services for accessing remote data repositories and job management services for controlling multiple execution steps. iSERVO is an example of a larger trend to build globally scalable scientific computing infrastructures using the Service Oriented Architecture approach. Adoption of this approach raises a number of research challenges in millisecond-latency message systems suitable for internet-enabled scientific applications. We review our research in these areas.


arXiv: Astrophysics | 2001

Exploration of parameter spaces in a virtual observatory

S. George Djorgovski; Ashish A. Mahabal; Robert J. Brunner; Roy Williams; Robert Granat; David W. Curkendall; Joseph C. Jacob; Paul Stolorz

Like every other field of intellectual endeavor, astronomy is being revolutionized by the advances in information technology. There is an ongoing exponential growth in the volume, quality, and complexity of astronomical data sets, mainly through large digital sky surveys and archives. The Virtual Observatory (VO) concept represents a scientific and technological framework needed to cope with this data flood. Systematic exploration of the observable parameter spaces, covered by large digital sky surveys spanning a range of wavelengths, will be one of the primary modes of research with a VO. This is where the truly new discoveries will be made, and new insights be gained about the already known astronomical objects and phenomena. We review some of the methodological challenges posed by the analysis of large and complex data sets expected in the VO-based research. The challenges are driven both by the size and the complexity of the data sets (billions of data vectors in parameter spaces of tens or hundreds of dimensions), by the heterogeneity of the data and measurement errors, including differences in basic survey parameters for the federated data sets (e.g., in the positional accuracy and resolution, wavelength coverage, time baseline, etc), various selection effects, as well as the intrinsic clustering properties (functional form, topology) of the data distributions in the parameter spaces of observed attributes. Answering these challenges will require substantial collaborative efforts and partnerships between astronomers, computer scientists, and statisticians.


Pure and Applied Geophysics | 2006

QuakeSim and the Solid Earth Research Virtual Observatory

Andrea Donnellan; John B. Rundle; Geoffrey C. Fox; Dennis McLeod; Lisa B. Grant; Terry E. Tullis; Marlon E. Pierce; Jay Parker; Greg Lyzenga; Robert Granat; M. T. Glasscoe

We are developing simulation and analysis tools in order to develop a solid Earth Science framework for understanding and studying active tectonic and earthquake processes. The goal of QuakeSim and its extension, the Solid Earth Research Virtual Observatory (SERVO), is to study the physics of earthquakes using state-of-the-art modeling, data manipulation, and pattern recognition technologies. We are developing clearly defined accessible data formats and code protocols as inputs to simulations, which are adapted to high-performance computers. The solid Earth system is extremely complex and nonlinear, resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes. We are using Web (Grid) service technology to demonstrate the assimilation of multiple distributed data sources (a typical data grid problem) into a major parallel high-performance computing earthquake forecasting code. Such a linkage of Geoinformatics with Geocomplexity demonstrates the value of the Solid Earth Research Virtual Observatory (SERVO) Grid concept, and advances Grid technology by building the first real-time large-scale data assimilation grid.


cluster computing and the grid | 2005

A scripting based architecture for management of streams and services in real-time grid applications

Harshawardhan Gadgil; Geoffrey C. Fox; Shrideep Pallickara; Marlon E. Pierce; Robert Granat

Recent specifications such as WS-management and WS-distributed management have stressed the importance of management of resources and services and propose methods towards querying Web services to gather the meta-data associated with these services. Management often entails system setup, querying system metadata, manipulation of system parameters at runtime and taking actions based on the system parameters to tune system performance. Real-time applications require rapid deployment of application components and demand results in real time. In this paper we present the HPSearch system which enables dynamic management of the system including both streams and Web services, and rapid deployment of applications via a scripting interface. We illustrate the functioning of the system by modeling a data streaming application and rapidly deploying the system and application components.


computational intelligence and data mining | 2007

Analysis of streaming GPS measurements of surface displacement through a web services environment

Robert Granat; Galip Aydin; Marlon E. Pierce; Zhigang Qi; Yehuda Bock

We present a method for performing mode classification of real-time streams of GPS surface position data. Our approach has two parts: an algorithm for robust, unconstrained fitting of hidden Markov models (HMMs) to continuous-valued time series, and SensorGrid technology that manages data streams through a series of filters coupled with a publish/subscribe messaging system. The SensorGrid framework enables strong connections between data sources, the HMM time series analysis software, and users. We demonstrate our approach through a Web portal environment through which users can easily access data from the SCIGN and SOPAC GPS networks in Southern California, apply the analysis method, and view results. Ongoing real-time mode classifications of streaming GPS data are displayed in a map-based visualization interface


international conference on computational science | 2003

A method of hidden Markov model optimization for use with geophysical data sets

Robert Granat

Geophysics research has been faced with a growing need for automated techniques with which to process large quantities of data. A successful tool must meet a number of requirements: it should be consistent, require minimal parameter tuning, and produce scientifically meaningful results in reasonable time. We introduce a hidden Markov model (HMM)-based method for analysis of geophysical data sets that attempts to address these issues. Our method improves on standard HMM methods and is based on the systematic analysis of structural local maxima of the HMM objective function. Preliminary results of the method as applied to geodetic and seismic records are presented.


Pure and Applied Geophysics | 2006

Detecting regional events via statistical analysis of geodetic networks

Robert Granat

We present an application of hidden Markov models (HMMs) to analysis of geodetic time series in Southern California. Our model-fitting method uses a regularized version of the deterministic annealing expectation-maximization algorithm to ensure that model solutions are both robust and of high quality. Using the fitted models, we segment the daily displacement time series collected by 127 stations of the Southern California Integrated Geodetic Network (SCIGN) over a two-year period. Segmentations of the series are based on statistical changes as identified by the trained HMMs. We look for correlations in state changes across multiple stations that indicate region-wide activity. We find that although in one case a strong seismic event was associated with a spike in station correlations, in all other cases in the study, time period strong correlations were not associated with any seismic event. This indicates that the method was able to identify more subtle signals associated with aseismic events or long-range interactions between smaller events.

Collaboration


Dive into the Robert Granat's collaboration.

Top Co-Authors

Avatar

Jay Parker

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrea Donnellan

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

John B. Rundle

University of California

View shared research outputs
Top Co-Authors

Avatar

Geoffrey C. Fox

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

M. T. Glasscoe

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dennis McLeod

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sharon Kedar

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yehuda Bock

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge