Ariel Garcia
Karlsruhe Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ariel Garcia.
PLOS ONE | 2014
Johannes Stegmaier; Jens C. Otte; Andrei Yu. Kobitski; Andreas Bartschat; Ariel Garcia; G. Ulrich Nienhaus; Uwe Strähle; Ralf Mikut
Automated analysis of multi-dimensional microscopy images has become an integral part of modern research in life science. Most available algorithms that provide sufficient segmentation quality, however, are infeasible for a large amount of data due to their high complexity. In this contribution we present a fast parallelized segmentation method that is especially suited for the extraction of stained nuclei from microscopy images, e.g., of developing zebrafish embryos. The idea is to transform the input image based on gradient and normal directions in the proximity of detected seed points such that it can be handled by straightforward global thresholding like Otsu’s method. We evaluate the quality of the obtained segmentation results on a set of real and simulated benchmark images in 2D and 3D and show the algorithm’s superior performance compared to other state-of-the-art algorithms. We achieve an up to ten-fold decrease in processing times, allowing us to process large data sets while still providing reasonable segmentation results.
Lecture Notes in Computer Science | 2003
Jorge Gomes; M. David; João Martins; Luis Bernardo; J. Marco; R. Marco; D. Rodríguez; José Salt; S. Gonzalez; Javier Sánchez; A. Fuentes; Markus Hardt; Ariel Garcia; P. Nyczyk; A. Ozieblo; Pawel Wolniewicz; Michal Bluj; Krzysztof Nawrocki; Adam Padée; Wojciech Wislicki; Carlos Fernández; J. Fontán; A. Gómez; I. López; Yiannis Cotronis; Evangelos Floros; George Tsouloupas; Wei Xing; Marios D. Dikaiakos; Ján Astalos
The CrossGrid project is developing new grid middleware components, tools and applications with a special focus on parallel and interactive computing. In order to support the development effort and provide a test infrastructure, an international grid testbed has been deployed across 9 countries. Through the deployment of the testbed and its supporting services, CrossGrid is also contributing to another important project objective, the expansion of the grid coverage in Europe. This paper describes the status of the CrossGrid testbed.
Scientific Reports | 2015
Andrei Yu. Kobitski; Jens C. Otte; Masanari Takamiya; Benjamin Schäfer; Jonas Mertes; Johannes Stegmaier; Sepand Rastegar; Francesca Rindone; Volker Hartmann; Rainer Stotzka; Ariel Garcia; Jos van Wezel; Ralf Mikut; Uwe Strähle; G. Ulrich Nienhaus
A new era in developmental biology has been ushered in by recent advances in the quantitative imaging of all-cell morphogenesis in living organisms. Here we have developed a light-sheet fluorescence microscopy-based framework with single-cell resolution for identification and characterization of subtle phenotypical changes of millimeter-sized organisms. Such a comparative study requires analyses of entire ensembles to be able to distinguish sample-to-sample variations from definitive phenotypical changes. We present a kinetic digital model of zebrafish embryos up to 16 h of development. The model is based on the precise overlay and averaging of data taken on multiple individuals and describes the cell density and its migration direction at every point in time. Quantitative metrics for multi-sample comparative studies have been introduced to analyze developmental variations within the ensemble. The digital model may serve as a canvas on which the behavior of cellular subpopulations can be studied. As an example, we have investigated cellular rearrangements during germ layer formation at the onset of gastrulation. A comparison of the one-eyed pinhead (oep) mutant with the digital model of the wild-type embryo reveals its abnormal development at the onset of gastrulation, many hours before changes are obvious to the eye.
ieee international symposium on parallel & distributed processing, workshops and phd forum | 2011
Ariel Garcia; S. Bourov; Ahmad Hammad; Jos van Wezel; Bernhard Neumair; Achim Streit; Volker Hartmann; Thomas Jejkal; Patrick Neuberger; Rainer Stotzka
The Large Scale Data Facility (LSDF) at the Karlsruhe Institute of Technology was started end of 2009 with the aim of supporting the growing requirements of data intensive experiments. In close cooperation with the involved scientific communities, the LSDF provides them not only with adequate storage space but with a directly attached analysis farm and -- more importantly -- with value added services for their big scientific data-sets. Analysis workflows are supported through the mixed Hadoop and Open Nebula Cloud environments directly attached to the storage, and enable the efficient processing of the experimental data. Metadata handling is a central part of the LSDF, where a metadata repository, community specific metadata schemes, graphical tools, and APIs were developed for accessing and efficiently organizing the stored data-sets.
parallel, distributed and network-based processing | 2011
Rainer Stotzka; Volker Hartmann; Thomas Jejkal; Michael Sutter; Jos van Wezel; Marcus Hardt; Ariel Garcia; Rainer Kupsch; S. Bourov
To cope with the growing requirements of data intensive scientific experiments, models and simulations the Large Scale Data Facility(LSDF) at KIT aims to support many scientific disciplines. The LSDFis a distributed storage facility at Exabyte scale providing storage, archives, data bases and meta data repositories. Open interfaces and APIs support a variety of access methods to the highly available services for high throughput data applications. Tools for an easy and transparent access allow scientists to use the LSDF without bothering with the internal structures and technologies. In close cooperation with the scientific communities the LSDF provides assistance to efficiently organize data and metadata structures, and develops and deploys community specific software on the directly connected computing infrastructure.
grid computing | 2005
Jorge Gomes; M. David; João Martins; Luis Bernardo; Ariel Garcia; Markus Hardt; Harald Kornmayer; J. Marco; Rafael Marco; D. Rodríguez; Iván Díaz; D. Cano; José Salt; Soledad Moreno González; Javier Sánchez; F. Fassi; V. Lara; P. Nyczyk; Patryk Lason; Andrzej Ozieblo; Pawel Wolniewicz; Michal Bluj; Krzysztof Nawrocki; Adam Padée; Wojciech Wislicki; C Campos Fernández; Javier Fontan; Yannis Cotronis; Evangelos Floros; George Tsouloupas
The International Testbed of the CrossGrid Project has been in operation for the last three years, including 16 sites in 9 countries across Europe. The main achievements in installation and operation are described, and also the substantial experience gained on providing support to application and middleware developers in the project. Results are presented showing the availability of a realistic Grid framework to execute distributed interactive and parallel jobs.
parallel, distributed and network-based processing | 2012
Thomas Jejkal; Volker Hartmann; Rainer Stotzka; Jens C. Otte; Ariel Garcia; Jos van Wezel; Achim Streit
To cope with the growing requirements of data intensive scientific experiments, models and simulations the Large Scale Data Facility (LSDF) at KIT aims to support many scientific disciplines. The LSDF is a distributed storage facility at Exabyte scale providing storage, archives, data bases and meta data repositories. Apart from data storage many scientific communities need to perform data processing operations as well. For this purpose the LSDF Execution Framework for Data Intensive Applications (LAMBDA) was developed to allow asynchronous high-performance data processing next to the LSDF. However, it is not restricted to the LSDF or any special feature only available at the LSDF. The main goal of LAMBDA is to simplify large scale data processing for scientific users by reducing complexity, responsibility and error-proneness. The description of an execution is realized as part of LAMBDA administration in the background via meta data that can be obtained from arbitrary sources. Thus, the scientific user has only to select which applications he wants to apply to his data.
ieee symposium on large data analysis and visualization | 2011
Ariel Garcia; S. Bourov; Ahmad Hammad; Volker Hartmann; Thomas Jejkal; Jens C. Otte; S. Pfeiffer; T. Schenker; C. Schmidt; P. Neuberger; Rainer Stotzka; J. van Wezel; Bernhard Neumair; Achim Streit
The Large Scale Data Facility (LSDF) was conceived and launched at the Karlsruhe Institute of Technology (KIT) end of 2009 to address the growing need for value-added storage services for data intensive experiments. The LSDF main focus is to support scientific experiments producing large data sets reaching into the petabyte range with adequate storage, support and value added services for data management, processing and preservation. In this work we describe the approach taken to perform data analysis in LSDF, as well as for data management of the scientific datasets.
international conference on digital information management | 2011
Ariel Garcia; S. Bourov; Ahmad Hammad; Thomas Jejkal; Jens C. Otte; S. Pfeiffer; T. Schenker; C. Schmidt; J. van Wezel; Bernhard Neumair; Achim Streit
The Large Scale Data Facility (LSDF) was started at the Karlsruhe Institute of Technology (KIT) end of 2009 to address the growing need for value-added storage services for its data intensive experiments. The main focus of the project is to provide scientific communities producing data collections in the tera — to petabyte range with the necessary hardware infrastructure as well as with adequate value-added services and support for the data management, processing, and preservation. In this work we describe the projects infrastructure and services design, as well as its meta data handling. Both community specific meta data schemes, a meta data repository, an application programming interface and a graphical tool for accessing the resources were developed to further support the processing workflows of the partner scientific communities. The analysis workflow of high throughput microscopy images for studying biomedical processes is described in detail.
Archive | 2011
Rüdiger Berlich; Sven Gabriel; Ariel Garcia; M. Kunze
Geneva is an Open Source library implemented in C++, targetted at large-scale parametric optimization problems. One of its strengths is the ability to run on resources ranging from multi-core machines over clusters to Grids and Clouds. This paper describes Geneva’s architecture and performance, as well as intended deployment scenarios.