Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jos van Wezel is active.

Publication


Featured researches published by Jos van Wezel.


Scientific Reports | 2015

An ensemble-averaged, cell density-based digital model of zebrafish embryo development derived from light-sheet microscopy data with single-cell resolution

Andrei Yu. Kobitski; Jens C. Otte; Masanari Takamiya; Benjamin Schäfer; Jonas Mertes; Johannes Stegmaier; Sepand Rastegar; Francesca Rindone; Volker Hartmann; Rainer Stotzka; Ariel Garcia; Jos van Wezel; Ralf Mikut; Uwe Strähle; G. Ulrich Nienhaus

A new era in developmental biology has been ushered in by recent advances in the quantitative imaging of all-cell morphogenesis in living organisms. Here we have developed a light-sheet fluorescence microscopy-based framework with single-cell resolution for identification and characterization of subtle phenotypical changes of millimeter-sized organisms. Such a comparative study requires analyses of entire ensembles to be able to distinguish sample-to-sample variations from definitive phenotypical changes. We present a kinetic digital model of zebrafish embryos up to 16 h of development. The model is based on the precise overlay and averaging of data taken on multiple individuals and describes the cell density and its migration direction at every point in time. Quantitative metrics for multi-sample comparative studies have been introduced to analyze developmental variations within the ensemble. The digital model may serve as a canvas on which the behavior of cellular subpopulations can be studied. As an example, we have investigated cellular rearrangements during germ layer formation at the onset of gastrulation. A comparison of the one-eyed pinhead (oep) mutant with the digital model of the wild-type embryo reveals its abnormal development at the onset of gastrulation, many hours before changes are obvious to the eye.


ieee international symposium on parallel & distributed processing, workshops and phd forum | 2011

The Large Scale Data Facility: Data Intensive Computing for Scientific Experiments

Ariel Garcia; S. Bourov; Ahmad Hammad; Jos van Wezel; Bernhard Neumair; Achim Streit; Volker Hartmann; Thomas Jejkal; Patrick Neuberger; Rainer Stotzka

The Large Scale Data Facility (LSDF) at the Karlsruhe Institute of Technology was started end of 2009 with the aim of supporting the growing requirements of data intensive experiments. In close cooperation with the involved scientific communities, the LSDF provides them not only with adequate storage space but with a directly attached analysis farm and -- more importantly -- with value added services for their big scientific data-sets. Analysis workflows are supported through the mixed Hadoop and Open Nebula Cloud environments directly attached to the storage, and enable the efficient processing of the experimental data. Metadata handling is a central part of the LSDF, where a metadata repository, community specific metadata schemes, graphical tools, and APIs were developed for accessing and efficiently organizing the stored data-sets.


parallel, distributed and network-based processing | 2011

Perspective of the Large Scale Data Facility (LSDF) Supporting Nuclear Fusion Applications

Rainer Stotzka; Volker Hartmann; Thomas Jejkal; Michael Sutter; Jos van Wezel; Marcus Hardt; Ariel Garcia; Rainer Kupsch; S. Bourov

To cope with the growing requirements of data intensive scientific experiments, models and simulations the Large Scale Data Facility(LSDF) at KIT aims to support many scientific disciplines. The LSDFis a distributed storage facility at Exabyte scale providing storage, archives, data bases and meta data repositories. Open interfaces and APIs support a variety of access methods to the highly available services for high throughput data applications. Tools for an easy and transparent access allow scientists to use the LSDF without bothering with the internal structures and technologies. In close cooperation with the scientific communities the LSDF provides assistance to efficiently organize data and metadata structures, and develops and deploys community specific software on the directly connected computing infrastructure.


KI'10 Proceedings of the 33rd annual German conference on Advances in artificial intelligence | 2010

Methods for automated high-throughput toxicity testing using Zebrafish embryos

Rüdiger Alshut; Jessica Legradi; Urban Liebel; Lixin Yang; Jos van Wezel; Uwe Strähle; Ralf Mikut; Markus Reischl

In this paper, an automated process to extract experiment-specific parameters out of microscope images of zebrafish embryos is presented and applied to experiments consisting of toxicological treated zebrafish embryos. The treatments consist of a dilution series of several compounds. A custom built graphical user interface allows an easy labeling and browsing of the image data. Subsequently image-specific features are extracted for each image based on image processing algorithms. By means of feature selection, the most significant features are determined and a classification divides the images in two classes. Out of the classification results dose-response curves as well as frequently used general indicators of substances acute toxicity can be automatically calculated. Exemplary the median lethal dose is determined. The presented approach was designed for real high-throughput screening including data handling and the results are stored in a long-time data storage and prepared to be processed on a cluster computing system being build up in the KIT. It provides the possibility to test any amount of chemical substances in highthroughput and is, in combination with new screening microscopes, able to manage ten thousands of risk tests required e.g. in the REACH framework or for drug discovery.


parallel, distributed and network-based processing | 2012

LAMBDA -- The LSDF Execution Framework for Data Intensive Applications

Thomas Jejkal; Volker Hartmann; Rainer Stotzka; Jens C. Otte; Ariel Garcia; Jos van Wezel; Achim Streit

To cope with the growing requirements of data intensive scientific experiments, models and simulations the Large Scale Data Facility (LSDF) at KIT aims to support many scientific disciplines. The LSDF is a distributed storage facility at Exabyte scale providing storage, archives, data bases and meta data repositories. Apart from data storage many scientific communities need to perform data processing operations as well. For this purpose the LSDF Execution Framework for Data Intensive Applications (LAMBDA) was developed to allow asynchronous high-performance data processing next to the LSDF. However, it is not restricted to the LSDF or any special feature only available at the LSDF. The main goal of LAMBDA is to simplify large scale data processing for scientific users by reducing complexity, responsibility and error-proneness. The description of an execution is realized as part of LAMBDA administration in the background via meta data that can be obtained from arbitrary sources. Thus, the scientific user has only to select which applications he wants to apply to his data.


Automatisierungstechnik | 2016

Automation strategies for large-scale 3D image analysis

Johannes Stegmaier; Benjamin Schott; Eduard Hübner; Manuel Traub; Maryam Shahid; Masanari Takamiya; Andrei Yu. Kobitski; Volker Hartmann; Rainer Stotzka; Jos van Wezel; Achim Streit; G. Ulrich Nienhaus; Uwe Strähle; Markus Reischl; Ralf Mikut

Abstract New imaging techniques enable visualizing and analyzing a multitude of unknown phenomena in many areas of science at high spatio-temporal resolution. The rapidly growing amount of image data, however, can hardly be analyzed manually and, thus, future research has to focus on automated image analysis methods that allow one to reliably extract the desired information from large-scale multidimensional image data. Starting with infrastructural challenges, we present new software tools, validation benchmarks and processing strategies that help coping with large-scale image data. The presented methods are illustrated on typical problems observed in developmental biology that can be answered, e.g., by using time-resolved 3D microscopy images.


Journal of Physics: Conference Series | 2014

Archival Services and Technologies for Scientific Data

Jörg Meyer; Marcus Hardt; Achim Streit; Jos van Wezel

After analysis and publication, there is no need to keep experimental data online on spinning disks. For reliability and costs inactive data is moved to tape and put into a data archive. The data archive must provide reliable access for at least ten years following a recommendation of the German Science Foundation (DFG), but many scientific communities wish to keep data available much longer. Data archival is on the one hand purely a bit preservation activity in order to ensure the bits read are the same as those written years before. On the other hand enough information must be archived to be able to use and interpret the content of the data. The latter is depending on many also community specific factors and remains an areas of much debate among archival specialists. The paper describes the current practice of archival and bit preservation in use for different science communities at KIT for which a combination of organizational services and technical tools are required. The special monitoring to detect tape related errors, the software infrastructure in use as well as the service certification are discussed. Plans and developments at KIT also in the context of the Large Scale Data Management and Analysis (LSDMA) project are presented. The technical advantages of the T10 SCSI Stream Commands (SSC-4) and the Linear Tape File System (LTFS) will have a profound impact on future long term archival of large data sets.


arXiv: Digital Libraries | 2012

Data Life Cycle Labs, A New Concept to Support Data-Intensive Science

Jos van Wezel; Achim Streit; Christopher Jung; Rainer Stotzka; Silke Halstenberg; Fabian Rigoll; Ariel Garcia; Andreas Heiss; Kilian Schwarz; Martin Gasthuber; André Giesler


Helmholtz Portfolio Theme Large-Scale Data Management and Analysis (LSDMA). Ed.: C. Jung | 2017

Long-term Access to Data, Communities, Developments and Infrastructure

Jos van Wezel; Felix Bach; Peter Krauss; Tobias Kurze; Jörg Meyer; Ralph Müller-Pfefferkorn; Jan Potthoff


10th European Zebrafish Meeting, Budapest, H, July 3-7, 2017 | 2017

High-throughput screening and analysis of startle response behvior in zebrafish

Ravindra Peravali; Daniel Marcato; Johannes Stegmaier; Rbert Geisler; Christian Pylatiuk; Markus Reischl; Jos van Wezel; Ralf Mikut; Julia E. Dallman; Uwe Strähle

Collaboration


Dive into the Jos van Wezel's collaboration.

Top Co-Authors

Avatar

Rainer Stotzka

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Achim Streit

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ariel Garcia

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Volker Hartmann

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ralf Mikut

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Uwe Strähle

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Johannes Stegmaier

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Markus Reischl

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Thomas Jejkal

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrei Yu. Kobitski

Karlsruhe Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge