Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Antoine Vandecreme is active.

Publication


Featured researches published by Antoine Vandecreme.


BMC Bioinformatics | 2015

Survey statistics of automated segmentations applied to optical imaging of mammalian cells

Peter Bajcsy; Antonio Cardone; Joe Chalfoun; Michael Halter; Derek Juba; Marcin Kociolek; Michael P. Majurski; Adele P. Peskin; Carl G. Simon; Mylene Simon; Antoine Vandecreme; Mary Brady

BackgroundThe goal of this survey paper is to overview cellular measurements using optical microscopy imaging followed by automated image segmentation. The cellular measurements of primary interest are taken from mammalian cells and their components. They are denoted as two- or three-dimensional (2D or 3D) image objects of biological interest. In our applications, such cellular measurements are important for understanding cell phenomena, such as cell counts, cell-scaffold interactions, cell colony growth rates, or cell pluripotency stability, as well as for establishing quality metrics for stem cell therapies. In this context, this survey paper is focused on automated segmentation as a software-based measurement leading to quantitative cellular measurements.MethodsWe define the scope of this survey and a classification schema first. Next, all found and manually filteredpublications are classified according to the main categories: (1) objects of interests (or objects to be segmented), (2) imaging modalities, (3) digital data axes, (4) segmentation algorithms, (5) segmentation evaluations, (6) computational hardware platforms used for segmentation acceleration, and (7) object (cellular) measurements. Finally, all classified papers are converted programmatically into a set of hyperlinked web pages with occurrence and co-occurrence statistics of assigned categories.ResultsThe survey paper presents to a reader: (a) the state-of-the-art overview of published papers about automated segmentation applied to optical microscopy imaging of mammalian cells, (b) a classification of segmentation aspects in the context of cell optical imaging, (c) histogram and co-occurrence summary statistics about cellular measurements, segmentations, segmented objects, segmentation evaluations, and the use of computational platforms for accelerating segmentation execution, and (d) open research problems to pursue.ConclusionsThe novel contributions of this survey paper are: (1) a new type of classification of cellular measurements and automated segmentation, (2) statistics about the published literature, and (3) a web hyperlinked interface to classification statistics of the surveyed papers at https://isg.nist.gov/deepzoomweb/resources/survey/index.html.


international conference on big data | 2013

Terabyte-sized image computations on Hadoop cluster platforms

Peter Bajcsy; Antoine Vandecreme; Julien M. Amelot; Phuong T. Nguyen; Joe Chalfoun; Mary Brady

We present a characterization of four basic Terabyte-sized image computations on a Hadoop cluster in terms of their relative efficiency according to the modified Amdahls law. The work is motivated by the lack of standard benchmarks and stress tests for big image processing operations on a Hadoop computer cluster platform. Our benchmark design and evaluations were performed on one of the three microscopy image sets, each consisting of over one half Terabyte. All image processing benchmarks executed on the NIST Raritan cluster with Hadoop were compared against baseline measurements, such as the Terasort/Teragen designed for Hadoop testing previously, image processing executions on a multiprocessor desktop and on NIST Raritan cluster using Java Remote Method Invocation (RMI) with multiple configurations. By applying our methodology to assessing efficiencies of computations on computer cluster configurations, we could rank computation configurations and aid scientists in measuring the benefits of running image processing on a Hadoop cluster.


Stem Cell Research | 2016

Large-scale time-lapse microscopy of Oct4 expression in human embryonic stem cell colonies.

Kiran Bhadriraju; Michael Halter; Julien M. Amelot; Peter Bajcsy; Joe Chalfoun; Antoine Vandecreme; Barbara S. Mallon; Kyeyoon Park; Subhash Sista; John T. Elliott; Anne L. Plant

Identification and quantification of the characteristics of stem cell preparations is critical for understanding stem cell biology and for the development and manufacturing of stem cell based therapies. We have developed image analysis and visualization software that allows effective use of time-lapse microscopy to provide spatial and dynamic information from large numbers of human embryonic stem cell colonies. To achieve statistically relevant sampling, we examined >680 colonies from 3 different preparations of cells over 5 days each, generating a total experimental dataset of 0.9 terabyte (TB). The 0.5 Giga-pixel images at each time point were represented by multi-resolution pyramids and visualized using the Deep Zoom Javascript library extended to support viewing Giga-pixel images over time and extracting data on individual colonies. We present a methodology that enables quantification of variations in nominally-identical preparations and between colonies, correlation of colony characteristics with Oct4 expression, and identification of rare events.


IEEE Computer | 2016

Enabling Stem Cell Characterization from Large Microscopy Images

Peter Bajcsy; Antoine Vandecreme; Julien M. Amelot; Joe Chalfoun; Michael P. Majurski; Mary Brady

Microscopy could be an important tool for characterizing stem cell products if quantitative measurements could be collected over multiple spatial and temporal scales. With the cells changing states over time and being several orders of magnitude smaller than cell products, modern microscopes are already capable of imaging large spatial areas, repeat imaging over time, and acquiring images over several spectra. However, characterizing stem cell products from such large image collections is challenging because of data size, required computations, and lack of interactive quantitative measurements needed to determine release criteria. We present a measurement web system consisting of available algorithms, extensions to a client-server framework using Deep Zoom, and the configuration know-how to provide the information needed for inspecting the quality of a cell product. The cell and other data sets are accessible via the prototype web-based system at http://isg.nist.gov/deepzoomweb.Microscopes can now cover large spatial areas and capture stem cell behavior over time. However, without discovering statistically reliable quantitative stem cell quality measures, products cannot be released to market. A Web-based measurement system overcomes desktop limitations by leveraging cloud and cluster computing for offline computations and by using Deep Zoom extensions for interactive viewing and measurement.


international conference on big data | 2014

Spatial computations over terabyte-sized images on hadoop platforms

Peter Bajcsy; Phuong T. Nguyen; Antoine Vandecreme; Mary Brady

Our objective is to lower the barrier of executing spatial image computations in a computer cluster/cloud environment instead of in a desktop/laptop computing environment. We research two related problems encountered during an execution of spatial computations over terabyte-sized images using Apache Hadoop running on distributed computing resources. The two problems address (a) detection of spatial computations and their parameter estimation from a library of image processing functions, and (b) partitioning of image data for spatial image computations on Hadoop cluster/cloud computing platforms in order to minimize network data transfer. The first problem is solved by designing an iterative estimation methodology. The second problem is formulated as an optimization over three partitioning schemas (physical, logical without overlap and logical with overlap), and evaluated over several system configuration parameters. Our experimental results for the two problems demonstrate 100% accuracy in detecting spatial computations in the Java Advanced Imaging and ImageJ libraries, a speed-up of 5.36 between the default Hadoop physical partitioning and developed logical image partitioning with overlap, and 3.14 times faster execution of logical partitioning with overlap than the one without overlap. The novelty of our work is in designing an extension to Apache Hadoop to run a class of spatial image processing operations efficiently on a distributed computing resource.


Microscopy and Microanalysis | 2014

Interactive Analysis of Terabyte-sized SEM-EDS Hyperspectral Images

Antoine Vandecreme; Peter Bajcsy; Nicholas W. M. Ritchie; John Henry J. Scott

Scanning electron microscopy with energy-dispersive X-ray spectrometry (SEM-EDS) was used to characterize a macroscopic artifact from Peru (a Moche cast figure) [1]. A 3D hyperspectral datacube was acquired over 51.2 hours, measuring 11,520 x 9,984 pixels with 2,048 energy channels per pixel; each pixel corresponds to 900 nm in the x and y dimensions. With current technology, interacting with this hyperspectral dataset to perform exploratory analysis, displaying gigapixel elemental maps, and surveying terabyte-scale spectral data is problematic.


Microscopy and Microanalysis | 2016

Interactive Web-based Spatio-Statistical Image Modeling from Gigapixel Images to Improve Discovery and Traceability of Published Statistical Models

Peter Bajcsy; Antoine Vandecreme; Mary Brady

One of the main challenges for many scientific communities is reproducibility of published discoveries [1]. In the current publication process, manuscripts contain only data summaries and models. The trend in many state-of-the-art publication venues is to include references in the manuscripts that point to online accessible software and raw data [2]. However, this trend does not address the problems of reproducibility and traceability of published summaries and models to the individual contributing data points for analyses from microscopy images reaching Gigapixel and Terabyte sizes. The reasons lie in the extra expertise and infrastructure needed to share Gigapixel images and intermediate image features for visual validation. In addition, summarization and modelling steps during a discovery include userdriven selections of data points and statistical models. There is a need to design discovery-assisting and summary validation software systems over very large images that would advance discoveries and improve reproducibility of published scientific results.


international conference on big data | 2013

Re-projection of terabyte-sized images

Peter Bajcsy; Antoine Vandecreme; Mary Brady

This work addresses the problem of re-projecting a terabyte-sized 3D data set represented as a set of 2D Deep Zoom pyramids. In general, a re-projection for small 3D data sets is executed directly in RAM. However, RAM becomes a limiting factor for terabyte-sized 3D volumes formed by a stack of hundreds of megapixel to gigapixel 2D frames. We have benchmarked three methods to perform the re-projection computation in order to overcome the RAM limitation.


Spie Newsroom | 2013

Deep Zoom tool for advanced interactivity with high-resolution images

Paul Khouri-Saba; Antoine Vandecreme; Mary Brady; Kiran Bhadriraju; Peter Bajcsy

Deep Zoom technology enables efficient transmission and viewing of images with large pixel counts.1 Originally developed for 2D images by Seadragon Software, it was expanded by Microsoft Live Labs, and by Google to support Google Maps. Later, it was extended for 3D and other visualizations with opensource projects such as OpenSeaDragon.234 Here we report on the extension of Deep Zoom to 2DCtemporal data sets, retrieving image features, and recording fly-through image sequences (recorded simultaneously and viewed sequentially) from terabytes of image data. Our work is motivated by analysis of live cell microscopy images that use about 241,920 image tiles ( 0.677 TB) per investigation. Each experiment is represented by 18 14 spatial image tiles of two color channels (phase contrast and green fluorescent protein) acquired over five days every 15 minutes. With hundreds of thousands of image tiles, it is extremely difficult to inspect the current 2DCtime images in a contiguous spatial and temporal context without preprocessing (calibration and stitching of multiple images), and without browser-based visualization using Deep Zoom. Other challenges include Deep Zoom pyramid-building (the process by which an image is created as a pyramid of tiles at different resolutions)5 and storage issues (for every experiment there are about 6,091,378 pyramid files in 16,225 folders). Analyzing cell images requires the comparison of image intensity values across various channels, as well as additional layers of extracted information (intensity statistics over cell colonies, for example). A further challenge is extracting parts of a Deep Zoom rendering to examine interesting subsets for documenting, sharing, and to perform further scientific analyses. Figure 1. The main control panel for Deep Zoom image interactions, shown in a regular browser view.


Microscopy Today | 2017

From Image Tiles to Web-Based Interactive Measurements in One Stop

Antoine Vandecreme; Michael P. Majurski; Joe Chalfoun; Keana C. Scott; John Henry J. Scott; Mary Brady; Peter Bajcsy

Collaboration


Dive into the Antoine Vandecreme's collaboration.

Top Co-Authors

Avatar

Mary Brady

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Joe Chalfoun

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Peter Bajcsy

University of Illinois at Urbana–Champaign

View shared research outputs
Top Co-Authors

Avatar

Julien M. Amelot

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Michael P. Majurski

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Michael Halter

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Anne L. Plant

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Barbara S. Mallon

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

John Henry J. Scott

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

John T. Elliott

National Institute of Standards and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge