Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael D. Young is active.

Publication


Featured researches published by Michael D. Young.


Proceedings of SPIE | 2014

ODI - Portal, Pipeline, and Archive (ODI-PPA): a web-based astronomical compute archive, visualization, and analysis service

Arvind Gopu; Soichi Hayashi; Michael D. Young; Daniel R. Harbeck; Todd A. Boroson; Wilson M. Liu; Ralf Kotulla; Richard A. Shaw; Robert Henschel; Jayadev Rajagopal; Elizabeth B. Stobie; Patricia Marie Knezek; R. Pierre Martin; Kevin Archbold

The One Degree Imager-Portal, Pipeline, and Archive (ODI-PPA) is a web science gateway that provides astronomers a modern web interface that acts as a single point of access to their data, and rich computational and visualization capabilities. Its goal is to support scientists in handling complex data sets, and to enhance WIYN Observatorys scientific productivity beyond data acquisition on its 3.5m telescope. ODI-PPA is designed, with periodic user feedback, to be a compute archive that has built-in frameworks including: (1) Collections that allow an astronomer to create logical collations of data products intended for publication, further research, instructional purposes, or to execute data processing tasks (2) Image Explorer and Source Explorer, which together enable real-time interactive visual analysis of massive astronomical data products within an HTML5 capable web browser, and overlaid standard catalog and Source Extractor-generated source markers (3) Workflow framework which enables rapid integration of data processing pipelines on an associated compute cluster and users to request such pipelines to be executed on their data via custom user interfaces. ODI-PPA is made up of several light-weight services connected by a message bus; the web portal built using Twitter/Bootstrap, AngularJS and jQuery JavaScript libraries, and backend services written in PHP (using the Zend framework) and Python; it leverages supercomputing and storage resources at Indiana University. ODI-PPA is designed to be reconfigurable for use in other science domains with large and complex datasets, including an ongoing offshoot project for electron microscopy data.


Proceedings of SPIE | 2014

Integrating the ODI-PPA scientific gateway with the QuickReduce pipeline for on-demand processing

Michael D. Young; Ralf Kotulla; Arvind Gopu; Wilson M. Liu

As imaging systems improve, the size of astronomical data has continued to grow, making the transfer and processing of data a significant burden. To solve this problem for the WIYN Observatory One Degree Imager (ODI), we developed the ODI-Portal, Pipeline, and Archive (ODI-PPA) science gateway, integrating the data archive, data reduction pipelines, and a user portal. In this paper, we discuss the integration of the QuickReduce (QR) pipeline into PPAs Tier 2 processing framework. QR is a set of parallelized, stand-alone Python routines accessible to all users, and operators who can create master calibration products and produce standardized calibrated data, with a short turn-around time. Upon completion, the data are ingested into the archive and portal, and made available to authorized users. Quality metrics and diagnostic plots are generated and presented via the portal for operator approval and user perusal. Additionally, users can tailor the calibration process to their specific science objective(s) by selecting custom datasets, applying preferred master calibrations or generating their own, and selecting pipeline options. Submission of a QuickReduce job initiates data staging, pipeline execution, and ingestion of output data products all while allowing the user to monitor the process status, and to download or further process/analyze the output within the portal. User-generated data products are placed into a private user-space within the portal. ODI-PPA leverages cyberinfrastructure at Indiana University including the Big Red II supercomputer, the Scholarly Data Archive tape system and the Data Capacitor shared file system.


Software and Cyberinfrastructure for Astronomy V | 2018

ImageX 3.0: a full stack imaging archive solution

Michael D. Young; Arvind Gopu; Raymond Perigo

Over the past several years we have faced the need to develop a number of solutions to address the challenge of archiving large-format scientific imaging data and seamlessly visualizing that data—irrespective of the image format—on a web browser. ImageX is a ground-up rewrite and synthesis of our solutions to this issue, with a goal of reducing the workload required to transition from simply storing vast amounts of scientific imaging data on disk to securely archiving and sharing that data with the world. The components that make up the ImageX service stack include a secure and scalable back-end data service optimized for providing imaging data, a pre-processor to harvest metadata and intelligently scale and store the imaging data, and a flexible and embeddable front-end visualization web application. Our latest version of the software suite called ImageX 3.0 has been designed to meet the needs of a single user running locally on their own personal computer or scaled up to provide support for the image storage and visualization needs of a modern observatory with the intention of providing a ’Push button’ solution to a fully deployed solution. Each ImageX 3.0 component is provided as a Docker container, and can be rapidly and seamlessly deployed to meet demand. In this paper, we describe the ImageX architecture while demonstrating many of its features, including intelligent image scaling with adaptive histograms, load-balancing, and administrative tools. On the user-facing side we demonstrate how the ImageX 3.0 viewer can be embedded into the content of any web application, and explore the astronomy-specific features and plugins we’ve written into it. The ImageX service stack is fully open-sourced, and is built upon widely-supported industry standards (Node.js, Angular, etc.). Apart from being deployed as a standalone service stack, ImageX components are currently in use or expected to be deployed on: (1) the ODI-PPA portal serving astronomical images taken at the WIYN Observatory in near real-time; (2) the web portal serving microscopy images taken at the IU Electron Microscopy Center; (3) the RADY-SCA portal supporting radiology and medical imaging as well as neuroscience researchers at IU.


Software and Cyberinfrastructure for Astronomy V | 2018

Toward sustainable deployment of distributed services on the cloud: dockerized ODI-PPA on Jetstream

Yuanzhi Bao; Arvind Gopu; Raymond Perigo; Michael D. Young

The One Degree Imager - Portal, Pipeline and Archive (ODI-PPA) - a mature and fully developed product - has been a workhorse for astronomers observing on the WIYN ODI. It not only provides access to data stored in a secure archive, it also has a rich search and visualization interface, as well as integrated pipeline capabilities connected with supercomputers at Indiana University in a manner transparent to the user. As part of our ongoing sustainability review process, and given the increasing age of the ODI-PPA codebase, we have considered various approaches to modernization. While industry currently trends toward Node.js based architectures, we concluded that porting an entire legacy PHP and Python-based system like ODI-PPA with its complex and distributed service stack would require too significant an amount of human development/testing/deployment hours. Aging deployment hardware with tight budgets is another issue we identified, a common one especially when deploying complex distributed service stacks. In this paper, we present DockStream (https://jsportal.odi.iu.edu), an elegant solution that addresses both of the aforementioned issues. Using ODI-PPA as a case study, we present a proof of concept solution combining a suite of Docker containers built for each PPA service and a mechanism to acquire cost-free computational and storage resources. The dockerized ODI-PPA services can be deployed on one Dockerenabled host or several depending on the availability of hardware resources and the expected levels of use. In this paper, we describe the process of designing, creating, and deploying such custom containers. The NSF-funded Jetstream led by the Indiana University Pervasive Technology Institute (PTI), provides cloud-based, on-demand computing and data analysis resources, and a pathway to tackle the issue of insufficient hardware refreshment funds. We briefly describe the process to acquiring computational and storage resources on Jetstream, and the use of the Atmosphere web interface to create and maintain virtual machines on Jetstream. Finally, we present a summary of security refinements to a dockerized service stack on the cloud using nginx, custom docker networks, and Linux firewalls that significant decrease the risk of security vulnerabilities and incidents while improving scalability.


Publications of the Astronomical Society of the Pacific | 2017

An Archive of Spectra from the Mayall Fourier Transform Spectrometer at Kitt Peak

Catherine A. Pilachowski; K. Hinkle; Michael D. Young; H. B. Dennis; Arvind Gopu; Robert Henschel; Soichi Hayashi

We describe the SpArc science gateway for spectral data obtained during the period from 1975 through 1995 at the Kitt Peak National Observatory using the Fourier Transform Spectrometer (FTS) in operation at the Mayall 4-m telescope. SpArc is hosted by Indiana University Bloomington and is available for public access. The archive includes nearly 10,000 individual spectra of more than 800 different astronomical sources including stars, nebulae, galaxies, and Solar System objects. We briefly describe the FTS instrument itself, and summarize the conversion of the original interferograms into spectral data and the process for recovering the data into FITS files. The architecture of the archive is discussed, and the process for retrieving data from the archive is introduced. Sample use cases showing typical FTS spectra are presented.


Proceedings of SPIE | 2016

ImageX: New and improved Image Explorer for astronomical images and beyond

Soichi Hayashi; Arvind Gopu; Ralf Kotulla; Michael D. Young

The One Degree Imager - Portal, Pipeline, and Archive (ODI-PPA) has included the Image Explorer interactive image visualization tool since it went operational. Portal users were able to quickly open up several ODI images within any HTML5 capable web browser, adjust the scaling, apply color maps, and perform other basic image visualization steps typically done on a desktop client like DS9. However, the original design of the Image Explorer required lossless PNG tiles to be generated and stored for all raw and reduced ODI images thereby taking up tens of TB of spinning disk space even though a small fraction of those images were being accessed by portal users at any given time. It also caused significant overhead on the portal web application and the Apache webserver used by ODI-PPA. We found it hard to merge in improvements made to a similar deployment in another projects portal. To address these concerns, we re-architected Image Explorer from scratch and came up with ImageX, a set of microservices that are part of the IU Trident project software suite, with rapid interactive visualization capabilities useful for ODI data and beyond. We generate a full resolution JPEG image for each raw and reduced ODI FITS image before producing a JPG tileset, one that can be rendered using the ImageX frontend code at various locations as appropriate within a web portal (for example: on tabular image listings, views allowing quick perusal of a set of thumbnails or other image sifting activities). The new design has decreased spinning disk requirements, uses AngularJS for the client side Model/View code (instead of depending on backend PHP Model/View/Controller code previously used), OpenSeaDragon to render the tile images, and uses nginx and a lightweight NodeJS application to serve tile images thereby significantly decreasing the Time To First Byte latency by a few orders of magnitude. We plan to extend ImageX for non-FITS images including electron microscopy and radiology scan images, and its featureset to include basic functions like image overlay and colormaps. Users needing more advanced visualization and analysis capabilities could use a desktop tool like DS9+IRAF on another IU Trident project called StarDock, without having to download Gigabytes of FITS image data.


Proceedings of SPIE | 2016

Trident: scalable compute archives: workflows, visualization, and analysis

Arvind Gopu; Soichi Hayashi; Michael D. Young; Ralf Kotulla; Robert Henschel; Daniel R. Harbeck

The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domains data products and about executing their pipelines and application work flows. Tridents microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub-work flows (3) ImageX, an interactive image visualization service (3) an authentication and authorization service (4) a data service that handles archival, staging and serving of data products, and (5) a notification service that serves statistical collation and reporting needs of various projects. Several other additional components are under development. Trident is an umbrella project, that evolved from the One Degree Imager, Portal, Pipeline, and Archive (ODI-PPA) project which we had initially refactored toward (1) a powerful analysis/visualization portal for Globular Cluster System (GCS) survey data collected by IU researchers, 2) a data search and download portal for the IU Electron Microscopy Centers data (EMC-SCA), 3) a prototype archive for the Ludwig Maximilian Universitys Wide Field Imager. The new Trident software has been used to deploy (1) a metadata quality control and analytics portal (RADY-SCA) for DICOM formatted medical imaging data produced by the IU Radiology Center, 2) Several prototype work flows for different domains, 3) a snapshot tool within IUs Karst Desktop environment, 4) a limited component-set to serve GIS data within the IU GIS web portal. Trident SCA systems leverage supercomputing and storage resources at Indiana University but can be configured to make use of any cloud/grid resource, from local workstations/servers to (inter)national supercomputing facilities such as XSEDE.


Proceedings of SPIE | 2016

StarDock: Shipping Customized Computing Environments to the Data

Michael D. Young; Soichi Hayashi; Arvind Gopu

Surging data volumes make it increasingly unfeasible to transfer astronomical datasets to the local systems of individual scientists. Centralized pipelines offer some relief, but lack flexibility to fulfill the needs of all users. We have developed a system that leverages the Docker container application virtualization software. Along with a suite of commonly used astronomy applications, users can configure a container with their own custom software and analysis tools. Our StarDock system will move the users container to the data, and expose the requested dataset, allowing our users to safely and securely process their data without needlessly transferring hundreds of gigabytes.


GeoENV 2012 | 2012

Investigating Local Relationships between Bioaccessibility of Trace Elements in Soils and Cancer Data

Jennifer McKinley; Ulrich Ofterdinger; Michael D. Young; Anna Gavin; Mark Cave; Joanna Wragg; A. Barsby


2012 Sino-European Symposium on Environment and Health (SESEH 2012) | 2012

Identifying geogenic and anthropogenic influences on near surface deposits across urban and rural areas in Northern Ireland

Rory Doherty; Siobhan Cox; Ulrich Ofterdinger; R. McIllwaine; Michael D. Young

Collaboration


Dive into the Michael D. Young's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ralf Kotulla

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel R. Harbeck

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jennifer McKinley

Queen's University Belfast

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge