Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joshua Doubleday is active.

Publication


Featured researches published by Joshua Doubleday.


IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | 2010

Optimized Autonomous Space In-Situ Sensor Web for Volcano Monitoring

Wen-Zhan Song; Behrooz A. Shirazi; Renjie Huang; Mingsen Xu; Nina Peterson; Rick LaHusen; John S. Pallister; Dan Dzurisin; Seth C. Moran; M. Lisowski; Sharon Kedar; Steve Chien; Frank H. Webb; Aaron Kiely; Joshua Doubleday; Ashley Gerard Davies; David C. Pieri

In response to NASAs announced requirement for Earth hazard monitoring sensor-web technology, a multidisciplinary team involving sensor-network experts (Washington State University), space scientists (JPL), and Earth scientists (USGS Cascade Volcano Observatory (CVO)), have developed a prototype of dynamic and scalable hazard monitoring sensor-web and applied it to volcano monitoring. The combined Optimized Autonomous Space - In-situ Sensor-web (OASIS) has two-way communication capability between ground and space assets, uses both space and ground data for optimal allocation of limited bandwidth resources on the ground, and uses smart management of competing demands for limited space assets. It also enables scalability and seamless infusion of future space and in-situ assets into the sensor-web. The space and in-situ control components of the system are integrated such that each element is capable of autonomously tasking the other. The ground in-situ was deployed into the craters and around the flanks of Mount St. Helens in July 2009, and linked to the command and control of the Earth Observing One (EO-1) satellite.


international geoscience and remote sensing symposium | 2011

Combining space-based and in-situ measurements to track flooding in Thailand

Steve Chien; Joshua Doubleday; David Mclaren; Daniel Tran; Veerachai Tanpipat; Watis Leelapatra; Vichian Plermkamon; Cauligi S. Raghavendra; Daniel Mandl

We describe efforts to integrate in-situ sensing, space-borne sensing, hydrological modeling, active control of sensing, and automatic data product generation to enhance monitoring and management of flooding. In our approach, broad coverage sensors and missions such as MODIS, TRMM, and weather satellite information and in-situ weather and river gauging information are all inputs to track flooding via river basin and sub-basin hydrological models. While these inputs can provide significant information as to the major flooding, targetable space measurements can provide better spatial resolution measurements of flooding extent. In order to leverage such assets we automatically task observations in response to automated analysis indications of major flooding. These new measurements are automatically processed and assimilated with the other flooding data. We describe our ongoing efforts to deploy this system to track major flooding events in Thailand.


international geoscience and remote sensing symposium | 2011

Space-based Sensorweb monitoring of wildfires in Thailand

Steve Chien; Joshua Doubleday; David Mclaren; Ashley Gerard Davies; Daniel Tran; Veerachai Tanpipat; Siri Akaakara; Anuchit Ratanasuwan; Daniel Mandl

We describe efforts to apply sensorweb technologies to the monitoring of forest fires in Thailand. In this approach, satellite data and ground reports are assimilated to assess the current state of the forest system in terms of forest fire risk, active fires, and likely progression of fires and smoke plumes. This current and projected assessment can then be used to actively direct sensors and assets to best acquire further information. This process operates continually with new data updating models of fire activity leading to further sensing and updating of models. As the fire activity is tracked, products such as active fire maps, burn scar severity maps, and alerts are automatically delivered to relevant parties. We describe the current state of the Thailand Fire Sensorweb which utilizes the MODIS-based FIRMS system to track active fires and trigger Earth Observing One / Advanced Land Imager to acquire imagery and produce active fire maps, burn scar severity maps, and alerts. We describe ongoing work to integrate additional sensor sources and generate additional products.


Geological Society, London, Special Publications | 2016

The NASA Volcano Sensor Web, advanced autonomy and the remote sensing of volcanic eruptions: a review

Ashley Gerard Davies; Steve Chien; Daniel Tran; Joshua Doubleday

Abstract The Volcano Sensor Web (VSW) is a globe-spanning net of sensors and applications for detecting volcanic activity. Alerts from the VSW are used to trigger observations from space using the Earth Observing-1 (EO-1) spacecraft. Onboard EO-1 is the Autonomous Sciencecraft Experiment (ASE) advanced autonomy software. Using ASE has streamlined spacecraft operations and has enabled the rapid delivery of high-level products to end-users. The entire process, from initial alert to product delivery, is autonomous. This facility is of great value as a rapid response is vital during a volcanic crisis. ASE consists of three parts: (1) Science Data Classifiers, which process EO-1 Hyperion data to identify anomalous thermal signals; (2) a Spacecraft Command Language; and (3) the Continuous Activity Scheduling Planning Execution and Replanning (CASPER) software that plans and replans activities, including downlinks, based on available resources and operational constraints. For each eruption detected, thermal emission maps and estimates of eruption parameters are posted to a website at the Jet Propulsion Laboratory, California Institute of Technology, in Pasadena, CA. Selected products are emailed to end-users. The VSW uses software agents to detect volcanic activity alerts generated from a wide variety of sources on the ground and in space, and can also be easily triggered manually.


IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | 2013

Onboard Product Generation on Earth Observing One: A Pathfinder for the Proposed Hyspiri Mission Intelligent Payload Module

Steve Chien; David Mclaren; Daniel Tran; Ashley Gerard Davies; Joshua Doubleday; Daniel Mandl

The proposed HyspIRI mission is evaluating a X-band Direct Broadcast capability that would enable data to be delivered to ground stations virtually as it is acquired. However the HyspIRI VSWIR and TIR instruments are expected to produce over 800 × 106 bits per second of data while the Direct Broadcast capability is approximately 10 × 106 bits per second for a ~ 80x oversubscription. In order to address this data throughput mismatch a Direct Broadcast concept called the Intelligent Payload Module (IPM) has been developed to determine which data to downlink based on both the type of surface the spacecraft is overlying and onboard processing of the data to detect events. For example, when the spacecraft is overlying polar regions it might downlink a snow/ice product. Additionally the onboard software would search for thermal signatures indicative of a volcanic event or wild fire and downlink summary information (extent, spectra) when detected. Earth Observing One (EO-1) has served as a test bed and pathfinder for this type of onboard product generation. As part of the Autonomous Sciencecraft (ASE), EO-1 implemented in ίight software the ability to analyze and develop products for a limited swath of the Hyperion hyperspectral instrument onboard the spacecraft. In a series of technology demonstrations that became part of the operational EO-1 system over 5000 science products have been generated onboard EO-1 and down linked via engineering S-band contacts, a routine automated process that continues to this day. We describe the onboard products demonstrated in EO-1 operations and show how they have paved the way for the HyspIRI Intelligent Payload Module concept.


Journal of Aerospace Information Systems | 2017

Onboard Autonomy on the Intelligent Payload EXperiment CubeSat Mission

Steve Chien; Joshua Doubleday; David R. Thompson; Kiri L. Wagstaff; John Bellardo; Craig Francis; Eric Baumgarten; Austin Williams; Edmund Yee; Eric Stanton; Jordi Piug-Suari

The Intelligent Payload Experiment (IPEX) is a CubeSat that flew from December 2013 through January 2015 and validated autonomous operations for onboard instrument processing and product generation for the Intelligent Payload Module of the Hyperspectral Infrared Imager (HyspIRI) mission concept. IPEX used several artificial intelligence technologies. First, IPEX used machine learning and computer vision in its onboard processing. IPEX used machine-learned random decision forests to classify images onboard (to downlink classification maps) and computer vision visual salience software to extract interesting regions for downlink in acquired imagery. Second, IPEX flew the Continuous Activity Scheduler Planner Execution and Re-planner AI planner/scheduler onboard to enable IPEX operations to replan to best use spacecraft resources such as file storage, CPU, power, and downlink bandwidth. First, the ground and flight operations concept for proposed HyspIRI IPM operations is described, followed by a description ...


IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | 2013

Monitoring Flooding in Thailand Using Earth Observing One in a Sensorweb

Steve Chien; Joshua Doubleday; David Mclaren; Daniel Tran; Veerachai Tanpipat; Royol Chitradon; Surajate Boonya-aroonnet; Porranee Thanapakpawin; Daniel Mandl

The Earth Observing One (EO-1) mission has been a pathfinder in demonstrating autonomous operations paradigms. In 2010-2012 (and continuing), EO-1 has been supporting sensorweb operations to enable autonomous tracking of flooding in Thailand. In this approach, the Moderate Imaging Spectrometer (MODIS) is used to perform broad-scale monitoring to track flooding at the regional level (500 m/pixel) and EO-1 is autonomously tasked in response to alerts to acquire higher resolution (30 m/pixel) Advanced Land Imager (ALI) data. This data is then automatically processed to derive products such as surface water extent and volumetric water estimates. These products are then automatically pushed to relevant authorities in Thailand for use in damage estimation, relief efforts, and damage mitigation. EO-1 has served as a testbed and pathfinder to this type of sensorweb operations. Beginning with EO-1, these techniques for monitoring are being extended to other space sensors (such as Radarsat-2, Landsat, Worldview-2, TRMM) and integrated with hydrological models, and integration with in-situ sensors.


AIAA Infotech@Aerospace Conference | 2009

Towards an Autonomous Space In-situ Marine Sensorweb

Steve Chien; Joshua Doubleday; Daniel Tran; David R. Thompson; Grace Mahoney; Yi Chao; Ramon Abel Castano; James M. Ryan; Raphael M. Kudela; Sherry L. Palacios; David G. Foley; Arjuna Balasuriya; H Schmidt; Oscar Schofield; Matthew Arrott; Michael Meisinger; Daniel Mandl; Stuart Frye; Lawrence Ong; Patrice Cappelaere

We describe ongoing efforts to integrate and coordinate space and marine assets to enable autonomous response to dynamic ocean phenomena such as algal blooms, eddies, and currents. Thus far we have focused on the use of remote sensing assets (e.g. satellites) but future plans include expansions to use a range of in-situ sensors such as gliders, autonomous underwater vehicles, and buoys/moorings.


Journal of Field Robotics | 2016

Real-Time Orbital Image Analysis Using Decision Forests, with a Deployment Onboard the IPEX Spacecraft

Alphan Altinok; David R. Thompson; Benjamin J. Bornstein; Steve Chien; Joshua Doubleday; John Bellardo

Automatic cloud recognition promises significant improvements in Earth science remote sensing. At any time, more than half of Earths surface is covered by clouds, obscuring images and atmospheric measurements. This is particularly problematic for CubeSats, a new generation of small, low-orbiting spacecraft with very limited communications bandwidth. Such spacecraft can use image analysis to autonomously select clear scenes for prioritized downlink. More agile spacecraft can also benefit from cloud screening by retargeting observations to cloud-free areas. This could significantly improve the science yield of instruments such as the Orbiting Carbon Observatory 3 mission. However, most existing cloud detection algorithms are not suitable for these applications, because they require calibrated and georectified spectral data, which is not typically available onboard. Here, we describe a statistical machine-learning method for real-time autonomous scene interpretation using a visible camera with no radiometric calibration. A random forest classifies cloud and clear pixels based on local patterns of image texture. We report on experimental evaluation of images from the International Space Station ISS and present results from a deployment onboard the IPEX spacecraft. This demonstrates actual execution in flight and provides some preliminary lessons learned about operational use. It is a rare example of a machine-learning system deployed to an autonomous spacecraft. To our knowledge, it is also the first instance of significant artificial intelligence deployed on board a CubeSat and the first ever deployment of visible image-based cloud screening onboard any operational spacecraft.


international geoscience and remote sensing symposium | 2015

Autonomy for remote sensing — Experiences from the IPEX CubeSat

Joshua Doubleday; Steve Chien; Charles D. Norton; Kiri L. Wagstaff; David R. Thompson; John Bellardo; Craig Francis; Eric Baumgarten

The Intelligent Payload Experiment (IPEX) is a CubeSat mission to flight validate technologies for onboard instrument processing and autonomous operations for NASAs Earth Science Technologies Office (ESTO). Specifically IPEX is to demonstrate onboard instrument processing and product generation technologies for the Intelligent Payload Module (IPM) of the proposed Hyperspectral Infra-red Imager (HyspIRI) mission concept. Many proposed future missions, including HyspIRI, are slated to produce enormous volumes of data requiring either significant communication advancements or data reduction techniques. IPEX demonstrates several technologies for onboard data reduction, such as computer vision, image analysis, image processing and in general demonstrates general operations autonomy. We conclude this paper with a number of lessons learned through operations of this technology demonstration mission on a novel platform for NASA.

Collaboration


Dive into the Joshua Doubleday's collaboration.

Top Co-Authors

Avatar

Steve Chien

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ashley Gerard Davies

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel Tran

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

Daniel Mandl

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar

David Mclaren

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

David R. Thompson

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

John Bellardo

California Polytechnic State University

View shared research outputs
Top Co-Authors

Avatar

Rebecca Castano

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel Q. Tran

California Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge