Daniel Davis
Monterey Bay Aquarium Research Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Daniel Davis.
oceans conference | 2006
Duane R. Edgington; Danelle E. Cline; Daniel Davis; Ishbel Kerkez; Jerome Mariette
For oceanographic research, remotely operated underwater vehicles (ROVs) and underwater observatories routinely record several hours of video material every day. Manual processing of such large amounts of video has become a major bottleneck for scientific research based on this data. We have developed an automated system that detects, tracks, and classifies objects that are of potential interest for human video annotators. By pre-selecting salient targets for track initiation using a selective attention algorithm, we reduce the complexity of multi-target tracking. Then, if an object is tracked for several frames, a visual event is created and passed to a Bayesian classifier utilizing a Gaussian mixture model to determine the object class of the detected event
Deep-sea Research Part I-oceanographic Research Papers | 1997
Catherine Goyet; Daniel Davis
We use total C02 data sets measured on the recent Joint Global Ocean Flux Study (JGOFS) and World Ocean Circulation Experiment (WOCE) cruises to investigate potential parameterizations of TC02 data sets. The observed temporal variations of TC02 in surface seawater were large (4% of the signal within a month) compared to the relatively small spatial variations (< 2% over 10 degree latitude). Yet the result of our study suggests that a single sigmoid function of the form TCO2(x) = a1 − b1[1 + e−(c1(x−d1))]e1 + b1 can be used to fit, parameterize, and interpolate closely (within the accuracy of the measurement) each upper ocean (< 250 m) TCO2-depth profile of the temporal and spatial data sets from the North Atlantic and Equatorial Pacific Oceans that we examined. This sigmoid function is not specific to TCO2 data sets, and it could probably be used over most (if not all) of the upper ocean to fit, parameterize, and interpolate physical and chemical oceanic data sets. Below the upper layer (depth > 250 m) TCO2 can be parameterized by the relationship TCO2 = a + bΘ + c AOU + dS where Θ, AOU, and S represents potential temperature, apparent oxygen utilization and salinity, respectively. The coefficients a, b, c, and d are experimentally determined by multiple linear regression using discrete bottle data from depths below the wintertime mixed layer depth. Densely sampled CTD data of temperature, salinity, and oxygen can then be used to interpolate TCO2 data at non-sampled depths. The two parameterized functions describing TCO2 in and below the upper ocean can further be blended at the wintertime mixed layer depth to yield a continuous estimate of TCO2 concentration throughout the water column.
oceans conference | 2006
Thomas C. O'Reilly; K. Headley; John Graybeal; Kevin Gomes; Duane R. Edgington; Karen A. Salamy; Daniel Davis; Andrew Chase
The ocean science and engineering communities have identified some key requirements for large-scale ocean observatories at a recent ORION-sponsored workshop, and these requirements are being refined by the ORION project and others. MBARI has developed and deployed hardware and software technologies that address many of these requirements. In particular, we describe how these technologies address several key issues: (1) scalable integration, configuration, and management of large numbers of diverse instruments and data streams, (2) reliable association of instrument data and contextual metadata, and (3) development of observatory infrastructure and components that are interoperable among a variety of observatory architectures, including at-sea systems with relatively limited power and bandwidth availability. We focus on three technologies developed at MBARI. These technologies work together to enable MBARIs self-configuring self-describing MOOS mooring-based observatory. Yet these technologies have been designed to be largely independent of an observatorys physical implementation, and will be deployed for testing on the MARS cable-to-shore observatory test-bed. Moreover each of the technologies provide components that could be selectively used by other observatories. For example, PUCKs could be widely useful and are not dependent in any way on SIAM middleware or SSDS metadata structures. We also describe lessons learned during development and deployment of these technologies, and how policies and human-procedures interact with the new technologies. Finally, we discuss how these technologies are being refined through community efforts such as the emerging Marine Plug and Work Consortium and Marine Metadata Initiative
Geophysical Research Letters | 1995
Catherine Goyet; Daniel Davis; Edward T. Peltzer; Peter G. Brewer
Large-scale ocean observing programs such as the Joint Global Ocean Flux Study (JGOFS) and the World Ocean Circulation Experiment (WOCE) today, must face the problem of designing an adequate sampling strategy. For ocean chemical variables, the goals and observing technologies are quite different from ocean physical variables (temperature, salinity, pressure). The authors have recently acquired data on the ocean CO{sub 2} properties on WOCE cruises P16c and P17c that are sufficiently dense to test for sampling redundancy. They use linear and quadratic interpolation methods on the sampled field to investigate what is the minimum number of samples required to define the deep ocean total inorganic carbon (TCO{sub 2}) field within the limits of experimental accuracy ({+-}4 {mu}mol/kg). Within the limits of current measurements, these lines were oversampled in the deep ocean. Should the precision of the measurement be improved, then a denser sampling pattern may be desirable in the future. This approach rationalizes the efficient use of resources for field work and for estimating gridded TCO{sub 2} fields needed to constrain geochemical models. 10 refs., 6 figs., 1 tab.
oceans conference | 2006
Duane R. Edgington; Daniel Davis; Thomas C. O'Reilly
We summarize results of a Workshop on Instrument Software Infrastructure held at MBARI, Moss Landing, California USA from September 13-15, 2004, jointly sponsored the National Science Foundation (NSF) and Ocean Research Interactive Observatory Networks (ORION) program. The Workshop included over fifty participants, including international participants from Germany, Canada, and Japan. This was one of the first technical workshops in the development of a series of ocean observatories under the US Ocean Observatory Initiative (OOI) being managed under the ORION program. The specific focus of this workshop was to define the standard requirements to be met by software infrastructure for sensors, instruments and platforms for observing systems in the ORION program. These requirements include the issues of configuring, interfacing, and managing devices, including sensors and actuators, to a networked based observing system as well as managing the resources necessary to support such devices. The topics include the capability of supporting plug-and-work instrumentation using service oriented network architecture. A major issue addressed is the observatory infrastructure requirements necessary for managing data and metadata coming from sensors and instruments of the observatory in support of an integrated data management system
oceans conference | 2000
Daniel Davis
MBARI, a nonprofit, privately funded research institute devoted to the development of technology to support research in ocean sciences has been developing systems for long term environmental monitoring in Monterey Bay since 1987. The institute has initiated a project for expanding its ocean observing system capabilities through an expansion of existing moored data acquisition systems as well as the additional use of a benthic network, ROV and AUV based data acquisition systems. The goal of this project is to provide semi-continuous observations of important physical, biological, and chemical variables extended in space and time to support event detection, such as the onset of an El Nino, as well as support for focused intermediate-term process studies. In addition to the problems associated with managing a large variety and quantity of data associated to systems of this nature, there is the additional problem of how to optimize the data sampling topology. That is, what spacing and frequency of the system measurement resources will best meet the specific scientific goals of researchers using the system. Given the enormous cost of developing and deploying high technology instrumentation and systems a modest effort to understand, and to develop a sampling methodology for such systems is clearly warranted. In this paper, an approach to the multi-dimensional sampling problem based on data compression is developed. The method is empirically based and does not depend on, or require, assumptions about the underlying data field or processes. The approach can also be used by a system to analyze its own sampling efficiency, and adjust sampling rates and spacing (assuming the system has this capability) for improved efficiency and accuracy. The methodology is illustrated with practical applications to one-dimensional bio-chemical data from the WOCE program, as well as prototypical multi-dimensional problems for the MBARI Ocean Observing System (MOOS).
international conference physics and control | 2003
K. Headley; Daniel Davis; Duane R. Edgington; L. McBride; Tom O'Reilly; M. Risi
oceans conference | 2001
Tom O'Reilly; Duane R. Edgington; Daniel Davis; R. Henthorn; M. P. McCann; T. Meese; W. Radochonski; M. Risi; Brent Roman; R. Schramm
oceans conference | 2004
Tom O'Reilly; K. Headley; Robert Herlien; M. Risi; Daniel Davis; Duane R. Edgington; Kevin Gomes; T. Meese; John Graybeal; M. Chaffey
Archive | 2002
Daniel Davis; Duane R. Edgington; Karen A. Salamy