Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sam Siewert is active.

Publication


Featured researches published by Sam Siewert.


IEEE Transactions on Knowledge and Data Engineering | 2000

Dynamically negotiated resource management for data intensive application suites

Gary J. Nutt; Scott A. Brandt; Adam J. Griff; Sam Siewert; Marty Humphrey; Toby Berk

In contemporary computers and networks of computers, various application domains are making increasing demands on the system to move data from one place to another, particularly under some form of soft real-time constraint. A brute force technique for implementing applications in this type of domain demands excessive system resources, even though the actual requirements by different parts of the application vary according to the way it is being used at the moment. A more sophisticated approach is to provide applications with the ability to dynamically adjust resource requirements according to their precise needs, as well as the availability of system resources. This paper describes a set of principles for designing systems to provide support for soft real-time applications using dynamic negotiation. Next, the execution level abstraction is introduced as a specific mechanism for implementing the principles. The utility of the principles and the execution level abstraction is then shown in the design of three resource managers that facilitate dynamic application adaptation: Gryphon, EPA/RT-PCIP, and the DQM architectures.


Proceedings of SPIE | 2004

On orbit performance of the MIPS instrument

G. H. Rieke; Erick T. Young; James Cadien; C. W. Engelbracht; Karl D. Gordon; Douglas M. Kelly; Frank J. Low; Karl Anthony Misselt; J. E. Morrison; James Muzerolle; G. Rivlis; J. A. Stansberry; Jeffrey W. Beeman; E. E. Haller; David T. Frayer; William B. Latter; Alberto Noriega-Crespo; Deborah Lynne Padgett; Dean C. Hines; J. Douglas Bean; William Burmester; Gerald B. Heim; Thomas Glenn; R. Ordonez; John P. Schwenker; Sam Siewert; Donald W. Strecker; S. Tennant; John R. Troeltzsch; Bryce Unruh

The Multiband Imaging Photometer for Spitzer (MIPS) provides long wavelength capability for the mission, in imaging bands at 24, 70, and 160 microns and measurements of spectral energy distributions between 52 and 100 microns at a spectral resolution of about 7%. By using true detector arrays in each band, it provides both critical sampling of the Spitzer point spread function and relatively large imaging fields of view, allowing for substantial advances in sensitivity, angular resolution, and efficiency of areal coverage compared with previous space far-infrared capabilities. The Si:As BIB 24 micron array has excellent photometric properties, and measurements with rms relative errors of 1% or better can be obtained. The two longer wavelength arrays use Ge:Ga detectors with poor photometric stability. However, the use of 1.) a scan mirror to modulate the signals rapidly on these arrays, 2.) a system of on-board stimulators used for a relative calibration approximately every two minutes, and 3.) specialized reduction software result in good photometry with these arrays also, with rms relative errors of less than 10%.


ieee aerospace conference | 1997

Interactive, repair-based planning and scheduling for Shuttle payload operations

Gregg Rabideau; Steve Chien; Tobias Mann; C. Eggemeyer; J. Willis; Sam Siewert; Peter Stone

This paper describes the DATA-CHASER Automated Planner/Scheduler (DCAPS) system for automatically generating low-level command sequences from high-level user goals. DCAPS uses Artificial Intelligence (AI)-based search techniques and an iterative repair framework in which the system selectively resolves conflicts with the resource and temporal constraints of the DATA-CHASER Shuttle payload activities.


real time technology and applications symposium | 1997

A real-time execution performance agent interface to parametrically controlled in-kernel pipelines

Sam Siewert; Gary J. Nutt; Marty Humphrey

This paper presents work-in-progress to build a confidence-based in-kernel pipeline execution performance interface to a fixed priority deadline monotonic scheduler. The interface provides performance-controlled pipeline execution, allowing applications to specify expected execution times, negotiate desired deadline confidence and to configure and control pipelines. The confidence-based scheduling interface and in-kernel pipeline are being evaluated on an unoccupied air vehicle incorporating digital control, continuous media, and event-driven pipelines.


2005 IEEE Region 5 and IEEE Denver Section Technical, Professional and Student Development Workshop | 2005

IO latency hiding in pipelined architectures

Sam Siewert

This paper reports upon development of a novel mathematical formalism for analyzing data pipelines. The method accounts for IO and CPU latencies in the stages of the data pipeline. An experimental pipeline was constructed using a video encoder, frame processing, and transport of the frames over an IP (Internet protocol) network. The pipelined architecture provides a method to overlap processing with DMA, encoding and network transport latency so that streams can be processed with optimal scalability. The model expectations were compared with experimental test results and found to be consistent. The model is therefore expected to provide a good estimate for the scalability of streaming video-on-demand systems. Video-on-demand is a rapidly growing service segment for entertainment, advertising, on-line education, and a myriad of emergent applications.


Proceedings of SPIE | 2014

Low-cost, high-performance and efficiency computational photometer design

Sam Siewert; Jeries Shihadeh; Randall Myers; Jay Khandhar; Vitaly Ivanov

Researchers at the University of Alaska Anchorage and University of Colorado Boulder have built a low cost high performance and efficiency drop-in-place Computational Photometer (CP) to test in field applications ranging from port security and safety monitoring to environmental compliance monitoring and surveying. The CP integrates off-the-shelf visible spectrum cameras with near to long wavelength infrared detectors and high resolution digital snapshots in a single device. The proof of concept combines three or more detectors into a single multichannel imaging system that can time correlate read-out, capture, and image process all of the channels concurrently with high performance and energy efficiency. The dual-channel continuous read-out is combined with a third high definition digital snapshot capability and has been designed using an FPGA (Field Programmable Gate Array) to capture, decimate, down-convert, re-encode, and transform images from two standard definition CCD (Charge Coupled Device) cameras at 30Hz. The continuous stereo vision can be time correlated to megapixel high definition snapshots. This proof of concept has been fabricated as a fourlayer PCB (Printed Circuit Board) suitable for use in education and research for low cost high efficiency field monitoring applications that need multispectral and three dimensional imaging capabilities. Initial testing is in progress and includes field testing in ports, potential test flights in un-manned aerial systems, and future planned missions to image harsh environments in the arctic including volcanic plumes, ice formation, and arctic marine life.


2005 IEEE Region 5 and IEEE Denver Section Technical, Professional and Student Development Workshop | 2005

An embedded real-time autonomic architecture

Sam Siewert; Z. Pfeffer

Autonomic computing is a set of new architectural goals envisioned by IBM and inspired by the human autonomic system. Autonomic architecture is intended to avoid a management crisis that looms based upon the success of Moores law. If we continue to increase storage, memory, processing and 10 resources at present rates and manage them the way we have, IBM projects a system administration crisis. The proposed autonomic architecture has four goals for systems: self configuring, self-healing, self-optimizing, and self-protecting. In this paper, we examine how autonomic architecture goals apply to real-time embedded systems rather than the enterprise systems that IBM has focused upon.


Proceedings of SPIE | 2016

Software defined multi-spectral imaging for Arctic sensor networks

Sam Siewert; Vivek Angoth; Ramnarayan Krishnamurthy; Karthikeyan Mani; Kenrick J. Mock; Surjith B. Singh; Saurav Srivistava; Chris Wagner; Ryan Claus; Matthew Demi Vis

Availability of off-the-shelf infrared sensors combined with high definition visible cameras has made possible the construction of a Software Defined Multi-Spectral Imager (SDMSI) combining long-wave, near-infrared and visible imaging. The SDMSI requires a real-time embedded processor to fuse images and to create real-time depth maps for opportunistic uplink in sensor networks. Researchers at Embry Riddle Aeronautical University working with University of Alaska Anchorage at the Arctic Domain Awareness Center and the University of Colorado Boulder have built several versions of a low-cost drop-in-place SDMSI to test alternatives for power efficient image fusion. The SDMSI is intended for use in field applications including marine security, search and rescue operations and environmental surveys in the Arctic region. Based on Arctic marine sensor network mission goals, the team has designed the SDMSI to include features to rank images based on saliency and to provide on camera fusion and depth mapping. A major challenge has been the design of the camera computing system to operate within a 10 to 20 Watt power budget. This paper presents a power analysis of three options: 1) multi-core, 2) field programmable gate array with multi-core, and 3) graphics processing units with multi-core. For each test, power consumed for common fusion workloads has been measured at a range of frame rates and resolutions. Detailed analyses from our power efficiency comparison for workloads specific to stereo depth mapping and sensor fusion are summarized. Preliminary mission feasibility results from testing with off-the-shelf long-wave infrared and visible cameras in Alaska and Arizona are also summarized to demonstrate the value of the SDMSI for applications such as ice tracking, ocean color, soil moisture, animal and marine vessel detection and tracking. The goal is to select the most power efficient solution for the SDMSI for use on UAVs (Unoccupied Aerial Vehicles) and other drop-in-place installations in the Arctic. The prototype selected will be field tested in Alaska in the summer of 2016.


AIAA Science and Technology Conference | 2014

Verification of Video Frame Latency Telemetry for UAV Systems Using a Secondary Optical Method

Sam Siewert; Muhammad Ahmad; Trellis-Logic Llc; Kevin Yao

This paper presents preliminary work and a prototype computer vision optical method for latency measurement for an UAS (Uninhabited Aerial System) digital video capture, encode, transport, decode, and presentation subsystem. Challenges in this type of latency measurement include a no-touch policy for the camera and encoder as well as the decoder and player because the methods developed must not interfere with the system under test. The goal is to measure the true latency of displayed frames compared to observed scenes (and targets in those scenes) and provide an indication of latency to operators that can be verified and compared to true optical latency from scene to display. Latency measurement using this optical computer vision method was prototyped using both flight side cameras and H.264 encoding using off-the-shelf equivalent equipment to the actual UAS and off-the-shelf ground systems running the Linux operating system and employing a Graphics Processor Unit to accelerate video decode. The key transport latency indicator to be verified on the real UAS is the KLV (Key Length Value) time-stamp which is an air-to-ground transport latency that measures transmission time between the UAS encoder elementary video stream encapsulation and transmission interface to the ground receiver and ground network analyzer interface. The KLV time-stamp is GPS (Global Positioning System) synchronized and employs serial or UDP (User Datagram Protocol) injection of that GPS clock time into the H.264 transport stream at the encoder, prior to transport over an RF (Radio Frequency) or laboratory RF-emulated transmission path on coaxial cable. The hypothesis of this testing is that the majority of capture-to-display latency comes from transport due to satellite relay as well as lower latency line-of-sight transmission. The encoder likewise must set PTS/DTS (Presentation Time Stamp / Decode Time Stamp) to estimate bandwidth-delay in transmission and in some cases may either over or underestimate this time resulting in either undue added display latency or frame drop-out in the latter case. Preliminary analysis using a typical off-the-shelf encoder showed that a majority of observed frame latency is not due to path latency, but rather due to encoder PTS/DTS settings that are overly pessimistic. The method and preliminary results will be presented along with concepts for future work to better tune PTS/DTS in UAS H.264 video transport streams.


ieee aerospace conference | 2001

Experiments with a real-time multi-pipeline architecture for shared control

Sam Siewert

This paper summarizes results from both the hard real-time RACE optical navigation experiment and the soft real-time DATA-CHASER Shuttle demonstration project and presents an integrated architecture for both hard and soft real-time shared control. The results show significant performance advantages of the shared-control architecture and greatly simplified implementation using the derived framework. Lessons learned from both experiments and the implementation of this evolving architecture are presented along with plans for future work to make the framework a standardized kernel module available for VxWorks, Solaris, and Linux.

Collaboration


Dive into the Sam Siewert's collaboration.

Top Co-Authors

Avatar

Gary J. Nutt

University of Colorado Boulder

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

G. Rivlis

University of Arizona

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jeffrey W. Beeman

Lawrence Berkeley National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Karl D. Gordon

Space Telescope Science Institute

View shared research outputs
Top Co-Authors

Avatar

Kenrick J. Mock

University of Alaska Anchorage

View shared research outputs
Researchain Logo
Decentralizing Knowledge