Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Tom Wypych is active.

Publication


Featured researches published by Tom Wypych.


IEEE Computer | 2011

Dealing with Archaeology's Data Avalanche

Vid Petrovic; Aaron Gidding; Tom Wypych; Falko Kuester; Thomas A. DeFanti; Thomas E. Levy

The increasing availability and relatively low cost of digital data collection technologies have created a data avalanche for archaeologists. In this paper, we discuss a system that integrates geographic information system (GlS)-based artifact and material sample data sets with massive point clouds within an interactive visual analysis environment. Our system lets researchers revisit archeological sites virtually, with the entirety of the captured record accessible for exploration.


Future Generation Computer Systems | 2011

CGLXTouch: A multi-user multi-touch approach for ultra-high-resolution collaborative workspaces

Kevin Ponto; Kai Doerr; Tom Wypych; John Kooker; Falko Kuester

This paper presents an approach for empowering collaborative workspaces through ultra-high resolution tiled display environments concurrently interfaced with multiple multi-touch devices. Multi-touch table devices are supported along with portable multi-touch tablet and phone devices, which can be added to and removed from the system on the fly. Events from these devices are tagged with a device identifier and are synchronized with the distributed display environment, enabling multi-user support. As many portable devices are not equipped to render content directly, a remotely scene is streamed in. The presented approach scales for large numbers of devices, providing access to a multitude of hands-on techniques for collaborative data analysis.


ieee aerospace conference | 2012

AirGSM: An unmanned, flying GSM cellular base station for flexible field communications

Tom Wypych; Radley Angelo; Falko Kuester

We present a functional implementation of a lightweight GSM (Global System for Mobile Communications) cellular base station and core network based on open-source software, coupled with a simple and rapidly deployable autonomous aerial vehicle for establishing field communications in the absence of commercial service. We advocate the utility of mobile GSM cellular devices for communication and data acquisition in many types of fieldwork, posing advantages in functionality over conventional long-range push-to-talk radios and advantages in size over laptop type data terminals. We argue that alternative radio communications technologies inevitably fail to simultaneously optimize cost, power management, range, integration, and spectral efficiency compared to the GSM radio interface.


Future Generation Computer Systems | 2016

Low bandwidth desktop and video streaming for collaborative tiled display environments

Jason Kimball; Tom Wypych; Falko Kuester

High-resolution display environments built on networked, multi-tile displays have emerged as an enabling tool for collaborative, distributed visualization work. They provide a means to present, compare, and correlate data in a broad range of formats and coming from a multitude of different sources. Visualization of these distributed data resources may be achieved from a variety of clustered processing and display resources for local rendering and may be streamed on demand and in real-time from remotely rendered content. The latter is particularly important when multiple users want to concurrently share content from their personal devices to further augment the shared workspace. This paper presents a high-quality video streaming technique allowing remotely generated content to be acquired and streamed to multi-tile display environments from a range of sources and over a heterogeneous wide area network.The presented technique uses video compression to reduce the entropy and therefore required bandwidth of the video stream. Compressed video delivery poses a series of challenges for display on tiled video walls which are addressed in this paper. These include delivery to the display wall from a variety of devices and localities with synchronized playback, seamless mobility as users move and resize the video streams across the tiled display wall, and low latency video encoding, decoding, and display necessary for interactive applications. The presented technique is able to deliver 1080p resolution, multimedia rich content with bandwidth requirements below 10 Mbps and low enough latency for constant interactivity. A case study is provided, comparing uncompressed and compressed streaming techniques, with performance evaluations for bandwidth use, total latency, maximum frame rate, and visual quality. H.264 video streaming of desktop content to high resolution tiled display systems.Low bandwidth and low latency streaming outperform existing raw RGB techniques.Enables streaming full HD resolution desktop content from wireless laptops.Removes the dependence on 10 Gbps networks in collaborative tiled display systems.Demonstration system delivers 1080P30 desktop content under 10 Mbps, 100 ms latency.


international symposium on multimedia | 2009

VideoBlaster: A Distributed, Low-Network Bandwidth Method for Multimedia Playback on Tiled Display Systems

Kevin Ponto; Tom Wypych; Kai Doerr; So Yamaoka; Jason Kimball; Falko Kuester

Tiled display environments present opportune workspaces for displaying multimedia content. Often times, displaying this kind of audio/visual information requires a substantial amount of network bandwidth. In this paper we present a method for distributing the decoding over the entire display environment, allowing for substantially less network overhead. This method also allows for interactive configuration of the visual workspace. The method described scales independently of video frame size and produces lower latency compared to streaming approaches.


ieee aerospace conference | 2011

System for inspection of large high-resolution radiography datasets

Tom Wypych; So Yamaoka; Kevin Ponto; Falko Kuester

High-resolution image collections pose unique challenges to analysts tasked with managing the associated data assets and deriving new information from them. While significant progress has been made towards rapid automated filtering, alignment, segmentation, characterization, and feature identification from image collections, the extraction of new insights still strongly depends on human intervention. In real-time capture and immediate-mode analysis environments where image data has to be continuously and interactively processed, a broad set of challenges in the image-driven verification and analysis cycle have to be addressed. A framework for interactive and intuitive inspection of large, high-resolution image data sets is presented, leveraging the strength of the human visual system for large-scale image processing. A case study is provided for an X-ray radiography system, covering the scanner-to-screen data management and representation pipeline, resulting in a visual analytics environment enabling analytical reasoning by means of interactive and intuitive visualization.1 2


ieee aerospace conference | 2012

Media-rich streaming for remote simulation and training

Jason Kimball; Stefanie Hoepner; Tom Wypych; Falko Kuester

We present a new approach for the real-time control and operation of media-rich workstations and simulations over remote networked computers. This approach allows a user to interact with a remote computer using a standard keyboard, mouse, and joystick at a high level of interactivity while using significantly less bandwidth than existing remote desktop applications. Existing remote desktop solutions that use no compression or single frame image compression are not able to update full-screen animation and video at an interactive rate over bandwidth constrained networks. The use of video compression allows high frame update rates with a low bandwidth usage. This allows a wide range of new media-rich telecommuting applications such as 3D graphical simulation and training. Simulation and training operations are often performed at dedicated facilities. There can be many reasons restricting these operations to a specific site, such as the use of expensive computers and the need for security to protect sensitive data used in the simulations. These geographical restrictions can have significant ramifications. Requiring training to be performed on site can both restrict the availability of such training as well as lead to high transportation and time costs and for users. By developing tools which allow remote operation of simulations, a broader deployment of training can be achieved, allowing more users to train and at a lower cost. Using video compression to stream live desktop visuals can leverage the computational facilities at simulation sites while requiring significantly less powerful computers to receive and display the video streams to end users. Furthermore, simulation data and other sensitive information can be stored on secure computers while only the visual output of the simulation is sent to the remote user. In this paper we present an implementation intended for the training of control console operators which allows remote visualization and interaction with a simulation at HD resolution over a 1.5 Mbps T-1 data line. This allows users to train while on active deployment by using a dummy console which is able to receive and display the video stream while sending user input events back to the simulation computer. The dummy console costs significantly less than an actual simulation console and can be deployed to virtually anywhere in the world with a satisfactory internet connection. Furthermore, we demonstrate the option of using virtual and augmented reality mixed with the streamed simulation content. This allows for training when console workspaces are either unavailable or can not be taken offline from their active use. We evaluate the performance of this system on a variety of mock-simulation applications and measure that the latency of the video streaming is less than 150 ms, which includes video encoding on the host computer and decoding and display on the users computer. The network transportation time for long distance network communication adds additional latency. Including the time for user input to be sent, we demonstrate that the total end-to-end latency for a cross country simulation is 250 ms. With this performance, interactive remote training and simulation becomes possible.


ieee aerospace conference | 2014

Airborne imaging for cultural heritage

Tom Wypych; James Strawson; Vid Petrovic; Radley Angelo; Aliya Hoff; Matt Howland; Maurizio Seracini; Thomas E. Levy; Falko Kuester

We present our work in designing and deploying airborne sensor vehicles specifically for cultural heritage applications. Numerous practical cultural heritage missions in survey, assessment, and conservation work can benefit from the utility of specializing commodity and customizable airborne platforms to collect visual and non-visual data. These systems and customizations therein have undergone several generations of development both in our own designs and in the research community at large. We discuss the historical application of airborne imaging to cultural heritage conservation and surveying as well as discuss the design evolution towards multi-rotor systems from conventional rotary-wing and fixed-wing systems. This discussion addresses the fundamental principles of operation, as well as the capabilities, contemporary methods and commodity components available for the implementation of such a system. We present our current system and its features in concert with example payloads of utility in conducting these practical reconnaissance missions, as well as useful post-processing techniques, as well as future work in applied visualization.


ieee aerospace conference | 2014

Exploration with live stereoscopic 3D video in mixed reality environments

Jason Kimball; Tom Wypych; Falko Kuester

This paper describes an integrated system for the real time acquisition, streaming, and display of stereoscopic 3D video in mixed reality environments. These mixed reality environments combine real-time stereoscopic video with rendered 3D environments composed of sampled and modeled 3D data from actual landscapes or structures. While live video itself is an important part of exploration, reconnaissance, and documentation, the ability to overlay this video with existing models can significantly add to the context for analysis and decision making. We describe the components of a high- resolution stereoscopic video streaming system integrated with a virtual reality visualization system viewed on high-resolution stereoscopic 3D display walls. The 3D display wall provides a field of view for visualization which is able to exceed the field of view of the human visual system, allowing for a more immersive and natural experiences, and far extending the visual canvas provided by a stereoscopic video stream. We demonstrate a live stereoscopic video feed streamed to a visualization wall and mixed in real-time with a virtual model of that location and present several different usage scenarios exploring how this new visualization technique would be beneficial.


ieee aerospace conference | 2013

System for interactive management of aerial imaging campaigns

Tom Wypych; Falko Kuester

We present a system to enable real time management of interchangeable imaging platforms aboard commodity unmanned aerial vehicles (UAVs) to improve interactivity during aerial imaging campaigns. We argue that this improvement in interactivity enables powerful immediate-mode inspection by the ground operator, and implements a more intuitive, flexible, and ultimately useful control interface to aerial imaging systems.

Collaboration


Dive into the Tom Wypych's collaboration.

Top Co-Authors

Avatar

Falko Kuester

University of California

View shared research outputs
Top Co-Authors

Avatar

Vid Petrovic

University of California

View shared research outputs
Top Co-Authors

Avatar

Jason Kimball

University of California

View shared research outputs
Top Co-Authors

Avatar

Thomas E. Levy

University of California

View shared research outputs
Top Co-Authors

Avatar

James Strawson

University of California

View shared research outputs
Top Co-Authors

Avatar

Kevin Ponto

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Aliya Hoff

University of California

View shared research outputs
Top Co-Authors

Avatar

Kai Doerr

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Radley Angelo

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge