Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ulrich von Zadow is active.

Publication


Featured researches published by Ulrich von Zadow.


interactive tabletops and surfaces | 2014

SleeD: Using a Sleeve Display to Interact with Touch-sensitive Display Walls

Ulrich von Zadow; Wolfgang Büschel; Ricardo Langner; Raimund Dachselt

We present SleeD, a touch-sensitive Sleeve Display that facilitates interaction with multi-touch display walls. Large vertical displays allow multiple users to interact effectively with complex data but are inherently public. Also, they generally cannot present an interface adapted to the individual user. The combination with an arm-mounted, interactive display allows complex personalized interactions. In contrast to hand-held devices, both hands remain free for interacting with the wall. We discuss different levels of coupling between wearable and wall and propose novel user interface techniques that support user-specific interfaces, data transfer, and arbitrary personal views. In an iterative development process, we built a mock-up using a bendable e-Ink display and a fully functional prototype based on an arm-mounted smartphone. In addition, we developed several applications that showcase the techniques presented. An observational study we conducted demonstrates the high potential of our concepts.


human factors in computing systems | 2013

SimMed: combining simulation and interactive tabletops for medical education

Ulrich von Zadow; Sandra Buron; Tina Harms; Florian Behringer; Kai Sostmann; Raimund Dachselt

A large body of work asserts that interactive tabletops are well suited for group work, and numerous studies have examined these devices in educational contexts. However, few of the described systems support simulations for collaborative learning, and none of them explicitly address immersion. We present SimMed, a system allowing medical students to collaboratively diagnose and treat a virtual patient using an interactive tabletop. The hybrid user interface combines elements of virtual reality with multitouch input. The paper delineates the development process of the system and rationale behind a range of interface design decisions. Thereby, the role of realism in gaining procedural knowledge is discussed - in particular, the interplay between realism, immersion and training goals. We implemented several medical test cases and evaluated our approach with a user study that suggests the great potential of the system. Results show a high level of immersion, cooperation and engagement by the students.


interactive tabletops and surfaces | 2010

GlobalData: multi-user interaction with geographic information systems on interactive surfaces

Ulrich von Zadow; Florian Daiber; Johannes Schöning; Antonio Krüger

The geographical domain was often used as a showcase to show the possibilities of multi-touch interaction. Nonetheless, researchers have rarely investigated multi-user interaction with GIS - in fact, most of the geographical tabletop applications are not suited to multi-user interaction. Our multitouch application, GlobalData, allows multiple people to interact and collaborate in examining global, geolocated data. In idle mode, the device simply shows a stylized map of the earth. Users can open circular GeoLenses. These circles show the same map segment as the underlying base map and superimpose different data layers on it.


interactive tabletops and surfaces | 2010

Medical education on an interactive surface

Maria Kaschny; Sandra Buron; Ulrich von Zadow; Kai Sostmann

We present initial results of SimMed, an ongoing interdisciplinary project for the use of interactive tables in medical education. The project is motivated by the need of combining theoretical knowledge with practice in medical education and the time-consuming task of finding appropriate patients for teaching. Students in medicine are able to interact realistically with a virtual patient displayed on the interactive table to diagnose and cure illnesses. The project is still under development.


advanced visual interfaces | 2016

YouTouch! Low-Cost User Identification at an Interactive Display Wall

Ulrich von Zadow; Patrick Reipschläger; Daniel Bösel; Anita Sellent; Raimund Dachselt

We present YouTouch!, a system that tracks users in front of an interactive display wall and associates touches with users. With their large size, display walls are inherently suitable for multi-user interaction. However, current touch recognition technology does not distinguish between users, making it hard to provide personalized user interfaces or access to private data. In our system we place a commodity RGB + depth camera in front of the wall, allowing us to track users and correlate them with touch events. While the cameras driver is able to track people, it loses the users ID whenever she is occluded or leaves the scene. In these cases, we re-identify the person by means of a descriptor comprised of color histograms of body parts and skeleton-based biometric measurements. Additional processing reliably handles short-term occlusion as well as assignment of touches to occluded users. YouTouch! requires no user instrumentation nor custom hardware, and there is no registration nor learning phase. Our system was thoroughly tested with data sets comprising 81 people, demonstrating its ability to re-identify users and correlate them to touches even under adverse conditions.


Collaboration Meets Interactive Spaces | 2016

Content Sharing Between Spatially-Aware Mobile Phones and Large Vertical Displays Supporting Collaborative Work

Ricardo Langner; Ulrich von Zadow; Tom Horak; Annett Mitschick; Raimund Dachselt

Large vertical displays are increasingly widespread, and content sharing between them and personal mobile devices is central to many collaborative usage scenarios. In this chapter we present FlowTransfer, bidirectional transfer techniques which make use of the mobile phone’s position and orientation. We focus on three main aspects: multi-item transfer and layout, the dichotomy of casual versus precise interaction, and support for physical navigation. Our five techniques explore these aspects in addition to being contributions in their own right. They leverage physical navigation, allowing seamless transitions between different distances to the display, while also supporting arranging content and copying entire layouts within the transfer process. This is enabled by a novel distance-dependent pointing cursor that supports coarse pointing from distance as well as precise positioning at close range. We fully implemented all techniques and conducted a qualitative study documenting their benefits. Finally, based on a literature review and our holistic approach in designing the techniques, we also contribute an analysis of the underlying design space.


interactive tabletops and surfaces | 2014

X-O Arch Menu: Combining Precise Positioning with Efficient Menu Selection on Touch Devices

Felix Thalmann; Ulrich von Zadow; Marcel Heckel; Raimund Dachselt

Support for precise positioning is cruical for many touch applications, and an efficient way to select the action at that point is very desirable in many cases as well. We draw upon existing work in the area of touch accuracy and touch menus to contribute the X-O Arch Menu. Our menu seamlessly combines precise positioning and fast, hierarcial menu selection. Furthermore, we introduce a novel optimization to pie menus that allows usage in limited screen space. The menu is fully implemented; we have created a touch-enabled version of a commercially available application using it.


interactive tabletops and surfaces | 2013

The SimMed experience: medical education on interactive tabletops

Ulrich von Zadow; Sandra Buron; Kai Sostmann; Raimund Dachselt

We present SimMed, a novel tool for medical education that allows medical students to diagnose and treat a simulated patient in real-time. The students assume the roles of doctors, collaborating as they interact with the patient. To achieve immersion and support complex interactions for gaining procedural knowledge, the hybrid user interface combines elements of real-time Virtual Reality (VR) with multitouch input. On the one hand, SimMed features a simulated, life-sized patient that is rendered and reacts in real-time. On the other hand, a more conventional touch input interface allows access to a large variety of medical procedures and tools.


human factors in computing systems | 2017

GIAnT: Visualizing Group Interaction at Large Wall Displays

Ulrich von Zadow; Raimund Dachselt

Large interactive displays are increasingly important and a relevant research topic, and several studies have focused on wall interaction. However, in many cases, thorough user studies currently require time-consuming video analysis and coding. We present the Group Interaction Analysis Toolkit GIAnT, which provides a rich set of visualizations supporting investigation of multi-user interaction at large display walls. GIAnT focuses on visualizing time periods, making it possible to gain overview-level insights quickly. The toolkit is designed to be extensible and features several carefully crafted visualizations: A novel timeline visualization shows movement in front of the wall over time, a wall visualization shows interactions on the wall and gaze data, and a floor visualization displays user positions. In addition, GIAnT shows the captured video stream along with basic statistics. We validate our tool by analyzing how it supports investigating major research topics and by practical use in evaluating a cooperative game.


Mensch & Computer | 2017

Challenges in Personalized Multi-user Interaction at Large Display Walls

Ulrich von Zadow

Computing devices such as desktop PCs and mobile phones assume a single user at the interface and present an interface personalized to this individual. In the case of large interactive display walls, simultaneous interaction by multiple users becomes possible and consequently, the system needs to adapt to each user at a more fine-grained level. This paper considers the challenges that follow and examines proposed solutions. We first look at requirements for a system that technically identifies the different users and compare existing systems to these requirements. Second, we discuss how to adapt the interface to multiple, possibly collaborating, users: How do we display personal or private data? How do we adapt to users that move around? In a collaborative environment, there is often a need for personal views of data in addition to a shared view. We consider both lens-based user interfaces and the use of additional personal mobile devices as mechanisms for enabling personalized interfaces.

Collaboration


Dive into the Ulrich von Zadow's collaboration.

Top Co-Authors

Avatar

Raimund Dachselt

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Brandl

Simon Fraser University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel Bösel

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Patrick Reipschläger

Dresden University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge