Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jan O. Borchers is active.

Publication


Featured researches published by Jan O. Borchers.


international conference on pervasive computing | 2009

Overcoming Assumptions and Uncovering Practices: When Does the Public Really Look at Public Displays?

Elaine M. Huang; Anna Koster; Jan O. Borchers

This work reports on the findings of a field study examining the current use practices of large ambient information displays in public settings. Such displays are often assumed to be inherently eye-catching and appealing to people nearby, but our research shows that glancing and attention at large displays is complex and dependent on many factors. By understanding how such displays are being used in current, public, non-research settings and the factors that impact usage, we offer concrete, ecologically valid knowledge and design implications about these technologies to researchers and designers who are employing large ambient displays in their work.


human factors in computing systems | 2003

iStuff: a physical user interface toolkit for ubiquitous computing environments

Rafael Ballagas; Meredith Ringel; Maureen C. Stone; Jan O. Borchers

The iStuff toolkit of physical devices, and the flexible software infrastructure to support it, were designed to simplify the exploration of novel interaction techniques in the post-desktop era of multiple users, devices, systems and applications collaborating in an interactive environment. The toolkit leverages an existing interactive workspace in-frastructure, making it lightweight and platform independent. The supporting software framework includes a dynamically configurable intermediary to simplify the mapping of devices to applications. We describe the iStuff architecture and provide several examples of iStuff, organized into a design space of ubiquitous computing interaction components. The main contribution is a physical toolkit for distributed, heterogeneous environments with run-time retargetable device data flow. We conclude with some insights and experiences derived from using this toolkit and framework to prototype experimental interaction techniques for ubiquitous computing environments.


human factors in computing systems | 2009

SLAP widgets: bridging the gap between virtual and physical controls on tabletops

Malte Weiss; Yvonne Jansen; Roger Jennings; Ramsin Khoshabeh; James D. Hollan; Jan O. Borchers

We present Silicone iLluminated Active Peripherals (SLAP), a system of tangible, translucent widgets for use on multitouch tabletops. SLAP Widgets are cast from silicone or made of acrylic, and include sliders, knobs, keyboards, and buttons. They add tactile feedback to multi-touch tables, improving input accuracy. Using rear projection, SLAP Widgets can be relabeled dynamically, providing inexpensive, battery-free, and untethered augmentations. Furthermore, SLAP combines the flexibility of virtual objects with physical affordances. We evaluate how SLAP Widgets influence the user experience on tabletops compared to virtual controls. Empirical studies show that SLAPWidgets are easy to use and outperform virtual controls significantly in terms of accuracy and overall interaction time.


human factors in computing systems | 2009

Tactile motion instructions for physical activities

Daniel Spelmezan; Mareike Jacobs; Anke Hilgers; Jan O. Borchers

While learning new motor skills, we often rely on feedback from a trainer. Auditive feedback and demonstrations are used most frequently, but in many domains they are inappropriate or impractical. We introduce tactile instructions as an alternative to assist in correcting wrong posture during physical activities, and present a set of full-body vibrotactile patterns. An initial study informed the design of our tactile patterns, and determined appropriate locations for feedback on the body. A second experiment showed that users perceived and correctly classified our tactile instruction patterns in a relaxed setting and during a cognitively and physically demanding task. In a final experiment, snowboarders on the slope compared their perception of tactile instructions with audio instructions under real-world conditions. Tactile instructions achieved overall high recognition accuracy similar to audio instructions. Moreover, participants responded quicker to instructions delivered over the tactile channel than to instructions presented over the audio channel. Our findings suggest that these full-body tactile feedback patterns can replace audio instructions during physical activities.


user interface software and technology | 2010

Madgets: actuating widgets on interactive tabletops

Malte Weiss; Florian Schwarz; Simon Jakubowski; Jan O. Borchers

We present a system for the actuation of tangible magnetic widgets (Madgets) on interactive tabletops. Our system combines electromagnetic actuation with fiber optic tracking to move and operate physical controls. The presented mechanism supports actuating complex tangibles that consist of multiple parts. A grid of optical fibers transmits marker positions past our actuation hardware to cameras below the table. We introduce a visual tracking algorithm that is able to detect objects and touches from the strongly sub-sampled video input of that grid. Six sample Madgets illustrate the capabilities of our approach, ranging from tangential movement and height actuation to inductive power transfer. Madgets combine the benefits of passive, untethered, and translucent tangibles with the ability to actuate them with multiple degrees of freedom.


user interface software and technology | 2011

FingerFlux: near-surface haptic feedback on tabletops

Malte Weiss; Chat Wacharamanotham; Simon Voelker; Jan O. Borchers

We introduce FingerFlux, an output technique to generate near-surface haptic feedback on interactive tabletops. Our system combines electromagnetic actuation with permanent magnets attached to the users hand. FingerFlux lets users feel the interface before touching, and can create both attracting and repelling forces. This enables applications such as reducing drifting, adding physical constraints to virtual controls, and guiding the user without visual output. We show that users can feel vibration patterns up to 35 mm above our table, and that FingerFlux can significantly reduce drifting when operating on-screen buttons without looking.


interactive tabletops and surfaces | 2010

MudPad: tactile feedback and haptic texture overlay for touch surfaces

Yvonne Jansen; Thorsten Karrer; Jan O. Borchers

We introduce MudPad, a system capable of localized active haptic feedback on multitouch screens. We use an array of electromagnets combined with an overlay containing magnetorheological (MR) fluid to actuate a tablet-sized area. As MudPad has a very low reaction time it is able to produce instant multi-point feedback for multitouch input, ranging from static levels of surface softness to a broad set of dynamically changeable textures. Our system does not only convey global confirmative feedback on user input but allows the UI designer to enrich the entire interface with a tactile layer conveying local semantic information. This also allows users to explore the interface haptically.


IEEE Wireless Communications | 2002

Stanford interactive workspaces: a framework for physical and graphical user interface prototyping

Jan O. Borchers; Meredith Ringel; Joshua Tyler; Armando Fox

Most smart homes are created evolutionarily by adding more and more technologies to an existing home, rather than being developed on a single occasion by building a new home from scratch. This incremental addition of technology requires a highly flexible infrastructure to accommodate both future extensions and legacy systems without requiring extensive rewiring of hardware or extensive reconfiguration on the software level. Stanfords iStuff (Interactive Stuff) provides an example of a hardware interface abstraction technique that enables quick customization and reconfiguration of Smart Home solutions. iStuff gains its power from its combination with the Stanford Interactive Room Operating System (iROS), which creates a flexible and robust software framework that allows custom and legacy applications to communicate with each other and with user interface devices in a dynamically configurable way. The Stanford Interactive Room (iRoom), while not a residential environment, has many characteristics of a smart home: a wide array of advanced user interface technologies, abundant computation power, and infrastructure with which to coordinate the use of these resources (for more information on the iRoom or the Interactive Workspaces project, please visit http://iwork.stanford.edu). As a result, many aspects of the iRoom environment have strong implications for, and can be intuitively translated to, smart homes. In particular, the rapid and fluid development of physical user interfaces using iStuff and the iROS, which has been demonstrated in the iRoom, is an equally powerful concept for designing and living in smart homes. Before focusing on the details of iStuff, we describe the software infrastructure on which it is based and the considerations that went into designing this infrastructure.


Ai & Society | 2001

A pattern approach to interaction design

Jan O. Borchers

To create successful interactive systems, user interface designers need to cooperate with developers and application domain experts in an interdisciplinary team. These groups, however, usually lack a common terminology to exchange ideas, opinions and values. This paper presents an approach that uses pattern languages to capture this knowledge in software development, human-computer interaction (HCI) and the application domain. A formal, domain-independent definition of design patterns allows for computer support without sacrificing readability, and pattern use is integrated into the usability engineering life cycle. As an example, experience from building an award-winning interactive music exhibit was turned into a pattern language, which was then used to inform follow-up projects and support HCI education.


human factors in computing systems | 2008

DRAGON: a direct manipulation interface for frame-accurate in-scene video navigation

Thorsten Karrer; Malte Weiss; Eric Lee; Jan O. Borchers

We present DRAGON, a direct manipulation interaction technique for frame-accurate navigation in video scenes. This technique benefits tasks such as professional and amateur video editing, review of sports footage, and forensic analysis of video scenes. By directly dragging objects in the scene along their movement trajectory, DRAGON enables users to quickly and precisely navigate to a specific point in the video timeline where an object of interest is in a desired location. Examples include the specific frame where a sprinter crosses the finish line, or where a car passes a traffic light. Through a user study, we show that DRAGON significantly reduces task completion time for in-scene navigation tasks by an average of 19-42% compared to a standard timeline slider. Qualitative feedback from users is also positive, with multiple users indicating that the DRAGON interaction felt more natural than the traditional timeline slider for in-scene navigation.

Collaboration


Dive into the Jan O. Borchers's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eric Lee

RWTH Aachen University

View shared research outputs
Top Co-Authors

Avatar

Malte Weiss

RWTH Aachen University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Max Mühlhäuser

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge