Christian Weichel
Lancaster University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Christian Weichel.
human factors in computing systems | 2014
Christian Weichel; Manfred Lau; David Kim; Nicolas Villar; Hans-Werner Gellersen
Personal fabrication machines, such as 3D printers and laser cutters, are becoming increasingly ubiquitous. However, designing objects for fabrication still requires 3D modeling skills, thereby rendering such technologies inaccessible to a wide user-group. In this paper, we introduce MixFab, a mixed-reality environment for personal fabrication that lowers the barrier for users to engage in personal fabrication. Users design objects in an immersive augmented reality environment, interact with virtual objects in a direct gestural manner and can introduce existing physical objects effortlessly into their designs. We describe the design and implementation of MixFab, a user-defined gesture study that informed this design, show artifacts designed with the system and describe a user study evaluating the systems prototype.
human factors in computing systems | 2015
Faisal Taher; John G. Hardy; Abhijit Karnik; Christian Weichel; Yvonne Jansen; Kasper Hornbæk; Jason Alexander
Visualizations such as bar charts help users reason about data, but are mostly screen-based, rarely physical, and almost never physical and dynamic. This paper investigates the role of physically dynamic bar charts and evaluates new interactions for exploring and working with datasets rendered in dynamic physical form. To facilitate our exploration we constructed a 10x10 interactive bar chart and designed interactions that supported fundamental visualisation tasks, specifically; annotation, filtering, organization, and navigation. The interactions were evaluated in a user study with 17 participants. Our findings identify the preferred methods of working with the data for each task i.e. directly tapping rows to hide bars, highlight the strengths and limitations of working with physical data, and discuss the challenges of integrating the proposed interactions together into a larger data exploration system. In general, physical interactions were intuitive, informative, and enjoyable, paving the way for new explorations in physical data visualizations.
tangible and embedded interaction | 2013
Christian Weichel; Manfred Lau; Hans Gellersen
This paper explores the problem of designing enclosures (or physical cases) that are needed for prototyping electronic devices. We present a novel interface that uses electronic components as handles for designing the 3D shape of the enclosure. We use the .NET Gadgeteer platform as a case study of this problem, and implemented a proof-of-concept system for designing enclosures for Gadgeteer components. We show examples of enclosures designed and fabricated with our system.
human factors in computing systems | 2013
Andreas Bulling; Christian Weichel; Hans Gellersen
In this work we present EyeContext, a system to infer high-level contextual cues from human visual behaviour. We conducted a user study to record eye movements of four participants over a full day of their daily life, totalling 42.5 hours of eye movement data. Participants were asked to self-annotate four non-mutually exclusive cues: social (interacting with somebody vs. no interaction), cognitive (concentrated work vs. leisure), physical (physically active vs. not active), and spatial (inside vs. outside a building). We evaluate a proof-of-concept EyeContext system that combines encoding of eye movements into strings and a spectrum string kernel support vector machine (SVM) classifier. Our results demonstrate the large information content available in long-term human visual behaviour and opens up new venues for research on eye-based behavioural monitoring and life logging.
designing interactive systems | 2016
Eduardo Velloso; Markus Wirth; Christian Weichel; Augusto Esteves; Hans-Werner Gellersen
Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.
human factors in computing systems | 2013
Steven Houben; Christian Weichel
In recent years there has been a widespread installation of large interactive public displays. Longitudinal studies however show that these interactive displays suffer from interaction blindness - the inability of the public to recognize the interactive capabilities of those surfaces. In this paper, we explore the use of curiosity-provoking artifacts, (curiosity objects) to overcome interaction blindness. Our study confirmed the interaction blindness problem and shows that introducing a curiosity object results in a significant increase in interactivity with the display as well as changes in movement in the spaces surrounding the interactive display.
user interface software and technology | 2015
Christian Weichel; John G. Hardy; Jason Alexander; Hans Gellersen
Digital fabrication machines such as 3D printers and laser-cutters allow users to produce physical objects based on virtual models. The creation process is currently unidirectional: once an object is fabricated it is separated from its originating virtual model. Consequently, users are tied into digital modeling tools, the virtual design must be completed before fabrication, and once fabricated, re-shaping the physical object no longer influences the digital model. To provide a more flexible design process that allows objects to iteratively evolve through both digital and physical input, we introduce bidirectional fabrication. To demonstrate the concept, we built ReForm, a system that integrates digital modeling with shape input, shape output, annotation for machine commands, and visual output. By continually synchronizing the physical object and digital model it supports object versioning to allow physical changes to be undone. Through application examples, we demonstrate the benefits of ReForm to the digital fabrication process.
IEEE Pervasive Computing | 2013
Thomas Kubitza; Norman Pohl; Tilman Dingler; Stefan Schneegass; Christian Weichel; Albrecht Schmidt
The emergence of many new embedded computing platforms has lowered the hurdle for creating ubiquitous computing devices. Here, the authors highlight some of the newer platforms, communication technologies, sensors, actuators, and cloud-based development tools, which are creating new opportunities for ubiquitous computing.
IEEE Pervasive Computing | 2015
Christian Weichel; Jason Alexander; Abhijit Karnik; Hans Gellersen
As digital fabrication and digital design become more pervasive, the physical tools we use in conjunction will have to catch up. With the Internet of Things, cyberphysical systems, and Industry 4.0 in our midst, connecting and integrating measurement tools into design processes is a logical step. Here, the authors describe the first steps in that direction, coming from a variety of communities: academics, makers, and industry alike. In particular, they present their spatio-tangible (SPATA) tools for fabrication-aware design.
human factors in computing systems | 2015
John G. Hardy; Christian Weichel; Faisal Taher; John Vidler; Jason Alexander