Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Niloofar Dezfuli is active.

Publication


Featured researches published by Niloofar Dezfuli.


european conference on interactive tv | 2012

PalmRC: imaginary palm-based remote control for eyes-free television interaction

Niloofar Dezfuli; Mohammadreza Khalilbeigi; Jochen Huber; Florian Müller; Max Mühlhäuser

User input on television (TV) typically requires a mediator device, such as a handheld remote control. While being a well-established interaction paradigm, a handheld device has serious drawbacks: it can be easily misplaced due to its mobility and in case of a touch screen interface, it also requires additional visual attention. Emerging interaction paradigms like 3D mid-air gestures using novel depth sensors, such as Microsofts Kinect, aim at overcoming these limitations, but are known to be e.g. tiring. In this paper, we propose to leverage the palm as an interactive surface for TV remote control. Our contribution is three-fold: (1) we explore the conceptual design space in an exploratory study. (2) Based upon these results, we investigate the effectiveness and accuracy of such an interface in a controlled experiment. And (3), we contribute PalmRC: an eyes-free, palm-surface-based TV remote control, which in turn is evaluated in an early user feedback session. Our results show that the palm has the potential to be leveraged for device-less and eyes-free TV remote interaction without any third-party mediator device.


interactive tabletops and surfaces | 2013

ObjecTop: occlusion awareness of physical objects on interactive tabletops

Mohammadreza Khalilbeigi; Jürgen Steimle; Jan Riemann; Niloofar Dezfuli; Max Mühlhäuser; James D. Hollan

In this paper, we address the challenges of occlusion created by physical objects on interactive tabletops. We contribute an integrated set of interaction techniques designed to cope with the physical occlusion problem as well as facilitate organizing objects in hybrid settings. These techniques are implemented in ObjecTop, a system to support tabletop display applications involving both physical and virtual objects. We compile design requirements for occlusion-aware tabletop systems and conduct the first in-depth user study comparing ObjecTop with conventional tabletop interfaces in search and layout tasks. The empirical results show that occlusion-aware techniques outperform the conventional tabletop interface. Furthermore, our findings indicate that physical properties of occluders dramatically influence which strategy users employ to cope with occlusion. We conclude with a set of design implications derived from the study.


european conference on interactive tv | 2011

A study on interpersonal relationships for social interactive television

Niloofar Dezfuli; Mohammadreza Khalilbeigi; Max Mühlhäuser; David Geerts

This paper presents an explorative study investigating the social video watching experience. We particularly investigate the role of interpersonal relationships on social interaction while watching and its link to video genres. The results reveal that the desired relationship for social interactions around video content does not solely depend on strong relationship between viewers. Moreover, program genre plays an important role on social structure preferences for watching television as a shared activity. These results can have considerable impact on designing social interactive television systems to enhance social interactions between remote viewers.


human factors in computing systems | 2012

CoStream: in-situ co-construction of shared experiences through mobile video sharing during live events

Niloofar Dezfuli; Jochen Huber; Simon Olberding; Max Mühlhäuser

Mobile live video broadcasting has become increasingly popular as means for novel social media interactions. Recent research mainly focused on bridging larger physical distances in large-scale events such as car racing, where participants are unable to spectate from a certain location in the event. In this paper, we advocate using live video streams not only over larger distances, but also in-situ in closed events such as soccer matches or concerts. We present CoStream, a mobile live video sharing system and present its iterative design process. We used CoStream as an instrument in a field study to investigate the in-situ co-construction of shared experiences during live events. We contribute our findings and outline future work.


Behaviour & Information Technology | 2014

PalmRC: leveraging the palm surface as an imaginary eyes-free television remote control

Niloofar Dezfuli; Mohammadreza Khalilbeigi; Jochen Huber; Murat Özkorkmaz; Max Mühlhäuser

User input on television (TV) typically requires a mediator device such as a handheld remote control. While this is a well-established interaction paradigm, a handheld device has serious drawbacks: it can be easily misplaced due to its mobility and in case of a touch screen interface, it also requires additional visual attention. Emerging interaction paradigms such as 3D mid-air gestures using novel depth sensors (e.g. Microsoft Kinect), aim at overcoming these limitations, but are known to be tiring. In this article, we propose to leverage the palm as an interactive surface for TV remote control. We present three user studies which set the base for our four contributions: We (1) qualitatively explore the conceptual design space of the proposed imaginary palm-based remote control in an explorative study, (2) quantitatively investigate the effectiveness and accuracy of such an interface in a controlled experiment, (3) identified user acceptance in a controlled laboratory evaluation comparing PalmRC concept with two most typical existing input modalities, here conventional remote control and touch-based remote control interfaces on smart phones for their user experience, task load, as well as overall preference, and (4) contribute PalmRC, an eyes-free, palm-surface-based TV remote control. Our results show that the palm has the potential to be leveraged for device-less eyes-free TV remote interaction without any third-party mediator device.


human factors in computing systems | 2016

Liquido: Embedding Liquids into 3D Printed Objects to Sense Tilting and Motion

Martin Schmitz; Andreas Leister; Niloofar Dezfuli; Jan Riemann; Florian Müller; Max Mühlhäuser

Tilting and motion are widely used as interaction modalities in smart objects such as wearables and smart phones (e.g., to detect posture or shaking). They are often sensed with accelerometers. In this paper, we propose to embed liquids into 3D printed objects while printing to sense various tilting and motion interactions via capacitive sensing. This method reduces the assembly effort after printing and is a low-cost and easy-to-apply way of extending the input capabilities of 3D printed objects. We contribute two liquid sensing patterns and a practical printing process using a standard dual-extrusion 3D printer and commercially available materials. We validate the method by a series of evaluations and provide a set of interactive example applications.


human factors in computing systems | 2017

Flexibles: Deformation-Aware 3D-Printed Tangibles for Capacitive Touchscreens

Martin Schmitz; Jürgen Steimle; Jochen Huber; Niloofar Dezfuli; Max Mühlhäuser

We introduce Flexibles: 3D-printed flexible tangibles that are deformation-aware and operate on capacitive touchscreens. Flexibles add expressive deformation input to interaction with on-screen tangibles. Based on different types of deformation mapping, we contribute a set of 3D-printable mechanisms that capture pressing, squeezing, and bending input with multiple levels of intensities. They can be integrated into 3D printed objects with custom geometries and on different locations. A Flexible is printed in a single pass on a consumer-level 3D printer without requiring further assembly. Through a series of interactive prototypes, example applications and a technical evaluation, we show the technical feasibility and the wide applicability of Flexibles.


symposium on spatial user interaction | 2015

A Study on Proximity-based Hand Input for One-handed Mobile Interaction

Florian Müller; Mohammadreza Khalilbeigi; Niloofar Dezfuli; Alireza Sahami Shirazi; Sebastian Günther; Max Mühlhäuser

On-body user interfaces utilize the humans skin for both sensing input and displaying graphical output. In this paper, we present how the degree of freedom offered by the elbow joint, i.e., flexion and extension, can be leveraged to extend the input space of projective user interfaces. The user can move his hand towards or away from himself to browse through a multi-layer information space. We conducted a controlled experiment to investigate how accurately and efficiently users can interact in the space. The results revealed that the accuracy and efficiency of proximity-based interactions mainly depend on the traveling distance to the target layer while neither the hand side nor the direction of interaction have a significant influence. Based on our findings, we propose guidelines for designing on-body user interfaces.


human factors in computing systems | 2015

StackTop: Hybrid Physical-Digital Stacking on Interactive Tabletops

Jan Riemann; Mohammadreza Khalilbeigi; Niloofar Dezfuli; Max Mühlhäuser

The concurrent use of digital and physical documents on interactive surfaces is becoming more and more common. However, the integration of both document types is limited, one example being the ability to stack documents. In this paper we propose StackTop, an integrated system supporting ordered hybrid digital/physical piling (hybrid stacking) on interactive surfaces. This allows for a tighter physical/digital integration in hybrid workspaces and provides a more consistant approach when working with hybrid document sets.


human computer interaction with mobile devices and services | 2015

Palm-based Interaction with Head-mounted Displays

Florian Müller; Niloofar Dezfuli; Max Mühlhäuser; Martin Schmitz; Mohammadreza Khalilbeigi

Head-mounted displays (HMDs) are an emerging class of wearable devices that allow users to access and alter information right in front of their eyes. However, due to their size and shape, traditional input modalities (e.g., multi-touch sensing on the device) are not practical. In this position paper, we argue that palm-based interactions have a great potential to ease the interaction with HMDs. We outline two interaction concepts and present directions for future research that can lead to more enjoyable and usable interfaces for HMDs.

Collaboration


Dive into the Niloofar Dezfuli's collaboration.

Top Co-Authors

Avatar

Max Mühlhäuser

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Florian Müller

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Mohammadreza Khalilbeigi

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Sebastian Günther

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Martin Schmitz

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Jan Riemann

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Markus Funk

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andreas Leister

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Azita Hosseini Nejad

Technische Universität Darmstadt

View shared research outputs
Researchain Logo
Decentralizing Knowledge