Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dimitar Valkov is active.

Publication


Featured researches published by Dimitar Valkov.


PLOS ONE | 2013

FIM, a Novel FTIR-Based Imaging Method for High Throughput Locomotion Analysis

Benjamin Risse; Silke Thomas; Nils Otto; Tim Löpmeier; Dimitar Valkov; Xiaoyi Jiang; Christian Klämbt

We designed a novel imaging technique based on frustrated total internal reflection (FTIR) to obtain high resolution and high contrast movies. This FTIR-based Imaging Method (FIM) is suitable for a wide range of biological applications and a wide range of organisms. It operates at all wavelengths permitting the in vivo detection of fluorescent proteins. To demonstrate the benefits of FIM, we analyzed large groups of crawling Drosophila larvae. The number of analyzable locomotion tracks was increased by implementing a new software module capable of preserving larval identity during most collision events. This module is integrated in our new tracking program named FIMTrack which subsequently extracts a number of features required for the analysis of complex locomotion phenotypes. FIM enables high throughput screening for even subtle behavioral phenotypes. We tested this newly developed setup by analyzing locomotion deficits caused by the glial knockdown of several genes. Suppression of kinesin heavy chain (khc) or rab30 function led to contraction pattern or head sweeping defects, which escaped in previous analysis. Thus, FIM permits forward genetic screens aimed to unravel the neural basis of behavior.


eurographics | 2010

Touching floating objects in projection-based virtual reality environments

Dimitar Valkov; Frank Steinicke; Gerd Bruder; Klaus H. Hinrichs; Johannes Schöning; Florian Daiber; Antonio Krüger

Touch-sensitive screens enable natural interaction without any instrumentation and support tangible feedback on the touch surface. In particular multi-touch interaction has proven its usability for 2D tasks, but the challenges to exploit these technologies in virtual reality (VR) setups have rarely been studied. In this paper we address the challenge to allow users to interact with stereoscopically displayed virtual environments when the input is constrained to a 2D touch surface. During interaction with a large-scale touch display a user changes between three different states: (1) beyond the arm-reach distance from the surface, (2) at arm-reach distance and (3) interaction. We have analyzed the users ability to discriminate stereoscopic display parallaxes while she moves through these states, i. e., if objects can be imperceptibly shifted onto the interactive surface and become accessible for natural touch interaction. Our results show that the detection thresholds for such manipulations are related to both user motion and stereoscopic parallax, and that users have problems to discriminate whether they touched an object or not, when tangible feedback is expected.


international conference on human computer interaction | 2009

Bimanual Interaction with Interscopic Multi-Touch Surfaces

Johannes Schöning; Frank Steinicke; Antonio Krüger; Klaus H. Hinrichs; Dimitar Valkov

Multi-touch interaction has received considerable attention in the last few years, in particular for natural two-dimensional (2D) interaction. However, many application areas deal with three-dimensional (3D) data and require intuitive 3D interaction techniques therefore. Indeed, virtual reality (VR) systems provide sophisticated 3D user interface, but then lack efficient 2D interaction, and are therefore rarely adopted by ordinary users or even by experts. Since multi-touch interfaces represent a good trade-off between intuitive, constrained interaction on a touch surface providing tangible feedback, and unrestricted natural interaction without any instrumentation, they have the potential to form the foundation of the next generation user interface for 2D as well as 3D interaction. In particular, stereoscopic display of 3D data provides an additional depth cue, but until now the challenges and limitations for multi-touch interaction in this context have not been considered. In this paper we present new multi-touch paradigms and interactions that combine both traditional 2D interaction and novel 3D interaction on a touch surface to form a new class of multi-touch systems, which we refer to as interscopic multi-touch surfaces (iMUTS). We discuss iMUTS-based user interfaces that support interaction with 2D content displayed in monoscopic mode and 3D content usually displayed stereoscopically. In order to underline the potential of the proposed iMUTS setup, we have developed and evaluated two example interaction metaphors for different domains. First, we present intuitive navigation techniques for virtual 3D city models, and then we describe a natural metaphor for deforming volumetric datasets in a medical context.


interactive tabletops and surfaces | 2011

Triangle cursor: interactions with objects above the tabletop

Sven Strothoff; Dimitar Valkov; Klaus H. Hinrichs

Extending the tabletop display to the third dimension using a stereoscopic projection offers the possibility to improve applications by using the volume above the table surface. The combination of multi-touch input and stereoscopic projection usually requires an indirect technique to interact with objects above the tabletop, as touches can only be detected on the surface. Triangle Cursor is a 3D interaction technique that allows specification of a 3D position and yaw rotation above the interactive tabletop. It was designed to avoid occlusions that disturb the stereoscopic perception. While Triangle Cursor uses an indirect approach, the position, the height above the surface and the yaw rotation can be controlled simultaneously, resulting in a 4 DOF manipulation technique. We have evaluated Triangle Cursor in an initial user study and compared it to a related existing technique in a formal user study. Our experiments show that users were able to perform all tasks significantly faster with our technique without loosing any precision. Most of the subjects considered the technique easy to use and satisfying.


virtual reality software and technology | 2014

DigiTap: an eyes-free VR/AR symbolic input device

Manuel Prätorius; Dimitar Valkov; Ulrich Burgbacher; Klaus H. Hinrichs

In this paper we present DigiTap---a wrist-worn device specially designed for symbolic input in virtual and augmented reality (VR/AR) environments. DigiTap is able to robustly sense thumb-to-finger taps on the four fingertips and the eight minor knuckles. These taps are detected by an accelerometer, which triggers capturing of an image sequence with a small wrist-mounted camera. The tap position is then extracted with low computational effort from the images by an image processing pipeline. Thus, the device is very energy efficient and may potentially be integrated in a smartwatch-like device, allowing an unobtrusive, always available, eyes-free input. To demonstrate the feasibility of our approach an initial user study with our prototype device was conducted. In this study the suitability of the twelve tapping locations was evaluated, and the most prominent sources of error were identified. Our prototype system was able to correctly classify 92% of the input locations.


human factors in computing systems | 2014

Imperceptible depth shifts for touch interaction with stereoscopic objects

Dimitar Valkov; Alexander Giesler; Klaus H. Hinrichs

While touch technology has proven its usability for 2D interaction and has already become a standard input modality for many devices, the challenges to exploit its applicability with stereoscopically rendered content have barely been studied. In this paper we exploit the properties of the visual perception to allow users to touch stereoscopically displayed objects when the input is constrained to a 2D surface. Therefore, we have extended and generalized recent evaluations on the users ability to discriminate small induced object shifts while reaching out to touch a virtual object, and we propose a practical interaction technique, the attracting shift technique, suitable for numerous application scenarios where shallow depth interaction is sufficient. In addition, our results indicate that slight object shifts during touch interaction make the virtual scene appear perceptually more stable compared to a static scene. As a consequence, applications have to manipulate the virtual objects to make them appear static for the user.


2012 5th Workshop on Software Engineering and Architectures for Realtime Interactive Systems (SEARIS) | 2012

Viargo - A generic virtual reality interaction library

Dimitar Valkov; Benjamin Bolte; Gerd Bruder; Frank Steinicke

Traditionally, interaction techniques for virtual reality applications are implemented in a proprietary way on specific target platforms, e. g., requiring specific hardware, physics or rendering libraries, which withholds reusability and portability. Though hardware abstraction layers for numerous devices are provided by multiple virtual reality libraries, they are usually tightly bound to a particular rendering environment. In this paper we introduce Viargo - a generic virtual reality interaction library, which serves as additional software layer that is independent from the application and its linked libraries, i. e., a once developed interaction technique, such as walking with a head-mounted display or multi-touch interaction, can be ported to different hard- or software environments with minimal code adaptation. We describe the underlying concepts and present examples on how to integrate Viargo in different graphics engines, thus extending proprietary graphics libraries with a few lines of code to easy-to-use virtual reality engines.


symposium on 3d user interfaces | 2010

A multi-touch enabled human-transporter metaphor for virtual 3D traveling

Dimitar Valkov; Frank Steinicke; Gerd Bruder; Klaus H. Hinrichs

In this tech-note we demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive 3D environments. Geographic Information Systems (GIS) are well suited as a complex testbed for evaluation of user interfaces based on multi-modal input. Recent developments in the area of interactive surfaces enable the construction of low-cost multi-touch displays and relatively inexpensive sensor technology to detect foot gestures, which allows to explore these input modalities for virtual reality environments. In this tech-note, we describe an intuitive 3D user interface metaphor and corresponding hardware, which combine multi-touch hand and foot gestures for interaction with spatial data.


virtual reality software and technology | 2012

VINS: shared memory space for definition of interactive techniques

Dimitar Valkov; Alexander Giesler; Klaus H. Hinrichs

Traditionally, interaction techniques for virtual reality applications are implemented in a proprietary way on specific target platforms, e. g., requiring specific hardware, physics or rendering libraries, which hinders reusability and portability. Even though abstraction layers for hardware devices are provided by numerous virtual reality libraries, they are usually tightly bound to a particular rendering environment and hardware configuration. In this paper we introduce VINS (Virtual Interactive Namespace) a seamless distributed memory space, which provides a hierarchical structure to support reusable design of interactive techniques. With VINS an interaction metaphor, whether it is implemented as function or class in the main application thread, uses its own thread or runs as its own process on another computer, can be transferred from one application to another without modifications. We describe the underlying concepts and present examples on how to integrate VINS with different frameworks or already implemented interactive techniques.


symposium on 3d user interfaces | 2010

Immersive virtual studio for architectural exploration

Gerd Bruder; Frank Steinicke; Dimitar Valkov; Klaus H. Hinrichs

Architects use a variety of analog and digital tools and media to plan and design constructions. Immersive virtual reality (VR) technologies have shown great potential for architectural design, especially for exploration and review of design proposals. In this work we propose a virtual studio system, which allows architects and clients to use arbitrary real-world tools such as maps or rulers during immersive exploration of virtual 3D models. The user interface allows architects and clients to review designs and compose 3D architectural scenes, combining benefits of mixed-reality environments with immersive head-mounted display setups.

Collaboration


Dive into the Dimitar Valkov's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Frank Steinicke

German Research Centre for Artificial Intelligence

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gerd Bruder

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Gerd Bruder

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge