Janki Dodiya
German Aerospace Center
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Janki Dodiya.
symposium on 3d user interfaces | 2013
Johannes Hummel; Janki Dodiya; Robin Wolff; Andreas Gerndt; Torsten W. Kuhlen
Weight perception in virtual environments generally can be achieved with haptic devices. However, most of these are hard to integrate in an immersive virtual environment (IVE) due to their technical complexity and the restriction of a users movement within the IVE. We describe two simple methods using only a wireless light-weight finger-tracking device in combination with a physics simulated hand model to create a feeling of heaviness of virtual objects when interacting with them in an IVE. The first method maps the varying distance between tracked fingers and the thumb to the grasping force required for lifting a virtual object with a given weight. The second method maps the detected intensity of finger pinch during grasping gestures to the lifting force. In an experiment described in this paper we investigated the potential of the proposed methods for the discrimination of heaviness of virtual objects by finding the just noticeable difference (JND) to calculate the Weber fraction. Furthermore, the workload that users experienced using these methods was measured to gain more insight into their usefulness as interaction technique. At a hit ratio of 0.75, the determined Weber fraction using the finger distance based method was 16.25% and using the pinch based method was 15.48%, which corresponds to values found in related work. There was no significant effect of method on the difference threshold measured and the workload experienced, however the user preference was higher for the pinch based method. The results demonstrate the capability of the proposed methods for the perception of heaviness in IVEs and therefore represent a simple alternative to haptics based methods.
ieee virtual reality conference | 2016
Johannes Hummel; Janki Dodiya; Laura Eckardt; Robin Wolff; Andreas Gerndf; Torsten W. Kuhlen
An immersive virtual environment is the ideal platform for the planning and training of on-orbit servicing missions, as it provides a flexible and safe environment. In such kind of virtual assembly simulation, grasping virtual objects is one of the most common and natural interactions. However, unlike grasping objects in the real world, it is a non-trivial task in virtual environments, where the primary feedback is visual only. A lot of research investigated ways to provide haptic feedback, such as force, vibrational and electrotactile feedback. Such devices, however, are usually uncomfortable and hard to integrate in projection-based immersive YR systems. In this paper, we present a novel, small and lightweight electro-tactile feedback device, specifically designed for immersive virtual environments. It consists of a small tactor with eight electrodes for each finger and a signal generator attached to the users hand or arm. Our device can be easily integrated with an existing optical finger tracking system. The study presented in this paper assesses the feasibility and usability of the interaction device. An experiment was conducted in a repeated measures design using the electrotactile feedback modality as independent variable. As benchmark, we chose three typical assembly tasks of a YR simulation for satellite on-orbit servicing missions, including pressing a button, switching a lever switch, and pulling a module from its slot. Results show that electrotactile feedback improved the users grasping in our virtual on-orbit servicing scenario. The task completion time was significantly lower for all three tasks and the precision of the users interaction was higher. The workload reported by the participants was significantly lower when using electrotactile feedback. Additionally, users were more confident with their performance while completing the tasks with electrotactile feedback. We describe the device, outline the user study and report the results.
ieee virtual reality conference | 2015
David J. Roberts; Arturo S. García; Janki Dodiya; Robin Wolff; Allen J. Fairchild; Terrence Fernando
We introduce the collaborative telepresence workspaces for SPACE operation and science that are under development in the European research project CROSS DRIVE. The vision is to give space mission controllers and scientists the impression of “beaming” to the surface of Mars, along with simulations of the environment and equipment, to step out together where a robot has or may move. We briefly overview the design and describe the state of the demonstrator. The contribution of the publication is to give an example of how collaborative Virtual Reality research is being taken up in space science.
ieee virtual reality conference | 2013
Mikel Sagardia; Katharina Hertkorn; Thomas Hulin; Robin Wolff; Johannes Hummell; Janki Dodiya; Andreas Gerndt
The growth of space debris is becoming a serious problem. There is an urgent need for mitigation measures based on maintenance, repair and de-orbiting technologies. Our video presents a virtual reality framework in which robotic maintenance tasks of satellites can be simulated interactively. The two key components of this framework are a realistic virtual reality simulation and an immersive interaction device. The peculiarity of the virtual reality simulation is the combination of a physics engine based on Bullet with an extremely efficient haptic rendering algorithm inspired by an enhanced version of the Voxmap-Pointshell Algorithm. A central logic module controls all states and objects in the virtual world. To enable the human operator an optimal immersion into the virtual environment, the DLR bimanual haptic device is used as interaction device. Equipped with two light-weight robot arms, this device is able to provide realistic haptic feedback at both human hands, while covering the major part of human operators workspace. The applicability of this system is enhanced by additional force sensors, active hand interfaces with an additional degree of freedom, smart safety technologies and intuitive robot data augmentation. Our platform can be used for verification or training purposes of robotic systems interacting in space environments.
ieee aerospace conference | 2015
Mikel Sagardia; Katharina Hertkorn; Thomas Hulin; Simon Schätzle; Robin Wolff; Johannes Hummel; Janki Dodiya; Andreas Gerndt
The growth of space debris is becoming a severe issue that urgently requires mitigation measures based on maintenance, repair, and de-orbiting technologies. Such on-orbit servicing (OOS) missions, however, are delicate and expensive. Virtual Reality (VR) enables the simulation and training in a flexible and safe environment, and hence has the potential to drastically reduce costs and time, while increasing the success rate of future OOS missions. This paper presents a highly immersive VR system with which satellite maintenance procedures can be simulated interactively using visual and haptic feedback. The system can be used for verification and training purposes for human and robot systems interacting in space. Our framework combines unique realistic virtual reality simulation engines with advanced immersive interaction devices. The DLR bimanual haptic device HUG is used as the main user interface. The HUG is equipped with two light-weight robot arms and is able to provide realistic haptic feedback on both human arms. Additional devices provide vibrotactile and electrotactile feedback at the elbow and the fingertips. A particularity of the realtime simulation is the fusion of the Bullet physics engine with our haptic rendering algorithm, which is an enhanced version of the Voxmap-Pointshell Algorithm. Our haptic rendering engine supports multiple objects in the scene and is able to compute collisions for each of them within 1 msec, enabling realistic virtual manipulation tasks even for stiff collision configurations. The visualization engine ViSTA is used during the simulation to achieve photo-realistic effects, increasing the immersion. In order to provide a realistic experience at interactive frame rates, we developed a distributed system architecture, where the load of computing the physics simulation, haptic feedback and visualization of a complex scene is transferred to dedicated machines. The implementations are presented in detail and the performance of the overall system is validated. Additionally, a preliminary user study in which the virtual system is compared to a physical test bed shows the suitability of the VR-OOS framework.
ieee aerospace conference | 2015
Arturo S. García; David J. Roberts; Terrence Fernando; Christian Bar; Robin Wolff; Janki Dodiya; Wito Engelke; Andreas Gerndt
Space exploration missions have produced large data of immense value, to both research and the planning and operating of future missions. However, current datasets and simulation tools fragment teamwork, especially across disciplines and geographical location. The aerospace community already exploits virtual reality for purposes including space tele-robotics, interactive 3D visualization, simulation and training. However, collaborative virtual environments are yet to be widely deployed or routinely used in space projects. Advanced immersive and collaborative visualization systems have the potential for enhancing the efficiency and efficacy of data analysis, simplifying visual benchmarking, presentations and discussions. We present preliminary results of the EU funded international project CROSS DRIVE, which develops an infrastructure for collaborative workspaces for space science and missions. The aim is to allow remote scientific and engineering experts to collectively analyze and interpret combined datasets using shared simulation tools. The approach is to combine advanced 3D visualization techniques and interactive tools in conjunction with immersive virtuality telepresence. This will give scientists and engineers the impression of teleportation from their respective buildings across Europe, to stand together on a planetary surface, surrounded by the information and tools that they need. The conceptual architecture and proposed realization of the collaborative workspace are described. ESAs planned ExoMars mission provides the use-case for deriving user requirements and evaluating our implementation.
Archive | 2013
Ralf Dörner; Geert Matthys; Manfred Bogen; Stefan Rilling; Andreas Gerndt; Janki Dodiya; Katharina Hertkorn; Thomas Hulin; Johannes Hummel; Mikel Sagardia; Robin Wolff; Tom Kühnert; Guido Brunnett; Hagen Buchholz; Lisa Blum; Christoffer Menk; Christian Bade; Werner Schreiber; Matthias Greiner; Thomas Alexander; Michael Kleiber; Gerd Bruder; Frank Steinicke
Dieses Kapitel enthalt eine Sammlung von ausgewahlten erfolgreichen Fallbeispielen fur VR/AR aus Forschung und Praxis.
ICAT/EGVE/EuroVR | 2012
Johannes Hummel; Robin Wolff; Janki Dodiya; Andreas Gerndt; Torsten W. Kuhlen
The selection of the right input devices for 3D interaction methods is important for a successful VR system. While natural direct interaction is often preferred, research has shown that indirect interaction can be beneficial. This paper focuses on an immersive simulation and training environment, in which one sub-task it is to carefully grasp and move a force-sensitive thin deformable foil without damaging it. In order to ensure transfer of training it was necessary to inform the user of the fact of gentle grasping and moving the foil. We explore the potential of three simple and light-weight interaction methods that each map interaction to a virtual hand in a distinct way. We used a standard tracked joystick with an indirect mapping, a standard finger tracking device with direct mapping based on finger position, and a novel enhanced finger tracking device, which additionally allowed pinch force input. The results of our summative user study show that the task performance did not show a significant difference among the three interaction methods. The simple position based mapping using finger tracking was most preferred, although the enhanced finger tracking device with direct force input offered the most natural interaction mapping. Our findings show that both a direct and indirect input method have potential to interact with force-sensitive thin deformable objects, while the direct method is preferred.
Archive | 2012
Johannes Hummel; Robin Wolff; Janki Dodiya; Andreas Gerndt; Torsten W. Kuhlen
Archive | 2014
Andreas Gerndt; Janki Dodiya; Katharina Hertkorn; Thomas Hulin; Johannes Hummel; Mikel Sagardia; Robin Wolff