Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Carlos Garre is active.

Publication


Featured researches published by Carlos Garre.


international symposium on mixed and augmented reality | 2009

Augmented touch without visual obtrusion

Francesco Cosco; Carlos Garre; Fabio Bruno; Maurizio Muzzupappa; Miguel A. Otaduy

Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. However, haptic devices tend to be bulky items that appear in the field of view of the user. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, but without visual obtrusion produced by the haptic device. This mixed reality paradigm relies on the following three technical steps: tracking of the haptic device, visual deletion of the device from the real scene, and background completion using image-based models. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects in the context of a real scene.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2009

Haptic rendering of complex deformations through handle-space force linearization

Carlos Garre; Miguel A. Otaduy

The force-update-rate requirements of transparent rendering of virtual environments are in conflict with the computational cost required for computing complex interactions between deforming objects. In this paper we introduce a novel method for satisfying high force update rates with deformable objects, yet retaining the visual quality of complex deformations and interactions. The objects that are haptically manipulated may have many degrees of freedom, but haptic interaction is often implemented in practice through low-dimensional force-feedback devices. We exploit the low-dimensional domain of the interaction for devising a novel linear approximation of interaction forces that can be efficiently evaluated at force-update rates. Moreover, our linearized force model is time-implicit, which implies that it accounts for contact constraints and the internal dynamics of deforming objects. In this paper we show examples of haptic interaction in complex situations such as large deformations, collision between deformable objects (with friction), or even self-collision.


world haptics conference | 2011

Interactive simulation of a deformable hand for haptic rendering

Carlos Garre; Fernando Hernandez; Antonio Gracia; Miguel A. Otaduy

Operations such as object manipulation and palpation rely on the fine perception of contact forces, both in time and space. Haptic simulation of grasping, with the rendering of contact forces resulting from the manipulation of virtual objects, requires realistic yet interactive models of hand mechanics. This paper presents a model for interactive simulation of the skeletal and elastic properties of a human hand, allowing haptic grasping of virtual objects with soft finger contact. The novel aspects of the model consist of a simple technique to couple skeletal and elastic elements, an efficient dynamics solver in the presence of joints and contact constraints, and an algorithm that connects the simulation to a haptic device.


world haptics conference | 2013

Strain limiting for soft finger contact simulation

Alvaro G. Perez; Gabriel Cirio; Fernando Hernandez; Carlos Garre; Miguel A. Otaduy

The command of haptic devices for rendering direct interaction with the hand requires thorough knowledge of the forces and deformations caused by contact interactions on the fingers. In this paper, we propose an algorithm to simulate nonlinear elasticity under frictional contact, with the goal of establishing a model-based strategy to command haptic devices and to render direct hand interaction. The key novelty in our algorithm is an approach to model the extremely nonlinear elasticity of finger skin and flesh using strain-limiting constraints, which are seamlessly combined with frictional contact constraints in a standard constrained dynamics solver. We show that our approach enables haptic rendering of rich and compelling deformations of the fingertip.


IEEE Transactions on Visualization and Computer Graphics | 2013

Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration

Francesco Cosco; Carlos Garre; Fabio Bruno; Maurizio Muzzupappa; Miguel A. Otaduy

Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the users real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the users hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the users hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.


Proceedings of the IEEE | 2013

Representations and Algorithms for Force-Feedback Display

Miguel A. Otaduy; Carlos Garre; Ming C. Lin

“Haptic rendering” or “haptic display” can be broadly defined as conveying information about virtual objects or data to a user through the sense of touch. Among all applications of haptic rendering, force-feedback display of contact interactions with rigid and deformable virtual models through the sense of touch has matured considerably over the last decade. In this paper, we present a general framework for force-feedback display of rigid and virtual environments, and we outline its major building blocks. We focus on computational aspects, and we classify algorithms and representations successfully used in the three major subproblems of force-feedback display: collision detection, dynamics simulation, and constrained optimization. In addition, force-feedback display is an integral part of a multimodal experience, often involving both visual and auditory display; therefore, we also discuss the choice of algorithms and representations for force feedback as a part of multimodal display.


Computers & Graphics | 2010

Computer Graphics in Spain: a Selection of Papers from CEIG 2009: Haptic rendering of objects with rigid and deformable parts

Carlos Garre; Miguel A. Otaduy

In many haptic applications, the user interacts with the virtual environment through a rigid tool. Tool-based interaction is suitable in many applications, but the constraint of using rigid tools is not applicable to some situations, such as the use of catheters in virtual surgery, or of a rubber part in an assembly simulation. Rigid-tool-based interaction is also unable to provide force feedback regarding interaction through the human hand, due to the soft nature of human flesh. In this paper, we address some of the computational challenges of haptic interaction through deformable tools, which forms the basis for direct-hand haptic interaction. We describe a haptic rendering algorithm that enables interactive contact between deformable objects, including self-collisions and friction. This algorithm relies on a deformable tool model that combines rigid and deformable components, and we present the efficient simulation of such a model under robust implicit integration.


CEIG | 2009

Toward Haptic Rendering of Full-Hand Touch

Carlos Garre; Miguel A. Otaduy

Most of the current haptic rendering techniques model either force-interaction through a pen-like tool or vibrationinteraction on the finger tip. Such techniques are not able, nowadays, to provide force-feedback of the interaction through the human hand. In this paper, we address some of the computational challenges in computing haptic feedback forces for hand-based interaction. We describe a haptic rendering algorithm that enables interactive contact between deformable surfaces, even with self-collisions and friction. This algorithm relies on a virtual hand model that combines rigid and deformable components, and we present the efficient simulation of such model under robust implicit integration.


SAE 2014 World Congress & Exhibition | 2014

Performance Comparison of Real-Time and General-Purpose Operating Systems in Parallel Physical Simulation with High Computational Cost

Carlos Garre; Domenico Mundo; Marco Gubitosa; Alessandro Toso

Real-time simulation is a valuable tool in the design and test of vehicles and vehicle parts, mainly when interfacing with hardware modules working at a given rate, as in hardware-inthe-loop testing. Real-time operating-systems (RTOS) are designed for minimizing the latency of critical operations such as interrupt dispatch, task switch or inter-process communication (IPC). General-purpose operating-systems (GPOS), instead, are designed for maximizing throughput in heavy-load systems. In complex simulations where the amount of work to do in one step is high, achieving real-time depends not only in the latency of the event starting the step, but also on the capacity of the system for computing one step in the available time. While it is demonstrated that RTOS present lower latencies than GPOS, the choice is not clear when maximizing throughput is also critical.


Archive | 2008

A simple Mass-Spring system for Character Animation

Carlos Garre; Alvaro G. Perez

Collaboration


Dive into the Carlos Garre's collaboration.

Top Co-Authors

Avatar

Miguel A. Otaduy

King Juan Carlos University

View shared research outputs
Top Co-Authors

Avatar

Alvaro G. Perez

King Juan Carlos University

View shared research outputs
Top Co-Authors

Avatar

Fernando Hernandez

King Juan Carlos University

View shared research outputs
Top Co-Authors

Avatar

Fabio Bruno

University of Calabria

View shared research outputs
Top Co-Authors

Avatar

Francesco Cosco

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alberto Sánchez

King Juan Carlos University

View shared research outputs
Top Co-Authors

Avatar

Gabriel Cirio

King Juan Carlos University

View shared research outputs
Top Co-Authors

Avatar

Ming C. Lin

University of North Carolina at Chapel Hill

View shared research outputs
Researchain Logo
Decentralizing Knowledge