Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Giulio Rognini is active.

Publication


Featured researches published by Giulio Rognini.


PLOS ONE | 2012

Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task

Ali Sengül; Michiel van Elk; Giulio Rognini; Jane E. Aspell; Hannes Bleuler; Olaf Blanke

The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience.


European Journal of Neuroscience | 2013

Visuo-tactile integration and body ownership during self-generated action

Giulio Rognini; Ali Sengül; Jane E. Aspell; Roy Salomon; Hannes Bleuler; Olaf Blanke

Although there is increasing knowledge about how visual and tactile cues from the hands are integrated, little is known about how self‐generated hand movements affect such multisensory integration. Visuo‐tactile integration often occurs under highly dynamic conditions requiring sensorimotor updating. Here, we quantified visuo‐tactile integration by measuring cross‐modal congruency effects (CCEs) in different bimanual hand movement conditions with the use of a robotic platform. We found that classical CCEs also occurred during bimanual self‐generated hand movements, and that such movements lowered the magnitude of visuo‐tactile CCEs as compared to static conditions. Visuo‐tactile integration, body ownership and the sense of agency were decreased by adding a temporal visuo‐motor delay between hand movements and visual feedback. These data show that visual stimuli interfere less with the perception of tactile stimuli during movement than during static conditions, especially when decoupled from predictive motor information. The results suggest that current models of visuo‐tactile integration need to be extended to account for multisensory integration in dynamic conditions.


international workshop on advanced motion control | 2012

Towards multimodal haptics for teleoperation: Design of a tactile thermal display

Simon Gallo; Laura Santos-Carreras; Giulio Rognini; Masayuki Hara; Akio Yamamoto; Toshiro Higuchi

Surgical robotics is among the most challenging applications of motion control. Present and future systems are essentially master-slave systems. Our work focuses on force-feedback and haptic interfaces. In this context, we study multimodal haptic interfaces, i.e. the fusion of force-feedback, with other tactile information such as temperature or pressure. First results support the proposition that such multimodal haptic devices can help improve surgeons dexterity and motion control. In order to strengthen this point, we investigate the psychophysics of thermal perception. This paper presents a device for temperature feedback that can be integrated in a multimodal haptic console. A finger sized tactile temperature display able to generate temperature gradients under the fingertip is presented along with first measurement results.


Experimental Brain Research | 2013

Force feedback facilitates multisensory integration during robotic tool use

Ali Sengül; Giulio Rognini; Michiel van Elk; Jane E. Aspell; Hannes Bleuler; Olaf Blanke

The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal congruency task, by responding to tactile vibrations applied to their hands, while ignoring visual distractors superimposed on the robotic tools. In the first experiment it was found that tool-use training with force feedback facilitates multisensory integration of signals from the tool, as reflected in a stronger crossmodal congruency effect with the force feedback training compared to training without force feedback and to no training. The second experiment extends these findings by showing that training with realistic online force feedback resulted in a stronger crossmodal congruency effect compared to training in which force feedback was delayed. The present study highlights the importance of haptic information for multisensory integration and extends findings from classical tool-use studies to the domain of robotic tools. We argue that such crossmodal congruency effects are an objective measure of robotic tool integration and propose some potential applications in surgical robotics, robotic tools, and human–tool interaction.


PLOS ONE | 2014

Crossing the Hands Increases Illusory Self-Touch

Polona Pozeg; Giulio Rognini; Roy Salomon; Olaf Blanke

Manipulation of hand posture, such as crossing the hands, has been frequently used to study how the body and its immediately surrounding space are represented in the brain. Abundant data show that crossed arms posture impairs remapping of tactile stimuli from somatotopic to external space reference frame and deteriorates performance on several tactile processing tasks. Here we investigated how impaired tactile remapping affects the illusory self-touch, induced by the non-visual variant of the rubber hand illusion (RHI) paradigm. In this paradigm blindfolded participants (Experiment 1) had their hands either uncrossed or crossed over the body midline. The strength of illusory self-touch was measured with questionnaire ratings and proprioceptive drift. Our results showed that, during synchronous tactile stimulation, the strength of illusory self-touch increased when hands were crossed compared to the uncrossed posture. Follow-up experiments showed that the increase in illusion strength was not related to unfamiliar hand position (Experiment 2) and that it was equally strengthened regardless of where in the peripersonal space the hands were crossed (Experiment 3). However, while the boosting effect of crossing the hands was evident from subjective ratings, the proprioceptive drift was not modulated by crossed posture. Finally, in contrast to the illusion increase in the non-visual RHI, the crossed hand postures did not alter illusory ownership or proprioceptive drift in the classical, visuo-tactile version of RHI (Experiment 4). We argue that the increase in illusory self-touch is related to misalignment of somatotopic and external reference frames and consequently inadequate tactile-proprioceptive integration, leading to re-weighting of the tactile and proprioceptive signals.The present study not only shows that illusory self-touch can be induced by crossing the hands, but importantly, that this posture is associated with a stronger illusion.


Applied Bionics and Biomechanics | 2011

Dionis: A novel remote-center-of-motion parallel manipulator for Minimally Invasive Surgery

Ricardo Beira; Laura Santos-Carreras; Giulio Rognini; Hannes Bleuler; Reymond Clavel

The large volume and reduced dexterity of current surgical robotic systems are factors that restrict their effective performance. To improve the usefulness of surgical robots in minimally invasive surgery MIS, a compact and accurate positioning mechanism, named Dionis, is proposed in this paper. This spatial hybrid mechanism based on a novel parallel kinematics is able to provide three rotations and one translation for single port procedures. The corresponding axes intersect at a remote center of rotation RCM that is the MIS entry port. Another important feature of the proposed positioning manipulator is that it can be placed below the operating table plane, allowing a quick and direct access to the patient, without removing the robotic system. This, besides saving precious space in the operating room, may improve safety over existing solutions. The conceptual design of Dionis is presented in this paper. Solutions for the inverse and direct kinematics are developed, as well as the analytical workspace and singularity analysis. Due to its unique design and kinematics, the proposed mechanism is highly compact, stiff and its dexterity fullfils the workspace specifications for MIS procedures.


Frontiers in Psychology | 2015

Voluntary self-touch increases body ownership

Masayuki Hara; Polona Pozeg; Giulio Rognini; Takahiro Higuchi; Kazunobu Fukuhara; Akio Yamamoto; Toshiro Higuchi; Olaf Blanke; Roy Salomon

Experimental manipulations of body ownership have indicated that multisensory integration is central to forming bodily self-representation. Voluntary self-touch is a unique multisensory situation involving corresponding motor, tactile and proprioceptive signals. Yet, even though self-touch is frequent in everyday life, its contribution to the formation of body ownership is not well understood. Here we investigated the role of voluntary self-touch in body ownership using a novel adaptation of the rubber hand illusion (RHI), in which a robotic system and virtual reality allowed participants self-touch of real and virtual hands. In the first experiment, active and passive self-touch were applied in the absence of visual feedback. In the second experiment, we tested the role of visual feedback in this bodily illusion. Finally, in the third experiment, we compared active and passive self-touch to the classical RHI in which the touch is administered by the experimenter. We hypothesized that active self-touch would increase ownership over the virtual hand through the addition of motor signals strengthening the bodily illusion. The results indicated that active self-touch elicited stronger illusory ownership compared to passive self-touch and sensory only stimulation, and show an important role for active self-touch in the formation of bodily self.


intelligent robots and systems | 2011

A novel approach to the manipulation of body-parts ownership using a bilateral master-slave system

Masayuki Hara; Giulio Rognini; Nathan Evans; Olaf Blanke; Akio Yamamoto; Hannes Bleuler; Toshiro Higuchi

This paper introduces a novel approach to the manipulation of body-parts ownership, using the tactile rubber hand illusion (RHI) paradigm. In the conventional studies on the RHI, participants invisible hand and the visible rubber hand are manually tapped or stroked by an experimenter. Differently, in our approach, a bilateral master-slave system is applied to provide tactile stimulations in a novel interactive manner—active self-touch. Here, we present a 3-DOF master-slave system based on human self-touch characteristics and a validation experiment using an arranged version of the conventional RHI paradigm. In this new version, participants can contact with the rubber hand by manipulating the master device with their right hand and actively touch their own left hand through the slave device. The results demonstrate that the master-slave system can be successfully used to manipulate body-parts ownership, opening up the way to new studies concerning self-touch and body representation.


Journal of Neuroscience Methods | 2014

A novel manipulation method of human body ownership using an fMRI-compatible master-slave system

Masayuki Hara; Roy Salomon; Wietske van der Zwaag; Tobias Kober; Giulio Rognini; Hiroyuki Nabae; Akio Yamamoto; Olaf Blanke; Toshiro Higuchi

Bodily self-consciousness has become an important topic in cognitive neuroscience aiming to understand how the brain creates a unified sensation of the self in a body. Specifically, full body illusion (FBI) in which changes in bodily self-consciousness are experimentally introduced by using visual-tactile stimulation has led to improve understanding of these mechanisms. This paper introduces a novel approach to the classic FBI paradigm using a robotic master-slave system which allows us to examine interactions between action and the sense of body ownership in behavioral and MRI experiments. In the proposed approach, the use of the robotic master-slave system enables unique stimulation in which experimental participants can administer tactile cues on their own back using active self-touch. This active self-touch has never been employed in FBI experiments and it allows to test the role of sensorimotor integration and agency (the feeling of control over our actions) in FBI paradigms. The objective of this study is to propose a robotic-haptic platform allowing a new FBI paradigm including the active self-touch in MRI environments. This paper, first, describes the design concept and the performance of the prototype device in the fMRI environment (for 3T and 7T MRI scanners). In addition, the prototype device is applied to a classic FBI experiment, and we verify that the use of the prototype device succeeded in inducing the FBI. These results indicate that the proposed approach has a potential to drive advances in our understanding of human body ownership and agency by allowing novel manipulation and paradigms.


Trends in Cognitive Sciences | 2016

Cognetics: Robotic Interfaces for the Conscious Mind

Giulio Rognini; Olaf Blanke

Cognetics joins the cognitive neuroscience of bodily awareness with robotics to study, control, and enhance perception, cognition, and consciousness. We highlight robot-controlled bodily perception, conscious states, and social interactions and sketch how future cognetic interfaces will impact cognitive neuroscience and human enhancement.

Collaboration


Dive into the Giulio Rognini's collaboration.

Top Co-Authors

Avatar

Olaf Blanke

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar

Hannes Bleuler

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar

Roy Salomon

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Ali Sengül

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Laura Santos-Carreras

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Polona Pozeg

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Simon Gallo

École Polytechnique Fédérale de Lausanne

View shared research outputs
Researchain Logo
Decentralizing Knowledge