Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniele De Leonardis is active.

Publication


Featured researches published by Daniele De Leonardis.


systems man and cybernetics | 2012

A New Gaze-BCI-Driven Control of an Upper Limb Exoskeleton for Rehabilitation in Real-World Tasks

Antonio Frisoli; Claudio Loconsole; Daniele De Leonardis; Filippo Bannò; Michele Barsotti; Carmelo Chisari; Massimo Bergamasco

This paper proposes a new multimodal architecture for gaze-independent brain-computer interface (BCI)-driven control of a robotic upper limb exoskeleton for stroke rehabilitation to provide active assistance in the execution of reaching tasks in a real setting scenario. At the level of action plan, the patients intention is decoded by means of an active vision system, through the combination of a Kinect-based vision system, which can online robustly identify and track 3-D objects, and an eye-tracking system for objects selection. At the level of action generation, a BCI is used to control the patients intention to move his/her own arm, on the basis of brain activity analyzed during motor imagery. The main kinematic parameters of the reaching movement (i.e., speed, acceleration, and jerk) assisted by the robot are modulated by the output of the BCI classifier so that the robot-assisted movement is performed under a continuous control of patients brain activity. The system was experimentally evaluated in a group of three healthy volunteers and four chronic stroke patients. Experimental results show that all subjects were able to operate the exoskeleton movement by BCI with a classification error rate of 89.4±5.0% in the robot-assisted condition, with no difference of the performance observed in stroke patients compared with healthy subjects. This indicates the high potential of the proposed gaze-BCI-driven robotic assistance for neurorehabilitation of patients with motor impairments after stroke since the earliest phase of recovery.


IEEE Transactions on Haptics | 2015

An EMG-Controlled Robotic Hand Exoskeleton for Bilateral Rehabilitation

Daniele De Leonardis; Michele Barsotti; Claudio Loconsole; Massimiliano Solazzi; Marco Troncossi; Claudio Mazzotti; Vincenzo Parenti Castelli; Caterina Procopio; Giuseppe Lamola; Carmelo Chisari; Massimo Bergamasco; Antonio Frisoli

This paper presents a novel electromyography (EMG)-driven hand exoskeleton for bilateral rehabilitation of grasping in stroke. The developed hand exoskeleton was designed with two distinctive features: (a) kinematics with intrinsic adaptability to patients hand size, and (b) free-palm and free-fingertip design, preserving the residual sensory perceptual capability of touch during assistance in grasping of real objects. In the envisaged bilateral training strategy, the patients non paretic hand acted as guidance for the paretic hand in grasping tasks. Grasping force exerted by the non paretic hand was estimated in real-time from EMG signals, and then replicated as robotic assistance for the paretic hand by means of the hand-exoskeleton. Estimation of the grasping force through EMG allowed to perform rehabilitation exercises with any, non sensorized, graspable objects. This paper presents the system design, development, and experimental evaluation. Experiments were performed within a group of six healthy subjects and two chronic stroke patients, executing robotic-assisted grasping tasks. Results related to performance in estimation and modulation of the robotic assistance, and to the outcomes of the pilot rehabilitation sessions with stroke patients, positively support validity of the proposed approach for application in stroke rehabilitation.


international conference on human haptic sensing and touch enabled computer applications | 2014

A High-Fidelity Surface-Haptic Device for Texture Rendering on Bare Finger

Michael Wiertlewski; Daniele De Leonardis; David J. Meyer; Michael A. Peshkin; J. Edward Colgate

We present the design and evaluation of a high fidelity surface-haptic device. The user slides a finger along a glass plate while friction is controlled via the amplitude modulation of ultrasonic vibrations of the plate. A non-contact finger position sensor and low latency rendering scheme allow for the reproduction of fine textures directly on the bare finger. The device can reproduce features as small as 25 \(\upmu \)m while maintaining an update rate of 5 kHz. Signal attenuation, inherent to resonant devices, is compensated with a feedforward filter, enabling an artifact-free rendering of virtual textures on a glass plate.


world haptics conference | 2015

A wearable fingertip haptic device with 3 DoF asymmetric 3-RSR kinematics

Daniele De Leonardis; Massimiliano Solazzi; Ilaria Bortone; Antonio Frisoli

A novel wearable haptic device for modulating skin stretch at the fingertip is presented. Rendering of skin stretch in 3 degrees of freedom (DoF), with contact - no contact capabilities, was implemented through rigid parallel kinematics. The novel asymmetrical three revolute-spherical-revolute (3-RSR) configuration allowed compact dimensions with minimum encumbrance of the hand workspace and minimum inter-finger interference. A differential method for solving the non-trivial inverse kinematics is proposed and implemented in real time for controlling the position of the skin tactor. Experiments involving the grasping of a virtual object were conducted using two devices (thumb and index fingers) in a group of 4 subjects: results showed that participants performed the grasping task more precisely and with grasping forces closer to the expected natural behavior when the proposed device provided haptic feedback.


international symposium on neural networks | 2014

A novel BCI-SSVEP based approach for control of walking in Virtual Environment using a Convolutional Neural Network

Vitoantonio Bevilacqua; Giacomo Tattoli; Domenico Buongiorno; Claudio Loconsole; Daniele De Leonardis; Michele Barsotti; Antonio Frisoli; Massimo Bergamasco

A non-invasive Brain Computer Interface (BCI) based on a Convolutional Neural Network (CNN) is presented as a novel approach for navigation in Virtual Environment (VE). The developed navigation control interface relies on Steady State Visually Evoked Potentials (SSVEP), whose features are discriminated in real time in the electroencephalographic (EEG) data by means of the CNN. The proposed approach has been evaluated through navigation by walking in an immersive and plausible virtual environment (VE), thus enhancing the involvement of the participant and his perception of the VE. Results show that the BCI based on a CNN can be profitably applied for decoding SSVEP features in navigation scenarios, where a reduced number of commands needs to be reliably and rapidly selected. The participant was able to accomplish a waypoint walking task within the VE, by controlling navigation through of the only brain activity.


world haptics conference | 2013

An emg-based robotic hand exoskeleton for bilateral training of grasp

Claudio Loconsole; Daniele De Leonardis; Michele Barsotti; Massimiliano Solazzi; Antonio Frisoli; Massimo Bergamasco; Marco Troncossi; Mohammad Mozaffari Foumashi; Claudio Mazzotti; Vincenzo Parenti Castelli

This work presents the development and the preliminary experimental assessment of a novel EMG-driven robotic hand exoskeleton for bilateral active training of grasp motion in stroke. The system allows to control the grasping force required to lift a real object with an impaired hand, through the active guidance provided by a hand active exoskeleton, whose force is modulated by the EMG readings acquired on the opposite unimpaired arm. To estimate the grasping force, the system makes use of surface EMG recordings during grasping, developed on the opposite unimpaired arm, and of a neural network to classify the information. The design, integration and experimental characterization of the system during the grasp of two cylindrical objects is presented. The experimental results show that an optimal force tracking of the interaction force with the object can be achieved.


ieee haptics symposium | 2012

Illusory perception of arm movement induced by visuo-proprioceptive sensory stimulation and controlled by motor imagery

Daniele De Leonardis; Antonio Frisoli; Massimiliano Solazzi; Massimo Bergamasco

This paper investigates the illusory perception of movement induced through visuo-proprioceptive signals and the possibility of control the illusory movement itself through motor imagery. In particular we evaluated the cross-modal integration of the proprioceptive illusion, induced through vibro-mechanical stimuli, with visual feedback provided by means of a virtual avatar body. The visuo-proprioceptive signals were then integrated as an enhanced feedback for a motor imagery Brain Computer Interface, with the final aim of providing to the user both the perception and mental control of the illusory movement. For this purpose it was specifically developed a system capable of applying vibration stimuli in proximity of the biceps and triceps brachi muscle tendons, and so eliciting the illusory perception of a movement as the vibrated muscle is elongated. We report that the multimodal integration of the proprioceptive illusion with the visual feedback can be successfully induced, though it depends on the subjective perception of speed and onset time of the illusory movement. Moreover we show how it is possible to both perceive and control the avatar body by mental motor imagery through a brain-computer-interface.


IEEE Transactions on Haptics | 2017

A 3-RSR Haptic Wearable Device for Rendering Fingertip Contact Forces

Daniele De Leonardis; Massimiliano Solazzi; Ilaria Bortone; Antonio Frisoli

A novel wearable haptic device for modulating contact forces at the fingertip is presented. Rendering of forces by skin deformation in three degrees of freedom (DoF), with contact—no contact capabilities, was implemented through rigid parallel kinematics. The novel asymmetrical three revolute-spherical-revolute (3-RSR) configuration allowed compact dimensions with minimum encumbrance of the hand workspace. The device was designed to render constant to low frequency deformation of the fingerpad in three DoF, combining light weight with relatively high output forces. A differential method for solving the non-trivial inverse kinematics is proposed and implemented in real time for controlling the device. The first experimental activity evaluated discrimination of different fingerpad stretch directions in a group of five subjects. The second experiment, enrolling 19 subjects, evaluated cutaneous feedback provided in a virtual pick-and-place manipulation task. Stiffness of the fingerpad plus device was measured and used to calibrate the physics of the virtual environment. The third experiment with 10 subjects evaluated interaction forces in a virtual lift-and-hold task. Although with different performance in the two manipulation experiments, overall results show that participants better controlled interaction forces when the cutaneous feedback was active, with significant differences between the visual and visuo-haptic experimental conditions.


ieee haptics symposium | 2016

A new wearable fingertip haptic interface for the rendering of virtual shapes and surface features

Massimiliano Gabardi; Massimiliano Solazzi; Daniele De Leonardis; Antonio Frisoli

This work presents the Haptic Thimble, a novel wearable haptic device for surface exploration. The Haptic Thimble combines rendering of surface orientation with fast transient and wide frequency bandwidth tactile cues. Such features allow surface exploration with rich tactile feedback, including reactive contact - no contact transition, rendering of collisions, surface asperities and textures. Above capabilities were obtained through a novel serial kinematics wrapped around the finger, actuated by compact servo motor for orienting the last link, and by a custom voice coil for actuating the plate in contact with the fingerpad. Performance of the voice coil were measured at the bench in static and dynamic conditions, assessing the capability of reproducing generic, wide-bandwidth (0-300 Hz) tactile cues. Overall usability of the Haptic Thimble was explored within a virtual environment involving exploration of virtual surfaces. Finally, a perceptual experiment executed in a teleoperated environment with kinesthetic feedback, showed that the addition of tactile feedback, provided through the Haptic Thimble, significantly improved performance of an exploratory task.


ieee international conference on rehabilitation robotics | 2015

A full upper limb robotic exoskeleton for reaching and grasping rehabilitation triggered by MI-BCI

Michele Barsotti; Daniele De Leonardis; Claudio Loconsole; Massimiliano Solazzi; Edoardo Sotgiu; Caterina Procopio; Carmelo Chisari; Massimo Bergamasco; Antonio Frisoli

In this paper we propose a full upper limb exoskeleton for motor rehabilitation of reaching, grasping and releasing in post-stroke patients. The presented system takes into account the hand pre-shaping for object affordability and it is driven by patients intentional control through a self-paced asynchronous Motor Imagery based Brain Computer Interface (MI-BCI). The developed antropomorphic eight DoFs exoskeleton (two DoFs for the hand, two for the wrist and four for the arm) allows full support of the manipulation activity at the level of single upper limb joint. In this study, we show the feasibility of the proposed system through experimental rehabilitation sessions conducted with three chronic post-stroke patients. Results show the potential of the proposed system for being introduced in a rehabilitation protocol.

Collaboration


Dive into the Daniele De Leonardis's collaboration.

Top Co-Authors

Avatar

Antonio Frisoli

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Massimiliano Solazzi

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Massimo Bergamasco

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Michele Barsotti

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Claudio Loconsole

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Caterina Procopio

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Edoardo Sotgiu

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Ilaria Bortone

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Massimiliano Gabardi

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge