Francesco Nori
Istituto Italiano di Tecnologia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Francesco Nori.
performance metrics for intelligent systems | 2008
Giorgio Metta; Giulio Sandini; David Vernon; Lorenzo Natale; Francesco Nori
We report about the iCub, a humanoid robot for research in embodied cognition. At 104 cm tall, the iCub has the size of a three and half year old child. It will be able to crawl on all fours and sit up to manipulate objects. Its hands have been designed to support sophisticate manipulation skills. The iCub is distributed as Open Source following the GPL/FDL licenses. The entire design is available for download from the project homepage and repository (http://www.robotcub.org). In the following, we will concentrate on the description of the hardware and software systems. The scientific objectives of the project and its philosophical underpinning are described extensively elsewhere [1].
intelligent robots and systems | 2010
Ugo Pattacini; Francesco Nori; Lorenzo Natale; Giorgio Metta; Giulio Sandini
In this paper we describe the design of a Cartesian Controller for a generic robot manipulator. We address some of the challenges that are typically encountered in the field of humanoid robotics. The solution we propose deals with a large number of degrees of freedom, produce smooth, human-like motion and is able to compute the trajectory on-line. In this paper we support the idea that to produce significant advancements in the field of robotics it is important to compare different approaches not only at the theoretical level but also at the implementation level. For this reason we test our software on the iCub platform and compare its performance against other available solutions.
IEEE Transactions on Autonomous Mental Development | 2010
Angelo Cangelosi; Giorgio Metta; Gerhard Sagerer; Stefano Nolfi; Chrystopher L. Nehaniv; Kerstin Fischer; Jun Tani; Tony Belpaeme; Giulio Sandini; Francesco Nori; Luciano Fadiga; Britta Wrede; Katharina J. Rohlfing; Elio Tuci; Kerstin Dautenhahn; Joe Saunders; Arne Zeschel
This position paper proposes that the study of embodied cognitive agents, such as humanoid robots, can advance our understanding of the cognitive development of complex sensorimotor, linguistic, and social learning skills. This in turn will benefit the design of cognitive robots capable of learning to handle and manipulate objects and tools autonomously, to cooperate and communicate with other robots and humans, and to adapt their abilities to changing internal, environmental, and social conditions. Four key areas of research challenges are discussed, specifically for the issues related to the understanding of: 1) how agents learn and represent compositional actions; 2) how agents learn and represent compositional lexica; 3) the dynamics of social interaction and learning; and 4) how compositional action and language representations are integrated to bootstrap the cognitive system. The review of specific issues and progress in these areas is then translated into a practical roadmap based on a series of milestones. These milestones provide a possible set of cognitive robotics goals and test scenarios, thus acting as a research roadmap for future work on cognitive developmental robotics.
Frontiers in Computational Neuroscience | 2013
Cristiano Alessandro; Ioannis Delis; Francesco Nori; Stefano Panzeri; Bastien Berret
In this paper we review the works related to muscle synergies that have been carried-out in neuroscience and control engineering. In particular, we refer to the hypothesis that the central nervous system (CNS) generates desired muscle contractions by combining a small number of predefined modules, called muscle synergies. We provide an overview of the methods that have been employed to test the validity of this scheme, and we show how the concept of muscle synergy has been generalized for the control of artificial agents. The comparison between these two lines of research, in particular their different goals and approaches, is instrumental to explain the computational implications of the hypothesized modular organization. Moreover, it clarifies the importance of assessing the functional role of muscle synergies: although these basic modules are defined at the level of muscle activations (input-space), they should result in the effective accomplishment of the desired task. This requirement is not always explicitly considered in experimental neuroscience, as muscle synergies are often estimated solely by analyzing recorded muscle activities. We suggest that synergy extraction methods should explicitly take into account task execution variables, thus moving from a perspective purely based on input-space to one grounded on task-space as well.
ieee-ras international conference on humanoid robots | 2006
Lorenzo Jamone; Giorgio Metta; Francesco Nori; Giulio Sandini
The recent trend of humanoid robotics research has been deeply influenced by concepts such as embodiment, embodied interaction and emergence. In our view, these concepts, beside shaping the controlling intelligence, should guide the very design process of the modern humanoid robotic platforms. In this paper, we discuss how these principles have been applied to the design of a humanoid robot called James. James has been designed by considering an object manipulation scenario and by explicitly taking into account embodiment, interaction, and the exploitation of smart design solutions. The robot is equipped with moving eyes, neck, arm and hand, and a rich set of sensors, enabling proprioceptive, kinesthetic, tactile and visual sensing. A great deal of effort has been devoted to the design of the hand and touch sensors. Experiments, e.g., tactile object classification, have been performed, to validate the quality of the robot perceptual capabilities
ieee international conference on biomedical robotics and biomechatronics | 2008
Sarah Degallier; Ludovic Righetti; Lorenzo Natale; Francesco Nori; Giorgio Metta; Auke Jan Ijspeert
Movement generation in humans appears to be processed through a three-layered architecture, where each layer corresponds to a different level of abstraction in the representation of the movement. In this article, we will present an architecture reflecting this organization and based on a modular approach to human movement generation. We will show that our architecture is well suited for the online generation and modulation of motor behaviors, but also for switching between motor behaviors. This will be illustrated respectively through an interactive drumming task and through switching between reaching and crawling.
International Journal of Social Robotics | 2012
Alessandra Sciutti; Ambra Bisio; Francesco Nori; Giorgio Metta; Luciano Fadiga; Thierry Pozzo; Giulio Sandini
In the last decades, the introduction of robotic devices in fields such as industries, dangerous environments, and medicine has notably improved working practices. The availability of a new generation of humanoid robots for everyday’s activities in human populated environments can entail an even wider revolution. Indeed, not only domestic activities but also social behaviors will adapt to a continuous interaction with a completely new kind of social agents.In the light of this scenario, it becomes crucial to design robots suited to natural cooperation with humans, and contextually to develop quantitative methods to measure human-robot interaction (HRI). Motor resonance, i.e. the activation of the observer’s motor control system during action perception, has been suggested to be a key component of human social behavior, and as such is thought to play a central role for HRI.In the literature there are reports of robots that have been used as tools to understand the human brain. The aim of this review is to offer a different perspective in suggesting that human responses can become a tool to measure and improve robot interactional attitudes. In the first part of the paper the notion of motor resonance and its neurophysiological correlates are introduced. Subsequently we describe motor resonance studies on the perception of robotic agents’ behavior. Finally we introduce proactive gaze and automatic imitation, two techniques adopted in human motor resonance studies, and we present the advantages which would follow their application to HRI.
International Journal of Humanoid Robotics | 2012
Lorenzo Jamone; Lorenzo Natale; Francesco Nori; Giorgio Metta; Giulio Sandini
In this paper we describe an autonomous strategy which enables a humanoid robot to learn how to reach for a visually identified object in the 3D space. The robot is a 22-DOF upper-body humanoid wit...
International Journal of Humanoid Robotics | 2012
Alberto Parmiggiani; Marco Maggiali; Lorenzo Natale; Francesco Nori; Alexander Schmitz; Nikos G. Tsagarakis; José Santos Victor; Francesco Becchi; Giulio Sandini; Giorgio Metta
This article describes the hardware design of the iCub humanoid robot. The iCub is an open-source humanoid robotic platform designed explicitly to support research in embodied cognition. This paper covers the mechanical and electronic design of the first release of the robot. A series upgrades developed for the second version of the robot (iCub2), which are aimed at the improvement of the mechanical and sensing performance, are also described.
intelligent robots and systems | 2011
Andrea Del Prete; Simone Denei; Lorenzo Natale; Fulvio Mastrogiovanni; Francesco Nori; Giorgio Cannata; Giorgio Metta
This paper deals with the problem of estimating the position of tactile elements (i.e. taxels) that are mounted on a robot body part. This problem arises with the adoption of tactile systems with a large number of sensors, and it is particularly critical in those cases in which the system is made of flexible material that is deployed on a curved surface. In this scenario the location of each taxel is partially unknown and difficult to determine manually. Placing the device is in fact an inaccurate procedure that is affected by displacements in both position and orientation. Our approach is based on the idea that it is possible to automatically infer the position of the taxels by measuring the interaction forces exchanged between the sensorized part and the environment. The location of the contact is estimated through force/torque (F/T) measures gathered by a sensor mounted on the kinematic chain of the robot. Our method requires few hypotheses and can be effectively implemented on a real platform, as demonstrated by the experiments with the iCub humanoid robot.