Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thomas Feix is active.

Publication


Featured researches published by Thomas Feix.


IEEE Transactions on Human-Machine Systems | 2016

The GRASP Taxonomy of Human Grasp Types

Thomas Feix; Javier Romero; Heinz-Bodo Schmiedmayer; Aaron M. Dollar; Danica Kragic

In this paper, we analyze and compare existing human grasp taxonomies and synthesize them into a single new taxonomy (dubbed “The GRASP Taxonomy” after the GRASP project funded by the European Commission). We consider only static and stable grasps performed by one hand. The goal is to extract the largest set of different grasps that were referenced in the literature and arrange them in a systematic way. The taxonomy provides a common terminology to define human hand configurations and is important in many domains such as human-computer interaction and tangible user interfaces where an understanding of the human is basis for a proper interface. Overall, 33 different grasp types are found and arranged into the GRASP taxonomy. Within the taxonomy, grasps are arranged according to 1) opposition type, 2) the virtual finger assignments, 3) type in terms of power, precision, or intermediate grasp, and 4) the position of the thumb. The resulting taxonomy incorporates all grasps found in the reviewed taxonomies that complied with the grasp definition. We also show that due to the nature of the classification, the 33 grasp types might be reduced to a set of 17 more general grasps if only the hand configuration is considered without the object shape/size.


IEEE Transactions on Robotics | 2013

A Metric for Comparing the Anthropomorphic Motion Capability of Artificial Hands

Thomas Feix; Javier Romero; Carl Henrik Ek; Heinz-Bodo Schmiedmayer; Danica Kragic

We propose a metric for comparing the anthropomorphic motion capability of robotic and prosthetic hands. The metric is based on the evaluation of how many different postures or configurations a hand can perform by studying the reachable set of fingertip poses. To define a benchmark for comparison, we first generate data with human subjects based on an extensive grasp taxonomy. We then develop a methodology for comparison using generative, nonlinear dimensionality reduction techniques. We assess the performance of different hands with respect to the human hand and with respect to each other. The method can be used to compare other types of kinematic structures.


IEEE Transactions on Haptics | 2014

Analysis of Human Grasping Behavior: Object Characteristics and Grasp Type

Thomas Feix; Ian M. Bullock; Aaron M. Dollar

This paper is the first of a two-part series analyzing human grasping behavior during a wide range of unstructured tasks. The results help clarify overall characteristics of human hand to inform many domains, such as the design of robotic manipulators, targeting rehabilitation toward important hand functionality, and designing haptic devices for use by the hand. It investigates the properties of objects grasped by two housekeepers and two machinists during the course of almost 10,000 grasp instances and correlates the grasp types used to the properties of the object. We establish an object classification that assigns each object properties from a set of seven classes, including mass, shape and size of the grasp location, grasped dimension, rigidity, and roundness. The results showed that 55 percent of grasped objects had at least one dimension larger than 15 cm, suggesting that more than half of objects cannot physically be grasped using their largest axis. Ninety-two percent of objects had a mass of 500 g or less, implying that a high payload capacity may be unnecessary to accomplish a large subset of human grasping behavior. In terms of grasps, 96 percent of grasp locations were 7 cm or less in width, which can help to define requirements for hand rehabilitation and defines a reasonable grasp aperture size for a robotic hand. Subjects grasped the smallest overall major dimension of the object in 94 percent of the instances. This suggests that grasping the smallest axis of an object could be a reliable default behavior to implement in grasp planners.


The International Journal of Robotics Research | 2015

The Yale human grasping dataset: Grasp, object, and task data in household and machine shop environments

Ian M. Bullock; Thomas Feix; Aaron M. Dollar

This paper presents a dataset of human grasping behavior in unstructured environments. Wide-angle head-mounted camera video was recorded from two housekeepers and two machinists during their regular work activities, and the grasp types, objects, and tasks were analyzed and coded by study staff. The full dataset contains 27.7 hours of tagged video and represents a wide range of manipulative behaviors spanning much of the typical human hand usage. We provide the original videos, a spreadsheet including the tagged grasp type, object, and task parameters, time information for each successive grasp, and video screenshots for each instance. Example code is provided for MATLAB and R, demonstrating how to load in the dataset and produce simple plots.


international conference on robotics and automation | 2013

Finding small, versatile sets of human grasps to span common objects

Ian M. Bullock; Thomas Feix; Aaron M. Dollar

Robotic and prosthetic hand designers are challenged to replicate as much functionality of the human hand as possible, while minimizing cost and any unnecessary complexity. Selecting which aspects of human hand function to emulate can be difficult, especially when little data is available on unstructured human manipulation behavior. The present work analyzes 19 hours of video with over 9000 grasp instances from two housekeepers and two machinists to find small sets of versatile human grasps. A novel grasp span metric is used to evaluate sets of grasps and pick an optimal grasp set which can effectively handle as many different objects as possible. The results show medium wrap and lateral pinch are both important, versatile grasps for basic object handling. The results suggest that three-fingertip precision grasps such as thumb-2 finger, tripod, or lateral tripod can be used to handle dexterous manipulation of a wide range of objects. The recommended grasp sets can help aid difficult design decisions for robotic and prosthetic hands, as well as suggesting important human hand functionality to restore during hand surgery or rehabilitate in an impaired hand.


intelligent robots and systems | 2010

Spatio-temporal modeling of grasping actions

Javier Romero; Thomas Feix; Hedvig Kjellström; Danica Kragic

Understanding the spatial dimensionality and temporal context of human hand actions can provide representations for programming grasping actions in robots and inspire design of new robotic and prosthetic hands. The natural representation of human hand motion has high dimensionality. For specific activities such as handling and grasping of objects, the commonly observed hand motions lie on a lower-dimensional non-linear manifold in hand posture space. Although full body human motion is well studied within Computer Vision and Biomechanics, there is very little work on the analysis of hand motion with nonlinear dimensionality reduction techniques. In this paper we use Gaussian Process Latent Variable Models (GPLVMs) to model the lower dimensional manifold of human hand motions during object grasping. We show how the technique can be used to embed high-dimensional grasping actions in a lower-dimensional space suitable for modeling, recognition and mapping.


IEEE Transactions on Haptics | 2014

Analysis of Human Grasping Behavior: Correlating Tasks, Objects and Grasps

Thomas Feix; Ian M. Bullock; Aaron M. Dollar

This paper is the second in a two-part series analyzing human grasping behavior during a wide range of unstructured tasks. It investigates the tasks performed during the daily work of two housekeepers and two machinists and correlates grasp type and object properties with the attributes of the tasks being performed. The task or activity is classified according to the force required, the degrees of freedom, and the functional task type. We found that 46 percent of tasks are constrained, where the manipulated object is not allowed to move in a full six degrees of freedom. Analyzing the interrelationships between the grasp, object, and task data show that the best predictors of the grasp type are object size, task constraints, and object mass. Using these attributes, the grasp type can be predicted with 47 percent accuracy. Those parameters likely make useful heuristics for grasp planning systems. The results further suggest the common sub-categorization of grasps into power, intermediate, and precision categories may not be appropriate, indicating that grasps are generally more multi-functional than previously thought. We find large and heavy objects are grasped with a power grasp, but small and lightweight objects are not necessarily grasped with precision grasps-even with grasped object size less than 2 cm and mass less than 20 g, precision grasps are only used 61 percent of the time. These results have important implications for robotic hand design and grasp planners, since it appears while power grasps are frequently used for heavy objects, they can still be quite practical for small, lightweight objects.


IEEE Transactions on Robotics | 2013

Extracting Postural Synergies for Robotic Grasping

Javier Romero; Thomas Feix; Carl Henrik Ek; Hedvig Kjellström; Danica Kragic

We address the problem of representing and encoding human hand motion data using nonlinear dimensionality reduction methods. We build our work on the notion of postural synergies being typically based on a linear embedding of the data. In addition to addressing the encoding of postural synergies using nonlinear methods, we relate our work to control strategies of combined reaching and grasping movements. We show the drawbacks of the (commonly made) causality assumption and propose methods that model the data as being generated from an inferred latent manifold to cope with the problem. Another important contribution is a thorough analysis of the parameters used in the employed dimensionality reduction techniques. Finally, we provide an experimental evaluation that shows how the proposed methods outperform the standard techniques, both in terms of recognition and generation of motion patterns.


Journal of the Royal Society Interface | 2015

Estimating thumb–index finger precision grip and manipulation potential in extant and fossil primates

Thomas Feix; Tracy L. Kivell; Emmanuelle Pouydebat; Aaron M. Dollar

Primates, and particularly humans, are characterized by superior manual dexterity compared with other mammals. However, drawing the biomechanical link between hand morphology/behaviour and functional capabilities in non-human primates and fossil taxa has been challenging. We present a kinematic model of thumb–index precision grip and manipulative movement based on bony hand morphology in a broad sample of extant primates and fossil hominins. The model reveals that both joint mobility and digit proportions (scaled to hand size) are critical for determining precision grip and manipulation potential, but that having either a long thumb or great joint mobility alone does not necessarily yield high precision manipulation. The results suggest even the oldest available fossil hominins may have shared comparable precision grip manipulation with modern humans. In particular, the predicted human-like precision manipulation of Australopithecus afarensis, approximately one million years before the first stone tools, supports controversial archaeological evidence of tool-use in this taxon.


ieee haptics symposium | 2014

Dexterous workspace of human two- and three-fingered precision manipulation

Ian M. Bullock; Thomas Feix; Aaron M. Dollar

Precision manipulation, in which an object held between the fingertips is translated and/or rotated with respect to the hand without sliding, is used frequently in everyday tasks such as writing, yet few studies have examined the experimental precision manipulation workspace of the human hand. This study evaluates the range of positions over which 19 participants manipulated a moderately sized (3.3-4.1cm diameter) object using either the thumb and index finger (2 finger condition) or the thumb, index and middle fingers (3 finger condition). The results show that the 2-fingered workspace is on average 40 % larger than the 3-fingered workspace (p <; 0.001, likely due to added kinematic constraints from an additional finger. Representative precision manipulation workspaces for a median 17.5cm length hand are shown from multiple views to clearly illustrate the overall workspace shape, while the general relationship between hand length and workspace volume is evaluated. This view of the human precision manipulation workspace has various applications, ranging from motivating the design of effective, comfortable haptic interfaces to benchmarking the performance of robotic and prosthetic hands.

Collaboration


Dive into the Thomas Feix's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Danica Kragic

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Carl Henrik Ek

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hedvig Kjellström

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Heinz-Bodo Schmiedmayer

Vienna University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge