Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ian M. Bullock is active.

Publication


Featured researches published by Ian M. Bullock.


IEEE Transactions on Haptics | 2013

Grasp Frequency and Usage in Daily Household and Machine Shop Tasks

Ian M. Bullock; Joshua Z. Zheng; Sara De La Rosa; Charlotte Guertler; Aaron M. Dollar

In this paper, we present results from a study of prehensile human hand use during the daily work activities of four subjects: two housekeepers and two machinists. Subjects wore a head-mounted camera that recorded their hand usage during their daily work activities in their typical place of work. For each subject, 7.45 hours of video was analyzed, recording the type of grasp being used and its duration. From this data, we extracted overall grasp frequency, duration distributions for each grasp, and common transitions between grasps. The results show that for 80 percent of the study duration the housekeepers used just five grasps and the machinists used 10. The grasping patterns for the different subjects were compared, and the overall top 10 grasps are discussed in detail. The results of this study not only lend insight into how people use their hands during daily tasks, but can also inform the design of effective robotic and prosthetic hands.


IEEE Transactions on Haptics | 2013

A Hand-Centric Classification of Human and Robot Dexterous Manipulation

Ian M. Bullock; Raymond R. Ma; Aaron M. Dollar

This work contributes to the development of a common framework for the discussion and analysis of dexterous manipulation across the human and robotic domains. An overview of previous work is first provided along with an analysis of the tradeoffs between arm and hand dexterity. A hand-centric and motion-centric manipulation classification is then presented and applied in four different ways. It is first discussed how the taxonomy can be used to identify a manipulation strategy. Then, applications for robot hand analysis and engineering design are explained. Finally, the classification is applied to three activities of daily living (ADLs) to distinguish the patterns of dexterous manipulation involved in each task. The same analysis method could be used to predict problem ADLs for various impairments or to produce a representative benchmark set of ADL tasks. Overall, the classification scheme proposed creates a descriptive framework that can be used to effectively describe hand movements during manipulation in a variety of contexts and might be combined with existing object centric or other taxonomies to provide a complete description of a specific manipulation task.


ieee international conference on rehabilitation robotics | 2011

Classifying human manipulation behavior

Ian M. Bullock; Aaron M. Dollar

This paper presents a taxonomy for detailed classification of human and anthropomorphic manipulation behavior. This hand-centric, motion-centric taxonomy differentiates tasks based on criteria such as object contact, prehension, and the nature of object motion relative to a hand frame. A sub-classification of the most dexterous categories, within-hand manipulation, is also presented, based on the principal axis of object rotation or translation in the hand frame. Principles for categorizing complex, multi-faceted tasks are also presented, along with illustrative examples. We hope that the proposed taxonomy will both establish a standard language around human and anthropomorphic manipulation as well as enable improved understanding of the differences in hand use for a wide variety of behavior. Although designed for human and anthropomorphic hands, the taxonomy might easily be extended to a wide range of robot manipulators and end-effectors.


IEEE Transactions on Haptics | 2014

Analysis of Human Grasping Behavior: Object Characteristics and Grasp Type

Thomas Feix; Ian M. Bullock; Aaron M. Dollar

This paper is the first of a two-part series analyzing human grasping behavior during a wide range of unstructured tasks. The results help clarify overall characteristics of human hand to inform many domains, such as the design of robotic manipulators, targeting rehabilitation toward important hand functionality, and designing haptic devices for use by the hand. It investigates the properties of objects grasped by two housekeepers and two machinists during the course of almost 10,000 grasp instances and correlates the grasp types used to the properties of the object. We establish an object classification that assigns each object properties from a set of seven classes, including mass, shape and size of the grasp location, grasped dimension, rigidity, and roundness. The results showed that 55 percent of grasped objects had at least one dimension larger than 15 cm, suggesting that more than half of objects cannot physically be grasped using their largest axis. Ninety-two percent of objects had a mass of 500 g or less, implying that a high payload capacity may be unnecessary to accomplish a large subset of human grasping behavior. In terms of grasps, 96 percent of grasp locations were 7 cm or less in width, which can help to define requirements for hand rehabilitation and defines a reasonable grasp aperture size for a robotic hand. Subjects grasped the smallest overall major dimension of the object in 94 percent of the instances. This suggests that grasping the smallest axis of an object could be a reliable default behavior to implement in grasp planners.


The International Journal of Robotics Research | 2015

The Yale human grasping dataset: Grasp, object, and task data in household and machine shop environments

Ian M. Bullock; Thomas Feix; Aaron M. Dollar

This paper presents a dataset of human grasping behavior in unstructured environments. Wide-angle head-mounted camera video was recorded from two housekeepers and two machinists during their regular work activities, and the grasp types, objects, and tasks were analyzed and coded by study staff. The full dataset contains 27.7 hours of tagged video and represents a wide range of manipulative behaviors spanning much of the typical human hand usage. We provide the original videos, a spreadsheet including the tagged grasp type, object, and task parameters, time information for each successive grasp, and video screenshots for each instance. Example code is provided for MATLAB and R, demonstrating how to load in the dataset and produce simple plots.


international conference on robotics and automation | 2013

Finding small, versatile sets of human grasps to span common objects

Ian M. Bullock; Thomas Feix; Aaron M. Dollar

Robotic and prosthetic hand designers are challenged to replicate as much functionality of the human hand as possible, while minimizing cost and any unnecessary complexity. Selecting which aspects of human hand function to emulate can be difficult, especially when little data is available on unstructured human manipulation behavior. The present work analyzes 19 hours of video with over 9000 grasp instances from two housekeepers and two machinists to find small sets of versatile human grasps. A novel grasp span metric is used to evaluate sets of grasps and pick an optimal grasp set which can effectively handle as many different objects as possible. The results show medium wrap and lateral pinch are both important, versatile grasps for basic object handling. The results suggest that three-fingertip precision grasps such as thumb-2 finger, tripod, or lateral tripod can be used to handle dexterous manipulation of a wide range of objects. The recommended grasp sets can help aid difficult design decisions for robotic and prosthetic hands, as well as suggesting important human hand functionality to restore during hand surgery or rehabilitate in an impaired hand.


IEEE Transactions on Haptics | 2014

Analysis of Human Grasping Behavior: Correlating Tasks, Objects and Grasps

Thomas Feix; Ian M. Bullock; Aaron M. Dollar

This paper is the second in a two-part series analyzing human grasping behavior during a wide range of unstructured tasks. It investigates the tasks performed during the daily work of two housekeepers and two machinists and correlates grasp type and object properties with the attributes of the tasks being performed. The task or activity is classified according to the force required, the degrees of freedom, and the functional task type. We found that 46 percent of tasks are constrained, where the manipulated object is not allowed to move in a full six degrees of freedom. Analyzing the interrelationships between the grasp, object, and task data show that the best predictors of the grasp type are object size, task constraints, and object mass. Using these attributes, the grasp type can be predicted with 47 percent accuracy. Those parameters likely make useful heuristics for grasp planning systems. The results further suggest the common sub-categorization of grasps into power, intermediate, and precision categories may not be appropriate, indicating that grasps are generally more multi-functional than previously thought. We find large and heavy objects are grasped with a power grasp, but small and lightweight objects are not necessarily grasped with precision grasps-even with grasped object size less than 2 cm and mass less than 20 g, precision grasps are only used 61 percent of the time. These results have important implications for robotic hand design and grasp planners, since it appears while power grasps are frequently used for heavy objects, they can still be quite practical for small, lightweight objects.


ieee haptics symposium | 2014

Dexterous workspace of human two- and three-fingered precision manipulation

Ian M. Bullock; Thomas Feix; Aaron M. Dollar

Precision manipulation, in which an object held between the fingertips is translated and/or rotated with respect to the hand without sliding, is used frequently in everyday tasks such as writing, yet few studies have examined the experimental precision manipulation workspace of the human hand. This study evaluates the range of positions over which 19 participants manipulated a moderately sized (3.3-4.1cm diameter) object using either the thumb and index finger (2 finger condition) or the thumb, index and middle fingers (3 finger condition). The results show that the 2-fingered workspace is on average 40 % larger than the 3-fingered workspace (p <; 0.001, likely due to added kinematic constraints from an additional finger. Representative precision manipulation workspaces for a median 17.5cm length hand are shown from multiple views to clearly illustrate the overall workspace shape, while the general relationship between hand length and workspace volume is evaluated. This view of the human precision manipulation workspace has various applications, ranging from motivating the design of effective, comfortable haptic interfaces to benchmarking the performance of robotic and prosthetic hands.


IEEE Transactions on Biomedical Engineering | 2015

Workspace Shape and Characteristics for Human Two- and Three-Fingered Precision Manipulation

Ian M. Bullock; Thomas Feix; Aaron M. Dollar

Goal: To study precision manipulation, which involves repositioning an object in the fingertips and is used in everyday tasks such as writing and key insertion, and also for domain-specific tasks such as small scalpel cuts, using tweezers, and hand soldering. Methods: In this study, the range of positions (workspace) through which 19 participants manipulated a 3.3-4.1 cm-diameter object are measured with a magnetic tracker. Each participant performed two conditions: a two-finger thumb-index finger condition and a three-finger thumb-index-middle finger condition. Results: The observed workspaces, normalized to a 17.5 cm hand length, are small compared to free-finger trajectories; for the two-finger trials, 68% of points are within 1.05 cm of the centroid and 95% are within 2.31 cm, while the three-finger case shows a narrower distribution, with 68% of points within 0.94 cm of the centroid and 95% of points within 2.19 cm. The longest axis is a long thin arc in the proximal-palmar plane. Analysis of fingertip workspaces shows that the index fingertip workspace volume is the most linear predictor of object workspace (R2 = 0.98). Conclusion: Precision manipulation workspace size and shape is shown, along with how the fingers are used during the manipulation. Significance: The results have many applications, including normative data for rehabilitation, guidelines for ergonomic device design, and benchmarking prosthetic and robotic hands.


intelligent robots and systems | 2014

Analyzing human fingertip usage in dexterous precision manipulation: Implications for robotic finger design

Ian M. Bullock; Thomas Feix; Aaron M. Dollar

Designing robot hands for dexterous precision manipulation involves many complex tradeoffs in order to optimize hand performance. While many studies focus on overall hand kinematics, far fewer consider tradeoffs in the design of the robotic finger surfaces themselves. Our present work uses 3.8 total hours of precision manipulation from 19 participants to look at the fingertip surfaces used while moving a sphere through as much of the feasible position workspace as possible. Fingertip surface use is estimated by measuring the relative orientation changes between a high-resolution 6DOF sensor mounted on the fingernails of the fingers and in the object being manipulated, indicating to what extent the object has been “rolled” onto the sides of the fingers. The results show significant lateral use of the index and middle fingers, and also show that the side surface of the index finger is used much more in two-finger manipulation than three finger manipulation. The lateral fingertip usage suggests that robot finger designs could also benefit from enabling lateral surface use. The lateral middle finger use also suggests that fingers can be effectively used as passive supports to supply forces in directions that may not be actively controlled. We anticipate these results should be useful especially for robotic and prosthetic hand design, but also in other fields such as rehabilitation or haptic interface design.

Collaboration


Dive into the Ian M. Bullock's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Charlotte Guertler

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge