Brian T. Gleeson
University of Utah
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Brian T. Gleeson.
IEEE Transactions on Haptics | 2010
Brian T. Gleeson; Scott K. Horschel; William R. Provancher
Application of tangential skin displacement at the fingertip has been shown to be effective in communicating direction and has potential for several applications. We have developed a portable, fingertip-mounted tactile display capable of displacing and stretching the skin of the fingerpad, using a 7 mm hemispherical tactor. In vivo tests of fingerpad skin stiffness were performed to determine the forces required to effectively render stimuli. Other design parameters such as stimulus speed and displacement were derived from our earlier work. The tactile display is capable of rendering \pm 1 mm of displacement at arbitrary orientations within a plane and with rates of approximately 5 mm/s. Compliance and backlash in the devices drive train were characterized using external measurements, and were compensated for in software to reduce the impact on device hysteresis.
IEEE Transactions on Haptics | 2010
Brian T. Gleeson; Scott K. Horschel; William R. Provancher
A variety of tasks could benefit from the availability of direction cues that do not rely on vision or sound. The application of tangential skin displacement at the fingertip has been found to be a reliable means of communicating direction and has potential to be rendered by a compact device. Our lab has conducted experiments exploring the use of this type of tactile stimulus to communicate direction. Each subject pressed his/her right index fingertip against a 7 mm rounded rubber cylinder that moved at constant speed, applying shear force to deform the skin of the fingerpad. A range of displacements (0.05-1 mm) and speeds (0.5-4 mm/s) were tested. Subjects were asked to respond with the direction of the skin stretch, choosing from four directions, each separated by 90 degrees. Direction detection accuracy was found to depend upon both the speed and total displacement of the stimulus, with higher speeds and larger displacements resulting in greater accuracy. Accuracy rates greater than 95 percent were observed with as little as 0.2 mm of tangential displacement and at speeds as slow as 1 mm/s. Results were analyzed for direction dependence and temporal trends. Subjects responded most accurately to stimuli in the proximal and distal directions, and least accurately to stimuli in the ulnar direction. Subject performance decreased slightly with prolonged testing but there was no statistically significant learning trend. A second experiment was conducted to evaluate priming effects and the benefit of repeated stimuli. It was found that repeated stimuli do not improve direction communication, but subject responses were found to have a priming effect on future performance. This preliminary information will inform the design and use of a tactile display suitable for use in hand-held electronics.
IEEE Transactions on Haptics | 2012
Rebecca L. Koslover; Brian T. Gleeson; J. T. de Bever; William R. Provancher
This paper reports on a series of user experiments evaluating the design of a multimodal test platform capable of rendering visual, audio, vibrotactile, and directional skin-stretch stimuli. The test platform is a handheld, wirelessly controlled device that will facilitate experiments with mobile users in realistic environments. Stimuli rendered by the device are fully characterized, and have little variance in stimulus onset timing. A series of user experiments utilizing navigational cues validates the function of the device and investigates the user response to all stimulus modes. Results show users are capable of interpreting all stimuli with high accuracy and can use the direction cues for mobile navigation. Tests included both stationary (seated) and mobile (walking a simple obstacle course) tasks. Accuracy and response time patterns are similar in both seated and mobile conditions. This device provides a means of designing and evaluating multimodal communication methods for handheld devices and will facilitate experiments investigating the effects of stimulus mode on device usability and situation awareness.
symposium on haptic interfaces for virtual environment and teleoperator systems | 2009
Brian T. Gleeson; Scott K. Horschel; William R. Provancher
A variety of tasks could benefit from the availability of direction cues that do not rely on vision or sound. Skin stretch has been found to be a reliable means of communicating direction and has potential to be rendered by a compact device. We have conducted experiments exploring the use of lateral skin stretch at the fingertip to communicate direction. A small rubber cylinder was pressed against a subjects fingertip and moved at constant speed to stretch the skin of the fingerpad. The skin was stretched with a range of displacements (0.05 mm–1 mm) and speeds (0.5 mm/s-4 mm/s). Subjects were asked to respond with the direction of the skin stretch, choosing from 4 directions, each separated by 90 degrees. It was found that subjects could perceive skin stretch direction with as little as 0.05 mm of stretch. Direction detection accuracy was found to be dependent upon both the speed and total displacement of the skin stretch. Higher speeds and larger displacements resulted in greater accuracy. High accuracy rates, greater than 95%, were observed with as little as 0.2 mm of skin stretch and at speeds as slow as 2 mm/s. Accuracy was also found to vary with the direction of the stimulus. This preliminary information will be used to inform the design of a miniature tactile display suitable for use in hand-held electronics.
ieee haptics symposium | 2010
Mark A. Fehlberg; Brian T. Gleeson; Levi C. Leishman; William R. Provancher
People use handrests every day to complete dexterous activities as routine as providing a signature. However, the dexterous workspace of the hand is somewhat limited. To address this limit, we have developed an Active Handrest to aid in precision manipulation tasks by extending a users dexterous work space while providing ergonomic support for reduced fatigue - ideally while maintaining or even improving upon the precision obtained from a fixed handrest. Such a device could be useful for performing precision tasks over large workspaces, such as surgery, machining, or pick-and-place tasks. Our current prototype Active Handrest is a planar, computer controlled support for the users wrist and arm that allows the user complete control over a grasped tool or manipulation device. The device uses force input from the users hand, position input from a grasped manipulandum, or a combination of both force and position inputs. The control algorithm of the device then interprets and converts the input(s) into handrest motions. Pilot studies were conducted to optimize the control strategy by investigating the effects of control mode and of velocity limits. Task precision and completion time were used as performance metrics. Pilot testing showed that the device provided the greatest task precision when its velocity was limited to 5 mm/s, while using force input for its control strategy. An experiment was then conducted to compare the Active Handrest to various fixed wrist and arm support conditions, as well as the unsupported condition. Use of the Active Handrest was found to reduce task error by 36.6%, compared to performing the tasks with an unsupported arm, and by 26.0% compared to task completion with a static wrist support. These results are statistically significant (p ≪ 0.0001). While users generally completed experiments more slowly using the Active Handrest, performance with the Active Handrest shows lower sensitivity of task error relative to task completion time. Added experience with our drawing task leads to an increase in accuracy; however, the Active Handrest continues to outperform other hand support conditions (p ≪ 0.0001).
The International Journal of Robotics Research | 2012
Mark A. Fehlberg; Brian T. Gleeson; William R. Provancher
To address the limited dexterous workspace of the human hand, we have developed the Active Handrest. This device assists in precision manipulation tasks by extending a user’s dexterous workspace while providing ergonomic support for reduced fatigue. People use handrests to complete dexterous activities as routine as providing a signature. However, the dexterous workspace of the statically supported hand is somewhat limited. By providing consistent support over large workspaces the Active Handrest could be useful for performing precision tasks, such as surgery, upper limb rehabilitation, and machining. Our prototype Active Handrest is a planar, human–machine interface that provides support for the user’s wrist and arm while allowing the user to retain complete control over a grasped tool or manipulated device. The Active Handrest uses force input from the user’s hand, position input from a grasped tool, or a combination of these inputs. The device’s controller then converts the input(s) into handrest motions. In this paper we describe our novel device prototype and establish a baseline for its performance. Preliminary experiments were conducted to investigate the effects of control input, velocity limits, and user experience. Subsequent experiments compared the Active Handrest to various other support conditions. Use of the Active Handrest was found to significantly reduce task error and provided better speed-accuracy performance than the other tested support methods.
IEEE Transactions on Haptics | 2011
Brian T. Gleeson; William R. Provancher
Experiments were conducted using a novel tactile contact rendering device to explore important factors of the tactile contact event. The effects of contact velocity and event-based transient vibrations were explored. Our research was motivated by a need to better understand the perception of the tactile contact event and to develop a means of rendering stiff surfaces with a nonspecialized haptic device. A passive tactile display, suitable for mounting on a Phantom robot, was developed and is capable of rendering the tactile sensation of contact on a fingertip over a range of velocities commonly experienced during everyday manipulation and tactile exploration. Experiments were conducted with this device to explore how tactile contact dynamics affect the perceived stiffness of a virtual surface. It was found that contact velocity does not have a significant effect on perceived stiffness. These results can be explained by prior research that defines perceived hardness (akin to stiffness) in terms of rate-hardness. However, in agreement with prior literature with stylus-based studies, the addition of transient vibrations to the contact event can, in some cases, increase the perceived stiffness.
ieee haptics symposium | 2012
Brian T. Gleeson; William R. Provancher
Several researchers have developed haptic devices capable of rendering directional stimuli. When these devices are integrated into mobile or handheld devices, it becomes possible for a user to hold the haptic device in any orientation and thereby receive directional stimuli that may be out of alignment with rest of the world. In such cases, it becomes necessary for the user to perform a mental transformation of the directional stimuli, so that the stimuli may be understood in a fixed or global reference frame. This paper addresses two questions: 1. Can users perform such transformations and successfully interpret stimuli, and 2. What cognitive processes are involved in these transformations? In our experiments, users performed timed identification of directional tactile stimuli with their hand in a variety of orientations around a single axis. The results show that: 1. Users can successfully identify directional stimuli both quickly and accurately, even when the stimuli are rendered in a rotated reference frame, and 2. These tasks involve the mental rotation of a spatial mental representation of the stimulus, and also show evidence of embodiment effects. Furthermore, small angles of rotation (up to ~40°) incur very little cognitive cost, suggesting that tactile direction stimuli delivered through a handheld device would be robust to variations in user hand orientation.
ieee haptics symposium | 2010
Brian T. Gleeson; David E. Johnson
Non-photorealistic rendering (NPR) rejects a rigid adherence to physically accurate creation of visual imagery in favor of expressive styles that can enhance information transfer or create an artistic feeling. This paper considers those goals in the context of haptic rendering. Expressive haptic rendering techniques are developed for cartoon-inspired haptic rendering effects and demonstrated in three classic cartoon scenarios: super-slippery surfaces, exaggerated recoil and vibration upon hitting an object, and falling from a height based on a characters awareness of danger. Subjectively, these effects create increased interest in a scene and can facilitate transfer of artistic goals to a user. The value of expressive haptic rendering derives from this enhanced interaction experience.
international conference on human haptic sensing and touch enabled computer applications | 2010
Mark A. Fehlberg; Brian T. Gleeson; William R. Provancher
People use fixed handrests to complete routine dexterous activities such as providing a signature or making a sketch. Because the hands workspace for very fine motions is limited, we have developed an Active Handrest that extends a users dexterous workspace while providing ergonomic support. Our current Active Handrest prototype is a planar, computer controlled support for the users hand and wrist that allows complete control over a grasped tool. The device determines handrest motions by interpreting isometric (force) input from the users wrist, isotonic (position) input from a grasped manipulandum, or a blend of both inputs. Circle tracing experiments measuring task precision and completion time were conducted to investigate each control mode under various velocity limits for both experienced and novice users.