Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeremy A. Fishel is active.

Publication


Featured researches published by Jeremy A. Fishel.


Frontiers in Neurorobotics | 2012

Bayesian exploration for intelligent identification of textures.

Jeremy A. Fishel; Gerald E. Loeb

In order to endow robots with human-like abilities to characterize and identify objects, they must be provided with tactile sensors and intelligent algorithms to select, control, and interpret data from useful exploratory movements. Humans make informed decisions on the sequence of exploratory movements that would yield the most information for the task, depending on what the object may be and prior knowledge of what to expect from possible exploratory movements. This study is focused on texture discrimination, a subset of a much larger group of exploratory movements and percepts that humans use to discriminate, characterize, and identify objects. Using a testbed equipped with a biologically inspired tactile sensor (the BioTac), we produced sliding movements similar to those that humans make when exploring textures. Measurement of tactile vibrations and reaction forces when exploring textures were used to extract measures of textural properties inspired from psychophysical literature (traction, roughness, and fineness). Different combinations of normal force and velocity were identified to be useful for each of these three properties. A total of 117 textures were explored with these three movements to create a database of prior experience to use for identifying these same textures in future encounters. When exploring a texture, the discrimination algorithm adaptively selects the optimal movement to make and property to measure based on previous experience to differentiate the texture from a set of plausible candidates, a process we call Bayesian exploration. Performance of 99.6% in correctly discriminating pairs of similar textures was found to exceed human capabilities. Absolute classification from the entire set of 117 textures generally required a small number of well-chosen exploratory movements (median = 5) and yielded a 95.4% success rate. The method of Bayesian exploration developed and tested in this paper may generalize well to other cognitive problems.


robotics and biomimetics | 2009

Signal processing and fabrication of a biomimetic tactile sensor array with thermal, force and microvibration modalities

Chia Hsien Lin; Todd W. Erickson; Jeremy A. Fishel; Nicholas Wettels; Gerald E. Loeb

We have developed a finger-shaped sensor array that provides simultaneous information about the contact forces, microvibrations and thermal fluxes induced by contact with external objects. In this paper, we describe a microprocessor-based signal conditioning and digitizing system for these sensing modalities and its embodiment on a flex-circuit that facilitates efficient assembly of the entire system via injection molding. Thermal energy from the embedded electronics is used to heat the finger above ambient temperature, similar to the biological finger. This enables the material properties of contacted objects to be inferred from thermal transients measured by a thermistor in the sensor array. Combining sensor modalities provides synergistic benefits. For example, the contact forces for exploratory movements can be calibrated so that thermal and microvibration data can be interpreted more definitively.


international conference on robotics and automation | 2013

Tactile identification of objects using Bayesian exploration

Danfei Xu; Gerald E. Loeb; Jeremy A. Fishel

In order to endow robots with human-like tactile sensory abilities, they must be provided with tactile sensors and intelligent algorithms to select and control useful exploratory movements and interpret data from all available sensors. Current robotic systems do not possess such sensors or algorithms. In this study we integrate multimodal tactile sensing (force, vibration and temperature) from the BioTac® with a Shadow Dexterous Hand and program the robot to make exploratory movements similar to those humans make when identifying objects by their compliance, texture, and thermal properties. Signal processing strategies were developed to provide measures of these perceptual properties. When identifying an object, exploratory movements are intelligently selected using a process we have previously developed called Bayesian exploration [1], whereby exploratory movements that provide the most disambiguation between likely candidates of objects are automatically selected. The exploration algorithm was augmented with reinforcement learning whereby its internal representations of objects evolved according to its cumulative experience with them. This allowed the algorithm to compensate for drift in the performance of the anthropomorphic robot hand and the ambient conditions of testing, improving accuracy while reducing the number of exploratory movements required to identify an object. The robot correctly identified 10 different objects on 99 out of 100 presentations.


Frontiers in Neurorobotics | 2012

Use of tactile feedback to control exploratory movements to characterize object compliance

Zhe Su; Jeremy A. Fishel; Tomonori Yamamoto; Gerald E. Loeb

Humans have been shown to be good at using active touch to perceive subtle differences in compliance. They tend to use highly stereotypical exploratory strategies, such as applying normal force to a surface. We developed similar exploratory and perceptual algorithms for a mechatronic robotic system (Barrett arm/hand system) equipped with liquid-filled, biomimetic tactile sensors (BioTac® from SynTouch LLC). The distribution of force on the fingertip was measured by the electrical resistance of the conductive liquid trapped between the elastomeric skin and a cluster of four electrodes on the flat fingertip surface of the rigid core of the BioTac. These signals provided closed-loop control of exploratory movements, while the distribution of skin deformations, measured by more lateral electrodes and by the hydraulic pressure, were used to estimate material properties of objects. With this control algorithm, the robot plus tactile sensor was able to discriminate the relative compliance of various rubber samples.


ieee international conference on biomedical robotics and biomechatronics | 2008

A robust micro-vibration sensor for biomimetic fingertips

Jeremy A. Fishel; Veronica J. Santos; Gerald E. Loeb

Controlling grip force in a prosthetic or robotic hand requires detailed sensory feedback information about microslips between the artificial fingertips and the object. In the biological hand this is accomplished with neural transducers capable of measuring micro-vibrations in the skin due to sliding friction. For prosthetic tactile sensors, emulating these biological transducers is a difficult challenge due to the fragility associated with highly sensitive devices. Incorporating a pressure sensor into a fluid-filled fingertip provides a novel solution to this problem by effectively creating a device similar to a hydrophone, capable of recording vibrations from lateral movements. The fluid conducts these acoustic signals well and with little attenuation, permitting the pressure sensing elements to be located in a protected region inside the core of the sensor and removing them from harmpsilas way. Preliminary studies demonstrate that high frequency vibrations (50-400 Hz) can be readily detected when such a fingertip slides across a ridged surface.


ieee haptics symposium | 2014

Evaluation of force, vibration and thermal tactile feedback in prosthetic limbs

Meghan C. Jimenez; Jeremy A. Fishel

While substantial research efforts have been put forth to advance the mechatronics and control of prosthetic hands, little attention has been paid to restoring the sensory functions of tactile feedback to amputees. It is known that the human hand when unable to feel through either disease or induced anesthesia becomes incapable of performing a number of essential dexterous tasks. Therefore, it is proposed that prosthetic hands without these capabilities will be no better. Tactile sensing in the human hand can be used for both autonomous reflexes and conscious perception. In a previous study we had explored using tactile sensing for autonomous reflexes to enable fragile object grasping [1], in this study we evaluate the benefits and performance in conscious perception of force, vibration and thermal tactile information. A prosthetic hand equipped with a BioTac sensor (capable of sensing force, vibration and temperature) and different tactors developed to play back these feedback modalities on a subjects forearm were used to evaluate perception in tactile discrimination experiments. Results showed that this system is able to effectively convey information to the prosthesis user to identify and differentiate objects of different weight, temperature, thermal properties, or surface texture when they were placed between the subjects prosthetic fingertips. While this system was effective at providing useful perceptual feedback, the subject indicated that the majority of the tactors were distracting and would be undesirable for day-to-day use.


The Human Hand as an Inspiration for Robot Hand Development | 2014

Multimodal Tactile Sensor

Nicholas Wettels; Jeremy A. Fishel; Gerald E. Loeb

We have developed a finger-shaped sensor array (BioTac®) that provides simultaneous information about contact forces, microvibrations and thermal fluxes, mimicking the full cutaneous sensory capabilities of the human finger. For many tasks, such as identifying objects or maintaining stable grasp, these sensory modalities are synergistic. For example, information about the material composition of an object can be inferred from the rate of heat transfer from a heated finger to the object, but only if the location and force of contact are well controlled. In this chapter we introduce the three sensing modalities of our sensor and consider how they can be used synergistically. Tactile sensing and signal processing is necessary for human dexterity and is likely to be required in mechatronic systems such as robotic and prosthetic limbs if they are to achieve similar dexterity.


Frontiers in Neuroscience | 2014

Bayesian Action&Perception: Representing the World in the Brain

Gerald E. Loeb; Jeremy A. Fishel

Theories of perception seek to explain how sensory data are processed to identify previously experienced objects, but they usually do not consider the decisions and effort that goes into acquiring the sensory data. Identification of objects according to their tactile properties requires active exploratory movements. The sensory data thereby obtained depend on the details of those movements, which human subjects change rapidly and seemingly capriciously. Bayesian Exploration is an algorithm that uses prior experience to decide which next exploratory movement should provide the most useful data to disambiguate the most likely possibilities. In previous studies, a simple robot equipped with a biomimetic tactile sensor and operated according to Bayesian Exploration performed in a manner similar to and actually better than humans on a texture identification task. Expanding on this, “Bayesian Action&Perception” refers to the construction and querying of an associative memory of previously experienced entities containing both sensory data and the motor programs that elicited them. We hypothesize that this memory can be queried (i) to identify useful next exploratory movements during identification of an unknown entity (“action for perception”) or (ii) to characterize whether an unknown entity is fit for purpose (“perception for action”) or (iii) to recall what actions might be feasible for a known entity (Gibsonian affordance). The biomimetic design of this mechatronic system may provide insights into the neuronal basis of biological action and perception.


ieee haptics symposium | 2014

Using the BioTac as a tumor localization tool

Morelle S. Arian; C. Alexander Blaine; Gerald E. Loeb; Jeremy A. Fishel

Robotically-Assisted Minimally Invasive Surgery (RMIS) offers many benefits to patients, yet introduces new challenges to surgeons due to the loss of tactile feedback that would be available in open surgery. This makes many intraoperative procedures such as tumor localization or other technically intricate and delicate tasks increasingly difficult. Reestablishing the ability to feel for surgeons during RMIS would improve the quality and safety of these surgeries and facilitate conversion of many procedures requiring touch that are traditionally performed as open-surgery. In this research a biomimetic tactile sensor (BioTac, SynTouch LLC) was evaluated for localization of artificial tumors. Various signal processing techniques implementing spatial and temporal derivatives were implemented into a graphical user interface to aid in the localization of tumors when explored by a human operator. The ability to localize tumors using the BioTac sensor was compared to performance of the human finger. The BioTac sensor was found to be particularly effective for superficial tumors (3mm deep), achieving a detection rate of 94.1%. The BioTac was also able to detect small tumors 3mm in diameter at a detection rate of 61.5%, and tumors at a depth of 12mm with a detection rate of 60.0%. While human subjects were more effective at localizing most tumors, the BioTac was often able to do so at lighter forces.


Progress in Brain Research | 2011

Understanding haptics by evolving mechatronic systems.

Gerald E. Loeb; George A. Tsianos; Jeremy A. Fishel; Nicholas Wettels; Stefan Schaal

Haptics can be defined as the characterization and identification of objects by voluntary exploration and somatosensory feedback. It requires multimodal sensing, motor dexterity, and high levels of cognitive integration with prior experience and fundamental concepts of self versus external world. Humans have unique haptic capabilities that enable tool use. Experimental animals have much poorer capabilities that are difficult to train and even more difficult to study because they involve rapid, subtle, and variable movements. Robots can now be constructed with biomimetic sensing and dexterity, so they may provide a suitable platform on which to test theories of haptics. Robots will need to embody such theories if they are ever going to realize the long-standing dream of working alongside humans using the same tools and objects.

Collaboration


Dive into the Jeremy A. Fishel's collaboration.

Top Co-Authors

Avatar

Gerald E. Loeb

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Nicholas Wettels

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Raymond A. Peck

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chia Hsien Gary Lin

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

George A. Tsianos

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Morelle S. Arian

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Rahman Davoodi

University of Southern California

View shared research outputs
Researchain Logo
Decentralizing Knowledge