Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mandayam A. Srinivasan is active.

Publication


Featured researches published by Mandayam A. Srinivasan.


Nature | 2000

Real-time prediction of hand trajectory by ensembles of cortical neurons in primates.

Johan Wessberg; Christopher R. Stambaugh; Jerald D. Kralik; Pamela D. Beck; Mark Laubach; John K. Chapin; Jung Kim; S. James Biggs; Mandayam A. Srinivasan; Miguel A. L. Nicolelis

Signals derived from the rat motor cortex can be used for controlling one-dimensional movements of a robot arm. It remains unknown, however, whether real-time processing of cortical signals can be employed to reproduce, in a robotic device, the kind of complex arm movements used by primates to reach objects in space. Here we recorded the simultaneous activity of large populations of neurons, distributed in the premotor, primary motor and posterior parietal cortical areas, as non-human primates performed two distinct motor tasks. Accurate real-time predictions of one- and three-dimensional arm movement trajectories were obtained by applying both linear and nonlinear algorithms to cortical neuronal ensemble activity recorded from each animal. In addition, cortically derived signals were successfully used for real-time control of robotic devices, both locally and through the Internet. These results suggest that long-term control of complex prosthetic robot arm movements can be achieved by simple real-time transformations of neuronal population signals derived from multiple cortical areas in primates.


Computers & Graphics | 1997

Haptics in virtual environments: Taxonomy, research status, and challenges

Mandayam A. Srinivasan; Cagatay Basdogan

Abstract Haptic displays are emerging as effective interaction aids for improving the realism of virtual worlds. Being able to touch, feel, and manipulate objects in virtual environments has a large number of exciting applications. The underlying technology, both in terms of electromechanical hardware and computer software, is becoming mature and has opened up novel and interesting research areas. In this paper, we clarify the terminology of human and machine haptics and provide a brief overview of the progress recently achieved in these fields, based on our investigations as well as other studies. We describe the major advances in a new discipline, Computer Haptics (analogous to computer graphics), that is concerned with the techniques and processes associated with generating and displaying haptic stimuli to the human user. We also summarize the issues and some of our results in integrating haptics into multimodal and distributed virtual environments, and speculate on the challenges for the future.


collaborative virtual environments | 2000

An experimental study on the role of touch in shared virtual environments

Cagatay Basdogan; Chih-Hao Ho; Mandayam A. Srinivasan; Mel Slater

Investigating virtual environments has become an increasingly interesting research topic for engineers, computer and cognitive scientists, and psychologists. Although there have been several recent studies focused on the development of multimodal virtual environments (VEs) to study human-machine interactions, less attention has been paid to human-human and human-machine interactions in shared virtual environments (SVEs), and to our knowledge, no attention paid at all to what extent the addition of haptic communication between people would contribute to the shared experience. We have developed a multimodal shared virtual environment and performed a set of experiments with human subjects to study the role of haptic feedback in collaborative tasks and whether haptic communication through force feedback can facilitate a sense of being and collaborating with a remote partner. The study concerns a scenario where two participants at remote sites must cooperate to perform a joint task in an SVE. The goals of the study are (1) to assess the impact of force feedback on task performance, (2) to better understand the role of haptic communication in human-human interactions, (3) to study the impact of touch on the subjective sense of collaborating with a human as reported by the participants based on what they could see and feel, and (4) to investigate if gender, personality, or emotional experiences of users can affect haptic communication in SVEs. The outcomes of this research can have a powerful impact on the development of next-generation human-computer interfaces and network protocols that integrate touch and force feedback technology into the internet, development of protocols and techniques for collaborative teleoperation such as hazardous material removal, space station.


IEEE Computer Graphics and Applications | 2004

Haptics in minimally invasive surgical simulation and training

Cagatay Basdogan; Suvranu De; Jung Kim; Manivannan Muniyandi; Hyun K. Kim; Mandayam A. Srinivasan

Haptics is a valuable tool in minimally invasive surgical simulation and training. We discuss important aspects of haptics in MISST, such as haptic rendering and haptic recording and playback. Minimally invasive surgery has revolutionized many surgical procedures over the last few decades. MIS is performed using a small video camera, a video display, and a few customized surgical tools. In procedures such as gall bladder removal (laparoscopic cholesystectomy), surgeons insert a camera and long slender tools into the abdomen through small skin incisions to explore the internal cavity and manipulate organs from outside the body as they view their actions on a video display. Because the development of minimally invasive techniques has reduced the sense of touch compared to open surgery, surgeons must rely more on the feeling of net forces resulting from tool-tissue interactions and need more training to successfully operate on patients.


IEEE-ASME Transactions on Mechatronics | 2001

Virtual environments for medical training: graphical and haptic simulation of laparoscopic common bile duct exploration

Cagatay Basdogan; Chih-Hao Ho; Mandayam A. Srinivasan

We develop a computer-based training system to simulate laparoscopic procedures in virtual environments for medical training. The major hardware components of our system include a computer monitor to display visual interactions between 3D virtual models of organs and instruments together with a pair of force feedback devices interfaced with laparoscopic instruments to simulate haptic interactions. We simulate a surgical procedure that involves inserting a catheter into the cystic duct using a pair of laparoscopic forceps. This procedure is performed during laparoscopic cholecystectomy to search for gallstones in the common bile duct. Using the proposed system, the user can be trained to grasp and insert a flexible and freely moving catheter into the deformable cystic duct in virtual environments. The associated deformations are displayed on the computer screen and the reaction forces are fed back to the user through the force feedback devices. A hybrid modeling approach was developed to simulate the real-time visual and haptic interactions that take place between the forceps and the catheter, as well as the duct; and between the catheter and the duct.


Journal of Biomechanical Engineering-transactions of The Asme | 2003

3-D Finite-Element Models of Human and Monkey Fingertips to Investigate the Mechanics of Tactile Sense

Kiran Dandekar; Balasundar I. Raju; Mandayam A. Srinivasan

The biomechanics of skin and underlying tissues plays a fundamental role in the human sense of touch. It governs the mechanics of contact between the skin and an object, the transmission of the mechanical signals through the skin, and their transduction into neural signals by the mechanoreceptors. To better understand the mechanics of touch, it is necessary to establish quantitative relationships between the loads imposed on the skin by an object, the state of stresses/strains at mechanoreceptor locations, and the resulting neural response. Towards this goal, 3-D finite-element models of human and monkey fingertips with realistic external geometries were developed. By computing fingertip model deformations under line loads, it was shown that a multi-layered model was necessary to match previously obtained in vivo data on skin surface displacements. An optimal ratio of elastic moduli of the layers was determined through numerical experiments whose results were matched with empirical data. Numerical values of the elastic moduli of the skin layers were obtained by matching computed results with empirically determined force-displacement relationships for a variety of indentors. Finally, as an example of the relevance of the model to the study of tactile neural response, the multilayered 3-D finite-element model was shown to be able to predict the responses of the slowly adapting type I (SA-I) mechanoreceptors to indentations by complex object shapes.


Presence: Teleoperators & Virtual Environments | 1999

Efficient Point-Based Rendering Techniques for Haptic Display of Virtual Objects

Chih-Hao Ho; Cagatay Basdogan; Mandayam A. Srinivasan

Computer haptics, an emerging field of research that is analogous to computer graphics, is concerned with the generation and rendering of haptic virtual objects. In this paper, we propose an efficient haptic rendering method for displaying the feel of 3-D polyhedral objects in virtual environments (VEs). Using this method and a haptic interface device, the users can manually explore and feel the shape and surface details of virtual objects. The main component of our rendering method is the neighborhood watch algorithm that takes advantage of precomputed connectivity information for detecting collisions between the end effector of a force-reflecting robot and polyhedral objects in VEs. We use a hierarchical database, multithreading techniques, and efficient search procedures to reduce the computational time such that the haptic servo rate after the first contact is essentially independent of the number of polygons that represent the object. We also propose efficient methods for displaying surface properties of objects such as haptic texture and friction. Our haptic-texturing techniques and friction model can add surface details onto convex or concave 3-D polygonal surfaces. These haptic-rendering techniques can be extended to display dynamics of rigid and deformable objects.


Journal of Biomechanical Engineering-transactions of The Asme | 1996

An investigation of the mechanics of tactile sense using two-dimensional models of the primate fingertip

Mandayam A. Srinivasan; K. Dandekar

Tactile information about an object in contact with the skin surface is contained in the spatio-temporal load distribution on the skin, the corresponding stresses and strains at mechanosensitive receptor locations within the skin, and the associated pattern of electrical impulses produced by the receptor population. At present, although the responses of the receptors to known stimuli can be recorded, no experimental techniques exist to observe either the load distribution on the skin or the corresponding stress-state at the receptor locations. In this paper, the role of mechanics in the neural coding of tactile information is investigated using simple models of the primate fingertip. Four models that range in geometry from a semi-infinite medium to a cylindrical finger with a rigid bone, and composed of linear elastic media, are analyzed under plane strain conditions using the finite element method. The results show that the model geometry has a significant influence on the surface load distribution as well as the subsurface stress and strain fields for a given mechanical stimulus. The elastic medium acts like a spatial low pass filter with the property that deeper the receptor location, the more blurred the tactile information. None of the models predicted the experimentally observed surface deflection profiles under line loads as closely as a simple heterogeneous waterbed model that treated the fingerpad as a membrane enclosing an incompressible fluid (Srinivasan, 1989). This waterbed model, however, predicted a uniform state of stress inside the fingertip and thus failed to explain the spatial variations observed in the neural response. For the cylindrical model indented by rectangular gratings, the maximum compressive strain and strain energy density at typical receptor locations emerged as the two strain measures that were directly related to the electrophysiologically recorded response rate of slowly adapting type I (SAI) mechanoreceptors. Strain energy density is a better candidate to be the relevant stimulus for SAIs, since it is a scalar that is invariant with respect to receptor orientations and is a direct measure of the distortion of the receptor caused by the loads imposed on the skin.


Attention Perception & Psychophysics | 1995

Manual discrimination of compliance using active pinch grasp: The roles of force and work cues

Hong Z. Tan; Nathaniel I. Durlach; G. Lee Beauregard; Mandayam A. Srinivasan

In these experiments, two plates were grasped between the thumb and the index finger and squeezed together along a linear track. The force resisting the squeeze, produced by an electromechanical system under computer control, was programmed to be either constant (in the case of the force discrimination experiments) or linearly increasing (in the case of the compliance discrimination experiments) over the squeezing displacement. After completing a set of basic psychophysical experiments on compliance resolution (Experiment 1), we performed further experiments to investigate whether work and/or terminal-force cues played a role in compliance discrimination. In Experiment 2, compliance and force discrimination experiments were conducted with a roving-displacement paradigm to dissociate work cues (and terminal-force cues for the compliance experiments) from compliance and force cues, respectively. The effect of trial-by-trial feedback on response strategy was also investigated. In Experiment 3, compliance discrimination experiments were conducted with work cues totally eliminated and terminal-force cues greatly reduced. Our results suggest that people tend to use mechanical work and force cues for compliance discrimination. When work and terminal-force cues were dissociated from compliance cues, compliance resolution was poor (22%) relative to force and length resolution. When work cues were totally eliminated, performance could be predicted from terminal-force cues. A parsimonious description of all data from the compliance experiments is that subjects discriminated compliance on the basis of terminal force.


collaborative virtual environments | 2004

Transatlantic touch: a study of haptic collaboration over long distance

Jung Kim; Hyun K. Kim; Boon K. Tay; Manivannan Muniyandi; Mandayam A. Srinivasan; Joel Jordan; Jesper Mortensen; Manuel Oliveira; Mel Slater

The extent to which the addition of haptic communication between human users in a shared virtual environment (SVE) contributes to the shared experience of the users has not received much attention in the literature. In this paper we describe a demonstration of and an experimental study on haptic interaction between two users over a network of significant physical distance and a number of network hops. A number of techniques to mitigate instability of the haptic interactions induced by network latency are presented. An experiment to evaluate the use of haptics in a collaborative situation mediated by a networked virtual environment is examined. The experimental subjects were to cooperate in lifting a virtual box together under one of four conditions in a between-groups design. Questionnaires were used to report the ease with which they could perform the task and the subjective levels of presence and copresence experienced. This extends earlier work by the authors to consider the possibility of haptic collaboration under real network conditions with a number of improvements. Using the technology described in this paper, transatlantic touch was successfully demonstrated between the Touch Lab at Massachusetts Institute of Technology, USA and Virtual Environments and Computer Graphics (VECG) lab at University College London (UCL), UK in 2002. It was also presented at the Internet II demonstration meeting in 2002 between University of Southern California and the Massachusetts Institute of Technology.

Collaboration


Dive into the Mandayam A. Srinivasan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chih-Hao Ho

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Suvranu De

Rensselaer Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar

David W. Schloerb

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ki-Uk Kyung

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Anuradha M. Annaswamy

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge