Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeremy R. Cooperstock is active.

Publication


Featured researches published by Jeremy R. Cooperstock.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2011

A Game Platform for Treatment of Amblyopia

Long To; Benjamin Thompson; Jeffrey R. Blum; Goro Maehara; Robert F. Hess; Jeremy R. Cooperstock

We have developed a prototype device for take-home use that can be used in the treatment of amblyopia. The therapeutic scenario we envision involves patients first visiting a clinic, where their vision parameters are assessed and suitable parameters are determined for therapy. Patients then proceed with the actual therapeutic treatment on their own, using our device, which consists of an Apple iPod Touch running a specially modified game application. Our rationale for choosing to develop the prototype around a game stems from multiple requirements that such an application satisfies. First, system operation must be sufficiently straight-forward that ease-of-use is not an obstacle. Second, the application itself should be compelling and motivate use more so than a traditional therapeutic task if it is to be used regularly outside of the clinic. This is particularly relevant for children, as compliance is a major issue for current treatments of childhood amblyopia. However, despite the traditional opinion that treatment of amblyopia is only effective in children, our initial results add to the growing body of evidence that improvements in visual function can be achieved in adults with amblyopia.


Communications of The ACM | 1997

Reactive environments

Jeremy R. Cooperstock; Sidney S. Fels; William Buxton; Kenneth C. Smith

increasingly widespread, we are confronted with the burden of controlling a myriad of complex devices in our day-today activities. While many people today could hardly imagine living in electronics-free homes or working in offices without computers, few of us have truly mastered full control of our VCRs, microwave ovens, or office photocopiers. Rather than making our lives easier, as technology was intended to do, it has complicated our activities with lengthy instruction manuals and confusing user interfaces. Designers have been trying to make the computer more “user-friendly” ever since its inception. The last two decades have brought us the notable advances of keyboard terminals, graphics displays, and pointing devices, as well as the graphical user interface, introduced in 1981 by the Xerox Star and popularized by the Apple Macintosh. Most recently, we have seen the emergence of pen-based and portable computers. However, despite this progress of interface improvements, very little has changed in terms of how we work with these machines. The basic rules of interaction are the same as they were in the days of the ENIAC: users must engage in an explicit, machine oriented dialogue with the computer rather than interact with the computer as they do with other people. In the last few years, computer scientists have begun talking about a new approach to human-computer interaction in which computing would not necessitate sitting in front of a screen and isolating ourselves from the world around us. Instead, in a computer-augmented environment, electronic systems could be merged into the physical world to provide computer functionality to everyday objects. This idea is exemplified by Ubiquitous Computing (UbiComp)


IEEE Transactions on Haptics | 2009

Touch Is Everywhere: Floor Surfaces as Ambient Haptic Interfaces

Yon Visell; Alvin Law; Jeremy R. Cooperstock

Floor surfaces are notable for the diverse roles that they play in our negotiation of everyday environments. Haptic communication via floor surfaces could enhance or enable many computer-supported activities that involve movement on foot. In this paper, we discuss potential applications of such interfaces in everyday environments and present a haptically augmented floor component through which several interaction methods are being evaluated. We describe two approaches to the design of structured vibrotactile signals for this device. The first is centered on a musical phrase metaphor, as employed in prior work on tactile display. The second is based upon the synthesis of rhythmic patterns of virtual physical impact transients. We report on an experiment in which participants were able to identify communication units that were constructed from these signals and displayed via a floor interface at well above chance levels. The results support the feasibility of tactile information display via such interfaces and provide further indications as to how to effectively design vibrotactile signals for them.


human factors in computing systems | 2012

TeleHuman: effects of 3d perspective on gaze and pose estimation with a life-size cylindrical telepresence pod

Kibum Kim; John Bolton; Audrey Girouard; Jeremy R. Cooperstock; Roel Vertegaal

In this paper, we present TeleHuman, a cylindrical 3D display portal for life-size human telepresence. The TeleHuman 3D videoconferencing system supports 360 degree motion parallax as the viewer moves around the cylinder and optionally, stereoscopic 3D display of the remote person. We evaluated the effect of perspective cues on the conveyance of nonverbal cues in two experiments using a one-way telecommunication version of the system. The first experiment focused on how well the system preserves gaze and hand pointing cues. The second experiment evaluated how well the system conveys 3D body postural information. We compared 3 perspective conditions: a conventional 2D view, a 2D view with 360 degree motion parallax, and a stereoscopic view with 360 degree motion parallax. Results suggest the combined presence of motion parallax and stereoscopic cues significantly improved the accuracy with which participants were able to assess gaze and hand pointing cues, and to instruct others on 3D body poses. The inclusion of motion parallax and stereoscopic cues also led to significant increases in the sense of social presence and telepresence reported by participants.


international conference on human computer interaction | 2009

Did Minority Report Get It Wrong? Superiority of the Mouse over 3D Input Devices in a 3D Placement Task

François Bérard; Jessica Ip; Mitchel Benovoy; Dalia El-Shimy; Jeffrey R. Blum; Jeremy R. Cooperstock

Numerous devices have been invented with three or more degrees of freedom (DoF) to compensate for the assumed limitations of the 2 DoF mouse in the execution of 3D tasks. Nevertheless, the mouse remains the dominant input device in desktop 3D applications, which leads us to pose the following question: is the dominance of the mouse due simply to its widespread availability and long-term user habituation, or is the mouse, in fact, more suitable than dedicated 3D input devices to an important subset of 3D tasks? In the two studies reported in this paper, we measured performance efficiency of a group of subjects in accomplishing a 3D placement task and also observed physiological indicators through biosignal measurements. Subjects used both a standard 2D mouse and three other 3 DoF input devices. Much to our surprise, the standard 2D mouse outperformed the 3D input devices in both studies.


Journal of the Acoustical Society of America | 2012

Identification of walked-upon materials in auditory, kinesthetic, haptic, and audio-haptic conditions.

Bruno L. Giordano; Yon Visell; Hsin-Yun Yao; Vincent Hayward; Jeremy R. Cooperstock; Stephen McAdams

Locomotion generates multisensory information about walked-upon objects. How perceptual systems use such information to get to know the environment remains unexplored. The ability to identify solid (e.g., marble) and aggregate (e.g., gravel) walked-upon materials was investigated in auditory, haptic or audio-haptic conditions, and in a kinesthetic condition where tactile information was perturbed with a vibromechanical noise. Overall, identification performance was better than chance in all experimental conditions and for both solids and the better identified aggregates. Despite large mechanical differences between the response of solids and aggregates to locomotion, for both material categories discrimination was at its worst in the auditory and kinesthetic conditions and at its best in the haptic and audio-haptic conditions. An analysis of the dominance of sensory information in the audio-haptic context supported a focus on the most accurate modality, haptics, but only for the identification of solid materials. When identifying aggregates, response biases appeared to produce a focus on the least accurate modality--kinesthesia. When walking on loose materials such as gravel, individuals do not perceive surfaces by focusing on the most accurate modality, but by focusing on the modality that would most promptly signal postural instabilities.


international conference on haptics perception devices and scenarios | 2008

A Vibrotactile Device for Display of Virtual Ground Materials in Walking

Yon Visell; Jeremy R. Cooperstock; Bruno L. Giordano; Karmen Franinovic; Alvin Law; Stephen McAdams; Kunal Jathal; Federico Fontana

We present a floor tile designed to provide the impression of walking on different ground materials, such as gravel, carpet, or stone. The device uses affordable and commercially available vibrotactile actuators and force sensors, and as such might one day be cost-effectively used in everyday environments. The control software is based on a lumped model of physical interactions between the foot and the ground surface. We have prototyped a measurement scheme for calibrating the device to match real-world ground materials.


IEEE Transactions on Image Processing | 2012

Toward Dynamic Image Mosaic Generation With Robustness to Parallax

Qi Zhi; Jeremy R. Cooperstock

Mosaicing is largely dependent on the quality of registration among the constituent input images. Parallax and object motion present challenges to image registration, leading to artifacts in the result. To reduce the impact of these artifacts, traditional image mosaicing approaches often impose planar scene constraints or rely on purely rotational camera motion or dense sampling. However, these requirements are often impractical or fail to address the needs of all applications. Instead, taking advantage of depth cues and a smooth transition criterion, we achieve significantly improved mosaicing results for static scenes, coping effectively with nontrivial parallax in the input. We extend this approach to the synthesis of dynamic video mosaics, incorporating foreground/background segmentation and a consistent motion perception criterion. Although further additions are required to cope with unconstrained object motion, our algorithm can synthesize a perceptually convincing output, conveying the same appearance of motion as seen in the input sequences.


PLOS ONE | 2011

Vibration influences haptic perception of surface compliance during walking.

Yon Visell; Bruno L. Giordano; Guillaume Millet; Jeremy R. Cooperstock

Background The haptic perception of ground compliance is used for stable regulation of dynamic posture and the control of locomotion in diverse natural environments. Although rarely investigated in relation to walking, vibrotactile sensory channels are known to be active in the discrimination of material properties of objects and surfaces through touch. This study investigated how the perception of ground surface compliance is altered by plantar vibration feedback. Methodology/Principal Findings Subjects walked in shoes over a rigid floor plate that provided plantar vibration feedback, and responded indicating how compliant it felt, either in subjective magnitude or via pairwise comparisons. In one experiment, the compliance of the floor plate was also varied. Results showed that perceived compliance of the plate increased monotonically with vibration feedback intensity, and depended to a lesser extent on the temporal or frequency distribution of the feedback. When both plate stiffness (inverse compliance) and vibration amplitude were manipulated, the effect persisted, with both factors contributing to compliance perception. A significant influence of vibration was observed even for amplitudes close to psychophysical detection thresholds. Conclusions/Significance These findings reveal that vibrotactile sensory channels are highly salient to the perception of surface compliance, and suggest that correlations between vibrotactile sensory information and motor activity may be of broader significance for the control of human locomotion than has been previously acknowledged.


workshop on applications of computer vision | 2005

Requirements for Camera Calibration: Must Accuracy Come with a High Price?

Wei Sun; Jeremy R. Cooperstock

Since a large number of vision applications rely on the mapping between 3D scenes and their corresponding 2D camera images, an important practical consideration for researchers is, what are the major determinants of camera calibration accuracy and what accuracy can be achieved within the practical limits of their environments. In response, we present a thorough study investigating the effects of training data quantity, measurement error, pixel coordinate noise, and the choice of camera model, on camera calibration results. Through this effort, we seek to determine whether expensive, elaborate setups are necessary, or indeed, beneficial, to camera calibration, and whether a high complexity camera model leads to improved accuracy. The results are first provided for a simulated camera system and then verified through carefully controlled experiments using real-world measurements

Collaboration


Dive into the Jeremy R. Cooperstock's collaboration.

Top Co-Authors

Avatar

Yon Visell

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge