Yon Visell
McGill University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yon Visell.
Interacting with Computers | 2009
Yon Visell
To apply enactive principles within human-computer interaction poses interesting challenges to the way that we design and evaluate interfaces, particularly those that possess a strong sensorimotor character. This article surveys the field of tactile sensory substitution, an area of science and engineering that lies at the intersection of such research domains as neuroscience, haptics, and sensory prosthetics. It is argued that this area of research is of high relevance to the design and understanding of enactive interfaces that make use of touch, and is also a fertile arena for revealing fundamental issues at stake in the design and implementation of enactive interfaces, ranging from engineering, to human sensory physiology, and the function and plasticity of perception. A survey of these questions is provided, alongside a range of current and historical examples.
IEEE Transactions on Haptics | 2009
Yon Visell; Alvin Law; Jeremy R. Cooperstock
Floor surfaces are notable for the diverse roles that they play in our negotiation of everyday environments. Haptic communication via floor surfaces could enhance or enable many computer-supported activities that involve movement on foot. In this paper, we discuss potential applications of such interfaces in everyday environments and present a haptically augmented floor component through which several interaction methods are being evaluated. We describe two approaches to the design of structured vibrotactile signals for this device. The first is centered on a musical phrase metaphor, as employed in prior work on tactile display. The second is based upon the synthesis of rhythmic patterns of virtual physical impact transients. We report on an experiment in which participants were able to identify communication units that were constructed from these signals and displayed via a floor interface at well above chance levels. The results support the feasibility of tactile information display via such interfaces and provide further indications as to how to effectively design vibrotactile signals for them.
Journal of the Acoustical Society of America | 2012
Bruno L. Giordano; Yon Visell; Hsin-Yun Yao; Vincent Hayward; Jeremy R. Cooperstock; Stephen McAdams
Locomotion generates multisensory information about walked-upon objects. How perceptual systems use such information to get to know the environment remains unexplored. The ability to identify solid (e.g., marble) and aggregate (e.g., gravel) walked-upon materials was investigated in auditory, haptic or audio-haptic conditions, and in a kinesthetic condition where tactile information was perturbed with a vibromechanical noise. Overall, identification performance was better than chance in all experimental conditions and for both solids and the better identified aggregates. Despite large mechanical differences between the response of solids and aggregates to locomotion, for both material categories discrimination was at its worst in the auditory and kinesthetic conditions and at its best in the haptic and audio-haptic conditions. An analysis of the dominance of sensory information in the audio-haptic context supported a focus on the most accurate modality, haptics, but only for the identification of solid materials. When identifying aggregates, response biases appeared to produce a focus on the least accurate modality--kinesthesia. When walking on loose materials such as gravel, individuals do not perceive surfaces by focusing on the most accurate modality, but by focusing on the modality that would most promptly signal postural instabilities.
human factors in computing systems | 2008
Davide Rocchesso; Stefania Serafin; Frauke Behrendt; Nicola Bernardini; Roberto Bresin; Gerhard Eckel; Karmen Franinovic; Thomas Hermann; Sandra Pauletto; Patrick Susini; Yon Visell
Sonic Interaction Design (SID) is an emerging field that is positioned at the intersection of auditory display, ubiquitous computing, interaction design, and interactive arts. SID can be used to describe practice and inquiry into any of various roles that sound may play in the interaction loop between users and artifacts, services, or environments, in applications that range from the critical functionality of an alarm, to the artistic significance of a musical creation. This field is devoted to the privileged role the auditory channel can assume in exploiting the convergence of computing, communication, and interactive technologies. An over-emphasis on visual displays has constrained the development of interactive systems that are capable of making more appropriate use of the auditory modality. Today the ubiquity of computing and communication resources allows us to think about sounds in a proactive way. This workshop puts a spotlight on such issues in the context of the emerging domain of SID.
PLOS ONE | 2011
Yon Visell; Bruno L. Giordano; Guillaume Millet; Jeremy R. Cooperstock
Background The haptic perception of ground compliance is used for stable regulation of dynamic posture and the control of locomotion in diverse natural environments. Although rarely investigated in relation to walking, vibrotactile sensory channels are known to be active in the discrimination of material properties of objects and surfaces through touch. This study investigated how the perception of ground surface compliance is altered by plantar vibration feedback. Methodology/Principal Findings Subjects walked in shoes over a rigid floor plate that provided plantar vibration feedback, and responded indicating how compliant it felt, either in subjective magnitude or via pairwise comparisons. In one experiment, the compliance of the floor plate was also varied. Results showed that perceived compliance of the plate increased monotonically with vibration feedback intensity, and depended to a lesser extent on the temporal or frequency distribution of the feedback. When both plate stiffness (inverse compliance) and vibration amplitude were manipulated, the effect persisted, with both factors contributing to compliance perception. A significant influence of vibration was observed even for amplitudes close to psychophysical detection thresholds. Conclusions/Significance These findings reveal that vibrotactile sensory channels are highly salient to the perception of surface compliance, and suggest that correlations between vibrotactile sensory information and motor activity may be of broader significance for the control of human locomotion than has been previously acknowledged.
Journal of the Acoustical Society of America | 2008
Bruno L. Giordano; Stephen McAdams; Yon Visell; Jeremy R. Cooperstock; Hsin-Yun Yao; Vincent Hayward
We investigated the role of haptic, proprioceptive and auditory information in the non‐visual identification of walking grounds. We selected four solid materials (e.g., marble) and four aggregate materials (e.g., fine gravel). Five observers identified the materials in each of four experimental conditions: multisensory, haptic, proprioceptive, and auditory. In the auditory condition, they were presented with walking sounds they produced. In the other conditions, observers walked blindfolded on the materials. In the haptic and proprioceptive conditions auditory information was masked. In the proprioceptive condition haptic information was masked. No masking took place in the multisensory condition. In all conditions, solids and aggregates were seldom confused, and aggregates were better identified than solids. Chance identification performance was observed only for solids in the presence of simultaneous haptic and auditory masking, suggesting a secondary role of proprioceptive information. In the proprioce...
symposium on 3d user interfaces | 2010
Yon Visell; Severin Smith; Alvin Law; Rishi Rajalingham; Jeremy R. Cooperstock
This paper presents a novel interface and set of techniques enabling users to interact via the feet with augmented floor surfaces. The interface consists of an array of instrumented floor tiles distributed over an area of several square meters. Intrinsic force sensing is used to capture foot-floor contact at resolutions as fine as 1 cm, for use with floor-based multimodal touch surface interfaces. We present the results of a preliminary evaluation of the usability of such a display.
ieee haptics symposium | 2010
Yon Visell; Jeremy R. Cooperstock
This paper describes the analysis, optimized redesign and evaluation of a high fidelity vibrotactile interface integrated in a rigid surface. The main application of the embodiment described here is vibrotactile display of virtual ground surface material properties for immersive environments, although the design principles are general. The device consists of a light, composite plate mounted on an elastic suspension, with integrated force sensors. It is actuated by a single voice coil motor. The structural dynamics of the device were optimized, within constraints imposed by the requirements of user interaction, and corrected via digital inverse filtering, in order to enable accurate simulation of virtual ground materials. Measurements of the resulting display demonstrate that it is capable of accurately reproducing forces of more than 40 N across a usable frequency band from 50 Hz to 750 Hz.
ieee international workshop on haptic audio visual environments and games | 2008
Alvin Law; Benjamin V. Peck; Yon Visell; Paul G. Kry; Jeremy R. Cooperstock
We present a floor-space design that provides the impression of walking on various terrains by rendering graphical, audio and haptic stimuli synchronously with low latency. Currently, interactive floors tend to focus on visual and auditory feedback but have neglected to explore the role of haptics. Our design recreates these three modalities in a direct manner where the sensors and reactive cues are located in the same area. The goal of this project is the creation of a realistic and dynamic area of ground that allows multiple, untethered users to engage in intuitive interaction via locomotion in a virtual or augmented reality environment.
ieee virtual reality conference | 2010
Yon Visell; Alvin Law; Jessica Ip; Severin Smith; Jeremy R. Cooperstock
We present techniques to enable users to interact on foot with simulated natural ground surfaces, such as soil or ice, in immersive virtual environments. Position and force estimates from in-floor force sensors are used to synthesize plausible auditory and vibrotactile feedback in response. Relevant rendering techniques are discussed in the context of walking on a virtual frozen pond.