Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Xavier Benavides is active.

Publication


Featured researches published by Xavier Benavides.


human factors in computing systems | 2015

ShowMe: A Remote Collaboration System that Supports Immersive Gestural Communication

Judith Amores; Xavier Benavides; Pattie Maes

ShowMe is an immersive mobile collaboration system that allows a remote user to communicate with a peer using video, audio and hand gestures. We explore the use of a Head Mounted Display and depth camera to create a system that (1) enables a remote user to be immersed in another users first-persons point of view and (2) offers a new way for the remote expert to provide guidance through three dimensional hand gestures and voice. Using ShowMe both users are present in the same physical environment and can perceive real-time communication from one another in the form of 2-handed gestures and voice.


designing interactive systems | 2014

AutoEmotive: bringing empathy to the driving experience to manage stress

Javier Hernandez; Daniel McDuff; Xavier Benavides; Judith Amores; Pattie Maes; Rosalind W. Picard

With recent developments in sensing technologies, its becoming feasible to comfortably measure several aspects of emotions during challenging daily life situations. This work describes how the stress of drivers can be measured through different types of interactions, and how the information can enable several interactions in the car with the goal of helping to manage stress. These new interactions could help not only to bring empathy to the driving experience but also to improve driver safety and increase social awareness.


human factors in computing systems | 2016

PsychicVR: Increasing mindfulness by using Virtual Reality and Brain Computer Interfaces

Judith Amores; Xavier Benavides; Pattie Maes

We present PsychicVR, a proof-of-concept system that integrates a brain-computer interface device and Virtual Reality headset to improve mindfulness while enjoying a playful immersive experience. The fantasy that any of us could have superhero powers has always inspired people around the world. By using Virtual Reality and real-time brain activity sensing we are moving one step closer to making this dream real. We non-invasively monitor and record electrical activity of the brain and incorporate this data in the VR experience using an Oculus Rift and the MUSE headband. By sensing brain waves using a series of EEG sensors, the level of activity is fed back to the user via 3D content in the virtual environment. When the user is focused they are able to make changes in the 3D environment and control their powers. Our system increases mindfulness and helps achieve higher levels of concentration while entertaining the user.


tangible and embedded interaction | 2015

clayodor: Retrieving Scents through the Manipulation of Malleable Material

Cindy Hsin-Liu Kao; Ermal Dreshaj; Judith Amores; Sang-won Leigh; Xavier Benavides; Pattie Maes; Ken Perlin; Hiroshi Ishii

clayodor (\klei-o-dor\) is a clay-like malleable material that changes smell based on user manipulation of its shape. This work explores the tangibility of shape changing materials to capture smell, an ephemeral and intangible sensory input. We present the design of a proof-of-concept prototype, and discussions on the challenges of navigating smell though form.


human factors in computing systems | 2015

Exploring the Design of a Wearable Device to Turn Everyday Objects into Playful Experiences

Judith Amores; Xavier Benavides; Roger Boldú; Pattie Maes

In this paper we present a wearable device in the form of a bracelet that turns everyday objects into interactive physical gameplay. We combine physical exploration and interactive entertainment by providing real-time audio and light feedback without the need to be in front of a screen. In contrast with todays computer, video and smartphone games, our system has the potential to enhance childrens physical, social and outdoor play. We designed a set of playful applications that seamlessly integrate technology with outdoor game play, music, sports and social interactions.


user interface software and technology | 2015

Remot-IO: a System for Reaching into the Environment of a Remote Collaborator

Xavier Benavides; Judith Amores; Pattie Maes

In this paper we present Remot-IO, a system for mobile collaboration and remote assistance around Internet connected devices. The system uses two Head Mounted Displays, cameras and depth sensors to enable a remote expert to be immersed in a local users point of view and control devices in that user?s environment. The remote expert can provide guidance through the use of hand gestures that appear in real-time in the local user?s field of view as superimposed 3D hands. In addition, the remote expert is able to operate devices in the novice?s environment and bring about physical changes by using the same hand gestures the novice would use. We describe a smart radio where the knobs of the radio can be controlled by local and remote user alike. Moreover, the user can visualize, interact and modify properties of sound waves in real time by using intuitive hand gestures.


user interface software and technology | 2015

KickSoul: A Wearable System for Feet Interactions with Digital Devices

Xavier Benavides; Chang Long Zhu Jin; Pattie Maes; Joseph A. Paradiso

In this paper we present a wearable device that maps natural feet movements into inputs for digital devices. KickSoul consists of an insole with sensors embedded that tracks movements and triggers actions in devices that surround us. We present a novel approach to use our feet as input devices in mobile situations when our hands are busy. We analyze natural feet?s movements and their meaning before activating an action. This paper discusses different applications for this technology as well as the implementation of our prototype.


international symposium on mixed and augmented reality | 2016

TactileVR: Integrating Physical Toys into Learn and Play Virtual Reality Experiences

Lior Shapira; Judith Amores; Xavier Benavides

We present TactileVR, a proof-of-concept virtual reality system in which a user is free to move around and interact with physical objects and toys, which are represented in the virtual world. By integrating tracking information from the head, hands and feet of the user, as well as the objects, we infer complex gestures and interactions such as shaking a toy, rotating a steering wheel, or clapping your hands. We create educational and recreational experiences for kids, which promote exploration and discovery, while feeling intuitive and safe. In each experience objects have a unique appearance and behavior e.g. in an electric circuits lab toy blocks serve as switches, batteries and light bulbs.We conducted a user study with children ages 5–11, who experienced TactileVR and interacted with virtual proxies of physical objects. Children took instantly to the TactileVR environment, intuitively discovered a variety of interactions, and completed tasks faster than with non-tactile virtual objects. Moreover, the presence of physical toys created the opportunity for collaborative play, even when only some of the kids were using a VR headset.


international symposium on wearable computers | 2015

Invisibilia: revealing invisible data using augmented reality and internet connected devices

Xavier Benavides; Judith Amores; Pattie Maes

Invisibilia seeks to explore the use of Augmented Reality (AR), Head Mounted Displays (HMD) and depth cameras to create a system that makes invisible data from our environment visible, combining widely accessible hardware to visualize layers of information on top of the physical world. Using our implemented prototype, the user can visualize, interact and modify properties of sound waves in real time by using intuitive hand gestures. As such the system supports experiential learning about certain physics phenomena through observation and hands-on experimentation.


international symposium on mixed and augmented reality | 2016

The RealityMashers: Augmented Reality Wide Field-of-View Optical See-Through Head Mounted Displays

Jaron Lanier; Victor Mateevitsi; Kishore Rathinavel; Lior Shapira; Joseph Menke; Patrick Therien; Joshua Hudman; Gheric Speiginer; Andrea Stevenson Won; Andrzej Banburski; Xavier Benavides; Judith Amores; Javier Porras Lurashi; Wayne Chang

Optical see-through (OST) displays can overlay computer generated graphics on top of the physical world, effectually fusing the two worlds together. However, current OST displays have a limited (compared to the human) field-of-view (FOV) and are powered by laptops which hinders their mobility. Furthermore the systems are designed for single-user experiences and therefore cannot be used for collocated multi-user applications. In this paper we contribute the design of the RealityMashers, two wide FOV OST displays that can be manufactured using rapid-prototyping techniques. We also contribute preliminary user feedback providing insights into enhancing future RealityMasher experiences. By providing the RealityMashers schematics we hope to make Augmented Reality more accessible and as a result accelerate the research in the field.

Collaboration


Dive into the Xavier Benavides's collaboration.

Top Co-Authors

Avatar

Judith Amores

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Pattie Maes

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chang Long Zhu Jin

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Cindy Hsin-Liu Kao

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ermal Dreshaj

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hiroshi Ishii

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge