Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Misha Sra is active.

Publication


Featured researches published by Misha Sra.


virtual reality software and technology | 2016

Procedurally generated virtual reality from 3D reconstructed physical space

Misha Sra; Sergio Garrido-Jurado; Chris Schmandt; Pattie Maes

We present a novel system for automatically generating immersive and interactive virtual reality (VR) environments using the real world as a template. The system captures indoor scenes in 3D, detects obstacles like furniture and walls, and maps walkable areas (WA) to enable real-walking in the generated virtual environment (VE). Depth data is additionally used for recognizing and tracking objects during the VR experience. The detected objects are paired with virtual counterparts to leverage the physicality of the real world for a tactile experience. Our approach is new, in that it allows a casual user to easily create virtual reality worlds in any indoor space of arbitrary size and shape without requiring specialized equipment or training. We demonstrate our approach through a fully working system implemented on the Google Project Tango tablet device.


user interface software and technology | 2015

MetaSpace: Full-body Tracking for Immersive Multiperson Virtual Reality

Misha Sra; Chris Schmandt

Most current virtual reality (VR) interactions are mediated by hand-held input devices or hand gestures and they usually display only a partial representation of the user in the synthetic environment. We believe, representing the user as a full avatar that is controlled by natural movements of the person in the real world will lead to a greater sense of presence in VR. Possible applications exist in various domains such as entertainment, therapy, travel, real estate, education, social interaction and professional assistance. In this demo, we present MetaSpace, a virtual reality system that allows co-located users to explore a VR world together by walking around in physical space. Each users body is represented by an avatar that is dynamically controlled by their body movements. We achieve this by tracking each users body with a Kinect device such that their physical movements are mirrored in the virtual world. Users can see their own avatar and the other persons avatar allowing them to perceive and act intuitively in the virtual environment.


human factors in computing systems | 2016

Immersive Terrestrial Scuba Diving Using Virtual Reality

Dhruv Jain; Misha Sra; Jingru Guo; Rodrigo Marques; Raymond Wu; Chris Schmandt

SCUBA diving as a sport has enabled people to explore the magnificent ocean diversity of beautiful corals, striking fish, and mysterious wrecks. However, only a small number of people are able to experience these wonders as diving is expensive, mentally and physically challenging, needs a large time investment, and requires access to large bodies of water. Most existing SCUBA diving simulations in VR are limited to visual and aural displays. We propose a virtual reality system, Amphibian that provides an immersive SCUBA diving experience through a convenient terrestrial simulator. Users lie on their torso on a motion platform with their outstretched arms and legs placed in a suspended harness. Users receive visual and aural feedback through the Oculus Rift head-mounted display and a pair of headphones. Additionally, we simulate buoyancy, drag, and temperature changes through various sensors. Preliminary deployment shows that the system has potential to offer a high degree of presence in VR.


ubiquitous computing | 2015

Expanding social mobile games beyond the device screen

Misha Sra; Chris Schmandt

AbstractEmerging pervasive games use sensors, graphics and networking technologies to provide immersive game experiences integrated with the real world. Existing pervasive games commonly rely on a device screen for providing game-related information, while overlooking opportunities to include new types of contextual interactions like jumping, a punching gesture, or even voice to be used as game inputs. We present the design of Spellbound, a physical mobile team-based game, to help contribute to our understanding of how we can design pervasive games that aim to nurture a spirit of togetherness. We also briefly touch upon how togetherness and playfulness can transform physical movement into a desirable activity in the user evaluation section. Spellbound is an outdoor pervasive team-based physical game. It takes advantage of the above-mentioned opportunities and integrates real-world actions like jumping and spinning with a virtual world. It also replaces touch-based input with voice interaction and provides glanceable and haptic feedback using custom hardware in the true spirit of social play characteristic of traditional children’s games. We believe Spellbound is a form of digital outdoor gaming that anchors enjoyment on physical action, social interaction, and tangible feedback. Spellbound was well received in user evaluation playtests which confirmed that the main design objective of enhancing a sense of togetherness was largely met.


human factors in computing systems | 2018

Project Zanzibar: A Portable and Flexible Tangible Interaction Platform

Nicolas Villar; Daniel Cletheroe; Greg Saul; Christian Holz; Tim Regan; Oscar Salandin; Misha Sra; Hui Shyong Yeo; William Field; Haiyan Zhang

We present Project Zanzibar: a flexible mat that can locate, uniquely identify and communicate with tangible objects placed on its surface, as well as sense a users touch and hover hand gestures. We describe the underlying technical contributions: efficient and localised Near Field Communication (NFC) over a large surface area; object tracking combining NFC signal strength and capacitive footprint detection, and manufacturing techniques for a rollable device form-factor that enables portability, while providing a sizable interaction area when unrolled. In addition, we detail design patterns for tangibles of varying complexity and interactive capabilities, including the ability to sense orientation on the mat, harvest power, provide additional input and output, stack, or extend sensing outside the bounds of the mat. Capabilities and interaction modalities are illustrated with self-generated applications. Finally, we report on the experience of professional game developers building novel physical/digital experiences using the platform.


user interface software and technology | 2016

Resolving Spatial Variation And Allowing Spectator Participation In Multiplayer VR

Misha Sra; Dhruv Jain; Arthur Pitzer Caetano; Andrés A. Calvo; Erwin Hilton; Chris Schmandt

Multiplayer virtual reality (VR) games introduce the problem of variations in the physical size and shape of each users space for mapping into a shared virtual space. We propose an asymmetric approach to solve the spatial variation problem, by allowing people to choose roles based on the size of their space. We demonstrate this concept through the implementation of a virtual snowball fight where players can choose from multiple roles, namely the shooter, the target, or an onlooker depending on whether the game is played remotely or together in one large space. In the co-located version, the target stands behind an actuated cardboard fort that responds to events in VR, providing non-VR spectators a way to participate in the experience. During preliminary deployment, users showed extremely positive reactions and the spectators were thrilled.


collaborative virtual environments | 2016

Bringing real objects, spaces, actions, and interactions into social VR

Misha Sra; Chris Schmandt

We present a novel multiuser virtual reality (VR) system where the physical world is used as a template for the placement of walls, furniture, and objects in the virtual world so as to create a correspondence in scale, spatial layout, and object placement between the two spaces. Through this association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, interact with each other and with objects just like they would in real life, bringing us closer to the realization of realistic collaborative virtual environments. Preliminary deployment during our labs semi-annual open house shows that the system has potential to offer a high degree of presence in VR.


international conference on persuasive technology | 2013

Spotz: a location-based approach to self-awareness

Misha Sra; Chris Schmandt

This paper introduces the location-based mobile application Spotz that explores the persuasive qualities of sharing location information visually to promote behavior change. Spotz encourages users to become self-aware of the kinds of places they visit which can have motivational properties deriving from social feedback. The app displays a continually evolving graphic of relatively sized circles depicting the number and type of places at which the users check-in, including the option to upload this visual to social media.


IEEE Transactions on Visualization and Computer Graphics | 2017

Oasis: Procedurally Generated Social Virtual Spaces from 3D Scanned Real Spaces

Misha Sra; Sergio Garrido-Jurado; Pattie Maes

We present Oasis, a novel system for automatically generating immersive and interactive virtual reality environments for single and multiuser experiences. Oasis enables real-walking in the generated virtual environment by capturing indoor scenes in 3D and mapping walkable areas. It makes use of available depth information for recognizing objects in the real environment which are paired with virtual counterparts to leverage the physicality of the real world, for a more immersive virtual experience. Oasis allows co-located and remotely located users to interact seamlessly and walk naturally in a shared virtual environment. Experiencing virtual reality with currently available devices can be cumbersome due to presence of objects and furniture which need to be removed every time the user wishes to use VR. Our approach is new, in that it allows casual users to easily create virtual reality environments in any indoor space without rearranging furniture or requiring specialized equipment, skill or training. We demonstrate our approach to overlay a virtual environment over an existing physical space through fully working single and multiuser systems implemented on a Tango tablet device.


human factors in computing systems | 2018

BreathVR: Leveraging Breathing as a Directly Controlled Interface for Virtual Reality Games

Misha Sra; Xuhai Xu; Pattie Maes

With virtual reality head-mounted displays rapidly becoming accessible to mass audiences, there is growing interest in new forms of natural input techniques to enhance immersion and engagement for players. Research has explored physiological input for enhancing immersion in single player games through indirectly controlled signals like heart rate or galvanic skin response. In this paper, we propose breathing as a directly controlled physiological signal that can facilitate unique and engaging play experiences through natural interaction in single and multiplayer virtual reality games. Our study (N = 16) shows that participants report a higher sense of presence and find the gameplay more fun and challenging when using our breathing actions. From study observations and analysis we present five design strategies that can aid virtual reality game designers interested in using directly controlled forms of physiological input.

Collaboration


Dive into the Misha Sra's collaboration.

Top Co-Authors

Avatar

Chris Schmandt

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Pattie Maes

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dhruv Jain

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Deb Roy

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jingru Guo

Rhode Island School of Design

View shared research outputs
Researchain Logo
Decentralizing Knowledge