Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thai Phan is active.

Publication


Featured researches published by Thai Phan.


intelligent robots and systems | 2015

Mixed reality for robotics

Wolfgang Hönig; Christina Milanes; Lisa Scaria; Thai Phan; Mark T. Bolas; Nora Ayanian

Mixed Reality can be a valuable tool for research and development in robotics. In this work, we refine the definition of Mixed Reality to accommodate seamless interaction between physical and virtual objects in any number of physical or virtual environments. In particular, we show that Mixed Reality can reduce the gap between simulation and implementation by enabling the prototyping of algorithms on a combination of physical and virtual objects, including robots, sensors, and humans. Robots can be enhanced with additional virtual capabilities, or can interact with humans without sharing physical space. We demonstrate Mixed Reality with three representative experiments, each of which highlights the advantages of our approach. We also provide a testbed for Mixed Reality with three different virtual robotics environments in combination with the Crazyflie 2.0 quadcopter.


international conference on human-computer interaction | 2011

Leveraging unencumbered full body control of animated virtual characters for game-based rehabilitation

Belinda Lange; Evan A. Suma; Brad Newman; Thai Phan; Chien-Yen Chang; Albert A. Rizzo; Mark T. Bolas

The use of commercial video games as rehabilitation tools, such as the Nintendo® Wii Fit™, has recently gained much interest in the physical therapy arena. However, physical rehabilitation requires accurate and appropriate tracking and feedback of performance, often not provided by existing commercial console devices or games. This paper describes the development of an application that leverages recent advances in commercial video game technology to provide fullbody control of animated virtual characters with low cost markerless tracking. The aim of this research is to develop and evaluate an interactive game-based rehabilitation tool for balance training of adults with neurological injury. This paper outlines the development and evaluation of a game-based rehabilitation tool using the PrimeSense depth sensing technology, designed to elicit specific therapeutic motions when controlling a virtual avatar in pursuit of in-game goals. A sample of nine adults participated in the initial user testing, providing feedback on the hardware and software prototype.


ieee virtual reality conference | 2013

Open virtual reality

Mark T. Bolas; Perry Hoberman; Thai Phan; Palmer Luckey; James Iliff; Nate Burba; Ian E. McDowall; David M. Krum

The ICT Mixed Reality Lab is leveraging an open source philosophy to influence and disrupt industry. Projects spun out of the labs efforts include the VR2GO smartphone based viewer, the inVerse tablet based viewer, the Socket HMD reference design, the Oculus Rift and the Project Holodeck gaming platforms, a repurposed FOV2GO design with Nokia Lumia phones for a 3D user interface course at Columbia University, and the EventLabs Socket based HMD at the University of Barcelona. A subset of these will be demonstrated. This open approach is providing low cost yet surprisingly compelling immersive experiences.


ieee virtual reality conference | 2017

NIVR: Neuro imaging in virtual reality

Tyler Ard; David M. Krum; Thai Phan; Dominique Duncan; Ryan Essex; Mark T. Bolas; Arthur W. Toga

Visualization is a critical component of neuroimaging, and how to best view data that is naturally three dimensional is a long standing question in neuroscience. Many approaches, programs, and techniques have been developed specifically for neuroimaging. However, exploration of 3D information through a 2D screen is inherently limited. Many neuroscientific researchers hope that with the recent commercialization and popularization of VR, it can offer the next-step in data visualization and exploration. Neuro Imaging in Virtual Reality (NIVR), is a visualization suite that employs various immersive visualizations to represent neuroimaging information in VR. Some established techniques, such as raymarching volume visualization, are paired with newer techniques, such as near-field rendering, to provide a broad basis of how we can leverage VR to improve visualization and navigation of neuroimaging data. Several of the neuroscientific visualization approaches presented are, to our knowledge, the first of their kind. NIVR offers not only an exploration of neuroscientific data visualization, but also a tool to expose and educate the public regarding recent advancements in the field of neuroimaging. By providing an engaging experience to explore new techniques and discoveries in neuroimaging, we hope to spark scientific interest through a broad audience. Furthermore, neuroimaging offers deep and expansive datasets; a single scan can involve several gigabytes of information. Visualization and exploration of this type of information can be challenging, and real-time exploration of this information in VR even more so. NIVR explores pathways which make this possible, and offers preliminary stereo visualizations of these types of massive data.


ieee virtual reality conference | 2014

Tablet-based interaction panels for immersive environments

David M. Krum; Thai Phan; Lauren Cairco Dukes; Peter Wang; Mark T. Bolas

With the current widespread interest in head mounted displays, we perceived a need for devices that support expressive and adaptive interaction in a low-cost, eyes-free manner. Leveraging rapid prototyping techniques for fabrication, we have designed and manufactured a variety of panels that can be overlaid on multi-touch tablets and smartphones. The panels are coupled with an app running on the multi-touch device that exchanges commands and state information over a wireless network with the virtual reality application. Sculpted features of the panels provide tactile disambiguation of control widgets and an onscreen heads-up display provides interaction state information. A variety of interaction mappings can be provided through software to support several classes of interaction techniques in virtual environments. We foresee additional uses for applications where eyes-free use and adaptable interaction interfaces can be beneficial.


Computer Animation and Virtual Worlds | 2017

Social influence of humor in virtual human counselor's self-disclosure

Sin-Hwa Kang; David M. Krum; Peter Khooshabeh; Thai Phan; Chien-Yen Chang; Ori Amir; Rebecca Lin

We explored the social influence of humor in a virtual human counselors self‐disclosure while also varying the ethnicity of the virtual counselor. In a 2 × 3 experiment (humor and ethnicity of the virtual human counselor), participants experienced counseling interview interactions via Skype on a smartphone. We measured user responses to and perceptions of the virtual human counselor. The results demonstrate that humor positively affects user responses to and perceptions of a virtual counselor. The results further suggest that matching styles of humor with a virtual counselors ethnicity influences user responses and perceptions. The results offer insight into the effective design and development of realistic and believable virtual human counselors. Furthermore, they illuminate the potential use of humor to enhance self‐disclosure in human–agent interactions.


2016 Workshop on Immersive Analytics (IA) | 2016

ShodanVR: Immersive visualization of text records from the Shodan database

Thai Phan; David M. Krum; Mark T. Bolas

ShodanVR is an immersive visualization for querying and displaying text records from the Shodan database of Internet connected devices. Shodan provides port connection data retrieved from servers, routers, and other networked devices [2]. Cybersecurity professionals can glean this data for device populations, software versions, and potential security vulnerabilities [1].


ieee virtual reality conference | 2014

A demonstration of tablet-based interaction panels for immersive environments

David M. Krum; Thai Phan; Lauren Cairco Dukes; Peter Wang; Mark T. Bolas

Our demo deals with the need in immersive virtual reality for devices that support expressive and adaptive interaction in a low-cost, eyes-free manner. Leveraging rapid prototyping techniques for fabrication, we have developed a variety of panels that can be overlaid on multi-touch tablets and smartphones. The panels are coupled with an app running on the multi-touch device that exchanges commands and state information over a wireless network with the virtual reality application. Sculpted features of the panels provide tactile disambiguation of control widgets and an onscreen heads-up display provides interaction state information. A variety of interaction mappings can be provided through software to support several classes of interaction techniques in virtual environments. We foresee additional uses for applications where eyes-free use and adaptable interaction interfaces can be beneficial.


computer animation and social agents | 2018

Socio-Cultural Effects of Virtual Counseling Interviewers as Mediated by Smartphone Video Conferencing

Sin-Hwa Kang; David M. Krum; Peter Khooshabeh; Thai Phan; Chien-Yen Chang

We explored how users perceive virtual characters that performed the role of a counseling interviewer, while presenting different levels of social class, as well as single or multi-tasking behavior. To investigate this subject, we designed a 2x2 experiment (tasking type and social class of the virtual counseling interviewer). In the experiment, participants experienced the counseling interview interactions over video conferencing on a smartphone. We measured user responses to and perceptions of the virtual human interviewer. The results demonstrate that the tasking types and social class of the virtual counselor affected user responses to and perceptions of the virtual counselor. The results offer insight into the design and development of effective, realistic, and believable virtual human counselors. Furthermore, the results also address current social questions about how smartphones might mediate social interactions, including human-agent interactions.


international conference on distributed, ambient, and pervasive interactions | 2017

Social Impact of Enhanced Gaze Presentation Using Head Mounted Projection

David M. Krum; Sin-Hwa Kang; Thai Phan; Lauren Cairco Dukes; Mark T. Bolas

Projected displays can present life-sized imagery of a virtual human character that can be seen by multiple observers. However, typical projected displays can only render that virtual human from a single viewpoint, regardless of whether head tracking is employed. This results in the virtual human being rendered from an incorrect perspective for most individuals in a group of observers. This could result in perceptual miscues, such as the “Mona Lisa” effect, causing the virtual human to appear as if it is simultaneously gazing and pointing at all observers in the room regardless of their location. This may be detrimental to training scenarios in which all trainees must accurately assess where the virtual human is looking or pointing a weapon. In this paper, we discuss our investigations into the presentation of eye gaze using REFLCT, a previously introduced head mounted projective display. REFLCT uses head tracked, head mounted projectors and retroreflective screens to present personalized, perspective correct imagery to multiple users without the occlusion of a traditional head mounted display. We examined how head mounted projection for enhanced presentation of eye gaze might facilitate or otherwise affect social interactions during a multi-person guessing game of “Twenty Questions.”

Collaboration


Dive into the Thai Phan's collaboration.

Top Co-Authors

Avatar

Mark T. Bolas

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Sin-Hwa Kang

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Chien-Yen Chang

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Evan A. Suma

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Perry Hoberman

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Nora Ayanian

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Wolfgang Hönig

University of Southern California

View shared research outputs
Researchain Logo
Decentralizing Knowledge