Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alexandra Kitson is active.

Publication


Featured researches published by Alexandra Kitson.


symposium on spatial user interaction | 2015

NaviChair: Evaluating an Embodied Interface Using a Pointing Task to Navigate Virtual Reality

Alexandra Kitson; Bernhard E. Riecke; Abraham M. Hashemian; Carman Neustaedter

This research aims to investigate if using a more embodied interface that includes motion cueing can facilitate spatial updating compared to a more traditional non-embodied interface. The ultimate goal is to create a simple, elegant, and effective self-motion control interface. Using a pointing task, we quantify spatial updating in terms of mean pointing error to determine how two modes of locomotion compare: user powered motion cueing (use your body to swivel and tilt a joystick-like interface) and no-motion cueing (traditional joystick). Because the user-powered chair is a more embodied interface providing some minimal motion cueing, we hypothesized it should more effectively support spatial updating and, thus, increase task performance. Results showed, however, the user-powered chair did not significantly improve mean pointing performance in a virtual spatial orientation task (i.e., knowing where users are looking in the VE). Exit interviews revealed the control mechanism for the user-powered chair was not as accurate or easy to use as the joystick, although many felt more immersed. We discuss how user feedback can guide the design of more effective user-powered motion cueing to overcome usability issues and realize benefits of motion cueing.


symposium on spatial user interaction | 2015

Upper Body Leaning can affect Forward Self-Motion Perception in Virtual Environments

Ernst Kruijff; Bernhard E. Riecke; Christina Trekowski; Alexandra Kitson

The study of locomotion in virtual environments is a diverse and rewarding research area. Yet, creating effective and intuitive locomotion techniques is challenging, especially when users cannot move around freely. While using handheld input devices for navigation may often be good enough, it does not match our natural experience of motion in the real world. Frequently, there are strong arguments for supporting body-centered self-motion cues as they may improve orientation and spatial judgments, and reduce motion sickness. Yet, how these cues can be introduced while the user is not moving around physically is not well understood. Actuated solutions such as motion platforms can be an option, but they are expensive and difficult to maintain. Alternatively, within this article we focus on the effect of upper-body tilt while users are seated, as previous work has indicated positive effects on self-motion perception. We report on two studies that investigated the effects of static and dynamic upper body leaning on perceived distances traveled and self-motion perception (vection). Static leaning (i.e., keeping a constant forward torso inclination) had a positive effect on self-motion, while dynamic torso leaning showed mixed results. We discuss these results and identify further steps necessary to design improved embodied locomotion control techniques that do not require actuated motion platforms.


symposium on 3d user interfaces | 2017

Comparing leaning-based motion cueing interfaces for virtual reality locomotion

Alexandra Kitson; Abraham M. Hashemian; Ekaterina R. Stepanova; Ernst Kruijff; Bernhard E. Riecke

In this paper, we describe a user study comparing five different locomotion interfaces for virtual reality locomotion. We compared a standard non-motion cueing interface, Joystick (Xbox), with four motion cueing interfaces, NaviChair (stool with springs), MuvMan (sit/stand active stool), Head-Directed (Oculus Rift DK2), and Swivel Chair (everyday office chair with leaning capability). Each interface had two degrees of freedom to move forward/backward and rotate using velocity (rate) control. The aim of this mixed methods study was to better understand relevant user experience factors and guide the design of future locomotion interfaces. This study employed methods from HCI to provide an understanding of why users behave a certain way while using the interface and to unearth any new issues with the design. Participants were tasked to search for objects in a virtual city while they provided talk-aloud feedback and we logged their behaviour. Subsequently, they completed a post-experimental questionnaire on their experience. We found that the qualitative themes of control, usability, and experience echoed the results of the questionnaire, providing internal validity. The quantitative measures revealed the Joystick to be significantly more comfortable and precise than the motion cueing interfaces. However, the qualitative feedback and interviews showed this was due to the reduced perceived controllability and safety of the motion cueing interfaces. Designers of these interfaces should consider using a backrest if users need to lean backwards and avoid using velocity-control for rotations when using HMDs.


international conference on human-computer interaction | 2017

Gathering and Applying Guidelines for Mobile Robot Design for Urban Search and Rescue Application

Ekaterina R. Stepanova; Markus Heyde; Alexandra Kitson; Thecla Schiphorst; Bernhard E. Riecke

Robotics technology can assist Urban Search and Rescue (USAR) by allowing to explore environments inaccessible or unsafe for a human team [1]. This creates the need to develop a better understanding of the USAR procedures and specific requirements in order to guide the design of the robotics technology which will be accepted by USAR professionals. The current paper explores the specific requirements for the assistive technology, and extracts design guidelines for development of the robotic technology to be used during USAR operations. Design guidelines are derived from both literature review and from a qualitative study performed with Vancouver Heavy Urban Search and Rescue Task Force (HUSAR), focusing on usage scenarios and specific requirements for communication, control and user experience. The study revealed that the most crucial factors for the design of the robot are speed, robustness, reliability, weight, affordability, and adaptability to different environments and tasks, as well as ability to provide a two-way audio/video communication. For the interface, the most important characteristics are its learnability, immersiveness, and ability to afford a high sense of spatial presence. We further discuss how the above requirements were implemented though a case-study of the development of the “TeleSpider” (a hexapod tele-operated walking robot), and assess its effectiveness during the field testing at the Vancouver HUSAR warehouse. Failing to meet a number of the discussed requirements will likely result in the technology to be rejected by the USAR team, and never being used during actual deployments as has happened with a number of existing technologies.


ieee virtual reality conference | 2017

Lean into it: Exploring leaning-based motion cueing interfaces for virtual reality movement

Alexandra Kitson; Abraham M. Hashemian; Ekaterina R. Stepanova; Ernst Kruijff; Bernhard E. Riecke

We describe here a pilot user study comparing five different locomotion interfaces for virtual reality (VR) locomotion. We compared a standard non-motion cueing interface, Joystick, with four leaning-based seated motion-cueing interfaces: NaviChair, MuvMan, Head-Directed and Swivel Chair. The aim of this mixed methods study was to investigate the usability and user experience of each interface, in order to better understand relevant factors and guide the design of future ground-based VR locomotion interfaces. We asked participants to give talk-aloud feedback and simultaneously recorded their responses while they were performing a search task in VR. Afterwards, participants completed an online questionnaire. Although the Joystick was rated as more comfortable and precise than the other interfaces, the leaning-based interfaces showed a trend to provide more enjoyment and a greater sense of self-motion. There were also potential issues of using velocity-control for rotations in leaning-based interfaces when using HMDs instead of stationary displays. Developers need to focus on improving the controllability and perceived safety of these seated motion cueing interfaces.


ieee virtual reality conference | 2017

Development and evaluation of a hands-free motion cueing interface for ground-based navigation

Jacob Freiberg; Alexandra Kitson; Bernhard E. Riecke

With affordable high performance VR displays becoming commonplace, users are becoming increasingly aware of the need for well-designed locomotion interfaces that support these displays. After considering the needs of users, we quantitatively evaluated an embodied locomotion interface called the NaviChair according to usability needs and fulfillment of system requirements. Specifically, we investigated influences of locomotion interfaces (joystick vs. an embodied motion cueing chair) and display type (HMD vs. projection screen) on a spatial updating pointing task. Our findings indicate that our embodied VR locomotion interface provided users with an immersive experience of a space without requiring a significant investment of set up time.


Frontiers in Behavioral Neuroscience | 2016

Influence of Ethnicity, Gender and Answering Mode on a Virtual Point-to-Origin Task.

Alexandra Kitson; Daniel Sproll; Bernhard E. Riecke

In a virtual point-to-origin task, participants seem to show different response patterns and underlying strategies for orientation, such as “turner” and “non-turner” response patterns. Turners respond as if succeeding to update simulated heading changes, and non-turners respond as if failing to update their heading, resulting in left-right hemisphere errors. We present two other response patterns, “non-movers” and “spinners,” that also appear to result in failures to update heading. We have three specific goals in mind: (1) extend previous findings of higher turner rates with spatial language response mode using a point-to-origin task instead of a triangle completion task; (2) replicate the gender effect of males more likely responding as turners; (3) examine ethnicity influence. Designed as a classroom study, we presented participants (N = 498) with four passages through a virtual star field. Participants selected the direction pointing to the origin from four multiple-choice items. Response mode was either pictograms or written language, chosen to compare with similar studies and see if these response modes have an effect on virtual orientation behavior. Results show a majority of participants (48.35%) classified as non-turners, 32.93% turners, 15.57% as non-movers, and 3.14% as spinners. A multinomial regression model reached 49% classification performance. Written spatial language, compared to pictograms, made turner response patterns more likely; this effect was more pronounced for Chinese participants and among females, but not male Caucasians. Moreover, higher turner numbers for written spatial language extends Avraamides findings of higher turner numbers when participants turned their bodies toward the origin but not when they responded verbally. Using pictorial response mode (i.e., top-down picture of a head) may have increased cognitive load because it could be considered more embodied. It remains to be seen how we can reduce the reference frame conflict that might have caused increased cognitive load. Second, our results are inconsistent with previous research in that males overall did not show more turner behavior than females. Future research may look at possible underlying factors, such as cultural norms. Third, individualistic cultures (Caucasians; Greif, 1994) lean toward turner response patterns, whereas collectivist cultures (Asian) lean toward non-turner response patterns.


2016 IEEE International Workshop on Mixed Reality Art (MRA) | 2016

Exploring embodied experience of flying in a virtual reality game with kinect

Xin Tong; Alexandra Kitson; Mahsoo Salimi; Dave Fracchia; Diane Gromala; Bernhard E. Riecke

Immersive Virtual Reality (VR) as a research tool provides numerous opportunities of what one can do and see in a virtual world which is not possible in real world. Being able to fly is an experience that humans have long dreamed of achieving. In this paper, we introduce a VR game where participants can use their body gestures as a Natural User Interface (NUI) to control flying movements via a Microsoft Kinect. The goal of this research is to explore the navigational experience of flying via body gestures: what people like to do, what they want to be, and most importantly, how they map their gestures to navigation control easily in a VR environment.


Proceedings of the 2nd International Workshop on Movement and Computing | 2015

Influence of movement expertise on a virtual point-to-origin task

Alexandra Kitson; Bernhard E. Riecke; Ekaterina R. Stepanova


ieee virtual reality conference | 2018

Investigating a Sparse Peripheral Display in a Head-Mounted Display for VR Locomotion

Abraham M. Hashemian; Alexandra Kitson; Thinh Nquyen-Vo; Hrvoje Benko; Wolfgang Stuerzlinger; Bernhard E. Riecke

Collaboration


Dive into the Alexandra Kitson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ernst Kruijff

Bonn-Rhein-Sieg University of Applied Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge