Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ahmad Byagowi is active.

Publication


Featured researches published by Ahmad Byagowi.


symposium on spatial user interaction | 2016

Combining Ring Input with Hand Tracking for Precise, Natural Interaction with Spatial Analytic Interfaces

Barrett Ens; Ahmad Byagowi; Teng Han; Juan David Hincapié-Ramos; Pourang Irani

Current wearable interfaces are designed to support short-duration tasks known as micro-interactions. To support productive interfaces for everyday analytic tasks, designers can leverage natural input methods such as direct manipulation and pointing. Such natural methods are now available in virtual, mobile environments thanks to miniature depth cameras mounted on head-worn displays (HWDs). However, these techniques have drawbacks, such as fatigue and limited precision. To overcome these limitations, we explore combined input: hand tracking data from a head-mounted depth camera, and input from a small ring device. We demonstrate how a variety of input techniques can be implemented using this novel combination of devices. We harness these techniques for use with Spatial Analytic Interfaces: multi-application, spatial UIs for in-situ, analytic taskwork on wearable devices. This research demonstrates how combined input from multiple wearable devices holds promise for supporting high-precision, low-fatigue interaction techniques, to support Spatial Analytic Interfaces on HWDs.


Journal of Experimental Neuroscience | 2014

Design and Application of a Novel Virtual Reality Navigational Technology (VRNChair)

Ahmad Byagowi; Danyal Mohaddes; Zahra Moussavi

This paper presents a novel virtual reality navigation (VRN) input device, called the VRNChair, offering an intuitive and natural way to interact with virtual reality (VR) environments. Traditionally, VR navigation tests are performed using stationary input devices such as keyboards or joysticks. However, in case of immersive VR environment experiments, such as our recent VRN assessment, the user may feel kinetosis (motion sickness) as a result of the disagreement between vestibular response and the optical flow. In addition, experience in using a joystick or any of the existing computer input devices may cause a bias in the accuracy of participant performance in VR environment experiments. Therefore, we have designed a VR navigational environment that is operated using a wheelchair (VRNChair). The VRNChair translates the movement of a manual wheelchair to feed any VR environment. We evaluated the VRNChair by testing on 34 young individuals in two groups performing the same navigational task with either the VRNChair or a joystick; also one older individual (55 years) performed the same experiment with both a joystick and the VRNChair. The results indicate that the VRNChair does not change the accuracy of the performance; thus removing the plausible bias of having experience using a joystick. More importantly, it significantly reduces the effect of kinetosis. While we developed VRNChair for our spatial cognition study, its application can be in many other studies involving neuroscience, neurorehabilitation, physiotherapy, and/or simply the gaming industry.


canadian conference on electrical and computer engineering | 2013

Implementation of a nanosatellite attitude determination and control system for the T-Sat1 mission

Brady Russell; Lee Clement; Joshua Hernandez; Ahmad Byagowi; Dario Schor; Witold Kinsner

The design of attitude determination and control of nanosatellites requires innovative solutions that are low-cost, small in size, and minimized power consumption. The University of Manitoba T-Sat1 project has created a simple complement of sensors and actuators that can provide stable determination and orientation. The linear region of photodiode readings was combined with magnetometer readings in a sensor-fusion attitude determination algorithm. The actuators were custom-built torque discs on printed circuit boards that proved easy to mount and generate fields 25 nT that are sufficient to orient the satellite while in orbit. The torque disc manufacturing process is described with details on the imperfections and lessons learned from the prototypes. This paper describes the hardware and software design and implementation for the attitude determination and control of the T-Sat1 nanosatellite.


12th FIRA RoboWorld Congress on Progress in Robotics 2009 | 2009

Teen sized humanoid robot: Archie

Jacky Baltes; Ahmad Byagowi; John Anderson; Peter Kopacek

This paper describes our first teen sized humanoid robot Archie. This robot has been developed in conjunction with Prof. Kopacek’s lab from the Technical University of Vienna. Archie uses brushless motors and harmonic gears with a novel approach to position encoding. Based on our previous experience with small humanoid robots, we developed software to create, store, and play back motions as well as control methods which automatically balance the robot using feedback from an internal measurement unit (IMU).


international ieee/embs conference on neural engineering | 2013

The perceived orientation in people with and without Alzheimer's

D. Zen; Ahmad Byagowi; M. Garcia; Debbie M. Kelly; Brian Lithgow; Zahra Moussavi

Accurate spatial perception is important for the successful performance of any motor task. Research suggests that this capability declines significantly in patients with Alzheimers disease. Using a virtual reality navigational test, this study investigates the orientation ability. In particular, we hypothesize the ability to orient using egocentric information declines significantly with Alzheimers even at early stages, whereas general orientation abilities maybe preserved through the use of allocentric orientation strategies in mild Alzheimers patients. The study subjects were 11 cognitively healthy and 8 patients with mild to moderate Alzheimers. The results are congruent with our hypothesis, and encourage further investigation in a larger population.


international symposium on safety, security, and rescue robotics | 2012

Bluetooth as a victim detection sensor

Ahmad Byagowi; Siavash Malektaji; Robert D. McLeod

This work investigates the promise of Bluetooth wireless communication protocol as a victim detection sensor. The fact that cellphones are carried by people can lead rescue robots to track Bluetooth devices alongside other victim detection sensors (e.g., Heat, CO2, Voice, Visual motion detector, etc.). Bluetooth communication protocol can provide the received signal strength (RSS) as an indication of the distance between the two devices in communication. Therefore, the RSS can be used to estimate the distance between the robot and any possible Bluetooth device carried by a victim. This can lead a rescue robot to check other victim sensory data to confirm the existence of a victim in the vicinity of the detected Bluetooth emitting device.


international conference of the ieee engineering in medicine and biology society | 2017

Virtual reality body motion induced navigational controllers and their effects on simulator sickness and pathfinding

Cassandra N. Aldaba; Paul White; Ahmad Byagowi; Zahra Moussavi

Virtual reality (VR) navigation is usually constrained by plausible simulator sickness (SS) and intuitive user interaction. The paper reports on the use of four different degrees of body motion induced navigational VR controllers, a TiltChair, omni-directional treadmill, a manual wheelchair joystick (VRNChair), and a joystick in relation to a participants SS occurrence and a controllers intuitive utilization. Twenty young adult participants utilized all controllers to navigate through the same VR task environment in separate sessions. Throughout the sessions, SS occurrence was measured from a severity score by a standard SS questionnaire and from body sway by a center of pressure path length with eyes opened and closed. SS occurrence did not significantly differ among the controllers. However, time spent in VR significantly contributed to SS occurrence; hence, a few breaks to minimize SS should be interjected throughout a VR task. For all task trials, we recorded the participants travel trajectories to investigate each controllers intuitive utilization from a computed traversed distance. Shorter traversed distances indicated that participants intuitively utilized the TiltChair with a slower speed; while longer traversed distances indicated participants struggled to utilize the omni-directional treadmill with a unnaturalistic stimulation of gait. Therefore, VR navigation should use technologies best suited for the intended age group that minimizes SS, and produces intuitive interactions for the participants.


nordic conference on human-computer interaction | 2016

Exploring Design Factors for Transforming Passive Vibration Signals into Smartwear Interactions

Teng Han; David Ahlström; Xing-Dong Yang; Ahmad Byagowi; Pourang Irani

Vibrational signals that are generated when a finger is swept over an uneven surface can be reliably detected via low-cost sensors that are in proximity to the interaction surface. Such interactions provide an alternative to touchscreens by enabling always-available input. In this paper we demonstrate that Inertial Measurement Units (known as IMUs) embedded in many off-the-shelf smartwear are well suited for capturing vibrational signals generated by a users finger swipes, even when the IMU appears in a smartring or smartwatch. In comparison to acoustic based approaches for capturing vibrational signals, IMUs are sensitive to a vast number of factors, both, in terms of the surface and swipe properties, when the interaction is carried out. We contribute by examining the impact of these surface and swipe properties, including surface or bump height and density, surface stability, sensor location, swipe style, and swipe direction. Based on our results, we present a number of usage scenarios to demonstrate how this approach can be used to provide always-available input for digital interactions.


international conference on computer graphics and interactive techniques | 2016

Haptic wheelchair

Mike Lambeta; Matt Dridger; Paul White; Jesslyn Janssen; Ahmad Byagowi

Virtual reality aims to provide an immersive experience to a user, with the help of a virtual environment. This immersive experience requires two key components; one for capturing inputs from the real world, and the other for synthesizing real world outputs based on interactions with the virtual environment. However, a user in a real world environment experiences a greater set of feedback from real world inputs which relate directly to auditory, visual, and force feedback. As such, in a virtual environment, a dissociation is introduced between the users inputs and the feedback from the virtual environment. This dissociation relates to the discomfort the user experiences with real world interaction. Our team has introduced a novel way of receiving synthesized feedback from the virtual environment through the use of a haptic wheelchair.


international conference of the ieee engineering in medicine and biology society | 2012

Design of a Virtual Reality Navigational (VRN) experiment for assessment of egocentric spatial cognition

Ahmad Byagowi; Zahra Moussavi

Collaboration


Dive into the Ahmad Byagowi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul White

University of Manitoba

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dario Schor

University of Manitoba

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge