Teddy Seyed
University of Calgary
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Teddy Seyed.
interactive tabletops and surfaces | 2012
Teddy Seyed; Chris Burns; Mario Costa Sousa; Frank Maurer; Anthony Tang
Multi-display environments (MDEs) have advanced rapidly in recent years, incorporating multi-touch tabletops, tablets, wall displays and even position tracking systems. Designers have proposed a variety of interesting gestures for use in an MDE, some of which involve a user moving their hands, arms, body or even a device itself. These gestures are often used as part of interactions to move data between the various components of an MDE, which is a longstanding research problem. But designers, not users, have created most of these gestures and concerns over implementation issues such as recognition may have influenced their design. We performed a user study to elicit these gestures directly from users, but found a low level of convergence among the gestures produced. This lack of agreement is important and we discuss its possible causes and the implication it has for designers. To assist designers, we present the most prevalent gestures and some of the underlying conceptual themes behind them. We also provide analysis of how certain factors such as distance and device type impact the choice of gestures and discuss how to apply them to real-world systems.
human factors in computing systems | 2016
Edwin Chan; Teddy Seyed; Wolfgang Stuerzlinger; Xing-Dong Yang; Frank Maurer
Gestural interaction has become increasingly popular, as enabling technologies continue to transition from research to retail. The mobility of miniaturized (and invisible) technologies introduces new uses for gesture recognition. This paper investigates single-hand microgestures (SHMGs), detailed gestures in a small interaction space. SHMGs are suitable for the mobile and discrete nature of interactions for ubiquitous computing. However, there has been a lack of end-user input in the design of such gestures. We performed a user-elicitation study with 16 participants to determine their preferred gestures for a set of referents. We contribute an analysis of 1,632 gestures, the resulting gesture set, and prevalent conceptual themes amongst the elicited gestures. These themes provide a set of guidelines for gesture designers, while informing the designs of future studies. With the increase in hand-tracking and electronic devices in our surroundings, we see this as a starting point for designing gestures suitable to portable ubiquitous computing.
interactive tabletops and surfaces | 2013
Teddy Seyed; Mario Costa Sousa; Frank Maurer; Anthony Tang
The process of oil and gas exploration and its result, the decision to drill for oil in a specific location, relies on a number of distinct but related domains. These domains require effective collaboration to come to a decision that is both cost effective and maintains the integrity of the environment. As we show in this paper, many of the existing technologies and practices that support the oil and gas exploration process overlook fundamental user issues such as collaboration, interaction and visualization. The work presented in this paper is based upon a design process that involved expert users from an oil and gas exploration firm in Calgary, Alberta, Canada. We briefly present knowledge of the domain and how it informed the design of SkyHunter, a prototype multi-surface environment to support oil and gas exploration. This paper highlights our current prototype and we conclude with a reflection on multi-surface interactions and environments in this domain.
interactive tabletops and surfaces | 2014
Apoorve Chokshi; Teddy Seyed; Francisco Marinho Rodrigues; Frank Maurer
Emergency response planning is a process that involves many different stakeholders who may communicate concurrently with different channels and exchange different information artefacts. The planning typically occurs in an emergency operations centre (EOC) and involves personnel both in the room and also in the field. The EOC provides an interesting context for examining the use of tablets, tabletops and high resolution wall displays, and their role in facilitating information and communication exchange in an emergency response planning scenario. In collaboration with a military and emergency response simulation software company in Calgary, Alberta, Canada, we developed ePlan Multi-Surface, a multi-surface environment for communication and collaboration for emergency response planning exercises. In this paper, we describe the domain, how it informed our prototype, and insights on collaboration, interactions and information dissemination in multi-surface environments for EOCs.
engineering interactive computing system | 2011
Ali Hosseini-Khayat; Teddy Seyed; Chris Burns; Frank Maurer
Touch-based devices are becoming increasingly common in the consumer electronics space. Support for prototyping touch-based interfaces is currently limited. In this paper, we present a tool we developed in order to bridge the gap between user interface prototyping and touch-based interfaces.
human factors in computing systems | 2016
Teddy Seyed; Xing-Dong Yang; Daniel Vogel
Doppio is a reconfigurable smartwatch with two touch sensitive display faces. The orientation of the top relative to the base and how the top is attached to the base, creates a very large interaction space. We define and enumerate possible configurations, transitions, and manipulations in this space. Using a passive prototype, we conduct an exploratory study to probe how people might use this style of smartwatch interaction. With an instrumented prototype, we conduct a controlled experiment to evaluate the transition times between configurations and subjective preferences. We use the combined results of these two studies to generate a set of characteristics and design considerations for applying this interaction space to smartwatch applications. These considerations are illustrated with a proof-of-concept hardware prototype demonstrating how Doppio interactions can be used for notifications, private viewing, task switching, temporary information access, application launching, application modes, input, and sharing the top.
interactive tabletops and surfaces | 2014
Francisco Marinho Rodrigues; Teddy Seyed; Frank Maurer; Sheelagh Carpendale
Nowadays, looking at the path between two points on a city map has become a simple task using any modern tablet, smartphone or laptop. However, when exploring maps with different information across multiple layers and scales, users experience information discontinuity. Bancada is a multi-display system developed to investigate the exploration of geospatial information using multiple mobile devices in a multi-display environment. In Bancada, tablets are Zoomable Magic Lenses that augment, through specific geospatial layers, an overview map displayed on a tabletop or on a wall display. Users interact with lenses using touch gestures to pan and zoom; and multi-layer maps can be built by overlapping different lenses. Currently, Bancada is being used to research user interfaces separated across multiple devices and interactions with high-resolution mobile devices. Future work with Bancada includes (i) evaluating the user performance when using one tablet or multiple tablets to control all lenses; (ii) exploring what and how interactions can be performed on an overview map; and (iii) exploring how lenses can be changed.
intelligent user interfaces | 2013
Teddy Seyed; Chris Burns; Mario Costa Sousa; Frank Maurer
Devices such as tablets, mobile phones, tabletops and wall displays all incorporate different sizes of screens, and are now commonplace in a variety of situations and environments. Environments that incorporate these devices, multi-display environments (MDEs) are highly interactive and innovative, but the interaction in these environments is not well understood. The research presented here investigates and explores interaction and users in MDEs. This exploration tries to understand the conceptual models of MDEs for users and then examine and validate interaction approaches that can be done to make them more usable. In addition to a brief literature review, the methodology, research goals and current research status are presented.
human factors in computing systems | 2014
Teddy Seyed; Francisco Marinho Rodrigues; Frank Maurer; Anthony Tang
3D volumetric medical images, such as MRIs, are commonly explored and interacted with by medical imaging experts using systems that require keyboard and mouse-based techniques. These techniques have presented challenges for medical imaging specialists: 3D spatial navigation is difficult, in addition to the detailed selection and analysis of 3D medical images being difficult due to depth perception and occlusion issues. In this work, we explore a potential solution to these challenges by using tangible interaction techniques with a mobile device to simplify 3D interactions for medical imaging specialists. We discuss preliminary observations from our design sessions with medical imaging specialists and argue that tangible 3D interactions using mobile devices are viable solution for the medical imaging domain, as well as highlight that domain plays an important role in 3D interaction techniques.
symposium on 3d user interfaces | 2014
Teddy Seyed; Francisco Marinho Rodrigues; Frank Maurer; Anthony Tang
Medical imaging specialists have traditionally used keyboard and mouse based techniques and interfaces for examining both 2D and 3D medical images, but with newer imaging technologies resulting in significantly larger volumes of 3D medical images, these techniques that have become increasingly cumbersome for imaging specialists. To replace traditional techniques, using mobile devices present an effective means for navigating and exploring complex 3D medical data sets, as they provide increased fluidity and flexibility, leveraging peoples existing skills with tangible objects. 3D interactions using mobile devices may provide benefit for imaging specialists, but little is known about using these interactions in the medical imaging domain. In this paper, we explore the design of 3D interaction techniques using mobile devices and preliminary feedback from imaging specialists suggests that these interactions may be a viable solution for the medical imaging domain.