Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ramin Tadayon is active.

Publication


Featured researches published by Ramin Tadayon.


international conference on universal access in human-computer interaction | 2014

Affective Haptics for Enhancing Access to Social Interactions for Individuals Who are Blind

Troy L. McDaniel; Shantanu Bala; Jacob Rosenthal; Ramin Tadayon; Arash Tadayon; Sethuraman Panchanathan

Non-verbal cues used during social interactions, such as facial expressions, are largely inaccessible to individuals who are blind. This work explores the use of affective haptics for communicating emotions displayed during social interactions. We introduce a novel haptic device, called the Haptic Face Display (HFD), consisting of a two-dimensional array of vibration motors capable of displaying rich spatiotemporal vibrotactile patterns presented through passive or active interaction styles. This work investigates users’ emotional responses to vibrotactile patterns using a passive interaction style in which the display is embedded on the back of an ergonomic chair. Such a technology could enhance social interactions for individuals who are blind in which emotions of interaction partners, once recognized by a frontend system such as computer vision algorithms, are conveyed through the HFD. We present the results of an experiment exploring the relationship between vibrotactile pattern design and elicited emotional response. Results indicate that pattern shape, duration, among other dimensions, influence emotional response, which is an important consideration when designing technologies for affective haptics.


international conference on human-computer interaction | 2015

Interactive Motor Learning with the Autonomous Training Assistant: A Case Study

Ramin Tadayon; Troy L. McDaniel; Morris Goldberg; Pamela M. Robles-Franco; Jonathan Zia; Miles Laff; Mengjiao Geng; Sethuraman Panchanathan

At-home exercise programs have met limited success in rehabilitation and training. A primary cause for this is the lack of a trainer’s presence for feedback and guidance in the home. To create such an environment, we have developed a model for the representation of motor learning tasks and training protocols. We designed a toolkit based on this model, the Autonomous Training Assistant, which uses avatar interaction and real-time multi-modal feedback to guide at-home exercise. As an initial case study, we evaluate a component of our system on a child with Cerebral Palsy and his martial arts trainer through three simple motion activities, demonstrating the effectiveness of the model in representing the trainer’s exercise program.


2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings | 2014

A toolkit for motion authoring and motor skill learning in serious games

Ramin Tadayon; Sethuraman Panchanathan; Troy L. McDaniel; Bijan Fakhri; Miles Laff

Trainers and therapists provide critical support and guidance as individuals undergo long-term exercise programs. Unfortunately, without this guidance, individuals who undergo home therapy or training are unable to determine whether or not they are making meaningful progress, and, as a result, many give up prior to completion of their programs. To address these issues, the authors propose a toolkit for the design of games that support motor skill learning and relearning consisting of three main components: (1) an “intelligent stick” hardware interface, (2) motion authoring software for the design of new motion patterns, and (3) a framework for mapping elements of game design to skill learning. Prototypes for the first two components have been developed, and results from an initial usability study involving 9 participants are presented and discussed.


International Journal of Semantic Computing | 2017

Enriching the fan experience in a smart stadium using internet of things technologies

Sethuraman Panchanathan; Shayok Chakraborty; Troy L. McDaniel; Ramin Tadayon; Bijan Fakhri; Noel E. O'Connor; Mark Marsden; Suzanne Little; Kevin McGuinness; David S. Monaghan

Rapid urbanization has brought about an influx of people to cities, tipping the scale between urban and rural living. Population predictions estimate that 64% of the global population will reside in cities by 2050. To meet the growing resource needs, improve management, reduce complexities, and eliminate unnecessary costs while enhancing the quality of life of citizens, cities are increasingly exploring open innovation frameworks and smart city initiatives that target priority areas including transportation, sustainability, and security. The size and heterogeneity of urban centers impede progress of technological innovations for smart cities. We propose a Smart Stadium as a living laboratory to balance both size and heterogeneity so that smart city solutions and Internet of Things (IoT) technologies may be deployed and tested within an environment small enough to practically trial but large and diverse enough to evaluate scalability and efficacy. The Smart Stadium for Smart Living initiative brings together multiple institutions and partners including Arizona State University (ASU), Dublin City University (DCU), Intel Corporation, and Gaelic Athletic Association (GAA), to turn ASUs Sun Devil Stadium and Irelands Croke Park Stadium into twinned smart stadia to investigate IoT and smart city technologies and applications.


conference on computers and accessibility | 2016

Autonomous Training Assistant: A System and Framework for Guided At-Home Motor Learning

Ramin Tadayon; Troy L. McDaniel; Sethuraman Panchanathan

We present a novel framework and system for at-home rehabilitative exercise in the absence of a physical therapist. The framework includes metrics for assessing motor performance on a wide variety of exercises. We present our system, the Autonomous Training Assistant, which utilizes this framework and a low-cost accessible exercise device called the Intelligent Stick to deliver feedback as a user trains at home. We evaluated the systems multimodal feedback mechanism in a case study whose results indicate that individual preference may have a significant effect on modality assignment for optimal learning. We conclude with ideas for future work.


acm multimedia | 2016

A Multimodal Gamified Platform for Real-Time User Feedback in Sports Performance

David S. Monaghan; Freddie Honohan; Amin Ahmadi; Troy L. McDaniel; Ramin Tadayon; Ajay Karpur; Kieran Morran; Noel E. O'Connor; Sethuraman Panchanathan

In this paper we introduce a novel platform that utilises multi-modal low-cost motion capture technology for the delivery of real-time visual feedback for sports performance. This platform supports the expansion to multi-modal interfaces that utilise haptic and audio feedback, which scales effectively with motor task complexity. We demonstrate an implementation of our platform within the field of sports performance. The platform includes low-cost motion capture through a fusion technique, combining a Microsoft Kinect V2 with two wrist inertial sensors, which make use of the accelerometer and gyroscope sensors, alongside a game-based Graphical User Interface (GUI) for instruction, visual feedback and gamified score tracking.


IEEE MultiMedia | 2016

Person-Centered Multimedia Computing: A New Paradigm Inspired by Assistive and Rehabilitative Applications

Sethuraman Panchanathan; Shayok Chakraborty; Troy L. McDaniel; Ramin Tadayon


acm multimedia | 2011

Socially relevant simulation games: a design study

Ramin Tadayon; Ashish Amresh; Winslow Burleson


FECS | 2008

Recruiting Students to Computer Science Program via Online Games.

Nasser Tadayon; Ramin Tadayon; Manghui Tu


ieee international conference on serious games and applications for health | 2018

Real-time stealth intervention for motor learning using player flow-state

Ramin Tadayon; Ashish Amresh; Troy L. McDaniel; Sethuraman Panchanathan

Collaboration


Dive into the Ramin Tadayon's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ashish Amresh

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Arash Tadayon

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Bijan Fakhri

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Miles Laff

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Nasser Tadayon

Southern Utah University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge