Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thiago Chaves is active.

Publication


Featured researches published by Thiago Chaves.


symposium on 3d user interfaces | 2012

Poster: Improving motor rehabilitation process through a natural interaction based system using Kinect sensor

A. Da Gama; Thiago Chaves; Lucas Silva Figueiredo; Veronica Teichrieb

In general, the motor rehabilitation process can take advantage of natural interaction based systems, including measurements from patient performance to track its evolution during time and therapy direction. Thus, the aim of this research is the analysis of the use of Kinect sensor as interaction support tool for rehabilitation systems. The Kinect sensor gives three-dimensional information about the user body, recognizing skeleton and joint positions, however does not provide the detection of the body specific movements. This way, the correct description of a rehabilitation movement (shoulder abduction, for instance) was implemented in a system prototype. A scoring mechanism was also developed in order to measure the patient performance, as well as to stimulate his improvement by displaying a positive feedback.


international conference on human computer interaction | 2009

For Your Eyes Only: Controlling 3D Online Games by Eye-Gaze

Howell O. Istance; Aulikki Hyrskykari; Stephen Vickers; Thiago Chaves

Massively multiplayer online role-playing games, such as World of Warcraft, have become the most widespread 3D graphical environments with millions of active subscribers worldwide. People with severe motor impairments should be able to take part in these games without the extent of their disability being apparent to others online. Eye gaze is a high bandwidth modality that can support this. We have developed a software device that uses gaze input in different modes for emulating mouse and keyboard events appropriate for interacting with on-line games. We report an evaluation study that investigated gaze-based interaction with World of Warcraft using the device. We have found that it is feasible to carry out tasks representative of game play at a beginners skill level using gaze alone. The results from the locomotion task part of the study show similar performance for gaze-based interaction compared with a keyboard and mouse. We discuss the usability issues that arose when completing three types of tasks in the game and the implications of these for playing of this type of game using gaze as the only input modality.


2012 14th Symposium on Virtual and Augmented Reality | 2012

Human Body Motion and Gestures Recognition Based on Checkpoints

Thiago Chaves; Lucas Silva Figueiredo; Alana Elza Fontes Da Gama; Cristiano de Araújo; Veronica Teichrieb

The computational implementation of human body gestures recognition has been a challenge for several years. Nowadays, thanks to the development of RGB-D cameras it is possible to acquire a set of data that represents a human position in time. Despite that, these cameras provide raw data, still being a problem to identify in real-time a specific pre-defined user movement without relying on offline training. However, in several cases the real-time requisite is critical, especially when it is necessary to detect and analyze a movement continuously, as in the tracking of physiotherapeutic movements or exercises. This paper presents a simple and fast technique to recognize human movements using the set of data provided by a RGB-D camera. Moreover, it describes a way to identify not only if the performed motion is valid, i.e. belongs to a set of pre-defined gestures, but also the identification of at which point the motion is (beginning, end or somewhere in the middle of it). The precision of the proposed technique can be set to suit the needs of the application and has a simple and fast way of gesture registration, thus, being easy to set new motions if necessary. The proposed technique has been validated through a set of tests focused on analyzing its robustness considering a series of variations during the interaction like fast and complex gestures.


Expert Systems With Applications | 2019

Rehabilitation motion recognition based on the international biomechanical standards

Alana Elza Fontes da Gama; Thiago Chaves; Pascal Fallavollita; Lucas Silva Figueiredo; Veronica Teichrieb

Abstract Since 1990, the International Society of Biomechanics (ISB) has made efforts to provide uniformity to motion description. However, the majority of user studies and technological systems that use motion recognition for rehabilitation, such as virtual and augmented reality (VR and AR), do not follows this standard. Typically, VR and AR techniques recognize motion by comparing human joint positions to a pre-recorded movement, direction, or position relative to an interactive object. Aiming to fill a lack of biomechanical reference on joint motion recognition and analysis for rehabilitation interactive systems, this paper proposes a movement description and recognition method following the ISB standard. The developed method aims to provide computational solutions which are clinically relevant and that uses clinical jargon known to the rehabilitation therapists. The proposed method will also allow motion analysis independent of users height or its position relative to sensor. We conclude this paper by presenting a comparative analysis of the effectiveness of the Kinect sensors (versions 1 & 2) regarding the recognition of patient exercise motions related to the biomechanical standard. Our solution consists of biomechanical motion classification by identifying in which 3D plane the motion occurs. The motion recognition solution also provides patient posture, motion, and information about incorrect exercise performance. The method was developed by first implementing a body representation and Joint Coordinate System based on ISB standard and then motion angle and biomechanical classifications were performed. Specific methods to measure axial rotations were developed inspired by the traditional goniometry strategy. Also, experiments to evaluate if the method is able to classify the biomechanical motions and, thus, detect whether or not they are being performed correctly were performed using both Kinect v1 and Kinect v2. Motions were performed in correctly and wrongly to also verify false positives. Each classified motion was performed 100 times: 70 correct repetitions (35 at normal speed and 35 fast ones) and 30 wrong ones (out of their respective plane). The motion recognition method showed good efficacy in classifying the biomechanical motions correctly. Most of motions presented great success rate of recognition . higher than 80%, mainly using a 20 degrees tolerance.


2017 19th Symposium on Virtual and Augmented Reality (SVR) | 2017

ARkanoidAR: An Augmented Reality System to Guide Biomechanical Movements at Sagittal Plane

Ricardo R. Barioni; Thiago Chaves; Lucas Silva Figueiredo; Veronica Teichrieb; Edvar Vilar Neto; Alana Elza Fontes Da Gama

In motor rehabilitation, the search for better methodologies is intense. In parallel, natural interaction technologies are considered to be a nice approach for rehabilitation guidance, as they can provide content for visual feedback in augmented reality. This paper describes an augmented reality physiotherapy rehabilitation system called ARkanoidAR, which was implemented using the Microsoft Kinect and whose main focus is to provide feedback for biomechanical movements, more specifically those which occurs on the sagittal plane. The main objective of this paper is to propose the ARkanoidAR and evaluate its usability in the rehabilitation context. We conceived and applied a usability quiz, and its results obtained shows that the application is efficient in guiding and engaging users to do motor rehabilitation exercises and that it is easy to be set up and handled.


international conference on human-computer interaction | 2015

Sci-Fi Gestures Catalog

Lucas Silva Figueiredo; Mariana Pinheiro; Edvar Vilar Neto; Thiago Chaves; Veronica Teichrieb

In Science Fiction (Sci-Fi) movies, filmmakers try to anticipate trends and new forms of interaction. Metaphors are created allowing their characters to interact with futuristic devices and environments. These devices and metaphors should be target of research considering they have proven to be useful before. Moreover, the impact of the new interfaces on the audience may indicate their expectations regarding future gesture interactions. Thus, the first goal of this work is to collect and expose a compilation of gestural interactions in Sci-Fi movies, providing a catalog to researchers as resource to future discussions. The second goal is to classify the collected data according to a series of criteria. The catalog is also open to new content contribution, and fellow researchers are invited to provide additional entries of hand gesture scenes from any Sci-Fi title as well as suggestions about new classification criteria and amendments on the already provided content.


Archive | 2012

Development and Ev aluation of a Kinect Based Motor Rehabilitation Game

Daniel Queiroz de Freitas; Alana Elza Fontes Da Gama; Lucas Silva Figueiredo; Thiago Chaves; Déborah Marques-Oliveira; Veronica Teichrieb Cristiano Araújo


Computer Methods and Programs in Biomedicine | 2016

MirrARbilitation: A clinically-related gesture recognition interactive tool for an AR rehabilitation system

Alana Elza Fontes Da Gama; Thiago Chaves; Lucas Silva Figueiredo; Adriana Baltar; Ma Meng; Nassir Navab; Veronica Teichrieb; Pascal Fallavollita


Archive | 2014

MARKERLESS GESTURE RECOGNITION ACCORDING TO BIOMECHANICAL CONVENTION

A. E. F. Da Gama; Thiago Chaves; Lucas Silva Figueiredo; Veronica Teichrieb


Motriz-revista De Educacao Fisica | 2013

Desenvolvimento e aprimoramento de um sistema computacional- Ikapp- de suporte a reabilitação motora

Déborah Marques de Oliveira; Adriana Baltar do Rêgo Maciel; Maíra Izzadora Souza Carneiro; Ana Cláudia de Andrade Cardoso; Alana Elza Fontes da Gama; Thiago Chaves; Veronica Teichrieb; Cristiano C. de Araujo; Kátia Monte-Silva

Collaboration


Dive into the Thiago Chaves's collaboration.

Top Co-Authors

Avatar

Lucas Silva Figueiredo

Federal University of Pernambuco

View shared research outputs
Top Co-Authors

Avatar

Veronica Teichrieb

Federal University of Pernambuco

View shared research outputs
Top Co-Authors

Avatar

Alana Elza Fontes Da Gama

Federal University of Pernambuco

View shared research outputs
Top Co-Authors

Avatar

Edvar Vilar Neto

Federal University of Pernambuco

View shared research outputs
Top Co-Authors

Avatar

Alana Elza Fontes da Gama

Federal University of Pernambuco

View shared research outputs
Top Co-Authors

Avatar

Mariana Pinheiro

Federal University of Pernambuco

View shared research outputs
Top Co-Authors

Avatar

A. E. F. Da Gama

Federal University of Pernambuco

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Adriana Baltar

Federal University of Pernambuco

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge