Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Héctor A. Caltenco is active.

Publication


Featured researches published by Héctor A. Caltenco.


international conference of the ieee engineering in medicine and biology society | 2010

Inductive tongue control of powered wheelchairs

Morten Enemark Lund; Henrik Vie Christiensen; Héctor A. Caltenco; Eugen R. Lontis; Bo Bentsen; Lotte N. S. Andreasen Struijk

Alternative and effective methods for controlling powered wheelchairs are important to individuals with tetraplegia and similar impairments whom are unable to use the standard joystick. This paper describes a system where tongue movements are used to control a powered wheelchair thus providing users, with high level spinal cord injuries, full control of their wheelchair. The system is based on an inductive tongue control system developed at Center for Sensory-Motor Interaction (SMI), Aalborg University. The system emulates a standard analog joystick in order to interface the wheelchair, thus ensuring that the system works with almost any wheelchair. The total embedment of the tongue interface into the mouth makes the control practically invisible. A fuzzy system combining 8 sensors for directional control allows for multidirectional control of the wheelchair. Preliminary test results show navigation abilities, which are highly competitive when compared to other tongue control system.


international conference of the ieee engineering in medicine and biology society | 2009

Fully integrated wireless inductive tongue computer interface for disabled people

Lotte N. S. Andreasen Struijk; Eugen R. Lontis; Bo Bentsen; Henrik Vie Christensen; Héctor A. Caltenco; Morten Enemark Lund

This work describes a novel fully integrated inductive tongue computer interface for disabled people. The interface consists of an oral unit placed in the mouth, including inductive sensors, related electronics, a system for wireless transmission and a rechargeable battery. The system is activated using an activation unit placed on the tongue, and incorporates 18 inductive sensors, arranged in both a key area and a mouse-pad area. The systems functionality was demonstrated in a pilot experiment, where a typing rate of up to 70 characters/minute was obtained with an error rate of 3%. Future work will include tests with disabled subjects.


international conference of the ieee engineering in medicine and biology society | 2010

Clinical evaluation of wireless inductive tongue computer interface for control of computers and assistive devices

Eugen R. Lontis; Morten Enemark Lund; Henrik Vie Christensen; Bo Bentsen; Michael Gaihede; Héctor A. Caltenco; Lotte N. S. Andreasen Struijk

Typing performance of a full alphabet keyboard and a joystick type of mouse (with on-screen keyboard) provided by a wireless integrated tongue control system (TCS) has been investigated. The speed and accuracy have been measured in a form of a throughput defining the true correct words per minute [cwpm]. Training character sequences were typed in a dedicated interface that provided visual feedback of activated sensors, a map of the alphabet associated, and the task character. Testing sentences were typed in Word, with limited visual feedback, using non-predictive typing (map of characters in alphabetic order associated to sensors) and predictive typing (LetterWise) for TCS keyboard, and non-predictive typing for TCS mouse. Two subjects participated for four and three consecutive days, respectively, two sessions per day. Maximal throughput of 2.94, 2.46, and 2.06, 1.68 [cwpm] were obtained with TCS keyboard by subject 1 and 2 with predictive and non-predictive typing respectively. Maximal throughput of 2.09 and 1.71 [cwpm] was obtained with TCS mouse by subject 1 and 2, respectively. Same experimental protocol has been planned for a larger number of subjects.


international conference of the ieee engineering in medicine and biology society | 2010

TongueWise: Tongue-computer interface software for people with tetraplegia

Héctor A. Caltenco; Lotte N. S. Andreasen Struijk; Björn Breidegard

Many computer interfaces and assistive devices for people with motor disabilities limit the input dimensionality from user to system, in many cases leading to single switch interfaces where the user can only press one button. This can, either limit the level of direct access to the functionalities of the operating system, or slow down speed of interaction. In this paper we present TongueWise: a software developed for a tongue computer interface that can be activated with the tip of the tongue and that provides direct input that covers most of the standard keyboard and mouse commands.


international conference of the ieee engineering in medicine and biology society | 2009

Inductive pointing device for tongue control system for computers and assistive devices

Eugen R. Lontis; Héctor A. Caltenco; Bo Bentsen; Henrik Vie Christensen; Morten Enemark Lund; Lotte N. S. Andreasen Struijk

Experimental results for pointing tasks using a tongue control system are reported in this paper. Ten untrained subjects participated in the experiment. Both typing and pointing tasks were performed, in three short-term training sessions, in consecutive days, by each subject. The system provided a key pad (14 sensors) and a mouse pad (10 sensors with joystick functionality) whose placements were interchanged (front, back) in half of the subjects. The pointing tasks consisted of selecting and tracking a target circle (of 50, 75 and 100 pixels diameter) that occurred randomly in each of the 16 positions uniformly distributed along the perimeter of a layout circle of 250 pixels diameter. The throughput was of 0.808 bits per second and the time on target was of 0.164 of the total tracking time. The pads layout, the subjects, the sessions, the target diameters, and the angle of the tracking direction had a statistically significant effect on the two performance measures. Long term training is required to assess the improvement of the user capability.


IEEE Transactions on Biomedical Engineering | 2012

Tip of the Tongue Selectivity and Motor Learning in the Palatal Area

Héctor A. Caltenco; Eugen R. Lontis; Shellie Boudreau; Bo Bentsen; Johannes J. Struijk; L. N. S. Andreasen Struijk

This study assessed the ability of the tongue tip to accurately select intraoral targets embedded in an upper palatal tongue-computer interface, using 18 able-bodied volunteers. Four performance measures, based on modifications to Fittss Law, were determined for three different tongue-computer interface layouts. The layouts differed with respect to number and location of the targets in the palatal interface. Assessment of intraoral target selection speed and accuracy revealed that performance was indeed dependent on the location and distance between the targets. Performances were faster and more accurate for targets located farther away from the base of the tongue in comparison to posterior and medial targets. A regression model was built, which predicted intraoral target selection time based on target location and movement amplitude better than the predicted by using a standard Fittss Law model. A 30% improvement in the speed and accuracy over three daily practice sessions of 30 min emphasizes the remarkable motor learning abilities of the tongue musculature and provides further evidence that the tongue is useful for operating computer-interface technologies.


international conference of the ieee engineering in medicine and biology society | 2009

A framework for mouse and keyboard emulation in a tongue control system

Morten Enemark Lund; Héctor A. Caltenco; Eugen R. Lontis; Henrik Vie Christiensen; Bo Bentsen; Lotte N. S. Andreasen Struijk

Effective human input devices for computer control are very important to quadriplegics and others with severe disabilities. This paper describes a framework for computer control without need for special PC software or drivers. The framework is based on a tongue control system recently developed at Center for Sensory-Motor Interaction (SMI), Aalborg University. The framework provides emulation of a standard USB keyboard and mouse, and allows tongue control of any computer using standard USB drivers available in all modern operating systems.


Disability and Rehabilitation: Assistive Technology | 2014

On the tip of the tongue: learning typing and pointing with an intra-oral computer interface

Héctor A. Caltenco; Björn Breidegard; Lotte N. S. Andreasen Struijk

Abstract Purpose: To evaluate typing and pointing performance and improvement over time of four able-bodied participants using an intra-oral tongue-computer interface for computer control. Background: A physically disabled individual may lack the ability to efficiently control standard computer input devices. There have been several efforts to produce and evaluate interfaces that provide individuals with physical disabilities the possibility to control personal computers. Method: Training with the intra-oral tongue-computer interface was performed by playing games over 18 sessions. Skill improvement was measured through typing and pointing exercises at the end of each training session. Results: Typing throughput improved from averages of 2.36 to 5.43 correct words per minute. Pointing throughput improved from averages of 0.47 to 0.85 bits/s. Target tracking performance, measured as relative time on target, improved from averages of 36% to 47%. Path following throughput improved from averages of 0.31 to 0.83 bits/s and decreased to 0.53 bits/s with more difficult tasks. Conclusions: Learning curves support the notion that the tongue can rapidly learn novel motor tasks. Typing and pointing performance of the tongue-computer interface is comparable to performances of other proficient assistive devices, which makes the tongue a feasible input organ for computer control. Implications for Rehabilitation Intra-oral computer interfaces could provide individuals with severe upper-limb mobility impairments the opportunity to control computers and automatic equipment. Typing and pointing performance of the tongue-computer interface is comparable to performances of other proficient assistive devices, but does not cause fatigue easily and might be invisible to other people, which is highly prioritized by assistive device users. Combination of visual and auditory feedback is vital for a good performance of an intra-oral computer interface and helps to reduce involuntary or erroneous activations.


International Federation for Medical and Biological Engineering Proceedings; 34, pp 191-194 (2011) | 2011

Fuzzy Inference System for Analog Joystick Emulation with an Inductive Tongue-Computer Interface

Héctor A. Caltenco; Eugen R. Lontis; Lotte N. S. Andreasen Struijk

This paper describes the development of a fuzzy inference system (FIS) for emulating an analog joystick using an inductive tongue-computer interface. The principle of operation of the interface and the inductive sensors signals are described. The FIS receives sensor signals and output the Cartesian position of the virtual joystick, which can be used to control the mouse pointer in a personal computer, wheelchairs or other joystick enabled applications at varying magnitude and directions proportional to the tongue position over the palatal plate. This provides a significant advantage to individuals with tetraplegia using this computer interface.


human computer interaction with mobile devices and services | 2015

What Do You Like? Early Design Explorations of Sound and Haptic Preferences

Charlotte Magnusson; Héctor A. Caltenco; Sara Finocchietti; Giulia Cappagli; Graham A. Wilson; Monica Gori

This study is done within the framework of a project aimed at developing a wearable device (a bracelet) intended to support sensory motor rehabilitation of children with visual impairments. We present an exploratory study of aesthetic/hedonistic preferences for sounds and touch experiences among visually impaired children. The work is done in a participatory setting, and we have used mixed methods (questionnaires, workshop and field trial using a mobile location based app for story creation) in order to get a more complete initial picture of how enjoyable training devices should be designed for our target users.

Collaboration


Dive into the Héctor A. Caltenco's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gabriel Baud-Bovy

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar

Sara Finocchietti

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Monica Gori

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge