Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hugo Nicolau is active.

Publication


Featured researches published by Hugo Nicolau.


IEEE MultiMedia | 2008

From Tapping to Touching: Making Touch Screens Accessible to Blind Users

Tiago João Vieira Guerreiro; Paulo Lagoá; Hugo Nicolau; D. Gonalves; Joaquim A. Jorge

The NavTouch navigational method enables blind users to input text in a touch-screen device by performing directional gestures to navigate a vowel-indexed alphabet.


international conference on human computer interaction | 2011

BrailleType: unleashing braille over touch screen mobile phones

João Oliveira; Tiago João Vieira Guerreiro; Hugo Nicolau; Joaquim A. Jorge; Daniel Gonçalves

The emergence of touch screen devices poses a new set of challenges regarding text-entry. These are more obvious when considering blind people, as touch screens lack the tactile feedback they are used to when interacting with devices. The available solutions to enable non-visual text-entry resort to a wide set of targets, complex interaction techniques or unfamiliar layouts. We propose BrailleType, a text-entry method based on the Braille alphabet. BrailleType avoids multi-touch gestures in favor of a more simple single-finger interaction, featuring few and large targets. We performed a user study with fifteen blind subjects, to assess this methods performance against Apples VoiceOver approach. BrailleType although slower, was significantly easier and less error prone. Results suggest that the target users would have a smoother adaptation to BrailleType than to other more complex methods.


conference on computers and accessibility | 2010

Towards accessible touch interfaces

Tiago João Vieira Guerreiro; Hugo Nicolau; Joaquim A. Jorge; Daniel Gonçalves

Touch screen mobile devices bear the promise of endless leisure, communication, and productivity opportunities to motor-impaired people. Indeed, users with residual capacities in their upper extremities could benefit immensely from a device with no demands regarding strength. However, the precision required to effectively select a target without physical cues creates problems to people with limited motor abilities. Our goal is to thoroughly study mobile touch screen interfaces, their characteristics and parameterizations, thus providing the tools for informed interface design for motor-impaired users. We present an evaluation performed with 15 tetraplegic people that allowed us to understand the factors limiting user performance within a comprehensive set of interaction techniques (Tapping, Crossing, Exiting and Directional Gesturing) and parameterizations (Position, Size and Direction). Our results show that for each technique, accuracy and precision vary across different areas of the screen and directions, in a way that is directly dependent on target size. Overall, Tapping was both the preferred technique and among the most effective. This proves that it is possible to design inclusive unified interfaces for motor-impaired and able-bodied users once the correct parameterization or adaptability is assured.


human computer interaction with mobile devices and services | 2010

Assessing mobile touch interfaces for tetraplegics

Tiago João Vieira Guerreiro; Hugo Nicolau; Joaquim A. Jorge; Daniel Gonçalves

Mobile touch-screen interfaces and tetraplegic people have a controversial connection. While users with residual capacities in their upper extremities could benefit immensely from a device which does not require strength to operate, the precision needed to effectively select a target bars these people access to countless communication, leisure and productivity opportunities. Insightful projects attempted to bridge this gap via either special hardware or particular interface tweaks. Still, we need further insight into the challenges and the frontiers separating failure from success for such applications to take hold. This paper discusses an evaluation conducted with 15 tetraplegic people to learn the limits to their performance within a comprehensive set of interaction methods. We then present the results concerning a particular interaction technique: Tapping. Results show that performance varies across different areas of the screen whose distribution changes with target size.


conference on computers and accessibility | 2014

Motor-impaired touchscreen interactions in the wild

Kyle Montague; Hugo Nicolau; Vicki L. Hanson

Touchscreens are pervasive in mainstream technologies; they offer novel user interfaces and exciting gestural interactions. However, to interpret and distinguish between the vast ranges of gestural inputs, the devices require users to consistently perform interactions inline with the predefined location, movement and timing parameters of the gesture recognizers. For people with variable motor abilities, particularly hand tremors, performing these input gestures can be extremely challenging and impose limitations on the possible interactions the user can make with the device. In this paper, we examine touchscreen performance and interaction behaviors of motor-impaired users on mobile devices. The primary goal of this work is to measure and understand the variance of touchscreen interaction performances by people with motor-impairments. We conducted a four-week in-the-wild user study with nine participants using a mobile touchscreen device. A Sudoku stimulus application measured their interaction performance abilities during this time. Our results show that not only does interaction performance vary significantly between users, but also that an individuals interaction abilities are significantly different between device sessions. Finally, we propose and evaluate the effect of novel tap gesture recognizers to accommodate for individual variances in touchscreen interactions.


conference on computers and accessibility | 2015

Typing Performance of Blind Users: An Analysis of Touch Behaviors, Learning Effect, and In-Situ Usage

Hugo Nicolau; Kyle Montague; Tiago João Vieira Guerreiro; André Rodrigues; Vicki L. Hanson

Non-visual text-entry for people with visual impairments has focused mostly on the comparison of input techniques reporting on performance measures, such as accuracy and speed. While researchers have been able to establish that non-visual input is slow and error prone, there is little understanding on how to improve it. To develop a richer characterization of typing performance, we conducted a longitudinal study with five novice blind users. For eight weeks, we collected in-situ usage data and conducted weekly laboratory assessment sessions. This paper presents a thorough analysis of typing performance that goes beyond traditional aggregated measures of text-entry and reports on character-level errors and touch measures. Our findings show that users improve over time, even though it is at a slow rate (0.3 WPM per week). Substitutions are the most common type of error and have a significant impact on entry rates. In addition to text input data, we analyzed touch behaviors, looking at touch contact points, exploration movements, and lift positions. We provide insights on why and how performance improvements and errors occur. Finally, we derive some implications that should inform the design of future virtual keyboards for non-visual input.


european conference on cognitive ergonomics | 2008

Mobile text-entry models for people with disabilities

Tiago João Vieira Guerreiro; Paulo Lagoá; Hugo Nicolau; Pedro F. Santana; Joaquim A. Jorge

Motivation -- To provide suitable mobile text-entry interfaces for the disabled, designed considering their capabilities and needs. Research approach -- We analyzed 20 blind users and the difficulties they face with traditional text-entry approaches. We designed a new text-entry method, modelled accordingly to the design guidelines retrieved from the user studies and evaluated in comparison to the traditional approach through user evaluation. The navigation model presented shows to be effective both on keypad and touch screen based devices. Findings/Design -- Results show that if the users limitations and capacities are taken into account, the first approach with the mobile device is subtle and the learning curve is accentuated. In opposite to traditional approaches, the theoretical values are likely to be achieved. Research limitations/Implications -- As the available set of target users is limited, the user studies were made with five users per group (3 groups/15 users). Originality/Value -- The research presents an innovative text-entry method and its comparison with commonly used methods. We also present a solution to provide text input in touch screen mobile devices for blind users. Take away message -- If the interaction is designed with the end users in mind, the best theoretical values are likely to be achieved.


human factors in computing systems | 2014

B#: chord-based correction for multitouch braille input

Hugo Nicolau; Kyle Montague; Tiago João Vieira Guerreiro; João Guerreiro; Vicki L. Hanson

Braille has paved its way into mobile touchscreen devices, providing faster text input for blind people. This advantage comes at the cost of accuracy, as chord typing over a flat surface has proven to be highly error prone. A misplaced finger on the screen translates into a different or unrecognized character. However, the chord itself gathers information that can be leveraged to improve input performance. We present B#, a novel correction system for multitouch Braille input that uses chords as the atomic unit of information rather than characters. Experimental results on data collected from 11 blind people revealed that B# is effective in correcting errors at character-level, thus providing opportunities for instant corrections of unrecognized chords; and at word-level, where it outperforms a popular spellchecker by providing correct suggestions for 72% of incorrect words (against 38%). We finish with implications for designing chord-based correction system and avenues for future work.


interactive tabletops and surfaces | 2015

Blind People Interacting with Large Touch Surfaces: Strategies for One-handed and Two-handed Exploration

Tiago João Vieira Guerreiro; Kyle Montague; João Guerreiro; Rafael Nunes; Hugo Nicolau; Daniel Gonçalves

Interaction with large touch surfaces is still a relatively infant domain, particularly when looking at the accessibility solutions offered to blind users. Their smaller mobile counterparts are shipped with built-in accessibility features, enabling non-visual exploration of linearized screen content. However, it is unknown how well these solutions perform in large interactive surfaces that use more complex spatial content layouts. We report on a user study with 14 blind participants performing common touchscreen interactions using one and two-hand exploration. We investigate the exploration strategies applied by blind users when interacting with a tabletop. We identified six basic strategies that were commonly adopted and should be considered in future designs. We finish with implications for the design of accessible large touch interfaces.


human factors in computing systems | 2015

TabLETS Get Physical: Non-Visual Text Entry on Tablet Devices

João Guerreiro; André Rodrigues; Kyle Montague; Tiago João Vieira Guerreiro; Hugo Nicolau; Daniel Gonçalves

Tablet devices can display full-size QWERTY keyboards similar to the physical ones. Yet, the lack of tactile feedback and the inability to rest the fingers on the home keys result in a highly demanding and slow exploration task for blind users. We present SpatialTouch, an input system that leverages previous experience with physical QWERTY keyboards, by supporting two-handed interaction through multitouch exploration and spatial, simultaneous audio feedback. We conducted a user study, with 30 novice touchscreen participants entering text under one of two conditions: (1) SpatialTouch or (2) mainstream accessibility method Explore by Touch. We show that SpatialTouch enables blind users to leverage previous experience as they do a better use of home keys and perform more efficient exploration paths. Results suggest that although SpatialTouch did not result in faster input rates overall, it was indeed able to leverage previous QWERTY experience in contrast to Explore by Touch.

Collaboration


Dive into the Hugo Nicolau's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel Gonçalves

Instituto Superior Técnico

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Vicki L. Hanson

Rochester Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

João Guerreiro

Instituto Superior Técnico

View shared research outputs
Top Co-Authors

Avatar

João Oliveira

Technical University of Lisbon

View shared research outputs
Top Co-Authors

Avatar

Alessandra Brandão

Rochester Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge