Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Cristina Manresa-Yee is active.

Publication


Featured researches published by Cristina Manresa-Yee.


Journal of Network and Computer Applications | 2008

Hands-free vision-based interface for computer accessibility

Javier Varona; Cristina Manresa-Yee; Francisco J. Perales

Physically disabled and mentally challenged people are an important part of our society that has not yet received the same opportunities as others in their inclusion in the Information Society. Therefore, it is necessary to develop easily accessible systems for computers to achieve their inclusion within the new technologies. This paper presents a project whose objective is to draw disabled people nearer to new technologies. It presents a vision-based user interface designed to achieve computer accessibility for disabled users with motor impairments. The interface automatically finds the users face and tracks it through time to recognize gestures within the face region in real time. Subsequently, a new information fusion procedure is proposed to acquire data from computer vision algorithms and its results are used to carry out a robust recognition process. Finally, we show how the system is used to replace a conventional mouse device for computer interaction and as a communication system for non-verbal children.


conference on computers and accessibility | 2008

Experiences using a hands-free interface

Cristina Manresa-Yee; Javier Varona; Francisco J. Perales; Francesca Negre; Joan Jordi Muntaner

Hands-free interfaces could be the best choice for Human-Computer Interaction (HCI) for people with physical disabilities that are not capable of using traditional input devices. Once a first prototype is developed in the laboratory taking into account design and usability requirements, real users is what finally categorize an interface as useful or not. Therefore, an evaluation of our interface with users with cerebral palsy and multiple sclerosis has been carried out during a project of 9 months long. This paper presents a vision-based user interface designed to achieve computer accessibility together with the validation and evaluation of its human computer interaction issues such as usability and accessibility.


Interacting with Computers | 2010

User experience to improve the usability of a vision-based interface

Cristina Manresa-Yee; Pere Ponsa; Javier Varona; Francisco J. Perales

When we develop an input device for users to communicate with computers, we have to take into account that end-users must consider the utilization of the device to be effective, efficient and satisfactory. Users whose expectations are unmet by the interface will tend to abandon it. In this paper we present a vision-based interface for motor-impaired users; a multidisciplinary group developed this interface. The users preferences are a critical issue when selecting an access device; therefore, user requirements should be included in the design. Usability evaluation should be integrated into relevant phases of software development. In order to evaluate the design, we present a process with multiple user studies at different development stages. We describe the combination of a development project and its implementation, with user experience considerations embedded in the process. Finally, we studied the performance of the interface through several tests, paying special attention to satisfaction and fatigue. From our results we observed that although several users found the interface tiring, their satisfaction level was encouraging, suggesting the interface is usable.


Universal Access in The Information Society | 2014

Design recommendations for camera-based head-controlled interfaces that replace the mouse for motion-impaired users

Cristina Manresa-Yee; Javier Varona; Francisco J. Perales; Iosune Salinas

Abstract This work focuses on camera-based systems that are designed for mouse replacement. Usually, these interfaces are based on computer vision techniques that capture the user’s face or head movements and are specifically designed for users with disabilities. The work identifies and reviews the key factors of these interfaces based on the lessons learnt by the authors’ experience and by a comprehensive analysis of the literature to describe the specific points to consider in their design. These factors are as follows: user features to track, initial user detection (calibration), position mapping, feedback, error recovery, event execution, profiles and ergonomics of the system. The work compiles the solutions offered by different systems to help new designers avoid problems already discussed by the others.


human factors in computing systems | 2015

Face Me! Head-Tracker Interface Evaluation on Mobile Devices

Maria Francesca Roig-Maimó; Javier Varona Gómez; Cristina Manresa-Yee

The integration of front cameras on mobile devices and the increase on processing capacity has opened the door to head-tracker interfaces on mobile devices. However, research mostly focus on the development of new interfaces and their integration into prototypes without analyzing human performance. In this work, we present a head-tracker interface for mobile devices and its evaluation from the point of view of Human-Computer Interaction. Nineteen participants performed position-select tasks using their noses movement. User performance was measured with different device orientations and combining different gain and target width. Based on the obtained results, two design recommendations were made for those designers using the developed interface. In addition, we confirmed that device orientation, a particular feature for mobile devices that does not affect desktop computers, has no effect on the users performance.


Sensors | 2016

A Robust Camera-Based Interface for Mobile Entertainment

Maria Francesca Roig-Maimó; Cristina Manresa-Yee; Javier Varona

Camera-based interfaces in mobile devices are starting to be used in games and apps, but few works have evaluated them in terms of usability or user perception. Due to the changing nature of mobile contexts, this evaluation requires extensive studies to consider the full spectrum of potential users and contexts. However, previous works usually evaluate these interfaces in controlled environments such as laboratory conditions, therefore, the findings cannot be generalized to real users and real contexts. In this work, we present a robust camera-based interface for mobile entertainment. The interface detects and tracks the user’s head by processing the frames provided by the mobile device’s front camera, and its position is then used to interact with the mobile apps. First, we evaluate the interface as a pointing device to study its accuracy, and different factors to configure such as the gain or the device’s orientation, as well as the optimal target size for the interface. Second, we present an in the wild study to evaluate the usage and the user’s perception when playing a game controlled by head motion. Finally, the game is published in an application store to make it available to a large number of potential users and contexts and we register usage data. Results show the feasibility of using this robust camera-based interface for mobile entertainment in different contexts and by different people.


international conference of design user experience and usability | 2015

Designing a Vibrotactile Language for a Wearable Vest

Ann Morrison; Hendrik Knoche; Cristina Manresa-Yee

We designed a wearable vest that houses a set of actuators to be placed at specific points on the body. We developed vibrotactile patterns to induce five sensation types: 1 Calming, 2 patterns, Up and Down back 2 Feel Good 4 patterns in different directions around the waist, 3 Activating 2 patterns, Tarzan and Shiver, on top front of body and then down the back as well for Shiver, 4 Navigation 2 patterns, Turn Left and Turn Right, prompting on back then opposite side front waist for full body turning and 5 Warning, 1 pattern on solar plexus to slow down or stop the wearers. We made an overlap between the pulses, which were of longer durations than the short burst saltation pulses designed to induce muscle movement. Our participants responded well to the Calming and Feel Good patterns, but reported mixed responses to Activation, Navigation and Warning patterns.


Behaviour & Information Technology | 2014

Observing the use of an input device for rehabilitation purposes

Cristina Manresa-Yee; Pere Ponsa; Iosune Salinas; Francisco J. Perales; Francisca Negre; Javier Varona

We designed and developed a vision-based computer interface which works with head movements. The system was implemented in a centre for users with cerebral palsy and they used it in contexts related with recreation or with education. During this process, it was observed that the continued use of the interface with a set of training tasks may act as a physical and cognitive rehabilitation tool and complement users’ rehabilitation therapy. We comment on five case studies of users who have worked with the interface for five months and whose qualitative outcomes, observed by the therapists who accompanied them, were positive; specifically there was improvement in work posture, head control, increased endurance, decreased involuntary movements and improved spatial orientation. The case studies also showed the need to supervise the users’ work in order to achieve these aims, along with the importance of motivation and active, voluntary participation of users in the rehabilitation process.


conference on human system interactions | 2009

Assessment of the use of a human-computer vision interaction framework

Pere Ponsa; Cristina Manresa-Yee; David Batlle; Javier Varona

The main goal of this work is to present an integrated framework between vision based interfaces and human-centered design approaches. The final issue is to study the interaction between humans and devices in order to improve the performance of the human-computer system designed, and apply this taxonomy to develop future applications for people with special needs in smart home environments.


Archive | 2017

Interactive Furniture: Bi-directional Interaction with a Vibrotactile Wearable Vest in an Urban Space

Ann Morrison; Jack Leegaard; Cristina Manresa-Yee; Walther Jensen; Hendrik Knoche

In this study we investigate the experience for participants while wearing a vibrotactile vest that interacts with a vibroacoustic architecture The Humming Wall, set in an urban space. This public large scale artefact is built to exchange vibrotactile and physiological interactions with a vibrotactile wearable vest. The heart beats and breath rates of the vest wearers are vibroacoustically displayed at The Humming Wall. In addition, participants can swipe and knock on The Humming Wall and the vest wearer is effectively swiped and knocked upon. We work with overlapping vibrotactile outputs in order that the wearers experience a flow of sensations similar to a touch gesture. The communication advantaged vibroacoustic and vibrotactile as the primary interaction modalities for both vest wearers as well as for a passing public. The participants found the experience favourable and analysis reveals some patterns on the vest and zones at the wall impact relaxation in the form of calming and feel-good sensations, (even therapeutic) as well as activation and warning on the vest. We contribute to this research field by adding a large scale public object and visibly responsive interactive wall that was positively received as the partner responder for the wearers of a vibrotactile vest set in an urban environment. Participants reported calming, therapeutic, feel good sensations in response to the patterns.

Collaboration


Dive into the Cristina Manresa-Yee's collaboration.

Top Co-Authors

Avatar

Javier Varona

University of the Balearic Islands

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Francisco J. Perales

University of the Balearic Islands

View shared research outputs
Top Co-Authors

Avatar

Maria Francesca Roig-Maimó

University of the Balearic Islands

View shared research outputs
Top Co-Authors

Avatar

Pere Ponsa

Polytechnic University of Catalonia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge