Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Aidan Kehoe is active.

Publication


Featured researches published by Aidan Kehoe.


international conference on design of communication | 2006

Designing help topics for use with text-to-speech

Aidan Kehoe; Ian Pitt

Speech technology can be used to provide online help to users in situations where visual display of online help is not possible, or has some display-related limitations. Presenting help material in this manner can also complement traditional online help systems. To date, most online help material has been developed with the assumption that the material will be read. This paper proposes a number of guidelines to assist in the creation and testing of help material that may be presented to users via speech synthesis engines. The paper also provides a brief overview of an on-going project that provides online help using speech technolog


international conference on multimedia and expo | 2011

Using eye tracking technology to identify visual and verbal learners

Tracey J. Mehigan; Mary Barry; Aidan Kehoe; Ian Pitt

Learner style data is increasingly being incorporated into adaptive eLearning (electronic learning) systems for the development of personalized user models. This practice currently relies heavily on the prior completion of questionnaires by system users. Whilst potentially improving learning outcomes, the completion of questionnaires can be time consuming for users. Recent research indicates that it is possible to detect a users preference on the Global / Sequential dimension of the FSLSM (Felder-Silverman Learner Style Model) through a users mouse movement pattern, and other biometric technology including eye tracking and accelerometer technology. In this paper we discuss the potential of eye tracking technology for inference of Visual / Verbal learners. The paper will discuss the results of a study conducted to detect individual user style data based on the Visual / Verbal dimension of the FSLSM.


Evaluating User Experience in Games | 2010

Beyond the Gamepad: HCI and Game Controller Design and Evaluation

Michael A. Brown; Aidan Kehoe; Jurek Kirakowski; Ian Pitt

In recent years there has been an increasing amount of computer game focused HCI research, but the impact of controller-related issues on user experience remains relatively unexplored. In this chapter we highlight the limitations of current practices with respect to designing support for both standard and innovative controllers in games. We proceed to explore the use of McNamara and Kirakowski’s (Interactions 13(6):26–28, 2006) theoretical framework of interaction in order to better design and evaluate controller usage in games. Finally, we will present the findings of a case study applying this model to the evaluation and comparison of three different game control techniques: gamepad, keyboard and force feedback steering wheel. This study highlights not only the need for greater understanding of user experience with game controllers, but also the need for parallel research of both functionality and usability in order to understand the interaction as a whole.


text speech and dialogue | 2007

User modeling to support the development of an auditory help system

Flaithrí Neff; Aidan Kehoe; Ian Pitt

The implementations of online help in most commercial computing applications deployed today have a number of well documented limitations. Speech technology can be used to complement traditional online help systems and mitigate some of these problems. This paper describes a model used to guide the design and implementation of an experimental auditory help system, and presents results from a pilot test of that system.


international conference on design of communication | 2007

Improvements to a speech-enabled user assistance system based on pilot study results

Aidan Kehoe; Flaithrí Neff; Ian Pitt; Gavin Russell

User assistance, as implemented in most commercial computing applications deployed today, has a number of well documented limitations. Speech technology can be used to complement traditional user assistance techniques and mitigate some of these problems. This paper reports the results from a pilot study conducted on a speech-enabled user assistance system, and describes improvements to the system made as a result of that initial study.


Universal Access in The Information Society | 2009

Use of voice input to enhance cursor control in mainstream gaming applications

Aidan Kehoe; Flaithrí Neff; Ian Pitt

There are opportunities for use of voice input to enhance the effectiveness of continuous cursor control in mainstream gaming. This paper describes a program that uses voice input to manipulate the cursor gain parameter within the context of a game. For some use groups the ability to dynamically manipulate this parameter can be important in making games more accessible. The program makes use of readily-available speech technology, and can be used in conjunction with existing games.


adaptive hypermedia and adaptive web based systems | 2008

Implementing a Multimodal Interface to a DITA User Assistance Repository

Aidan Kehoe; Ian Pitt

User assistance systems can be extended to enable multimodal access to user assistance material. Implementing multimodal user assistance introduces new considerations with respect to authoring and storage of assistance material, transformation of assistance material for effective presentation on a range of devices, and user interaction issues. We describe an implementation of a multimodal interface to enable access to a DITA user assistance repository.


human computer interaction with mobile devices and services | 2008

Evaluation of pause intervals between haptic/audio cues and subsequent speech information

Aidan Kehoe; Flaithrí Neff; Ian Pitt

Non-speech sounds and haptics have an important role in enabling access to user assistance material in ubiquitous computing scenarios. Non-speech sounds and haptics can be used to cue assistance material that is to be presented to users via speech. In this paper, we report on a study that examines user perception of the duration of a pause between a cue (which may be non-speech sound, haptic, or combined non-speech sound plus haptic) and the subsequent delivery of assistance material using speech.


acm conference on hypertext | 2007

Transforming DITA topics for speech synthesis output

Aidan Kehoe; Ian Pitt

Speech technology can be used to enhance user assistance systems by making help information more accessible. In this paper we describe our implementation of a transform that converts DITA topics to SSML/SAPI documents, thus allowing the assistance topics to be presented to users via speech synthesis.


conference on artificial intelligence for applications | 2007

Extending traditional user assistance systems to support an auditory interface

Aidan Kehoe; Flaithrí Neff; Ian Pitt

Collaboration


Dive into the Aidan Kehoe's collaboration.

Top Co-Authors

Avatar

Ian Pitt

University College Cork

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mary Barry

Waterford Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge