Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Flaithrí Neff is active.

Publication


Featured researches published by Flaithrí Neff.


text speech and dialogue | 2007

User modeling to support the development of an auditory help system

Flaithrí Neff; Aidan Kehoe; Ian Pitt

The implementations of online help in most commercial computing applications deployed today have a number of well documented limitations. Speech technology can be used to complement traditional online help systems and mitigate some of these problems. This paper describes a model used to guide the design and implementation of an experimental auditory help system, and presents results from a pilot test of that system.


international conference on design of communication | 2007

Improvements to a speech-enabled user assistance system based on pilot study results

Aidan Kehoe; Flaithrí Neff; Ian Pitt; Gavin Russell

User assistance, as implemented in most commercial computing applications deployed today, has a number of well documented limitations. Speech technology can be used to complement traditional user assistance techniques and mitigate some of these problems. This paper reports the results from a pilot study conducted on a speech-enabled user assistance system, and describes improvements to the system made as a result of that initial study.


international conference on computers helping people with special needs | 2010

Accelerometer & spatial audio technology: making touch-screen mobile devices accessible

Flaithrí Neff; Tracey J. Mehigan; Ian Pitt

As mobile-phone design moves toward a touch-screen form factor, the visually disabled are faced with new accessibility challenges. The mainstream interaction model for touch-screen devices relies on the user having the ability to see spatially arranged visual icons, and to interface with these icons via a smooth glass screen. An inherent challenge for blind users with this type of interface is its lack of tactile feedback. In this paper we explore the concept of using a combination of spatial audio and accelerometer technology to enable blind users to effectively operate a touch-screen device. We discuss the challenges involved in representing icons using sound and we introduce a design framework that is helping us tease out some of these issues. We also outline a set of proposed user-studies that will test the effectiveness of our design using a Nokia N97. The results of these studies will be presented at ICCHP 2010.


Universal Access in The Information Society | 2009

Use of voice input to enhance cursor control in mainstream gaming applications

Aidan Kehoe; Flaithrí Neff; Ian Pitt

There are opportunities for use of voice input to enhance the effectiveness of continuous cursor control in mainstream gaming. This paper describes a program that uses voice input to manipulate the cursor gain parameter within the context of a game. For some use groups the ability to dynamically manipulate this parameter can be important in making games more accessible. The program makes use of readily-available speech technology, and can be used in conjunction with existing games.


audio mostly conference | 2017

The Data-Driven Algorithmic Composer

J. Fitzpatrick; Flaithrí Neff

The Data-Driven Algorithmic Composer (D-DAC) is an application designed to output data-driven algorithmically composed music via MIDI. The application requires input data to be in tab-separated format to be compatible. Each dataset results in a unique piece of music that remains consistent with each iteration of the application. The only varying elements between each iteration of the same dataset are factors defined by the user: tempo, scale, and intervals between rows. Each measure of the melody, harmony and bassline is derived from each row of the dataset. By utilizing this non-random algorithmic application, users can create a unique and predefined musical iteration of their dataset. The overall aim of the D-DAC is to inspire musical creativity from scientific data and encourage the sharing of datasets between various research communities.


human computer interaction with mobile devices and services | 2008

Evaluation of pause intervals between haptic/audio cues and subsequent speech information

Aidan Kehoe; Flaithrí Neff; Ian Pitt

Non-speech sounds and haptics have an important role in enabling access to user assistance material in ubiquitous computing scenarios. Non-speech sounds and haptics can be used to cue assistance material that is to be presented to users via speech. In this paper, we report on a study that examines user perception of the duration of a pause between a cue (which may be non-speech sound, haptic, or combined non-speech sound plus haptic) and the subsequent delivery of assistance material using speech.


Journal of the Acoustical Society of America | 2008

Investigating the potential of human echolocation in virtual sonic trigonometry

Flaithrí Neff; Ian Pitt

Describing a mathematical problem often involves visual diagrams. For blind students this accentuates the challenges they face. Projects such as LAMBDA have used linear speech and Braille to convey algebraic equations. However, spatial features, for example in trigonometry, are difficult to map to a linear‐based system. Traditional tactile methods (e.g. German film) convey simple shapes but need Braille support and speech‐tactile interfaces (e.g. NOMAD) require unconventional equipment. Cognitive issues regarding tactile interpretation of 3D shapes also persist. Blind students interact regularly with speech technology and audio games. This exposure means that the auditory system is potentially becoming accustomed to sonic interpretation of computer‐based information. Some of our research has looked at expanding the sonic environment to include spatial information aimed at trigonometry. The next stage is to provide interactive user control. Our system is based on a user interface model in order to consider...


Archive | 2010

Spatial sound for computer games and virtual reality

David Murphy; Flaithrí Neff


conference on artificial intelligence for applications | 2007

Extending traditional user assistance systems to support an auditory interface

Aidan Kehoe; Flaithrí Neff; Ian Pitt


audio mostly conference | 2016

Evaluating Gesture Characteristics When Using a Bluetooth Handheld Music Controller

Richard Pinsenschaum; Flaithrí Neff

Collaboration


Dive into the Flaithrí Neff's collaboration.

Top Co-Authors

Avatar

Ian Pitt

University College Cork

View shared research outputs
Top Co-Authors

Avatar

Aidan Kehoe

University College Cork

View shared research outputs
Top Co-Authors

Avatar

David Murphy

University College Cork

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. Fitzpatrick

Limerick Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Richard Pinsenschaum

Limerick Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge