Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Narada D. Warakagoda is active.

Publication


Featured researches published by Narada D. Warakagoda.


Archive | 2010

Multimodal Interfaces to Mobile Terminals - A Design-For-All Approach

Knut Kvale; Narada D. Warakagoda

Multimodal human-computer user interfaces are able to combine different input signals, extract the combined meaning from them, find requested information and present the response in the most appropriate format. Hence, a multimodal human-computer interface offers the users an opportunity to choose the most natural interaction pattern for the actual application and context of use. If the preferred mode fails in a certain context or task, users may switch to a more appropriate mode or they can combine modalities. Around thirty years ago Bolt presented the “Put That There” concept demonstrator, which processed speech in parallel with manual pointing during object manipulation (Bolt, 1980). Since then major advances have been made in automatic speech recognition (ASR) algorithms and natural language processing (NLP), in handwriting and gesture recognition, as well as in speed, processing power and memory capacity of computers. Today’s multimodal systems are capable of recognizing and combining a wide variety of signals such as speech, touch, manual gestures, gaze tracking, facial expressions, head and body movements. The response can be presented by e.g. facial animation in the form of human-like presentation agents on the screen in a multimedia system. These advanced systems need various sensors and a lot of processing power and memory. They are therefore best suited for interaction with computers and in kiosk applications, as demonstrated in e.g. (Oviatt, 2000); (Gustafson et al., 2000); (Wahlster, 2001); (Beskow, et al. 2002); (Karpov, 2006); (Smartkom, 2007). Modern mobile terminals are now portable computers where the traditional audio user interfaces, microphones and speakers, are accompanied with touch screens, cameras, accelerometers and gyroscopes etc. These enriched user interfaces combined with the ever increasing capacity of processors, access to mobile networks with increasing bandwidths and functionality as global positioning system (GPS) and near field communication (NFC) will make mobile terminals well suited for developing user-friendly multimodal interfaces in the years to come. However, the multimodal functionality on mobile terminals is still restricted to two input modes: speech (audio) and touch, and two output modes: audio and vision. This type of multimodality, sometimes called tap & talk (or point & speak), is essentially


conference of the international speech communication association | 2000

a noise robust multilingual reference recogniser based on speechdat(II)

Børge Lindberg; Finn Tore Johansen; Narada D. Warakagoda; Gunnar Lehtinen; Zdravko Kacic; Andrej Zgank; Kjell Elenius; Giampiero Salvi


language resources and evaluation | 2000

The cost 249 speechdat multilingual reference recogniser

Finn Tore Johansen; Narada D. Warakagoda; Børge Lindberg; Gunnar Lehtinen; Zdravko Kacic; Andrej Zgank; Kjell Elenius; Giampiero Salvi


Computer Assisted Language Learning | 2002

The MUST guide to Paris: Implementation and expert evaluation of a multimodal tourist guide to Paris

L.S. Almeida; Ingunn Amdal; Nuno Beires; M. Boualem; L.W.J. Boves; E.A. den Os; P. Filoche; Rui Gomes; J.E. Knudsen; Knut Kvale; J. Rugelbak; C. Tallec; Narada D. Warakagoda


Journal of the Acoustical Society of America | 2002

Implementing and evaluating a multimodal and multilingual tourist guide

L.S. Almeida; I. Amdal; Nuno Beires; M. Boualem; L.W.J. Boves; E.A. den Os; P. Filoche; Rui Gomes; J.E. Knudsen; Knut Kvale; J. Rugelbak; C. Tallec; Narada D. Warakagoda


conference of the international speech communication association | 2005

A speech centric mobile multimodal service useful for dyslectics and aphasics.

Knut Kvale; Narada D. Warakagoda


Nordic Conference on Multimodal Communication | 2006

Evaluation of a mobile multimodal service for disabled users

Knut Kvale; Narada D. Warakagoda; Marthin Kristiansen


International CLASS Workshop on Natural, Intelligent and Effective Interaction in Multimodal Dialogue System | 2002

Implementing and evaluating a multimodal tourist guide

Luís B. Almeida; Ingunn Amdal; Nuno Beires; M. Boualem; L.W.J. Boves; Els den Os; P. Filoche; Rui Gomes; J.E. Knudsen; Knut Kvale; J. Rugelbak; C. Tallec; Narada D. Warakagoda


conference of the international speech communication association | 1999

Neural network based optimal feature extraction for ASR.

Narada D. Warakagoda; Magne Hallstein Johnsen


Technology and Disability | 2008

Speech centric multimodal interfaces for disabled users

Knut Kvale; Narada D. Warakagoda

Collaboration


Dive into the Narada D. Warakagoda's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

L.W.J. Boves

Radboud University Nijmegen

View shared research outputs
Top Co-Authors

Avatar

Andrej Zgank

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Finn Tore Johansen

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Gunnar Lehtinen

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Magne Hallstein Johnsen

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

L.S. Almeida

Radboud University Nijmegen

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge