Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Grégory Leplâtre is active.

Publication


Featured researches published by Grégory Leplâtre.


human factors in computing systems | 2007

Pictures at the ATM: exploring the usability of multiple graphical passwords

Wendy Moncur; Grégory Leplâtre

Users gain access to cash, confidential information and services at Automated Teller Machines (ATMs) via an authentication process involving a Personal Identification Number (PIN). These users frequently have many different PINs, and fail to remember them without recourse to insecure behaviours. This is not a failing of users. It is a usability failing in the ATM authentication mechanism. This paper describes research executed to evaluate whether users find multiple graphical passwords more memorable than multiple PINs. The research also investigates the success of two memory augmentation strategies in increasing memorability of graphical passwords. The results demonstrate that multiple graphical passwords are substantially more effective than multiple PIN numbers. Memorability is further improved by the use of mnemonics to aid their recall.This study will be of interest to HCI practitioners and information security researchers exploring approaches to usable security.


european conference on cognitive ergonomics | 2012

A situated cognition aware approach to the design of information retrieval systems for geospatial data

Paul Craig; Néna Roa-Seïler; Grégory Leplâtre

Motivation -- To improve the process of information retrieval (IR), specifically for geospatial data, by accounting for the natural processes of situated cognition where knowledge is a product of both action and context. Research approach -- To focus on a specific topic (Mexican history), evaluate the limitations of existing approaches and design/implement a new system that overcomes these limitations. Findings/Design -- As the theory situated cognition stipulates-all knowledge is situated in activity bound to social, cultural and physical contexts. It was found that the knowledge produced by information retrieval can be situated in the activity of exploring search results and bound to the context of geographic location (specifically, place names). In the design of our new application this made it important to allow the user to be able to have place-names for towns and cities visible throughout the search process. Research limitations/Implications -- Tests were only undertaken with Mexicans living in the Mixteca region of Oaxaca with data about Mexican events, hence results may be culturally specific or specific to users from countries with a particular geography. Originality/Value -- The results of this research should be of interest to designers of interactive maps and those who attempting to apply the theory of situated cognition to application design. Take away message -- Taking account for the context in which users want to view the results of searches can improve the usability of IR applications. Specifically, this is demonstrated for geographic data where maintaining the visibility of place-names makes results generally more valuable.


designing interactive systems | 2017

CymaSense: A Real-Time 3D Cymatics-Based Sound Visualisation Tool

John McGowan; Grégory Leplâtre; Iain McGregor

What does music look like? Representation of music has taken many forms over time, from musical notation [16], through to random algorithm-based visualisations based on the amplitude of an audio signal [19]. One aspect of music visualisation that has not been widely explored is that of Cymatics. Cymatics are physical impressions of music created in mediums such as water. Current Cymatic visualisations are restricted to 2D imaging, whilst 3D visualisations of music are generally based on arbitrary mapping of audio-visual attributes. This paper looks at the design of CymaSense, an interactive tool based on Cymatics.


Innovation in Language Learning and Teaching | 2018

Viewing speech in action: speech articulation videos in the public domain that demonstrate the sounds of the International Phonetic Alphabet (IPA)

Satsuki Nakai; David Beavan; Eleanor Lawson; Grégory Leplâtre; James M. Scobbie; Jane Stuart-Smith

ABSTRACT In this article, we introduce recently released, publicly available resources, which allow users to watch videos of hidden articulators (e.g. the tongue) during the production of various types of sounds found in the world’s languages. The articulation videos on these resources are linked to a clickable International Phonetic Alphabet chart ([International Phonetic Association. 1999. Handbook of the International Phonetic Association: A Guide to the Use of the International Phonetic Alphabet. Cambridge: Cambridge University Press]), so that the user can study the articulations of different types of speech sounds systematically. We discuss the utility of these resources for teaching the pronunciation of contrastive sounds in a foreign language that are absent in the learner’s native language.


international conference on multimodal interfaces | 2017

Utilising natural cross-modal mappings for visual control of feature-based sound synthesis

Augoustinos Tsiros; Grégory Leplâtre

This paper presents the results of an investigation into audio-visual (AV) correspondences conducted as part of the development of Morpheme, a painting interface to control a corpus-based concatenative sound synthesis algorithm. Previous research has identified strong AV correspondences between dimensions such as pitch and vertical position or loudness and size. However, these correspondences are usually established empirically by only varying a single audio or visual parameter. Although it is recognised that the perception of AV correspondences is affected by the interaction between the parameters of auditory or visual stimuli when these are complex multidimensional objects, there has been little research into perceived AV correspondences when complex dynamic sounds are involved. We conducted an experiment in which two AV mapping strategies and three audio corpora were empirically evaluated. 110 participants were asked to rate the perceived similarity of six AV associations. The results confirmed that size/loudness, vertical position/pitch, colour brightness/spectral brightness are strongly associated. A weaker but significant association was found between texture granularity and sound dissonance, as well as colour complexity and sound dissonance. Harmonicity was found to have a moderating effect on the perceived strengths of these associations: the higher the harmonicity of the sounds, the stronger the perceived AV associations.


conference on computers and accessibility | 2017

CymaSense: A Novel Audio-Visual Therapeutic Tool for People on the Autism Spectrum

John McGowan; Grégory Leplâtre; Iain McGregor

Music Therapy has been shown to be an effective intervention for clients with Autism Spectrum Condition (ASC), a lifelong neurodevelopmental condition that can affect people in a number of ways. This paper presents a study evaluating the use of a multimodal 3D interactive tool, CymaSense, within a series of music therapy sessions. Eight adults with ASC participated in an 8-week period using a single case experimental design approach. The study used qualitative and quantitative methodological tools for analysis within and beyond the therapy sessions. The results indicate an increase in communicative behaviours for both verbal and non-verbal participants.


human-computer interaction with mobile devices and services | 2004

Mobile HCI and Sound

Simon Holland; Robert Day; Grégory Leplâtre; Alistair D. N. Edwards

Sound plays an increasingly varied and vital role in mobile and ubiquitous user interaction. One reason is the limited screen real-estate available in typical mobile devices. Another reason is that many mobile devices are used in minimal attention situations These are situations in which the user has only limited attention available for the interface: the user’s eyes may be busy elsewhere; and the user may be busy avoiding the normal hazards of moving around, and engaging with real-world tasks. In many circumstances, such interactions will involve non-speech audio and gesture to afford natural means of access to information, to other people, and to services and situations in the environment.


Archive | 2000

Using non-speech sounds in mobile computing devices.

Stephen A. Brewster; Grégory Leplâtre; Murray Crease


international conference on auditory display | 1998

An investigation of using music to provide navigation cues

Grégory Leplâtre; Stephen A. Brewster


international conference on auditory display | 2004

How to tackle auditory interface aesthetics? Discussion and case study.

Grégory Leplâtre; Iain McGregor

Collaboration


Dive into the Grégory Leplâtre's collaboration.

Top Co-Authors

Avatar

Iain McGregor

Edinburgh Napier University

View shared research outputs
Top Co-Authors

Avatar

Alison Crerar

Edinburgh Napier University

View shared research outputs
Top Co-Authors

Avatar

David Benyon

Edinburgh Napier University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Augoustinos Tsiros

Edinburgh Napier University

View shared research outputs
Top Co-Authors

Avatar

John McGowan

Edinburgh Napier University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Beavan

University College London

View shared research outputs
Top Co-Authors

Avatar

Eleanor Lawson

Queen Margaret University

View shared research outputs
Researchain Logo
Decentralizing Knowledge