Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gavin Sim is active.

Publication


Featured researches published by Gavin Sim.


Computers in Education | 2006

All work and no play: measuring fun, usability, and learning in software for children

Gavin Sim; Stuart MacFarlane; Janet C. Read

This paper describes an empirical study of fun, usability, and learning in educational software. Twenty five children aged 7 and 8 from an English primary school participated. The study involved three software products that were designed to prepare children for government initiated science tests. Pre and post tests were used to measure the learning effect, and observations and survey methods were used to assess usability and fun. The findings from the study demonstrate that in this instance learning was not correlated with fun or usability, that observed fun and observed usability were correlated, and that children of this age appeared to be able to differentiate between the constructs used to describe software quality. The Fun Sorter appears to be an effective tool for evaluating products with children. The authors discuss the implications of the results, offer some thoughts on designing experiments with children, and propose some ideas for future work.


interaction design and children | 2005

Assessing usability and fun in educational software

Stuart MacFarlane; Gavin Sim; Matthew Horton

We describe an investigation into the relationship between usability and fun in educational software designed for children. Twenty-five children aged between 7 and 8 participated in the study. Several evaluation methods were used; some collected data from observers, and others collected reports from the users. Analysis showed that in both observational data, and user reports, ratings for fun and usability were correlated, but that there was no significant correlation between the observed data and the reported data. We discuss the possible reasons for these findings, and describe a method that was successful in eliciting opinions from young children about fun and usability.


interaction design and children | 2009

Experience it, draw it, rate it: capture children's experiences with their drawings

Diana Xu; Janet C. Read; Gavin Sim; Barbara McManus

This paper investigates the use of drawings as a tool for the evaluation of childrens interfaces. In the study, childrens experiences on a variety of computer interfaces were captured in drawings. A group of four researchers participated in the coding of the drawings, before the results were aggregated and statistically analysed. The evaluation of the approach is positive: the chosen drawing method could be used easily and was effective in conveying the user experience from the drawings; a number of the drawings conveyed information pertaining to user experiences: fun (F), goal fit (GF) and tangible magic (TM); the method was found generally reliable at capturing all three elements and particularly reliable at capturing fun.


human factors in computing systems | 2013

CHECk: a tool to inform and encourage ethical practice in participatory design with children

Janet C. Read; Matthew Horton; Gavin Sim; Peggy Gregory; Daniel Fitton; Brendan Cassidy

When working with children in participatory design activities ethical questions arise that are not always considered in a standard ethics review. This paper highlights five challenges around the ethics of the value of design and the ethics of the childrens participation and presents a new tool, CHECk that deals with three of these challenges by virtue of two checklists that are designed to challenge researchers in CCI and HCI to critically consider the reasons for involving children in design projects and to examine how best to describe design activities in order that children can better consent to participate.


interaction design and children | 2012

Investigating children's opinions of games: Fun Toolkit vs. This or That

Gavin Sim; Matthew Horton

Over the past decade many new evaluation methods have emerged for evaluating user experience with children, but the results of these studies have tended to be reported in isolation of other techniques. This paper reports on a comparative analysis of 2 user experience evaluations methods with children. A within-subject design was adopted using 20 children aged between 7 and 8. The children played 2 different games on a tablet PCs and their experiences of each were captured using 2 evaluation methods which have been validated with children: the Fun Toolkit and This or That. The results showed that the Fun Toolkit and This or That method yielded similar results and were able to establish a preference for one game over the other. However, there were some inconsistencies between the results of individual tools within the Fun toolkit and some of the constructs being measured in the This or That method. Further research will try to identify any ordering effects within each method and redundancies within the questions.


human factors in computing systems | 2012

School friendly participatory research activities with children

Matthew Horton; Janet C. Read; Emanuela Mazzone; Gavin Sim; Daniel Fitton

Participatory Design is a common practice in HCI and user based evaluations are also highly recommended. This paper looks at the practice of carrying out design and evaluation sessions with school aged children by describing a general method for carrying out and arranging whole class activities that are school friendly and then by analyzing the academic value of these activities. An analysis of 6 MESS days with 21 activities yielded a research out of 9 publications at a research output of 43%.


international conference on human-computer interaction | 2009

Evidence Based Design of Heuristics for Computer Assisted Assessment

Gavin Sim; Janet C. Read; Gilbert Cockton

The use of heuristics for the evaluation of interfaces is a well studied area. Currently there appear to be two main research areas in relation to heuristics: the analysis of methods to improve the effectiveness of heuristic evaluations; and the development of new heuristic sets for novel and specialised domains. This paper proposes an evidence based design approach to the development of domain specific heuristics and shows how this method was applied within the context of computer assisted assessment. A corpus of usability problems was created through a series of student surveys, heuristic evaluations, and a review of the literature. This corpus was then used to synthesise a set of domain specific heuristics for evaluating CAA applications. The paper describes the process, and presents a new set of heuristics for evaluating CAA applications.


interaction design and children | 2013

Understanding the fidelity effect when evaluating games with children

Gavin Sim; Brendan Cassidy; Janet C. Read

There have been a number of studies that have compared evaluation results from prototypes of different fidelities but very few of these are with children. This paper reports a comparative study of three prototypes ranging from low fidelity to high fidelity within the context of mobile games, using a between subject design with 37 participants aged 7 to 9. The children played a matching game on either an iPad, a paper prototype using screen shots of the actual game or a sketched version. Observational data was captured to establish the usability problems, and two tools from the Fun Toolkit were used to measure user experience. The results showed that there was little difference for user experience between the three prototypes and very few usability problems were unique to a specific prototype. The contribution of this paper is that children using low-fidelity prototypes can effectively evaluate games of this genre and style.


asia-pacific computer and human interaction | 2013

Can children perform a heuristic evaluation

Kishan Salian; Gavin Sim; Janet C. Read

Inspection based methods are not widely researched in the area of Child Computer Interaction. This paper reports the findings of a study to analyze the effectiveness of the heuristic evaluation method when using children as expert evaluators. In total 14 children participated in the study, evaluating a music making game on a laptop. The results showed that children could perform a heuristic evaluation but they encountered problems in understanding severity ratings, allocating problems found to the heuristic set and aggregating the problems. Further research will be performed to modify the process in an attempt to eliminate these issues in order to improve the method for children.


Human-Computer Interaction | 2015

From England to Uganda: Children Designing and Evaluating Serious Games

Gavin Sim; Janet C. Read; Peggy Gregory; Diane Xu

The participation of end-users in the design and evaluation of technologies has long been an important principle in human–computer interaction. This article reports a study to ascertain to what extent children using participatory methods could effectively design for a surrogate population. Fifty children, from a UK primary school, participated in a design activity to specify a serious game for children in Uganda. The children’s designs were analyzed and were shown to have effectively incorporated learning and gaming aspects. Based on these designs a serious game was developed. This new serious game and the commercial game Angry Birds were both evaluated for fun with 25 children in Uganda, using the Fun Toolkit and the This or That method. The results suggested that the children found both games fun, thus confirming that the children in the United Kingdom could effectively design a fun game for a surrogate population. Despite the positive results, the reliability of the evaluation methods is questioned. Inconsistencies were noted within the individual evaluation tools and the comparative results for some constructs yielded a low reliability score. We conclude that further research is required to establish suitable evaluation methods for evaluating fun with children in developing countries.

Collaboration


Dive into the Gavin Sim's collaboration.

Top Co-Authors

Avatar

Janet C. Read

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Matthew Horton

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Phil Holifield

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Daniel Fitton

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Brendan Cassidy

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Barbara McManus

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Peggy Gregory

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Chinedu Okwudili Obikwelu

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Nicky Danino

University of Central Lancashire

View shared research outputs
Top Co-Authors

Avatar

Stuart MacFarlane

University of Central Lancashire

View shared research outputs
Researchain Logo
Decentralizing Knowledge