Janet C. Read
University of Central Lancashire
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Janet C. Read.
interaction design and children | 2006
Janet C. Read; Stuart MacFarlane
The paper begins with a review of some of the current literature on the use of survey methods with children. It then presents four known concerns with using survey methods for opinion gathering and reflects on how these concerns may impact on studies in Child Computer Interaction. The paper then investigates the use of survey methods in Child Computer Interaction and investigates the Fun Toolkit. Three new research studies into the efficacy and usefulness of the tools are presented and these culminate in some guidelines for the future use of the Fun Toolkit. The authors then offer some more general guidelines for HCI researchers and developers intending to use survey methods in their studies with children. The paper closes with some thoughts about the use of survey methods in this interesting but complex area.
Cognition, Technology & Work | 2008
Janet C. Read
The paper presents the Fun Toolkit (v3), a survey instrument that has been devised to assist researchers and developers to gather opinions about technology from children. In presenting the toolkit, the paper provides a reflective look at several studies where the toolkit has been validated and considers how the Fun Toolkit should be used as well as discussing how, and in what way, the instruments contained within it should be employed. This consideration of use is one of the novel contributions of the paper. The second major contribution is the discussion based around software appeal; in which the fit between the Fun Toolkit and usability and engagement is explored. The paper concludes that the Fun Toolkit is useful, that it can be used with some confidence to gather opinions from children and that it has the potential for use for other user experiences.
Computers in Education | 2006
Gavin Sim; Stuart MacFarlane; Janet C. Read
This paper describes an empirical study of fun, usability, and learning in educational software. Twenty five children aged 7 and 8 from an English primary school participated. The study involved three software products that were designed to prepare children for government initiated science tests. Pre and post tests were used to measure the learning effect, and observations and survey methods were used to assess usability and fun. The findings from the study demonstrate that in this instance learning was not correlated with fun or usability, that observed fun and observed usability were correlated, and that children of this age appeared to be able to differentiate between the constructs used to describe software quality. The Fun Sorter appears to be an effective tool for evaluating products with children. The authors discuss the implications of the results, offer some thoughts on designing experiments with children, and propose some ideas for future work.
Informatics for Health & Social Care | 2009
Faisal Hanif; Janet C. Read; John Goodacre; Afzal N. Chaudhry; Paul Gibbs
The Internet has made it possible for patients and their families to access vast quantities of information that previously would have been difficult for anyone but a physician or librarian to obtain. Health information websites, however, are recognised to differ widely in quality and reliability of their content. This has led to the development of various codes of conduct or quality rating tools to assess the quality of health websites. However, the validity and reliability of these quality tools and their applicability to different health websites also varies. In principle, rating tools should be available to consumers, require a limited number of elements to be assessed, be assessable in all elements, be readable and be able to gauge the readability and consistency of information provided from a patients view point. This article reviews the literature on the trends of the Internet use for health and analyses various codes of conduct/ethics or ‘quality tools’ available to monitor the quality of health websites from a patient perspective.
nordic conference on human-computer interaction | 2006
S. Rebecca Kelly; Emanuela Mazzone; Matthew Horton; Janet C. Read
This paper presents Bluebells, a design method that balances child-centred design with expert design in a progressive approach that marries the best of both disciplines. The method is described in the context of a museum technologies project. Bluebells comprises several new design techniques; these are evaluated and discussed in the paper. The authors conclude with guidelines for future use of the Bluebells method including the importance of providing a context for design partners and allowing them to express their ideas in ways they are comfortable with.
international conference on human-computer interaction | 2001
Janet C. Read; Stuart MacFarlane; Chris Casey
This paper describes an experiment in which children aged between 6 and 10 entered text into a word processor using four different input methods, mouse, keyboard, speech recognition, and handwriting recognition. Several different measures of usability were made in an attempt to assess the suitability of the input methods in this situation. The paper describes and discusses the measures and their use with very young children.
interaction design and children | 2014
Janet C. Read; Daniel Fitton; Matthew Horton
Participatory Design (PD) in various guises is a popular approach with the Interaction Design and Children (IDC) community. In studying it as a method very little work has considered the fundamentals of participation, namely how children choose to participate and how their ideas are included and represented. This paper highlights ethical concerns about PD with children within the context of information needed to consent. In helping children understand participation in PD, a central aspect is the necessity to help children understand how their design ideas are used which itself challenges researchers to seek a fair and equitable process that is describable and defensible. The TRAck (tracking, representing and acknowledging) Method, is described as an initial process that could meet this need. This is evaluated, in two forms, in a PD study with 84 children. The TRAck Method encouraged careful scrutiny of designs and allowed the researchers to distil useful design ideas although these were maybe not the most imaginative. There is a trade off between the limitations of applying such a process to PD against the benefits of ensuring fullinformed involvement of children.
interaction design and children | 2004
Janet C. Read; Stuart MacFarlane; Peggy Gregory
This paper describes how the design of a novel writing interface for children was informed by requirements gathering. The derivation of a set of system requirements from observations of children using early prototypes of the interface and from modelling the system is described, and then two methods of gathering further requirements by surveying children are outlined. The relative advantages and disadvantages of each method are discussed. The children were not able to contribute to the full range of requirements necessary for a complete system, but they contributed fun requirements that the observational work failed to identify. A model of the childs relationship to interactive systems is used to discuss why this is the case.
interaction design and children | 2009
Diana Xu; Janet C. Read; Gavin Sim; Barbara McManus
This paper investigates the use of drawings as a tool for the evaluation of childrens interfaces. In the study, childrens experiences on a variety of computer interfaces were captured in drawings. A group of four researchers participated in the coding of the drawings, before the results were aggregated and statistically analysed. The evaluation of the approach is positive: the chosen drawing method could be used easily and was effective in conveying the user experience from the drawings; a number of the drawings conveyed information pertaining to user experiences: fun (F), goal fit (GF) and tangible magic (TM); the method was found generally reliable at capturing all three elements and particularly reliable at capturing fun.
human factors in computing systems | 2013
Janet C. Read; Matthew Horton; Gavin Sim; Peggy Gregory; Daniel Fitton; Brendan Cassidy
When working with children in participatory design activities ethical questions arise that are not always considered in a standard ethics review. This paper highlights five challenges around the ethics of the value of design and the ethics of the childrens participation and presents a new tool, CHECk that deals with three of these challenges by virtue of two checklists that are designed to challenge researchers in CCI and HCI to critically consider the reasons for involving children in design projects and to examine how best to describe design activities in order that children can better consent to participate.