Sarah Tazzyman
University of Sunderland
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sarah Tazzyman.
interaction design and children | 2016
Lynne Hall; Colette Hume; Sarah Tazzyman
This paper focuses on achieving optimal responses through supporting childrens judgements, using Smiley Face Likert scales as a rating scale for quantitative questions in evaluations. It highlights the need to provide appropriate methods for children to communicate judgements, highlighting that the traditional Smiley Face Likert scale does not provide an appropriate method. The paper outlines a range of studies, identifying that to achieve differentiated data and full use of rating scales by children that faces with positive emotions should be used within Smiley Face Likert scales. The proposed rating method, the Five Degrees of Happiness Smiley Face Likert scale, was used in a large-scale summative evaluation of a Serious Game resulting in variance within and between children, with all points of the scale used.
Interactive Technology and Smart Education | 2013
Lynne Hall; Susan Jones; Ruth Aylett; Marc Hall; Sarah Tazzyman; Ana Paiva; Lynne P. Humphries
Purpose – This paper aims to briefly outline the seamless evaluation approach and its application during an evaluation of ORIENT, a serious game aimed at young adults.Design/methodology/approach – In this paper, the authors detail a unobtrusive, embedded evaluation approach that occurs within the game context, adding value and entertainment to the player experience whilst accumulating useful data for the development team.Findings – The key result from this study was that during the “seamless evaluation” approach, users were unaware that they had been participating in an evaluation, with instruments enhancing rather than detracting from the in‐role game experience.Practical implications – This approach, seamless evaluation, was devised in response to player expectations, perspectives and requirements, recognising that in the evaluation of games the whole process of interaction including its evaluation must be enjoyable and fun for the user.Originality/value – Through using seamless evaluation, the authors ...
international conference on human-computer interaction | 2014
Birgit Endrass; Lynne Hall; Colette Hume; Sarah Tazzyman; Elisabeth André
In this paper, we outline the creation of an engaging and intuitive pictorial language as an interaction modality to be used by school children aged 9 to 11 years to interact with virtual characters in a cultural learning environment. Interaction takes place on a touch screen tablet computer linked to a desktop computer on which the characters are displayed. To investigate the benefit of such an interaction style, we conducted an evaluation study to compare the pictorial interaction language with a menu-driven version for the same system. Results indicate that children found the pictorial interaction language more fun and more exciting than the menus, with users expressing a desire to interact for longer using the pictorial interaction language. Thus, we think the pictorial interaction language can help support the children’s experiential learning, allowing them to concentrate on the content of the cultural learning scenario.
human factors in computing systems | 2014
Birgit Endrass; Lynne Hall; Colette Hume; Sarah Tazzyman; Elisabeth André; Ruth Aylett
Providing fun, engaging child-centric approaches to interaction is challenging. The Pictorial Interaction Language was developed for children to communicate and interact with virtual characters in a serious game, MIXER. The design and development of the Pictorial Interaction Language is briefly outlined. Results highlight that children found interacting fun and were highly positive about the Pictorial Interaction Language.
Industry and higher education | 2016
Derek Watson; Lynne Hall; Sarah Tazzyman
This paper reports in part on a major study, carried out in 2013, in which data were collected from university senior executives and academics in the five university business schools in the North East of England: it focuses on the quantitative findings produced. Whilst prima facie evidence would suggest that universities are strategically embedding and integrating third stream strategies alongside first and second stream activities, a critical analysis of the research data revealed that this was clearly not the case. Empirical evidence from the research indicates that there are strategic failings in universities. This research contributes to the existing literature which highlights the academic pressures in embedding the third stream in higher education institutions.
Enfance | 2015
Lynne Hall; Colette Hume; Sarah Tazzyman
nteractive applications designed specifically for children offer great potential for education and play. However, to ascertain that the aims of applications are achieved, child-centred evaluations must be conducted. The design of any evaluation with children requires significant consideration of potential problems with comprehension, cognitive ability, response biases and study attrition. Multidisciplinary R&D project evaluation requirements are often extensive, requiring an all-encompassing and prolonged evaluation design. Discontinuity between the highly engaging interaction experience and themultitude of measures that form the evaluation poses a major issue for the evaluation of interactive applications. In response, we have developed Transmedia Evaluation, a method that aims to maintain engagement throughout the evaluation process. In this paper, the Transmedia Evaluation process is explained and applied to evaluate a learning application for children, MIXER (Moderating Interactions for Cross Cultural Empathic Relationships). Children aged 9-11 (N = 117) used the MIXER application and completed an evaluation battery including pre- and posttest questionnaires, immediate learning assessment and qualitative evaluation. Using Transmedia Evaluation to develop the MIXER evaluation resulted in complete data-sets (100%) for quantitative data (by self-regulated completion) along with rich, high quality qualitative responses. Transmedia Evaluation transformed the evaluation, with children fully engaging in and enjoying their experience. Engaging Children in Interactive Application Evaluation (PDF Download Available). Available from: https://www.researchgate.net/publication/276422985_Engaging_Children_in_Interactive_Application_Evaluation [accessed Feb 8, 2016].
adaptive agents and multi agents systems | 2014
Ruth Aylett; Lynne Hall; Sarah Tazzyman; Birgit Endrass; Elisabeth André; Christopher Ritter; Asad Nazir; Ana Paiva; Gert Jan Hofstede; Arvid Kappas
British Journal of Psychology | 2016
John Maltby; Liza Day; Ruth M. Hatcher; Sarah Tazzyman; Heather D. Flowe; Emma J. Palmer; Caren A. Frosch; Michelle O'Reilly; Ceri Jones; Chloe Buckley; Melanie Knieps; Katie Cutts
artificial intelligence in education | 2015
Lynne Hall; Sarah Tazzyman; Colette Hume; Birgit Endrass; Mei Yii Lim; Gert Jan Hofstede; Ana Paiva; Elisabeth André; Arvid Kappas; Ruth Aylett
Archive | 2012
Derek Watson; Lynne Hall; Sarah Tazzyman