Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jim S Tognolini is active.

Publication


Featured researches published by Jim S Tognolini.


Australian Journal of Education | 2007

Standards-Based Assessment: A Tool and Means to the Development of Human Capital and Capacity Building in Education

Jim S Tognolini; Gordon Stanley

This paper outlines a model for giving meaning to student achievement by referencing assessment to student learning or standards. This effectively shifts the focus in assessment from notions of rank ordering students (comparing their performance purely to each other) to those of monitoring growth or progress and measurement. More specifically it introduces standards-based assessment: the concept and theory. It considers how such systems operate and provides some possible strategies for implementation. Finally it shows how such systems can significantly impact upon human capital and capacity building in education.


Journal of Applied Research in Higher Education | 2015

Measuring attitudes toward plagiarism: issues and psychometric solutions

John F Ehrich; Steven J Howard; Jim S Tognolini; Sahar Bokosmaty

Purpose – The purpose of this paper is to address the issue of failing to psychometrically test questionnaire instruments when measuring university students’ attitudes towards plagiarism. These issues are highlighted by a psychometric evaluation of a commonly used (but previously untested) plagiarism attitudinal scale. Design/methodology/approach – The importance of psychometric testing is shown through an analysis of a commonly used scale using modern techniques (e.g. Rasch analysis) on 131 undergraduate education students at an Australian university. Findings – Psychometric analysis revealed the scale to be unreliable in its present form. However, when reduced to an eight-item subscale it became marginally reliable. Research limitations/implications – The main implication of this paper is that questionnaire instruments cannot be assumed to function as they are intended without thorough psychometric testing. Practical implications – The paper offers valuable insight into the psychometric properties of a ...


Assessment in Education: Principles, Policy & Practice | 2012

Establishing and Applying Performance Standards for Curriculum-Based Examinations.

John Bennett; Jim S Tognolini; Samantha Pickering

This paper describes how a state education system in Australia introduced standards-referenced assessments into its large-scale, high-stakes, curriculum-based examinations in a way that enables comparison of performance across time even though the examinations are different each year. It describes the multi-stage modified Angoff standard-setting procedure used to establish cut-off scores on subject examinations, and how the results from this exercise were then used to develop standards packages. These packages illustrate the performances of students at the borders between the various bands. The paper also shows how originally it was intended to use a Rasch measurement model to create the statistical feedback used in the standard-setting procedure. It also describes the modifications to the feedback that were necessary to meet the real-time constraints of this large-scale examination programme. It argues that consideration should now be given to using the Rasch model to provide this feedback instead of the current approach.


Archive | 2012

Assessment, Standards-Referencing and Standard Setting

Jim S Tognolini; Michelle Davidson

This chapter describes a model for giving meaning to student performance by referencing it to standards. This effectively shifts the focus in assessment from notions of rank ordering students (comparing their performance purely to each other) to those of monitoring growth or progress and measurement. More specifically, it introduces standards-referenced assessment: the concept and theory.


Australian Journal of Education | 2017

Evaluating the validity of the online multiliteracy assessment tool

Kellie Buckley-Walker; Jim S Tognolini; Lori Lockyer; Ian M Brown; Peter Caputi

This study aims to assess the validity of the Online Multiliteracy Assessment for students in Years 5 and 6. The Online Multiliteracy Assessment measures students’ abilities in making and creating meaning, using a variety of different modes of communication, such as text, audio and video. The study involved selecting two groups of students: the first group (n=19) was used in two pilot studies of the items and the second (n=299) was used in a field trial validating the functioning of the items and assessing the quality of the scale. The results indicated that the Online Multiliteracy Assessment has acceptable test–retest reliability; however, the fit to the Rasch model was less than ideal. Further investigation identified two important areas for improvement. First, the items assessing the higher order skills of synthesising, communicating and creating need to be more cognitively demanding. Second, some items need to be modified in order to improve their functionality.


Research Developments | 2006

Australian Certificate of Education: exploring a way forward

Geoff N Masters; Margaret Forster; Gabrielle Matters; Jim S Tognolini


Archive | 2006

Meeting the challenge of assessing in a standards based education system

Jim S Tognolini


Archive | 2001

Generic versus content-driven assessment

Jim S Tognolini


Asia Pacific Journal of Educational Development (APJED) | 2014

Personal Best Goal and Self-Regulation as Predictors of Mathematics Achievement: A Multilevel Structural Equation Model

Magdalena Mo Ching 莫慕貞 Mok; Michael Ying Wah Wong; Michael Ronald Su; Jim S Tognolini; Gordon Stanley


Archive | 2010

Multimodality, Multiliteracy and Visual Literacy: Where does assessment fit?

Ian M Brown; Lori Lockyer; Peter Caputi; Jim S Tognolini

Collaboration


Dive into the Jim S Tognolini's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ian M Brown

University of Wollongong

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Caputi

University of Wollongong

View shared research outputs
Top Co-Authors

Avatar

Michael Ying Wah Wong

Hong Kong Institute of Education

View shared research outputs
Top Co-Authors

Avatar

John Bennett

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge