Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Vanessa Scherman is active.

Publication


Featured researches published by Vanessa Scherman.


Sustainability Science | 2015

Practicing the science of sustainability: the challenges of transdisciplinarity in a developing world context

Toddi A. Steelman; Elizabeth Guthrie Nichols; April L. James; Lori Bradford; Liesel Ebersöhn; Vanessa Scherman; Funke Omidire; David Bunn; Wayne Twine; Melissa R. McHale

Questions related to how we practice sustainability science remain salient in the face of the failure to achieve broad-scale sustainability objectives. Transdisciplinarity is an essential part of sustainability science. Transdisciplinary conceptual scholarship has been more prevalent than empirical scholarship or applications, especially in developing world contexts. In a single case study of a multiyear project addressing water security issues in HaMakuya, South Africa, we used a framework for assessing transdisciplinary objectives to facilitate more systematic learning for those who practice sustainability science. We found that defining the problem and assembling our team were easier than the co-creation of solution-oriented knowledge and the reintegration and application of this new knowledge. Our singular case study speaks to the potential challenges related to building relationships and co-creating knowledge in an epistemologically diverse setting. Other case studies appear to have negotiated these issues in developing country contexts, and this leaves room further investigation for how to practice transdisciplinarity under these conditions.


Archive | 2013

Approaches to Effective Data Use: Does One Size Fit All?

Elizabeth Archer; Vanessa Scherman; Sarah J. Howie

This chapter describes the experiences from the most recent phase of a 7-year research project in South Africa on school-based monitoring of pupil performance in some 22 primary schools. The project aimed to generate knowledge as well as to design and develop a well-functioning feedback system to provide data to schools on learner performance. The feedback system that was developed is known as the South African Monitoring system for Primary schools (SAMP). A key objective of this phase of the project was to evaluate the use of the performance data at school and classroom level and to design an intervention for effective use of the data within the primary school environment. It is hoped that a deeper understanding of how data travel in schools (data paths) and how schools can appropriately use data may assist policymakers in developing monitoring policies and provide guidance to school leaders and teachers. This chapter focuses on the data generated through observations, journals, and interviews in the evaluation of one of these design cycles. The sample consists of three schools participating in SAMP that were purposefully selected. The evaluation data collected during this cycle of development focused particularly on how data were used by schools and how data moved within the schools. Three distinct approaches to data use that appeared to be appropriate for their specific contexts (schools) were identified: Team, Cascade, and Top-down. The data suggest that the most appropriate and effective approach of use may depend on the culture of the school, school leadership approach, level of teacher development, and context and level of functioning of the school. There are, however, certain commonalities in the approaches to effective data use. An effective feedback system should thus try to establish or encourage these conditions for data use. The data in this chapter seem to suggest that policy on data use should be flexible and provide exemplars of various possible approaches, which are appropriate for different contexts. It is important that there are layers of sophistication (different levels of detail, complexity of presentation, and disaggregation) within the data, which the school can access as needed for its particular milieu.


African Journal of Research in Mathematics, Science and Technology Education | 2012

Development of a model of effectiveness in science education to explore differential science performance : a case of South Africa

Mee-Ok Cho; Vanessa Scherman; Estelle Gaigher

Abstract This paper reports on secondary analysis of TIMSS 2003 data, based on a sound conceptual model, aiming to explain differential science achievement in South Africa from the perspective of educational effectiveness research. The conceptual framework was developed by refining existing school effectiveness models and including factors related to science achievement. The refined model integrated psychological and sociological aspects and reflected the multilevel-structure of schools. The model added resources and climate to the quality factors at class/school level. It was applied to the South African results of TIMSS 2003. Data from the student (n=8,952), teacher (n=255) and school questionnaires (n=255) were analysed in conjunction with achievement data by means of factor, reliability, correlation and multilevel analysis. The multilevel analysis revealed that at student level the strongest predictor of science achievement is attitude towards science. At classroom/school level, the strongest predictors are resource-and climate-related factors such as the safety in school, physical resources and class size. Factors at class/school level influenced performance more than student level factors with 59% of the total variance in science achievement occurring at class/school level. Such results indicate that the model developed is well suited to science education in developing countries.


Educational Research and Evaluation | 2011

Constructing benchmarks for monitoring purposes: evidence from South Africa

Vanessa Scherman; Sarah J. Howie; Roel Bosker

In information-rich environments, schools are often presented with a myriad of data from which decisions need to be made. The use of the information on a classroom level may be facilitated if performance could be described in terms of levels of proficiency or benchmarks. The aim of this article is to explore benchmarks using data from a monitoring system on secondary school level. Seventeen secondary schools, purposively sampled for maximum variation, participated in this project. Pupils from a random sample of 2 Grade 8 classes per school completed the assessments (n = 1706). Using a dichotomous Rasch model, person item distribution maps were generated for mathematics, and different difficulty levels were constructed from the items that corresponded to the ability levels of pupils as well as the reporting protocol of the National Department of Education. Implications for benchmarking and standard setting are discussed based on the results provided in the article.


SAGE Open | 2018

Multiple Imputation for Dichotomous MNAR Items Using Recursive Structural Equation Modeling With Rasch Measures as Predictors

Celeste Combrinck; Vanessa Scherman; David J.F. Maree; Sarah J. Howie

Missing Not at Random (MNAR) data present challenges for the social sciences, especially when combined with Missing Completely at Random (MCAR) data for dichotomous test items. Missing data on a Grade 8 Science test for one school out of seven could not be excluded as the MNAR data were required for tracking learning progression onto the next grade. Multiple imputation (MI) was identified as a solution, and the missingness patterns were modeled with IBM Amos applying recursive structural equation modeling (SEM) for 358 cases. Rasch person measures were utilized as predictors. The final imputations were done in SPSS with logistic regression MI. Diagnostic checks of the imputations showed that the structure of the data had been maintained, and that differences between MNAR and non-MNAR missing data had been accounted for in the imputation process.


International Journal of Multiple Research Approaches | 2018

Editors’ Introduction to the Mixed Methods Manifesto Inaugural Special Issue

Anthony J. Onwuegbuzie; John H. Hitchcock; R. Burke Johnson; Brigitte Smit; Vanessa Scherman; Donggil Song

aDepartment of Educational Leadership, Sam Houston State University, TX, USA and Department of Educational Leadership and Management/Department of Educational Psychology, University of Johannesburg, Johannesburg, South Africa; bDepartment of Instructional Systems Technology, Center for Evaluation and Education Policy, Indiana University Bloomington, IN, USA; cDepartment of Counseling and Instructional Sciences, College of Education and Professional Studies, University of South Alabama, Mobile, AL, USA; dDepartment of Educational Leadership and Management, College of Education, University of South Africa, Pretoria, South Africa; eDepartment of Psychology of Education, University of South Africa, Pretoria, South Africa; fDepartment of Computer Science, Sam Houston State University, TX, USA


South African Journal of Psychology | 2017

Evaluating anchor items and reframing assessment results through a practical application of the Rasch Measurement Model

Celeste Combrinck; Vanessa Scherman; David J.F. Maree

The monitoring of learning over time is critical for determining progression within and across cohorts of learners. This research investigated the use of the Rasch Measurement Model to determine the functioning of anchor items as well as an application of the model to convert the results to the same metric. A group of 321 Grade 8 learners and the same in the following school year wrote English Additional Language Comprehension Tests aimed at monitoring learning progression over years. The two tests were linked with 15 anchor items. This study examined the results of the anchor items from Years 1 and 2, applying non-parametric statistical tests as well as the Rasch Partial Credit Model to identify items which did not contribute to monitoring learning progression; these items were removed or refined based on the results and reviews by subject specialists. Learner results from Grades 8 and 9 were placed in the same frame of reference by applying the Rasch Partial Credit Model in order to establish a more accurate representation of the magnitude of learning progression. The first finding illustrated that applying non-parametric statistics and Rasch Measurement Theory identifies potentially problematic anchor items, and that when items are improved or removed, the overall results tend to be more stable and precise. Second, it was found that when applying Rasch item and threshold calibrations to assessment results, a more accurate indication of learning progression is obtained which can be used to communicate results to stakeholders and more importantly, inform teaching and learning.


Archive | 2017

The Role of Monitoring in Enhancing the Quality of Education

Vanessa Scherman; Roel Bosker

“Nobody is against quality, so of course everyone is in favour of assuring quality”, a remark made in the opening chapter of her book Monitoring the Quality of Education by Carol Taylor Fitz-Gibbon (1996, p. 3).


Archive | 2017

Monitoring and School Self-Evaluation

Vanessa Scherman; William Fraser

School success has often been thought of in terms of achievement. Emphasis has also been placed on the tools used to monitor the progress of pupils in order to ensure achievement (Safer & Fleischman, 2005).


Archive | 2017

Monitoring Systems for the Future

Sarah J. Howie; Vanessa Scherman

Increasingly the purpose of monitoring education systems is to evaluate achievement progress across subjects in schooling in response to global calls for improving quality of education for all (Howie, 2013; UNESCO, 2012).

Collaboration


Dive into the Vanessa Scherman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brigitte Smit

University of South Africa

View shared research outputs
Top Co-Authors

Avatar

Elizabeth Archer

University of South Africa

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Roel Bosker

University of Groningen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lisa Zimmerman

University of South Africa

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John H. Hitchcock

American Institutes for Research

View shared research outputs
Top Co-Authors

Avatar

R. Burke Johnson

University of South Alabama

View shared research outputs
Researchain Logo
Decentralizing Knowledge