Jenny Lenkeit
University of Oxford
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jenny Lenkeit.
International Journal of Research & Method in Education | 2012
Daniel H. Caro; Jenny Lenkeit
The paper draws on the work of Willms [2006. Learning divides: Ten policy questions about the performance and equity of schools and schooling systems. Montreal: UNESCO Institute for Statistics] to present an analytical approach to the study of academic achievement disparities related to family socioeconomic status. The approach is illustrated by evaluating 10 hypotheses with two-level and three-level hierarchical linear models using data from the Progress in International Reading Literacy Study 2006. For each hypothesis, the underlying theory, statistical model, and critical model test are presented and the results are discussed. The analytical approach can be generalized to other studies and data sets. The results help to understand how inequalities are configured at the within-country and between-country levels.
School Effectiveness and School Improvement | 2013
Jenny Lenkeit
Educational effectiveness research often appeals to value-added models (VAM) to gauge the impact of schooling on student learning net of the effect of student background variables. A huge amount of cross-sectional studies do not, however, meet VAMs requirement for longitudinal data. Contextualised attainment models (CAM) measure the influence of schools on student outcomes controlling for family background characteristics in cross-sectional studies. It is argued that the latter are adequate substitutes for student prior attainment. Drawing on data from a 3-point longitudinal study in the city of Berlin, Germany (n = 3,074), reading and mathematics achievement of primary students are investigated to assess effectiveness measures of schools. Estimates are compared for a 3-level growth curve analysis (VAM), a hierarchical linear model controlling for background characteristics (CAM), and one additionally controlling for prior achievement scores (prior attainment model). The article contributes to the enhancement of a feedback culture for cross-sectional study results.
Scandinavian Journal of Educational Research | 2018
Therese N. Hopfenbeck; Jenny Lenkeit; Yasmine El Masri; Kate Cantrell; Jeanne Ryan; Jo-Anne Baird
ABSTRACT International large-scale assessments are on the rise, with the Programme for International Student Assessment (PISA) seen by many as having strategic prominence in education policy debates. The present article reviews PISA-related English-language peer-reviewed articles from the programme’s first cycle in 2000 to its most current in 2015. Five literature bases were searched, and results were analysed with SPSS. Results map the frequency of publications according to journal, country, and scientific discipline. They also summarise major themes within three identified categories: secondary analysis, policy impact, and critiques. Findings indicated that studies based on the PISA dataset has led to progress in educational research while simultaneously pointing to the need for caution when using this research to inform educational policy.
Educational Research and Evaluation | 2014
Jenny Lenkeit; Daniel H. Caro
Reports of international large-scale assessments tend to evaluate and compare education system performance based on absolute scores. And policymakers refer to high-performing and economically prosperous education systems to enhance their own systemic features. But socioeconomic differences between systems compromise the plausibility of those comparisons and references. The paper applies conceptual and methodological approaches from educational effectiveness research to investigate how effectively education systems perform and how effectively they change their performance over time by accounting for socioeconomic differences between systems and cohorts (assessment cycles). Data from 4 cycles of the Programme for International Student Assessment (PISA) are analysed. Results indicate that the quality of systems is evaluated differently if assessed by absolute performance scores or effectiveness measures. The study contributes to methodological developments of effectiveness research in international large-scale assessments and provides relevant information for policymakers to further look into policies, structures, and reform measures that have favoured effectiveness.
Educational Research and Evaluation | 2015
Jenny Lenkeit; Daniel H. Caro; Steve Strand
In England, students with immigrant background exhibit lower educational attainment than those without immigrant background. Family socioeconomic status (SES) helps explain differences in educational attainment, but a gap remains that differs in size for students with different immigrant backgrounds. While the explanatory repertoire for the remaining gap is broad, it has been neglected to comprehensively investigate whether family SES constructs are equivalent across students with different immigrant backgrounds. Using data from the first wave of the Children of Immigrants Longitudinal Survey in Four European Countries (CILS4EU) for England (n = 4,315), the paper applies exploratory structural equation modelling (ESEM) to evaluate measurement invariance of family background constructs across students without and with immigrant background, specifically Pakistani/Bangladeshi immigrant background. Results suggest differences in the structure of family SES indicators across groups and in their association with educational attainment. Complementary variables are suggested to enhance family SES indicators. Findings are relevant to researchers investigating educational inequalities related to immigrant background.
Assessment in Education: Principles, Policy & Practice | 2018
Jenny Lenkeit; Knut Schwippert
International large-scale assessments (ILSAs) are an influential instrument for assessing and evaluating quality and equity across education systems and informing educational policies. The data collected in studies such as the Programme for International Student Assessment (PISA), the Progress in International Reading Literacy Study (PIRLS) and the Trends in International Mathematics and Science Study (TIMSS) provide ample opportunity to investigate a broad range of research questions that are most often related to student attainment in specific subjects. In addition, the data enable researchers to examine topics related for instance to the impact of teacher and instructional characteristics on learning and attitudes towards learning, educational inequalities and the effectiveness of school and systemic characteristics (Hopfenbeck et al., 2017). The internationally comparative context in which these questions can be raised and addressed allows us to advance our understanding of the differences and similarities in the learning and teaching environments across national, cultural and regional settings around the world. But also within national borders ILSAs can contribute to a better understanding of the structures and mechanisms of the education system (Schwippert & Lenkeit, 2012). This is particularly true in countries where no or only few experience with national assessments exist. The assessment of reading has been at the forefront in ILSAs (e.g. Reading Literacy Study 1990–1991; PIRLS 2001-ongoing, PISA 2000-ongoing) not least because it is seen as one of the most important cultural techniques in our modern world. Reading skills are not only used to acquire knowledge in schools but more importantly are a necessary ability which guides individuals through their whole life (Motiejunaite, Noorani, & Monseur, 2014). But linguistic patterns, structures and the processes of their acquisition are very different across languages and the assessment of reading in an international context has also triggered critical examinations of the underlying cognitive-psychological concepts associated with the learning processes (Asil & Brown, 2016; Evans & Levinson, 2009; Grisay, Gonzalez, & Monseur, 2009). Language influences on item difficulties and test results have been documented for subjects such as science as well (e.g. El Masri, Baird, & Graesser, 2016), but the assessment of abstract mathematical and scientific concepts may overall be less affected by their translation into different languages as their acquisition follows similar cognitive processes. Recent studies have also shown that ILSA data are used more frequently in mathematics and science education research fields than in research for teaching reading and language (Hopfenbeck et al., 2017; Lenkeit, Chan, Hopfenbeck, & Baird, 2015). With few exceptions, reading research does not seem to utilise ILSA data or publish respective research in subject-specific scientific journals. This seems despite the fact that studies such as PIRLS for example offer a vast amount of information about students’ performance in different reading purposes and processes, all of which can be clearly assigned to individual items. As such, research aiming to enhance our understanding of the effects of the teaching environment and instructional approaches on learning and skill development has so far been more influential for the science and mathematics domains than for reading in ILSAs
Assessment in Education: Principles, Policy & Practice | 2017
Jenny Lenkeit; Knut Schwippert; Michel Knigge
Abstract Research provides evidence that gender, immigrant background and socio-economic characteristics present multiple disadvantaging characteristics that change their relative importance and configurations over time. When evaluating inequalities researchers tend to focus on one particular aspect and often use composite measures when evaluating socio-economic characteristics. Neither can fully represent the complexity of students’ various disadvantaging characteristics, which have autonomous associations with attainment and with each other. This paper investigates how the relative importance and configurations of different disadvantaging factors have changed over time to form educational inequalities and how these changes differ across countries. Data from five PISA cycles (2000–2012) for France, Germany, Sweden and the United Kingdom are used and configurations of gender, immigration background, parents’ occupational and educational levels, and the number of books at home evaluated. Results enable us to relate changes (or lack thereof) in configurations of disadvantaging factors to recent reforms targeted at reducing educational inequality after the first PISA results.
Studies in Educational Evaluation | 2009
Daniel H. Caro; Jenny Lenkeit; Rainer Lehmann; Knut Schwippert
Studies in Educational Evaluation | 2016
Daniel H. Caro; Jenny Lenkeit; Leonidas Kyriakides
Educational Research Review | 2015
Jenny Lenkeit; Jessica Chan; Therese N. Hopfenbeck; Jo-Anne Baird