Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eunice Eunhee Jang is active.

Publication


Featured researches published by Eunice Eunhee Jang.


Journal of Mixed Methods Research | 2008

Integrative Mixed Methods Data Analytic Strategies in Research on School Success in Challenging Circumstances.

Eunice Eunhee Jang; Douglas McDougall; Dawn Pollon; Monique Herbert; Pia Russell

There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging circumstances. The research was conducted using a concurrent mixed methods approach. The qualitative and quantitative strands of data were analyzed independently through thematic analysis of qualitative data and factor analysis of survey data, followed by integrative data analytic procedures. The integrative data analytic approach included strategies such as parallel integration for member checking, data transformation for comparison, data consolidation for emergent themes, and case analysis for fine-grained descriptions of school profiles. The integrative data analysis process featured the iterative nature of mixing data sources at various points and allowed the researchers to pay attention to emergent insights made available through mixed methods research.


Educational Assessment | 2016

Advances in the Science of Assessment

Valerie J. Shute; Jacqueline P. Leighton; Eunice Eunhee Jang; Man-Wai Chu

Designing, developing, and administering assessments has remained fairly unchanged across the past century. However, recent developments in instructional technology, learning science theory, and advances in the design of assessments necessitate a newfound perspective on assessment. The objective of the present article is to review the topic of assessment in depth—past, present, and future. Specifically, we focus on the use of technologically rich learning environments that have spurred advances in student assessment, new methods and procedures from these advances, and consequently the need to consider implementing comprehensive assessment systems that provide rigorous and ubiquitous measurement of the whole student learning experience.


Language Assessment Quarterly | 2009

Demystifying a Q-Matrix for Making Diagnostic Inferences About L2 Reading Skills

Eunice Eunhee Jang

In researching the potential of cognitive diagnostic assessment, researchers concur that the quality of diagnostic inferences is subject to the extent to which the construct representations based on cognitive skills are theoretically compelling, empirically sound, and relevant to test use. In this paper, I argue that the construction of a Q matrix requires multiple sources of evidence supporting the representation of the construct with well-defined cognitive skills and their explicit links to item characteristics. I illustrate the process of constructing and refining a Q matrix using the results from a large-scale study that examined the validity of applying cognitive diagnosis assessment approaches to LanguEdge reading comprehension tests. I focus on the characteristics of reading skills identified from verbal protocols along with the analyses of text and items and discuss issues related to identifying reading skills and determining the granularity for diagnostic inferences. By demonstrating the process of refining a Q matrix, I discuss fundamental issues arising from the application of cognitive diagnostic assessment through retrofitting. I believe that the paper provides useful guidelines for those who are interested in designing a systematic cognitive diagnostic assessment for L2 reading comprehension abilities.


Language Testing | 2015

How do young students with different profiles of reading skill mastery, perceived ability, and goal orientation respond to holistic diagnostic feedback?

Eunice Eunhee Jang; Maggie Dunlop; Gina Park; Edith Van der Boom

One critical issue with cognitive diagnostic assessment (CDA) lies in its lack of research evidence that shows how diagnostic feedback from CDA is interpreted and used by young students. This mixed methods research examined how holistic diagnostic feedback (HDF) is processed by young learners with different profiles of reading skills, goal orientations, and perceived ability. HDF provides three learner profiles: learners’ current skill mastery levels; self-assessed skill proficiency; and goal orientations. It also has a section for plans for future learning. A total of 44 Grades 5 and 6 students (aged 11–12) from two classrooms, their parents and teacher received individually customized HDF reports. Students’ reading skill mastery profiles were determined based on the application of cognitive diagnostic modeling to their performance on a provincial reading achievement measure, while their perceived ability and goal orientation profiles were created by using self-assessment and goal-orientation questionnaires. Students and parents provided written responses to their HDF reports. The study findings show the dynamic influence of young students’ profiles on the ways in which they perceive, interpret and use HDF. Students’ responses to diagnostic feedback did not differ substantially across reading mastery levels; however, psychological factors most strongly impacted the efficacy of learner feedback processing. Furthermore, the result that it was not students’ actual goal orientations but their perceived parent goal orientations that showed significant relationships with their skill mastery levels strongly indicates that young students’ responses to HDF are likely to be influenced by broader learning environments, and such influences are further filtered through their own perceptions. Understanding students’ interactions with diagnostic feedback is critical for maximizing its effect because their perceptions about ability and orientations to learning strongly influence the ways in which they process diagnostic feedback on their learning.


International Journal of Testing | 2009

Integrative Analytic Approach to Detecting and Interpreting L2 Vocabulary DIF.

Eunice Eunhee Jang; Louis Roussos

In this article we present results of a Differential Item Functioning (DIF) study using Shealy and Stouts (1993) multidimensionality-based DIF analysis framework. In this framework, differences in test score distributions across different groups of examinees may be a result of multidimensionality if secondary dimensions (not the primary dimension measured by the test) differentially affect examinee performance. Thus, this framework requires both statistical and substantive judgments for identifying potential DIF items and substantiating the causes of DIF, which, as a result, will enhance a comprehensive construct validity argument. In this article, we illustrate step-by-step procedures of multidimensionality-based DIF analyses using LanguEdge reading comprehension test data. Qualitative data from think-aloud verbal protocols were used to generate DIF hypotheses about differential functioning of vocabulary items between two groups of Indo-European and non-Indo-European L2 learners. Statistical Simultaneous Item Bias Test (SIBTEST; Shealy & Stout, 1993) was used to test the DIF hypotheses. The DIF results supported the hypotheses by flagging four uniform DIF items and one crossing DIF item. Post-hoc analyses of the DIF-flagged items were performed by visual inspection of group differences using TestGraf (Ramsay, 2001) and revisiting the qualitative verbal data and analyses of cognate types. The results showed that DIF items with large effect sizes were associated with the following: (a) translation-equivalent cognates; (b) word meaning determined independent of context, and (c) less frequent, more difficult distracter words than those in stems. In light of empirical evidence, the article discusses implications for test development and validation processes.


Language Assessment Quarterly | 2015

Investigating the Homogeneity and Distinguishability of STEP Proficiency Descriptors in Assessing English Language Learners in Ontario Schools.

Eunice Eunhee Jang; Jim Cummins; Maryam Wagner; Saskia Stille; Maggie Dunlop

Research on issues concerning the assessment of school-age English language learners (ELLs) in curriculum-learning contexts has been relatively less productive than assessment of adult language learners. A growing demand for assessing school-age ELLs has led to the development of assessment frameworks that provide the opportunity to examine the extent to which such frameworks represent the developmental continuum of these learners’ language acquisition and literacy learning. Steps to English Proficiency (STEP) assessment framework were developed in Ontario, Canada, to be used by its teachers to assess, track, and support the language proficiency development of English language learners. STEP comprises three sets of descriptor-based developmental continua related to oral communication, reading, and writing, for each of four different grade clusters across Kindergarten to Grade 12. In this article we focus mainly on the characteristics of STEP language proficiency descriptors concerning: (a) the extent to which the descriptors are distinguishable across the skills of oral communication, reading, and writing; (b) the extent to which the STEP descriptors are distinguishable across the six proficiency levels; and (c) the extent to which the descriptors are homogeneous within each proficiency level. Our findings reveal that overall, the STEP scales are stable with six distinguishable STEPs and that there are strong correlational relationships among the three represented skills. Our research highlights the urgent need for research geared to understanding the relationship between oral and literacy skills for school-age language learners.


Annual Review of Applied Linguistics | 2014

Mixed Methods Research in Language Testing and Assessment

Eunice Eunhee Jang; Maryam Wagner; Gina Park

As an alternative paradigm, mixed methods research (MMR), in general, endorses pluralism to understand the complex nature of a social world from multiple perspectives and multiple methodological lenses, each of which offers partial, yet valuable, insights. This methodological mixing is not limited to mixing of methods, but extends to the entire inquiry process. Researchers in language testing and assessment (LTA) are increasingly turning to MMR in order to understand the complexities of language acquisition and interaction among various language users, and also to expand opportunities to investigate validity claims beyond the three traditional facets of construct, content, and criterion validity. We use current conceptualizations of validity as a guiding framework to review 32 empirical MMR studies that have been published in LTA since 2007. Our systematic review encompassed multiple areas of foci, including the rationale for the use of MMR, evidence of collaboration, and synergetic effects. The analyses revealed several key trends including: (a) triangulation and complementarity were the prevalent uses of MMR in LTA; (b) the majority of the studies took place predominantly in higher education learning contexts with adult immigrant or university populations; (c) aspects of writing assessment were most frequently the focus of the studies (compared to other language modalities); (d) many of the studies explicitly addressed facets of validity, and others had significant implications for expanding notions of validity in LTA; (e) the majority of the studies avoided mixing at the data analysis stage by distinguishing data types and reporting results separately; and (f) integration occurred primarily at the discussion stage. We contend that LTA should embrace MMR through creative designs and integrative analytic strategies to seek new insights into the complexities and contexts of language testing and assessment.


Journal of Educational Computing Research | 2017

Person-Oriented Approaches to Profiling Learners in Technology-Rich Learning Environments for Ecological Learner Modeling

Eunice Eunhee Jang; Susanne P. Lajoie; Maryam Wagner; Zhenhua Xu; Eric Poitras; Laura Naismith

Technology-rich learning environments (TREs) provide opportunities for learners to engage in complex interactions involving a multitude of cognitive, metacognitive, and affective states. Understanding learners’ distinct learning progressions in TREs demand inquiry approaches that employ well-conceived theoretical accounts of these multiple facets. The present study investigated learners’ interactions with BioWorld, a TRE developed to guide students’ clinical reasoning through diagnoses of simulated patients. We applied person-oriented analytic methods to multimodal data including verbal protocols, questionnaires, and computer logs from 78 task solutions. Latent class analysis, clustering methods, and latent profile analysis followed by logistic regression analyses revealed that students’ clinical diagnosis ability was positively correlated with advanced self-regulated learning behaviors, high confidence and cognitive strategy use, critical attention to experts’ feedback, and their positive emotional responses to feedback. The study results have the potential to contribute to a theory-guided approach to designing TREs with a data-driven assessment of multidimensional growth. Building on the study results, we introduce and discuss an ecological learner model for assessing multidimensional learner traits which can be used to design a TRE for adaptive scaffolding.


Assessment in Education: Principles, Policy & Practice | 2017

Ontario’s educational assessment policy and practice: a double-edged sword?

Eunice Eunhee Jang; Jeanne Sinclair

Abstract This paper examines assessment policies for K-12 education in Ontario, Canada. We begin with a discussion of Ontario’s education system, and then turn our focus to Ontario’s K-12 assessment policies serving multiple purposes for accountability and classroom learning. We first examine the Educational Quality and Accountability Office (EQAO), which is responsible for Ontario’s province-wide literacy and numeracy assessments by focusing on the nature of its educational stakes, its implications for Ontario’s diverse student populations and its alignment with Ontario’s curriculum. We then turn to Ontario’s policies geared towards classroom-based, teacher-led assessments, such as the Growing Success initiative and the Steps to English Proficiency assessment framework. Together, these policies coexist and function like a double-edged sword for teachers because of conflicting expectations and roles. We conclude with some suggestions for further enhancing Ontario’s assessment policies through the integration of technology and building of teachers’ assessment competencies.


Language Assessment Quarterly | 2010

Demystifying a Q-matrix for Making Diagnostic Inferences about L2 Reading Skills: The Author Responds

Eunice Eunhee Jang

I welcome the opportunity to respond to the commentary essays by Charles Alderson and Fred Davidson on Demystifying a Q-matrix for making diagnostic inferences about L2 reading skills (Jang, 2009)....

Collaboration


Dive into the Eunice Eunhee Jang's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michelle Taub

North Carolina State University

View shared research outputs
Top Co-Authors

Avatar

Roger Azevedo

North Carolina State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gina Park

University of Toronto

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge