Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dorry M. Kenyon is active.

Publication


Featured researches published by Dorry M. Kenyon.


System | 1992

Research on the Comparability of the Oral Proficiency Interview and the Simulated Oral Proficiency Interview.

Charles W. Stansfield; Dorry M. Kenyon

Abstract The simulated oral proficiency interview (SOPI) is a tape-mediated test designed to be a surrogate for the oral proficiency interview (OPI) in situations where a face-to-face interview is not possible or desirable. This article reviews research that sheds light on the comparability of the two tests. It begins with a brief description of the SOPI and continues to discuss the results of research on the reliability of the two tests, the agreement of scores obtained on the two tests, the comparability of the approach to testing, and the qualitative content of speech samples obtained via the two approaches. This article concludes with suggestions for further research.


Language Testing | 2008

Development of a cognate awareness measure for Spanish-speaking English language learners

Valerie Malabonga; Dorry M. Kenyon; María S. Carlo; Diane August; Mohammed Louguit

This paper describes the development and validation of the Cognate Awareness Test (CAT), which measures cognate awareness in Spanish-speaking English Language Learners (ELLs) in fourth and fifth grade. An investigation of differential performance on the two subtests of the CAT (cognates and noncognates) provides evidence that the instrument is sensitive to English—Spanish cognate awareness among elementary school-age Spanish-speaking ELLs. Cognates were highly correlated with the childrens Spanish WLPB-R Picture Vocabulary scores, whereas noncognates were highly correlated to childrens English WLPB-R Picture Vocabulary scores.


Language Testing | 2005

Self-Assessment, Preparation and Response Time on a Computerized Oral Proficiency Test.

Valerie Malabonga; Dorry M. Kenyon; Helen Carpenter

Two studies investigated technical aspects of a computer-mediated test, the Computerized Oral proficiency Instrument (COPI), particularly in contrast to a similar tape-mediated test, the Simulated Oral proficiency Interview (SOPI). The first study investigated how examinees used self-assessment to choose an appropriate starting level on the COPI. The second study looked at examinees’ planning and response time on the COPI, and the factors that affected their use of time. Fifty-five university students took the COPI and SOPI in one of three languages: Arabic, Chinese, or Spanish. Results show that the majority of examinees (92%) were able to use the self-assessment instrument to select test tasks at appropriate difficulty levels. However, the COPI starting level (which was based on examinees’ self-assessment) might have been problematic for a small proportion (8%) of the examinees who appeared to choose tasks that were too difficult for them. As for planning and response time on the COPI, different amounts of time were used across four main levels of proficiency. Examinees with the highest proficiency levels tended to use less planning time but gave longer responses.


The Modern Language Journal | 2000

The Rating of Direct and Semi-Direct Oral Proficiency Interviews: Comparing Performance at Lower Proficiency Levels.

Dorry M. Kenyon; Erwin Tschirner

As states and universities institute oral proficiency requirements with vast numbers of students to be tested, there is a need to investigate effective alternatives to the ACTFL Oral Proficiency Interview (OPI) that allow group testing. This article reports on a study comparing student performances and test reliabilities for the German Speaking Test (GST) developed by the Center for Applied Linguistics, a semi-direct tape-mediated oral proficiency test, and the ACTFL OPI. Both the GST and a German OPI were administered as final oral exams to a randomly selected group of 20 students (out of a total of 59 students) enrolled in a fourth-semester German course at a large Midwestern university. The OPI levels of the students tested ranged from Novice High (n= 5) and Intermediate Low (n= 9) to Intermediate Mid (n= 6). At these 3 levels, final ratings on the GST and the OPI agreed with each other perfectly in 90% of the cases. There were only 2 one-step disagreements, both involving students who were rated Novice High on the ACTFL OPI, but who received other ratings on the GST. Although the results indicated a high score equivalency between ACTFL proficiency ratings obtained on both tests, this study underscores the pressing need for double ratings and arbitration procedures in high stakes testing situations.


Bilingual Research Journal | 2014

The Importance of SES, Home and School Language and Literacy Practices, and Oral Vocabulary in Bilingual Children's English Reading Development.

Elizabeth R. Howard; Mariela Páez; Diane August; Christopher D. Barr; Dorry M. Kenyon; Valerie Malabonga

This study explores the role that socioeconomic status (SES), home and school language and literacy practices, and oral vocabulary play in the development of English reading skills in Latino English language learners (ELLs) and how these factors contribute differentially to English reading outcomes for children of different ages and in different settings: 292 Spanish-speaking kindergarteners in mostly English instruction, 85 Spanish-speaking third graders in bilingual instruction, and 70 Spanish-speaking fifth graders in both English and bilingual settings. Data were analyzed using hierarchical regression. Findings indicate that for each sample, English oral vocabulary is a significant predictor of English reading accuracy and comprehension once SES and home and school language and literacy factors have been considered. Beyond oral vocabulary, however, there is considerable variability across samples in the home and school language and literacy variables that are predictive of English reading outcomes. The study points to the importance of looking closely at the texture of children’s lives in coming to an understanding of second-language literacy development.


Language Testing | 2012

Development and Validation of Extract the Base: An English Derivational Morphology Test for Third through Fifth Grade Monolingual Students and Spanish-Speaking English Language Learners.

Amanda P. Goodwin; A. Corinne Huggins; María S. Carlo; Valerie Malabonga; Dorry M. Kenyon; Mohammed Louguit; Diane August

This study describes the development and validation of the Extract the Base test (ETB), which assesses derivational morphological awareness. Scores on this test were validated for 580 monolingual students and 373 Spanish-speaking English language learners (ELLs) in third through fifth grade. As part of the validation of the internal structure, which involved using the Generalized Partial Credit Model for tests with polytomous items, items on this test were shown to provide information about students of different abilities and also discriminate amongst such heterogeneous students. As part of the validation of the test’s relationship to criterion, items were shown to correlate with measures of word identification, reading comprehension, and vocabulary measures. Differences in performances for fluent English students and ELLs, students of varied home language environments, and different grade levels were noted. Additionally, the task was validated using a dichotomous scoring system to provide reliability and validity information using this alternate scoring method.


Assessment for Effective Intervention | 2009

Measures for Determining English Language Proficiency and the Resulting Implications for Instructional Provision and Intervention.

Craig A. Albers; Dorry M. Kenyon; Timothy J. Boals

Although numerous English language proficiency (ELP) measures currently exist, many were developed prior to the No Child Left Behind Act of 2001 (NCLB). These pre-NCLB measures typically focused on social language proficiency, whereas post-NCLB measures are linked to ELP standards and focus on academic language proficiency (ALP). ELP measures are typically used for accountability purposes and to determine eligibility for services; less attention has been given to their utility in enhancing classroom instruction and intervention provision. Inconsistency in scores between pre- and post-NCLB measures frequently leaves educators wondering whether English language learners (ELLs) have the necessary ALP to benefit from classroom instruction. This study investigates the intervention validity of ELP assessment by examining the concurrent validity of various pre-NCLB measures to a recently developed post-NCLB measure. As hypothesized, results indicate moderate correlations between pre- and post-NCLB measures, suggesting that ALP-focused post-NCLB measures are likely to provide more utility for ELL classroom instruction and intervention provision.


Hispania | 1991

The Validity of the Portuguese Speaking Test for Use in a Summer Study Abroad Program.

Charles W. Stansfield; Dorry M. Kenyon; Margo Milleret

men study program. The study has three main ence expressed a significantly greaten degree of foci: a) to determine the validity of the ACTFLcomfort when using the language in a listening or based Portuguese Speaking Test as a placement speaking situation. tool for a short study abroad program; and b) to The findings of Carroll and Graman affirm assess gains in oral proficiency among particithe important role foreign study plays in depants in the University of Tennessees summer veloping language skills that are needed for study abroad program in Brazil; and c) to assess, advanced language courses. While efforts have where possible, other issues related to the initial been made to make the foreign language classstudy abroad experience, room more communicative and to improve students proficiency so as to better prepare Overview of Research students for advanced language classes, foreign study serves as an irreplaceable linguistic and John Carroll (1967) in his study of the lancultural experience. guage proficiency of foreign language majors in The traditional foreign study program detheir senior year, noted that most of the majors in signed for language learners includes study at a his sample had at least one foreign study expeniforeign institution in regular or specially deence. Even with this experience, most did not signed classes, placement with families, and surpass the level of 2+ on the FSI scale. This often additional scheduled travel. The cost of rating is equivalent to a scone of Advanced Plus these programs coupled with the time students on the ACTFL scale (Liskin-Gasparro, 1987). must commit to foreign study has led to a shorten, Graman (1987) noted the same phenomenon more intensive version of foreign study, the _________ summer study program. Unlike academic year 5Members are invited to send material to: Dr. Douglas K. programs, summer programs typically arrange Benson, Dept. of Modern Languages, Eisenhower Hall 104, for special classes for students at foreign instituKansas State Univ., Manhattan, KS 66506. Maximum tions that would normally be on vacation. Sumlength is 15 double-spaced pages, in either 1985 MLA or men study is attractive to faculty and students linguistics format. Please send a typed original and acopy alikebecauseitnequineslesstimeandlessmoney. for each paper submitted, and include a self-addressed envelope and loose stamps to cover return mailing of the However, summer study programs have been manuscript, criticized for their brevity, intensity, and lack of


Language Testing | 2011

Issues in vertical scaling of a K-12 English language proficiency test

Dorry M. Kenyon; David MacGregor; Dongyang Li; H. Gary Cook

One of the mandates of the No Child Left Behind Act is that states show adequate yearly progress in their English language learners’ (ELLs) acquisition of English language proficiency. States are required to assess ELLs’ English language proficiency annually in four language domains (listening, reading, writing, and speaking) to measure their progress; they are also required to report on a composite comprehension measure. Often the clearest way to effectively monitor students’ progress is to measure assessment results across grades on the same scale. In measurement terms, scores from tests across all grade levels can be put on the same scale using vertical scaling. In addition, to help stakeholders understand and interpret the results, these scale scores are often interpreted in terms of proficiency levels. In this article, we use the vertical scaling of WIDA ACCESS for ELLs®, a large-scale K-12 Academic English Language Proficiency assessment to illustrate measurement and practical issues involved in this technique. We first give background on the need for vertical scaling. We then assess the literature on vertical scaling and describe the procedures used for WIDA ACCESS for ELLs® to vertically scale test scores and interpret the results in terms of the WIDA ACCESS for ELLs® Proficiency Scale. Next we review several studies that have been conducted to gauge the effectiveness of that scaling. We end the paper with a discussion of the broad issues that arise from vertical scaling.


Language Assessment Quarterly | 2011

Exploring Domain-General and Domain-Specific Linguistic Knowledge in the Assessment of Academic English Language Proficiency

Anja Römhild; Dorry M. Kenyon; David MacGregor

This study examined the role of domain-general and domain-specific linguistic knowledge in the assessment of academic English language proficiency using a latent variable modeling approach. The goal of the study was to examine if modeling of domain-specific variance results in improved model fit and well-defined latent factors. Analyses were carried out on data from the ACCESS for ELLs® test battery, which comprises multiple test forms targeting different grade and proficiency levels. The results of the study provide empirical evidence in support of the conceptual distinction of domain-specific and domain-general linguistic knowledge. Domain-specific factors tended to become more salient with increasing language proficiency, whereas the salience of domain-general factors tended to decrease. However, overall domain-general factors remained stronger than the domain-specific factors. In one test form targeting high levels of proficiency, this factor pattern was reversed, suggesting some degree of fluidity in the relationship between domain-general and domain-specific linguistic knowledge.

Collaboration


Dive into the Dorry M. Kenyon's collaboration.

Top Co-Authors

Avatar

Charles W. Stansfield

Center for Applied Linguistics

View shared research outputs
Top Co-Authors

Avatar

Valerie Malabonga

Center for Applied Linguistics

View shared research outputs
Top Co-Authors

Avatar

Diane August

Center for Applied Linguistics

View shared research outputs
Top Co-Authors

Avatar

David MacGregor

Center for Applied Linguistics

View shared research outputs
Top Co-Authors

Avatar

Elizabeth R. Howard

Center for Applied Linguistics

View shared research outputs
Top Co-Authors

Avatar

Mohammed Louguit

Center for Applied Linguistics

View shared research outputs
Top Co-Authors

Avatar

Anja Römhild

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar

Igone Arteagoitia

Center for Applied Linguistics

View shared research outputs
Top Co-Authors

Avatar

Mary Lee Scott

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge