Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ou Lydia Liu is active.

Publication


Featured researches published by Ou Lydia Liu.


Educational Assessment | 2008

Assessing Knowledge Integration in Science: Construct, Measures, and Evidence

Ou Lydia Liu; Hee-Sun Lee; Carolyn Huie Hofstetter; Marcia C. Linn

In response to the demand for sound science assessments, this article presents the development of a latent construct called knowledge integration as an effective measure of science inquiry. Knowledge integration assessments ask students to link, distinguish, evaluate, and organize their ideas about complex scientific topics. The article focuses on assessment topics commonly taught in 6th- through 12th-grade classes. Items from both published standardized tests and previous knowledge integration research were examined in 6 subject-area tests. Results from Rasch partial credit analyses revealed that the tests exhibited satisfactory psychometric properties with respect to internal consistency, item fit, weighted likelihood estimates, discrimination, and differential item functioning. Compared with items coded using dichotomous scoring rubrics, those coded with the knowledge integration rubrics yielded significantly higher discrimination indexes. The knowledge integration assessment tasks, analyzed using knowledge integration scoring rubrics, demonstrate strong promise as effective measures of complex science reasoning in varied science domains.


Applied Measurement in Education | 2011

Validating Measurement of Knowledge Integration in Science Using Multiple-Choice and Explanation Items.

Hee-Sun Lee; Ou Lydia Liu; Marcia C. Linn

This study explores measurement of a construct called knowledge integration in science using multiple-choice and explanation items. We use construct and instructional validity evidence to examine the role multiple-choice and explanation items plays in measuring students knowledge integration ability. For construct validity, we analyze item properties such as alignment, discrimination, and target range on the knowledge integration scale using a Rasch Partial Credit Model analysis. For instructional validity, we test the sensitivity of multiple-choice and explanation items to knowledge integration instruction using a cohort comparison design. Results show that (1) one third of correct multiple-choice responses are aligned with higher levels of knowledge integration while three quarters of incorrect multiple-choice responses are aligned with lower levels of knowledge integration, (2) explanation items discriminate between high and low knowledge integration ability students much more effectively than multiple-choice items, (3) explanation items measure a wider range of knowledge integration levels than multiple-choice items, and (4) explanation items are more sensitive to knowledge integration instruction than multiple-choice items.


Science | 2014

Computer-Guided Inquiry to Improve Science Learning

Marcia C. Linn; Libby Gerard; Kihyun Ryoo; Kevin W. McElhaney; Ou Lydia Liu; Anna N. Rafferty

Automated guidance on essays and drawings can improve learning in precollege and college courses. Engaging students in inquiry practices is known to motivate them to persist in science, technology, engineering, and mathematics (STEM) fields and to create lifelong learners (1, 2). In inquiry, students initiate investigations, gather data, critique evidence, and make sophisticated drawings or write coherent essays to explain complex phenomena. Yet, most instruction relies on lectures that transmit information and multiple-choice tests that determine which details students recall. Massive Open Online Courses (MOOCs) mostly offer more of the same. But new cyber-learning tools may change all this, by taking advantage of new algorithms to automatically score student essays and drawings and offer personalized guidance.


Educational Assessment | 2010

Multifaceted Assessment of Inquiry-Based Science Learning

Ou Lydia Liu; Hee-Sun Lee; Marcia C. Linn

To improve student science achievement in the United States we need inquiry-based instruction that promotes coherent understanding and assessments that are aligned with the instruction. Instead, current textbooks often offer fragmented ideas and most assessments only tap recall of details. In this study we implemented 10 inquiry-based science units that promote knowledge integration and developed assessments that measure student knowledge integration abilities. To measure student learning outcomes, we designed a science assessment consisting of both proximal items that are related to the units and distal items that are published from standardized tests (e.g., Trends in International Mathematics and Science Study). We compared the psychometric properties and instructional sensitivity of the proximal and distal items. To unveil the context of learning, we examined how student, class, and teacher characteristics affect student inquiry science learning. Several teacher-level characteristics including professional development showed a positive impact on science performance.


Educational Assessment | 2011

An Investigation of Explanation Multiple-Choice Items in Science Assessment

Ou Lydia Liu; Hee-Sun Lee; Marcia C. Linn

Both multiple-choice and constructed-response items have known advantages and disadvantages in measuring scientific inquiry. In this article we explore the function of explanation multiple-choice (EMC) items and examine how EMC items differ from traditional multiple-choice and constructed-response items in measuring scientific reasoning. A group of 794 middle school students was randomly assigned to answer either constructed-response or EMC items following regular multiple-choice items. By applying a Rasch partial-credit analysis, we found that there is a consistent alignment between the EMC and multiple-choice items. Also, the EMC items are easier than the constructed-response items but are harder than most of the multiple-choice items. We discuss the potential value of the EMC items as a learning and diagnostic tool.


International Journal of Science Education | 2012

Differential Performance by English Language Learners on an Inquiry-Based Science Assessment

Sultan Turkan; Ou Lydia Liu

The performance of English language learners (ELLs) has been a concern given the rapidly changing demographics in US K-12 education. This study aimed to examine whether students English language status has an impact on their inquiry science performance. Differential item functioning (DIF) analysis was conducted with regard to ELL status on an inquiry-based science assessment, using a multifaceted Rasch DIF model. A total of 1,396 seventh- and eighth-grade students took the science test, including 313 ELL students. The results showed that, overall, non-ELLs significantly outperformed ELLs. Of the four items that showed DIF, three favored non-ELLs while one favored ELLs. The item that favored ELLs provided a graphic representation of a science concept within a family context. There is some evidence that constructed-response items may help ELLs articulate scientific reasoning using their own words. Assessment developers and teachers should pay attention to the possible interaction between linguistic challenges and science content when designing assessment for and providing instruction to ELLs.


International Journal of Science Education | 2015

Measuring Knowledge Integration Learning of Energy Topics: A two-year longitudinal study

Ou Lydia Liu; Kihyun Ryoo; Marcia C. Linn; Elissa Sato; Vanessa Svihla

Although researchers call for inquiry learning in science, science assessments rarely capture the impact of inquiry instruction. This paper reports on the development and validation of assessments designed to measure middle-school students’ progress in gaining integrated understanding of energy while studying an inquiry-oriented curriculum. The assessment development was guided by the knowledge integration framework. Over 2 years of implementation, more than 4,000 students from 4 schools participated in the study, including a cross-sectional and a longitudinal cohort. Results from item response modeling analyses revealed that: (a) the assessments demonstrated satisfactory psychometric properties in terms of reliability and validity; (b) both the cross-sectional and longitudinal cohorts made progress on integrating their understanding energy concepts; and (c) among many factors (e.g. gender, grade, school, and home language) associated with students’ science performance, unit implementation was the strongest predictor.


Archive | 2013

Professional Development Programs for Teaching with Visualizations

Libby Gerard; Ou Lydia Liu; Stephanie B. Corliss; Keisha Varma; Michele W. Spitulnik; Marcia C. Linn

Previous research suggests the value of technology-enhanced materials that guide learners to use dynamic, interactive visualizations of science phenomena. The power of these visualizations to improve student understanding depends on the teacher. In this chapter we provide two exemplars of professional development programs that focus on teaching with visualizations. The programs differ in intensity but follow the same basic philosophy. We show that the more intense professional development approach results in more effective teacher implementation of visualizations and greater student learning gains. We identify specific strategies that other educators can use to improve students’ knowledge integration with interactive visualizations.


Journal of Research in Science Teaching | 2010

How Do Technology-Enhanced Inquiry Science Units Impact Classroom Learning?.

Hee Sun Lee; Marcia C. Linn; Keisha Varma; Ou Lydia Liu


Journal of Research in Science Teaching | 2010

An investigation of teacher impact on student inquiry science performance using a hierarchical linear model

Ou Lydia Liu; Hee-Sun Lee; Marcia C. Linn

Collaboration


Dive into the Ou Lydia Liu's collaboration.

Top Co-Authors

Avatar

Marcia C. Linn

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Libby Gerard

University of California

View shared research outputs
Top Co-Authors

Avatar

Kihyun Ryoo

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Keisha Varma

University of Minnesota

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge