Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Saif Rayyan is active.

Publication


Featured researches published by Saif Rayyan.


2010 PHYSICS EDUCATION RESEARCH CONFERENCE | 2010

Toward an Integrated Online Learning Environment

Raluca Teodorescu; Andrew Pawl; Saif Rayyan; Analia Barrantes; David E. Pritchard

We are building in LON‐CAPA an integrated learning environment that will enable the development, dissemination and evaluation of PER‐based material. This environment features a collection of multi‐level research‐based homework sets organized by topic and cognitive complexity. These sets are associated with learning modules that contain very short exposition of the content supplemented by integrated open‐access videos, worked examples, simulations, and tutorials (some from ANDES). To assess students’ performance accurately with respect to a system‐wide standard, we plan to implement Item Response Theory. Together with other PER assessments and purposeful solicitation of student feedback, this will allow us to measure and improve the efficacy of various research‐based materials, while getting insights into teaching and learning.


American Journal of Physics | 2014

Analyzing the impact of course structure on electronic textbook use in blended introductory physics courses

Daniel T. Seaton; Gerd Kortemeyer; Yoav Bergner; Saif Rayyan; David E. Pritchard

We investigate how elements of course structure (i.e., the frequency of assessments as well as the sequencing and weight of course resources) influence the usage patterns of electronic textbooks (e-texts) in introductory physics courses. Specifically, we analyze the access logs of courses at Michigan State University and the Massachusetts Institute of Technology, each of which deploy e-texts as primary or secondary texts in combination with different formative assessments (e.g., embedded reading questions) and different summative assessment (exam) schedules. As such studies are frequently marred by arguments over what constitutes a “meaningful” interaction with a particular page (usually judged by how long the page remains on the screen), we consider a set of different definitions of “meaningful” interactions. We find that course structure has a strong influence on how much of the e-texts students actually read, and when they do so. In particular, courses that deviate strongly from traditional structures,...


2010 PHYSICS EDUCATION RESEARCH CONFERENCE | 2010

Improved Student Performance In Electricity And Magnetism Following Prior MAPS Instruction In Mechanics

Saif Rayyan; Andrew Pawl; Analia Barrantes; Raluca Teodorescu; David E. Pritchard

We examine the performance of a group of students in Introductory Electricity and Magnetism following a ReView course in Introductory Mechanics focusing on problem solving employing the Modeling Applied to Problem Solving (MAPS) pedagogy[1]. The group consists of students who received a D in the fall Mechanics course (8.01) and were given the chance to attend the ReView course and take a final retest. Improvement to a passing grade was qualification for the Electricity and Magnetism course (8.02) in the spring. The ReView course was conducted twice—during January 2009 and January 2010. As a control, we took a group of students with similar z‐scores in 8.01 in Fall 2007 that were not offered the ReView course. We show that the ReView students perform ∼0.7 standard deviations better than the control group (p∼0.002) and ∼0.5 standard deviations better than what is expected based on their performance in 8.01(p ∼0.001).


2011 Physics Education Research Conference Proceedings | 2012

Item response theory analysis of the mechanics baseline test

Caroline N. Cardamone; Jonathan E. Abbott; Saif Rayyan; Daniel T. Seaton; Andrew Pawl; David E. Pritchard

Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item resp...


2011 PHYSICS EDUCATION RESEARCH CONFERENCE | 2012

Development of a mechanics reasoning inventory

Andrew Pawl; Analia Barrantes; Carolin Cardamone; Saif Rayyan; David E. Pritchard

Strategic knowledge is required to appropriately organize procedures and concepts to solve problems. We are developing a standardized instrument assessing strategic knowledge in the domain of introductory mechanics. This instrument is inspired in part by Lawsons Classroom Test of Scientific Reasoning and Van Domelens Problem Decomposition Diagnostic. The predictive validity of the instrument has been suggested by preliminary studies showing significant correlation with performance on final exams administered in introductory mechanics courses at the Massachusetts Institute of Technology and the Georgia Institute of Technology. In order to study the validity of the content from the students perspective, we have administered the instrument in free-response format to 40 students enrolled in calculus-based introductory mechanics at the University of Wisconsin-Platteville. This procedure has the additional advantage of improving the construct validity of the inventory, since student responses suggest effecti...


learning at scale | 2016

Instructor Dashboards In EdX

Colin Fredericks; Glenn Lopez; Victor Shnayder; Saif Rayyan; Daniel T. Seaton

Staff from edX, MIT, and Harvard will present two instructor dashboards for edX MOOCs. Current workflows will be described, from parsing and displaying data to using dashboards for course revision. A major focus will be lessons learned in the first two years of deployment.


2012 Physics Education Research Conference Proceedings | 2013

Multidimensional student skills with collaborative filtering

Yoav Bergner; Saif Rayyan; Daniel T. Seaton; David E. Pritchard

Despite the fact that a physics course typically culminates in one final grade for the student, many instructors and researchers believe that there are multiple skills that students acquire to achieve mastery. Assessment validation and data analysis in general may thus benefit from extension to multidimensional ability. This paper introduces an approach for model determination and dimensionality analysis using collaborative filtering (CF), which is related to factor analysis and item response theory (IRT). Model selection is guided by machine learning perspectives, seeking to maximize the accuracy in predicting which students will answer which items correctly. We apply the CF to response data for the Mechanics Baseline Test and combine the results with prior analysis using unidimensional IRT.


educational data mining | 2012

Model-Based Collaborative Filtering Analysis of Student Response Data: Machine-Learning Item Response Theory.

Yoav Bergner; Stefan Dröschler; Gerd Kortemeyer; Saif Rayyan; Daniel T. Seaton; David E. Pritchard


arXiv: Physics Education | 2014

Participation and Performance in 8.02x Electricity and Magnetism: The First Physics MOOC from MITx

Saif Rayyan; Daniel T. Seaton; John Belcher; David E. Pritchard; Isaac L. Chuang


Journal of Computer Assisted Learning | 2016

A MOOC based on blended pedagogy

Saif Rayyan; Colin Fredericks; Kimberly F. Colvin; Alwina Liu; Raluca E. Teodorescu; Analia Barrantes; Andrew Pawl; Daniel T. Seaton; David E. Pritchard

Collaboration


Dive into the Saif Rayyan's collaboration.

Top Co-Authors

Avatar

Daniel T. Seaton

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

David E. Pritchard

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrew Pawl

University of Wisconsin–Platteville

View shared research outputs
Top Co-Authors

Avatar

Analia Barrantes

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gerd Kortemeyer

Michigan State University

View shared research outputs
Top Co-Authors

Avatar

David E. Pritchard

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Isaac L. Chuang

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Raluca E. Teodorescu

George Washington University

View shared research outputs
Researchain Logo
Decentralizing Knowledge