Manher Jariwala
Boston University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Manher Jariwala.
arXiv: Physics Education | 2016
Manher Jariwala; Jada-Simone S. White; Ben Van Dusen; Eleanor W. Close
This study investigates differences in student responses to in-class and online administrations of the Force Concept Inventory (FCI), Conceptual Survey of Electricity and Magnetism (CSEM), and the Colorado Learning Attitudes about Science Survey (CLASS). Close to 700 physics students from 12 sections of three different courses were instructed to complete the concept inventory relevant to their course, either the FCI or CSEM, and the CLASS. Each student was randomly assigned to take one of the surveys in class and the other survey online using the LA Supported Student Outcomes (LASSO) system hosted by the Learning Assistant Alliance (LAA). We examine how testing environments and instructor practices affect participation rates and identify best practices for future use.
International Journal of STEM Education | 2018
Jayson M. Nissen; Manher Jariwala; Eleanor W. Close; Ben Van Dusen
BackgroundHigh-stakes assessments, such the Graduate Records Examination, have transitioned from paper to computer administration. Low-stakes research-based assessments (RBAs), such as the Force Concept Inventory, have only recently begun this transition to computer administration with online services. These online services can simplify administering, scoring, and interpreting assessments, thereby reducing barriers to instructors’ use of RBAs. By supporting instructors’ objective assessment of the efficacy of their courses, these services can stimulate instructors to transform their courses to improve student outcomes. We investigate the extent to which RBAs administered outside of class with the online Learning About STEM Student Outcomes (LASSO) platform provide equivalent data to tests administered on paper in class, in terms of both student participation and performance. We use an experimental design to investigate the differences between these two assessment conditions with 1310 students in 25 sections of 3 college physics courses spanning 2 semesters.ResultsAnalysis conducted using hierarchical linear models indicates that student performance on low-stakes RBAs is equivalent for online (out-of-class) and paper-and-pencil (in-class) administrations. The models also show differences in participation rates across assessment conditions and student grades, but that instructors can achieve participation rates with online assessments equivalent to paper assessments by offering students credit for participating and by providing multiple reminders to complete the assessment.ConclusionsWe conclude that online out-of-class administration of RBAs can save class and instructor time while providing participation rates and performance results equivalent to in-class paper-and-pencil tests.
2017 Physics Education Research Conference Proceedings | 2018
Manher Jariwala; Jayson M. Nissen; Xochith Herrera; Eleanor W. Close; Ben Van Dusen
This study investigates differences in student participation rates between in-class and online administrations of research-based assessments. A sample of 1,310 students from 25 sections of 3 different introductory physics courses over two semesters were instructed to complete the CLASS attitudinal survey and the concept inventory relevant to their course, either the FCI or the CSEM. Each student was randomly assigned to take one of the surveys in class and the other survey online at home using the Learning About STEM Student Outcomes (LASSO) platform. Results indicate large variations in participation rates across both test conditions (online and in class). A hierarchical generalized linear model (HGLM) of the student data utilizing logistic regression indicates that student grades in the course and faculty assessment administration practices were both significant predictors of student participation. When the recommended online assessments administration practices were implemented, participation rates were similar across test conditions. Implications for student and course assessment methodologies will be discussed.
2016 Physics Education Research Conference Proceedings | 2016
Alexander P. Becker; Bennett Goldberg; Manher Jariwala
We study how graduate student teaching fellows (TFs) and undergraduate learning assistants (LAs) view their roles and responsibilities as educators in undergraduate classrooms. We present results from a survey of 35 physics TFs and LAs across a range of physics classes, measuring their expectations of their teaching mission with regard to such factors as classroom authority, student interaction time, responsibility for student learning, and helpfulness to students. We further analyze their answers based on the classroom format they have taught in. We find that the perceptions TFs and LAs express in the survey regarding their roles in the classroom are similar; however, we find differences when looking at the questions surrounding teacher-student interactions.
Science Education | 2015
Peter Garik; Luciana Garbayo; Yann Benétreau-Dupin; Charles Winrich; Andrew Duffy; Nicholas Gross; Manher Jariwala
2017 Physics Education Research Conference Proceedings | 2018
Jayson M. Nissen; Manher Jariwala; Xochith Herrera; Eleanor W. Close; Ben Van Dusen
arXiv: Physics Education | 2017
Jayson M. Nissen; Manher Jariwala; Xochith Herrera; Eleanor W. Close; Ben Van Dusen
Bulletin of the American Physical Society | 2017
Manher Jariwala
Bulletin of the American Physical Society | 2016
Manher Jariwala; Hunter Close; David G. Haase
Bulletin of the American Physical Society | 2011
Andrew Duffy; Manher Jariwala; Peter Garik