Jayson M. Nissen
California State University, Chico
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jayson M. Nissen.
2017 Physics Education Research Conference Proceedings | 2018
Ben Van Dusen; Jayson M. Nissen
Creating equitable performance outcomes among students is a focus of many instructors and researchers. One focus of this effort is examining disparities in physics student performance across genders, which is a well-established problem. Another less common focus is disparities across racial and ethnic groups, which may have received less attention due to low representation rates making it difficult to identify gaps in their performance. In this investigation we examined associations between Learning Assistant (LA) supported courses and improved equity in student performance. We built Hierarchical Linear Models of student performance to investigate how performance differed by gender and by race/ethnicity and how LAs may have moderated those differences. Data for the analysis came from pre-post concept inventories in introductory mechanics courses collected through the Learning About STEM Student Outcomes (LASSO) platform. Our models show that gaps in performance across genders and races/ethnicities were similar in size and increased from pre to post instruction. LA-support is meaningfully and reliably associated with improvement in overall student performance but not with shifts in within-course performance gaps.
International Journal of STEM Education | 2018
Jayson M. Nissen; Manher Jariwala; Eleanor W. Close; Ben Van Dusen
BackgroundHigh-stakes assessments, such the Graduate Records Examination, have transitioned from paper to computer administration. Low-stakes research-based assessments (RBAs), such as the Force Concept Inventory, have only recently begun this transition to computer administration with online services. These online services can simplify administering, scoring, and interpreting assessments, thereby reducing barriers to instructors’ use of RBAs. By supporting instructors’ objective assessment of the efficacy of their courses, these services can stimulate instructors to transform their courses to improve student outcomes. We investigate the extent to which RBAs administered outside of class with the online Learning About STEM Student Outcomes (LASSO) platform provide equivalent data to tests administered on paper in class, in terms of both student participation and performance. We use an experimental design to investigate the differences between these two assessment conditions with 1310 students in 25 sections of 3 college physics courses spanning 2 semesters.ResultsAnalysis conducted using hierarchical linear models indicates that student performance on low-stakes RBAs is equivalent for online (out-of-class) and paper-and-pencil (in-class) administrations. The models also show differences in participation rates across assessment conditions and student grades, but that instructors can achieve participation rates with online assessments equivalent to paper assessments by offering students credit for participating and by providing multiple reminders to complete the assessment.ConclusionsWe conclude that online out-of-class administration of RBAs can save class and instructor time while providing participation rates and performance results equivalent to in-class paper-and-pencil tests.
2017 Physics Education Research Conference Proceedings | 2018
Manher Jariwala; Jayson M. Nissen; Xochith Herrera; Eleanor W. Close; Ben Van Dusen
This study investigates differences in student participation rates between in-class and online administrations of research-based assessments. A sample of 1,310 students from 25 sections of 3 different introductory physics courses over two semesters were instructed to complete the CLASS attitudinal survey and the concept inventory relevant to their course, either the FCI or the CSEM. Each student was randomly assigned to take one of the surveys in class and the other survey online at home using the Learning About STEM Student Outcomes (LASSO) platform. Results indicate large variations in participation rates across both test conditions (online and in class). A hierarchical generalized linear model (HGLM) of the student data utilizing logistic regression indicates that student grades in the course and faculty assessment administration practices were both significant predictors of student participation. When the recommended online assessments administration practices were implemented, participation rates were similar across test conditions. Implications for student and course assessment methodologies will be discussed.
2017 Physics Education Research Conference Proceedings | 2018
David Donnelly; Jean-Michel Mailloux-Huberdeau; Jayson M. Nissen; Eleanor W. Close
At Texas State University, we have been using the Force Concept Inventory (FCI) to assess our introductory mechanics course since the Spring 2011 semester. This provides us with a large data set (N=1,626) on which to perform detailed statistical analysis of student learning. Recent research has found conflicting results in the relationships between normalized gain 〈g〉, Cohens d, and pretest mean, which might lead to different interpretations of student learning. Specifically, in one study 〈g〉 was found to positively correlate with both pretest mean and pretest standard deviation, whereas Cohens d did not; in another study, ANOVA showed no connection between 〈g〉 and pretest mean. We will present a comparison of 〈g〉 and Cohen’s d for our data set, and will specifically use these measures to look at performance gaps related to gender and race/ethnicity.
2017 Physics Education Research Conference Proceedings | 2018
Daniel Caravez; Angelica De La Torre; Jayson M. Nissen; Ben Van Dusen
A central goal of the Learning Assistant (LA) model is to improve students learning of science through the transformation of instructor practices. There is minimal existing research on the impact of college physics instructor experiences on their effectiveness. To investigate the association between college introductory physics instructors experiences with and without LAs and student learning, we drew on data from the Learning About STEM Student Outcomes (LASSO) database. The LASSO database provided us with student-level data (concept inventory scores and demographic data) for 4,365 students and course-level data (instructor experience and course features) for the students 93 mechanics courses. We performed Hierarchical Multiple Imputation to impute missing data and Hierarchical Linear Modeling to nest students within courses when modeling the associations between instructor experience and student learning. Our models predict that instructors effectiveness decreases as they gain experience teaching without LAs. However, LA supported environments appear to remediate this decline in effectiveness as instructor effectiveness is maintained while they gain experience teaching with LAs.
arXiv: Physics Education | 2018
Ben Van Dusen; Jayson M. Nissen
arXiv: Physics Education | 2018
Xochhith Herrera; Jayson M. Nissen; Ben Van Dusen
arXiv: Physics Education | 2018
Jayson M. Nissen; Robin Donatello; Ben Van Dusen
arXiv: Physics Education | 2018
Ben Van Dusen; Jayson M. Nissen
arXiv: Physics Education | 2018
Ben Van Dusen; Jayson M. Nissen