Robin Anderson
James Madison University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Robin Anderson.
Interdisciplinary Journal of Problem-based Learning | 2010
Olga Pierrakos; Anna Zilberberg; Robin Anderson
There has been much criticism about science, technology, engineering, and mathematics (STEM) education not focusing enough on problem solving, especially in authentic realworld contexts which are most often associated to ill-structured domains. To improve education, it is essential that curricula promote high levels of cognitive development by exposing students to authentic problems. Problem-based learning (PBL) is a studentcentered pedagogy that offers a strong framework upon which to build a curriculum to teach students essential problem solving skills. An authentic problem-solving experience, which is highly valued and promoted outside of the classroom yet almost nonexistent in the classroom, is undergraduate research (UR). Herein, the goal was to understand the nature of UR problems as a means of developing recommendations for translating UR problems and experiences into the classroom using PBL methodologies. Using survey design, data were collected from sixty students participating in summer undergraduate research experiences. Our findings revealed that moderately structured and fairly complex UR problems are well-suited for PBL implementation in the classroom because they trigger the use of multiple cognitive operations in the context of a continuously changing, dynamic, and interdisciplinary team environment.
Journal of Computer Information Systems | 2015
Nancy L. Harris; S. E. Kruck; Polly Cushman; Robin Anderson
IT jobs continue to increase; however, the number of women pursuing a career in technology has declined. We conducted a study of incoming freshmen to examine previous course taking patterns, access to computers, interest in technology majors and reasons students opted in or out of a technology major. Males and females exhibited differences in course taking patterns. We found a significant difference in course taking patterns in physics, but not differences in computer access. We also report on perception of technology majors.
frontiers in education conference | 2010
Olga Pierrakos; Heather Watson; Ron Kander; Javarro Russell; Robin Anderson
Most of us acknowledge that not all problems are created equal and that different types of problems lead to different learning outcomes for students. For example, it is well-known that undergraduate engineering courses mainly focus on problems that are well-structured with known, correct solutions; yet, real-world practice is more suffused with complex and ill-structured problems. So, it is imperative that engineering students begin the real-world practice of problem solving within the undergraduate curriculum. Problem-based learning (PBL) is a powerful student-centered pedagogy that offers a strong framework upon which to build a curriculum that will allow all students to learn essential, real-world, and globally competitive problem solving skills. This special session is thus designed to not only provide participants with background on PBL theory and the nature of problems, but also to provide them with materials and resources on developing, classifying, and assessing a variety of PBL activities in their courses. It is hoped that this session will enable the facilitators and attendees to generate a collection of peer developed ideas and feedback on understanding the nature of problems and problem solving. The potential impacts of this session could have transformative implications for engineering education and student learning.
International Journal of Testing | 2014
Anna Zilberberg; Sara J. Finney; Kimberly R. Marsh; Robin Anderson
Given worldwide prevalence of low-stakes testing for monitoring educational quality and students’ progress through school (e.g., Trends in International Mathematics and Science Study, Program for International Student Assessment), interpretability of resulting test scores is of global concern. The nonconsequential nature of low-stakes tests can undermine students’ test-taking motivation, artificially deflating performance and thus jeopardizing validity of test-based inferences, whether they pertain to programs, institutions, or nations (Eklöf, 2007, 2010; Stanat & Lüdtke, 2013; Wise & DeMars, 2005). Moreover, students in countries such as the United States, where academic progress over the course of K–12 (kindergarten through Grade 12) is systematically assessed, are likely to develop antagonistic attitudes toward low-stakes testing by the time they enter college. The relationship between such attitudes, test-taking motivation, and performance on a low-stakes university accountability test was modeled via path analysis. Results indicated the effects of attitudes were indirect (via test-taking motivation) and minimal, suggesting the influence of attitudes on test performance is negligible, further supporting the validity of inferences made from such low-stakes tests. Implications for international assessment are discussed.
frontiers in education conference | 2016
Olga Pierrakos; Robin Anderson; Elise Barrella
Problem-solving is generally regarded as the most important cognitive activity in everyday and professional practice. Problems in real-world practice have been described as messy, complex, and ill-structured, whereas many engineering classroom problems have been described as well-structured with single correct solutions. How do we prepare our students for real-world problem solving? In this collaborative and participant-centered workshop, faculty will be introduced to a novel and adaptive Problem-Based Learning (PBL) model developed and implemented in JMUs Engineering program over the past eight years and supported by NSF awards. Participants will be provided with PBL theory, PBL examples, a PBL classification framework, assessment tools, and a PBL template for use across courses and curricula.
frontiers in education conference | 2017
Nicholas A. Curtis; Olga Pierrakos; Robin Anderson
To correctly engage in the instrument development process takes time. According to Bensons model of construct validation (1) there are at least three lengthy stages (substantive, structural, external) to construct validation when developing a new measure. This study examines how instrument structure impacts the validation process and what it can tell us about instrument revision. In this study we will take a closer look at the development of the Engineering Student Identity Scale (ESIS). After engaging in the in-depth substantive stage of instrument development, the instrument developers began the process of examining whether the anticipated factor structure of the measure fit the observed data. Results from a series of CFA analyses indicated that none of the hypothesized models fit the observed data. While instrument revision at any stage of development is important and likely to lead to a better measure, it is important to gather as much information as possible from each stage to guide revisions. In an effort to better understand how the items on the ESIS instrument function together, a cross-disciplinary team of psychology and engineering experts conducted a series of CFAs and an EFA to examine the factor structure present in the observed data. Initial results support a six-factor structure. Implications for the next round of instrument development are discussed.
frontiers in education conference | 2009
Olga Pierrakos; Tk Beam; Jamie Constantz; Aditya Johri; Robin Anderson
Educational Assessment | 2013
Anna Zilberberg; Robin Anderson; Sara J. Finney; Kimberly R. Marsh
2009 Annual Conference & Exposition | 2009
Tk Beam; Olga Pierrakos; Jamie Constantz; Aditya Johri; Robin Anderson
Research & Practice in Assessment | 2012
Anna Zilberberg; Robin Anderson; Peter J. Swerdzewski; Sara J. Finney; Kimberly R. Marsh