David F. Feldon
Utah State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David F. Feldon.
Games and Culture | 2010
Yasmin B. Kafai; Maria Quintero; David F. Feldon
Learning scientists have created and used virtual worlds to support players’ historical, scientific, and ecological inquiries. Much less explored has been the impact of community events on players’ investigations in virtual worlds. The authors present here the case of a community event Whypox, a virtual epidemic whose annual outbreak in Whyville affects players’ communication and appearance. The authors analyze the different levels of participation ranging from casual to systematic in which players searched out more information about the Whypox, participated in online discussions about its causes and investigated different scenarios with simulations. The discussion examines ethical concerns, the contributions of our findings for the design of such community events, and educational resources in virtual worlds to support informal learning.
Games and Culture | 2009
Yasmin B. Kafai; Maria Quintero; David F. Feldon
Learning scientists have created and used virtual worlds to support players’ historical, scientific, and ecological inquiries. Much less explored has been the impact of community events on players’ investigations in virtual worlds. The authors present here the case of a community event Whypox, a virtual epidemic whose annual outbreak in Whyville affects players’ communication and appearance. The authors analyze the different levels of participation ranging from casual to systematic in which players searched out more information about the Whypox, participated in online discussions about its causes and investigated different scenarios with simulations. The discussion examines ethical concerns, the contributions of our findings for the design of such community events, and educational resources in virtual worlds to support informal learning.
Journal of Cognitive Engineering and Decision Making | 2013
Colby Tofel-Grehl; David F. Feldon
Cognitive task analysis (CTA) is enjoying growing popularity in both research and practice as a foundational element of instructional design. However, there exists relatively little research exploring its value as a foundation for training through controlled studies. Furthermore, highly individualized approaches to conducting CTA do not permit broadly generalizable conclusions to be drawn from the findings of individual studies. Thus, examining the magnitude of observed effects across studies from various domains and CTA practitioners is essential for assessing replicable effects. This study reports the findings from a meta-analysis that examines the overall effectiveness of CTA across practitioners and settings in relation to other means for identifying and representing instructional content. Overall, the effect of CTA-based instruction is large (Hedges’s g = 0.871). However, effect sizes vary substantially by both CTA method used and training context. Though limited by a relatively small number of studies, the notable effect size indicates that the information elicited through CTA provides a strong basis for highly effective instruction.
American Educational Research Journal | 2015
David F. Feldon; Michelle Maher; M. Hurst; Briana E. Timmerman
Faculty mentorship is thought to be a linchpin of graduate education in STEM disciplines. This mixed-method study investigates agreement between student mentees’ and their faculty mentors’ perceptions of the students’ developing research knowledge and skills in STEM. We also compare both assessments against independent ratings of the students’ written research proposals. In most cases, students and their mentors identified divergent strengths and weaknesses. However, when mentor-mentee pairs did identify the same characteristics, mentors and mentees disagreed about the mentee’s abilities in 44% of cases in the Fall semester and 75% of cases in the Spring semester. When compared against performance-based assessments of mentees’ work, neither faculty mentors’ nor their mentees’ perceptions aligned with rubric scores at rates greater than chance in most categories.
Change: The Magazine of Higher Learning | 2010
David F. Feldon
We always tell our students that there are no shortcuts, that important ideas are nuanced, and that recognizing subtle distinctions is an essential critical-thinking skill. Mastery of a discipline,...
The Journal of Higher Education | 2015
Joanna Gilmore; Michelle Vieyra; Briana E. Timmerman; David F. Feldon; Michelle Maher
Undergraduate research experiences have been adopted across higher education institutions. However, most studies examining benefits derived from undergraduate research rely on self-report of skill development. This study used an empirical assessment of research skills to investigate associations between undergraduate research experiences and research skill performance in graduate school. Research experience characteristics including duration, autonomy, collaboration, and motivation were also examined. Undergraduate research experience was linked to heightened graduate school performance in all research skills assessed. While autonomy and collaboration were highlighted in student interviews, duration was most strongly correlated to significant increases in research skill performance. Based on these findings, we advocate for the inclusion of research experiences into the undergraduate science curriculum coupled with the creation of centralized offices of undergraduate research and faculty incentives for involving undergraduates in their research.
Science | 2010
David F. Feldon; Michelle Maher; Briana E. Timmerman
Performance-based assessments of student skill development can help inform decisions about improving graduate education. Understanding the scholarly development of Ph.D. students in science, technology, engineering, and mathematics (STEM) is vital to the preparation of the scientific workforce. During doctoral study, students learn to be professional scientists and acquire the competencies to succeed in those roles. However, this complex process is not well studied. Research to date suffers from overreliance on a narrow range of methods that cannot provide data appropriate for addressing questions of causality or effectiveness of specific practices in doctoral education. We advocate a shift in focus from student and instructor self-report toward the use of actual performance data as a remedy that can ultimately contribute to improved student outcomes.
Proceedings of the National Academy of Sciences of the United States of America | 2017
David F. Feldon; Soojeong Jeong; James Peugh; Josipa Roksa; Cathy Maahs-Fladung; Alok Shenoy; Michael Oliva
Significance To increase the effectiveness of graduate research training, many universities have introduced boot camps and bridge programs lasting several days to several weeks. National Science Foundation and National Institutes of Health currently support such interventions with nearly
CBE- Life Sciences Education | 2017
David F. Feldon; James Peugh; M. Maher; Josipa Roksa; Colby Tofel-Grehl
28 million in active awards. Previous evidence for the efficacy of this format exists primarily in the form of anecdotes and end-of-course surveys. Here we show that participation in such short-format interventions is not associated with observable benefits related to skill development, scholarly productivity, or socialization into the academic community. Analyzing data from 294 PhD students in life sciences from 53 US institutions, we found no evidence of effectiveness across 115 variables. We conclude that boot camps and other short formats may not durably impact student outcomes. Many PhD programs incorporate boot camps and summer bridge programs to accelerate the development of doctoral students’ research skills and acculturation into their respective disciplines. These brief, high-intensity experiences span no more than several weeks and are typically designed to expose graduate students to data analysis techniques, to develop scientific writing skills, and to better embed incoming students into the scholarly community. However, there is no previous study that directly measures the outcomes of PhD students who participate in such programs and compares them to the outcomes of students who did not participate. Likewise, no previous study has used a longitudinal design to assess these outcomes over time. Here we show that participation in such programs is not associated with detectable benefits related to skill development, socialization into the academic community, or scholarly productivity for students in our sample. Analyzing data from 294 PhD students in the life sciences from 53 US institutions, we found no statistically significant differences in outcomes between participants and nonparticipants across 115 variables. These results stand in contrast to prior studies presenting boot camps as effective interventions based on participant satisfaction and perceived value. Many universities and government agencies (e.g., National Institutes of Health and National Science Foundation) invest substantial resources in boot camp and summer bridge activities in the hopes of better supporting scientific workforce development. Our findings do not reveal any measurable benefits to students, indicating that an allocation of limited resources to alternative strategies with stronger empirical foundations warrants consideration.
American Educational Research Journal | 2016
David F. Feldon; M. Maher; Josipa Roksa; James Peugh
A national sample of female PhD students logged significantly more hours conducting research than their male counterparts. However, males were 15% more likely to be listed as authors on journal articles per 100 hours of research time, reflecting inequality on an essential metric of scholarly productivity that directly impacts competitiveness for academic positions.