Daniel T. Hickey
Indiana University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Daniel T. Hickey.
International Journal of Science Education | 2008
Dionne I. Cross; Gita Taasoobshirazi; Sean Hendricks; Daniel T. Hickey
In this paper we explore the relationship between learning gains, measured through pre‐assessment and post‐assessment, and engagement in scientific argumentation. In order to do so, this paper examines group discourse and individual learning during the implementation of NASA Classroom of the Future’s BioBLAST!® (BB) software program in a high school biology classroom. We found that the argumentative structures, the quality of these structures, and the identities that students take on during collaborative group work are critical in influencing student learning and achievement in science. We provide recommendations for instructors implementing argumentation in their science classrooms, and provide suggestions for the development of future research in this area.
Educational Psychologist | 2001
Mary McCaslin; Daniel T. Hickey
Psychology has long been a field beset with identity crises of one sort or another. At midcentury, psychology openly struggled with self-definition-what is psychology?-and the role-whom or what does it serve?-it was to play in individual and societal issues. Educational psychology has suffered similar identity issues. This article examines briefly the history and futility of educational psychologys in-house fights over mission and contests for theoretical dominance, allegedly in the name of unity. This article suggests instead the desirability of collaboration among diverse participants and theoretical integration for the improvement of educational practices. This article illustrates this goal with discussion of current work within a social constructivist framework.
International Journal of Science Education | 2007
Kate T. Anderson; Steven J. Zuiker; Gita Taasoobshirazi; Daniel T. Hickey
This study details an innovative approach to coordinating and enhancing multiple levels of assessment and discursive feedback around an existing multi‐media curricular environment called Astronomy Village®. As part of a broader design‐based research programme, the study analysed small group interactions in feedback activities across two design cycles. The goal of this analysis is to develop an understanding of the ways that a situative approach to assessment and practise supports learning. Findings demonstrate ways that student and teacher engagement in collaborative activities support and constrain meaningful understanding, which we consider in terms of a trajectory of participation in and across conversations and written assessments, as well as individual learning gains on formal classroom examinations and standards‐oriented external tests. Analyses of complementary formulations of domain concepts—discourse practises and assessment performance—suggest that participation in social forms of scientific engagement supports both learning and subsequent performance in more formal contexts. We suggest design principles for integrating the formative functions of discursive feedback with the summative functions of traditional assessment, through participation in different forms of science discourse(s).
learning analytics and knowledge | 2014
Philip J. Piety; Daniel T. Hickey; M. J. Bishop
In this paper, we develop a conceptual framework for organizing emerging analytic activities involving educational data that can fall under broad and often loosely defined categories, including Academic/Institutional Analytics, Learning Analytics/Educational Data Mining, Learner Analytics/Personalization, and Systemic Instructional Improvement. While our approach is substantially informed by both higher education and K-12 settings, this framework is developed to apply across all educational contexts where digital data are used to inform learners and the management of learning. Although we can identify movements that are relatively independent of each other today, we believe they will in all cases expand from their current margins to encompass larger domains and increasingly overlap. The growth in these analytic activities leads to the need to find ways to synthesize understandings, find common language, and develop frames of reference to help these movements develop into a field.
Learning and Individual Differences | 1994
Susan C. Fischer; Daniel T. Hickey; James W. Pellegrino; David J. Law
Abstract Four experiments investigated performance differences in dynamic spatial reasoning that reflect differences in cognitive strategies. In Experiment 1, verbal protocols obtained during execution of the arrival-time task revealed a systematic relationship between performance and strategy use. High-performance subjects were more likely than low-performance subjects to use and integrate key information about object velocity and travel distance. Experiment 2 and 3 further showed that verbal and visual feedback, respectively, improve judgment accuracy when stimulus conditions necessitate the integration of object velocity and travel distance information, but not when distance information alone is sufficient to make the judgment. Experiment 4 established the stability of the feedback effects and demonstrated that individual differences in velocity-judgment ability predict the capacity to profit from feedback. The results suggest a tendency to neglect information regarding relative object velocity when making judgments of arrival-time.
The Journal of the Learning Sciences | 2012
Daniel T. Hickey; Steven J. Zuiker
Evaluating the impact of instructional innovations and coordinating instruction, assessment, and testing present complex tensions. Many evaluation and coordination efforts aim to address these tensions by using the coherence provided by modern cognitive science perspectives on domain-specific learning. This paper introduces an alternative framework that uses emerging situative assessment perspectives to align learning across increasingly formal levels of educational practice. This framework emerged from 2 design studies of a 20-hr high school genetics curriculum that used the GenScope computer-based modeling software. The 1st study aligned learning across (a) the contextualized enactment of inquiry-oriented activities in GenScope, (b) “feedback conversations” around informal embedded assessments, and (c) a formal performance assessment; the 2nd study extended this alignment to a conventional achievement test. Design-based refinements ultimately delivered gains of nearly 2 SD on the performance assessment and more than 1 SD in achievement. These compared to gains of 0.25 and 0.50 SD, respectively, in well-matched comparison classrooms. General and specific assessment design principles for aligning instruction, assessment, and testing and for evaluating instructional innovations are presented.
Journal of Educational Computing Research | 2011
Daniel T. Hickey; Jenna McWilliams; Michelle A. Honeyford
Traditional literacy instruction is perhaps still necessary but is certainly no longer sufficient to prepare learners for participation in the range of literacy practices that characterize an increasingly participatory culture. This article identifies discrepancies between traditional instructional practices that emphasize individual mastery of abstract concepts and skills and new media literacy practices that rely upon collaborative, social, and context-specific activity. In particular, mainstream assessment practices become problematic for teachers who are interested in integrating these so-called “participatory practices” into their classrooms. Through a description of a year-long collaboration around a secondary language arts curriculum, we present an assessment framework designed to support a social model of learning and to help prepare learners for engagement with and participation in a range of knowledge-building and problem-solving activities and communities, while supporting gains in more traditional curricular and standards-based assessments. This framework, which we call “participatory assessment,” builds on previous work in science and math instruction, as well as in immersive video games, and extends that work into the secondary English language arts classroom. This article describes the curriculum, the approach, and some of the assessment design principles that emerged.
Archive | 1994
Susan R. Goldman; Anthony J. Petrosino; Robert D. Sherwood; Steve Garrison; Daniel T. Hickey; John D. Bransford; James W. Pellegrino
The Scientists-in-Action series is a multimedia environment for anchoring science instruction in meaningful contexts. Video anchors are designed according to research-based design principles and used in classroom contexts with adolescents. In two experiments, students who worked with an episode about a chemical spill showed gains in content knowledge and more positive attitudes toward science and scientists than did students who did not see the episode.
American Journal of Distance Education | 2013
Daniel T. Hickey; Rebecca C. Itow; Andi Rehak; Katerina Schenke; Cathy Tran
Erin Knight is the senior learning director at the Mozilla Foundation and co-creator of Mozilla’s Open Badges Infrastructure. Erin is leading the effort at the Mozilla Foundation to support the dynamic community that has emerged around digital badges. Daniel Hickey directs the Badge Design Principles Documentation Project at Indiana University. This two-year project was launched in 2012 to capture and share the new knowledge being generated across thirty diverse educational projects that were funded by the MacArthur and Gates Foundations to incorporate digital badges. He and his colleagues are currently identifying the general design principles for using badges to recognize, assess, motivate, and evaluate learning. In this interview, team members Rebecca Itow, Andi Rehak, Katerina Schenke, and Cathy Tran also pose questions to Erin Knight.
The Information Society | 2016
Carla Casilli; Daniel T. Hickey
ABSTRACT Open digital badges are Web-enabled tokens of learning and accomplishment. They operate in an environment of explicit (rather than tacit) trust; open badges provide issuers the ability to include specific claims and associate those claims with detailed supporting evidence. Earners are encouraged to share their badges over social networks, e-mail, and websites, and the information they contain is expected to circulate readily in these spaces. Building upon current concepts and theories from the Information Sciences and Learning Sciences, this article shows how the informational affordances of digital badges are transforming education and learning more generally, and more particularly by transcending conventional paradigms of academic credentialing and educational assessment.