Stephen M. Alessi
University of Iowa
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stephen M. Alessi.
Discourse Processes | 1979
Stephen M. Alessi; Thomas H. Anderson; Ernest T. Goetz
Looking back at relevant sections of previously read text is proposed as a useful fixup strategy when comprehension fails while studying text. Subjects read 24 pages of text and answered inserted questions which assessed their comprehension of the text frequently. About half of the subjects were branched back to reread prerequisite information when it was later needed but not fully understood by the subjects. Subjects receiving lookbacks showed better comprehension of later information dependent upon the prerequisite information. In the light of these results the training of natural lookbacks during study holds promise as a means of improving students’ study behaviors.
Simulation & Gaming | 2000
Stephen M. Alessi
Although learning in the system dynamics approach is generally accomplished through student model creation, there are many cases where learning may be better facilitated through incorporation of system dynamics models into more guided simulations. A model of simulation design is described and illustrated wherein designers create models in a system dynamics package and then transfer those models into a general instructional authoring system for the addition of instructional support features.
Archive | 2000
Stephen M. Alessi
The educational use of simulation encompasses two very different approaches: learners using simulations created by others, and learners building simulations themselves. At a general level it is tempting to say that the former (using simulations) applies more to procedural learning while the latter (building simulations) applies more to declarative learning. But what of the specific cases, the exceptions, and most importantly the common situation where both procedural and declarative learning must occur? This paper addresses two questions. First, what are the conditions which suggest choosing one approach (using versus building) over the other, or a combination of both? Second, what should be the characteristics of the learning environment to support each approach? Answering the first question requires analyzing the characteristics of the knowledge, the learners, and the learning processes that must occur. Answering the second question requires analyzing the characteristics of modeling, of simulations, and of learning environments needed to foster the approach and the desired learning processes.
Journal of research on computing in education | 1993
Yu-Fen Shih; Stephen M. Alessi
AbstractLearning and transfer of procedural skills was measured as a function of conceptual understanding (subjects’ mental models) induced by conceptual models in the form of computer graphics and animation during computer-based instruction. Three groups of nonprogrammers learned and practiced either code evaluation, code evaluation with the aid of conceptual models, or code generation. Practicing code evaluation with conceptual models was found to facilitate conceptual understanding, learning of code evaluation, and transfer to code generation. A positive relationship was found between the quality of subjects’ mental models and transfer ability, regardless of the experimental condition. The findings suggest that both number of shared productions and level of declarative knowledge are developed during practice and that transfer is a function of both. In practical terms, conceptual methods of instruction fostering appropriate mental models are suggested for cognitive skill learning. The instructional mate...
Educational Psychologist | 1975
Thomas H. Anderson; Richard C. Anderson; Bruce R. Dalgaard; Donald W. Paden; W. Barry Biddle; John R. Surber; Stephen M. Alessi
Abstract A computer assisted study management system (CAISMS) was experimentally investigated in the context of an introductory college economics course. The 169 students in the CAISMS and control classes attended similar lecture‐discussion classes and received an identical battery of achievement tests and questionnaires during the semester. Results from a multivariate analysis of covariance indicated that the CAISMS group scored significantly higher (p <.05) on achievement tests than the control group. In addition, analysis of variance showed that the attitudes of CAISMS students were more positive (p <.01) than were those of control students. Attrition rates were approximately equal in the two groups.
Educational Psychologist | 1974
Thomas H. Anderson; Richard C. Anderson; Bruce R. Dalgaard; Edward J. Wietecha; W. Barry Biddle; Donald W. Paden; H. Richard Smock; Stephen M. Alessi; John R. Surber; Laura L. Klemt
Abstract The Computer Aided Instruction Study Management System (CAISMS) was designed to maintain attentive study of instructional materials. Students often fail to learn from books and other instructional sources because they do not study them carefully enough. CAISMS was designed to intermittently question the student about what he is studying so as to maintain deep processing. In practice, the student signs in at a computer terminal and receives a brief study assignment. Upon completing the assignment in a nearby work space, the student again signs in. This time he receives a short quiz over the assignment just finished. The cycle starts again with the next assignment. CAISMS was used in an introductory level college economics course with approximately 70 students. Achievement, procedural and attitudinal data indicate that the study management technique is feasible to administer and potentially effective in producing achievement gains over traditional ongoing instruction. Most students had favorable to...
Journal of research on computing in education | 1988
Stanley R. Trollip; Stephen M. Alessi
AbstractComputer-based instruction should be regarded as a tool for enhancing the effectiveness of teachers. This can happen in two ways: by improving the general learning environment for the students; and by freeing the teacher from unproductive, time-comsuming tasks. In this paper, we highlight important criteria for deciding whether to integrate computers into classroom instruction.
Simulation & Gaming | 2010
Birgit Kopainsky; Matteo Pedercini; Pål I. Davidsen; Stephen M. Alessi
Simulation models provide decision support to long-term planning processes. The Bergen Learning Environment for National Development (BLEND) is a game based on a simplified version of Millennium Institutes Threshold 21 model (T21) that sensitizes policy makers in sub-Saharan African nations to the need for simulation-based decision support. The simplification eliminates or aggregates details about individual policy sectors and maintains cross-sector relationships. Validation indicates that the full and the simplified T21 model generate very similar behavior patterns for a wide range of policy scenarios. Pilot tests demonstrate that the simplified T21 model contributes to the learning goals of BLEND. The debriefing employs causal loop diagrams and simulation for structural explanations of the behavior observed during the game. BLEND workshops with repeated runs of the game, full debriefing sessions and different formats of instructional support will contribute further to research on dynamic decision making and learning about tasks with great complexity.
Journal of Instructional Development | 1988
Stephen M. Alessi
This article describes the development of an interactive videodisc in a graduate course. Design of interactive video and instruction about interactive video are discussed.
Computers in Human Behavior | 2013
Elisabeth M.C. Taminiau; Liesbeth Kester; Gemma Corbalan; Stephen M. Alessi; Erling Moxnes; Wim H. Gijselaers; Paul A. Kirschner; Jeroen J. G. van Merriënboer
In on-demand education, learners are required to plan their own learning trajectory by selecting suitable learning tasks. A positive effect on learning is expected when learners select tasks that help them fulfil their individual learning needs. However, the selection of suitable tasks is a difficult process for learners with little domain knowledge and suboptimal task-selection skills. A common solution for helping learners deal with on-demand education and develop domain-specific skills is to give them advice on task selection. In a randomized experiment, learners (N=30) worked on learning tasks in the domain of system dynamics and received either advice or no advice on the selection of new learning tasks. Surprisingly, the no-advice group outperformed the advice group on a post-test measuring domain-specific skills. It is concluded that giving advice on task selection prevents learners from thinking about how the process of task selection works. The advice seems to supplant rather than support their considerations why they should perform the advised task, which results in negative effects on learning. Implications for future research on giving advice in on-demand education are discussed.