Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where William J. Gibbs is active.

Publication


Featured researches published by William J. Gibbs.


Journal of Computing in Higher Education | 2003

Indicators of constructivist principles in internet-based courses

Karen M. Partlow; William J. Gibbs

THIS STUDY set out to identify indicators of constructivist principles applied to the development of Internet-based courses. A peer-nominated panel of nationally recognized experts in constructivist learning theory and instructional technology participated in a 3-round Delphi Web survey, during which they proposed and rated categories and indicators of constructivist-compatible principles. The panel identified a total of 10 categories and 110 indicators of constructivist principles. They rated the importance of 59 of the identified indicators asM=3.51 or higher (on a 5-point Likert scale). The categories and associated indicators provide an initial framework that may help educators apply constructivist principles to the development of Internet-based courses. They also offer instructors and, in some cases, students a means to evaluate courses.


Journal of Computing in Higher Education | 1998

Implementing On-Line Learning Environments

William J. Gibbs

THIS PAPER reviews the implementation of an on-line course and discusses student perceptions of the virtual classroom experience. Students cited numerous reasons for enrolling in an on-line course including its flexibility and convenience. Course work was completed at a time and place that better accommodated students and the on-line environment provided learners more control over the pace of instruction. It was also conducive for thoughtful analysis of class questions and commentary.Nevertheless, students felt isolated because they lacked face-to-face interaction with their peers and the instructor. The inability to see facial expressions and non-verbal reactions was perceived to hinder communication. Students and instructors indicated that they spent more time in the virtual classroom than in courses taught with more traditional approaches. The additional time invested in the virtual classroom was related to technical issues and changes in instructional methods engendered by teaching in an on-line environment.


Journal of Computing in Higher Education | 1995

An Approach to Designing Computer-Based Evaluation of Student Constructed Responses: Effects on Achievement and Instructional Time

William J. Gibbs; Kyle L. Peck

AbstractTHIS INQUIRY examined the effectiveness of self-evaluation strategies to supplement computerized evaluation of constructed response answers. Additionally, self-evaluation was looked at as a means to improve learner recall of factual and comprehensive knowledge. The study compared the effects of five constructed response answer evaluation strategies on achievement and instructional time during computer-based learning. The five strategies were:1)computerized evaluation only,2)student evaluation only,3)computerized evaluation and student evaluation,4)student evaluation with required elaboration, and5)computer and student evaluation with elaboration following conflicting evaluations. Analysis of the collected data revealed that achievement, as measured in this study, was unaffected by evaluation strategy. Accordingly, treatments did not affect student evaluation of responses. Across all self-evaluation groups, student evaluation did not differ substantially from expert evaluation, which may indicate that students can accurately evaluate their own work. The treatment strategies did differentially affect instructional time, with instructional time increasing as the level of interaction with the instructional software increased. Implications for the design of instructional software are discussed.


Journal of Computing in Higher Education | 2000

Identifying Important Criteria for Multimedia Instructional Courseware Evaluation.

William J. Gibbs; Pat R. Graves; Ronan S. Bernas

SOFTWARE SELECTION DECISIONS, especially when they involve sophisticated multimedia instructional courseware can be difficult for educators. Many confounding factors such as teachers’ inexperience with using instructional courseware and the emerging capability and sophistication of technology often make the evaluation and subsequent selection of courseware a challenging process.In 1999, a study was conducted in which a panel of instructional technology experts rated the importance of 110 criteria statements to multimedia instructional courseware evaluation. The criteria could serve as a basis for constructing evaluative instruments useful for software screening. Participants completed an on-line World Wide Web (WWW) survey on three separate occasions or rounds over a two-month period. All communication between participants and the researchers was asynchronous through the Internet.This paper describes the methods and materials used in the study. It discusses the criteria judged to be most important by the panel of experts and presents an analysis of their commentary, which was used to modify and refine the list of the highest-rated multimedia instructional courseware evaluation criteria.


Journal of Computing in Higher Education | 2001

Group instruction and web-based instructional approaches for training student employees

William J. Gibbs; Carrie Chen; Ronan S. Bernas

THIS RESEARCH OBSERVED, in their natural setting, the outcomes of two training modalities implemented by a mediumsize academic library for training student employees. The researchers compared the learning gained from the group instruction (instructor-led) approach that the library has adopted for a number of years with learning from the Web-based self-instruction method recently developed and implemented. Achievement and attitudinal data from 190 graduate and undergraduate student employees over a period of three years were collected. Since the research was conducted as the library shifted from group instruction to Web-based self-instruction, naturally-occurring factors that may have influenced the data were not controlled but were accounted for in this report. Trainees performed equally in both modes and generally responded favorably to training despite modality. Findings suggest that Web-based self-instruction is a plausible alternative to augment group-instruction. While the instructional content was identical for the two approaches, student perception of their learning appeared to differ. Group-instruction students perceived their learning to relate to services, library functions, and information searching and literacy. The Web-based group emphasized that their learning focused on the building’s physical layout, locations of collections and service units, and general services.


Journal of Computing in Higher Education | 1997

The Effects of Reward Structure on Student Evaluation in a CAI Lesson.

William J. Gibbs

THIS STUDY examined the effects of reward structures on the ability of students to evaluate the correctness of their responses to open-ended questions in a Computer-Based Instruction (CBI) lesson. Students’ ability to evaluate responses was influenced, to some extent, by their incentive to achieve. In most instances, the evaluations of expert evaluators and students were found to correspond. However, differences between evaluations occurred when students perceive their responses as correct and evaluators evaluate them as incorrect. It was also found that the order in which the assessment items were presented affected evaluation performance.


Journal of Computing in Higher Education | 1994

Video Split-Screen Technology: A Data Collection Instrument.

William J. Gibbs; Arnold F. Shapiro

PENN STATE’S Smeal College of Business Administration developed a multimedia-based independent study prototype for teaching Mathematics of Finance courses. The prototype was pilot tested and all testing sessions were videotaped so that they could be reviewed by the developers. Video recording was accomplished by mixing video signals from two cameras to create a split-screen effect where the subject was positioned in the left half of the screen and the computer screen in the right half. This technique made it easy to collect students’ physical reactions (facial expressions and body movements) and verbal reports, and to associate them with what was occurring on the computer. This process ultimately helped to improve the program and its interface and provided view of the students’ information processing.


Educational Technology & Society | 2006

A Visualization Tool for Managing and Studying Online Communications

William J. Gibbs; Vladimir Olexa; Ronan S. Bernas


International journal of instructional media | 2008

An Analysis of Temporal Norms in Online Discussions

William J. Gibbs; Linda D. Simpson; Ronan S. Bernas


Archive | 1995

Multimedia and Computer-Based Instructional Software: Evaluation Methods.

William J. Gibbs

Collaboration


Dive into the William J. Gibbs's collaboration.

Top Co-Authors

Avatar

Ronan S. Bernas

Eastern Illinois University

View shared research outputs
Top Co-Authors

Avatar

Pat R. Graves

Eastern Illinois University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kyle L. Peck

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Arnold F. Shapiro

College of Business Administration

View shared research outputs
Researchain Logo
Decentralizing Knowledge