Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James R. Sanders is active.

Publication


Featured researches published by James R. Sanders.


American Journal of Evaluation | 2002

Presidential Address: On Mainstreaming Evaluation

James R. Sanders

The practice of evaluation in organizations continues to be limited by perceptions that evaluation is a marginal activity. Arguments demonstrating the importance of evaluation have been ineffective in moving most organizations toward integrating evaluation into their daily routines. A multifaceted approach to making evaluation a part of organizational culture is proposed. It includes using allies, examples, models, research, process guides, and trainers/developers. This article is based on The Presidential Address given by Dr. James R. Sanders at the 2001 Annual Meeting of the American Evaluation Association (AEA) in St. Louis, MO. The theme for the conference was “Mainstreaming Evaluation.”


Theory Into Practice | 1991

The changing face of educational evaluation

Blaine R. Worthen; James R. Sanders

In 1965, educational evaluation was often viewed as merely a nuisance imposed on schools receiving federal funds for special projects. During the ensuing quarter century, it has permeated the field of education, becoming an instrument of public policy


Archive | 2003

A Model for School Evaluation

James R. Sanders; E. Jane Davidson

School evaluation can be defined as the systematic investigation of the quality of a school and how well it is serving the needs of its community, and it is one of the most important investments we can make in K-12 (kindergarten through high school) education. It is the way we learn of strengths and weaknesses, the way we get direction, and the way critical issues get identified and resolved. It addresses the needs of many parents who want to know how to choose a good school and the needs of teachers, school administrators, and school board members who want to know how to improve their schools. It also provides important information to local, state/province, and national leaders by informing their decisions. As an integral part of good management practice, it contributes to (i) identifying needs; (ii) establishing goals; (iii) clarifying goals; (iv) selecting strategies to achieve goals; (v) monitoring progress; and (vi) assessing outcomes and impact. It can be used for public relations and communications, as school report cards to the public (Jaeger, Gorney, Johnson, Putnam, & Williamson, 1994), as well as to give direction to planning or to select materials or programs for adoption. In light of contemporary proposals for site-based management, school choice, and school restructuring, evaluation at the school level is needed more today than ever before.


Journal of Experimental Education | 1994

The Process of Developing National Standards That Meet ANSI Guidelines.

James R. Sanders

Abstract American national standards for evaluations of educational programs have been developed by the joint committee on Standards for Educational Evaluation. These standards have been approved by the American National Standards Institute (ANSI). In the article, the standard-setting process developed by the joint committee and approved by ANSI is described. The process involves the following steps: (a) initiation of projects, (b) development of the first draft, (c) national and international reviews, (d) field tests, (e) national public hearings, (f) finalization of standards, (g) consideration of views and objections, (h) appeals, and (i) validation. The process is public, participatory, open, and consensual.


Journal of School Psychology | 1978

School professionals and the evaluation function

James R. Sanders

Abstract Evaluation is assumed to be an integral part of the professional delivery of school services. As such, professionals employed in school systems are called upon to define alternative roles they might play in evaluation, to consider alternative ways to organize for evaluation, and to focus on various objects of evaluation. Listed alternatives were drawn from emerging literature in school evaluation. Standards suggested for judging school evaluation included those addressing accuracy, utility, propriety, and feasibility of the evaluation.


American Journal of Evaluation | 2007

Book Review: The Sage Handbook of Evaluation, edited by Ian F. Shaw, Jennifer C. Greene, and Melvin M. Mark. Thousand Oaks, CA: Sage, 2006

James R. Sanders

In this section, recent books applicable to the broad field of program evaluation are reviewed. In most cases, a single book will be considered in a review, but in some instances, multiple books may be jointly reviewed to illuminate similarities and differences in intent, philosophy, and usefulness. In most cases, a single review will be commissioned for each book. Two or more reviews may be commissioned for books judged by the Editor and/or Book Review Editor to be especially noteworthy works in evaluation. Persons with suggestions of books to be reviewed, or those who wish to submit a review, should contact Lori A. Wingate at [email protected].


Archive | 2000

Aspekte der Entwicklung und Verbreitung der Evaluationsstandards

James R. Sanders

Die erste Auflage der Program Evaluation Standards (1994) ist 1981 unter dem Titel „Standards for evaluation of educational programs projects and materials“ veroffentlicht worden. Sie sind durch das Joint Committee an Standards for Educational Evaluation entwickelt worden. Zwolf Fachorganisationen in den Vereinigten Staaten von Amerika haben zur Erarbeitung von Kriterien, anhand derer die Qualitat von Programmevaluationen beurteilt werden kann, das Joint Committee gegrundet. Das Joint Committee war damals wie heute hauptsachlich aus Bildungsfachleuten zusammengesetzt und die Standards waren fur den Einsatz in Bildung und Erziehung vorgesehen.


American Journal of Evaluation | 2018

In Memory of Daniel L. Stufflebeam (1936–2017)

James R. Sanders

I live near Kalamazoo, MI. I have lived here for over 40 years. Why would a young man who was born and bred in central Pennsylvania settle in southwest Michigan? My answer is Dan Stufflebeam. When I was working on my master’s degree in educational research in 1967–1968, I learned that our federal government had placed a new emphasis on requiring evaluations of federally funded innovations in public school programs. I was taught that experimental and quasi-experimental designs, as described by Campbell and Stanley (1963), were the best ways to approach these evaluations. And so, I diligently took on an assignment of evaluating several federally funded projects in local schools as a part of my graduate assistantship. I enjoyed this applied learning experience, as we turned out one research report after another. As a student, this work gave me a great sense of accomplishment and pride. It wasn’t until near the end of my degree program that a professor and mentor suggested to me that there might be other ways to approach evaluating educational programs beginning to emerge in the literature, and he assigned me to do a literature review of these new approaches. What an eye-opener that was. Someone named Daniel Stufflebeam had proposed using a context, input, process, product (CIPP) model for evaluation, and there were other evaluation approaches being proposed too. These creative authors were labeled as evaluation pioneers. Dan Stufflebeam was a pioneer whose path crossed with mine over the many years that have gone by since my early revelations as a graduate student. I listened to his lectures at the American Educational Research Association annual meetings. He came to the University of Colorado, where I was working on my PhD in research and evaluation methodology, to lecture. One of his graduate students came to Colorado as an assistant professor while I was there and brought with him a connection to Dan that has lasted throughout my professional life. Dan invited me to the Ohio State University Evaluation Center on several occasions while I was on the faculty at Indiana University. In 1975, I joined him as associate director of the Evaluation Center at Western Michigan University (WMU), which he founded in 1973. I then worked with and learned from Dan through my retirement from WMU in 2001 and beyond, until his untimely passing in July, 2017. Beginning in 1975, we worked closely on many projects together and talked about the projects that we had taken on individually. The WMU Evaluation Center was a laboratory where faculty and students worked side by side on complex, often perplexing, real-world evaluation issues. Dan made a commitment to give our doctoral students increasing levels of responsibility for planning and


رسالة التربية و علم النفس | 2013

تقويم البرنامج : طرق بديلة و إرشادات عملية = Program evaluation Alternative Approaches and Practical Guidelines

Jody L. Fitzpatrick; James R. Sanders; Blaine R. Worthen

I. INTRODUCTION TO EVALUATION. 1. Evaluations Basic Purpose, Uses, and Conceptual Distinctions. 2. Origins of Modern Program Evaluation. 3. Recent Developments and Trends in Evaluation. II. ALTERNATIVE APPROACHES TO PROGRAM EVALUATION. 4. Alternative Views of Evaluation. 5. Objectives-Oriented Evaluation Approaches. 6. Management-Oriented Evaluation Approaches. 7. Consumer-Oriented Evaluation Approaches. 8. Expertise-Oriented Evaluation Approaches. 9. Adversary-Oriented Evaluation Approaches. 10. Participant-Oriented Evaluation Approaches. 11. Alternative Evaluation Approaches: A Summary and Comparative Analysis. III. PRACTICAL GUIDELINES FOR PLANNING EVALUATION. 12. Clarifying the Evaluation Request and Responsibilities. 13. Setting Boundaries and Analyzing the Evaluation Context. 14. Identifying and Selecting the Evaluative Questions and Criteria. 15. Planning How to Conduct the Evaluation. IV. PRACTICAL GUIDELINES FOR CONDUCTING AND USING EVALUATIONS. 16. Dealing with Political, Ethical, and Interpersonal Aspects of Evaluation. 17. Collecting, Analyzing, and Interpreting Quantitative Information. 18. Collecting, Analyzing, and Interpreting Qualitative Information. 19. Reporting and Using Evaluation Information. 20. Evaluating Evaluations. V. EMERGING AND FUTURE SETTINGS FOR PROGRAM EVALUATION. 21. Conducting Multiple-Site Evaluation Studies. 22. Conducting Evaluations of Organizations Renewal and Training in Corporate and Nonprofit Settings. 23. The Future of Evaluation. Appendix: Some General Areas of Competence Important in Education Evaluation.


Evaluation Practice | 1995

Book Reviews : Evaluation for Foundations: Concepts, Cases, Guidelines and Resources, by the Council on Foundations. Jossey-Bass, San Francisco, CA. 1993. 320 pages:

James R. Sanders

James R. Sanders •Associate Director, The Evaluation Center, Western Michigan University, Kalamazoo, Michigan 49008. In contrast to those who as recently as ten years ago might have observed that foundations do not do much in the way of evaluating their investments in programs, the case studies in this book provide a rich set of contemporary examples to the contrary. The Council’s stated purpose is to offer grantmakers a &dquo;readable, practical and reasonably comprehensive book that would orient grantmakers, and perhaps grantees, to different kinds of grant evaluation work and to the strengths and weaknesses of each kind&dquo; (p. xv). The preface contains a disclaimer that this is not a textbook on evaluation designs, statistics, observation methods, experimental research or interview methods. It seeks to provide a framework for thinking about the practicality and desirability of evaluation. The intended audience is corporate and foundation program officers in both large and small foundations, foundation officers, trustees, grantees and evaluation consultants. The book introduces issues and advice related to evaluation conducted by or for foundations, and then describes nine cases of evaluation used by foundations. It concludes by listing thirty-five keys to effective evaluation drawn from the case studies. Two resource appendices are included: (A) a summary of evaluation approaches and methods, and (B)

Collaboration


Dive into the James R. Sanders's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

E. Jane Davidson

Western Michigan University

View shared research outputs
Researchain Logo
Decentralizing Knowledge