Christina A. Christie
Claremont Graduate University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Christina A. Christie.
American Journal of Evaluation | 2009
Dreolin N. Fleischer; Christina A. Christie
This paper presents the results of a cross-sectional survey on evaluation use completed by 1,140 U.S. American Evaluation Association members. This study had three foci: evaluators’ current attitudes, perceptions, and experiences related to evaluation use theory and practice, how these data are similar to those reported in a previous study conducted by Preskill and Caracelli (1997), and to identify characteristics that distinguish high endorsers of use items from others in the sample. Findings suggest a fair level of agreement on several dimensions of use including stakeholder involvement, factors that influence use, and the varied roles of the evaluator. Logistic regression results indicated that external evaluators were less likely to be high item endorsers while those who reported being members of the Evaluation Use Topical Interest Group of the American Evaluation Association were more likely to be high item endorsers.
American Journal of Evaluation | 2007
Christina A. Christie
Using a set of scenarios derived from actual evaluation studies, this simulation study examines the reported influence of evaluation information on decision makers’ potential actions. Each scenario described a context where one of three types of evaluation information (large-scale study data, case study data, or anecdotal accounts) is presented and a specific decision needs to be made. Participants were asked to indicate which type of data presented would influence their decision making. Results from 131 participants indicate that participants were influenced by all types of information, yet large-scale and case study data are more influential relative to anecdotal accounts; certain types of evaluation data are more influential among certain groups of decision makers; and choosing to use one type of evaluation data over the other two depends on the independent influence of other types of evaluation data on the decision maker, as well as prior beliefs about program efficacy.
American Journal of Evaluation | 2003
Christina A. Christie; Marvin C. Alkin
Program theory plays a prominent role in many evaluations, not only in theory-driven evaluations. This paper presents a case study of the process of developing and refining a program’s theory within a user-oriented evaluation. In user-oriented (or utilization-focused) evaluations, primary users can play a role in defining their own program theory. This is different, however, from the typical process by which a program theory is developed when using theory-driven evaluation framework. This case study concerns a university’s academic outreach program with three local school districts. The program’s objective is to increase the number of University of California eligible and specifically University of California, Los Angeles (UCLA) admissible students from targeted local public schools. The authors helped develop and refine outreach staff’s program theory. The evaluation procedures are described and results of the theory building process presented.
American Journal of Evaluation | 2003
Christina A. Christie; Mike Rose
Dialogue and discussion are fundamental to advancing the field of evaluation. Small informal discussion groups that focus on particular issues serve to promote productive dialogue about evaluation. This paper describes the dialogue of such a group. Because it is interactive, dynamic, and applied, we assert that the group is a personification of the nature of evaluation knowledge. We further argue that this group, and the way that it has been formulated, is particularly suitable for learning about evaluation, for socializing new people into the field, and for promoting continuing education.
American Journal of Evaluation | 2006
Christina A. Christie
a A. Christie, Claremont laremont, CA 91711; eIn my experience, wh ays been a bit of a cha reciative inquiry (AI) ck me about the AI ev rview. I’ve never read luation. Could you des l: I came across AI in rs ago and became exc participatory, collabor rned, the more I becam on findings but that it ted reading everything reciative inquiry (Dav applications of AI to e Before we get into the me a brief explanatio l: Sure. AI has been us -1980s. In a nutshell, A rt to design and imple roaches to organization ds, when we look for ge, we often end up fee onents of AI have foun e excited and energize d language, participan perrider, Sorensen, Wh Okay, let’s talk a bit a l: CEDT stands for C rnal training function ional Laboratories for gies, and most of thei uquerque, New Mexic he CEDT department ructor-led classroom t ng, and consulting on a s and offerings had be Christina A. Christie Claremont Graduate University
New Directions for Evaluation | 2003
Christina A. Christie
Studies in Educational Evaluation | 2008
Christina A. Christie; Marvin C. Alkin
Archive | 2004
Stewart I. Donaldson; Christina A. Christie
Studies in Educational Evaluation | 2004
Christina A. Christie; Rachel M. Ross; Brock M. Klein
American Journal of Evaluation | 2004
Christina A. Christie; Tarek Azzam