Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jonathan A. Morell is active.

Publication


Featured researches published by Jonathan A. Morell.


Evaluation and Program Planning | 1978

The Development of Evaluation as a Profession: Current Status and Some Predictions.

Jonathan A. Morell; Eugenie Walsh Flaherty

Abstract Sociological models of professionalization are applied to recent events in program evaluation in order to understand the development of the field and to predict future events. The focus is on problems that might arise and on the necessity of ameliorating them. The possibility of eliminating such problems is discounted since they appear to be integral elements of the process of professionalization. Major elements of the analysis include discussions of the consequences of the development of a specialized body of evaluation knowledge, the definition of evaluation tasks, exclusivity in the performance of those tasks, the development of professional associations, and problems in the training of evaluators. The reasons for problems in these areas are presented and the importance of being aware of the origin of the problems is discussed.


Evaluation and Program Planning | 1978

Evaluation: Manifestations of a new field

Eugenie Walsh Flaherty; Jonathan A. Morell

Abstract An examination of three characteristics of evaluation reveals significant divisions of opinion, suggesting that it is premature to seek a defining conceptual framework for this still evolving field. Therefore an alternative approach toward understanding evaluation is followed: empirical manifestations of the history and current state of evaluation are examined for evidence of growth and integration. A review of the history of evaluation suggests four causes for its recent growth: new accountability requirements, greater interest among social scientists in social relevance, a scarcity of resources for the traditional social sciences, and an expansion of methods useful for research in applied settings. Three empirical signs of the new field are described: the concepts and strategies employed in evaluation efforts, the discipline of practicing evaluators, and dissemination of evaluation information. The conclusions suggest that there are both signs of cohesiueness and immaturity in the current state of evaluation.


Journal of Experimental Child Psychology | 1976

Age, sex, training, and the measurement of field dependence

Jonathan A. Morell

Abstract This study was designed to: (a) examine the effects of age and sex on a persons susceptibility to field dependence training; (b) determine whether the field dependence phenomenon is a function of “cognitive style” or of a persons general inability to make correct judgments in the face of too much confusing and inaccurate information. Traditional Rod and Frame scoring is based on the latter assumption. Interpretation of results, however, has traditionally been based on the “cognitive style” assumption. Results indicate that Rod and Frame results are not a function of cognitive style. This seems particularly true of two aspects of the field dependence phenomenon: (a) the sex difference effect and (b) the correlation between Rod and Frame and Embedded Figures results. Age, more than sex, may be a function of both cognitive style and general ability to perceive the upright. A training effect was not demonstrated. Hypotheses were put forward to explain the nature of field dependence, the magnitude of field dependence errors, and the lack of a training effect.


Evaluation and Program Planning | 2016

The historical path of evaluation as reflected in the content of Evaluation and Program Planning

Abu H. Ayob; Jonathan A. Morell

This paper examines the intellectual structure of evaluation by means of citation analysis. By using various article attributes and citation counts in Google Scholar and (Social) Science Citation Index Web of Science, we analyze all articles published in Evaluation and Program Planning from 2000 until 2012. We identify and discuss the characteristics and development of the field as reflected in the history of those citations.


Evaluation and Program Planning | 1991

MODERNIZING INFORMATION SYSTEMS: A COMPARISON OF SCIENTIFIC AND TECHNOLOGICAL RESEARCH PERSPECTIVES

Jonathan A. Morell

Abstract As part of a series which compares different methodological paradigms, this article highlights technological and scientific themes in an information systems research study. While scientific endeavor is primarily oriented toward overcoming difficulties that stand in the way of seeking truth, technological efforts are devoted to finding practical solutions to real-world problems. The implications of this difference are applied to a telephone survey of high ranking information system executives in a variety of organizations. Topics in the survey included: coordination of information resources; improving data quality; evaluation of information system change; software productivity; information technologys role in pursuing strategic goals; information technologys impact on white collar work; and information resource management. Although both scientific and technological objectives are attained, a variety of factors led to a much heavier technological emphasis.


Evaluation and Program Planning | 2017

Systematic iteration between model and methodology: A proposed approach to evaluating unintended consequences

Jonathan A. Morell

This article argues that evaluators could better deal with unintended consequences if they improved their methods of systematically and methodically combining empirical data collection and model building over the life cycle of an evaluation. This process would be helpful because it can increase the timespan from when the need for a change in methodology is first suspected to the time when the new element of the methodology is operational. The article begins with an explanation of why logic models are so important in evaluation, and why the utility of models is limited if they are not continually revised based on empirical evaluation data. It sets the argument within the larger context of the value and limitations of models in the scientific enterprise. Following will be a discussion of various issues that are relevant to model development and revision. What is the relevance of complex system behavior for understanding predictable and unpredictable unintended consequences, and the methods needed to deal with them? How might understanding of unintended consequences be improved with an appreciation of generic patterns of change that are independent of any particular program or change effort? What are the social and organizational dynamics that make it rational and adaptive to design programs around single-outcome solutions to multi-dimensional problems? How does cognitive bias affect our ability to identify likely program outcomes? Why is it hard to discern change as a result of programs being embedded in multi-component, continually fluctuating, settings? The last part of the paper outlines a process for actualizing systematic iteration between model and methodology, and concludes with a set of research questions that speak to how the model/data process can be made efficient and effective.


Prevention in human services | 1982

Evaluation in prevention: implications from a general model.

Jonathan A. Morell


American Journal of Community Psychology | 1984

Making evaluation viable: The response of graduate programs in evaluation

Jonathan A. Morell


Evaluation and Program Planning | 1984

To the reader

Jonathan A. Morell; Eugenie Walsh Flaherty


Program Evaluation in Social Research | 1979

5 – Evaluation as Social Technology1

Jonathan A. Morell

Collaboration


Dive into the Jonathan A. Morell's collaboration.

Top Co-Authors

Avatar

Abu H. Ayob

National University of Malaysia

View shared research outputs
Researchain Logo
Decentralizing Knowledge