Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Richard J. Light is active.

Publication


Featured researches published by Richard J. Light.


Journal of the American Statistical Association | 1971

An Analysis of Variance for Categorical Data

Richard J. Light; Barry H. Margolin

Abstract A measure of variation for categorical data is discussed. We develop an analysis of variance for a one-way table, where the response variable is categorical. The data can be viewed alternatively as falling in a two-dimensional contingency table with one margin fixed. Components of variation are derived, and their properties are investigated under a common multinomial model. Using these components, we propose a measure of the variation in the response variable explained by the grouping variable. A test statistic is constructed on the basis of these properties, and its asymptotic behavior under the null hypothesis of independence is studied. Empirical sampling results confirming the asymptotic behavior and investigating power are included.


Journal of the American Statistical Association | 1974

An Analysis of Variance for Categorical Data, II: Small Sample Comparisons with Chi Square and other Competitors

Barry H. Margolin; Richard J. Light

Abstract Exact small sample behavior in two-way contingency tables is investigated for Pearsons chi-square statistic (X 2), Light and Margolins C statistic and its related R 2 measure of association, Kullbacks minimum discrimination information statistic (2I), and Goodman and Kruskals Lambda. R 2 is shown to be identical to Goodman and Kruskals tb , leading to a test for independence based on tb. In small samples from a product of multinomials model, the null distribution of C is better approximated by a χ2 distribution than is the null distribution of X 2; both are considerably better approximated by a χ2 distribution than is the null distribution of 2I. It is proved for tables with two columns and any number of rows that if the column totals are equal, then X 2 ≤ 2I; thus, X 2 is more conservative than 2I. Hence, use of 2I should be avoided in testing independence in tables with small samples.


Applied Psychological Measurement | 1977

Book Review : Discrete Multivariate Analysis: Theory and Practice

Yvonne Bishop; Stephen E. Fienberg; Paul W. Holland; Richard J. Light; Frederick Mosteller

comes stale or unproductive. The expressive range of methodological language also shapes the generation of theory and, in much the same manner that practical media and formal structural constraints influence art and literature, the limitations of theoretical and methodological constructs may prove stimulating or stifling to a science at a particular stage of maturity. In any science, a period of primarily methodological rather than substantive development may sometimes be necessary to unblock the logjam created by theories and measurements which cannot effectively interact through existing tools. Discrete Multivariate Alla(vsis arrives at a time when the various psychosocial disciplines are all suffering, to varying degrees, from attempts to swallow whole those chunks of statistical methodology for continuous data that have been most successful in the natural science disciplines. The movement to quantification of psychological and social research has been motivated, in large measure, by a desire to legitimatize behavioral science through application of the &dquo;hard science&dquo; criteria of objectivity and reproducibility to statements of and data analyses relating to behavioral paradigms. Passionate advocacy of multiple regression and other multivariate analytic tools has been matched by claims that such tools have forced their proponents, through compromises necessary in measurement, data preparation and formal hypothesis construction, to distort and ultimately trivialize their science in order to accommodate the prerequisites of statistical analysis. The intensity of this debate between &dquo;traditionalists&dquo; and &dquo;methodologists&dquo; has shown no sign of abating in the last 10 years: indeed. parallel discussions in the area of medical clinical research display the same basic concern as that which troubles academic social scientists. The question underlying debate in both areas is whether the refinement and continuous scaling of information derived from primitive conceptualizations or measurement strategies is likely to prove stimulating or stifling to further scientific groBB1h. Much of the urgency and stridency of discourse on this issue is certainly due to the perceived disarray of statistical methodology appropriate to categorical information, consisting of observations which fall into nominal, ordinal or scaled classes. The analysis of such discrete data has long been limited in scope and convenience by the basic dependence of available methods on the dimensionality


Evaluation Practice | 1994

The Future for Evaluation

Richard J. Light

It is challenging enough to predict what will happen five months from now, no less five or 10 years into the future. Yet the invitation to speculate about the future of our field comes at an exciting time, and so I am pleased to make some guesses. A good way to begin is to reflect briefly on the past thirty years. Both research and practice have gone through three distinct phases. The first phase, from the mid-1960s through the mid-1970s, was ignited by agroup of professionals, drawn from human services disciplines, who were excited by an interdisciplinary challenge. The challenge was to create a broad field called program evaluation. The result was vigorous activity with special emphasis on research design, data analysis, and efforts to stress the importance of having a solid theoretical framework to underpin both. The second phase for our field, which began roughly during the mid 197Os, gave us another 10 year push. Two professional societies were created-the Evaluation Research Society and the Evaluation Network. Each group attracted about a thousand members. My colleagues occasionally differentiated between them by asserting that ERS was more ‘academic’-focusing on theory and design, while ENET’s members emphasized the practical challenges of field-based interventions. This distinction was convenient. Yet the two groups clearly shared much overlap, both in substance and in their membership. Extensive discussions over several years consummated in a merger in 1986. The two groups consolidated to form one, overarching professional association. The American Evaluation Association was born, and I was privileged to be its first, elected president. I remember the first Board of Directors worrying early and often about what level of membership this new organization could sustain. As it turns out, AEA has grown and is prospering as we approach its 10th anniversary. The third phase, initiated by organizational leadership from AEA, began in the mid1980s and is continuing now. During this time, the field of evaluation has matured in several directions. First, the focus on the substantive disciplines of education, psychology, and health, which played such an important role in the field’s early development, has been


International Journal of Aging & Human Development | 1984

Investigating Health and Subjective Well-Being: Methodological Challenges

Richard J. Light

When three different forms of research reviews all reach the same conclusion, that health and subjective well-being have a modest positive correlation, the finding is convincing. Despite the consistent findings, different methods of reviewing research have different strengths. A meta-analysis emphasizes measures of central tendency. A narrative review, in contrast, can focus upon details in deviant findings. Using these methods together provides good information when aggregating findings from many studies.


Archive | 1984

Summing Up: The Science of Reviewing Research

Richard J. Light; David B. Pillemer


Archive | 2001

Making the Most of College: Students Speak Their Minds

Richard J. Light


Harvard Educational Review | 1971

Accumulating Evidence: Procedures for Resolving Contradictions among Different Research Studies

Richard J. Light; Paul V. Smith


Archive | 1992

Meta-Analysis for Explanation: A Casebook

Thomas D. Cook; Harris Cooper; David S. Cordray; Heidi Hartmann; Larry V. Hedges; Richard J. Light; Thomas A. Louis; Frederick Mosteller


Archive | 1990

By Design: Planning Research on Higher Education

Richard J. Light; Judith D. Singer; John B. Willett

Collaboration


Dive into the Richard J. Light's collaboration.

Top Co-Authors

Avatar

David B. Pillemer

University of New Hampshire

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge