Lowell L. Hargens
University of Illinois at Urbana–Champaign
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lowell L. Hargens.
American Sociological Review | 1988
Lowell L. Hargens
Rejection rates for scholarly journals show substantial variation between disciplines. Explanations of this variation have focused on two possible sources: variation in consensus and in space shortages. Longitudinal data on journal rejection rates show that they have been very stable over time and are largely unaffected by changes in submissions, impugning the argument that space shortages explain disciplinary variation in rejection rates. In contrast, a model of the manuscript-evaluation process can account for the observed variation in rejection rates and also casts light on additional characteristics of manuscript evaluation processes in different disciplines as well. Possible links between consensus and each of the elements of the model are discussed.
American Sociological Review | 1977
Nicholas C. Mullins; Lowell L. Hargens; Pamela K. Hecht; Edward L. Kick
Using block modeling of data from a sociometric questionnaire, we analyze the patterns of social structure shown by authors of two highly cocited clusters of biological-science papers. Analyses of anecdotal data, background information, and data oh citations support the findings from the block models. The density of contacts and the patterns of sociometric and citation data show that the authors of papers in the clusters form social groups. Each group has a centerperiphery pattern, and the two groups show structural differences that appear to reflect differences in the timing and diffusion of their respective major research findings.
Social Science Research | 1990
Lowell L. Hargens; Howard Schuman
Data from samples of biochemists and sociologists show that nearly all are familiar with citation indexes and that the two groups are equally likely to have used a citation index for bibliographic purposes. We develop three hypotheses from social comparison theory to account for variation in use and evaluation of citation counts as indicators of scientific achievement: (1) more highly cited scientists will more often use and more highly evaluate citation counts as indicators of scientific achievement than will less cited scientists, (2) these relationships will be stronger for sociologists than for biochemists, and (3) sociologists as a whole will more often use and more highly evaluate citation counts than biochemists. Finally, among sociologists, we hypothesize that those primarily interested in quantitative research areas will use and favor citation counts more than those with primarily qualitative or theoretical interests. Our data support all but one of these hypotheses. We also report unexpected differences in use and evaluation of citation counts by sex and departmental prestige.
The American Sociologist | 1991
David M. Bott; Lowell L. Hargens
Critics argue that few sociological publications are cited in the subsequent literature and that this implies many are superfluous. Data on the number of citations to three kinds of sociological documents—journal articles, chapters in edited books, and books—show that a substantial majority of each type is cited in the subsequent literature. Furthermore, the high proportions of ever-cited items do not result from authors’ citation of their own work. Average citation levels of journals are highly correlated with other measures of journal stature. The average book is cited about as often as an average article in a highly-cited journal, while an average chapter in an edited book is cited about as often as an average article in an infrequently-cited journal. Within-journal variation in article citation rates far exceeds between-journal variation.
Social Science Research | 1988
Diane Felmlee; Lowell L. Hargens
Abstract Sociologists frequently use ordinary least squares (OLS) to estimate a series of regression equations from data on the same observational entities. Such “seemingly unrelated regressions” are linked by correlations among the disturbances. In this paper we review three techniques for estimating “seemingly related regressions”—OLS, Zellners generalized least-squares method, and maximum likelihood estimation—and present anl illustrative sociological example employing each technique. We discuss the conditions under which the non-OLS estimation procedures offer advantages for efficiency of estimation and hypothesis testing.
Social Studies of Science | 1980
Lowell L. Hargens; Nicholas C. Mullins; Pamela K. Hecht
We present data for two research areas, delimited by co-citation analysis, to assess several hypotheses about the articulation of research area structure and stratification processes among scientists. Two main issues are examined. First, we contrast the social status of the members of the two areas with the general status levels shown by members of disciplines which encompass the two areas. This contrast is relevant to hypotheses about the social characteristics of central intellectual figures in research areas. Second, we examine the effect of a scientists participation in the development of a research area on subsequent scientific performance in order to determine the extent to which such participation should be included in analyses of scientific productivity and status.
Scientometrics | 1986
Lowell L. Hargens
Patterns of migration among disciplines and specialties are examined using data from a large survey of U. S. Ph. D. s in a broad range of fields. Mappings of scholarly fields are derived from the migration patterns and these mappings are largely consistent with results from previous studies using citation flows and other measures of field similarities. Migration patterns suggest that there are two boundaries dividing the fields in this analysis, and that hierarchical relations among disciplines are weak or absent. In contrast, specialties within a discipline are more likely to exhibit structural hierarchies.
The American Sociologist | 1990
Lowell L. Hargens
Articles in both the popular press and sociology journals have argued that between the mid-1970s and mid-1980s sociologists became more pessimistic about the intellectual vitality of their field. Data from the 1969, 1975, and 1984 Carnegie surveys of faculty at U.S. universities suggest that sociologists’ assessments of their field changed little over this period. In several respects the “sociology in the doldrums” thesis of the 1980s resembles the 1970s “blue collar blues” thesis; in both cases sociologists gave structural explanations for apparently nonexistent trends.
American Sociological Review | 1988
Lowell L. Hargens
In my paper I identify three proximate determinants of journal rejection rates and suggest how each may be related to disciplinary variation in consensus. CSC argue that I either ignore or improperly discount variables other than consensus that may be responsible for the observed interdisciplinary variation in journal rejection rates. For example, CSC argue that my analysis of the impact of journal space shortages on rejection rates is unconvincing because it does not measure journal-space availability directly. I believe they greatly underestimate the difficulties involved in obtaining such a measuredifficulties that frequently occur in attempts to measure potentials (rather than actualities).1 but even if an adequate measure were available, simply comparing its average value across disciplines would be uninformative because the issue is one of causal priority. Specifically, the space-shortage argument holds that both submissions to a journal and its available space jointly produce its rejection rate. If this argument is correct, a journals rejection rate should display considerable instability because annual fluctuations in submissions are unlikely to be consistent with the amount of space its annual budget makes available. In contrast, I argue that, for most scientific journals, the volume of submissions and the rejection rate jointly produce the number of pages that are eventually made available for accepted papers. Under this argument one should observe instability primarily in the number of papers a journal publishes because fluctuations in submissions and a fairly constant rejection rate will produce such variation. The data summarized in Table 1 of my paper show that journal rejection rates are extremely stable, and I therefore conclude that the space-shortage argument cannot account for much of the variation in rejection rates. Readers should bear in mind that at issue here is the interdisciplinary variation in average journal rejection rates. Until now, Coles et al. have discounted the argument that this variation suggests interdisciplinary variation in consensus, arguing instead that it may be due to variation in space shortages (cf. Zuckerman and Merton 1971, note 35; Cole, Cole, and Dietrich 1978; Cole 1983). However, if space shortages contributed appreciably to the interdisciplinary variation Zuckerman and Merton reported, it is difficult to believe that scholarly associations that publish journals with high rejection rates would not have moved in the intervening years to ameliorate the situation. CSC identify three additional variables that may affect rejection rates-field-specific publication norms,2 diffuseness of journal structures, and differences in training practices-and argue that I inappropriately view them as intervening variables that mediate the causal impact of consensus on rejection rates. Specifically, they argue that, in the absence of a direct measure of * Direct all correspondence to Lowell L. Hargens, Department of Sociology, University of Illinois at Urbana-Champaign, Urbana, IL 61801 l I discussed some of these issues at greater length in Hargens (1975, pp. 20-22) and concluded there that the existing measures are all indirect and allow only the conclusion that space shortages are not the sole source of rejection rates. Although CSC now agree with my earlier conclusion, I believe that the evidence I present in the current paper justifies the much stronger conclusion that space shortages have little to do with the observed variation in rejection rates. 2 Although CSC argue for the existence of fieldspecific publication norms, their examples, Social Forces and Studia Sociologiczne, imply that journal-specific norms are involved. As far as the latter journal is concerned, I doubt whether it is appropriate to compare it with U.S. sociology journals and suspect that a more appropriate comparison would be with Polish physics journals.
Scientometrics | 1990
Lowell L. Hargens; Jerald R. Herting