Anton J. Nederhof
Leiden University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Anton J. Nederhof.
Scientometrics | 2006
Anton J. Nederhof
SummaryThis paper addresses research performance monitoring of the social sciences and the humanities using citation analysis. Main differences in publication and citation behavior between the (basic) sciences and the social sciences and humanities are outlined. Limitations of the (S)SCI and A&HCI for monitoring research performance are considered. For research performance monitoring in many social sciences and humanities, the methods used in science need to be extended. A broader range of both publications (including non-ISI journals and monographs) and citation indicators (including non-ISI reference citation values) is needed. Three options for bibliometric monitoring are discussed.
Scientometrics | 1989
Anton J. Nederhof; R. A. Zwaan; R. E. De Bruin; P. J. Dekker
An evaluation was made of the use of bibliometric indicators for five disciplines in the humanities (social history, general linguistics, general literature, Dutch literature, and Dutch language) and three disciplines in the social and behavioural sciences (experimental psychology, anthropology, and public administration) in the Netherlands. Articles in journals were the predominant outlet in all disciplines. Monographs and popularizing articles were more important outlets in ‘softer’ fields than in ‘harder’ ones. The enlightenment function of scholarship was especially evident in Dutch literature and language, and public administration. Only some of the humanities disciplines are locally oriented. Although many publications were written in English, only experimental psychology, general linguistics, anthropology, and genrral literature were internationally oriented regarding output media. The impact of departments differed greatly both within and between disciplines. For all disciplines, bibliometric indicators are potentially useful for monitoring international impact, as expert interviews confirmed. Especially in Dutch language, Dutch literature and public administration, ISI-citation data are not very useful for monitoring national impact.
Research Policy | 1993
Anton J. Nederhof; A. F. J. Van Raan
Abstract The research performance of research units in economics has been evaluated by simultaneous efforts of peers and bibliometricians, with extensive interactive comparison of results afterwards. We studied trends in productivity and impact of six economics research groups in the period 1980–1988. These groups participate in a large (above one million pounds) research programme of a national Research Council. Research performance of the groups was compared to the world average by means of the Journal Citation Score method. In order to investigate the influence of one key scientists (the “star effect‘) we applied a sensitivity analysis to the performance of the research groups by elimination of the papers (and subsequent citations) of one key member. Furthermore, to provide insight into the fields to which a group directs its work, and the fields in which a group has its most important contributions, comparisons were made of publishing and citing journal packets. Similarly, citations to the work of the research groups were analysed for country of origin. We compared the results of the bibliometric part of this study with those of a simultaneous peer review study. The bibliometric study yielded clear and meaningful results, notwithstanding the increasingly applied nature of the research groups. Results from peer review and bibliometric studies appear to be complementary and mutually supportive. The participants of the bibliometrics-peer review “confrontation” meeting regarded the exercise as most valuable, with lessons for the Research Council both for the future of research programmes and the form of evaluation used for large awards.
Journal of the Association for Information Science and Technology | 1991
Anton J. Nederhof; Rolf A. Zwaan
This study had two main goals. First, an attempt was made to construct and validate an indicator of research performance through collecting peer judgments on the quality of journals by means of a world‐wide mail survey among 385 scholars. Second, to study the validity of indicators based on citations, these judgments were used to probe the quality of the coverage by the SSCI and the A & HCI of both core and noncore journals. Four disciplines in the humanities (General Linguistics, General Literature, Dutch Literature, and Dutch Language) and two disciplines in the social and behavioral sciences (Experimental Psychology and Public Administration) were studied. Coverage in both SSCI and A & HCI was generally increased somewhat when journals judged to be nonscholarly were eliminated. For non‐locally oriented disciplines, coverage of core journals was good: 85–100%. However, for locally oriented disciplines this varied between 20% and 40%. Despite limitations, the Journal Packet Quality indicator seems useful as a first, but crude approximation of the level of research performance when the number of articles is not too small. On an aggregate level, results showed convergence with those based on journal impact factors.
Journal of the Association for Information Science and Technology | 1992
Anton J. Nederhof; Ed C. M. Noyons
Methods are developed to compare the research performance of departments in two humanities disciplines, general linguistics and general literature. Departments from an Anglo‐Saxon country were compared with several departments from European, non‐Anglo‐Saxon countries. A method was developed to reconstruct publication lists of departments, based on searches in various databases. We were able to retrieve 98% of the citations given to the work of one particular department. In both disciplines, it was found that the impact of some departments was largely dependent on their books and chapters, while other departments received most citations from their journal articles. The origins of citations were traced. Some departments had a largely local impact, whereas others showed a more cosmopolitan impact. Although there was some evidence of continental “self‐citations,” citations were also given across continents. The results indicate that bibliometric assessment of research performance is potentially useful in these humanities disciplines.
Scientometrics | 1993
Anton J. Nederhof; R. F. Meijer; Henk F. Moed; A. F. J. Van Raan
The present bibliometric study extends previous work by focusing on the research performance of departments in the natural and life sciences, the social and behavioral sciences, and the humanities. The present study covers all 70 departments from one agricultural university, and several veterinary departments of a second university. The impact analysis was extended by including other types of documents than journal articles. For about a third of the departments, publications not covered in citation indexes accounted for at least 30% of the citations to their total oeuvre. To deal with different citation and publication habits in the various fields, both short-term and medium-term impact assessments were made. The commonly used three year window is not universally applicable, as our results show. The inclusion of self-citations forms an important source of error in the ratio of actual/expected impact. To cope with this, the trend and level of self-citations was compared at university level with that in a matched sample of publications. Moreover, at a departmental level, self-citation rates were used to detect departments with divergent levels of self-citation. The expected impact of journals accounted for only 18% of the variance in actual impact. Comparison of bibliometric indicators with two peer evaluations showed that the bibliometric impact analyses provided important additional information.
Scientometrics | 1991
Henk F. Moed; R. E. De Bruin; Anton J. Nederhof; Robert J. W. Tijssen
International scientific co-operation (ISC) and awareness are topics of increasing interest for both scientists and science policy makers. In this paper, we adopt primarily the science policy point of view. After a concise overview of the literature we summarize the main results of the research we conducted. The main outcome with respect to ISC is that it increases. However, large differences exist between countries and between scientific disciplines. ISC and awareness constitute a complex phenomenon, affected by several factors, science-internal, as well as external. In the paper several techniques are described, amongst which those that can visualize ISC relations through analytical maps. An important aspect of our research methodology is the combination of various quantitative, bibliometric analyses and qualitative research on the structure of science and the relations between science and society. Finally, we sketch perspectives for future research.
Scientometrics | 1987
Anton J. Nederhof; A. F. J. Van Raan
Quality judgments of predominantly local senior scientists regarding the scientific performance of candidates for a doctorate degree in physics were compared to the non-local short-term and long-term impact of the work published by these candidates before and after graduation. It was hypothesized that publications of cum laude degree-holders (‘cumlaudes’), both shortly before and shortly after the award of the degree, would be higher cited both on the short and long run than publications of ‘ordinary’ degree-holders. Before graduation, cumlaudes were significantly more productive, as well as authors of more highly cited publications than ordinary doctorates. Publications authored by cumlaudes some years before their graduation received on the average more than twice as many citations as publications authored by non-cumlaudes. However, in particular for cumlaudes, productivity and impact decreased sharply in years after graduation. After graduation, cumlaudes continued to be more productive than non-cumlaudes, but the impact of their publications equalled those produced by non-cumlaudes. The results offer little evidence for the Matthew effect and the Ortega hypothesis, but support the validity of both peer review outcomes and bibliometric impact assessments of scientific performance.
Research Policy | 1986
Arie Rip; Anton J. Nederhof
A Program Committee Biotechnology was established in the Netherlands for the period 1981?1985, to stimulate biotechnological research and its contribution to innovation. Effects of its activities on researchers and on the research system have been measured in terms of recognition of and commitment to the priorities and approaches of the Program Committee. Results from the questionnaire survey and the interviews are presented, and are used to assess the implementation strategy of the committee and the dynamics of orientation of researchers to new priorities. A consciously orchestrating strategy, with some accommodation to the interests of the field, appears to be productive. This may be a generally useful middle course between dirigism and laissez-faire, because it exploits the leverage that resides in the strategies of scientists to mobilize resources for their work.
Scientometrics | 2008
Anton J. Nederhof
This paper examines policy-relevant effects of a yearly public ranking of individual researchers and their institutes in economics by means of their publication output in international top journals. In 1980, a grassroots ranking (‘Top 40’) of researchers in the Netherlands by means of their publications in international top journals started a competition among economists. The objective was to improve economics research in the Netherlands to an internationally competitive level. The ranking lists did stimulate output in prestigious international journals. Netherlands universities tended to perform well compared to universities elsewhere in the EU concerning volume of output in ISI source journals, but their citation impact was average. Limitations of ranking studies and of bibliometric monitoring in the field of economics are discussed.