Thed N. van Leeuwen
Leiden University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Thed N. van Leeuwen.
Scientometrics | 1995
Henk F. Moed; R. E. De Bruin; Thed N. van Leeuwen
This paper gives an outline of a new bibliometric database based upon all articles published by authors from the Netherlands, and processed during the time period 1980–1993 by the Institute for Scientific Information (ISI) for theScience Citation Index (SCI),Social Science Citation Index (SSCI) andArts & Humanities Citation Index (A&HCI). The paper describes various types of information added to the database: data on articles citing the Dutch publications; detailed citation data on ISI journals and subfields; and a classification system of publishing main organizations, appearing in the addresses. Moreover, an overview is given of the types of bibliometric indicators that were constructed. Their relationship to indicators developed by other researchers in the field is discussed. Finally, two applications are given in order to illustrate the potentials of the database and of the bibliometric indicators derived from it. The first represents a synthesis of ‘classical’ macro indicator studies at the one hand, and bibliometric analyses of research groups or institutes at the other. The second application gives for the first time a detailed analysis of a countrys publication output per institutional sector.
Scientometrics | 2001
Thed N. van Leeuwen; Henk F. Moed; Robert J. W. Tijssen; Martijn S. Visser; Anthony F. J. van Raan
Empirical evidence presented in this paper shows that the utmost care must be taken ininterpreting bibliometric data in a comparative evaluation of national research systems. From theresults of recent studies, the authors conclude that the value of impact indicators of researchactivities at the level of an institution or a country strongly depend upon whether one includes orexcludes research publications in SCI covered journals written in other languages than in English.Additional material was gathered to show the distribution of SCI papers among publicationlanguages. Finally, the authors make suggestions for further research on how to deal with this typeof problems in future national research performance studies.
Journal of the Association for Information Science and Technology | 2012
Ludo Waltman; Clara Calero-Medina; Joost Kosten; Ed C. M. Noyons; Robert J. W. Tijssen; Nees Jan van Eck; Thed N. van Leeuwen; Anthony F. J. van Raan; Martijn S. Visser; Paul Wouters
The Leiden Ranking 2011/2012 is a ranking of universities based on bibliometric indicators of publication output, citation impact, and scientific collaboration. The ranking includes 500 major universities from 41 different countries. This paper provides an extensive discussion of the Leiden Ranking 2011/2012. The ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings. The comparison focuses on the methodological choices underlying the different rankings. Also, a detailed description is offered of the data collection methodology of the Leiden Ranking 2011/2012 and of the indicators used in the ranking. Various innovations in the Leiden Ranking 2011/2012 are presented. These innovations include (1) an indicator based on counting a universitys highly cited publications, (2) indicators based on fractional rather than full counting of collaborative publications, (3) the possibility of excluding non-English language publications, and (4) the use of stability intervals. Finally, some comments are made on the interpretation of the ranking and a number of limitations of the ranking are pointed out.
Journal of the Association for Information Science and Technology | 1995
Henk F. Moed; Thed N. van Leeuwen
The Institute for Scientific Information (ISI) publishes annually listings of impact factors of scientific journals, based upon data extracted from the Science Citation Index (SCI). The impact factor of a journal is defined as the average number of citations given in a specific year to documents published in that journal in the two preceding years, divided by the number of “citable” documents published in that journal in those 2 years. This article presents evidence that for a considerable number of journals the values of the impact factors published in ISIs Journal Citation Reports (JCR) are inaccurate, particularly for several journals having a high impact factor. The inaccuracies are due to an inappropriate definition of citable documents. Document types not defined by ISI as citable (particularly letters and editorials) are actually cited and do contribute to the citation counts of a journal. We present empirical data in order to assess the degree of inaccuracy due to this phenomenon. For several journals the results are striking. We propose to calculate for a journal impact factors per type of document rather than one single impact factor as given currently in the JCR.
Scientometrics | 2011
Ludo Waltman; Nees Jan van Eck; Thed N. van Leeuwen; Martijn S. Visser; Anthony F. J. van Raan
We present an empirical comparison between two normalization mechanisms for citation-based indicators of research performance. These mechanisms aim to normalize citation counts for the field and the year in which a publication was published. One mechanism is applied in the current so-called crown indicator of our institute. The other mechanism is applied in the new crown indicator that our institute is currently exploring. We find that at high aggregation levels, such as at the level of large research institutions or at the level of countries, the differences between the two mechanisms are very small. At lower aggregation levels, such as at the level of research groups or at the level of journals, the differences between the two mechanisms are somewhat larger. We pay special attention to the way in which recent publications are handled. These publications typically have very low citation counts and should therefore be handled with special care.
Scientometrics | 2002
Ed J. Rinia; Thed N. van Leeuwen; Eppo E. W. Bruins; Hendrik G. van Vuren; Anthony F. J. van Raan
In this paper we report on the results of an exploratory study of knowledge exchange between disciplines and subfields of science, based on bibliometric methods. The goal of this analysis is twofold. Firstly, we consider knowledge exchange between disciplines at a global level, by analysing cross-disciplinary citations in journal articles, based on the world publication output in 1999. Among others a central position of the Basic Life Sciences within the Life Sciences and of Physics within the Exact Sciences is shown. Limitations of analyses of interdisciplinary impact at the journal level are discussed. A second topic is a discussion of measures which may be used to quantify the rate of knowledge transfer between fields and the importance of work in a given field or for other disciplines. Two measures are applied, which appear to be proper indicators of impact of research on other fields. These indicators of interdisciplinary impact may be applied at other institutional levels as well.
Scientometrics | 2006
Thed N. van Leeuwen
The paper discusses an application of bibliometric techniques in the social sciences. While the interest of policy makers is growing, the topic is getting more and more attention from bibliometricians. However, many efforts are put into developing tools to measure scientific output and impact outside the world of the Social Sciences Citation Index, while the use of the SSCI for bibliometric applications is covered with obscurity and myths. This study attempts to clarify some of the topics mentioned against the application of the SSCI for evaluation purposes. The study will cover topics like the existing publication and citation culture within the social sciences, the effect of variable citation windows, and the (geographical) origin of citation flows.SummaryThe paper discusses an application of bibliometric techniques in the social sciences. While the interest of policy makers is growing, the topic is getting more and more attention from bibliometricians. However, many efforts are put into developing tools to measure scientific output and impact outside the world of the Social Sciences Citation Index, while the use of the SSCI for bibliometric applications is covered with obscurity and myths. This study attempts to clarify some of the topics mentioned against the application of the SSCI for evaluation purposes. The study will cover topics like the existing publication and citation culture within the social sciences, the effect of variable citation windows, and the (geographical) origin of citation flows.
Journal of Documentation | 1998
Henk F. Moed; Thed N. van Leeuwen; Jan Reedijk
During the past decades, journal impact data obtained from the Journal Citation Reports (JCR) have gained relevance in library management, research management and research evaluation. Hence, both information scientists and bibliometricians share the responsibility towards the users of the JCR to analyse the reliability and validity of its measures thoroughly, to indicate pitfalls and to suggest possible improvements. In this article, ageing patterns are examined in ‘formal’ use or impact of all scientific journals processed for the Science Citation Index (SCI) during 1981‐1995. A new classification system of journals in terms of their ageing characteristics is introduced. This system has been applied to as many as 3,098 journals covered by the Science Citation Index. Following an earlier suggestion by Glnzel and Schoepflin, a maturing and a decline phase are distinguished. From an analysis across all subfields it has been concluded that ageing characteristics are primarily specific to the individual journal rather than to the subfield, while the distribution of journals in terms of slowly or rapidly maturing or declining types is specific to the subfield. It is shown that the cited half life (CHL), printed in the JCR, is an inappropriate measure of decline of journal impact. Following earlier work by Line and others, a more adequate parameter of decline is calculated taking into account the size of annual volumes during a range of fifteen years. For 76 per cent of SCI journals the relative difference between this new parameter and the ISI CHL exceeds 5 per cent. The current JCR journal impact factor is proven to be biased towards journals revealing a rapid maturing and decline in impact. Therefore, a longer term impact factor is proposed, as well as a normalised impact statistic, taking into account citation characteristics of the research subfield covered by a journal and the type of documents published in it. When these new measures are combined with the proposed ageing classification system, they provide a significantly improved picture of a journal‘s impact to that obtained from the JCR.
Scientometrics | 2000
R. J. W. Tussen; R. K. Buter; Thed N. van Leeuwen
Patent citations to the research literature offer a way for identifying and comparing contributions of scientific and technical knowledge to technological development. This case study applies this approach through a series of analyses of citations to Dutch research papers listed on Dutch-invented and foreign patents granted in the US during the years 1987–1996.First, we examined the general validity and utility of these data as input for quantitative analyses of science-technology interactions. The findings provide new empirical evidence in support of the general view that these citations reflect genuine links between science and technology. The results of the various analyses reveal several important features of industrially relevant Dutch science: (1) the international scientific impact of research papers that are also highly cited by patents, (2) the marked rise in citations to Dutch papers on foreign-invented patents; (3) the large share of author-inventor self-citations in Dutch-invented patents; (4) the growing relevance of the life sciences, (5) an increase in the importance of scientific co-operation. We also find significant differences between industrial sectors as well as major contributions of large science-based multinational enterprises, such as Philips, in domestic science-technology linkages.The paper concludes by discussing general benefits and limitations of this bibliometric approach for macro-level analysis of science bases in advanced industrialised countries like the Netherlands.
Scientometrics | 2002
Thed N. van Leeuwen; Henk F. Moed
This paper discusses development and application of journal impact indicators in a number of bibliometric studies commissioned by Dutch organizations and institutions, and conducted in our institute during the past five years. An outline is given of the research questions addressed in these studies and their policy context. For each study the appropriateness of the use of journal impact indicators produced by the Institute for Scientific Information (ISI) is evaluated. Alternative journal impact measures were developed which are shown to be more appropriate in the particular research and policy contexts than the ISI measures. These measures were considered to be highly useful by the users. The studies have revealed methodological flaws of the ISI journal impact factors.