Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Henk F. Moed is active.

Publication


Featured researches published by Henk F. Moed.


Scientometrics | 1995

New bibliometric tools for the assessment of national research performance: Database description, overview of indicators and first applications

Henk F. Moed; R. E. De Bruin; Thed N. van Leeuwen

This paper gives an outline of a new bibliometric database based upon all articles published by authors from the Netherlands, and processed during the time period 1980–1993 by the Institute for Scientific Information (ISI) for theScience Citation Index (SCI),Social Science Citation Index (SSCI) andArts & Humanities Citation Index (A&HCI). The paper describes various types of information added to the database: data on articles citing the Dutch publications; detailed citation data on ISI journals and subfields; and a classification system of publishing main organizations, appearing in the addresses. Moreover, an overview is given of the types of bibliometric indicators that were constructed. Their relationship to indicators developed by other researchers in the field is discussed. Finally, two applications are given in order to illustrate the potentials of the database and of the bibliometric indicators derived from it. The first represents a synthesis of ‘classical’ macro indicator studies at the one hand, and bibliometric analyses of research groups or institutes at the other. The second application gives for the first time a detailed analysis of a countrys publication output per institutional sector.


Journal of Informetrics | 2010

Measuring contextual citation impact of scientific journals

Henk F. Moed

This paper explores a new indicator of journal citation impact, denoted as source normalized impact per paper (SNIP). It measures a journals contextual citation impact, taking into account characteristics of its properly defined subject field, especially the frequency at which authors cite other papers in their reference lists, the rapidity of maturing of citation impact, and the extent to which a database used for the assessment covers the fields literature. It further develops Eugene Garfields notions of a fields ‘citation potential’ defined as the average length of references lists in a field and determining the probability of being cited, and the need in fair performance assessments to correct for differences between subject fields. A journals subject field is defined as the set of papers citing that journal. SNIP is defined as the ratio of the journals citation count per paper and the citation potential in its subject field. It aims to allow direct comparison of sources in different subject fields. Citation potential is shown to vary not only between journal subject categories – groupings of journals sharing a research field – or disciplines (e.g., journals in mathematics, engineering and social sciences tend to have lower values than titles in life sciences), but also between journals within the same subject category. For instance, basic journals tend to show higher citation potentials than applied or clinical journals, and journals covering emerging topics higher than periodicals in classical subjects or more general journals. SNIP corrects for such differences. Its strengths and limitations are critically discussed, and suggestions are made for further research. All empirical results are derived from Elseviers Scopus.


Research Policy | 1985

The use of bibliometric data for the measurement of university research performance

Henk F. Moed; W. J. M. Burger; J. G. Frankfort; A. F. J. Van Raan

Abstract In this paper we present the results of a study on the potentialities of “bibliometric” (publication and citation) data as tools for university research policy. In this study bibliometric indicators were calculated for all research groups in the Faculty of Medicine and the Faculty of Mathematics and Natural Sciences at the University of Leiden. Bibliometric results were discussed with a number of researchers from the two faculties involved. Our main conclusion is that the use of bibliometric data for evaluation purposes carries a number of problems, both with respect to data collection and handling, and with respect to the interpretation of bibliometric results. However, most of these problems can be overcome. When used properly, bibliometric indicators can provide a “monitoring device” for university research-management and science policy. They enable research policy-makers to ask relevant questions of researchers on their scientific performance, in order to find explanations of the bibliometric results in terms of factors relevant to policy.


Scientometrics | 2001

Language biases in the coverage of the Science Citation Index and its consequencesfor international comparisons of national research performance

Thed N. van Leeuwen; Henk F. Moed; Robert J. W. Tijssen; Martijn S. Visser; Anthony F. J. van Raan

Empirical evidence presented in this paper shows that the utmost care must be taken ininterpreting bibliometric data in a comparative evaluation of national research systems. From theresults of recent studies, the authors conclude that the value of impact indicators of researchactivities at the level of an institution or a country strongly depend upon whether one includes orexcludes research publications in SCI covered journals written in other languages than in English.Additional material was gathered to show the distribution of SCI papers among publicationlanguages. Finally, the authors make suggestions for further research on how to deal with this typeof problems in future national research performance studies.


Journal of the Association for Information Science and Technology | 1991

Mapping of Science by Combined Co-Citation and Word Analysis. I. Structural Aspects

Robert R. Braam; Henk F. Moed; Anthony F. J. van Raan

The claim that co-citation analysis is a useful tool to map subject-matter specialties of scientific research in a given period, is examined. A method has been developed using quantitative analysis of content-words related to publications in order to: (1) study coherence of research topics within sets of publications citing clusters, i.e., (part of) the “current work” of a specialty; (2) to study differences in research topics between sets of publications citing different clusters; and (3) to evaluate recall of “current work” publications concerning the specialties identified by co-citation analysis. Empirical support is found for the claim that co-citation analysis identifies indeed subject-matter specialties. However, different clusters may identify the same specialty, and results are far from complete concerning the identified “current work.” These results are in accordance with the opinion of some experts in the fields. Low recall of co-citation analysis concerning the “current work” of specialties is shown to be related to the way in which researchers build their work on earlier publications: the “missed” publications equally build on very recent earlier work, but are less “consensual” and/or less “attentive” in their referencing practice. Evaluation of national research performance using co-citation analysis appears to be biased by this “incompleteness.”


Journal of the Association for Information Science and Technology | 1995

Improving the accuracy of Institute for Scientific Information's journal impact factors

Henk F. Moed; Thed N. van Leeuwen

The Institute for Scientific Information (ISI) publishes annually listings of impact factors of scientific journals, based upon data extracted from the Science Citation Index (SCI). The impact factor of a journal is defined as the average number of citations given in a specific year to documents published in that journal in the two preceding years, divided by the number of “citable” documents published in that journal in those 2 years. This article presents evidence that for a considerable number of journals the values of the impact factors published in ISIs Journal Citation Reports (JCR) are inaccurate, particularly for several journals having a high impact factor. The inaccuracies are due to an inappropriate definition of citable documents. Document types not defined by ISI as citable (particularly letters and editorials) are actually cited and do contribute to the citation counts of a journal. We present empirical data in order to assess the degree of inaccuracy due to this phenomenon. For several journals the results are striking. We propose to calculate for a journal impact factors per type of document rather than one single impact factor as given currently in the JCR.


Archivum Immunologiae Et Therapiae Experimentalis | 2009

New developments in the use of citation analysis in research evaluation

Henk F. Moed

This paper presents an overview of research assessment methodologies developed in the field of evaluative bibliometrics, a subfield of quantitative science and technology studies, aimed to construct indicators of research performance from a quantitative statistical analysis of scientific-scholarly documents. Citation analysis is one of its key methodologies. The paper illustrates the potentialities and limitations of the use of bibliometric indicators in research assessment. It discusses the relationship between metrics and peer review; databases used as sources of bibliometric analysis; the pros and cons of indicators often applied, including journal impact factors, Hirsch indices, and normalized indicators of citation impact; and approaches to the bibliometric measurement of institutional research performance.


Journal of the Association for Information Science and Technology | 1999

Combining mapping and citation analysis for evaluative bibliometric purposes: a bibliometric study

Ed C. M. Noyons; Henk F. Moed; Marc Luwel

The general aim of the article is to demonstrate how the results both of a structural analysis, and of a research performance assessment of a research field, can be enriched by combining elements of both into one integrated analysis. In addition, a procedure is discussed to select and analyse candidate benchmark institutes to assess the position of a particular research institute, in terms of both its cognitive orientation and its scientific production and impact at the international research front. The combined method is applied in an evaluation of the research scope and performance of the Inter‐university Centre for Micro‐Electronics (IMEC) in Leuven, Belgium. On the basis of the comments of an international panel of experts in micro‐electronics, the method was discussed in detail. We concluded that the method provides a detailed and useful picture of the position of the institute from an international perspective. Moreover, we found that the results of each of the two parts are an added value to the other.


Journal of the Association for Information Science and Technology | 1991

Mapping of Science by Combined Co-Citation and Word Analysis. II: Dynamical Aspects.

Robert R. Braam; Henk F. Moed; Anthony F. J. van Raan

Combined analysis of co‐citation relations and words is explored to study time‐dependent (“dynamical”) aspects of scientific activities, as expressed in research publications. This approach, using words originating from publications citing documents in co‐citation clusters, offers an additional and complementary possibility to identify and link specialty literature through time, compared to the exclusive use of citations. Analysis of co‐citation relations is used to locate and link groups of publications that share a consensus concerning intellectual base literature. Analysis of word‐profile similarity is used to identify and link publication groups that belong to the same subject‐matter research specialty. Different types of “content‐words” are analyzed, including indexing terms, classification codes, and words from title and abstract of publications. The developed methods and techniques are illustrated using data of a specialty in atomic and molecular physics. For this specialty, it is shown that, over a period of 10 years, continuity in intellectual base was at a lower level than continuity in topics of current research. This finding indicates that a series of interesting new contributions are made in course of time, without vast alteration in general topics of research. However, within this framework, a more detailed analysis based on timeplots of individual cited key‐articles and of content‐words reveals a change from more rapid succession of new empirical studies to more retrospective, and theoretically oriented studies in later years.


Scientometrics | 2002

Measuring China"s research performance using the Science Citation Index

Henk F. Moed

This contribution focuses on the application of bibliometric techniques to research activities in China, based on data extracted from the Science Citation Index (SCI) and related Citation Indexes, produced by the Institute for Scientific Information (ISI). The main conclusion is that bibliometric analyses based on the ISI databases in principle provide useful and valid indicators of the international position of Chinese research activities, provided that these analyses deal properly with the relatively large number of national Chinese journals covered by the ISI indexes. It is argued that it is important to distinguish between a national and an international point of view. In order to assess the Chinese research activities from a national perspective, it is appropriate to use the scientific literature databases with a good coverage of Chinese periodicals, such as the Chinese Science Citation Database (CSCD), produced at the Chinese Academy of Sciences. Assessment of the position of Chinese research from an international perspective should be based on the ISI databases, but it is suggested to exclude national Chinese journals from this analysis. In addition it is proposed to compute an indicator of international publication activity, defined as the percentage of articles in journals processed for the ISI indexes, with the national Chinese journals being removed, relative to the total number of articles published either in national Chinese or in other journals, regardless of whether these journals are processed for the ISI indexes or not. This indicator can only be calculated by properly combining CSCD and ISI indexes.

Collaboration


Dive into the Henk F. Moed's collaboration.

Top Co-Authors

Avatar

Gali Halevi

Icahn School of Medicine at Mount Sinai

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Carmen López-Illescas

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar

Félix de Moya-Anegón

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Wolfgang Glänzel

Hungarian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge