Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Martijn S. Visser is active.

Publication


Featured researches published by Martijn S. Visser.


Scientometrics | 2001

Language biases in the coverage of the Science Citation Index and its consequencesfor international comparisons of national research performance

Thed N. van Leeuwen; Henk F. Moed; Robert J. W. Tijssen; Martijn S. Visser; Anthony F. J. van Raan

Empirical evidence presented in this paper shows that the utmost care must be taken ininterpreting bibliometric data in a comparative evaluation of national research systems. From theresults of recent studies, the authors conclude that the value of impact indicators of researchactivities at the level of an institution or a country strongly depend upon whether one includes orexcludes research publications in SCI covered journals written in other languages than in English.Additional material was gathered to show the distribution of SCI papers among publicationlanguages. Finally, the authors make suggestions for further research on how to deal with this typeof problems in future national research performance studies.


Journal of the Association for Information Science and Technology | 2012

The Leiden ranking 2011/2012: Data collection, indicators, and interpretation

Ludo Waltman; Clara Calero-Medina; Joost Kosten; Ed C. M. Noyons; Robert J. W. Tijssen; Nees Jan van Eck; Thed N. van Leeuwen; Anthony F. J. van Raan; Martijn S. Visser; Paul Wouters

The Leiden Ranking 2011/2012 is a ranking of universities based on bibliometric indicators of publication output, citation impact, and scientific collaboration. The ranking includes 500 major universities from 41 different countries. This paper provides an extensive discussion of the Leiden Ranking 2011/2012. The ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings. The comparison focuses on the methodological choices underlying the different rankings. Also, a detailed description is offered of the data collection methodology of the Leiden Ranking 2011/2012 and of the indicators used in the ranking. Various innovations in the Leiden Ranking 2011/2012 are presented. These innovations include (1) an indicator based on counting a universitys highly cited publications, (2) indicators based on fractional rather than full counting of collaborative publications, (3) the possibility of excluding non-English language publications, and (4) the use of stability intervals. Finally, some comments are made on the interpretation of the ranking and a number of limitations of the ranking are pointed out.


Scientometrics | 2011

Towards a new crown indicator: an empirical analysis

Ludo Waltman; Nees Jan van Eck; Thed N. van Leeuwen; Martijn S. Visser; Anthony F. J. van Raan

We present an empirical comparison between two normalization mechanisms for citation-based indicators of research performance. These mechanisms aim to normalize citation counts for the field and the year in which a publication was published. One mechanism is applied in the current so-called crown indicator of our institute. The other mechanism is applied in the new crown indicator that our institute is currently exploring. We find that at high aggregation levels, such as at the level of large research institutions or at the level of countries, the differences between the two mechanisms are very small. At lower aggregation levels, such as at the level of research groups or at the level of journals, the differences between the two mechanisms are somewhat larger. We pay special attention to the way in which recent publications are handled. These publications typically have very low citation counts and should therefore be handled with special care.


Scientometrics | 2011

Severe language effect in university rankings: particularly Germany and France are wronged in citation-based rankings

Anthony F. J. van Raan; Thed N. van Leeuwen; Martijn S. Visser

We applied a set of standard bibliometric indicators to monitor the scientific state-of-arte of 500 universities worldwide and constructed a ranking on the basis of these indicators (Leiden Ranking 2010). We find a dramatic and hitherto largely underestimated language effect in the bibliometric, citation-based measurements of research performance when comparing the ranking based on all Web of Science (WoS) covered publications and on only English WoS covered publications, particularly for Germany and France.


Scientometrics | 2011

On the correlation between bibliometric indicators and peer review: reply to Opthof and Leydesdorff

Ludo Waltman; Nees Jan van Eck; Thed N. van Leeuwen; Martijn S. Visser; Anthony F. J. van Raan

Opthof and Leydesdorff (Scientometrics, 2011) reanalyze data reported by Van Raan (Scientometrics 67(3):491–502, 2006) and conclude that there is no significant correlation between on the one hand average citation scores measured using the CPP/FCSm indicator and on the other hand the quality judgment of peers. We point out that Opthof and Leydesdorff draw their conclusions based on a very limited amount of data. We also criticize the statistical methodology used by Opthof and Leydesdorff. Using a larger amount of data and a more appropriate statistical methodology, we do find a significant correlation between the CPP/FCSm indicator and peer judgment.


Psychotherapy Research | 2003

Bibliometric Analysis of Psychotherapy Research: Performance Assessment and Position in the Journal Landscape

Anthony F. J. van Raan; Martijn S. Visser; Thed N. van Leeuwen; Erik van Wijk

The authors provide an overview of advanced bibliometric methods for (a) an objective and transparent assessment of journal performance and (b) positioning of a journal in relation to other journals. These methods are applied to Psychotherapy Research, an international journal within the field of clinical psychology. In the first analysis, the authors focus on journal performance in an international comparative perspective (i.e., the performance of the journal in relation to all other journals in the same field of science) and introduce a novel type of journal impact factor. In the second analysis, the authors position the journal on the basis of total citation relations among all relevant journals, including those outside the specific field of science to which the journal belongs. A multitude of interdisciplinary relations between the journal under investigation and many other journals is revealed. The investigators discuss briefly the potential of such a “journal citation mapping” for unraveling interdisciplinary developments and “interfaces” between different fields of science.


Scientometrics | 2013

The role of editorial material in bibliometric research performance assessments

Thed N. van Leeuwen; Rodrigo Costas; Clara Calero-Medina; Martijn S. Visser

In this study, the possibilities to extend the basis for research performance exercises with editorial material are explored. While this document type has been traditionally not considered as an important type of scientific communication in research performance assessment procedures, there is a perception from researchers that editorial materials should be considered as relevant document types as important sources for the dissemination of scientific knowledge. In a number of these cases, some of the mentioned editorial materials are actually ‘highly cited’. This lead to a thorough scrutiny of editorials or editorial material over the period 1992–2001, for all citation indexes of Thomson Scientific. The relevance of editorial materials through three quantitative bibliometric characteristics of scientific publications, namely page length, number of references, and the number of received citations, are thoroughly analyzed.


Research Evaluation | 2008

Important factors when interpreting bibliometric rankings of world universities: an example from oncology

Clara Calero-Medina; Carmen López-Illescas; Martijn S. Visser; Henk F. Moed

This paper presents bibliometric characteristics of the 386 most frequently publishing world universities and of a (partly overlapping) set of 529 European universities. Rather than presenting a ranking itself, it presents a statistical analysis of ranking data, focusing on more general patterns. It compares US universities with European institutions; countries with a strong concentration of academic research activities among universities with nations showing a more even distribution; a ranking of universities based on indicators calculated for all research fields combined with one compiled for a single field (oncology); general with specialised universities; and rankings based on a single indicator with maps combining social network analysis and a series of indicators. It highlights important factors that should be taken into account in the interpretation of rankings of research universities based on bibliometric indicators. Moreover, it illustrates policy-relevant research questions that may be addressed in secondary analyses of ranking data. In this way, this paper aims at contributing to a public information system on research universities. Copyright , Beech Tree Publishing.


Journal of Documentation | 2004

Quantitative deconstruction of citation impact indicators: Waxing field impact but waning journal impact

Anton J. Nederhof; Martijn S. Visser

In two case studies of research units, reference values used to benchmark research performance appeared to show contradictory results: the average citation level in the subfields (FCSm) increased world‐wide, while the citation level of the journals (JCSm) decreased, where concomitant changes were expected. Explanations were sought in: a shift in preference of document types; a change in publication preference for subfields; and changes in journal coverage. Publishing in newly covered journals with a low impact had a negative effect on impact ratios. However, the main factor behind the increase in FCSm was the distribution of articles across the five‐year block periods that were studied. Publication in lower impact journals produced a lagging JCSm. Actual values of JCSm, FCSm, and citations per publication (CPP) values are not very informative either about research performance, or about the development of impact over time in a certain subfield with block indicators. Normalized citation impact indicators are free from such effects and should be consulted primarily in research performance assessments.


Scientometrics | 2009

Expansion of scientific journal categories using reference analysis: How can it be done and does it make a difference?

Carmen López-Illescas; Ed C. M. Noyons; Martijn S. Visser; Félix de Moya-Anegón; Henk F. Moed

This paper explores a methodology for delimitating scientific subfields by combining the use of (specialist) journal categories from Thomson Scientific’s Web of Science (WoS) and reference analysis. In a first step it selects all articles in journals included in a particular WoS journal category covering a subfield. These journals are labelled as a subfield’s specialist journals. In a second step, this set of papers is expanded with papers published in other, additional journals and citing a subfield’s specialist journals with a frequency exceeding a certain citation threshold. Data are presented for two medical subfields: Oncology and Cardiac & Cardiovascular System. A validation based on findings from earlier studies, from an analysis of MESH descriptors from MEDLINE, and on expert opinion provides evidence that the proposed methodology has a high precision, and that expansion substantially enhanced the recall, not merely in terms of the number of retrieved papers, but also in terms of the number of research topics covered. The paper also examines how a bibliometric ranking of countries and universities based on the citation impact of their papers published in a subfield’s specialist journals compares to that of a ranking based on the impact of their articles in additional journals. Rather weak correlations especially obtained at the level of universities underline the conclusion from earlier studies that an assessment of research groups or universities in a scientific subfield that takes into account solely papers published in a subfield’s specialist journals is unsatisfactory.

Collaboration


Dive into the Martijn S. Visser's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Henk F. Moed

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Carmen López-Illescas

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge