Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Elizabeth S. Vieira is active.

Publication


Featured researches published by Elizabeth S. Vieira.


Scientometrics | 2009

A comparison of Scopus and Web of Science for a typical university

Elizabeth S. Vieira; J.A.N.F. Gomes

For many years, the ISI Web of Knowledge from Thomson Reuters was the sole publication and citation database covering all areas of science thus becoming an invaluable tool in bibliometric analysis. In 2004, Elsevier introduced Scopus and this is rapidly becoming a good alternative. Several attempts have been made at comparing these two instruments from the point of view of journal coverage for research or for bibliometric assessment of research output.This paper attempts to answer the question that all researchers ask, i.e., what is to be gained by searching both databases? Or, if you are forced to opt for one of them, which should you prefer? To answer this question, a detailed paper by paper study is presented of the coverage achieved by ISI Web of Science and by Scopus of the output of a typical university. After considering the set of Portuguese universities, the detailed analysis is made for two of them for 2006, the two being chosen for their comprehensiveness typical of most European universities. The general conclusion is that about 2/3 of the documents referenced in any of the two databases may be found in both databases while a fringe of 1/3 are only referenced in one or the other. The citation impact of the documents in the core present in both databases is higher, but the impact of the fringe that are present only in one of the databases should not be disregarded as some high impact documents may be found among them.


Journal of Informetrics | 2010

Citations to scientific articles: Its distribution and dependence on the article features

Elizabeth S. Vieira; J.A.N.F. Gomes

The citation counts are increasingly used to assess the impact on the scientific community of publications produced by a researcher, an institution or a country. There are many institutions that use bibliometric indicators to steer research policy and for hiring or promotion decisions. Given the importance that counting citations has today, the aim of the work presented here is to show how citations are distributed within a scientific area and determine the dependence of the citation count on the article features. All articles referenced in the Web of Science in 2004 for Biology & Biochemistry, Chemistry, Mathematics and Physics were considered.


Journal of Informetrics | 2010

A research impact indicator for institutions

Elizabeth S. Vieira; J.A.N.F. Gomes

This paper introduces a new impact indicator for the research effort of a university, nh3. The number of documents or the number of citations obtained by an institution are used frequently in international ranking of institutions. However, these are very dependent on the size and this is inducing mergers with the apparent sole goal of improving the research ranking. The alternative is to use the ratio of the two measures, the mean citation rate, that is size independent but it has been shown to fluctuate along the time as a consequence of its dependence on a very small number of documents with an extremely good citation performance. In the last few years, the popularity of the Hirsch index as an indicator of the research performance of individual researchers led to its application to journals and institutions. However, the original aim of this h index of giving a mixed measure of the number of documents published and their impact as measured by the citations collected along the time is totally undesirable for institutions as the overall size may be considered irrelevant for the impact evaluation of research. Furthermore, the h index when applied to institutions tends to retain a very small number of documents making all other research production irrelevant for this indicator. The nh3 index proposed here is designed to measure solely the impact of research in a way that is independent of the size of the institution and is made relatively stable by making a 20-year estimate of the citations of the documents produced in a single year.


Scientometrics | 2011

An impact indicator for researchers

Elizabeth S. Vieira; J.A.N.F. Gomes

The assessment of individual researchers using bibliometric indicators is more complex than that of a region, country or university. For large scientific bodies, averages over a large number of researchers and their outputs is generally believed to give indication of the quality of the research work. For an individual, the detailed peer evaluation of his research outputs is required and, even this, may fail in the short term to make a final, long term assessment of the relevance and originality of the work. Scientometrics assessment at individual level is not an easy task not only due to the smaller number of publications that are being evaluated, but other factors can influence significantly the bibliometric indicators applied. Citation practices vary widely among disciplines and sub disciplines and this may justify the lack of good bibliometric indicators at individual level. The main goal of this study was to develop an indicator that considers in its calculation some of the aspects that we must take into account on the assessment of scientific performance at individual level. The indicator developed, the hnf index, considers the different cultures of citation of each field and the number of authors per publication. The results showed that the hnf index can be used on the assessment of scientific performance of individual researchers and for following the performance of a researcher.


Journal of Informetrics | 2014

How good is a model based on bibliometric indicators in predicting the final decisions made by peers

Elizabeth S. Vieira; José Sarsfield Cabral; J.A.N.F. Gomes

This paper shows how bibliometric models can be used to assist peers in selecting candidates for academic openings.


Scientometrics | 2011

The journal relative impact: an indicator for journal assessment

Elizabeth S. Vieira; J.A.N.F. Gomes

This paper presents the journal relative impact (JRI), an indicator for scientific evaluation of journals. The JRI considers in its calculation the different culture of citations presented by the Web of Science subject categories. The JRI is calculated considering a variable citation window. This citation window is defined taking into account the time required by each subject category for the maturation of citations. The type of document considered in each subject category depends on its outputs in relation to the citations. The scientific performance of each journal in relation to each subject category that it belongs to is considered allowing the comparison of the scientific performance of journals from different fields. The results obtained show that the JRI can be used for the assessment of the scientific performance of a given journal and that the SJR and SNIP should be used to complement the information provided by the JRI. The JRI presents good features as stability over time and predictability.


Journal of the Association for Information Science and Technology | 2014

Definition of a model based on bibliometric indicators for assessing applicants to academic positions

Elizabeth S. Vieira; José Sarsfield Cabral; J.A.N.F. Gomes

A model based on a set of bibliometric indicators is proposed for the prediction of the ranking of applicants to an academic position as produced by a committee of peers. The results show that a very small number of indicators may lead to a robust prediction of about 75% of the cases.


Journal of Informetrics | 2016

The growth process of higher education institutions and public policies

Elizabeth S. Vieira; Benedetto Lepori

This paper investigates the growth over time of the size of higher education institutions (HEIs), as measured by the number of academic staff, and its association with HEI and country attributes. We analyze a sample of 837 HEIs from 18 countries derived from the European Tertiary Education Register (ETER) and from the European Micro Data dataset (EUMIDA) for the years 2008 and 2012. Our analysis shows that (1) HEIs growth is largely proportional to their size, leading to a nearly log-normal distribution of size (Gibrats law), even if small institutions tend to grow faster; (2) the growth of the number of students and HEIs reputation level positively influences HEI growth. Consequently (3) small HEIs need a lower level of reputation and less growth of students to continue growing over time, while only highly reputed HEIs are able to maintain a large size over time. Our results are relevant to understand the extent to which cumulative effects lead to a lasting concentration of resources in the HE system and whether public policies are able to redistribute resources based on merit.


Research Evaluation | 2016

The bibliometric indicators as predictors of the final decision of the peer review

Elizabeth S. Vieira; J.A.N.F. Gomes


Research Evaluation | 2018

The peer-review process: The most valued dimensions according to the researcher’s scientific career

Elizabeth S. Vieira; J.A.N.F. Gomes

Collaboration


Dive into the Elizabeth S. Vieira's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge