In December, the well-regarded science, health and technology publisher Elsevier released a new metric to assess the influence that publications  (and the research published in them) have in the marketplace. It’s an alternative to what is considered the industry’s gold-standard assessment tool – the so-called Impact Factor. Every researcher knows the IF. And its influence can actually sway the direction of research publications for better or worse.
What’s the Impact Factor?
The IF was created in 1955 by Eugene Garfield to help librarians identify the most influential journals . It’s a tally of the number of times that articles in a journal are cited during the last two years (a 5-year IF can be computed as well). It’s a quantitative measure designed to assess the influence that a journal has in its field. Published annually in the Journal Citation Reports (Thomson Reuter), it covers about 11 000 publications around the world.
The idea behind the metric is commendable. However, the score is now often seen as a measurement of a researcher’s productivity (think of the term used in academic circles, “publish or perish”). That in turn has made it an easy target for manipulation, something obviously not intended when initially designed.
The real problem is that the IF changes the way research is conducted and published. It encourages researchers to submit only “pretty” results, on “fashionable” topics, to increase auto-citations and/or citations from the target journal, to improve scores.
What changes with CiteScore?
CiteScore is also a ratio of citations to documents. Proposed by Elsevier, it is based on the wide Scopus database that represents more than 22 000 titles, twice the number of titles utilized by the IF. The observation window is enlarged to three years (one more than IF). The total number of documents taken into account in the denominator of the formula includes both articles, reviews (like IF) and others types of editorial, such as letters to the editor, corrections and news items.
The changes negatively impact the scores of some publications, like Nature, whose ranking nosedived (by 173 places) with the CiteScore methodology. By comparison Emerald Publishing and Elsevier improve by 1317 and 127 ranks respectively (Notice these two publishers are involved in the development of the CiteScore).
Comparative graphics and tables are available on the website eigenfactor.org. (http://eigenfactor.org/projects/posts/citescore.php)
The table below compares Impact Factor to CiteScore for some journals in the field of gait and balance. All CiteScore metrics are available here: https://journalmetrics.scopus.com/
|Archives of Physical Medicine and Rehabilitation||Elsevier||3.045||3.09|
|Gait & Posture||Elsevier||2.286||2.98|
|Journal of Biomechanics||Elsevier||2.431||2.76|
|Journal of Motor Behavior||Taylor & Francis||1.573||1.71|
|PLoS One||Public Library of Science||3.057||3.32|
|Research in Developmental Disabilities||Elsevier||1.877||2.27|
Does CiteScore really change the rules?
As CiteScore is based only on citations, it is not certain that it modifies the way the researchers get funding and/or visibility. Other effects might appear like refusing non-research documents (erratum, letters to editor) which increases the total number of documents and decreases the CiteScore.
It remains to be seen if CiteScore becomes as popular as the Impact Factor. It is not the first attempt to propose an alternative to IF. The SCImago Journal Rank (since 1996) and the Eigenfactor (2008), where citations are weighted according to the rank of journals, are two recent attempts but have achieved little recognition within the scientific community.
If the final goal of these scores is to assess productivity and quality of a researcher’s work, maybe a score that includes more features that show a researcher is active in her/his field might be more powerful.
 Zijlstra H, McCullough R. CiteScore: a new metric to help you track journal performance and make decisions. Available online on December 8, 2016. https://www.elsevier.com/editors-update/story/journal-metrics/citescore-a-new-metric-to-help-you-choose-the-right-journal
 Garfield E. The history and meaning of the journal impact factor. JAMA 2006;295(1):90-3. https://dx.doi.org/10.1001/jama.295.1.90
 Casadevall A, Fang FC. Causes for the persistence of Impact Factor mania. mBio 2014;5(2): e00064-14. https://dx.doi.org/10.1128/mBio.00064-14