Article level metrics: a look beyond the journal impact factor

508095213

The journal Impact Factor (IF), developed by Eugene Garfield at the Institute for Scientific Information (ISI), reflects the average number of times articles from the journal published in the past two years have been cited in the Journal Citation Reports (JCR) year. The Impact Factor is calculated by dividing the number of citations in the JCR year by the total number of articles published in the two previous years. For example, if there were 200 papers published in a journal in 2013 and 2014 and there were 400 citations in that time period, then the 2015 IF for the journal would be 2. Impact Factor uses Thomson Reuters (ISI Web of Knowledge) citation data. The Impact factor citation data was first derived from the Science Citation Index, a citation index created by Garfield and produced by the Institute for Scientific Information (ISI). ISI was later acquired by Thomson Reuters along with the Science Citation Index, which Reuters grew into the Science Citation Index Expanded. That index is now housed in the Web of Science, a subscription-based scientific citation indexing service encompassing six other online databases. Today, Thomson Reuters calculates IFs using the data from all of the journals indexed in the Web of Science, and releases an IF listing on an annual basis in its yearly Journal Citation Reports, which is available with paid Web of Science subscriptions.

The journal Impact Factor has been widely used as the leading indicator of research impact. However, in recent years certain concerns in the foundation of the IF have emerged challenging its position as the leading indicator of research impact. Chief among concerns about the IF is that it is a journal-level metric. The Impact Factor’s reliance on citations has resulted in many other logistical challenges. Because the IF only calculates citation impact, it limits the scope of impact assessment to a select list of journals, not taking into account the ways other scholarship is having an influence, such as in public policy documents or social media. Additionally, because it relies on citation counts, there is often a significant lag between the time a paper is published and when it begins to contribute to the IF of the journal it was published in. IFs also do not inherently account for the fact that journals in different fields cannot expect to garner citations at the same rate. Citation patterns vary greatly between disciplines, and are often heavily influenced by many external factors. Journals can also manipulate their publishing methods to increase their IF, such as publishing a higher percentage of review articles, which tend to be cited more than regular articles, or by encouraging authors to frequently cite other articles in their journal, thereby gaining IF via self-citations.

Over time, supplemental indicators of impact and influence have arisen to address issues with the IF. Many of these new indicators have fallen under the umbrellas of author or article level metrics, which in addition to measuring citations of papers can also measure how often they publish, page views and download of articles, and the number of online comments articles generate. Perhaps the most popular alternative to the IF is the h-index. Created by Jorge E. Hirsch in 2005, the h-index seeks to measure the impact of individual authors by measuring researchers’ productivity and citations of their published works.

Given the limitations of bibliometrics, many academics and editors are looking to new noncitation based article-level indicators of impact as an alternative. Altmetrics, a type of article level metric (ALM), are metrics gathered from mentions of research in nontraditional online outlets that can be used to analyze how scholarship is being found, shared, cited, and discussed. Depending on the information source, altmetrics can encompass a range of insights including the number of views and downloads a research output receives, and how often that research is referenced online in public policy documents, databases, social media (Facebook, Twitter or Youtube), academic social networks (Mendeley, CiteULike, Delicious), news media, post-publication peer review forums, blogs, Wikipedia, and more. In recent years, companies have emerged with different tools (Public Library of Science – Article Level Metrics, Altmetric, Impactstory and Plum Analytics) to track article level metrics.4 These tools can be used by journals to gather altmetrics data for their publication at the journal and article level, and by individual scholars to track the online activity surrounding their published works. Altmetrics are normally early available and can provide a very fast view about the social impact of science.