Determining the impact of a fresh manuscript delivered to inquisitive audiences is an elusive pursuit, and difficult to quantify. Traditional methods consider the impact factor of the journal in which the work is published, or the number of times the findings are cited by others. The former serves to imply value by association, while the latter proceeds slowly, over time.
New metrics are emerging, however, as rapid and telling indicators of impact at the level of the manuscript itself. Modern criteria such as downloads and shares are becoming increasingly relevant in today’s digital environment.
Thus far, the generally accepted moniker for these emerging measures is Altmetrics; often misinterpreted as “alternative metrics,” the term is actually speaking to “article-level metrics” that explore the activity surrounding a single manuscript, in lots of different places, in real time.
Publishers today can track how often an article is downloaded, or bookmarked as particularly worthy. Discussions of a manuscript on Facebook, Twitter, blogs, and Wikipedia can be similarly tracked. Indices that might have held only passing interest a few years ago are now finding increasing significance. For example, high “tweetations” for a manuscript may serve to increase an author’s “twimpact factor.”1
Suffice it to say that specific nomenclature within the field of altmetrics is a work in progress. Nevertheless, a study of more than 1.3 million scientific papers found that 22% of all publications received at least one tweet. A fairly intuitive secondary finding was that shorter titles, and shorter documents in general, attained a higher degree of visibility.2
It also makes sense that this study found social science and biomedical papers were far more likely to be shared than papers concerning, say, mathematics. This finding, however, also leads to an important limitation; altmetrics cannot be used as a comparison of impact across different fields of science.2,3 A mediocre paper in a popular field may receive far more attention than a first-rate paper in some more arcane branch of study.
Because of findings like this, it is important to note that altmetrics serve as an emerging standard of audience engagement, and do not necessarily reflect the true impact, or even the quality, of the science itself. In some instances, quite the opposite. Seminal literature from bygone days will receive scant recognition in this arena, while exceptionally high marks will be awarded to the bustling conversations (and schadenfreude) that inevitably swirl around a retracted manuscript.
Many are also quick to point out that the system can be gamed with relative ease. Artificially inflated likes and tweets are readily available to those who might wish to accumulate them by any means possible.
The growth of altmetrics seems likely, the applications less clear. Funding agencies are starting to take note, however, and some academians are starting to incorporate altmetrics scores into their performance reviews.3 As noted by altmetrics.org, scholars are moving their work onto the web in growing numbers, essentially self-publishing by way of scholarly blogs or other forms of social sharing. This conjures up a strange new world in which peer-review is essentially crowdsourced, and impact may be assessed in real time by hundreds or even thousands of conversations that can all be tracked.4
For a company like W2O, steeped in communications and hard data, there is certainly value in capturing and quantifying the buzz around a given piece of media beyond the halls of the research community. Determining what that buzz actually means, and how to best extract its value, are the next steps as we follow the evolution of this burgeoning measure of impact.
1. Eysenbach G. J Med Internet Res. 2011;13:e123.
2. Haustein S, et al. PLoS One. 2015;10:e0120495.
3. Kwok R. Nature. 2013;500:491-3.
4. Priem J, et al. (2010) Altmetrics: A manifesto. http://altmetrics.org/manifesto.