What is Altmetric?


Altmetric is a system that tracks the attention that research outputs such as scholarly articles and datasets receive online. It does this by pulling in data from three main sources:

  • Social media like Twitter, Facebook, Google+, Pinterest and blogs
  • Traditional media - both mainstream (The Guardian, New York Times) and science specific (New Scientist, Scientific American). Many non-english language titles are covered. 
  • Online reference managers like Mendeley and CiteULike

We track too many sources to list individually but a more detailed breakdown is available here


Altmetric cleans up and normalizes the data from these sources then makes it available for analysis. A key difference between Altmetric and other social media monitoring services is that Altmetric will disambiguate links to outputs: it knows that even though some tweets might link to a PubMed abstract, newspapers to the publisher's site and blog posts to a dx.doi.org link they're all talking about the same paper.


What does it provide?


After Altmetric aggregates all of the information (we call each piece of information a post) it can find about a research output it looks at both the quantity and the quality of attention being paid to an output and visualises it:


The number inside the coloured circle is the Altmetric score of attention for the output being viewed. This is a quantitative measure of the quality and quantity of attention that the output has received - you can read more about the scoring algorithm here


The colours themselves reflect where the posts mentioning the output came from. For example, red means that the output has been mentioned by mainstream news outlets, blue means it has been tweeted about. 


Clicking on the circle typically gives you access to all of the posts that Altmetric has collected for that output in the Altmetric Details Page:



Here you can view all of the conversations about that output, including links back to the original mentions, alongside demographics and the score in context. 

How are outputs scored?

The Altmetric score is influenced by two factors:


  1. The quantity of posts mentioning an output
  2. The quality of each post


The first is relatively straightforward: the more posts mentioning an output the higher its score. We measure quality in a few different ways. In general:


  • Higher profile posts are worth more than lower profile ones. An article in the Washington Post contributes more, in score terms, than a blog post. A blog post contributes more than a tweet.
  • Who authored each post is important. For posts on social media sites we typically fetch an author's list of followers, a list of their past posts and information about how often those posts were liked, retweeted or reshared. A tweet from a doctor followed by other doctors will contribute more than an automated tweet from a journal's press office.


A more detailed explanation of how the scoring algorithm works can be found here.

Important things to remember

  • Altmetric measures attention, not quality. People pay attention to papers for all sorts of reasons, not all of them positive.
  • Altmetric only tracks public attention. Papers are discussed in private forums, offline in journal clubs and by email but we cannot track this.
  • Altmetric tracks direct attention, that is to say attention focused on a specific research paper or dataset. More specifically for a newspaper article or blog post etc. to be counted by Altmetric it must contain a hyperlink to or formal citation of a scholarly work.
  • Altmetric provides you with a single metric per output so that you can quickly compare relative levels of attention but it only makes sense to use this when comparing apples with apples (e.g. within a single discipline). The norms for attention are very different for different scientific disciplines, just as the norms for citations are.