What is Altmetric?

 

Altmetric is a system that tracks the attention that research outputs such as scholarly articles and datasets receive online. It pulls data from:


  • Social media like Twitter, Facebook, and Google+.
  • Traditional media - both mainstream (The Guardian, New York Times) and field specific (New Scientist, Bird Watching). Many non-English language titles are covered. 
  • Blogs - both major organisations (Cancer Research UK) and individual researchers. 
  • Online reference managers like Mendeley and CiteULike

We track too many sources to list individually but a more detailed breakdown is available here

 

Altmetric cleans up and normalizes the data from these sources then makes it available for analysis. A key difference between Altmetric and other social media monitoring services is that Altmetric will disambiguate links to outputs: it knows that even though some tweets might link to a PubMed abstract, newspapers to the publisher's site and blog posts to a dx.doi.org link they're all talking about the same paper.

 

What does it provide?

 

After Altmetric aggregates all of the information (we call each piece of information a post) it can find about a research output it looks at both the quantity and the quality of attention being paid to an output and visualises it:

 

The number inside the coloured circle is the Altmetric Attention Score for the output being viewed. This is a quantitative measure of the quality and quantity of attention that the output has received - you can read more about the scoring algorithm here

 

The colours themselves reflect where the posts mentioning the output came from. For example, red means that the output has been mentioned by mainstream news outlets, blue means it has been tweeted about. 

 

To view the data associated with the article you'll need to click on each source specific tab. For example, to see all the news mentions you have to click on the News tab.


 



How are outputs scored?


The Altmetric Attention Score is influenced by two factors:

  1. The quantity of posts mentioning an output
  2. The quality of the post's source 

 

The first is relatively straightforward: the more posts mentioning an output the higher its attention score. We measure quality in a few different ways. In general:

 

  • Higher profile posts are worth more than lower profile ones. An article in the Washington Post contributes more, in score terms, than a blog post. A blog post contributes more than a tweet.
  • Who authored each post is important. For posts on social media sites we typically fetch an author's list of followers, a list of their past posts and information about how often those posts were liked, retweeted or reshared. A tweet from a doctor followed by other doctors will contribute more than an automated tweet from a journal's press office.

  

A more detailed explanation of how the scoring algorithm works can be found here.


Important things to remember

  • Altmetric measures attention, not quality. People pay attention to papers for all sorts of reasons, not all of them positive.
  • Altmetric only tracks public attention. Papers are discussed in private forums, offline in journal clubs and by email but we cannot track this.
  • Altmetric tracks direct attention, that is to say attention focused on a specific research paper or dataset. More specifically for a newspaper article or blog post etc. to be counted by Altmetric it must either contain a link to the publication (journal article, DOI, PMID, or institutional repository) or reach our text mining criteria. We have more information here about how we do English-language text mining for news stories and policy documents.
  • Altmetric provides you with a single metric per output so that you can quickly compare relative levels of attention but it only makes sense to use this when comparing apples with apples (e.g. within a single discipline). The norms for attention are very different for different scientific disciplines, just as the norms for citations are.