Writing papers is fun, but rather pointless unless anyone reads them, uses them, and cites them. How do we find out if anyone reads our work? Gross citation counts are nice, and easily provided by
ISI Web of Science (easily, that is, if your institution coughs up the money to pay for the database) or
Google Scholar (free, but not always as comprehensive as ISI's counts). These services also provide links to the papers with the citations. This is useful, but everyone knows that more people read the paper than actually cite it. The problem, of course, is that there is no way to know how many are reading the darned thing.
Until now.
Public Library of Science (the publishers of
PLoS ONE,
PLoS Biology, and other open access journals; in the interests of full disclosure, I'm an academic editor for
PLoS ONE) has just completely shifted the playing field. Free, article-level metrics are now available. Easily. With one click, you can find out how many page views a research article has had and how many people have downloaded the PDF. Better yet, you can track trends through time and download the data into an Excel spreadsheet for further analysis.
Just for fun, I checked out the stats for my
co-authored paper on Triceratops horn use, which was published in January of this year. To date, the publication has had over 7,000 page views, 851 downloads of the PDF file, and 1 citation. The
paper on Darwinius, which came out shortly after the
Triceratops paper, has had over 66,000 page views and over 5,800 downloads of the PDF file.
PLoS ONE also provides
summary tables for selected disciplines - a paper on evolutionary biology (which includes paleontology, for most purposes) published in 2008 could expect to have racked up at least 2,200 hits by now.
So what's to like here? Well, an author gets an immediate sense if someone is paying attention to a publication. Page views and PDF downloads are a valuable tool for gauging community interest. In concert with citation data, it's probably a far better gauge of a paper's worth than the impact factor for the journal that the publication happens to show up in. The data are also freely available, transparent, and frequently updated. The latter is particularly important because it may be years before a paper's full impact is known. An open-access metric for an open-access world.
And are there any problems? As with any metric, the unfortunate answer is yes. Page view counts probably include a lot of casual readers, who read the abstract and promptly forget the existence of the article in question. These counts could also be gamed by "click contests" - one need only smell the stench emanating from the hordes of Pharyngula's zombie fanboyz as they lurch towards the next on-line poll to realize just how malleable page view data potentially are (although to
PLoS's credit, they have attempted to filter out any robot and web crawler traffic). The metric will also be abused by administrators, who will still make career-ending decisions based on a number (although at least it's a hopefully more relevant number this time). Once again to
PLoS's credit, they provide
explanatory and
cautionary pages candidly outlining the pros and cons of the metric.
I suspect that other journals will follow suit - it may not happen tomorrow, but it will happen. We may be seeing the death of the traditional, sometimes tyrannical, "impact factor." Let's hope we don't replace it with a new despot!