Google h5 vs Thomson Impact Factor

I really like bibliometrics. Imagine my delight then, when I learnt that Google have recently introduced a new journal metric as an alternative to the impact factor. It’s called the h5 index, and you can read more about it here.

Basically, it’s equivalent to the Hirsch index, but calculated for a journal rather than an author, over a 5 year period. So, an h5 of 10 means that during the past five years a journal has published 10 articles which were each cited at least ten times (and many more articles which were cited fewer than 10 times).

I thought it would be interesting to see how this compares to the impact factor for various journals of relevance to perception researchers. So, here is a graph showing the relationship between the two metrics:

As you can see, there’s a strong correlation over the lower end of the range (approximately h5 = 12*IF), but some divergence at the top end. In particular, some journals with very different impact factors have a similar h5 index (e.g. PLoS ONE and Nature Neuroscience). I suspect this is to do with volume of papers published and scope of the journals involved (e.g. PLoS ONE publishes thousands of articles in many disciplines, so it’s not surprising it has a high h5).

Despite the many problems people have with impact factors, the hard reality is that journal metrics are useful for a range of things. I’ll be interested to see how widely the h5 index gets used over the next few years, especially given the strong correlation for the low-to-medium impact journals.

In particular, if I were Thomson I’d be very worried indeed. In the past six months or so, Google have, seemingly out of nowhere, produced viable competitors to both the Journal Citation Reports database and the ResearcherID tool for calculating an individual’s h-index. In fact, for a variety of reasons I prefer Google’s versions (they’re free, not behind an annoying paywall, much faster, more transparent, and my h-index is higher according to Google!). Although Thomson may have pioneered bibliometrics, remember that Google didn’t invent internet search – they just did it better than everyone else . . . .


8 Responses to Google h5 vs Thomson Impact Factor

  1. Virtually everybody’s h-index is higher on Google, mainly thanks to its rather indiscriminate nature. Not mine, I hasten to add. However, I agree wholeheartedly with your analysis. Thomson should be quaking in their boots right now.

  2. bakerdh says:

    Interestingly mine is 1 (H-units?) higher on Google Scholar because it correctly acknowledges cites to JoV papers better than Web of Knowledge. This is down to JoV’s system of starting each article on page 1, but having article numbers as well (which get lost in many cases), and means that for one paper Thomson counts 3 cites, but Google counts 12! I wonder if this will also hurt JoV’s impact factor – I don’t know enough about the fine details of how this is worked out to be able to tell.

    • Have you tried Scopus? I found it so riddled with missing citations and duplicated authors as to be thoroughly useless for computing any statistic. Probably because it is run by Elsevier, and they probably only care about promoting their own publications!

  3. […] Very similar to ResearcherID, but it works better. Setting the whole thing up took literally under a minute – it found all my papers with only one false positive, which was easily discarded. It automatically adds new papers within a few days of being published. It’s more inclusive for citations, as it indexes the whole of the internet, rather than Thomson’s more limited database. This means my H-index is slightly higher than in ResearcherID. You can add your co-authors if they’re signed up too, and because it’s Google it’s free, and doesn’t require endless sign ins. The only thing I can think of to improve it would be a facility to ‘follow’ other people’s citations and H-index. This sort of exists, but it’s for email notifications – I’d prefer a summary on a web page instead. Otherwise, it’s great – Google have really outdone themselves with this and their journal citation tool (see my previous post). […]

  4. P Hallinger says:

    Everything about Thomson reeks of lack pf transparency!who appointed them lirds if the scholarly productivity domain? What you didnt mention is that their metrics are restricted to an arbitrary set of journals. Google trolls the real world. More valid in my opinion.

  5. […] by Jorge Hirsch, the h5-index has been made popular by Google Scholar’s Metrics. The score on the h-5 index reflects that during the past five years, a journal has published [score no.] […]

  6. […] to change the current ranking of top journals (at least in plant science and chemistry, but see this other analysis). The strategy of H5 is advantageous because it doesn’t put pressure on editors to reject papers […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: