They're a classic case of Goodhart's law — the metrics are easily gamed
The QS World University Rankings were released this week; there was much excitement about the UK having four universities in the top 10. And no doubt all my fellow alumni of King’s College London were thrilled to find that their alma mater had been ranked the 35th best seat of higher education in the world.
They may, however, have been somewhat surprised to realise that (according to the Guardian’s university rankings) they were also the 42nd best in the UK.
Sharp-eyed observers will have noticed that 42 is a bigger number than 35, and that it is logically impossible for there to be 41 better universities in the UK but only 34 better in the world, because the world contains the UK. So: what’s going on?
Rankings like this sound awfully important and objective. But (as my cousin David and I explain in our recent book How to Read Numbers) they conceal an awful lot of information. For instance: every so often, India overtakes the UK to become the fifth largest economy in the world, and people get very upset about it. According to the World Bank’s economic rankings, India is currently ahead of us. Meanwhile, China’s is the second biggest, while Japan is third.
That ranking makes it sound like the difference in size between India and the UK is the same as that between China and Japan; but India’s economy is 2% larger than the UK’s, while China’s is 280% larger than Japan’s — almost four times the size. A footling change in either the UK or India’s economy, or indeed a measurement error, could make them switch places, and regularly does; but if Japan were to overtake China, that would be a huge, and shocking, story.
But at least the size of an economy is an objective fact that could, in theory, be measured. What makes a “best” university? Teaching? Student experience? Research output? Pretty quadrangles?
For instance: the QS World University Rankings give heavy weight (40% of a university’s total score) to “academic reputation”: a survey goes out to 100,000 academics worldwide and they are asked to give their opinion of the “teaching and research quality” at the world’s universities. Since the large majority of those academics will never have visited, for instance, the University of Durham, their input will be based heavily on guesswork.
And there’s nothing set in stone about that 40%: they could have given more weight to student satisfaction, or student/faculty ratio, or research impact. And then the list would look entirely different. The Guardian’s list, indeed, is entirely different, because it uses different metrics: “spend per student”, how many students are in a career 15 months after leaving, and so on. Which is right? Neither; both. It’s an arbitrary choice.
You may reasonably say “who cares”: every year Oxford, Cambridge, and a few American big hitters (MIT, Harvard, Stanford, etc) make up the top dozen or so, along with Imperial and/or UCL. Does it really matter if the University of Liverpool is 158th or 189th?
But Goodhart’s Law — ”when a measure becomes a target, it ceases to be a good measure” — is a thing. In the UK, when schools are given targets of how many A*-C GCSE results they need, they start focusing on the children who are on the cusp between a D and a C, and neglecting the rest.
Similarly, if you rank universities, and then would-be students choose their universities on the basis of those rankings, then universities will try to game those rankings. In fact the formulation of Goodhart’s Law I gave above comes from a 1997 paper on how UK universities game the auditing system. It is a racing certainty that universities will focus on getting their rankings as high as possible, whether or not that actually translates to being better universities.
It’s not that these rankings are entirely meaningless. It is, for instance, probably true that UC Berkeley (32th on the QS list) is a “better” university than, in some fuzzy but real sense, the University of Nottingham (103rd). But nor are they very informative. And if KCL can be the 35th best in the world on one list and the 42nd best in the UK on another, it’s probably wise not to take any of them all that seriously.