University Rankings

The ranking of universities is a thorny topic in the academic world.  You can read endless articles and blogs about it.  The release of a new ranking usually makes the news whether the local universities go up or go down – there’s a story there anyway. What does bother me is what these rankings are based on. Sometimes we might go up or down simply because they changed how the ranking is calculated!

Do world university rankings matter? Does anyone make a decision based on these rankings? Do students decide where to study based on rankings? Here’s a Nature article on it. The usual problems with rankings appear – what started out as an entertaining newspaper story ends up being used for things it was not intended to be used for. On March 15 2012 the following appeared in the Irish Times.

The Department of Education has said that ranking systems and league tables rating educational institutions should be interpreted “with caution”.

Responding to the publication of the Times  Higher Education World Reputation Rankings today, in which Ireland was not represented in the top 100 universities in the world, the spokesman said differences in criteria between ranking systems can affect the outcomes of such league tables.

“Notwithstanding these reservations, it is recognised that league tables are referenced by international investors, employers and students as a marker of quality across systems and as such they cannot be ignored,” the spokesman said.

Good to see that someone else is aware that different criteria will produce different rankings. I can believe that investors might use rankings, and employers. I would really like to know if students actually use them. I see rankings as a crude measure of something, and one should be aware of exactly what is being measured. Certainly rankings are a big business and a lot of money is at stake. I do have my doubts that the enormous amount of data on which rankings are based is correctly collected and tabulated and normalized. Data collection errors have been made. Where does the data come from anyway? Is some of it provided by the universities?

An expert group even published a paper on best practice in compiling rankings. The European Union launched a project to design a new ranking system, and put it out for tender. I’m not sure what is happening but the website is here which says the first results of U-Multirank will be out in 2013.

Anyway, what criteria are rankings based on?   It depends on which ranking you look at.  There are lots of different rankings, and more are appearing all the time.  Currently there are probably two major rankings in the world, the QS ranking which comes from Quacquarelli Symonds, and the Times Higher Education Supplement (THES) ranking which comes from The Times and Thompson-Reuters.  These two rankings used to be the same, the THES-QS ranking, until they split in 2009. THES now use Thompson-Reuters for their citation numbers, and QS use Scopus.

The THES ranking makes it clear what their rankings are based on – which is a good thing.  The same cannot be said for the QS rankings. I decided to delve a little deeper and have a look at my university (University College Dublin) under the most recent (2011) THES ranking.

In the 2011 THES rankings UCD was ranked 159 in the world, with a score of 45.9 (joint with two other Dutch universities).  The number 1 ranked university (California Institute of Technology) had a score of 94.8.  First question is: what do these numbers even mean? The score of 45.9 is an average of scores in five categories.  Not all of the five categories are equal, some have a higher weighting than others.  Here are the scores of UCD in the five categories.

Teaching  25.2         (30% of total)

International Outlook 83.2      (7.5% of total)

Industry Income  32.3      (2.5% of total)

Research  23.7         (30% of total)

Citations 80.5         (30% of total)

Are you any the wiser?  Second question: what do these numbers mean?  Before going into that, let’s check the UCD overall ranking.  According to the weightings given, we should compute

(0.3)(25.2)+(0.075)(83.2)+(0.025)(32.3)+(0.3)(23.7)+(0.3)(80.5)

and this turns out to be equal to 45.8675.  I checked this myself on my calculator.  The THES gave UCD a score of 45.9, so it looks like they rounded up.

The THES website gives the scores within each category but doesn’t give the rankings within each category, by the way.

The next question is: how are the scores for each category calculated?  THES provides the following diagram on their website:

Well, ok. This gives a breakdown for each category, into subcategories.  So we must ask, how do they calculate the score for each subcategory.  There is an article on the website that outlines this breakdown but doesn’t give the full details. I suppose the full details are not made public.

The three main categories are teaching, research and citations. They have 30% weighting each. Let’s look at citations, which was worth 32.5% last year by the way.  UCD did well here, with a score of 80.5. The outline doesn’t say how the score is calculated, but it does say the following:

The data are drawn from the 12,000 academic journals indexed by Thomson Reuters’ Web of Science database and include all indexed journals published in the five years between 2005 and 2009. Citations to these papers made in the six years from 2005 to 2010 are collected – increasing the range by an additional year compared with 2010-11, thus improving the stability of the results and decreasing the impact of exceptionally highly cited papers on institutional scores.

Again, this does not say how the number is arrived at. Anyway, it clear that citations are important when it comes to this ranking. I’ll write another post about citations.

I also note that 18% of the research 30% comes from “reputational survey”. What does that mean? And 15% of the 30% for teaching comes from the same category.  This refers to the Academic Reputation Survey that THES conduct.  As far as I know this involves sending an email to ALL university professors around the world and asking them to fill in a survey giving their opinions about other universities.   I think this is highly unscientific. This seems to me to be the same as the Academic Peer Review for the QS rankings, which I will now discuss.

Let’s move on to the QS rankings.  UCD was ranked 134 in the world in the 2011 QS rankings, with a score of 56.9.  Again we should ask: what does this number mean and how was it calculated? I have not been able to find this information on their website directly, however I found it using a search engine.  Here are the categories and their weightings:

Academic Peer Review 40%
Global Employer Review 10%
Citations Per Faculty 20%
International Student Ratio 5%
International Faculty Ratio 5%
Faculty Student Ratio 20%

I could not find the breakdown of scores in each category for UCD (or any university).

Clearly the “academic peer review” is a huge factor, counting for 40%. What does this mean?  As far as I know, this score is found by sending an email to ALL university professors around the world and asking them to fill in a survey giving their opinions about other universities.   I think this is highly unscientific. Philip Altbach of Boston College wrote “Whether the QS rankings should be taken seriously by the higher education community is questionable.” There is an interview in today’s newspaper with TCD provost where he is asked why TCD dropped to number 65 in the world rankings. This refers to the QS rankings. TCD is ranked 117 in the 2011 THES ranking, with a score of 51.1.

Both THES and QS say that they normalize scores using z-scores (how many sigmas you are from mu).  They have been doing this since 2007 only. THES use Thompson for their citation numbers, and QS use Scopus.

A third highly used ranking system is the Academic Ranking of World Universities (ARWU) also known as the Shanghai rankings.  They don’t appear to have made any announcement in 2011. In 2010 they rated UCD in the 301-400 category.  Clearly not worth further comment!! As far as I know they only use research criteria, such as citations, funding, number of Nobel prizewinners on the staff, publications in Nature and Science.

In the USA there is an annual ranking known as the US News and World Report which has been going since 1983.  This is not without controversy, see the wikipedia article for example.

The danger of rankings is that universities start to make decisions in order to move up the rankings, instead of making decisions based on what they actually want to do. One might think that these two aims go hand in hand, but they don’t.  Reed College pulled out of the US News and World Report in 1995, and their president said “by far the most important consequence of sitting out the rankings game, however, is the freedom to pursue our own educational philosophy, not that of some newsmagazine.”

Another danger is that governments will make funding decisions for universities based on rankings.

However, rankings are not going to go away. We have to live with them. What we can do is try to ensure they are done properly, and that people know what they measure. Because they don’t measure the most important things.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s