There are various international rankings of universities, each one focusing on a different perspective, as the underlying indicators and their weighting make clear. Some focus mainly on research, while others concentrate on performance in multiple areas, such as teaching and internationalisation. We review the most popular ones below. Some are based only on hard data, in some cases provided by the universities themselves; others also make use of ‘soft’ data in the form of survey results.
ARWU (‘Shanghai’ ranking)
The Academic Ranking of World Universities (ARWU) was the first worldwide ranking of universities. Initially meant to determine the global standing of top Chinese universities, it quickly came to international prominence. It has been published annually since 2003 by Shanghai Jiao Tong University (hence the epithet ‘Shanghai’ ranking). A university’s standing in the general ARWU ranking depends mainly on research-related indicators:
- number of alumni winning Nobel Prizes and field medals (10%)
- number of staff winning Nobel Prizes and field medals (20%)
- number of highly cited researchers in 21 broad categories (20%)
- number of articles published in Nature and Science (20%)
- number of articles indexed in the Science Citation Index-Expanded and the Social Sciences Citation Index (20%)
- per capita performance of a university (10%)
A few years ago, rankings in fields (science, engineering, life sciences, medical sciences, social sciences) and subjects (mathematics, physics, chemistry, computer science, economics/business) were added.
Twelve Dutch universities are placed in the top 400 of the 2017 ranking. The sole exception is Tilburg University, which is explicable given its disciplinary background (note that in the subject field ranking for economics and business, Tilburg is in the top 100). The ARWU covers an estimated 1300 universities worldwide. The ARWU ranking places universities ranked 101st and lower into larger groups of fifty or hundred. Within these groups the universities are listed alphabetically.
QS World University Ranking
The QS World University Ranking compares more than 900 top universities on the following six indicators:
- Academic reputation, measured using a global survey of academics (40%)
- Employer reputation, measured using a global survey of employers (10%)
- Citations per faculty score, based on Elsevier’s Scopus database (20%)
- Student-to-faculty ratio (20%)
- International student ratio (5%)
- International faculty ratio (5%)
The QS ranking, which casts its net more broadly than the ARWU, also ranks universities by over forty different subjects. The order in which the QS and ARWU rankings place universities differs because each bases its measures on a different set of indicators. The QS ranking is based mainly on indicators of reputation, whereas the ARWU is based on factual scientific output.
The Leiden Ranking
The Leiden Ranking is compiled by the Centre for Science and Technology Studies (CWTS) at Leiden University. It goes even further than the ARWU in focusing specifically on research, as it is based solely on bibliometric data (publications and citations) from the Web of Science scientific citation indexing service. That also explains why it focuses mainly on the natural and medical sciences, which traditionally publish in international science journals. The Leiden Ranking is an interactive index of over 900 research universities world-wide. Users can set their own parameters:
- Time period: the first available time period is 2006-2009, with each successive period being one year later in time
- Field: one of the five fields of science or all sciences
- Minimum publication output
- Type of indicator: impact or collaboration (based on co-authored publications)
- Impact indicators (top 1%, top 10% and top 50%)
- Collaboration indicators (total no. of publications co-authored with other organisations, no. of publications co-authored with other countries, and no. of co-authored publications based on distance, either <100 km or >5000 km).
A university’s standing may vary considerably depending on the indicator used, as shown by how the two selected indicators in the table below affect the position of Dutch universities in a number of rankings.
The ‘University-Industry Research Connections’ (UIRC) ranking
In this ranking, the CWTS reports the percentage of publications co-authored by universities and industry (UIC intensity) for a set of 750 research universities. Besides UIC intensity, the UIRC ranking also indicates the percentage of publications co-authored with businesses in the relevant country (% domestic UICs). It does this for all fields of science collectively at the relevant university and for seven individual broad fields of science, i.e. mathematics, computer science and engineering; natural sciences; life sciences; earth and environmental sciences; medical sciences; social sciences; and cognitive sciences.
The most recent UIRC was in 2014. The CWTS is working on a new methodology and a more user-friendly design.
Times Higher Education Ranking
The Times Higher Education (THE) has published THE World University Rankings since 2004. Like the QS World University Rankings, it covers a broad spectrum of university activities: teaching (30%), research (30%), citations (30%), industry income (2.5%) and international outlook (7.5%).
The ranking has subject tables in eight fields. A third of the scores is based on the Academic Reputation Survey. The THE methodology is much-criticised. Because of changes in the methodology, the makers strongly advise against direct comparisons with previous years’ World University Rankings.
A new type of university performance table was launched in 2014 with the financial support of the European Commission. It was developed by a consortium led by the Centre for Higher Education (CHE) in Germany and the Center for Higher Education Policy Studies (CHEPS) and CWTS in the Netherlands. U-Multirank is not so much a ranking as a tool for comparing university performance on five dimensions and thirty aspects, based on information about more than 1300 institutions of higher education. Performance is rated on a scale ranging from very good (A ) to weak (E). The dimensions are:
- teaching and learning
- regional engagement
- knowledge transfer
- international orientation
One novelty is that U-Multirank has a number of ‘readymade’ rankings on its website. One of these focuses on research and research linkages, and another on teaching and learning (subdivided into 16 subjects). The results of the Dutch universities on the 'readymade' ranking research and research collaboration is presented in a separate file. It shows that almost all Dutch universities receive the highest score (A: very good) on the indicators citation rate, research publications (size-normalized) and top cited publications (the proportion of a university’s publications that, compared with other publications in the same field and in the same year, belong to the top 10% most frequently cited). TU Delft and TU Eindhoven score an A (very good) on the indicator co-publications with industrial partners. The indicators international joint publications and regional joint publications show a diverse pattern.
Results of various rankings
If we consider the general picture that emerges from the rankings, we see that US and British universities dominate the top of the performance tables. For example, the ARWU has 31 US and seven British universities in the top 50 spots. On the other hand, there are very few British and US universities in the second tier (100-500), even though both countries are home to a vast number of universities. Instead, it has a relative preponderance of universities in the Netherlands and a few other countries (including Switzerland, Sweden and Germany). The rankings show that the Dutch science system generally performs well. If we compare Utrecht University with European universities in the ARWU top 50 by their scores on the underlying indicators, we see wide gaps on some indicators and very narrow ones on others. In fact, Utrecht even has higher scores in some cases.
Besides their standing in a ranking, we can also compare universities by looking at size (based on total budget) and educational function (based on numbers of students enrolled). If we compare Utrecht University with the top 50 European universities in the AWRU, we see that most of the latter have much more money to spend and a much smaller student population than Utrecht (source: ETER database on European higher education).
The table below shows how Dutch universities perform in some of the rankings.