Global rankings: the age of deference is coming to an end

  • Feb 2024
  • Stefan Penders
  • ·
  • Modified 10 Jul
  • 16
  • Claartje Chajes
Stefan Penders
Nieuwsrubriek
  • Claartje Chajes

bron: University World News
By Richard Holmes

Recently, criticism of global university rankings has intensified. The causes and the possible consequences of this discontent are diverse and the future is hard to see, but it is a good bet that, in a few years, the international ranking scene will look very different.

As is often noted, global rankings began in 2003 with the first edition of the Academic Ranking of World Universities (ARWU) by Shanghai Jiao Tong University. This was two decades after the US News & World Report (US News) rankings of American colleges and universities became a powerful force in American higher education.

Since then, rankings have flourished. In addition to conventional rankings of research and those that combine research metrics with indicators that might be related to teaching and learning, we now have rankings of web activity, sustainability, employability, third missions and social impact.

For those with jaded palates, Times Higher Education (THE) is planning interdisciplinary and online learning rankings, which no doubt will be very exciting and awesome.

Rankings seem to be everywhere. They are used to attract students and talented researchers, assess research proficiency and review institutional strategy. They have become an element in migration policy in the United Kingdom, the Netherlands, Denmark, mainland China and Hong Kong and have even been used to police universities’ publicity campaigns.

Recently Ulster University was prevented from calling itself “world leading” because of its lowly position in the THE and other rankings.

They have helped inform students and others about how universities are regarded by society and peers and the quality and quantity of research they produce or support. They can, if they are transparent and reasonably accurate, alert the public to the deficiencies or achievements of institutions or systems, especially in research and innovation.

Some negative aspects

But there are negative aspects, sometimes very negative. Rankings have led universities to over-invest in those metrics that might boost their ranking scores, such as a few multi-institution projects that, until this year, have contributed disproportionately to success in the THE World University Rankings.

The well-known rankings are often subject to methodological changes that can harm or benefit universities, heaping undeserved blame or praise on university administrators.

Even if there is no publicly declared methodological change, puzzling fluctuations can be produced by recalibration, re-centering, procedural tweaking and the influx of new institutions.

Quite a few careers have come to grief after an inexplicable collapse in ranking scores, while some universities have risen to undeserved and sometimes temporary prominence with the help of performance in one or more of the big-league tables.

That said, we should remember that a world without some sort of comparative external assessment of universities is difficult to imagine. Before the US News rankings in the early 1980s, there was an unspoken ranking that proclaimed the supremacy of Harvard and the rest of the Ivy League and its imitators.

Similarly, there was a consensus in England about the relative status of different institutions, led, of course, by Oxford and Cambridge universities, that did not always reflect reality.

From the beginning, global rankings have been subject to sustained criticism in academic journals, conferences, social media and blogs.

Much of that is related to methodological issues such as combining disparate metrics into a single score or rank or identifying precise proxies for fluid and culturally dependent concepts like teaching quality.

There is also substantial literature that has examined the cultural biases of the rankings and their links to the hegemony of the Global North. A few observers have also detailed the errors and inadequacies of the rankings, particularly those published by THE.

Until recently, this did not seem to have a significant impact on higher education administrators, the media and government agencies. Universities welcomed the latest results if they showed that they were ascending and berated hapless researchers and analysts if they were falling.

Rankings were increasingly politicised as governments took credit when their flagships did well and opposition parties and factions gloated when they did badly. A classic example was Universiti Malaya scoring brilliantly on the first edition of the THES-QS world rankings, as they were then known, in 2004 and then crashing down in 2005, all because of a basic error by the rankers and the correction of that error.

The rankings, especially those from the THE stable, have been very popular with the higher education establishment. It is surely a sign of the decline of Western universities that they have been all too happy to celebrate their achievements in rankings that are opaque and, in some respects, technically deficient.

A more critical phase

It seems, however, that the days of uncritical acceptance of the big three global rankings – THE, Quacquarelli Symonds (QS) and Shanghai, or maybe four if we count US News’ Best Global Universities – are coming to an end. There is unease everywhere, but it seems more intense in Asia. Meanwhile, there is now a widespread movement in the Global North to rethink the whole business of rankings, perhaps even to do away with them altogether.

Underlying these trends is the growing intellectual capacity of China and perhaps other parts of Asia. This has led to the big commercial rankers seeking new metrics that will attract international students and revamping their methods to compensate for the declining or stagnating research output and quality of many institutions of the Global North.

Meanwhile, many universities all over the world, but especially in Asia, are thinking about opting out of the ranking system or looking at alternatives or reforms to conventional league tables.

Russia

Russia jumped aboard the rankings train in 2013 when it announced its 5-100 Project to get five universities into the top 100 of the three well-known rankings. In the end, the project was not very successful, although it did lead to some improvements, and eventually fizzled out.

Russia has developed rankings of its own. The Moscow International University Ranking (MosIUR) attempts to assess the third mission of universities and their impact on society. Russian universities do noticeably better here, with three in the top 100.

Unfortunately, MosIUR does not provide any breakdown of the indicator scores so the sources of the achievements of Russian universities, or any others, cannot be identified.

Round University Ranking (RUR) has moved from Moscow to Tbilisi, Georgia, although it still seems to be perceived as a Russian ranking since it no longer receives data from Clarivate. RUR was originally modelled on the THE rankings, but it has now added some interesting modifications, including replacing survey data from Clarivate with statistics from social media and using Lens instead of Web of Science for measuring publications and citations.

The Russo-Ukrainian War has deepened the gulf between Russia and the West with THE, QS, Webometrics and U-Multirank imposing various sanctions, such as withholding indicator ranks or ceasing consulting and benchmarking projects.

India

In 2020 the leading Indian Institutes of Technology (IITs) announced that they would no longer submit data to the THEworld rankings, citing a lack of transparency, fluctuations from year to year and high and questionable scores for small and obscure Indian institutions.

THE appears to be trying to get them back. In March of last year, they delivered a presentation about forthcoming changes for the leading IITs, although these were still absent from the recently published rankings.

Meanwhile, India has focused on developing its national ranking, the National Institutional Research Framework (NIRF), which includes metrics for outreach and inclusivity and graduation outcomes in addition to the standard measures of research and resources.

China

China was once an eager partner with global rankings. A few years ago, a Chinese official proclaimed that Phil Baty, head of THE rankings, was “education secretary of the world”.

Things have soured since then. It is noticeable that China has almost completely ignored THE’s Impact Rankings. In 2022 there were just 13 mainland Chinese universities in those rankings led by Fudan University, a lot less than those from Iran or Iraq. By 2023 that meagre number had been reduced to seven and Fudan had withdrawn.

Recently QS began publishing a sustainability ranking, which is now embedded in their World University Rankings. It is QS policy to assess universities whether they like it or not so China does not have the option of withdrawal. Chinese universities have not performed well here, perhaps because they have not submitted sufficient data.

South Korea

South Korea’s relationship with global rankings has also been mixed. Back in 2004 the president of Yonsei University, Jung Chang-young, proclaimed that national rankings did not mean anything and that he aimed to turn the university into a “world-renowned university”.

But despite this enthusiasm, Yonsei did badly in the THES-QS rankings, as they were then called. In 2006 the university ranked 486th compared to 150th for its arch-rival Korea University.

The president was held responsible and was, like other leaders of failing universities, attacked by the faculty for this poor performance. He resigned in 2007 after an unrelated admissions scandal.

Yonsei languished in the THE world rankings. In 2014-15 it was only in the 201-225 band along with Korea University, although Seoul National University (SNU), the Korea Advanced Institute of Science and Technology (KAIST) and Pohang University of Science and Technology (POSTECH) did very much better.

Then, in 2015-16 there was a catastrophic fall for Korean universities as a result of THE’s methodological changes, including the exclusion of citations of mega-papers with thousands of authors and institutions, although there was a recovery by 2023-24.

By contrast, South Korean universities continued to excel in the QS rankings. In 2022-23 there were six in the top 100, with SNU in 29th place. Disaster came in 2023 when SNU fell to 41st place and all the Korean universities fell except for Sejong University.

This followed another set of methodological changes, including reducing the weighting for the faculty-student ratio and the academic survey and increasing that for the employer survey. In addition, three new metrics were introduced: employment outcomes, sustainability and international research network.

The sustainability indicator represents a big departure for QS since it requires a substantial investment of time and effort from institutions, with over 50 bits of data to be submitted. The single-digit scores of many reputable Korean institutions suggest that they have not been submitting data or at least not submitting very much.

The international research network indicator is a measure of the countries that universities collaborate with rather than institutions and is considered to favour universities in the Global North. Last year, QS published the scores for this metric without standardisation, although they did not count them in the overall ranking. This year they were counted and they were standardised, leading to a drop in scores for many universities

Fifty-two universities have now formed a University Rankings Forum of Korea and have threatened an unprecedented ‘boycott’ of the QS rankings by not submitting data, although QS’s procedures will allow them to obtain such data from public sources.

Southeast Asia

Further signs of discontent come from Southeast Asia. Malaysian universities and government agencies have watched the rankings carefully and Malaysian academics and recruiters have been represented disproportionately in the QS academic and employer surveys.

Malaysian universities have done dramatically better in the QS rankings than in the Shanghai, or University Ranking by Academic Performance (URAP) rankings (produced by the Middle East Technical University in Ankara), largely as a result of high scores in the surveys.

The University of Malaya, for example, was 65th in the most recent QS world rankings, but in the 351-300 band in the THE world rankings and 401-500 in the Shanghai Rankings, and 282nd in URAP.

In this year’s QS world rankings, some Malaysian universities have continued to do well, but others have received low or very low scores for sustainability, suggesting that they too are less than enthusiastic about the new metric.

Indonesian academics have noted that their universities are sparsely represented in the conventional rankings, although some have performed well in the GreenMetric rankings published by Universitas Indonesia.

Dissatisfaction with the current global rankings has led AppliedHE, a Singapore-based company, to start rankings of public and private universities in the ASEAN (Association of Southeast Asian Nations) region.

These include metrics relating to teaching and learning; employability; research, which uses data from Google Scholar; community engagement; internationalisation; and institutional reputation.

The latest rankings show strong performances from institutions that have been overlooked by the global rankings. These include Infrastructure University Kuala Lumpur, Malaysia, and Paragon International University, Cambodia, in the Private University Ranking, and IPB University, Indonesia, and Universiti Teknologi Brunei in the Public University Ranking. AppliedHE is planning to expand into the rest of Asia soon.

The Arab region

The Arab region has been concerned for some time with some anomalous results in the THE World and Arab University Rankings. There were cases of places such as Alfaisal University, Aswan University and An-Najah National University leaping ahead of recognised world-class contenders.

Concern was increased by the publication of THE’s regional rankings in late 2023 at a meeting in Abu Dhabi, where universities in the United Arab Emirates advanced while leading Egyptian and Saudi institutions fell back. This was reminiscent of some previous summits where methodological tweaking sometimes favoured the host country.

Arab countries have also developed a new regional ranking, the Arab Ranking of Universities, supported by the General Secretariat of the Arab League, the Arab League of Educational, Cultural and Scientific Organisation and the Association of Arab Universities. There are four indicator groups: teaching and learning; scientific research; creativity and innovation; international and local cooperation; and community service.

This ranking appears to be well regarded in Egypt, which contributes five universities to the top 10. By contrast, some leading Saudi universities are absent, probably because they are for the moment satisfied with their performance in the global rankings and see no need for additional assessment.

Western Europe

Meanwhile, there are also signs of growing disaffection in Europe. Last October, Utrecht University announced that it would no longer cooperate with the THE world rankings, claiming they emphasised competition at the expense of high-quality research and collaboration. It is reported that other Dutch universities may follow suit.

This follows a paper by ranking experts for the board of the Universities of the Netherlands that recommended that universities should support alternatives to standard league tables such as U-Multirank, not use rankings for evaluations or distribution of resources and make public any data submitted for ranking purposes.

This was part of a critical stance towards rankings that included signing up for the Agreement on Reforming Research Assessment and the San Francisco Declaration on Research Assessment, and supporting the More than Our Rank initiative. The university is not apparently boycotting other rankings, QS, Round Rankings and the US News global rankings that use institutional data.

Meanwhile, students at Trinity College Dublin and the Union of Students in Ireland have called for an investigation into the college’s relationship with the rankings, and a withdrawal if possible. Grievances include unnecessary competition, social exclusion and the entrenchment of colonial legacies. It is likely that they will be followed by students and faculty at other universities in Western Europe.

United States

Across the Atlantic, discontent has been mainly directed at national league tables, in particular law and medical school rankings, and has so far bypassed the global rankings.

Recently Yale, the University of California Berkeley and other universities declared their intention to shun, ignore or withdraw from the US News law school rankings. Yale and the rest of the Ivy League have been well served by the US News rankings so this would seem, at first sight, a remarkable and unusual act of self-denial.

That, however, is not really the case. The elite schools are trying to create rankings that measure institutions’ resources from which to provide fellowships or maintain debt forgiveness programmes and admit students on the basis of identity and ideological conformity rather than academic merit. So far American universities have not had much to say about global rankings, but that may change.

An interesting future

The decline of the globalised economy and the globalised system of research and higher education is reflected in controversies about rankings. It looks like the THE-QS-Shanghai triopoly is fading. As baseball player Yogi Berra reportedly said, making predictions, especially about the future, is difficult, but it does seem that we are going to see more regional and national rankings and perhaps more innovative methodologies.

Countries and universities will continue to try to influence the rankings and shop around for the rankings that best suit their interests. The age of deference to the global rankings is coming to an end and the future will be, to make one tentative prediction, rather interesting.

Richard Holmes is an independent writer and consultant and the producer of the blog University Ranking Watch.