The idiocy behind university rankings | Inquirer News
Past Forward

The idiocy behind university rankings

/ 08:14 AM June 13, 2013

Might a financially constrained university employ an expensive Nobel laureate at the expense of several postdoctorates in a field with more pressing needs in order to improve its ranking position? Might an Asian university force staff to publish in high-profile English-language journals when local needs would be better served through local research outlets?” so asks Phil Baty, editor of the Times Higher Education (THE) World University Rankings.

The question should have introduced the news item published in the national dailies yesterday as they swallowed hook, line and sinker the latest formal-sounding Quacquarelli Symonds (QS) World Ranking Survey for Asian Universities. Instead, these newspapers even pointed out that other universities who figured in a previous year did not figure this time, as if they failed.

That’s what happens when you have journalists pretending to write about educational ranking without even calling these universities to find out if, in the first place, they participated in the QS survey at all. I know for a fact that the University of San Carlos (USC) consistently refuses to participate in such surveys, politely turning down any requests for data and information. Why? Because such surveys are highly suspect. (The QS survey on the Philippines by subject for example, identified the University of San Carlos as a top university in linguistics and some other courses which USC does not even offer!)

ADVERTISEMENT

The Quackquarrelli name alone almost sounds like its quack, doesn’t it? QS used to be the data supplier of Times Higher Education (THE) until 2009 when THE decided to go their own way by developing a better methodology using their own findings from Thomson Reuters. That should have warned newspapers who immediately report their surveys as if they represented the god of higher education. QS uses six indicators to determine their own version of university rankings: academic reputation (40 per cent), an employer survey (10 per cent), citations (20 per cent), staff-student ratio (20 per cent) and data on the proportion of international staff and students on campus (5 per cent each). If we go by this alone, Mapua Institute of Technology, the only school that matches international engineering compatibility through the Washington Accords, will not qualify unless it submits data that only its administration can supply. This is so true also of USC, which never gave away information on staff-student ratio and data on international staff and students nor submit to an employer interview.

FEATURED STORIES

And how does one measure academic reputation if the institution does not supply the surveyor with its list of accomplishments, bar and board topnotchers, etc? You expect QS to spend time looking at all the newspapers all over Asia combing for news about this or that university? It is these universities that have to supply them. That is why, other than those Manila-based universities which land in the press, all the provincial universities that did not bother with these highly questionable surveys never made it to the list.

In fact, USC administration was quite surprised why it was on the previous QS list when it never participated in the survey!

Worse, these university rankings end up only putting all higher education institutions into one uniform box, devoid of diversity. They want university administrators to tailor their institutions to four or five extremely narrow ranking criteria. These surveys idiotically see the world as one level playing field, as if the tuition fees charged in Manila that finance big universities are the same in the provinces. They should be surprised how provincial universities with half or even a quarter of the tuition fees that De La Salle or Ateneo charge can still produce topnotchers and the best students in their field.

Writ large, these surveys stupidly assume that Philippine universities can eventually compete with, say, National University of Singapore (NUS), probably the most heavily government-subsidized school in the world, whose teaching force is pirated from all over the world, mostly Cambridge and Oxford, because they can afford to do so. The damage these surveys do is they pit one university against another, telling parents and students to choose on the basis of their opportunistic surveys full of missing data. And newspapers print them whole without even thinking why.

The world of higher education, unfortunately for QS and all these pretentious surveys, is as varied as the diversity of life in every tropical rainforest. A world of uniformity, where all universities fall in line on the basis of a survey prepared by humans coming from a highly developed country will not augur well for the uniqueness that each institution should have to be effective in its niche and environment.

Indeed Harvard, Oxford and all the top universities, including NUS where I spent a few months on fellowship, are way up there on top. But at the end of the day, what matters most is how universities are seen by the local community in which they serve, their geographic location and how they have contributed to improving lives around them—not how newspapers deride them on the basis of one survey.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

TAGS:

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.