Friday, April 27, 2007

How Rankings Lead to a Decline in Quality

Geoffrey Alderman, currently a visiting research fellow at the Institute of Historical Research, University of London, has an article in the Guardian about the decline of standards in British universities. He refers to a case at Bournemouth University where officials overrode a decision by a professor and examination board to fail thirteen students. Apparently, the officials thought it unreasonable that students were required to do any reading to pass the course. He also comments on the remarkable increase in the number of first-class degrees at the University of Liverpool. Professor Alderman is clear that part of the problem is with the current obsession with rankings:


"Part of the answer lies in the league-table culture that now permeates the sector. The more firsts and upper seconds a university awards, the higher its ranking is likely to be. So each university looks closely at the grading criteria used by its league-table near rivals, and if they are found to be using more lenient grading schemes, the argument is put about that "peer" institutions must do the same. The upholding of academic standards is thus replaced by a grotesque "bidding" game, in which standards are inevitably sacrificed on the alter of public image - as reflected in newspaper rankings."

Similarly, it seems that in the US large numbers of students are being pushed through universities for no other reason than to improve graduation rates and therefore scores on the US News and World Report rankings.

Tuesday, April 24, 2007

Comparison of the THES-QS "Peer Review" and Citations per Faculty Scores

QS Quacquarelli Symonds, the consultants responsible for the THES-QS World University Rankings, have now placed data for 540 universities, complete with scores for the various components, on their topuniversities website (registration required). This reveals more dramatically than before the disparity between scores by some universities on the “peer review” and scores for citations per faculty, a measure of research quality. Below are the top 20 universities in the world according to the THES-QS “peer review” by research-active academics who were asked to select the universities that are best for research. In curved brackets to the right is the position of the universities in the 2006 rankings according to the number of citations per faculty.

Notice that some universities, including Sydney, Melbourne, Australian National University and the National University of Singapore perform dramatically better on the peer review than on citations per faculty. Melbourne, rated the tenth best university in the world for research by the THES-QS peer reviewers, is 189th for citations per faculty while the National University of Singapore, in twelfth place for the peer review, comes in at 170th for citations per faculty. The most devastating disparity is for Peking University, 11th on the “peer review” and 352nd for citations, behind, among others, Catania, Brunel, Sao Paulo, Strathclyde and Jyväskylä. Once again, this raises the question of how universities whose research is regarded so lightly by other researchers could be voted among the best for research. Oxford, Cambridge and Imperial College London are substantially overrated by the peer review. Kyoto is somewhat overrated while the American universities, with the exception of Chicago, have roughly the same place that would be indicated by the citations per faculty position.

Of course, part of the problem could be with the citations per faculty. I am fairly confident that the data for citations, which is collected by Evidence Ltd, is accurate but less certain about the number of faculty. I have noted already that if a university increases its score for student faculty ratio by increasing the number of faculty it would suffer a drop in the citations per faculty score. For most universities the trade-off would be worth it since the gap between the very good and the good is much greater for citations than for student-faculty ratio. So, if there has been some inflating of the number of faculty, however and by whom it was done, than this would have an adverse impact on the figures for citations per faculty.

I have therefore included the positions of these universities according to their score for articles in the Science Citation Index-expanded and Social Science Citation Index in 2005 in the Shanghai Jiao Tong rankings. This is not the same as the THES measure. It covers one year only and is based on the number of papers, not number of citations. It therefore measures overall research output of a certain minimum quality rather than the impact of that research on other researchers. The position according to this index is indicated in square brackets.

We can see that Cambridge and Oxford do not do as badly as they did on citations per faculty. Perhaps they produced research characterised by quantity more than quality or perhaps the difference is a result of inflated faculty numbers. Similarly the performance of Peking, National University of Singapore, Melbourne and Sydney is not as mediocre on this measure as it is THES’s citations per faculty.

Nonetheless the disparity still persists. Oxford, Cambridge, Imperial College and universities in Asia and Australia are still overrated by the THES-QS review.

1. Cambridge (46 ) [15]
2. Oxford (63) [17]
3. Harvard (2) [1]
4. Berkeley ( 7) [9]
5. Stanford (3) [10]
6. MIT (4) [29}
7. Yale (20) [27]
8. Australian National University (83) [125]
9. Tokyo (15) [2]
10. Melbourne (189) [52]
11. Peking (352) [50]
12. National University of Singapore (170) [111]
13. Princeton (10) [96]
14. Imperial College London ( 95) [23]
15. Sydney (171) [46}
16. Toronto (18) [3]
17. Kyoto (42) [8]
18. Cornell (16)
19. UCLA (19) [21]
20. Chicago (47) [55]

Monday, April 23, 2007

Book Review in the THES

A new book on university rankings, The World-Class University and Ranking: Aiming Beyond Status, has appeared and has been reviewed by Martin Ince in the Times Higher Education Suppplement (THES). It is edited by Liu Nian Cai of Shanghai Jiao Tong University and Jan Sadlak. You can read a review here (subscription required) I hope eventually to review the book myself.

I must admit to being rather amused by one comment by Ince. He says:

"Although one of its editors is the director of the Shanghai rankings, The World-Class University and Ranking largely reflects university concerns at being ranked. Many contributors regard ranking as an unwelcome new pressure on academics and institutions. Much is made of the "Berlin principles" for ranking, a verbose and pompous 16-point compilation that includes such tips as "choose indicators according to their relevance and validity". The Shanghai rankings themselves fall at the third principle, the need to recognise diversity, because they rank the world's universities almost exclusively on science research. But the principles are silent on the most important point they should have contained - the need for rankings to be independent and not be produced by universities or education ministries."

I would not argue about the desirability of rankings being independent of university or government bureaucracies but there is far greater danger in rankings that are dominated by the commercial interests of newspapers.

Friday, April 20, 2007

Rankings to be Investigated

The Guardian has announced that the Higher Education Funding Council for England (Hefce) will investigate university league tables, Its chief executive David Eastwood has announced that the council will examine the rankings produced by the Guardian, the Times and the Sunday Times and whether university policies are influenced by attempts to improve their scores.

The report continues:

'World tables compiled by Shanghai Jiao Tong University and the Times Higher Education Supplement will also be surveyed. The University of Manchester, for example, has made it clear that its strategy is to climb the international rankings, which include factors like the number of Nobel prizewinners. The university has pledged to recruit five Nobel laureates in the next few years.

Prof Eastwood said league tables were now part of the higher education landscape "as one of a number of sources of information available to potential students".

He added: "Hefce has an interest in the availability of high quality useful information for students and the sector's other stakeholders. The league table methodologies are already the subject of debate and academic comment. We plan to commission up-to-date research to explore higher education league table and ranking methodologies and the underlying data, with the intention of stimulating informed discussion.'


Thursday, April 19, 2007

More about Internationalisation

There is a very interesting piece by Ahmad Ismail at highnetworth -- acknowledgement to Education in Malaysia -- that argues that the economic value of an overseas university education for a Malaysian student is minimal. There are, no doubt, going to be questions about the assumptions behind the financial calculations and there are of course other reasons for studying abroad.


Even so, if students themselves typically gain little or nothing economically from studying in another country and if their parents suffer a great deal and if students or taxpayers in the host country in one way or another have to pick up the tap (at Cornell it takes USD1,700 to recruit an international student) then one wonders what internationalisation has to do with university quality. And one wonders why THES and QS consider it so important
How Rankings Produce Distortions

A letter to the Cornell Daily Sun from Mao Ye, a student-elected trustee, suggests increasing the recruitment of international students in order to boost the university's position in the US News and World Report rankings.

The question arises of whether the intenational students would add anything to the quality of an institution. If they do, then surely they would be recruited anyway. But if students are admitted for no other reason than to boost scores in the rankings they may well contribute to a decline in the overall quality of the students.


"Two critical ways to improve Cornell’s ranking are to increase the number of applications and to increase the yield rate of admitted students. To achieve this goal, no one can overlook the fact that international applications to all U.S. institutions have recently increased at a very fast pace. For Cornell, the applications from China increased by 42.9 percent in 2005 and 47.5 percent in 2006. We also saw a 40 percent increase in applications from India last year. By my estimation, if international applications continue to grow at the current rate, in 10 years there will be more than 10,000 foreign applications received by the Cornell admissions office. Therefore, good performance in the international market will have a significant positive impact on our ranking in U.S. News and World Report.

How might we get more international students to apply? It’s actually very easy. We can have different versions of application materials, each in various students’ native languages, highlighting Cornell’s achievements in that country and addressing the specific concerns of students from that country. I checked the price and realized we do not need more than $500 to translate the whole application package into Chinese. If we focus translation on the crucial information for Chinese applicants, the cost is as low as $50. Comparatively, this is lower than the cost of recruiting one undergraduate student to a university, which costs an average of $1,700 per student, based on the calculations of Prof. Ronald Ehrenburg, industrial and labor relations. Staff, students, parents and Cornell as a whole will all benefit greatly from this plan."