Sunday, August 17, 2008

The Shanghai Rankings 2008



Shanghai Jiao Tong University (SJTU) has just released their rankings for 2008. Compared to the THE-QS rankings, public response, especially in Asia and Australia, has been slight. This is largely because ascent and descent within the Shanghai index is minimal, a tribute to their reliability. In contrast, the THE-QS rankings, with their changes in methodology and frequent errors, arouse almost as much interest as a country's performance in the Olympics.



Still, it is instructive to check how well various universities do on the different components of the Shanghai rankings.



The current top ten are as follows:

1. Harvard
2. Stanford
3. Berkeley
4. Cambridge
5. MIT
6. Caltech
7. Columbia
8. Princeton
9. Chicago
10. Oxford

The Shanghai index includes two categories based on Nobel prizes and Fields medals. These measure the quality of research that might have been produced decades ago. Looking at the other criteria gives a rather different picture of current research.


It is interesting to see what happens to these ten if we rank them according to SJTU's PUB category, the total number of articles indexed in the Science Citation Index-Expanded (SCIE) and Social Science Citation Index (SSCI) in 2007. The SSCI gets a double weighting.

Harvard remains at number 1

Stanford goes down to number 8

Berkeley goes down to 11

Cambridge goes down to 23

MIT is down at 34

Caltech tumbles to 86

Columbia is down just a bit at 10

Princeton crashes to 120

Chicago falls to 72

Oxford goes down to 18

If this category represents current research output then it looks as though some American universities and Oxbridge have entered a period of decline. Of course, Caltech and MIT may suffer from the PUB category including social science research but would that explain why Princeton and Chicago are now apparently producing a relatively small amount of research?


The top ten for PUB is

1. Harvard

2. Tokyo

3. Toronto

4. University of Michigan

5. UCLA

6. University of Washington

7. Stanford

8. Kyoto

9. Columbia

10. Berkeley

Sunday, August 10, 2008

Are Things Improving at QS?

I think it worth quoting from a recent comment.




Just wanted to add a note about submitting data to THES based on my experience
here at a large Australian university. We forwarded our submission earlier this
year and now we have received a query on some of our numbers - a check just to
confirm if they are correct. And I can understand why they would need to be
checked. The numbers we submitted for staff (in the thousands) seem to have
changed to a number in the very low hundreds. Also, comments (made by us) that
were attached to specific sections have, in the past come back with typos... I
think this indicates that there is greater scope for human error in the
compilation of the data, even at such an early and relatively uncomplicated
phase of the data gathering process...


There are signs that QS is making a commendable effort to avoid the errors that have been so prevalent in previous rankings. Still it is rather disconcerting that thousands of faculty have turned into hundreds, especially since it is not altogether impossible that some universities might conveniently forget to correct an error that might be to their advantage.

So, I was wondering how common simple basic errors are in the QS rankings. I have been looking at QS's topuniversities site and checking the number of students listed in the descriptions for each university, comparing the number of undergraduates and postgraduates with the total of all students. Here are the results just for the Universities beginning with A.

For these twelve universities no problems were noticed: Aarhus, Aberdeen, Aberystwyth, Antwerp, Arizona State University, Athens, Aston, Amsterdam, Adelaide, Australian National University, Austral, Vrije Universiteit Amsterdam. In some cases there were minor discrepancies but not enough to cause concern.

About two weeks ago there were, in three cases, discrepancies between the number of total students and the combined numbers of undergraduates and post graduates: University of Arizona (more combined undergraduates and post graduates than total students), University of Auckland (number of postgraduates and total students the same) and Athens University of Economics and Business (more undergraduate international students than total international students) .

In the above three cases, at the time of writing the errors have been corrected with new entries.
There were however three cases where the errors at the time of writing had not been corrected. These are:

University of Arkansas
Full Time Equivalent (FTE) undergraduates. 20,416.
FTE graduate/postgraduate students 4,163.
Total FTE students 15,182.
Over 9,000 students "missing" from the total.

(correction: not the University of Arizona as was indicated in an earlier version of this post)


University of Alabama
FTE undergraduates 33, 986
FTE graduate/postgraduate students 8,291.
FTE total of 19,651 students.
Over 22,000 students “missing” from the total.

University of Alberta
FTE undergraduates 29,178.
FTE graduate/postgraduate students 5,419.
Total FTE total students 32,341.
About 2,500 students “missing” from the total.

I suspect that the problem with these three schools is that the totals of students were complied and entered separately and that the data for undergraduates and postgraduates included students in branch campuses, professional schools and/or research institutes and the data for total students did not. It will be interesting to see whether these errors will be corrected and whether new ones will emerge.