Wednesday, September 19, 2012

New Canadian Research Rankings

Higher Education Strategy Associates recently published their Canadian Research Rankings, which are based on the award of grants and H-indexes. I am not sure about counting grants since it is likely that the skills needed to lobby for grants and those needed to actually do research are not always the same.

The rankings do not include medical research.

The top five for science and engineering are:

1.  University of British Columbia
2.  Montreal
3.  Toronto -- St. George
4.  Ottawa
5.  McGill

The top five for  social sciences and humanities are:

1.  University of British Columbia
2.  Mcgill
3.  Toronto -- St George
4.  Alberta
5.  Guelph


These rankings, like the Times Higher Education (THE) World University Rankings are based on field normalisation. In other words they do not simple count the number of grants and H-index scores but compare them with the average for the field. The rationale for this is that there are enormous differences between disciplines so that it would, for example, be unfair to compare a physicist who has won a grant of 10,000 dollars, which is below average for physics, with an education researcher who has won a similar award, which is above average for education. Equally, does it make sense to rank a physicist with an average H-index for physics well above a linguist with an average one for linguistics?

Here are the average grants for various fields:

biological engineering 84,327
physics  42,913
linguistics 13,147
history 6,417
education 5,733

Here are the average H-indexes for discipline clusters:

science  10.6
social sciences 5.2
humanities 2.3


HESA (and THE ) do have a point. But there are problems. One is that as we drill down to smaller units of analysis there is a greater risk of outliers. So a single large grant or a single much cited author in a field with few grants or citations could have a disproportionate impact.

The other is that field normalisation implies that all disciplines are equal. But is that in fact the case?




Thursday, September 13, 2012


A Bit More about Les Ebdon

We have noted that Les Ebdon of the UK government's Office of Fair Access has been lecturing leading universities about the need to admit more students from state schools. If not, they might lose their world class status and could also be fined and forced to charge much lower fees.


English universities will be expected to enrol thousands more undergraduates from working-class families and poor-performing state schools in return for the right to charge up to £9,000 in tuition fees, it emerged.

Prof Ebdon, newly-appointed director of the Office for Fair Access, suggested that one poor student should eventually be admitted for each candidate enlisted from the wealthiest 20 per cent of households.


Currently, the ratio stands at around one-to-seven, he said.
Speaking as he took up his role this week, Prof Ebdon said the country’s best universities were “not going to stay world class in a very competitive world unless they have access to the full pool of talent”


So if Oxford admits 42% of UK applicants from independent schools and they all come from wealthy households and if some of the state school applicants also come from wealthy households so that altogether about half are from privileged backgrounds, then it would presumably have to stop admitting anyone from the middle 60%.

If Ebdon gets his way then it is quite easy to see what will happen. Independent schools will simply arrange for their pupils to do a year at a carefully selected state school after A levels or the schools will open campuses in Ireland (or post-referendum Scotland) so that they can go into the international category.

Ebdon was the Vice Chancellor of the University of Bedfordshire, formerly the University of Luton, which some think is among the worse universities in England, This may be a little unfair: it is almost certainly a better university than Luton Town is a football team.
I wonder whether the next move will be for Ebdon to appointed sports fairness tsar so that he can start telling Manchester United and Liverpool to recruit more players from posh post codes or with more than than one GCSE. Otherwise they will, unlike Luton Town, lose their world class status.

Wednesday, September 12, 2012

US News Rankings

The latest US News rankings are out. The top six are:

1.  Harvard
2.  Princeton
3. Yale
4.  Columbia
5.  Chicago
6.  MIT


Disappointing

Here is a comment from MIT News. There is nothing about whether MIT did in fact recruit a load of new international faculty between 2011 and 2012.


For the first time, MIT has been ranked as the world’s top university in the QS World University Rankings. The No. 1 ranking moves the Institute up two spots from its third-place ranking last year; two years ago, MIT was ranked fifth.

The full 2012-13 rankings — published by Quacquarelli Symonds, an organization specializing in education and study abroad — can be found at
http://www.topuniversities.com/. QS rankings are based on research quality, graduate employment, teaching quality, and an assessment of the global diversity of faculty and students.

MIT was also ranked the world’s top university in 11 of 28 disciplines ranked by QS, including all five in the “engineering and technology” category: computer science, chemical engineering, civil engineering, electrical engineering and mechanical engineering.

QS also ranked the Institute as the world’s best university in chemistry, economics and econometrics, linguistics, materials science, mathematics, and physics and astronomy.
The Institute ranked among the top five institutions worldwide in another five QS disciplines: accounting and finance (2), biological sciences (2), statistics and operational research (3), environmental sciences (4) and communication and media studies (5).

Rounding out the top five universities in the QS ranking were the University of Cambridge, Harvard University, University College London and the University of Oxford.

What happened to MIT and Cambridge?

QS now has a new world  number one university. Massachusetts Institute of Technology (MIT) has replaced Cambridge and overtaken Harvard.

Unfortunately, this change probably means very little.

Overall the change was very slight. MIT rose from 99.21 to 100 while Cambridge fell from 100 to 99.8.

There was no change in the two surveys that account for half of the weighting. MIT and Cambridge both scored 100 for the academic and the employer surveys in 2011 and in 2012.

On the citations per faculty indicator Cambridge did quite a bit better this year, rising from 92.7 to 97 while MIT fell slightly from 99.6 to 99.3. This could mean that, compared to front-runner Caltech,Cambridge has produced more articles, had articles cited more often, increased its faculty numbers or that there was some combination of the three.

For faculty student ratio, Cambridge fell slightly while MIT's score remained the same. For international students both fell slightly.

What made the difference was the international faculty indicator. Cambridge's score went from 98.4 to 98.2 while MIT's rose from 50 to 86.4, which means 1.82 more points in the total ranking, more than enough to overcome Cambridge's improvement in citations and pull slightly ahead.

Having done some rapid switching between the ranking scores and university statistics, I would estimate that a score of 50 represents about 15% international faculty and a score of 86 about 30 %.

It is most unlikely that MIT has in one year recruited about 150 international faculty while getting rid of a similar number of American faculty. We would surely have heard about it. After all, even the allocation of office space at MIT makes national headlines. Even more so if they had boosted the total number of faculty.

International faculty is a notoriously difficult statistic for data collectors. "International" could mean anything from getting a degree abroad to being a temporary visiting scholar. QS are quite clear that they mean current national status but this may not always reach the branch campuses, institutes, departments and programs where data is born before starting the long painful journey to the world rankings.

I suspect that what happened in the case of MIT is that somebody somewhere told somebody somewhere that permanent residents should be counted  as international or that faculty who forgot  to fill out a form  were moved into the international category or something like that.

All this draws attention to what may have been a major mistake by QS, that is configuring the surveys so that a large number of universities are squashed together at the top.For the academic survey, there are 11 universities with a score of 100 and another 17 with a score of 99 to 99.9. Consequently, differentiating between universities at the top depends largely on data about students and faculty submitted by institutions themselves. Even if they are totally scrupulous about finding and disseminating data there are all sorts of things that can can cause problems at each stage of the process.

I have not heard any official reaction yet from MIT. I believe that there are some people there who are quite good at counting things so maybe there will be a comment or an explanation soon






Monday, September 10, 2012

MIT is Number One

The new QS Rankings are out.

MIT has replaced Cambridge in first place.
University of Adelaide Falls out of the Top 100

News about the latest QS WUR is beginning to trickle out. This is from the Australian.

THE rest of the world is catching up with Australian universities, dragging down the sector's relative performance in the latest QS global rankings.
The league table, released this morning, shows Australia has one fewer institution in the QS top 100 for 2012-13, with the University of Adelaide sliding 10 places to 102.

Sunday, September 09, 2012

Will There be a New Number One?

One reason why QS and Times Higher Education get more publicity than the Shanghai ARWU, HEEACT and other rankings is that they periodically produce interesting surprises. Last year Caltech replaced Harvard as number one in the THE rankings and Tokyo overtook Hong Kong as the best Asian university. Two years ago Cambridge pushed Harvard aside at the top of the QS rankings.

Will there be another change this year?

There is an intriguing interview with Les Ebdon, the UK government's "university access tsar", in the Daily Telegraph. Ebdon claims that leading British universities are in danger  of losing their world class status unless they start admitting more students from state schools who may be somewhat less academically qualified. Perhaps he knows something.

So if Cambridge slips and is replaced by Harvard, MIT or Yale as QS number one (if it is Oxford or Imperial QS will lose all credibility) we can expect comments that Cambridge should start listening to him before its too late.

I suspect that if there is a new number one it might have something to do with the QS employer review. Since this is a sign up survey and since the numbers are quite small it would not take many additional responses to push Harvard or MIT into first place.

With regard to THE, the problem there is that normalising everything by country, year and/or field is a potential source of instability. If there is a vigorous debate with lots of  citations about an obscure article by a Harvard researcher in a little cited field it could dramatically boost the score on the citations indicator.

Getting a good score in the THE rankings also depends on what a school is being compared to. Last year, Hong Kong universities slumped because they were taken out of China (with low average scores) and classified as a separate country (with high average scores), so that their relative scores were lower. If they are put back in China they will go up this year and there will be a new number one Asian university.

So anybody want to bet on Harvard making a come back this year? Or Hong Kong regaining the top Asian spot from Tokyo in the THE rankings?

Friday, September 07, 2012

Rating Rankings

A caustic comment from the University of Adelaide:
"University rankings would have to be the worst consumer ratings in the retail market. In no other area are customers so badly served by consumer ratings as in the global student market," said Professor Bebbington. "The international rankings must change, or student consumers worldwide will eventually stop using them.

"Next to buying a house, choosing a university education is for most students, the largest financial commitment they will ever make. A degree costs more than a car, but if consumer advice for cars was this poor, there would be uproar.

"Students the world over use rankings for advice on which particular teaching program, at what campus to enrol in. Most don't realise that many of the rankings scarcely measure teaching or the campus experience at all. They mostly measure research outcomes." For this reason, said Professor Bebbington, half the universities in the US, including some very outstanding institutions, remain unranked.

He went on to discuss the inconsistency of university ranking results against the quality of the learning experience. According to Professor Bebbington, such a contradiction should come as no surprise: "Anyone who knows exactly what the rankings actually measure knows they have little bearing on the quality of the education."

Another problem was an increasing number of ranking systems, each producing different results. "With some, we have seen universities shift 40 places in a year, simply because the methodology has changed, when the universities have barely changed at all," he said. "It leaves students and parents hopelessly confused."

The Jiao Tong rankings in particular favour natural sciences and scarcely reflect other fields according to Professor Bebbington. "Moreover, they assume all universities have research missions. Those dedicated to serving their State or region through teaching, rather than competing in the international research arena, may be outstanding places to study, but at present are unranked.

"What is needed is a ranking that offers different lists for undergraduate teaching programs, international research-focussed programs, and regionally focussed programs. We need a ranking that measures all disciplines and is not so focussed on hard science."

The University of Adelaide has been steadily improving in the Shanghai ARWU rankings, although not in the two indicators that measure current research of the highest quality, publications in Nature and Science and highly cited researchers. It also improved quite a bit in the QS rankings from 107th in 2006 to 92nd in 2011. One wonders what Professor Bebbington is complaining about. Has he seen next week's results?

Still he has a point. The ARWU rankings have a very indirect measure of teaching, alumni who have won Nobel and Fields awards, while QS uses a very blunt instrument, faculty student ratio. It is possible to do well on on this indicator by recruiting large numbers of research staff who never do any teaching at all.

Times Higher Education rankings director Phil Baty has a reply in the Australian.
As yet unpublished research by international student recruitment agency IDP will show university rankings remain one of the very top information sources for international students when choosing where to study.

If they are making such a major investment in their future, the overall reputation of the institution is paramount. The name on the degree certificate is a global passport to a lifelong career.

Broad composite rankings - even those that say nothing about teaching and learning - will always matter to students.

But that does not mean the rankers do not have to improve.

The Times Higher Education league table, due for publication on October 4, is the only ranking to take teaching seriously. We employ five indicators (worth 30 per cent) dedicated to offering real insight into the teaching environment, based on things such as a university's resources, staff-student ratio and undergraduate-postgraduate mix.

Times Higher Education has made genuine efforts to capture some factors that may have relevance to teaching. Their academic survey has a question about teaching although it is only about postgraduate teaching. The others are necessarily somewhat indirect, income per academic, doctorates awarded, ratio of doctoral students to undergraduates.

If THE are to improve their learning environment criterion one option might be a properly organised, vetted and verified survey, perhaps based on university email records, of undergraduate students.

Saturday, September 01, 2012

From Preschool to PISA

We still don't have a ranking of kindergartens although no doubt that will happen one day. But there is now a country ranking of early education prepared by the Economist Intelligence Unit for the Lien Foundation "a Singapore philanthropic house noted for its model of radical philanthropy" (Economist print edition, 30/06/12). Details can be found here.

The ranking combines four indicators, Availability, Affordability, Quality of the Preschool Environment and Social Context, "which examines how healthy and ready for school children are".

The top five are:
1.  Finland
2.  Sweden
3.  Norway
4.  UK
5.  Belgium

The bottom five are:

41. Vietnam
42.  China
43.  Philippines
44.  Indonesia
45.   India

It is interesting to compare this with the  ability of fifteen year old students as measured by the 2009 PISA report. Finland, which is top of the preschool ranking, also scores well on all three sections of the PISA rankings, Reading, Science and Maths.

However, the United Kingdom which is ranked 4th in the preschool rankings does no better than the OECD average in the PISA ranking for Reading and Maths although not too badly in Science.

At the bottom of the preschool rankings we find China and India. In the PISA plus ranking India was represented by just two states and they preformed miserably.

On the other hand, China whose preschool system is ranked 42nd out of 45, does very well, ranking first in all three sections, or rather Shanghai does very well.

A warning is needed here. Mainland China is represented in the PISA rankings only by Shanghai. The results would almost certainly be very different if they included the whole of China including its huge rural hinterland. It is likely that if all of  China had been assessed its scores would have been something like Taiwan (Chinese Taipei), 495 instead of 556 for Reading, 543 instead of 600 for Maths and 520 instead of 575 for Science.

Even so, it is striking that the UK could have such highly ranked  preschools and such  modest secondary school achievement while China has lowly ranked preschools but such high scores at secondary level if we just consider Shanghai or scores similar to Taiwan's if we estimate what the nationwide level might be.

There are other apparent anomalies. For example, Singapore is only slightly better than Mexico on the preschool rankings but forges well ahead on the PISA rankings.

It could be that preschool education does not have much to do with long term achievement. Also, some of the criteria, such as how healthy children are, may not have much do with anything that a preschool does or can do. Nor should we forget the preschool ranking deals with input while the PISA rankings are about performance.

Furthermore, it is likely that the culture, social structure and demography of contemporary Asia, Europe and the Americas explain some of these differences in the effect of preschool education.




Are International Students an Indicator of Quality?

From BBC News

Some 2,600 foreign students affected by the London Metropolitan University (LMU) visa ban have been given until at least 1 December to find a new course.
The UK Border Agency says it will write to students after 1 October and "will ensure you have 60 days" to make a new student application or leave the UK.
On Thursday, the UKBA revoked LMU's licence to authorise non-EU visas. Ministers said it was failing to monitor student attendance. 

Apparently a substantial number of students did not have valid visas or had not been properly tested for spoken English and in many cases it was not possible to even tell if they were attending class.

From LMU's home page

At London Metropolitan University we believe that everyone has the right to an affordable quality education. Our fees for 2012/13 have been set at levels significantly lower than other Universities, and our courses recently received top marks from the UK's Quality Assurance Agency. We are committed to delivering affordable quality education, and are proud of the diversity & achievements of our students, alumni and staff.

Here at London Met we put our students at the centre of all we do.

London Met is a great place to study, located in the heart of one of the world's most exciting cities.
We stand out because we offer courses of quality, in a vibrant, socially diverse environment, which will help launch your career.

We are committed to transforming lives, meeting needs and building careers

Notice the bit about everyone.

Never trust a university that talks about transforming lives.



Friday, August 31, 2012

The Shanghai Rankings 4

The publications indicator in the Shanghai ARWU simply measures all science and social science publications in the ISI Science and Social Science Indexes over the previous year. The Arts and Humanities Index is excluded.

It is safe to assume that developing universities will start producing large numbers of publications before work that is of sufficient quality to justify a Nobel or Fields award or publication in Science or Nature. This indicator should then tell us something about that universities that are likely to forge ahead in the coming decade.

The top five for this indicator are:

1.  Harvard
2.  Toronto
3.  Michigan at Ann Arbor
4.  Tokyo
5.  Sao Paulo

Among the rising stars in the top fifty for this indicator are Seoul National University (15th), Peking (27th), National Taiwan University (36th) and Tsinghua (44th).

The Productivity per Capita indicator is one that is rather incoherent as it combines the scores for the other indicators, which may represent achievements from years or decades ago or from last year, and divides them by the current number of senior faculty, including those in the humanities whose papers are not counted. This is one indicator where spending cuts could actually have a positive effect.

For once, Harvard is second and first place is taken by Caltech. The top fifty contains an assortment of specialised institutions such as Scuola Normale Superiore Pisa, Ecole Normale Superieure Paris, Stockholm School of Economics and the Technical University of Munich. 

Wednesday, August 29, 2012

Moving Up

Every so  often political and academic leaders announce plans for getting into the top 50  or 100 or 200 of the global rankings.  The problem is that it is not always clear which rankings they are talking about. Not only do the various league tables have different indicators but doing well in one can actually have negative effects in another. Racking up a large number of publications in ISI-indexed journals would be great for the Shanghai rankings but could be detrimental for the THE World University Rankings until those publications start getting citations that are above average for year, field and country.

A paper by Angela Yung Chi Hou and Chung-Lin Chiang of Fu Jen Catholic University, Taiwan, and Robert Morse of US News & World Report has been published by Higher Education Research and Development. Here is the abstract

Since the start of the twenty-first century, university rankings have become
internationalized. Global rankings have a variety of uses, levels of popularity and
rationales and they are here to stay. An examination of the results of the current
global ranking reveals that well-reputed world-class universities are amongst the
top ranked ones. A major concern for university administrators in many parts of
the world is how to use the global rankings wisely in their mid-term and longterm
strategic planning for building their institutions into world-class
universities. Four major global rankings have been developed: the Academic
Ranking of World Universities, the World University Rankings, the
Webometrics Ranking of World Universities and the Performance Ranking of
Scientific Papers for World Universities. The main purpose of this paper is to
explore the most influential indicators in these global university rankings that
will affect the rank mobility of an institution. Based on an analysis of correlation
coefficients and K-means clustering, a model of strategic institutional planning
for building a world-class university is proposed.

The paper shows that for universities wishing to stay in the top 30 in various rankings, the most influential indicators are Nobel and Fields Awards in the Shanghai ARWU, Citations per Faculty and Faculty Student Ratio in the QS World University Rankings, Visibility in Webometrics and Citations in the Last Two Years in HEEACT.

For the ambitious intent on moving up the rankings, the indicators to watch out for are Papers in Nature and Science and Productivity per Capita in ARWU, the Academic Survey in QS, Visibility in webometrics and H-index in HEEACT

Tuesday, August 28, 2012

An International Student Bubble?

Something of a mania for international students seems to be developing. In a single issue of University World News there are stories from Canada, China and Poland about plans to recruit students from abroad.


A new report urging Canadian universities to nearly double international student enrolment by 2022 signals a fundamental policy change in Canada.

The report, released last week, recommends that Canada increase the number of foreign students from 240,000 in 2011 to 450,000 by 2022.

The government-appointed panel led by Amit Chakma, president and vice-chancellor of the University of Western Ontario, also laid out a blueprint for how the federal government ought to support universities in their recruitment efforts.

From China

China has been wooing foreign universities and foreign students in a bid to internationalise its universities and as part of a ‘soft power’ policy to project itself internationally.

“China wants to be seen as a major player internationally in terms of education,” said Anthony Welch, a professor of international education at the University of Sydney.

“There is a clear national policy in China of ‘soft power’ using education. I would argue that is a good thing for all partners,” said Yang Rui, an assistant professor in Hong Kong University’s faculty of education.

The article by Yojana Sharma also refers to efforts by universities and governments in Malaysia and Singapore to recruit more students from abroad

From Poland

Polish universities have introduced a free iPhone and iPad app to spread information internationally about opportunities in Polish higher education, and an Android version is promised soon.

The use of the latest technology will move the promotion of Polish higher education to a completely new level, according to a Polish Press Agency report quoting Dr Wojciech Marchwica of the Perspektywy Educational Foundation (Fundacja Edukacyjna Perspektywy), coordinator of the Study in Poland programme.
The universities are hoping to attract high-quality students from Ukraine, Russia, Belarus and Kazakhstan.

A few years ago, Ukraine was declared by the Study in Poland coordinating committee to be a priority source country, as it is tied to Poland by history, culture and geographical proximity.

The effort has already brought measurable results: the number of students from Ukraine grew from 1,989 in 2005 to 6,321 in 2012, an increase of more than 300%. In 2009 Study in Poland opened its first foreign office in Kyiv, at the Kyiv Polytechnic Institute.

China, Canada, Poland, Singapore and Malaysia are  not the only places struggling for more international students.

So why is there such a craze for moving students back and forth across international borders?

One reason for adding more international students is that it is probably the easiest  way to rise in the rankings (excluding ARWU) and the one with the quickest returns for universities outside the top 200 or 300. Getting faculty to do research and write papers is not always popular and may produce a backlash especially if senior staff have political connections. Writing papers that are  readable and citeable is even more difficult. Recruiting faculty to boost faculty student ratios can be expensive and may have an adverse impact on other indicators. The QS surveys are rather opaque and the THE citations indicator painfully complicated. But finding students who can cross a frontier to get a degree is comparatively easy and may even pay for itself. For University College Cork just one international student would pay for the cost of joining the QS Stars.

There are other reasons. Canada appears to be hunting for students from abroad as proxy for a meritocratic immigration policy. The problem here is that those talented engineers and computer scientists may be followed by not so talented spouses, siblings and cousins. China appears to be using universities to further diplomatic objectives and Poland seems to be trying to challenge Russian cultural hegemony.


Comment on the QS Subject Rankings

An enjoyable although perhaps a little intemperate comment on the QS philosophy rankings from the Leiter Reports.

Several readers sent this item, the latest worthless misinformation from the "world universities" ranking industry, in which "QS" (which, contrary to rumor, does not actually stand for 'Quirky Silliness") is a main player. As a commenter at The Guardian site notes, five of the universities ranked tops in Geography do not even have geography departments! And which are the "top five" US universities in philosophy?
1. Harvard University
2. University of California, Berkeley
3. Princeton University
4. Stanford University
5. Yale University
That corresponds decently to the top five American research universities, to be sure, but it has nothing to do with the top five U.S. philosophy departments, at least not in the 21st-century. But it should hardly be surprising that if you ask academics teaching in philosophy departments in Japan or Italy to rank the best philosophy departments, many of them will use general university reputation as a proxy. Indeed, every department that is pretty obviously "overrated" in philosophy in this list is at a top research university, and every department obviously underrated is not: so, e.g., Rutgers comes in at a mere 13th, Pittsburgh at 18th (behind Brown and Penn), and North Carolina at 20th.
One may hope that no student thinking about post-graduate work will base any decisions on this nonsense.
Important Dates

September 11th. From QS Intelligence Unit

QS Intelligence Unit is pleased to invite you to attend this afternoon event featuring the global exclusive release of the full QS World University Rankings® 2012-2013 Tuesday 11th September 2012 in Trinity College Dublin. Be among 300 university delegates present for a focused ninety minute session and networking reception on the eve of the EAIE conference.


September 12th. From Morse Code

The 2013 edition of U.S. News's Best Colleges rankings will go live on usnews.com on Wednesday, September 12. National UniversitiesNational Liberal Arts CollegesRegional Universities, and Regional Colleges are included in these rankings.

Our website will have the most complete version of the rankings, tables, and lists. It will have extensive statistical profiles for each school as well as wide-ranging interactivity and a college search to enable students and parents to find the school that best fits their needs. These exclusive rankings will also be published in our Best Colleges 2013 edition guidebook, which will go on sale September 18 on newsstands and at usnews.com.


October 3rd.  Times Higher Education
 
The annual THE rankings, which UK universities and science minister David Willetts said are "fast becoming something of a fixture in the academic calendar", will be published live online at 21.00 BST on 3 October.
A special rankings print supplement will also be published with the 4 October edition of THE, and the results will be available on a free interactive iPhone application.

Monday, August 27, 2012

The Shanghai Rankings 3

Two of the indicators in the Shanghai rankings measure research achievement at the highest level. The highly cited researchers indicator is based on a list of those scientists who have been cited most frequently by other researchers. Since ARWU counts current but not past affiliations of researchers, it is possible for a university to boost its score by recruiting researchers. This indicator might then  be seen as signalling a willingness to invest in and to retain international talent and hence a sign of future excellence. 

The top five for this indicator are

1,  Harvard
2.  Stanford
3.  UC Berkeley
4.  MIT
5.  Princeton

This indicator shows that there are a lot of US state universities and non-Ivy League schools that are doing well on this indicator. There is the University of Michigan (6th), University of Washington (13th), University of Minnesota (19th), Penn State (23rd), and Rutgers  (42nd).

Before this year, the methodology for this indicator was simple. If a highly cited researcher had two affiliations then there was a straightforward fifty-fifty division. Things were complicated when King Abdulaziz University (KAU) in Jeddah signed up scores of researchers on part time contracts, a story recounted in Science. ARWU has responded deftly by asking researchers to indicate how their time was divided if they had joint affiliations and this seems to have deflated KAU's score considerably but has had no or minimal effect for anyone else.

The top five universities for papers in Nature and Science are:

1.  Harvard
2.  Stanford
3.  MIT
4.  UC Berkeley
5.  Cambridge

High fliers on this indicator include several specialised science and medical institutions such as Imperial College London, Rockefeller University, Karolinka Institutet and the University of Texas Southwestern Medical Center.
Self Citation

In 2010 Mohamed El Naschie, former editor of the journal Chaos, Solitons and Fractals, embarrassed a lot of people by launching the University of Alexandria into the world's top five universities for research impact in the new Times Higher Education (THE) World University Rankings. He did this partly by  diligent self citation and partly by lot of mutual citation with a few friends and another journal. He was also helped by a ranking indicator that gave  the university disproportionate credit for citations in a little cited field, for citations in a short period of time and for being in a country were there are few citations.

Clearly self citation was only part of he story of Alexandria's brief and undeserved success but it was not an insignificant one.

It now seems that Thomson Reuters (TR), who collect and process the data for THE beginning to get a bit worried about  "anomalous citation patterns" . According to an article by Paul Jump in THE.

When Thomson Reuters announced at the end of June that a record 26 journals had been consigned to its naughty corner this year for "anomalous citation patterns", defenders of research ethics were quick to raise an eyebrow.

"Anomalous citation patterns" is a euphemism for excessive citation of other articles published in the same journal. It is generally assumed to be a ruse to boost a journal's impact factor, which is a measure of the average number of citations garnered by articles in the journal over the previous two years.

Impact factors are often used, controversially, as a proxy for journal quality and, even more contentiously, for the quality of individual papers published in the journal and even of the people who write them.

When Thomson Reuters discovers that anomalous citation has had a significant effect on a journal's impact factor, it bans the journal for two years from its annual Journal Citation Reports (JCR), which publishes up-to-date impact factors.

"Impact factor is hugely important for academics in choosing where to publish because [it is] often used to measure [their] research productivity," according to Liz Wager, former chair of the Committee on Publication Ethics.

"So a journal with a falsely inflated impact factor will get more submissions, which could lead to the true impact factor rising, so it's a positive spiral."

One trick employed by editors is to require submitting authors to include superfluous references to other papers in the same journal.

A large-scale survey by researchers at the University of Alabama in Huntsville's College of Business Administration published in the 3 February edition of Science found that such demands had been made of one in five authors in various social science and business fields.

That TR are beginning to crack down on self citation is good news. But will they follow their rivals QS and stop counting self citation in the citation indicator in their rankings? When I spoke to Simon Pratt of TR at the Shanghai World Class Universities conference in Shanghai at the end of last year he seemed adamant that they would go on counting self citations.

Even if TR and THE start excluding self citations, it would probably not be enough.. It may soon become necessary to exclude intra-journal citations as well.

Friday, August 24, 2012

Universiti Malaya Again

In many countries performance in international university rankings has become as much a symbol of national accomplishment as winning Olympic medals or qualifying for the World Cup. When a local university rises in the rankings it is cause for congratulations for everyone, especially for administrators. When they fall it is an occasion for soul-searching and a little bit of schadenfreude for opposition groups.

Malaysia has been particularly prone to this syndrome. There was a magical moment in 2004 when the first THES-QS ranking put Universiti Malaya (UM), the country's first university, in the world's top 100. Then it went crashing down . Since then it has moved erratically up and down around the 200th position.

Lim Kit Siang, leader emeritus of the Malaysian opposition has this to say in his blog:

At the University of Malaya’s centennial celebrations in June 2005, the then Deputy Prime Minister Datuk Seri Najib Razak threw the challenge to University of Malaya to raise its 89th position among the world’s top 100 universities in THES-QS (Times Higher Education Supplement-Quacquarelli Symonds) ranking in 2004 to 50 by the year 2020.

Instead of accepting Najib’s challenge with incremental improvement of its THES ranking, the premier university went into a free fall when in 2005 and 2006 it fell to 169th and 192nd ranking respectively, and in the following two years in 2007 and 2008, fell out of the 200 Top Universities ranking altogether.

In 2009, University of Malaya made a comeback to the 200 Top Universities Ranking when it was placed No. 180, but in 2010 it again fell out of the 200 Top Universities list when it dropped to 207th placing.

For the 2011 QS Top 200 Universities Ranking, University of Malaya returned to the Top 200 Universities Ranking, being placed at No. 167.

In the THES-QS World University Rankings 2009, University of Malaya leapfrogged 50 places from No. 230 placing in 2008 to No. 180 in 2009; while in the 2011 QS World University Ranking, University of Malaya leapt 40 places from No. 207 in 2010 to No. 167 in 2011.

The QS World University Rankings 2012 will be released in 20 days’ time. Can University of Malaya make another leapfrog as in 2009 and 2011 to seriously restore her place as one of the world’s top 100 universities by before 2015?


The government has announced that in addition to Najib’s challenge to University of Malaya in 2005 to be among the world’s Top 50 universities by 2020, the National Higher Education Strategic Plan called for at least three Malaysian universities to be ranked among the world’s top 100 universities.

Recently, the U.S. News World’s Best Universities Rankings included five local universities in its Top 100 Asian Universities, but this is not really something to celebrate about.

The U.S. News World’s Best Universities Ranking is actually based on the QS 2012 Top 300 Asian University Rankings released on May 30 this year, which commented that overall, although University of Malaya improved its ranking as compared to 2011 ranking, the majority of Malaysian universities dropped in their rankings this year as compared to 2011.
There is a lot of detail missing here. UM"s fluctuating scores had nothing to do with failed or successful policies but resulted from errors, corrections of errors, or "clarification of data", changes in methodology and variations in the collecting and reporting of data .

UM was only in the top 100 of the THES-QS rankings because of a mistake by QS, the data collectors, who thought that ethnic minority students and faculty were actually foreigners and therefore handed out a massive and undeserved boost for the international faulty and international student indicators.

Its fall in 2005 was the result of QS's belated realisation of its mistake.

The continued decline in 2007 may have been because QS changed its procedures to prevent respondents to the academic survey voting for their own institutions or because of the introduction of Z scores which had the effect of substantially boosting the scores in citation per faculty for mediocre universities like Peking but only slightly for laggards like UM.

The rise in 2009 from 230th to 180th position was largely the result of a big improvement in the score for faculty student ratio comprising both a reported fall in the number of students and a reported rise in the number of faculty. It is unlikely that the university administration had thrown 6000 students into the Klang River:more probably somebody told somebody that diploma and certificate students need not be included in the data reported to QS.

Whether UM rises again in the QS rankings is less interesting than its performance in the Shanghai Academic Ranking of World Universities. In 2011 it moved into the top 500 with.scores of 3.4 for highly cited researchers and 34.6 for ISI-indexed publications (compared to 100 for the front-runner Harvard) and 16 for per capita productivity (in this case the top scorer was Caltech).

In 2012 UM had the same score for highly cited researchers and registered a score of 38.6 for publications and a slight improvement to 16.7 for productivity. This meant that UM was now ranked in 439th place and that reaching the 300-400 band in  a few years time would not be impossible.

UM has managed to make it into the Shanghai rankings by actively encouraging research among its faculty and by recruiting international researchers, policies that are unpopular and in marked contrast to those of other Malaysian universities.

What will happen in the QS rankings when they come out next month? Something to watch out for is the employer survey, which has a weighting of ten per cent. In 2011 something odd was going on . Apparently there had been an enthusiastic response to the rankings in Latin America especially the employer survey so that QS resorted to capping the scores for many universities. They reported that:


"QS received a dramatic level of response from Latin America in 2011, these counts and all subsequent analysis have been adjusted by applying a weighting to responses from countries with a distinctly disproportionate level of response."
It seems that one effect of the inflated number of responses was to raise the mean score so that universities with below average scores saw a dramatic fall in their adjusted scores. If there is a further increase in responses this year universities like UM may see a further reduction for this indicator.

Sunday, August 19, 2012

The Shanghai Rankings 2

The Shanghai Rankings get more interesting when we look at the individual indicators. Here are the 2012 top five for Almuni who have won Nobel and Fields awards.

1. Harvard
2. Cambridge
3. MIT
4. Berkeley
5. Columbia

In the top fifty for this indicator there are the Ecole Normale Superieure, Moscow State University, the Technical University of Munich, Goettingen, Strasbourg and the City University of New York City College.

Essentially, this indicator allows universities that have seen better decades to gain a few points from an academic excellence that has long been in decline. City College of New York is an especially obvious victim of politics and bureaucracy.

The top five in the Awards indicator, faculty who have won Nobel prizes and Fields medals, are:

1.. Harvard
2.  Cambridge
3.  Princeton
4.  Chicago
5.  MIT

The top fifty includes the Universities of Buenos Aires, Heidelberg, Paris Dauphine, Bonn, Munich and Freiburg. Again, this indicator may be a pale reflection of past glory rather than a sign of future accomplishments.





Saturday, August 18, 2012

The Shanghai Rankings 1

The 2012 edition of Shanghai Jiao Tong University's Academic Ranking of World Universities has been published. Here are the top ten, which are the same as last year's top ten.

1.  Harvard
2.  Stanford
3.  MIT
4.  UC Berkeley
5.  Cambridge
6.  Caltech
7.  Princeton
8.  Columbia
9.  Chicago
10. Oxford

It is necessary to go down to the 19th and 20th places to find any changes. Tokyo is now 19th and University College London 20th, reversing last year's order and restoring that of 2003.

Saturday, August 11, 2012


What’s up at Wits?
 
The university of Witwatersrand is in turmoil. Faculty are going on strike for higher salaries, claiming that there has been a drastic decline in quality in recent years. Evidence for this decline is the university’s fall by more than a hundred places in the QS world rankings. The administration has argued that these rankings are not valid.

THE University of the Witwatersrand is one of SA's largest and oldest academic institutions. According to its strategic planning division, at the end of last year there were about 1300 academic staff, 2000 administrative staff and nearly 30000 students, with 9000 of these being postgraduates.

There is no doubt that Wits has pockets of excellence, and many talented academics who are players on the global stage. However, this excellence is being overwhelmed and dragged down by inefficient bureaucracy in its administrative processes.

There are more administrative staff than academic staff, and as one academic said: "It is impossible to get anything done."

David Dickinson, president of the Academic Staff Association of Wits University - which has more than 700 members and is threatening to strike, said: "Between 2007 and last year, we fell more than 100 places in the QS World University Rankings.. A significant problem is that the most important part of the university has been forgotten: its employees."

The university is ranked second in the country, after the University of Cape Town, but scraped into the top 400 in the world at 399th on the QS World University rankings for last year.
The faculty are correct about the QS rankings. Between 2007 and 2011 the university fell from 283rd place to 399th. The decline was especially apparent in the employer review, from 191st to below 301st and international faculty, from 69th to 176th.

But there is a problem. From 2007 to 2011 Wits steadily improved on some indicators in the Shanghai rankings, from 10.9 to 11.2 for publications in Nature and Science, from 26.2 to 29.9 for publications and from 14.8 to 16.3 for faculty productivity. The score for alumni winning Nobel prizes has declined from 23.5 to 21.2 but this was because the two alumni were being compared to an increase for front runner Harvard.

So which ranking is correct? Probably they both are because they refer to two different periods. The  alumni who contributed to the Alumni indicator in ARWU graduated in 1982 and 2002. Publications and papers In Nature and Science could reflect the fruits of research projects that began up to a decade earlier.

The QS rankings (formerly the THE-QS rankings) are heavily weighted towards two surveys of debatable validity. The declining score for Wits in the employer review from 59 points (well above the mean of 50) to 11 is remarkable and almost certainly is nothing to do with the university but is the result of a flooding of the survey by supporters of other institutions leading to a massive increase in the average number of responses.

The decline in other scores such as international faculty and faculty student ration could be the result of short term policy changes. However, if it is correct that research and teaching are being strangled by bureaucracy and mistaken policies then sooner or later we should start seeing indications in the Shanghai rankings.


Sunday, August 05, 2012

Philippine Universities and the QS English Rankings

The QS subject rankings have produced quite a few surprises. Among them is the high position of several Philippine universities in the 2012 English Literature and Language ranking. In the top 100 we find Ateneo de Manila University, the University of the Philippines and De La Salle University. Ateneo de Manila in 24th place is ahead of Birmingham, Melbourne, Lancaster and University College Dublin.

How did the Philippine universities do so well? First, the subject rankings are based on different combinations of criteria. Those for English Literature and Language rankings have a 90 per cent weighting for the academic survey conducted in 2011 and 10 percent for the employer survey. There is, unlike the natural sciences, nothing for citations. Essentially then the English ranking is a measure of reputation in that subject and these universities were picked by a large number of survey respondents..

One feature of the QS academic survey is that respondents can choose to nominate universities globally or by region. Ateneo de Manila's performing better than Birmingham or Melbourne in this subject most probably means that it was being compared with others in Asia while the latter were assessed internationally.

Also, the category English Literature and Language is an extremely diverse one, covering scholars toiling away at a critical edition of Chaucer, post-modern cultural theorists and researchers in language education. I suspect that the high scores for Ateneo de Manila and the other universities came from dozens of postgraduate TESOL students in the US and Australia. It would be a good idea for QS to have separate rankings for English literature and English language education.

As usual, university administrators seem to be somewhat confused  about the rankings. The Dean of the Faculty of Arts and Letters at the University of Santo Tomas is reported as saying;

The University, he pointed out, did not get any request for data from QS, the London consultancy that comes out with annual university rankings:
“With due respect to the QS, I think we should also know how the data is being collected, because as far as we are concerned, we are the academic unit taking care of arts and humanities and philosophy and literature,” he told the Varsitarian.
The QS survey may have been perception-based, and data gathering could have relied on what’s available on the Internet, Vasco added. “The question is, how do they source the data? Do they simply get it from the general information known about the University? Do they simply get it from the website? What if the website is not updated? What information will you get there?” he asked.
Vasco also said it would be difficult to compete in other clusters of the Arts and Humanities category of the QS subject rankings, namely Philosophy, Modern Languages, Geography, History, and Linguistics.
“[We] do not offer the same breadth of programs being surveyed under the arts and humanities cluster in the QS survey,” Vasco said.
The growing number of participants in the QS survey has contributed to the general decline of Philippine schools in various QS rankings, the Artlets dean noted. “More and more international universities from highly industrialized countries are participating, like universities from Europe, North America, and even Asia-Pacific,” he said. “Chances are, Philippine schools will slide down to lower rankings.”

For once, QS is being unfairly treated. The methodology of the subject rankings is explained quite clearly here



Friday, August 03, 2012


QS Stars

University World News (UWN) has published an article by David Jobbins about QS Stars, which are awarded to universities that pay (most of them anyway) for an audit and a three year licence to use the stars and which are shown alongside the listings in the  QS World University Rankings. Participation is not spread evenly around the world and it is mainly medioce universities or worse that have signed up according to a QS brochure. Nearly half of the universities that have opted for the stars are from Indonesia.

Jobbins refers to a report in Private Eye which in turn refers to the Irish Examiner. He writes:

The stars appear seamlessly alongside the listing for each university on the World University Rankings, despite protestations from QS that the two are totally separate operations.

The UK magazine Private Eye reported in its current issue that two Irish universities – the University of Limerick and University College Cork, UCC – had paid “tens of thousands” of euro for their stars.

The magazine recorded that UCC had told the Irish Examiner that the €22,000 (US$26,600) cost of obtaining the stars was worthwhile, as it could be recouped through additional international student recruitment.

The total cost for the audit and a three-year licence is US$30,400, according to the scheme prospectus.


 The Irish Examiner article by Neil Murray is quite revealing about the motivation for signing up for an audit:

UCC paid almost €22,000 for its evaluation, which includes a €7,035 audit fee and three annual licence fees of €4,893. It was awarded five-star status, which it can use for marketing purposes for the next three years.

The audit involved a visit to the college by QS researchers but is mostly based on analysis of data provided by UCC on eight criteria. The university’s five-star rating is largely down to top marks for research, infrastructure, internationalisation, innovation, and life science, but it got just three stars for teaching and engagement.
About 3,000 international students from more than 100 countries earn UCC approximately €19 million a year.

UCC vice-president for external affairs Trevor Holmes said there are plans to raise the proportion of international students from 13% — one of the highest of any Irish college — to 20%.

"Should UCC’s participation in QS Stars result in attracting a single additional, full-time international student to study at UCC then the costs of participation are covered," he said.

"In recent times, unlike many other Irish universities, UCC has not been in a position to spend significant sums on marketing and advertising domestically or internationally. QS Stars represents a very cost-effective approach of increasing our profile in international media and online."
So now we know how much a single international student adds to the revenue of an Irish university.

So far, there is nothing really new here. The QS Stars system has been well publicised and it probably was a factor in Times Higher Education dropping QS as its data collecting partner and replacing them with Thomson Reuters.

What is interesting about the UWN article is that a number of British and American universities have been given the stars without paying anything. These include Oxford and Cambridge and 12 leading American institutions that are described by QS as "independently audited based on publicly available information". It would be interesting to know whether the universities gave permission to QS to award them stars in the rankings. Also, why are there differences between the latest rankings and the QS brochure? Oxford does not have any stars in last year's rankings but is on the list in the brochure. Boston University has stars but is not on the list. It may be just a matter of updating.

It would probably be a good idea for QS to remove the stars from the rankings and keep them in the university profiles.