QS World University Rankings

From Infogalactic: the planetary knowledge core
Jump to: navigation, search
QS World University Rankings
File:QS World University Rankings logo.gif
Editor Danny Byrne
Categories Higher education
Frequency Annual
Publisher Quacquarelli Symonds Limited
First issue 2004 (in partnership with THE)
2010 (on its own)
Country  United Kingdom
Language English
Website www.topuniversities.com

QS World University Rankings is an annual publication of university rankings by Quacquarelli Symonds (QS). Previously known as THE-QS World University Rankings, QS had collaborated with Times Higher Education (THE) magazine to publish its international league tables from 2004 to 2009 before both of them started to announce their own versions. QS chose the existing methodology while THE adopted a new one for its Times Higher Education World University Rankings.[1] QS World University Rankings now comprises the global overall and subject rankings, alongside three independent regional tables (Asia, Latin America, and BRICS) with different methodologies. While it is viewed as one of the most widely read university rankings,[2][3][4][5] criticism of it for giving undue weight to subjective indicators and being commercialized also exists.[6]

History

The need for an international ranking of universities was highlighted in December 2003 in Richard Lambert's review of university-industry collaboration in Britain[7] for HM Treasury, the finance ministry of the United Kingdom. Amongst its recommendations were world university rankings, which Lambert said would help the UK to gauge the global standing of its universities.

The idea for the rankings was credited in Ben Wildavsky's book, The Great Brain Race: How Global Universities are Reshaping the World,[8] to then-editor of Times Higher Education (THE), John O'Leary. THE chose to partner with educational and careers advice company Quacquarelli Symonds (QS) to supply the data, appointing Martin Ince,[9] formerly deputy editor and later a contractor to THE, to manage the project.

Between 2004 and 2009, QS produced the rankings in partnership with THE. In 2009, THE announced they would produce their own rankings, the Times Higher Education World University Rankings, in partnership with Thomson Reuters. THE cited a weakness in the methodology of the original rankings,[10] as well as a perceived favoritism in the existing methodology for science over the humanities,[11] as one of the key reasons for the decision to split with QS.

QS retained the intellectual property in the Rankings and the methodology used to compile them[citation needed] and continues to produce the rankings, now called the QS World University Rankings.[12] THE created a new methodology with Thomson Reuters, published as the Times Higher Education World University Rankings in September 2010.

Global rankings

Overall

Methodology

Methodology of QS World University Rankings[13]
Indicator Weighting Elaboration
Academic peer review
  • 40%
Based on an internal global academic survey
Faculty/Student ratio
  • 20%
A measurement of teaching commitment
Citations per faculty
  • 20%
A measurement of research impact
Employer reputation
  • 10%
Based on a survey on graduate employers
International student ratio
  • 5%
A measurement of the diversity of the student community
International staff ratio
  • 5%
A measurement of the diversity of the academic staff

QS publishes the rankings results in some media around the world, including Chosun Ilbo in Korea. The first rankings produced by QS independently of THE, and using QS's consistent and original methodology, were released on September 8, 2010, with the second appearing on September 6, 2011.

QS tried to design its rankings to look at a broad range of university activity.[14]

Academic peer review

This is the most controversial part of the methodology. Using a combination of purchased mailing lists and applications and suggestions, this survey asks active academicians across the world about the top universities in fields they know about. QS has published the job titles and geographical distribution of the participants.[15]

The 2011 rankings made use of responses from 33,744 people from over 140 nations in its Academic Peer Review, including votes from the previous two years rolled forward provided there was no more recent information available from the same individual. Participants can nominate up to 30 universities but are not able to vote for their own. They tend to nominate a median of about 20, which means that this survey includes over 500,000 data points.[15]

In 2004, when the rankings first appeared, academic peer review accounted for half of a university's possible score. In 2005, its share was cut to 40 per cent because of the introduction of the Recruiter Review.

Faculty student ratio

This indicator accounts for 20 per cent of a university's possible score in the rankings. It is a classic measure used in various ranking systems as a surrogate for teaching commitment, but QS has admitted that it is less than satisfactory.[16]

Citations per faculty

Citations of published research are among the most widely used inputs to national and global university rankings. The QS World University Rankings used citations data from Thomson (now Thomson Reuters) from 2004 to 2007, and since then uses data from Scopus, part of Elsevier. The total number of citations for a five-year period is divided by the number of academicians in a university to yield the score for this measure, which accounts for 20 per cent of a university's possible score in the Rankings.

QS has explained that it uses this approach, rather than the citations per paper preferred for other systems, because it reduces the effect of biomedical science on the overall picture – bio-medicine has a ferocious "publish or perish" culture. Instead QS attempts to measure the density of research-active staff at each institution. But issues still remain about the use of citations in ranking systems, especially the fact that the arts and humanities generate comparatively few citations.[17]

QS has conceded the presence of some data collection errors regarding citations per faculty in previous years' rankings.[18]

One interesting issue is the difference between the Scopus and Thomson Reuters databases. For major world universities, the two systems capture more or less the same publications and citations. For less mainstream institutions, Scopus has more non-English language and smaller-circulation journals in its database. But as the papers there are less heavily cited, this can also mean fewer citations per paper for the universities that publish in them.[17] This area has been criticized for undermining universities which do not use English as their primary language.[19] Citations and publications in a language different from English are harder to come across. The English language is the most internationalized language and therefore the most popular when citing.

Recruiter review

This part of the ranking is obtained by a similar method to the Academic Peer Review, except that it samples recruiters who hire graduates on a global or significant national scale. The numbers are smaller – 16,875 responses from over 130 countries in the 2011 Rankings – and are used to produce 10 per cent of any university's possible score. This survey was introduced in 2005 in the belief that employers track graduate quality, making this a barometer of teaching quality, a famously problematic thing to measure. University standing here is of special interest to potential students.[20]

International orientation

The final ten per cent of a university's possible score is derived from measures intended to capture their internationalism: five percent from their percentage of international students, and another five percent from their percentage of international staff. This is of interest partly because it shows whether a university is putting effort into being global, but also because it tells us whether it is taken seriously enough by students and academics around the world for them to want to be there.[21]

Commentary

Reception

Several universities in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-Chancellor of New Zealand's Massey University, Professor Judith Kinnear, says that the Times Higher Education-QS ranking is a "wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability." She said the rankings are a true measure of a university's ability to fly high internationally: "The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand's Performance Based Research Fund (PBRF) measure or the Shanghai rankings."[22] In September 2012 the British newspaper The Independent described the QS World University Rankings as being "widely recognised throughout higher education as the most trusted international tables".[23]

Martin Ince,[9] chair of the Advisory Board for the Rankings, points out that their volatility has been reduced since 2007 by the introduction of the Z-score calculation method and that over time, the quality of QS's data gathering has improved to reduce anomalies. In addition, the academic and employer review are now so big that even modestly ranked universities receive a statistically valid number of votes. QS has published extensive data [24] on who the respondents are, where they are, and the subjects and industries to which the academicians and employers respectively belong.

Criticisms

Many are concerned with the use or misuse of survey data.

Since the split from Times Higher Education, further concerns about the methodology QS uses for its rankings have been brought up by several experts. Simon Marginson, professor of higher education at University of Melbourne and a member of the THE editorial board, in the article "Improving Latin American universities' global ranking" for University World News on 10 June 2012, said: "I will not discuss the QS ranking because the methodology is not sufficiently robust to provide data valid as social science." [25]

In an article for the New Statesman entitled "The QS World University Rankings are a load of old baloney", David Blanchflower, a leading labour economist, said: "This ranking is complete rubbish and nobody should place any credence in it. The results are based on an entirely flawed methodology that underweights the quality of research and overweights fluff... The QS is a flawed index and should be ignored." [26]

In an article titled The Globalisation of College and University Rankings and appearing in the January/February 2012 issue of Change magazine, Philip Altbach, professor of higher education at Boston College and also a member of the THE editorial board, said: "The QS World University Rankings are the most problematical. From the beginning, the QS has relied on reputational indicators for half of its analysis … it probably accounts for the significant variability in the QS rankings over the years. In addition, QS queries employers, introducing even more variability and unreliability into the mix. Whether the QS rankings should be taken seriously by the higher education community is questionable."[27]

The QS World University Rankings have been criticised by many for placing too much emphasis on peer review, which receives 40 percent of the overall score. Some people have expressed concern about the manner in which the peer review has been carried out.[28] In a report,[29] Peter Wills from the University of Auckland, New Zealand wrote of the Times Higher Education-QS World University Rankings:

<templatestyles src="Template:Blockquote/styles.css" />

But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.

QS points out that no survey participant, academic or employer, has been offered a financial incentive to respondents. And academics cannot vote for their own institution.

THES-QS introduced several changes in methodology in 2007 which were aimed at addressing these criticisms,[30] the ranking has continued to attract criticisms. In an article[31] in the peer-reviewed BMC Medicine authored by several scientists from the US and Greece, it was pointed out:

<templatestyles src="Template:Blockquote/styles.css" />

If properly performed, most scientists would consider peer review to have very good construct validity; many may even consider it the gold standard for appraising excellence. However, even peers need some standardized input data to peer review. The Times simply asks each expert to list the 30 universities they regard as top institutions of their area without offering input data on any performance indicators. Research products may occasionally be more visible to outsiders, but it is unlikely that any expert possesses a global view of the inner workings of teaching at institutions worldwide. Moreover, the expert selection process of The Times is entirely unclear. The survey response rate among the selected experts was only <1% in 2006 (1,600 of 190,000 contacted). In the absence of any guarantee for protection from selection biases, measurement validity can be very problematic.

Alex Usher, vice president of Higher Education Strategy Associates in Canada, commented:

<templatestyles src="Template:Blockquote/styles.css" />

Most people in the rankings business think that the main problem with The Times is the opaque way it constructs its sample for its reputational rankings - a not-unimportant question given that reputation makes up 50% of the sample. Moreover, this year's switch from using raw reputation scores to using normalized Z-scores has really shaken things up at the top-end of the rankings by reducing the advantage held by really top universities - University of British Columbia (UBC) for instance, is now functionally equivalent to Harvard in the Peer Review score, which, no disrespect to UBC, is ludicrous. I'll be honest and say that at the moment the THES Rankings are an inferior product to the Shanghai Jiao Tong's Academic Ranking of World Universities.

Academicians have also been critical of the use of the citation database, arguing that it undervalues institutions which excel in the social sciences. Ian Diamond, former chief executive of the Economic and Social Research Council and now vice-chancellor of the University of Aberdeen and a member of the THE editorial board, wrote to Times Higher Education in 2007, saying:[32]

<templatestyles src="Template:Blockquote/styles.css" />

The use of a citation database must have an impact because such databases do not have as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School of Economics, caused primarily by its citations score, is a result not of the output of an outstanding institution but the database and the fact that the LSE does not have the counterweight of a large natural science base.

The most recent criticism of the old system came from Fred L. Bookstein, Horst Seidler, Martin Fieder and Georg Winckler in the journal Scientomentrics for the unreliability of QS's methods:

<templatestyles src="Template:Blockquote/styles.css" />

Several individual indicators from the Times Higher Education Survey (THES) data base the overall score, the reported staff-to-student ratio, and the peer ratings—demonstrate unacceptably high fluctuation from year to year. The inappropriateness of the summary tabulations for assessing the majority of the "top 200" universities would be apparent purely for reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change scores of the various indices for them to be of use in the course of university management.[6]

The QS subject rankings have been dismissed as unreliable by some critics, including most notably Brian Leiter, who points out that programmes which are known to be high quality, and which rank highly in the Blackwell rankings (e.g., the University of Pittsburgh) fare poorly in the QS ranking for reasons that are not at all clear.[33]

Results

QS World University Rankings — Top 50[note 1]
Institution 2010/11[34] 2011/12[35] 2012/13[36] 2013/14[37] 2014/15[38] 2015/16[39]
United StatesMassachusetts Institute of Technology 5 3 1 1 1 1
United StatesHarvard University 2 2 3 2 4 2
United StatesStanford University 13 11 15 7 7 3
United KingdomUniversity of Cambridge 1 1 2 3 2 3
United StatesCalifornia Institute of Technology 9 12 10 10 8 5
United KingdomUniversity of Oxford 6 5 5 6 5 6
United KingdomUniversity College London 4 7 4 4 5 7
United KingdomImperial College London 7 6 6 5 2 8
SwitzerlandSwiss Federal Institute of Technology in Zurich 18 18 13 12 12 9
United StatesUniversity of Chicago 8 8 8 9 11 10
United StatesPrinceton University 10 13 9 10 9 11
SingaporeNational University of Singapore 31 28 25 24 22 12
SingaporeNanyang Technological University 74 58 47 41 39 13
SwitzerlandSwiss Federal Institute of Technology in Lausanne 32 35 29 19 17 14
United StatesYale University 3 4 7 8 10 15
United StatesThe Johns Hopkins University 17 16 16 16 14 16
United StatesCornell University 16 15 14 15 19 17
United StatesUniversity of Pennsylvania 12 9 12 13 13 18
United KingdomKing's College London 21 27 26 19 16 19
AustraliaAustralian National University 20 26 24 27 25 20
United KingdomUniversity of Edinburgh 22 20 21 17 17 21
United StatesColumbia University 11 10 11 14 14 22
FranceEcole Normale Supérieure 33 33 34 28 24 23
CanadaMcGill University 19 17 18 21 21 24
ChinaTsinghua University 54 47 48 48 47 25
United StatesUniversity of California, Berkeley 28 21 22 25 27 26
United StatesUniversity of California, Los Angeles 35 34 31 40 37 27
Hong KongThe Hong Kong University of Science and Technology 40 40 33 34 40 28
United StatesDuke University 14 19 20 23 25 29
Hong KongThe University of Hong Kong 23 22 23 26 28 30
United StatesUniversity of Michigan 15 14 17 22 23 30
United StatesNorthwestern University 26 24 27 29 34 32
United KingdomThe University of Manchester 30 29 32 33 30 33
CanadaUniversity of Toronto 29 23 19 17 20 34
United KingdomLondon School of Economics and Political Science -- -- 69 68 71 35
South KoreaSeoul National University 50 42 37 35 31 36
United KingdomUniversity of Bristol 27 30 28 30 29 37
JapanKyoto University 25 32 35 35 36 38
JapanThe University of Tokyo 24 25 30 32 31 39
FranceEcole Polytechnique 36 36 41 41 35 40
ChinaPeking University 47 46 44 46 57 41
AustraliaThe University of Melbourne 38 31 36 31 33 42
South KoreaKorean Advanced Institute of Science and Technology -- -- 63 60 51 43
United StatesUniversity of California, San Diego -- -- 70 63 59 44
AustraliaThe University of Sydney 37 38 39 38 37 45
AustraliaUniversity of New South Wales 39 39 52 52 48 46
AustraliaThe University of Queensland 43 48 46 43 43 46
United KingdomThe University of Warwick -- -- -- 58 64 48
United StatesBrown University 39 39 42 47 52 49
CanadaUniversity of British Columbia 44 51 45 49 43 50
QS also releases a list of QS Top 50 under 50 annually to rank universities which have established for not more than 50 years. These institutions are judged based on their positions on the overall table of the previous year.[40]

Faculties and subjects

QS also ranks universities by academic discipline organized into 5 faculties, namely Arts & Humanities, Engineering & Technology, Life Sciences& Medicine, Natural Sciences and Social Sciences & Management. These annual rankings are drawn up on the basis of academic opinion, recruiter opinion and citations.

Categories of QS World University Rankings by Faculty and Subject[41]
Art & Humanities Engineering & Technology Life Sciences & Medicine Natural Sciences [note 2] Social Sciences
Arts & Design Architecture Agriculture & Forestry Physics & Astronomy Accounting & Finance
English language & literature Chemical Engineering Biological Sciences Mathematics Business & Management
History Civil & Structural Engineering Dentistry Environmental Sciences Communication & Media Studies
Linguistics Computer Science Medicine Earth & Marine Sciences Development Studies
Modern languages Electrical & Electronic engineering Pharmacy & Pharmacology Chemistry Economics & Econometrics
Philosophy Mechanical, Aeronautical & Manufacturing Psychology Materials Sciences Education
Surveying Veterinary Science Geography Law
Politics & International Studies
Sociology
Statistics

Regional rankings

Asia

In 2009, QS launched the QS Asian University Rankings or QS University Rankings: Asia in partnership with The Chosun Ilbo newspaper in Korea to rank universities in Asia independently.

These rankings use some of the same criteria as the world rankings, but there are changed weightings and new criteria. One addition is the criterion of incoming and outgoing exchange students. Accordingly, the QS World University Rankings and the QS Asian University Rankings released in the same academic year are different from each other.[1] For example, The University of Hong Kong being 22nd and 23rd worldwide was regarded as the best Asian institution by the QS World University Rankings (2011 and 2012),[35][36] while The Hong Kong University of Science and Technology had topped the tables of the QS Asian University Rankings simultaneously.[42][43]

QS University Rankings: Asia — Top 50[note 1]
Institution 2009[44] 2010[45] 2011[43] 2012[42] 2013[46] 2014[47] 2015[48]
SingaporeNational University of Singapore 10 3 3 2 2 1 1
Hong KongThe University of Hong Kong 1 1 2 3 2 3 2
South KoreaKorea Advanced Institute of Science and Technology 7 13 11 7 6 2 3
SingaporeNanyang Technological University 14 18 17 17 10 7 4
Hong KongThe Hong Kong University of Science and Technology 4 2 1 1 1 5 5
Hong KongThe Chinese University of Hong Kong 2 4 5 5 7 6 6
ChinaPeking University 10 12 13 6 5 8 7
South KoreaSeoul National University 8 6 6 4 4 4 8
Hong KongThe City University of Hong Kong 18 15 15 12 12 11 9
South KoreaPohang University of Science and Technology 17 14 12 9 7 9 10

Latin America

The QS Latin American University Rankings or QS University Rankings: Latin America were launched in 2011. They use academic opinion (30%), employer opinion (20%), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures.[49]

BRICS

This set of rankings adopts 8 indicators to select the top 100 higher learning institutions in BRICS countries. Institutions in Hong Kong, Macau and Taiwan are not ranked here, leaving those in mainland China alone.

QS University Rankings: BRICS — Top 10[note 1]
Institution 2013[50] 2014[51] 2015[52]
ChinaTsinghua University 1 1 1
ChinaPeking University 2 2 2
ChinaFudan University 4 5 3
RussiaLomonosov Moscow State University 3 3 4
IndiaIndian Institute of Science Bangalore - - 5
ChinaShanghai Jiao Tong University 6 8 6
ChinaUniversity of Science and Technology of China 6 4 6
ChinaNanjing University 5 6 8
BrazilUniversidade de São Paulo 8 7 9
ChinaBeijing Normal University 12 14 10

QS Stars

QS also offers universities a way of seeing their own strengths and weaknesses in depth. Called QS Stars, this service is separate from the QS World University Rankings. It involves a detailed look at a range of functions which mark out a modern university. Universities can get from one star to five, or Five Star Plus for the truly exceptional.

QS Stars ratings are derived from scores on eight criteria. They are:

  • Research Quality
  • Teaching Quality
  • Graduate Employability
  • University Infrastructure
  • Internationalisation
  • Innovation and knowledge transfer
  • Third mission activity, measuring areas of social and civic engagement
  • Special criteria for specific subjects

Stars is an evaluation system, not a ranking. About 100 institutions had opted for the Stars evaluation as of early 2013. In 2012, fees to participate in this program were $9850 for the initial audit and an annual license fee of $6850.[53]

Notes

  1. 1.0 1.1 1.2 Order shown in accordance with the latest result.
  2. The term "Natural Sciences" here actually refers to physical sciences since life sciences are also a branch of natural sciences.

References

  1. 1.0 1.1 Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. 6.0 6.1 Lua error in package.lua at line 80: module 'strict' not found.
  7. Lambert Review of Business-University Collaboration
  8. Princeton University Press, 2010
  9. 9.0 9.1 Lua error in package.lua at line 80: module 'strict' not found.
  10. Lua error in package.lua at line 80: module 'strict' not found.
  11. Lua error in package.lua at line 80: module 'strict' not found.
  12. Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. 15.0 15.1 Lua error in package.lua at line 80: module 'strict' not found.
  16. QS Intelligence Unit | Faculty Student Ratio. Iu.qs.com. Retrieved on 2013-08-12.
  17. 17.0 17.1 QS Intelligence Unit | Citations per Faculty. Iu.qs.com. Retrieved on 2013-08-12.
  18. Lua error in package.lua at line 80: module 'strict' not found.
  19. "Global university rankings and their impact,". "European University Association". Retrieved 3, September, 2012
  20. QS Intelligence Unit | Employer Reputation. Iu.qs.com. Retrieved on 2013-08-12.
  21. QS Intelligence Unit | International Indicators. Iu.qs.com. Retrieved on 2013-08-12.
  22. Flying high internationally[dead link]
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. Lua error in package.lua at line 80: module 'strict' not found.
  25. Lua error in package.lua at line 80: module 'strict' not found.
  26. Lua error in package.lua at line 80: module 'strict' not found.
  27. Lua error in package.lua at line 80: module 'strict' not found.
  28. Lua error in package.lua at line 80: module 'strict' not found.
  29. Response to Review of Strategic Plan by Peter Wills
  30. Sowter, Ben (1 November 2007). THES – QS World University Rankings 2007 - Basic explanation of key enhancements in methodology for 2007"
  31. Lua error in package.lua at line 80: module 'strict' not found.
  32. Lua error in package.lua at line 80: module 'strict' not found.
  33. Leiter Reports: A Philosophy Blog: Guardian and "QS Rankings" Definitively Prove the Existence of the "Halo Effect". Leiterreports.typepad.com (2011-06-05). Retrieved on 2013-08-12.
  34. Lua error in package.lua at line 80: module 'strict' not found.
  35. 35.0 35.1 Lua error in package.lua at line 80: module 'strict' not found.
  36. 36.0 36.1 Lua error in package.lua at line 80: module 'strict' not found.
  37. Lua error in package.lua at line 80: module 'strict' not found.
  38. Lua error in package.lua at line 80: module 'strict' not found.
  39. Lua error in package.lua at line 80: module 'strict' not found.
  40. Lua error in package.lua at line 80: module 'strict' not found.
  41. Lua error in package.lua at line 80: module 'strict' not found.
  42. 42.0 42.1 Lua error in package.lua at line 80: module 'strict' not found.
  43. 43.0 43.1 Lua error in package.lua at line 80: module 'strict' not found.
  44. Lua error in package.lua at line 80: module 'strict' not found.
  45. Lua error in package.lua at line 80: module 'strict' not found.
  46. Lua error in package.lua at line 80: module 'strict' not found.
  47. Lua error in package.lua at line 80: module 'strict' not found.
  48. Lua error in package.lua at line 80: module 'strict' not found.
  49. Lua error in package.lua at line 80: module 'strict' not found.
  50. Lua error in package.lua at line 80: module 'strict' not found.
  51. Lua error in package.lua at line 80: module 'strict' not found.
  52. Lua error in package.lua at line 80: module 'strict' not found.
  53. Lua error in package.lua at line 80: module 'strict' not found.

External links