The Student Room Group

Scroll to see replies

Reply 580
I've always preferred Cambridge rather than Oxford. Since Cambridge's science heritage is so strong, people have to regard Oxford as being better in the arts for 'balance'.

Yet if you look at Cambridge's arts alumni over the centuries, it's arguably better than Oxford's, particularly in terms of groundbreaking changes like literary criticism. Oxford's had more well known authors but Cambridge has had more well known poets and actors. Of course, Oxford is definitely regarded as better than Cambridge for Politics but for every other arts subject it's about equal or Cambridge is ahead in some cases. Oxford does have a more famous dictionary in its favour though. But Cambridge is a grander, prettier, feeling place.

But Oxford- Alice in wonderland and George Orwell's Animal Farm rock so feel proud of that.
Reply 581
bodybuilder22
only by default; only place that offers the course.

come at me bro



Only it's not though, so.
Reply 582
I am a sixth form student in Cambridge and I love both the city and uni. Now no one can doubt that Cambridge is world's greatest university.

I am applying this year to St Catharine's college to read law but really worried that this publication of world ranking will mean more competition than ever before.

PS: I got 9 A* in IGCSE and my A2 predictions are A*A*AAA.
Reply 583
It's strongest UG subject, maths, isn't even the best UG course in the world. :QED:
Reply 584
The league tables in general are hard to follow after oxbridge the table goes haywire, its ******* useless. There's so much change year in year out. I'm not even talking about few places here and there :mob:
boooooo

oxford ftw :smile:
Reply 586
devan5
The league tables in general are hard to follow after oxbridge the table goes haywire, its ******* useless. There's so much change year in year out. I'm not even talking about few places here and there :mob:



That is the power of statistics and that is why statistics is so fasinating XD

As I made it very clear at the beggining, the purpose here is to discuss the methodology but not which university is better or worse.

This is the official link of QS 2010 and detailed methodology is given.

http://www.topuniversities.com/university-rankings/world-university-rankings/home

Shall we discuss the methodology rather than any university in particular??
Reply 587
lofter54
I am a sixth form student in Cambridge and I love both the city and uni. Now no one can doubt that Cambridge is world's greatest university.

I am applying this year to St Catharine's college to read law but really worried that this publication of world ranking will mean more competition than ever before.

PS: I got 9 A* in IGCSE and my A2 predictions are A*A*AAA.


Hey how was your Narnia holiday this summer?
This ranking shows that while St Andrews and Durham may do well in doemstic rankings, with their small size and high satisfaction, when it comes to world wide academic rep, actual research power, and ability to compete with the US now and in the future, they fall behind places like Manchester, Birmingham, and Nottingham which rank worse domestically. Warwick and Birstol are clearly good at both
Reply 589
Many people (particularly GCSE and A-Levels students) lack the very basic statistical knowledge and even INTELLIGENCE in understanding university rankings and talking stupid RUBBISH like e.g. "why Warwick is top five in the UK but not top 10 or even top 20 in the world, so World ranking is scandalous or flawed". The followings are the simple summary of the methodologies used by different newspapers and by AWRU and QS. Hopefully, the summary gives those people some kind of basic statistical sense to understand rankings.

The Guardian:

1. Teaching quality - as rated by graduates of the course (10%) (data source: the National Student Survey)
2. Feedback - as rated by graduates of the course (5%)
3. Spending per student (17%)
4. Staff/student ratio (17%)
5. Job prospects (17%) (data source: DLHE)
6. Value added (17%)
7. Entry score (17%)

The Times:

1. Student Satisfaction (data source: the National Student Survey 2005)
2. Research (data source: 2008 Research Assessment Exercise)
3. Entry Standards - Average UCAS tariff score (data source: Higher Education Statistics Agency)
4. The student-staff ratio (data source: Higher Education Statistics Agency)
5. Library and Computing spending - Average expenditure per student (data source: Higher Education Statistics Agency)
6. Facilities spending - Average expenditure per student on sports, careers services, health and counselling
7. Good Honours - Percentage of students graduating with a good degree, 'good' being defined as a first or 2.1
8. Graduate prospects – Percentage of UK graduates in graduate employment or further study (data source: HESA's survey of Destination of Leavers from Higher Education (DLHE))
9. Completion - Percentage of students who manage to complete their degree.

The Sunday Times

1. Student satisfaction (200 points) - The results of national student surveys (NSS) are scored taking a theoretical minimum and maximum score of 50% and 90% respectively (data source: the National Student Survey)
2. Teaching excellence (50) - Excellence is defined as: subjects scoring at least 22/24 points, those ranked excellent, or those undertaken more recently in which there is confidence in academic standards and in which teaching and learning, student progression and learning resources have all been ranked commendable (data source: Quality Assurance Agency; Scottish Higher Education Funding Council; Higher Education Funding Council for Wales)
3. Heads’/peer assessments (100) - Heads are asked to identify the highest-quality undergraduate provision (data source: The Sunday Times heads’ survey and peer assessment)
4. Research quality (200) - Based upon the most recent Research Assessment Exercise (data source: Higher Education Funding Council for England (Hefce))
5. A-level/Higher points (250) - Nationally audited data for the susbsequent academic year are used for league table calculations (data source: Higher Education Statistics Agency)
6. Unemployment (100) - The number of students assume to be unemployed six months after graduation is calculated as a percentage of the total number of known destinations (data source: HESA, Destinations of Leavers from Higher Education)
7. Firsts/2:1s awarded (100) - The percentage of students who graduate with firsts or 2:1 degrees. Unclassified degrees are excluded (data source: HESA)
8. Student/staff ratio (100) - Student/staff ratio calculated by Hesa by institution (a ratio of 10:1 as a benchmark for excellence, worthy of 100 points) (data source: HESA)
9. Dropout rate (variable) - The number of students who drop out before completing their courses is compared with the number expected to do so (the benchmark figure shown in brackets) (data source: Hefce, Performance Indicators in Higher Education.

The Independent

1. Student satisfaction - measure of the view of students of the teaching quality at the university (data source: the National Student Survey)
2. Research assessment/quality – measure of the average quality of the research undertaken in the university (data source: 2008 Research Assessment Exercise)
3. Entry standards - the average UCAS tariff score of new students under the age of 21 (data source: HESA data for 2008–09)
4. Student:staff ratio - measure of the average staffing level in the university (data source: HESA data for 2008–09)
5. Academic Services spend - the expenditure per student on all academic services (data source: HESA data for 2006–07, 2007–08 and 2008–09)
6. Facilities spend - the expenditure per student on staff and student facilities (data source: HESA data 2006–07, 2007–08 and 2008–09)
7. Good honours - proportion of firsts and upper seconds (data source: HESA data for 2008–09)
8. Graduate prospects - measure of the employability of a university's graduates (data source: HESA data for 2007–08)
9. Completion – measure of the completion rate of those studying at the university (data source: HESA performances indicators, based on data for 2008–09 and earlier years)
Reply 590
Now the followings are the methodologies used by global rankings. Hopefully, people can have the basic intelligence to appreciate the differences between national and global rankings.

Academic Ranking of World Universities

The ranking compared 1200 higher education institutions worldwide according to a formula that took into account

1). alumni winning Nobel Prizes and Fields Medals (10 percent)
2. staff winning Nobel Prizes and Fields Medals (20 percent)
3. highly-cited researchers in 21 broad subject categories (20 percent)
4. articles published in Nature and Science (20 percent)
5. the Science Citation Index and Social Sciences Citation Index (20 percent)
6. the per capita academic performance (on the indicators above) of an institution (10 percent).

QS World University Ranking

1. Academic Peer Review (Weighting 40 per cent).
2. Recruiter Review: weighting 10 per cent
3. Faculty Student Ratio: weighting 20 per cent
4. Citations per Faculty: weighting 20 per cent
5. International Orientation: weight 10 per cent
Part of the problem with university league tables is that it is not straight forward to compare like-with-like as universities can validate their own qualifications in the way other educational establishments (e.g. schools and colleges) cannot.

National exam boards for their flaws at least provide some level of objectivity in assessment in that for each of the subjects, all candidates take similar exams. The same cannot be said comparing institutions subject by subject. In anycase, a lot of the newer universities have more course titles than I knew existed, making comparisons even harder.

Secondly, different people will attach different weightings to different aspects of universities. I'm surprised that no one in the statto world has developed a model that would allow prospective students to find out what rankings their universities should be listed in given all the variables listed real9999's post in http://www.thestudentroom.co.uk/showpost.php?p=27572691&postcount=600
Reply 592
danny111
You think that's bad, check out UC Berkeley. They are in the top 5 for ALL 5 individual subject rankings and yet they are 28 I think overall.


...

That is simply because UC Berkeley is one of the biggest of the XL (extra large) universities in the QS ranking......

China is the world's second richest country by total GDP but is classified as a low middle income country... The same reason.
Reply 593
Tarutaru
...

That is simply because UC Berkeley is one of the biggest of the XL (extra large) universities in the QS ranking......

China is the world's second richest country by total GDP but is classified as a low middle income country... The same reason.


So per academic?

Seems silly to me though.
Reply 594
danny111
So per academic?

Seems silly to me though.


Oh yes XD, I know you are certainly more intelligent than the people who made this table XD

Why not just spend some times studying their methodology before posting here?
Reply 595
danny111
So per academic?

Seems silly to me though.


Very simple logic and I hope you can understand the following example.

A faculty with 10 members and 5 of them have nobel prizes.

A faculty with 100 members and 10 of them have nobel prizes.

So question... Which faculty do you think should be ranked higher simply based on these two sentences???
Reply 596
real9999
Many people (particularly GCSE and A-Levels students) lack the very basic statistical knowledge and even INTELLIGENCE in understanding university rankings and talking stupid RUBBISH like e.g. "why Warwick is top five in the UK but not top 10 or even top 20 in the world, so World ranking is scandalous or flawed". The followings are the simple summary of the methodologies used by different newspapers and by AWRU and QS. Hopefully, the summary gives those people some kind of basic statistical sense to understand rankings.

The Guardian:

1. Teaching quality - as rated by graduates of the course (10%) (data source: the National Student Survey)
2. Feedback - as rated by graduates of the course (5%)
3. Spending per student (17%)
4. Staff/student ratio (17%)
5. Job prospects (17%) (data source: DLHE)
6. Value added (17%)
7. Entry score (17%)

The Times:

1. Student Satisfaction (data source: the National Student Survey 2005)
2. Research (data source: 2008 Research Assessment Exercise)
3. Entry Standards - Average UCAS tariff score (data source: Higher Education Statistics Agency)
4. The student-staff ratio (data source: Higher Education Statistics Agency)
5. Library and Computing spending - Average expenditure per student (data source: Higher Education Statistics Agency)
6. Facilities spending - Average expenditure per student on sports, careers services, health and counselling
7. Good Honours - Percentage of students graduating with a good degree, 'good' being defined as a first or 2.1
8. Graduate prospects – Percentage of UK graduates in graduate employment or further study (data source: HESA's survey of Destination of Leavers from Higher Education (DLHE))
9. Completion - Percentage of students who manage to complete their degree.

The Sunday Times

1. Student satisfaction (200 points) - The results of national student surveys (NSS) are scored taking a theoretical minimum and maximum score of 50% and 90% respectively (data source: the National Student Survey)
2. Teaching excellence (50) - Excellence is defined as: subjects scoring at least 22/24 points, those ranked excellent, or those undertaken more recently in which there is confidence in academic standards and in which teaching and learning, student progression and learning resources have all been ranked commendable (data source: Quality Assurance Agency; Scottish Higher Education Funding Council; Higher Education Funding Council for Wales)
3. Heads’/peer assessments (100) - Heads are asked to identify the highest-quality undergraduate provision (data source: The Sunday Times heads’ survey and peer assessment)
4. Research quality (200) - Based upon the most recent Research Assessment Exercise (data source: Higher Education Funding Council for England (Hefce))
5. A-level/Higher points (250) - Nationally audited data for the susbsequent academic year are used for league table calculations (data source: Higher Education Statistics Agency)
6. Unemployment (100) - The number of students assume to be unemployed six months after graduation is calculated as a percentage of the total number of known destinations (data source: HESA, Destinations of Leavers from Higher Education)
7. Firsts/2:1s awarded (100) - The percentage of students who graduate with firsts or 2:1 degrees. Unclassified degrees are excluded (data source: HESA)
8. Student/staff ratio (100) - Student/staff ratio calculated by Hesa by institution (a ratio of 10:1 as a benchmark for excellence, worthy of 100 points) (data source: HESA)
9. Dropout rate (variable) - The number of students who drop out before completing their courses is compared with the number expected to do so (the benchmark figure shown in brackets) (data source: Hefce, Performance Indicators in Higher Education.

The Independent

1. Student satisfaction - measure of the view of students of the teaching quality at the university (data source: the National Student Survey)
2. Research assessment/quality – measure of the average quality of the research undertaken in the university (data source: 2008 Research Assessment Exercise)
3. Entry standards - the average UCAS tariff score of new students under the age of 21 (data source: HESA data for 2008–09)
4. Student:staff ratio - measure of the average staffing level in the university (data source: HESA data for 2008–09)
5. Academic Services spend - the expenditure per student on all academic services (data source: HESA data for 2006–07, 2007–08 and 2008–09)
6. Facilities spend - the expenditure per student on staff and student facilities (data source: HESA data 2006–07, 2007–08 and 2008–09)
7. Good honours - proportion of firsts and upper seconds (data source: HESA data for 2008–09)
8. Graduate prospects - measure of the employability of a university's graduates (data source: HESA data for 2007–08)
9. Completion – measure of the completion rate of those studying at the university (data source: HESA performances indicators, based on data for 2008–09 and earlier years)


Really don't understand... why there are so many idiots (even those of the most prestigious universities) have so much faith in newspaper rankings which are poorly designed and based on trivial ranking criteria O_o. Like the QS, these newspaper ranking tables give heavy weights to subjective opinions, even subjective opinions from graduates of the universities these tables are trying to rank.

A course with happier students (measured by satisfaction), higher completion rates (which may well be an indication low level of difficulty), higher computing expending etc is not necessarily a good course :frown:

Perhaps world ranking tables should try to compare student happiness across countries XD
Reply 597
Tarutaru
Very simple logic and I hope you can understand the following example.

A faculty with 10 members and 5 of them have nobel prizes.

A faculty with 100 members and 10 of them have nobel prizes.

So question... Which faculty do you think should be ranked higher simply based on these two sentences???


No.

Seems silly to differentiate subject rankings and overall rankings.

Surely for any given subject UCB has many more academics (if they have overall I assume it somewhat spreads evenly across departments) and if you apply the same criteria for subject as for overall then they would be lower in a subject ranking as you say.

So why the difference in criteria between individual areas and overall?
Student satisfaction, good honours and completion rate are stupid criteria for national tables. I'd look more at:

-Average starting salary
-% employed in PROPER graduate jobs
-Entry requirements (very good indicator of prestige)
-Some sort of spend per pupil
-Number of fellows of royal societies/nobel prizes etc
-Teaching assessment
-Research assessment
Reply 599
danny111
No.

Seems silly to differentiate subject rankings and overall rankings.

Surely for any given subject UCB has many more academics (if they have overall I assume it somewhat spreads evenly across departments) and if you apply the same criteria for subject as for overall then they would be lower in a subject ranking as you say.

So why the difference in criteria between individual areas and overall?


Spend some tims to study the Berline Principles for ranking...

http://www.topuniversities.com/university-rankings/world-university-rankings/methodology/classifications

QS also publishes a simple analysis of the top 100 institutions in each of the five faculty-level areas mentioned above: natural sciences, technology, biology and medicine, social sciences and the arts and humanities. These five tables list universities in order of their Academic Peer Review score. They also give the citations per paper for each institution.

http://en.wikipedia.org/wiki/QS_World_University_Rankings

"QS does not aggregate these scores and has said that doing so would not produce a meaningful result. It uses citations per paper rather than per person partly because it does not hold details of the academic staff in each subject area, and partly because the number of citations per paper should be a consistent indicator of impact within a specific field."

http://www.topuniversities.com/articles/rankings-comment/understanding-qs-world-university-rankings%E2%84%A2-methodology



"Size – Rankings have been criticised for favouring large universities. The Shanghai Jiao Tong Academic Ranking (SJT) certainly favours large, well funded institutions with an emphasis on science, and most of its indicators are not adjusted for size. In QS’s World University Rankings, by contrast, all the hard data indicators are adjusted for size. The unprompted nature of our survey design also means that respondents have to think about the universities which they actively know produce great research, and employers have to actively think about the universities they seek to recruit from."

Take an introductory statistics course if you still have problems.......

Latest

Trending

Trending