The Student Room Group

Guardian 2016 League Table - A Joke?

http://www.theguardian.com/education/ng-interactive/2015/may/25/university-league-tables-2016#S230

I know the Guardian has had some odd league tables before, but I think this wins.

People base their university choices on this, what are they doing?

Just some odd things:

Cambridge: 4th

Bath: 11th (below Greenwich/Dundee?)

UCL 13th

Bristol: 27th (below a whole range of WTF universities)

LSE; 38th

Manchester Met: 47th
Manchester: 58th


Don't get me wrong here - I'm not annoyed because my university has been ranked low, I'm at Warwick so I'll gladly take 2nd place! I don't think it's right though..

People who choose their universities based on these tables should be made very aware how dodgy they are.

Scroll to see replies

The Guardian rely heavily upon external research and perform a crude compilation of statistics. I personally believe their league tables to be considerably inaccurate and students wishing to seek out the best departments for their chosen subject should go elsewhere.
Student satisfaction makes up a lot of the ranking position factors so just because a uni has top researchers it does not make it more likely to be at the top. In fact, research quality is nowhere to be seen in these tables. Which makes sense because they are for undergrads. I assume you are a Maths student, you shouldn't be so shocked at the results. Check the metrics.
Cambridge and Manchester seem to be the biggest mistakes made in that table.
I would debate about where Oxford / Warwick / Imperial stand in relation to another for maths as much as the next person. Nonetheless, I don't see how Imperial or Warwick could possibly have a score higher than Cambridge for mathematics.

Keep in mind, last year the Guardian said St. Andrews was the best university for mathematics in the UK, this year it would seem St. Andrews is 5th. I'll probably be at Warwick next year, so yeah I'd take 2nd place, but I wouldn't take it seriously given the state of that table.
(edited 8 years ago)
Reply 4
I reckon Cambridge is down at 4th for Maths due to a missing Spend/Student data point - which is odd as they do have spend/student for other subjects (e.g. Engineering and Medicine are both 10, highest of all unis.)
I'm having de ja vu...

I'm sure I've already said in a thread about this that the value added score column is pointless.
"10. The value-added score compares students’ individual degree results with their entry qualifications, to show how effective the teaching is. It is given as a rating out of 10"

Tfw your uni is 44th.
Reply 6
Original post by rayquaza17
I'm having de ja vu...

I'm sure I've already said in a thread about this that the value added score column is pointless.
"10. The value-added score compares students’ individual degree results with their entry qualifications, to show how effective the teaching is. It is given as a rating out of 10"

Tfw your uni is 44th.


Value Add is a perfectly sensible measure. But you can always ask The Guardian about the method here:

University guide Q&A: put your questions to the compiler
http://gu.com/p/493dy

Edit to add info from their Q&A
"d - Value Added Scores

Based upon a sophisticated indexing methodology that tracks students from enrolment to graduation, qualifications upon entry are compared with the award that a student receives at the end of their studies. Each full time student is given a probability of achieving a 1st or 2:1, based on the qualifications that they enter with. If they manage to earn a good degree then they score points which reflect how difficult it was to do so (in fact, they score the reciprocal of the probability of getting a 1st or 2:1). Thus an institution that is adept at taking in students with low entry qualifications, which are generally more difficult to convert into a 1st or 2:1, will score highly in the value-added measure if the number of students getting a 1st or 2:1 exceeds expectations. At least 30 students must be in a subject for a meaningful Value Added score to be calculated using 2012/13 data alone. If there are more than 15 students in 2012/13 and the total number across 2011/12 and 2012/13 reaches 30, then a 2-year average is calculated. This option could only be exercised when the subjects were consistent in definition between the two years.

We always regard students who are awarded an integrated masters as having a positive outcome.

A variant of the Value Added score is used in the three medical subjects Medicine, Dentistry and Veterinary Science. This is because medical degrees are often unclassified. For this reason, unclassified degrees in medical subjects are regarded as positive but the scope of the study population is broadened to encompass students who failed to complete their degree and who would count negatively in the Value Added score."

Posted from TSR Mobile
(edited 8 years ago)
I just saw this table

How does Cambridge have the top rated Medicine course?? The rating for student satisfaction, teaching, and feedback are incredibly low... Only around 50% for the feedback rating.
Original post by jneill
Value Add is a perfectly sensible measure. But you can always ask The Guardian about the method here:

University guide Q&A: put your questions to the compiler
http://gu.com/p/493dy

Edit to add info from their Q&A
"d - Value Added Scores

Based upon a sophisticated indexing methodology that tracks students from enrolment to graduation, qualifications upon entry are compared with the award that a student receives at the end of their studies. Each full time student is given a probability of achieving a 1st or 2:1, based on the qualifications that they enter with. If they manage to earn a good degree then they score points which reflect how difficult it was to do so (in fact, they score the reciprocal of the probability of getting a 1st or 2:1). Thus an institution that is adept at taking in students with low entry qualifications, which are generally more difficult to convert into a 1st or 2:1, will score highly in the value-added measure if the number of students getting a 1st or 2:1 exceeds expectations. At least 30 students must be in a subject for a meaningful Value Added score to be calculated using 2012/13 data alone. If there are more than 15 students in 2012/13 and the total number across 2011/12 and 2012/13 reaches 30, then a 2-year average is calculated. This option could only be exercised when the subjects were consistent in definition between the two years.

We always regard students who are awarded an integrated masters as having a positive outcome.

A variant of the Value Added score is used in the three medical subjects Medicine, Dentistry and Veterinary Science. This is because medical degrees are often unclassified. For this reason, unclassified degrees in medical subjects are regarded as positive but the scope of the study population is broadened to encompass students who failed to complete their degree and who would count negatively in the Value Added score."

Posted from TSR Mobile


I'm sorry, but I still don't see the relevance of value added score - does that mean that top maths universities (A*A*A+ entry requirements) can never get a proper value added score?
Original post by Juichiro
Student satisfaction makes up a lot of the ranking position factors so just because a uni has top researchers it does not make it more likely to be at the top. In fact, research quality is nowhere to be seen in these tables. Which makes sense because they are for undergrads. I assume you are a Maths student, you shouldn't be so shocked at the results. Check the metrics.


I guess I would probably rank universities on different criteria then... Although I agree research quality shouldn't effect tables for undergrads.

I would do away with value added score in an instant.
Original post by TheIrrational
I guess I would probably rank universities on different criteria then... Although I agree research quality shouldn't effect tables for undergrads.

I would do away with value added score in an instant.


Why should not? High research quality roughly speaking means better mathematicians. And I think it is better to be taught by the top rather than the average. I know not all of them teach but still.
Reply 11
Original post by TheIrrational
I'm sorry, but I still don't see the relevance of value added score - does that mean that top maths universities (A*A*A+ entry requirements) can never get a proper value added score?


Yes but they could "reduce" value if they take A*A*A students and they get a below average number of 1st/2.1s.

Oxbridge seem to getting a VA of 5 in mist subjects so they are taking in very good students and they leave with very good degrees. They are not "adding value", but not reducing value either.

If a uni accepts students with poor A-levels but they leave with a high number of 1st then they get a VA of 10.

If they accept good students but they leave with poor degrees they get a VA of 1.

Posted from TSR Mobile
(edited 8 years ago)
Original post by TheIrrational
X


(This is from the main table)
Surrey 4th
Liverpool 59th below John Moores
Falmouth above King's and Bristol
Sussex, Kent, Coventry above Edinburgh, Manchester and Nottingham
Lancaster above UCL, LSE, Birmingham, Southampton

Can confirm: it's a joke.

Original post by CrimsonDucati
I just saw this table

How does Cambridge have the top rated Medicine course?? The rating for student satisfaction, teaching, and feedback are incredibly low... Only around 50% for the feedback rating.


Based on the table criteria, yeah, it seems strange it's rated top. But ignoring the criteria, you would expect it to be near the top tbf.
(edited 8 years ago)
Original post by jneill
Yes but they could "reduce" value if they take A*A*A students and they get a below average number of 1st/2.1s.

Oxbridge seem to getting a VA of 5 in mist subjects so they are taking in very good students and they leave with very good degrees. They are not "adding value", but not reducing value either.

If a uni accepts students with poor A-levels but they leave with a high number of 1st then they get a VA of 10.

If they accept good students but they leave with poor degrees they get a VA of 1.

Posted from TSR Mobile


But it's up to the uni what degree classification the students get.

I know it's extreme, but what's stopping a uni taking in say BBB students and giving them all a first?

The value added score would only make sense if exams were standardised across all universities, like how everyone sits the same a level exams.


Posted from TSR Mobile
Reply 14
Original post by rayquaza17
But it's up to the uni what degree classification the students get.

I know it's extreme, but what's stopping a uni taking in say BBB students and giving them all a first?

The value added score would only make sense if exams were standardised across all universities, like how everyone sits the same a level exams.


External moderation. Although it's by no means a perfect system at the moment.
(edited 8 years ago)
Original post by TheIrrational
I guess I would probably rank universities on different criteria then... Although I agree research quality shouldn't effect tables for undergrads.

I would do away with value added score in an instant.


Not sure about value added, but I would also use different criteria. Student satisfaction is a highly subjective metric and it does not deserve two columns imo.
Staggering to see my alma mater spanking its way up through the pack year on year. The big difference between this league table and the others is clearly the emphasis on employability, student satisfaction and most definitely not research quality; which for an undergraduate is exactly what you should be looking at. Good on the Guardian for thinking a bit differently.
Reply 17
Original post by jneill
External moderation. Although it's by no means a perfect system at the moment.


The point of external moderation is not really to make sure universities are awarding classifications that are in line with the standard of student. If it is, then it'd be interesting to know why the University of Portsmouth are awarding nearly 50% firsts to its mathematics students, when only 15% of them managed to get AAA+ at A-Level and Oxford/Cambridge only award 30% firsts when AAA wouldn't even get you in. That's not only an imperfect system, it's a broken one :lol:
Reply 18
Original post by Noble.
The point of external moderation is not really to make sure universities are awarding classifications that are in line with the standard of student. If it is, then it'd be interesting to know why the University of Portsmouth are awarding nearly 50% firsts to its mathematics students, when only 15% of them managed to get AAA+ at A-Level and Oxford/Cambridge only award 30% firsts when AAA wouldn't even get you in. That's not only an imperfect system, it's a broken one :lol:


Yes true.
Original post by jneill
Yes but they could "reduce" value if they take A*A*A students and they get a below average number of 1st/2.1s.

Oxbridge seem to getting a VA of 5 in mist subjects so they are taking in very good students and they leave with very good degrees. They are not "adding value", but not reducing value either.

If a uni accepts students with poor A-levels but they leave with a high number of 1st then they get a VA of 10.

If they accept good students but they leave with poor degrees they get a VA of 1.

Posted from TSR Mobile


But a first at Oxbridge is totally different to one at say Surrey. People fairly routinely leave Cambridge because it is too hard and go on to get firsts at other universities. They are not really comparable.

Mind you, I don't think Cambridge should be top of a league table for maths. Even for the majority of people there the course is pitched at too high a level. The difficulty level is not appropriate for an undergraduate university course.

Quick Reply

Latest