The Student Room Group

The Economist's British university rankings

The Economist has published an interesting new league table which ranks universities by the 'value added' to your career 5 years after graduation by attending certain universities.

The Economist:

WHICH university is best? Most published rankings tell you a great deal about an institution’s reputation, or how often its researchers publish in top academic journals. However, they don’t necessarily answer the question that prospective students really care about: what they have to gain by attending. For example, it is possible that graduates from the London School of Economics do well in the job market because of the university’s superior teaching. However, it is also possible that they earn so much simply because they were bright to begin with, or because they tend to choose courses that lead to lucrative careers.

To estimate which universities make the biggest positive impact on their students, we have built a statistical model that estimates how much graduates from each university can expect to earn. It is based on how selective a university is, what subjects its students study, the share of students from lower-income areas, how many students enter after age 20, what share of students attended private schools and where the university is located. We then compare these estimates of expected earnings with the actual figures. This difference, we argue, measures the value added by the institution.

The table below displays the results. We list both the actual and expected earnings of students from each institution, and rank them by the gap between the two figures. For the technically inclined, we offer a fuller explanation of our methodology here.


Here is the top 10:
Economist top 10.png

You can also see how individual courses at universities far. The most lucrative course was E&M at Oxford, where expected earnings are £47,605 5 years after graduation, but actual earnings are £71,700, giving a value added of £24,095. Some of the unexpectedly poorly performing universities include St Andrews and Durham (and the methodology controls for location so that's not the reason).

Note: the actual earnings uses median not mean, as to not be skewed by the very high earners. That could be considered a weakness as it may under-emphasise the performance of the richest graduates from elite universities.
(edited 6 years ago)

Scroll to see replies

Original post by ♥Samantha♥
The Economist has published an interesting new league table which ranks universities by the 'value added' to your career 5 years after graduation by attending certain universities.

The Economist:


Here is the top 10:
Economist top 10.png

You can also see how individual courses at universities far. The most lucrative course was E&M at Oxford, where expected earnings are £47,605 5 years after graduation, but actual earnings are £71,700, giving a value added of £24,095. Some of the unexpectedly poorly performing universities include St Andrews and Durham (and the methodology controls for location so that's not the reason).

Note: the actual earnings uses median not mean, as to not be skewed by the very high earners. That could be considered a weakness as it may under-emphasise the performance of the richest graduates from elite universities.


I cant see the point in the table? All its doing is showing a small variation between however they decided on the average perceived v average actual earnings.

Most people would still prefer to go to the uni that offered 100% greater actual earnings than a possible 5% increase of a low estimated earning.
According to this table, taking Computer Science at Oxford Brookes has a value added of +£6,134, whereas taking Computer Science at the University of Oxford has a value added of -£5,061. I don't know, perhaps there is some truth to that, but it makes it difficult for me to take this seriously as a measure of value. There are a lot of other very strange figures that come up. I can also imagine that this might fluctuate significantly year-on-year and suffer from sample-size effects, particularly for individual subjects.
(edited 6 years ago)
I can see the universities doing it already... getting their PR team to plaster this shiny new ranking all over their website and clearing advertisements.
Original post by 999tigger
I cant see the point in the table? All its doing is showing a small variation between however they decided on the average perceived v average actual earnings.

Most people would still prefer to go to the uni that offered 100% greater actual earnings than a possible 5% increase of a low estimated earning.


Yeah that was my first thought. No one's going to think oh let's go to Portsmouth instead of Imperial. But it is useful to compare similar universities, so universities which have similar expected earnings. E.g. UCL and St Andrews have similar expected earnings, indicating they take in similar students, but those who go to UCL go on to earn about £5k more after 5 years which could influence your decision. At the end of the day it's no use seeing that Oxford and LSE have the highest salaries if you're not at the standard to get in. so you instead can see at which universities at your level students seem to punch above their weight in times of income. You can even compare Oxford and LSE. LSE students have higher expected salaries, but Oxford students perform relatively better bc they outperform what their grades etc. suggest they should earn, which means if you put 2 of the exact same students in each, they should do better at Oxford. Just looking at salary data only tells you where the average student for each university earns more, not where the same student would earn more, bc the average student differs at each university, if that made sense... obviously that doesn't hold when comparing vastly different standards of university, but you are unlikely to be anyway, because when you're applying you're usually trying to decide between similar places.

Also, it appears as a small variation mainly bc the large variations of some courses are cancelled out by little variation in other courses, which obscures the results. it's a lot more interesting if you look at it at a course by course basis. In their methodology they discuss how some courses are pretty flat in their outcomes (uni choice does not much effect income) such as medicine/dentistry, architecture, nursing, education; whilst others are highly dependent on where you study, such as economics, law, business, computer science and maths.
(edited 6 years ago)
Original post by Plagioclase
According to this table, taking Computer Science at Oxford Brookes has a value added of +£6,134, whereas taking Computer Science at the University of Oxford has a value added of -£5,061. I don't know, perhaps there is some truth to that, but it makes it difficult for me to take this seriously as a measure of value. There are a lot of other very strange figures that come up. I can also imagine that this might fluctuate significantly year-on-year and suffer from sample-size effects, particularly for individual subjects.


Yes definitely. I think that as a ranking, this is basically useless, but as a tool it is useful. I think it is better to compare similar universities. There's no point comparing Brookes and Oxford when the students are of a completely different standard. But for someone trying to compare Oxford Brookes and Robert Gordon it would be useful to see that students from Oxford Brookes largely outperform what their backgrounds would suggest they would earn, whilst students from Robert Gordon do not. And yes they may suffer from sample-size issues or random variations as it is only based on data from 1 year, so its a WIP. They hope to start compiling new data and adding it each year.
Original post by Blue_Cow
I can see the universities doing it already... getting their PR team to plaster this shiny new ranking all over their website and clearing advertisements.


I know right ?
You see different universities claiming they are best for student satisfaction according to this ranking absolutely no one ever heard of.
I expect the results of this one to be coming soon on a annoying advertisement on TSR.
Original post by Blue_Cow
I can see the universities doing it already... getting their PR team to plaster this shiny new ranking all over their website and clearing advertisements.


Haha yeah soon on the Portsmouth website it will say "Ranked the best value university in the nation by The Economist"
Bottom of the table is St Andrews. Just think how much better Kate Middleton would have done if she had gone to Portsmouth and she could have saved money by living at home.
Original post by ♥Samantha♥
Yes definitely. I think that as a ranking, this is basically useless, but as a tool it is useful. I think it is better to compare similar universities. There's no point comparing Brookes and Oxford when the students are of a completely different standard. But for someone trying to compare Oxford Brookes and Robert Gordon it would be useful to see that students from Oxford Brookes largely outperform what their backgrounds would suggest they would earn, whilst students from Robert Gordon do not. And yes they may suffer from sample-size issues or random variations as it is only based on data from 1 year, so its a WIP. They hope to start compiling new data and adding it each year.

Well if you could do it properly

Haven't read the methodology but I think it'd be difficult to come up with a reasonable number for the expected earnings of an oxbridge graduate if they hadn't gone to uni cos people comparable to oxbridge graduates must almost all go to a university in the end even if it's not oxbridge. These people are choosing between unis

at the other end of the scale there are presumably lots of people very comparable with solent graduates (to pick on one uni) who never go to uni, the choice they face is whether to go to uni or get a job.

some quite odd results - engineering at imperial appears to decrease your earnings by £1700,
engineering at Abredeen and RGU apparently sends your earnings rocketing - oil industry presumably, but hiring has been sluggish for several years now.

---
PS I see the opposite regarding Brookes and Robert Gordon over all courses.
RGU #5 +£2700
OBU #90 -£950
(edited 6 years ago)
The problem with any type of student survey is the sample size. For such a unique criterion expected earnings how can they have a reliable and consistent reading.

And what this ranking is basically "less **** than you thought".
(edited 6 years ago)
Anyone spot the problems here?

econ.JPG

Spoiler

Beyond errors in presentation of the data, it's at least a useful metric for students from disadvantaged backgrounds who want to maximise what they get from university.

As with many things, not understanding the point of something doesn't usually indicate it's wrong, just that it's not relevant to your specific experience.
Original post by ♥Samantha♥
Yes definitely. I think that as a ranking, this is basically useless, but as a tool it is useful. I think it is better to compare similar universities. There's no point comparing Brookes and Oxford when the students are of a completely different standard. But for someone trying to compare Oxford Brookes and Robert Gordon it would be useful to see that students from Oxford Brookes largely outperform what their backgrounds would suggest they would earn, whilst students from Robert Gordon do not. And yes they may suffer from sample-size issues or random variations as it is only based on data from 1 year, so its a WIP. They hope to start compiling new data and adding it each year.


I agree with this analysis. Comparing Oxford and Portsmouth doesn't make much sense - the pool of students each recruits from is quite distinct. Comparing Oxford and Cambridge however - interesting. Overall a very interesting table.
Some data is pretty odd though. Expected earnings of Brighton medics is £51,561, whereas expected earnings of Cambridge medics (who have substantially higher UCAS points) is £42,000, a full 23% lower. That's quite a pay cut!

I'd look in the methodology but paywall...
Original post by Plagioclase
According to this table, taking Computer Science at Oxford Brookes has a value added of +£6,134, whereas taking Computer Science at the University of Oxford has a value added of -£5,061. I don't know, perhaps there is some truth to that, but it makes it difficult for me to take this seriously as a measure of value. There are a lot of other very strange figures that come up. I can also imagine that this might fluctuate significantly year-on-year and suffer from sample-size effects, particularly for individual subjects.


Nah. Having worked in Software, I would question the motives of people who study CS at Oxford. To get into Oxford you have to be highly academic and your face needs to fit. The CS industry generally does not seek academia. It therefore makes sense to me that a uni that provides real-world skills rather than theoretical academic knowledge would have more successful grads. Computing is one of the great bastions of meritocracy. What you can do counts for so much more than the colour of your tie.
Well, cool I guess? A rather pointless list.
Reply 17
As far as I can see this is simply the government's LEO data repackaged. And I don't understand the "expected" earnings bit and how is it relevant? As @999tigger said, the datapoint that counts is the actual earnings. Or am I missing something...
Original post by nexttime
Some data is pretty odd though. Expected earnings of Brighton medics is £51,561, whereas expected earnings of Cambridge medics (who have substantially higher UCAS points) is £42,000, a full 23% lower. That's quite a pay cut!

I'd look in the methodology but paywall...


Methodology:
OUR British university rankings are based primarily on “longitudinal education outcomes data” released in June by the Department for Education, which breaks down earnings for graduates by university, course and sex. We then match these figures to data on factors such as how selective each university is, as measured by UCAS entry standards. Wherever possible, we use course-specific entry requirements.

The model itself is an ordinary-least squares regression, where each observation is a specific course at a specific university and the dependent variable is median graduate earnings. The regression is weighted by the number of graduates in each course. Our independent variables are entry standards, field of study, the share of students from low-income areas, the share of older students, the share of students who attended private schools, the gross-value-added per person of the region the university is located in (eg, the north-east) and how far the university is from London, as measured by driving distance on Google Maps.

To produce an estimate of the value added by each particular university course, we first take the projected earnings numbers from our model and subtract them from the actual earnings numbers. We then aggregate these estimates across all courses to derive a university-level average. The advantage of this approach is that it allows us to compare like-for-like; Imperial College, for example, offers courses mainly in the more-lucrative fields of science and engineering. We argue that universities which offer students more choices in the arts and humanities should not be penalised for doing so.

We look forward to improving these calculations as more data become available. Currently, earnings figures five years after graduation are only available for a single graduating class. Oxford, for instance, is ranked far higher than Cambridge in our ranking because their graduates earned £3,640 a year more in our dataset—it’s possible that other classes of Cambridge graduates have outperformed their Oxford peers. Another potential pitfall is our use of median earnings, which compresses the variance in earnings across universities, and may under-emphasise the performance of the richest graduates from elite universities.

Finally, our analysis rests on the assumption that the residuals from our regression are a reliable proxy for the value added by a university. However, formally, all we can say is that our independent variables could not explain these remaining differences in earnings. They could well have been caused by some factor other than the education at a given university, or simply be the result of random chance. Future researchers would be served well by looking at individual-level statistics, rather than at data aggregated at the course level.

On a more conceptual level, it’s impossible to say how much of a graduate’s earnings can truly be attributed to value provided by the university they attended. A sceptic would argue that universities do not increase their students’ human capital, and simply serve as a filtering mechanism for employers to weed out weak job candidates. Another interpretation is that British university admissions are extremely meritocratic, and only the best pupils earn the privilege of enjoying superior educations at elite universities.


I don't know how it works but I'd guess it was due to the independent variables? They don't publish their actual data so I don't know. There may be some dodgy data in there as Brighton and Sussex have different actual/predicted earnings but their medical school is joint. also the course names are confusing, e.g. Oxford E&M seems to be under business, which leaves me to wonder whats under economics, PPE? so some courses may not be what they seem
Original post by JohnGreek
I haven't had a chance to read through the methodology, but one of the issues that strike me with this is that the 'geographical area' thing is bs and disadvantages London unis in favour of those in the South. Hence LSE and UCL's negative ratings. Studying E&M at Oxford isn't going to make you magically set your career in that city as opposed to London - London Paddington is barely an hour away from Oxford train station. Most E&M grads aren't going to be deterred from living in London by the mere location of their university - that's where all the juicy, high paying grad jobs are! Take this principle and apply it to any sector which is London-based, and you realise that the 'benefit' that London unis have is illusory. Everyone in banking/law/consultancy/accounting wants to work in London, irrespective of where they went to uni.

Regional contextualisation also ignores the location of one's parents/family. I've met plenty of kids at Durham who, despite living in what is nominally the poorest area in England, come from Home Counties, with families that can accommodate for them living at home while working, or with high enough salaries that the Bank of Mum and Dad can fund a mortgage. The fact that they happened to live up North did not pose any structural barriers to them moving back into higher paying, more prosperous areas upon graduation (the high relative presence of Durham grads in the City is testament to this).

Comparing value added also implies that the better unis have as much room to 'add value' to their students as the worse ones. I would humbly suggest that that's not the case - the relationship between value added and grades isn't linear. Top students who can get into any top RG aren't going to rely on that uni to save their careers - they already have the connections, grades, and likely know what they're doing. In fact, the benefits they'll get from that uni will come from the people they will study and network with, as opposed to anything the institution itself is responsible for providing (e.g. teaching). The uni could literally be a glorified library card with the occasional reading list and they'd still get into their grad scheme of choice. On the other hand, the CCCs who get into e.g. Bishop Grosseteste are likely to have far more untapped potential (be it academic or professional), that a uni can help build on, through higher numbers of contact hours, smaller seminars, etc.

And now for the good old critiques: this table is useless insofar as it doesn't have subject specific data, it doesn't compare like to like (no one has a dilemma between UCL and Aston, even for something like Engineering), and ultimately shows value added differences that are trivial (someone facing an Oxbridge dilemma, or a UCL vs LSE vs KCL dilemma, isn't going to be swayed by a few hundred pounds difference in value added salary - not absolute earnings - at age 27).


I agree with what you say but I don't think you have captured the whole of it.

Someone noted the disparity between Brighton and Oxford medical salaries. Once you re dealing with people who manage to achieve a "good job", pay level is not the only consideration. The best jobs in medicine may not be the most well paid jobs in medicine. Someone may choose to stay in a research position, someone else may take a year out to work in a refugee camp whilst a third person may be following the NHS career treadmill in a district general hospital. The jobs with the most attractive "non-monetary benefits" are likely not to be evenly distributed.

Likewise, civil service fast stream trainees are not particularly well paid but the attraction is being at the centre of power.
(edited 6 years ago)

Quick Reply

Latest

Trending

Trending