To be honest there is an easy way to disprove the point of ALs being an indicator of ability.
Let's compare the content from the A level IT syllabus (which I obtained a C grade on), to a typical computer science module from my university.
A level IT:
http://www.ealingindependentcollege.com/courses/as-and-a2-courses-a-level-retakes-a-level-retake-a-level/a-level-ict/A typical module from university:
What looks harder?
So if anything, by your logic, I should have got a 3rd or 2.2 if I was lucky because it is considerably harder.
Many did. People getting ordinary degrees on that course was not unheard off.
Easy.
One exam technique that I learn't later on.
Practising past exam papers.
At my school anyway, they didn't even do this. By not doing so:
- Time management was poor on the day of the exam, I actually ran out of time on that IT Alevel.
- I wasn't smart with my study techniques. Instead of focusing on what my strengths were and doing those questions on the exam, I focused on trying to learn the whole module.
- Doing past papers and getting feedback from teachers, again, never happened. They didn't care.
I don't blame them when the student:teacher ratio is about 20:1
In a good independent school, it is not uncommon for students to have 1-1 tutoring, smaller class sizes, and better resources. If in that scenario they dont meet the grades, then they are poor students and your point stands.
People can go on about self-learning, but the whole point of going to a school is to be guided in the correct way at an age where many are immature. My generation also comes from the pre-youtube era.
Edit:
Finally, if where you studied was such a big deal, then these corporates should encourage those that did not go to a top 10, that got 2.1s to do a masters at one. This isnt the case, the graduate will be still discriminated against.