The Student Room Group

Could an algorithm ever actually award grades?

In an emergency situation like the pandemic, in which public examinations are cancelled, could an algorithm ever be trusted to award grades again? Is there any way it could work?

With their rejection for this year's exam series, I'm curious to see if people could ever have faith in them again, or whether they believe they're too unreliable. They were still used to award grades in 2020 if its awarded grade was higher than the CAG.

I think that they could work with a better appeals system but only God knows what that is.

Scroll to see replies

I think one of the main problems is that a lot of smart people won't put the effort into non-essential work like school. It would be grossly unfair to them to retrospectively grade their school work.
Reply 2
An interesting but related side point is that AI is already used to help sift through candidates applying for jobs. Here is what a company recently wrote in the preamble of a video assessment I completed:

Our team will review your submission for job-related skills and abilities with the help of computer-assisted evaluation. This experience is designed to help the team focus on what matters, reduce bias, and make better, more objective decisions.
Reply 3
Ah, I see. What about current AI makes it inadequate for this task?
The assumption that teachers' predicted grades are more accurate than an algorithm's prediction seems highly speculative to me. It is well known that predicted grades are highly inaccurate.

I think it would be fairly trivial to formulate an algorithm that best retrodicts achieved grades based on predicted grades and all the other factors you can throw in. But people probably wouldn't like it.

The common impression seems to be that the problem was the use of the algorithm last year. If you ask me, the use of CAGs was just as problematic.
Reply 5
Original post by chazwomaq
The assumption that teachers' predicted grades are more accurate than an algorithm's prediction seems highly speculative to me. It is well known that predicted grades are highly inaccurate.

I think it would be fairly trivial to formulate an algorithm that best retrodicts achieved grades based on predicted grades and all the other factors you can throw in. But people probably wouldn't like it.

The common impression seems to be that the problem was the use of the algorithm last year. If you ask me, the use of CAGs was just as problematic.

It's interesting to note, once again, that a pupil would receive their algorithm-based grade if it was higher than the CAG in any subject. Now, there probably is a reasonable justification for this (or maybe even a few), but it shows that they didn't completely forsake the 'mutant' algorithm in the end. It just made me wonder if we could ever put our trust in one again.

I honestly think that the problem with the algorithm was a lack of an appeals route. If they had put more thought into how pupils could challenge these grades, maybe things would have been different. I can't necessarily think of how Ofqual would go about doing that though.
(edited 3 years ago)
True. It has been known for years that teachers overpredict on average, so obviously more people would be "downgraded" than "upgraded" by the algorithm. Then when it happened everyone acted outraged!
Reply 7
I did some analysis at my school sometime ago where I tried predicting students' GCSE grades for some subjects based on the data I had on them (CAT score, mock grades etc). On average I was better at doing it than the teacher in all but one of the subjects I tried it for - so overall my algorithm was closer to the final exam results than the teachers predictions were. The sample wasn't huge but I think it highlighted an issue that teachers are too influenced by students' behaviour, reliability in doing homework, general niceness and (in some cases) how much hassle the parents give.
Original post by Tolgash
In an emergency situation like the pandemic, in which public examinations are cancelled, could an algorithm ever be trusted to award grades again? Is there any way it could work?

With their rejection for this year's exam series, I'm curious to see if people could ever have faith in them again, or whether they believe they're too unreliable. They were still used to award grades in 2020 if its awarded grade was higher than the CAG.

I think that they could work with a better appeals system but only God knows what that is.

Interesting use of trust. Last year's algorithm was 100% trustworthy. The problem is that it just didn't work. It wasn't a rogue algorithm. It did exactly as it was told to. Just the concept was all wrong.

Personally, I think a solution might be to get teacher assessed grades and then run an algorithm on them to flatten out the peaks and troughs. Some teachers overestimated their kids by a country mile whilst others perhaps didn't do their kids a fair service. If the starting point is a teacher assessed grade, you can tweak grades to fit expected outcomes across schools and the country much more accurately and fairly.
it is teacher predicted grades
Reply 10
Original post by ByEeek
Interesting use of trust. Last year's algorithm was 100% trustworthy. The problem is that it just didn't work. It wasn't a rogue algorithm. It did exactly as it was told to. Just the concept was all wrong.

Personally, I think a solution might be to get teacher assessed grades and then run an algorithm on them to flatten out the peaks and troughs. Some teachers overestimated their kids by a country mile whilst others perhaps didn't do their kids a fair service. If the starting point is a teacher assessed grade, you can tweak grades to fit expected outcomes across schools and the country much more accurately and fairly.

I might have used 'trust' since Gavin Williamson did aha. However, I agree with you. That's a little better than what they did for classes with over fifteen pupils lol.

I understand that the concept was poor and the algorithm worked perfectly, but I guess I just mentioned algorithms because people don't believe there is a good concept for an algorithm, and thus fail to put any faith in using one. :smile:
(edited 3 years ago)
Probably. It would not be perfect, but predicted grades are known to be an absolute disaster, in terms of overall accuracy but also in terms of some schools consistently absurdly overpredicting compared to others, and and in terms of systematic bias. Honestly, its a very low bar to have to exceed.


Erm, what are you proposing using an AI for exactly? OP just mentioned using an algorithm.

Original post by Tolgash
I honestly think that the problem with the algorithm was a lack of an appeals route. If they had put more thought into how pupils could challenge these grades, maybe things would have been different. I can't necessarily think of how Ofqual would go about doing that though.

Problem was/is, they will get literally hundreds of thousands if not millions of appeals. What workforce is going to work through all that and deal with each one adequately and fairly?

What we actually needed, was either a) exams to go ahead in traditional format but with a short delay. Covid cases were actually really low by late June time, and we did Eat Out to Help Out not too long after. People will say that I'm only saying this as I have hindsight, but honestly it was pretty obvious at the time, or b) online exams taken from home. Again people will probably say that's not realistic, but honestly... it was. A driven person if not swaddled in red tape could have done it. But that's a huge 'if' - this country, and especially education, just loves pointless red tape so that the bureaucrats can stay feeling important.
Reply 12
Original post by nexttime

Problem was/is, they will get literally hundreds of thousands if not millions of appeals. What workforce is going to work through all that and deal with each one adequately and fairly?

What we actually needed, was either a) exams to go ahead in traditional format but with a short delay. Covid cases were actually really low by late June time, and we did Eat Out to Help Out not too long after. People will say that I'm only saying this as I have hindsight, but honestly it was pretty obvious at the time, or b) online exams taken from home. Again people will probably say that's not realistic, but honestly... it was. A driven person if not swaddled in red tape could have done it. But that's a huge 'if' - this country, and especially education, just loves pointless red tape so that the bureaucrats can stay feeling important.


Ofqual actually wanted socially distanced exams to take place, but Gavin Williamson didn't agree. Also, wouldn't online exams at home lead to an increase in malpractice?

With regard to appeals, how are awarding bodies going to handle those with teacher-assessed grades this year? Will the proposed ‘mini-exams’ be enough?
(edited 3 years ago)
Algorithms are responsible for student grades in normal years, so the question feels somewhat pointless.
Original post by Compost
I did some analysis at my school sometime ago where I tried predicting students' GCSE grades for some subjects based on the data I had on them (CAT score, mock grades etc). On average I was better at doing it than the teacher in all but one of the subjects I tried it for - so overall my algorithm was closer to the final exam results than the teachers predictions were. The sample wasn't huge but I think it highlighted an issue that teachers are too influenced by students' behaviour, reliability in doing homework, general niceness and (in some cases) how much hassle the parents give.


Maybe that was true in your school - we are held to account for our predictions - they form part of PM. We are very accurate because of this and I don't see why more schools don't do this.
Original post by 04MR17
Algorithms are responsible for student grades in normal years, so the question feels somewhat pointless.

PRSOM.
Exactly the point I came here to make!
Reply 16
Original post by 04MR17
Algorithms are responsible for student grades in normal years, so the question feels somewhat pointless.

Sorry, I should be more clear (although I think I mentioned what I meant in the post). I was referring to the kind of algorithm used in the absence of exams, as was the case last year.

I just wanted a snappy title, and I guess I didn't put too much thought into it.
(edited 3 years ago)
Original post by Tolgash
Sorry, I should be more clear (although I think I mentioned what I meant in the post). I was referring to the kind of algorithm used in the absence of exams, as was the case last year.

I just wanted a snappy title, and I guess I didn't put too much thought into it.

Well if your question is that specific, then you've already answered it in this post - yes they could, last year.
Reply 18
Original post by 04MR17
Well if your question is that specific, then you've already answered it in this post - yes they could, last year.

But would anyone ever want to see one used again if a similar situation were to arise? Gavin Williamson already said that the DfE would put its trust in teachers ahead of algorithms for 2021 because of the circumstances (i.e. exams cancelled due to the disruption caused by the pandemic), but could there ever be a time when more trust could be put in the algorithm in those circumstances, or are they just too unreliable to ever be used in that way?

The algorithm awarded a very small number of grades. It was only officially used to award those higher than the CAGs. CAGs were officially used, so the algorithm wasn't really trusted that much.
(edited 3 years ago)
Original post by Tolgash
But would anyone ever want to see one used again if a similar situation were to arise? Gavin Williamson already said that the DfE would put its trust in teachers ahead of algorithms for 2021 because of the circumstances (i.e. exams cancelled due to the disruption caused by the pandemic), but could there ever be a time when more trust could be put in the algorithm in those circumstances, or are they just too unreliable to ever be used in that way?

The algorithm awarded a very small number of grades. It was only officially used to award those higher than the CAGs. CAGs were officially used, so the algorithm wasn't really trusted that much.

Algorithms are not unreliable. Last year's algorithm did its job very well. The "put our trust in teachers" should be received as: "we will blame teachers in August" since that is entirely was Gavin meant when he said it.

The algorithm was grossly unpopular, which is why it was U-turned on. That doesn't mean it wasn't trusted.

Quick Reply

Latest

Trending

Trending