# Key Stage 3 SATs

Watch
This discussion is closed.
17 years ago
#1
Yes I know you can argue how important or how valid they are?

but

Having last year rejoiced over our 9% increase in our KS3 Science SAT levels only to find that
years results?

Mark U
0
17 years ago
#2
Well out results came in to school today, the way the statistics work is that you are always
comparing this years SAT's with last years national statistics. Besides, the best analysis is the
value-added analysis completed on your own students using the graidents and offsets (progress lines)
derived from the Autumn Package. If you are really keen, check out the DfES Standards web site...

Dominic

"NTL NEWS" <[email protected]> wrote in message
news:[email protected]...
[q1]> Yes I know you can argue how important or how valid they are?[/q1]
[q1]>[/q1]
[q1]> but[/q1]
[q1]>[/q1]
[q1]> Having last year rejoiced over our 9% increase in our KS3 Science SAT[/q1]
levels
[q1]> only to find that nationally the levels have risen by 7%, I wondered if anyone had heard any[/q1]
[q1]>[/q1]
[q1]> Mark U[/q1]
[q1]>[/q1]
0
17 years ago
#3
In article <[email protected]>, D R Tester
<[email protected]> wrote:
[q1]> Well out results came in to school today, the way the statistics work is that you are always[/q1]
[q1]> comparing this years SAT's with last years national statistics. Besides, the best analysis is[/q1]
[q1]> the value-added analysis completed on your own students using the graidents and offsets[/q1]
[q1]> (progress lines) derived from the Autumn Package. If you are really keen, check out the DfES[/q1]
[q1]> Standards web site...[/q1]

Assuming your turnover in pupils doesn't make this analysis invalid...

.. in fact this analysis is invalid for most schools because of turnover and low numbers. Try doing
the work yourself and just compare those pupils' scores at KS1 with the very same pupils at KS2 (ie
exclude all pupils who didn't take both sets of tests). You will probably be very surprised at how
few pupils remain.

If you hope to compare this years result from last years just remember that the tests are different
and not standardised. ie most of the work put into the analysis is wasted time, effort and trees.

--
John Cartmell
0
17 years ago
#4
On Tue, 02 Jul 2002 2152 +0100, John Cartmell wrote:

[q1]> In article <[email protected]>, D R Tester[/q1]
[q1]> <[email protected]> wrote:[/q1]
[q2]>> Well out results came in to school today, the way the statistics work is that you are always[/q2]
[q2]>> comparing this years SAT's with last years national statistics. Besides, the best analysis is the[/q2]
[q2]>> value-added analysis completed on your own students using the graidents and offsets (progress[/q2]
[q2]>> lines) derived from the Autumn Package. If you are really keen, check out the DfES Standards web[/q2]
[q2]>> site...[/q2]
[q1]>[/q1]
[q1]> Assuming your turnover in pupils doesn't make this analysis invalid...[/q1]
[q1]>[/q1]
[q1]> .. in fact this analysis is invalid for most schools because of turnover and low numbers. Try[/q1]
[q1]> doing the work yourself and just compare those pupils' scores at KS1 with the very same pupils at[/q1]
[q1]> KS2 (ie exclude all pupils who didn't take both sets of tests). You will probably be very[/q1]
[q1]> surprised at how few pupils remain.[/q1]
[q1]>[/q1]
[q1]> If you hope to compare this years result from last years just remember that the tests are[/q1]
[q1]> different and not standardised. ie most of the work put into the analysis is wasted time, effort[/q1]
[q1]> and trees.[/q1]

Well said that man. Strange how whole indutries get built on invalid assumptions :-)

If you are in a secondary school and you take CAT scores in year 7 and let's say last year's cohort
was a pretty normal distribution and so was this and this year did better in their SATs than last
year, either the SATs are easier or they have made faster progress. Within the bounds of statistical
uncertainty, making broad judgements based on such methods are probably as good and as valid as
anything so why waste a lot of time on detailed mathematics that really doesn't tell the truth any
more validly? I got used to doing this in the early days of OFSTED when schools - particularly with
low exam results didn't really know if these really were because of weak pupils or to what degree. I
found that you could determine within resonable degrees of uncertainty the picture in about an hour
if the raw stats were available using a very rudimentary spreadsheet on a PDA.Things like this
school has an attendance problem, not specifically a teaching problem can be determined by showing
that the progress of those that attend regularly is similar to those in the National Cohort
therefore concentrate on attendance. Its actually surprisingly simple to do this.Its all very well
saying pupils will attend if the curriculum and teaching are good, but if the majority of this is
similar to other schools you go in and the majority are attending, perhaps its something beyond all
that ;-).Of course it might be more politically advantageous to blame teaching than to look at the
wider ills of society in disaffecting pupils and families in terms of school attendance. Its a shame
that the unions can't take up some of these battles and get some professional credibility rather
than simple appearing reactionary to anything that the G throws up.

Regards,

--
IanL
0
X
new posts
Back
to top
Latest
My Feed

### Oops, nobody has postedin the last few hours.

Why not re-start the conversation?

see more

### See more of what you like onThe Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

### Poll

Join the discussion

Yes (267)
34.27%
No (512)
65.73%