Hey there! Sign in to join this conversationNew here? Join for free

Cambridge University to introduce written admissions tests Watch

    • Section Leader
    • Clearing and Applications Advisor
    Offline

    20
    ReputationRep:
    (Original post by shamika)



    Is this actually confirmed? I suppose it doesn't make too much of a difference as the new A-levels are quite tightly specified so if there is off syllabus material, everyone will have to learn it
    I think so: http://www.undergraduate.study.cam.a...ns-assessments

    "* No advance preparation will be needed, other than revision of relevant recent subject knowledge where appropriate."

    Posted from TSR Mobile
    Offline

    14
    ReputationRep:
    (Original post by jneill)
    The point about this news is not the interview. The interview is well established.

    The point is about the use of Oxford-style pre-interview tests for many courses...

    This gives rise to the possible implication that Cambridge will interview fewer applicants as a result (like Oxford).

    We will have to wait for Cambridge to clarify that in due course.


    Posted from TSR Mobile
    Probably not.............for the time being, anyway.

    The new tests will form part of Cambridge's assessments of candidates, rather than being a method of selecting students for interview, Dr Lucy said.
    http://www.telegraph.co.uk/education...-students.html

    It may become some kind of endurance test for Cambridge. How long will they be able to cope with increased amount and more complicated nature of work in the limited time schedule.
    Offline

    13
    ReputationRep:
    (Original post by jneill)
    I think so: http://www.undergraduate.study.cam.a...ns-assessments

    "* No advance preparation will be needed, other than revision of relevant recent subject knowledge where appropriate."

    Posted from TSR Mobile
    Fair enough. I really dislike Oxbridge's line of no advance preparation is needed. It might not be in a technical sense, but it's almost certainly required to be competitive. A very naive shamika took that at face value at 17 (luckily it didn't cost me an offer).
    • Political Ambassador
    Offline

    2
    ReputationRep:
    Yes !!,
    Offline

    11
    ReputationRep:
    (Original post by shamika)
    Contrast that to Gavin Lowe (Oxford CompSci admissions tutor) who repeatedly said that he considers interviews to be more predictive than MAT or academic history. (This was in the TSR's most recent MAT prep thread.)

    Don't think he had done a proper study though.
    It's actually rather hard to do a proper study, because of the missing data problem: we don't know how candidates we rejected would have got on if they had been accepted. The only way to do a proper study would be to accept candidates who had done badly on the MAT or interviews (and who would normally be rejected) and seeing how they would cope. Of course, we're not going to do that.

    In most cases, performance on the MAT, performance in interviews and performance during the degree are fairly compatible. However, my experience is that where a student's performance during their degree has been out of line with MAT or interview scores, it's the latter that is more reliable. (An exception to the above is that good teaching can help candidates do better on the MAT, and to a lesser extent in interviews, than those with equal potential but who received less good teaching; we can adjust for that to a certain extent.)

    Gavin
    Offline

    13
    ReputationRep:
    (Original post by gavinlowe)
    It's actually rather hard to do a proper study, because of the missing data problem: we don't know how candidates we rejected would have got on if they had been accepted. The only way to do a proper study would be to accept candidates who had done badly on the MAT or interviews (and who would normally be rejected) and seeing how they would cope. Of course, we're not going to do that.

    In most cases, performance on the MAT, performance in interviews and performance during the degree are fairly compatible. However, my experience is that where a student's performance during their degree has been out of line with MAT or interview scores, it's the latter that is more reliable. (An exception to the above is that good teaching can help candidates do better on the MAT, and to a lesser extent in interviews, than those with equal potential but who received less good teaching; we can adjust for that to a certain extent.)

    Gavin
    Except it is easy* to answer the question: "how correlated are interview scores to degree performance (for accepted students?)", which is really the only question people are interested in.

    *More accurately, easy if the data is readily available
    Offline

    15
    ReputationRep:
    (Original post by jneill)
    No.

    The whole point is they won't require any extra learning.
    But I wonder how they'll ensure this is the case.

    Each specification whether it be AQA, WJEC, Edexcel, OCR etc is not the same so there is a chance they have something in the test covered by the OCR and AQA specification but not if you do Edexcel putting the Edexcel student at a disadvantage for instance.
    • Section Leader
    • Clearing and Applications Advisor
    Offline

    20
    ReputationRep:
    (Original post by Excuse Me!)
    But I wonder how they'll ensure this is the case.

    Each specification whether it be AQA, WJEC, Edexcel, OCR etc is not the same so there is a chance they have something in the test covered by the OCR and AQA specification but not if you do Edexcel putting the Edexcel student at a disadvantage for instance.
    Possibly by having a reasonably wide range of questions but only a few need to be answered...

    All will become clearer when the more detailed test info becomes available soon.
    • Section Leader
    • Clearing and Applications Advisor
    Offline

    20
    ReputationRep:
    (Original post by shamika)
    Except it is easy* to answer the question: "how correlated are interview scores to degree performance (for accepted students?)", which is really the only question people are interested in.

    *More accurately, easy if the data is readily available
    The CAT (Christ's Admissions Tutor) has mentioned that one challenge is that different interviewers use the "score sheet" differently. Some don't score the interview at all per se, they just give an overall mark assessing the quality of the candidate's total application. Others might put the interview score but one interviewer might rarely give high marks and their "7" might be another interviewer's "8".

    Not ideal when trying to do a stastical analysis...
    Offline

    13
    ReputationRep:
    (Original post by jneill)
    The CAT (Christ's Admissions Tutor) has mentioned that one challenge is that different interviewers use the "score sheet" differently. Some don't score the interview at all per se, they just give an overall mark assessing the quality of the candidate's total application. Others might put the interview score but one interviewer might rarely give high marks and their "7" might be another interviewer's "8".

    Not ideal when trying to do a stastical analysis...
    Quite. Interpreting straight correlations from data like this is a bit of a nightmare.

    A modern approach would be at least to model the "rater effect" as a random intercept in some sort of ordinal regression. Even this is probably not sufficient, as one might have to take into account subject effects.
    Offline

    19
    ReputationRep:
    (Original post by jneill)
    The point is about the use of Oxford-style pre-interview tests for many courses...

    This gives rise to the possible implication that Cambridge will interview fewer applicants as a result (like Oxford). .
    Oxford interviews fewer, but to my knowledge performs more interviews per candidate it does invite. I think the difference is due to different use of interviews (investigate all possibilities versus extensively investigating the strong possibilities) rather than anything to do with admissions tests.

    Also to do with the degree of college autonomy (less at Oxford).

    Though maybe these are things up for debate at this time as well.

    (Original post by shamika)
    Fair enough. I really dislike Oxbridge's line of no advance preparation is needed. It might not be in a technical sense, but it's almost certainly required to be competitive. A very naive shamika took that at face value at 17 (luckily it didn't cost me an offer).
    I took that at face value, only doing one past paper and nil else in preparation, and got 100% in BMAT section 2, 86% overall.

    I think extensive preparation gives minimal advantage. People fret over having to memorise all the tiny details that came up in the answers of past papers, when in reality the idea is that you use your understanding to derive the answer from basic principals. Memorisation doesn't work. Being good at your subject, does.

    (Original post by Excuse Me!)
    But I wonder how they'll ensure this is the case.

    Each specification whether it be AQA, WJEC, Edexcel, OCR etc is not the same so there is a chance they have something in the test covered by the OCR and AQA specification but not if you do Edexcel putting the Edexcel student at a disadvantage for instance.
    Similar to how they do it at the moment? E.g. STEP and BMAT? Essentially asking difficult questions based on basic things.
    Offline

    14
    ReputationRep:
    (Original post by nexttime)
    Oxford interviews fewer, but to my knowledge performs more interviews per candidate it does invite. I think the difference is due to different use of interviews (investigate all possibilities versus extensively investigating the strong possibilities) rather than anything to do with admissions tests.

    Also to do with the degree of college autonomy (less at Oxford).

    Though maybe these are things up for debate at this time as well.



    I took that at face value, only doing one past paper and nil else in preparation, and got 100% in BMAT section 2, 86% overall.

    I think extensive preparation gives minimal advantage. People fret over having to memorise all the tiny details that came up in the answers of past papers, when in reality the idea is that you use your understanding to derive the answer from basic principals. Memorisation doesn't work. Being good at your subject, does.



    Similar to how they do it at the moment? E.g. STEP and BMAT? Essentially asking difficult questions based on basic things.
    Totally agree.
    I've read somewhere (or was it CAT?) that if an interviewer detect a candidate has seen a similar question/problem before and has been coached/practised, they can swiftly switch the question to something else, so that they can make sure they'll test the candidate on something new and unfamiliar to see how they work it out. And that's the main thing they want to do in interviews.
    Offline

    14
    ReputationRep:
    (Original post by Gregorius)
    Quite. Interpreting straight correlations from data like this is a bit of a nightmare.

    A modern approach would be at least to model the "rater effect" as a random intercept in some sort of ordinal regression. Even this is probably not sufficient, as one might have to take into account subject effects.
    What I gather from things I've read/heard, the interview score is not so much more than 'a note/memo' in numerical form for each interviewers/DoS to use to remind them about each candidate's 'performance' at their interview.
    It's really the last piece of jigsaw puzzle for them when they try to build a whole (3D) picture of how each candidate is like as an applicant. Not really for comparison between applicants on the basis of interview scores. And each interview can be slightly different from candidate to candidate, so you can't really compare them on a same basis.
    So trying to find a correlation between interview score and future Tripos performance is a bit meaningless, I think.

    That's my understanding anyway.
    Offline

    13
    ReputationRep:
    (Original post by jneill)
    The CAT (Christ's Admissions Tutor) has mentioned that one challenge is that different interviewers use the "score sheet" differently. Some don't score the interview at all per se, they just give an overall mark assessing the quality of the candidate's total application. Others might put the interview score but one interviewer might rarely give high marks and their "7" might be another interviewer's "8".

    Not ideal when trying to do a stastical analysis...
    Again, agreed. I am surprised that CAT was so blunt about the problem, because it identifies a fundamental flaw in the admissions process. Unless AT's are (implicitly) allowing for such bias, how can you select the best students for offer?
    Offline

    13
    ReputationRep:
    (Original post by Gregorius)
    Quite. Interpreting straight correlations from data like this is a bit of a nightmare.

    A modern approach would be at least to model the "rater effect" as a random intercept in some sort of ordinal regression. Even this is probably not sufficient, as one might have to take into account subject effects.
    More fundamental flaw: you have tiny amounts of data, anything beyond a linear regression is almost certainly spurious.

    Spoiler:
    Show
    As an actuary, that has never stopped from me applying said techniques when it suits
    Offline

    13
    ReputationRep:
    (Original post by shamika)
    More fundamental flaw: you have tiny amounts of data, anything beyond a linear regression is almost certainly spurious.
    Spoiler:
    Show
    As an actuary, that has never stopped from me applying said techniques when it suits
    Not quite sure I follow; Cambridge interviews thousands of students. Decent sample size especially compared to the work I usually do!
    Offline

    19
    ReputationRep:
    (Original post by vincrows)
    Totally agree.
    I've read somewhere (or was it CAT?) that if an interviewer detect a candidate has seen a similar question/problem before and has been coached/practised, they can swiftly switch the question to something else, so that they can make sure they'll test the candidate on something new and unfamiliar to see how they work it out. And that's the main thing they want to do in interviews.
    They did that for me (though admittedly at Oxford). They gave me a setof chromosomes and literally asked me whether I'd heard of Down's syndrome, and I very hesitantly answered that I thought i had and that all i knew was it might be the one with 3 chromosomes. That alone was sufficient for them to completely cut the question and move on!
    Offline

    14
    ReputationRep:
    (Original post by nexttime)
    They did that for me (though admittedly at Oxford). They gave me a setof chromosomes and literally asked me whether I'd heard of Down's syndrome, and I very hesitantly answered that I thought i had and that all i knew was it might be the one with 3 chromosomes. That alone was sufficient for them to completely cut the question and move on!
    Yeah, you're up against some of the top academics in the field in UK or even in the world, so there's no escaping!
    Offline

    13
    ReputationRep:
    (Original post by Gregorius)
    Not quite sure I follow; Cambridge interviews thousands of students. Decent sample size especially compared to the work I usually do!
    I was thinking college and course combo, but that doesn't really make much sense!
    Offline

    13
    ReputationRep:
    (Original post by shamika)
    I was thinking college and course combo, but that doesn't really make much sense!
    Oh I think it does. You'd like to control for the rate effect mainly, but chucking in a course effect and a college effect would be fun. My computer is salivating at the thought...
 
 
 
  • See more of what you like on The Student Room

    You can personalise what you see on TSR. Tell us a little about yourself to get started.

  • Poll
    Break up or unrequited love?
    Useful resources
  • See more of what you like on The Student Room

    You can personalise what you see on TSR. Tell us a little about yourself to get started.

  • The Student Room, Get Revising and Marked by Teachers are trading names of The Student Room Group Ltd.

    Register Number: 04666380 (England and Wales), VAT No. 806 8067 22 Registered Office: International House, Queens Road, Brighton, BN1 3XE

    Quick reply
    Reputation gems: You get these gems as you gain rep from other members for making good contributions and giving helpful advice.