They are talking about a conversion rate of 30% in several banks for the current summer interns
Reminded me of this morning's FT:
Sex, lies and pitfalls of overblown statistics By John Kay A visit to the International Festival of Statistics in Dublin (yes, really) prompted me to offer advice to young scholars on the interpretation and use of economic data. Always ask yourself the question: “where does that data come from?”. “Long distance rail travel in Britain is expected to increase by 96 per cent by 2043.” Note how the passive voice “is expected” avoids personal responsibility for this statement. Who expects this? And what is the basis of their expectation? For all I know, we might be using flying platforms in 2043, or be stranded at home by oil shortages: where did the authors of the prediction acquire their insight? More
“On average, men think about sex every seven seconds.” How did the researchers find this out? Did they ask men how often they thought about sex, or when they last thought about sex (3½ seconds ago, on average)? Did they give their subjects a buzzer to press every time they thought about sex? How did they confirm the validity of the responses? Is it possible that someone just made this statement up, and that it has been repeated frequently and without attribution ever since? Many of the numbers I hear at business conferences have that provenance. In more intellectual environments, the figures presented may be the product of serious analysis and calculation. Always ask of such data “what is the question to which this number is the answer?”. “Earnings before interest, tax, depreciation and amortisation on a like-for-like basis before allowance for exceptional restructuring costs” is the answer to the question “what is the highest profit number we can present without attracting flat disbelief?”. Beware explanations that are tautological: “gross domestic product is a measure of the income of the nation”, “movements of the consumer prices index reflect changes in the cost of a basket of commodities compiled by the Office for National Statistics”. Always probe descriptions – “GDP is not a measure of output, or of welfare” – that define what a statistic is not, rather than what it is. “These figures are not forecasts, and should not be relied on by prospective investors.” If they are not forecasts, then what are they, and if they are not to be relied on by prospective investors what purpose was intended in distributing the information to them? Be careful of data defined by reference to other documents that you are expected not to have read. “These accounts have been compiled in accordance with generally accepted accounting principles”, or “these estimates are prepared in line with guidance given by HM Treasury and the Department of Transport”. Such statements are intended to give a false impression of authoritative endorsement. A data set compiled by a national statistics organisation or a respected international institution such as the Organisation for Economic Co-operation and Development or Eurostat will have been compiled conscientiously. That does not, however, imply that the numbers mean what the person using them thinks or asserts they mean. When the data seem to point to an unexpected finding, always consider the possibility that the problem is a feature of the data, rather than a feature of the world. I recently saw a study of comparative productivity in financial services in which Italy came top and Britain and the US bottom. You might have thought alarm bells would ring, but no: the authors went on to comment that this divergence was serious because of the size of the financial services sectors of Britain and the US. A little thought might have directed the researchers’ attention to questions such as “what is meant by output of financial services?”. But it is now easy to import data into a computer program without thought. The unwarranted precision of the projected growth in rail traffic – a 96 per cent increase, rather than a doubling – is a clue that the number was generated by a computer, not a skilled interpreter of evidence. Statistics are only as valid as the sources from which they are drawn and the abilities of those who use them. When I discover something surprising in data, the most common explanation is that I made a mistake.
care to qualify your statistics, as i'm aware from talking to my boss this week that we are likely to take 6 interns in our product area, possibly 7 (from 11) with 4 more grad places open...
BarCap and DB will give out offers to interns on monday. then we will know their conversion rates, but not before... 30% sounds way too low. even if it is bad, it's gotta be at least 40-50%
BarCap and DB will give out offers to interns on monday. then we will know their conversion rates, but not before... 30% sounds way too low. even if it is bad, it's gotta be at least 40-50%
Hi all - I'm currently researching with the intention to apply for 2012 graduate schemes, though I'm on a post-uni gap year outside of the UK so not sure how that's going to play with interviews and the like (especially as I also don't have any experience), we'll see.
BarCap and DB will give out offers to interns on monday. then we will know their conversion rates, but not before... 30% sounds way too low. even if it is bad, it's gotta be at least 40-50%
well that's not true. none of ours are in on monday and they dont find out till friday
im expecting them to convert 1/2 on our desk and that seems to be the general consensus from those in the positions of power ~50%, however we have had a historically higher conversion rate than the rest of the street so who knows maybe that's optimistic for other banks.