The Student Room Group

David Cameron looks to have stumped all paedophilles then!

Every single one that searches for "child porn" into Google to access their business. Good job Dave man, we're safe once that legislation comes into place!

It worries me a man who sits talking with CEOs of the most powerful companies in the world has no idea how such ideas work.

Looks like we're going to be arresting lads from the ages of 13-19 who search child porn into their mates computer for a joke.

Tragic.

Scroll to see replies

Reply 1
Original post by paradoxicalme
Sorry, but that's one sick joke.

Have you got any alternative ideas to solve this child pornography consumer problem?


No it's not.

The person and cabinet who run are country shouldn't be so bloody stupid think every paedophille types "cp" into google
Reply 2
You sound as though you have something to worry about here, OP. I can't see many downfalls to this if they got it to work on a technical level.

There's no reason to be searching those sorts of things at all, and i'd be concerned if someone has a mate who would actually google child porn on his mates computer "for a laugh".
Not to mention the reason those rings are so hard to catch is because they don't use google and such often large syndicates exchanging content amongst themselves, not broadcasting it to the world.
Yes, I hear our Police force in Rochdale, Oxford, Keighly, Telford are doing very well to tackle such pedophile gangs. Including the social services, who have been brilliant in stopping the trend.
Reply 5
source?

I assume it somehow would look to see if you'd actually looked at images? I mean for example if you were doing some sort of analysis on sexual abuse, rape cases, child abuse etc and searched "child pornography arrest statistics" or something akin to that, would this system automatically take those first two words and alert some sort of authority?

Also I highly doubt sophisticated viewers of such content and going to use search terms like "child porn". Rather as discussed previously they would use peer to peer transfer or search for terms leading them to reservoirs of content which have names to give an illusion of credibility/innocence.
Reply 6
Couple of point here.

First of all, I know it sounds a silly question but what qualifies as child porn? Isn't porn different things to different people? So where would Cameron draw the line?

Secondly, since Pedopholia is a recognized mental abnormality, rather than spend millions hunting down and prosecuting people who type the words "child porn" into google perhaps that money could be better spent on healthcare for these people.
Reply 7
why is David Cameron playing cricket with paedephilles, I mean doesn't he have a job to running the country?!
Original post by atastycrumpet
No it's not.

The person and cabinet who run are country shouldn't be so bloody stupid think every paedophille types "cp" into google


I meant, if 13-19 year olds are typing 'child pornography' into google as a joke, then it's a pretty sickening joke.

No, they don't, but it's where a lot of paedophiles start.
Original post by Howard

Secondly, since Pedopholia is a recognized mental abnormality, rather than spend millions hunting down and prosecuting people who type the words "child porn" into google perhaps that money could be better spent on healthcare for these people.


Good point.

A lot of people think 'paedophile' and visualise an old guy chasing a kid down the street with a butterfly net, and don't realise that what 'paedophilia' actually describes is an attraction towards children, which is a mental abnormality. There are probably many paedophiles living normally because they suppress their urges or just don't act on them. And that would also mean that women can be paedophiles (a view many don't have). People view them as the scum of society - and yes, the ones who act on their urges by kidnapping and assaulting children are - but really we need to get them psychiatric help.
Original post by paradoxicalme
I meant, if 13-19 year olds are typing 'child pornography' into google as a joke, then it's a pretty sickening joke.

The number of times people put meatspin up on school computers was fairly substantial. Unless you lock the screen, a jackass friend will start doing dodgy searches on your behalf. Sick? Perhaps. That doesn't mean it's not a widespread occurrence.



This is a very slippery slope and the methods they are using are so naive. Pretty much all my university buddies have software installed to get around internet filters, so what good is it going to do to add more of them? Even citizens in China can access banned websites, so how do the government believe they can suddenly stop it now?

The additional regulations (filters by default, bans on certain types of porn, etc) are either subjective or promoting a lazy culture. Parents are naive to believe this will stop little Jimmy from finding porn - I doubt it will be long before school kids start setting up Dropbox accounts and selling access to porn they downloaded. Sharing data is far too easy and will lead to a "worse" culture than there already is. What do they think they are solving? You don't even need the internet to share data... Most kids have a smartphone, so they'll just send the files over that or share USB sticks or email it to each other. The getarounds are ridiculously simple.


Regarding child abuse, that absolutely must stop, but these proposals are aimed in entirely the wrong direction. It has taken us decades to uncover some child abuse cases because those who do it are trying not to be found! Spend the money on getting more police to hunt down the real criminals and to do proper investigative work. Who do they think is going to Google search for highly illegal porn anyway?

Subjectively blocking porn is a dangerous business as well. The definitions will only get broader. There are plenty of artistic nude photographers out there who may suffer and plenty of people who are likely to get in trouble because the government were over-zealous in choosing who can view what.

I wouldn't be surprised if they get a visit from "Anonymous" and I really hope the EU steps in to support the citizens on this one.
(edited 10 years ago)
The government must be aware that someone isn't actually going to go to Google and type 'child porn' to find it.
At the same time, the government are making cuts to the very police units and NGO groups that monitor child porn and attempt to apprehend offenders and rescue children in distress.

Pure politics and nothing whatever to do with actually making a difference - this has come from Lynton Crosby and is just another soundbite. They couldn't even be arsed to check their facts before they mouthed off - someone wrote this one on the back of a chablis glass mat.

Ah well, it'll all be forgotten about come the next 'announcement'.
It seems like a good idea in principle, but will fail epically.

The BBC report mentioned a couple of people who had used google to find such images. BUT once you've googled it once and found a site with lots of pictures/videos, you don't need to google it anymore, you can just go straight to that site. Meaning that this ban will be useless in stopping the worst offenders, who would have found such sites years ago. And others that are part of paedophile rings (which are worse than just individuals) probably didn't use google in the first place, they have a website for their own ring and share pics/vids with their friends and possibly email each other aswell. This new legislation won't stop this at all.

One of the big problems is, the obvious words will be blocked, but they'll still find a way round it, by inventing new words for it or using new code words, and that's if they even use google to find these pictures/videos, which I suspect the worst people don't.

It's not quite the same but .... My biology teacher googled 'blue tits' as in the bird, during a lesson. Because of the second word, the results were blocked and it had obviously been flagged by the computer and the head of ICT came in to see what was going on ! It was completely innocent and something relevant to our lesson, but was blocked. In the end, she googled the latin name for them and she got what we needed for the lesson. Something similar will happen in real life. They might not be able to use the obvious words which are banned, but they'll find a way around it, to get to the same results.

The problem with this ban on certain words and how search engines work is that it will end up blocking innoncent things (like what happened in my biology lesson, although the security settings were stupidly tight on those computers) and still allowing a lot of pornography to be viewed.

Like someone on here mentioned, won't it end up blocking results where people are researching statistics/news reports of child abuse/pornography !?!?
(edited 10 years ago)
Reply 14
Original post by Howard
Couple of point here.

First of all, I know it sounds a silly question but what qualifies as child porn? Isn't porn different things to different people? So where would Cameron draw the line?

Secondly, since Pedopholia is a recognized mental abnormality, rather than spend millions hunting down and prosecuting people who type the words "child porn" into google perhaps that money could be better spent on healthcare for these people.


But how do you identify them then?
Original post by paradoxicalme
Good point.

A lot of people think 'paedophile' and visualise an old guy chasing a kid down the street with a butterfly net, and don't realise that what 'paedophilia' actually describes is an attraction towards children, which is a mental abnormality. There are probably many paedophiles living normally because they suppress their urges or just don't act on them. And that would also mean that women can be paedophiles (a view many don't have). People view them as the scum of society - and yes, the ones who act on their urges by kidnapping and assaulting children are - but really we need to get them psychiatric help.


I completely agree. Except if a paedophial causes harm to a child they should still be punished & receive trestment because a very small minority have no remorse formtheir actions and treatment alone simply won't do.
do you agree?
Original post by fusensor
why is David Cameron playing cricket with paedephilles, I mean doesn't he have a job to running the country?!


That's the Tories on a team-building exercise
Original post by anony.mouse
It's not quite the same but .... My biology teacher googled 'blue tits' as in the bird, during a lesson. Because of the second word, the results were blocked and it had obviously been flagged by the computer and the head of ICT came in to see what was going on ! It was completely innocent and something relevant to our lesson, but was blocked. In the end, she googled the latin name for them and she got what we needed for the lesson. Something similar will happen in real life. They might not be able to use the obvious words which are banned, but they'll find a way around it, to get to the same results.


I bet that finally made the biology degree seem worthwhile!
This is such a stupid idea that it is difficult to know where to start.

The Chinese have text based ban lists. At least one has been leaked, so we can see that they're pretty odd even allowing for machine translation. But doing text matching on searches and results is simple.

Here, the technically illiterate PM and media supporters imagine there is a way to automatically know if an image or video could be classified as unacceptable (to whom?) This is harder, to put it mildly. If the ISPs and search engines have any balls, they will be asking for a precise unambiguous programmable definition of what to ban - rather than 'stuff I don't like' - with a guarantee that they will be indemnified against damages for 'false positives'. The government will fail totally to come up with it or they will solve several major problems in AI. Either way, result!

Even with text, there are problems. Ban a word or phrase and people use other ones. (It's probably why the Chinese lists are so weird - they're trying to keep up with an evolving target). I am hoping that 'conservative' becomes the new code word for porn, and such searches get banned.

Before that happens, do they really want to ban searches for 'block child porn' because of the last two words?

Oh, my stupid censorship story is a filter that blocked the entire bbc.co.uk site because somewhere on it was the word 'breasts'.
Original post by Howard
what qualifies as child porn?


The legal definition is stupider than I hope you can imagine.

See the Brass Eye programme on 'nonce sense', with the former head of the Met's vice squad being asked about various images. It's gotten sillier since then.

Quick Reply

Latest