Internet's biggest pro trump community banned for advocating murder of cops Watch

134841422
Badges: 11
Rep:
?
#41
Report Thread starter 3 weeks ago
#41
(Original post by AperfectBalance)
Got your daily dose of smug?

Can we also agree there is a difference between me saying "I dont like you" on some online place and me walking into your house and standing by you and saying "I dont like you" ?
there's literally no difference

let's say I make and online forum, right? I pay the domain fees and make other small investments (WITH MY OWN MONEY) such as renting out server space to store the data of my site's users, etc, etc.
now, I can use their data to help pay my own bills by selling it to companies so they can advertise to specific demographics (and remember, this is all legal, it's the stuff the user agrees to when he 'accepts terms and conditions' and allows cookies)

now, some of these companies which are paying the big bucks that are essential to keep my site running threaten to stop doing business with me unless I curb some of the more extremist content on my site (as like any normal company they don't want to have bad PR by being associated with such extremists)

now, what you're saying is that I should do nothing at all and allow my site to be shutdown and hence lose money I invested/not turn a profit just for the sake of free speech (which doesn't even apply to private platforms anyways)? even though this is a private, for profit company that I made and is not subsidized by the government at all

yeah, good logic there

when you get older, you'll thank me for this short lesson in business - it will make accepting reality more easy
Last edited by 134841422; 3 weeks ago
0
reply
Ascend
Badges: 13
Rep:
?
#42
Report 3 weeks ago
#42
Arseward tribalistic useful idiots.
1
reply
AperfectBalance
Badges: 21
Rep:
?
#43
Report 3 weeks ago
#43
(Original post by 134841422)
there's literally no difference

let's say I make and online forum, right? I pay the domain fees and make other small investments (WITH MY OWN MONEY) such as renting out server space to store the data of my site's users, etc, etc.
now, I can use their data to help pay my own bills by selling it to companies so they can advertise to specific demographics (and remember, this is all legal, it's the stuff the user agrees to when he 'accepts terms and conditions' and allows cookies)

now, some of these companies which are paying the big bucks that are essential to keep my site running threaten to stop doing business with me unless I curb some of the more extremist content on my site (as like any normal company they don't want to have bad PR by being associated with such extremists)

now, what you're saying is that I should do nothing at all and allow my site to be shutdown and hence lose money I invested/not turn a profit just for the sake of civil liberties? even though this is a private, for profit company that I made and is not subsidized by the government at all

yeah, good logic there

when you get older, you'll thank me for this short lesson in business - it will make accepting reality more easy
Reddit seems to be doing fine with all the other very questionable content on the website, especially the leftist content that is just as violent.

And yes there is a very clear difference, if I was online and said to someone "I am going to beat you up" you would probably brush it off as some idiot, if some guy turned up uninvited and said "I am going to beat you up" you probably would take things a bit more seriously and rightly so. This whole 'private is private' thing really is flawed and even with your lack of logic you do not even take into account if something is right or wrong, just 'it makes money so its fine'

and sure it is legal but so? slavery was once legal, doesnt make it morally right, the main argument here is not "ITS AGAINST THE LAW TO DO THIS" the main argument that it isnt the right thing to do.

My argment was never about the financial side of this so its pretty pointless thinking that you have outsmarted me, especially with such smug comments.
Last edited by AperfectBalance; 3 weeks ago
0
reply
Ascend
Badges: 13
Rep:
?
#44
Report 3 weeks ago
#44
0
reply
134841422
Badges: 11
Rep:
?
#45
Report Thread starter 3 weeks ago
#45
(Original post by AperfectBalance)
Reddit seems to be doing fine with all the other very questionable content on the website, especially the leftist content that is just as violent.

And yes there is a very clear difference, if I was online and said to someone "I am going to beat you up" you would probably brush it off as some idiot, if some guy turned up invited and said "I am going to beat you up" you probably would take things a bit more seriously and rightly so. This whole 'private is private' thing really is flawed and even with your lack of logic you do not even take into account if something is right or wrong, just 'it makes money so its fine'

and sure it is legal but so? slavery was once legal, doesnt make it morally right, the main argument here is not "ITS AGAINST THE LAW TO DO THIS" the main argument that it isnt the right thing to do.

My argment was never about the financial side of this so its pretty pointless thinking that you have outsmarted me, especially with such smug comments.
whatever you say buddy
I'm guessing you're planning on never being an employer and just remaining an employee
0
reply
AperfectBalance
Badges: 21
Rep:
?
#46
Report 3 weeks ago
#46
(Original post by 134841422)
whatever you say buddy
I'm guessing you're planning on never being an employer and just remaining an employee
Im waiting for you to push your MLM on me or something now.
0
reply
134841422
Badges: 11
Rep:
?
#47
Report Thread starter 3 weeks ago
#47
(Original post by AperfectBalance)
Im waiting for you to push your MLM on me or something now.
if I did, you'd be dumb enough to take it
0
reply
fallen_acorns
Badges: 20
Rep:
?
#48
Report 3 weeks ago
#48
The internet giants are going to have to decide if they want to be platforms or publishers soon enough.. They can't keep being regulated like one, and acting like the other.
1
reply
AperfectBalance
Badges: 21
Rep:
?
#49
Report 3 weeks ago
#49
(Original post by 134841422)
if I did, you'd be dumb enough to take it
Yeah sure I would. Im probably talking to somone with a job at KFC who also has £50 in bitcoin and now thinks he is some visionary 'self made man'
0
reply
134841422
Badges: 11
Rep:
?
#50
Report Thread starter 3 weeks ago
#50
(Original post by fallen_acorns)
The internet giants are going to have to decide if they want to be platforms or publishers soon enough.. They can't keep being regulated like one, and acting like the other.
wouldn't make a difference
no private platform has to allow something they don't want - at most they might get a fine for falsely advertising as a public forum - equivalent to a slap on the wrist - and maybe not even that, they'd just change their motto or something
Last edited by 134841422; 3 weeks ago
0
reply
fallen_acorns
Badges: 20
Rep:
?
#51
Report 3 weeks ago
#51
(Original post by 134841422)
wouldn't make a difference
no private platform has to allow something they don't want - at most they might get a fine for falsely advertising as a public forum - equivalent to a slap on the wrist - and maybe not even that, they'd just change their motto or something
the problem with this line of arguing though is it ignores 2 things, how they are regulated, and how their societal role has shifted.

Their regulation under section 230 of the communications decency act, treats them very much as a public space, rather than a publishing platform. It gives them near immunity for the content posted on their site while giving those publishing the content no recourse to dispute edited or removed content. That worked fine when the act was created in 1996, but things have changed a little since then. The key act that now regulates social media platforms was created before social media platforms even existed, as was designed to regulate ISPs instead, during a time when the internet was no where near as pervasive as it is now in our public discourse, and the biggest fear was an ISP being sued for content they were displaying. Since then a small handful of companies have grown to dominate the online landscape in a way that we have never seen in business before (due to the scalable nature of online business compared to brick+mortar business), and we have seen the journalistic, political and social landscape become dominated and centralized by less then 10 major companies.

Obviously things have changed somewhat since then, but the legal framework hasn't kept up. For example compare them with printed publications, like news papers or broadcast media. You are correct that no one can force the media to say anything. They are privately run and can publish what they want. But they are acknowledged legally as publishers, not platforms. They are fully legally responsible for the content they publish, not the writers or contributes who create it. if the New York Times publishes an article by X journalist, that's deemed legally problematic, its the NYT that will be prosecuted rather than the author.

So you have two frameworks, that of the publisher or that of the platform. Currently the 'platform' is the option we take, based on very outdated rules that give social media platforms no legal responsibility for content that is on their sites, because they are technological platforms not publishers.

Legally, currently under section 230, they are in a golden spot though because they can perfectly legally edit, delete or censor your content without you having any legal recourse, and are immune to all of the legal responsibilities that publications have. Peoples arguments currently are that this outdated framework is giving them the benefits of being a publisher, with only the regulation of being a platform, its a best-of-both worlds scenario they are in. They can edit content, filter content, broadcast the message they want, all traits of a publisher, whilst coming under none of the regulatory and legal scrutiny. For example you can sue a newspaper for defamation if they publish an article that contains defamatory material, but you can't sue a social media platform that only allowed defamatory content to be published on it.

What people want changed, has nothing to do with them being private or not, but everything to do with what type of private business they are, and how they are legally able to be held accountable for their actions and the content published on their site.

So, either people want them to be regulated as platforms, under a new and updated bill that keeps them from being prosecuted for work published on their platforms, but also removes their ability to edit/remove/filter effect the publication of content in any way further then what is legally established by the government.

Or people want them to be regulated as platforms that can do what they want, ban any opinions they want, set their own publishing guidelines that are vastly different from the countries legal ones, and act in any way they like.. but then be legally responsible for content published through their domain, in the same way that other curated media is.

Neither option has anything to do with them being private or not private.. no one (except some very radical left/right individuals) want them to be nationalized or made public. Private or not private isn't the argument at all, its all about how they are regulated and what type of private business they function as.
0
reply
134841422
Badges: 11
Rep:
?
#52
Report Thread starter 3 weeks ago
#52
(Original post by fallen_acorns)
the problem with this line of arguing though is it ignores 2 things, how they are regulated, and how their societal role has shifted.

Their regulation under section 230 of the communications decency act, treats them very much as a public space, rather than a publishing platform. It gives them near immunity for the content posted on their site while giving those publishing the content no recourse to dispute edited or removed content. That worked fine when the act was created in 1996, but things have changed a little since then. The key act that now regulates social media platforms was created before social media platforms even existed, as was designed to regulate ISPs instead, during a time when the internet was no where near as pervasive as it is now in our public discourse, and the biggest fear was an ISP being sued for content they were displaying. Since then a small handful of companies have grown to dominate the online landscape in a way that we have never seen in business before (due to the scalable nature of online business compared to brick+mortar business), and we have seen the journalistic, political and social landscape become dominated and centralized by less then 10 major companies.

Obviously things have changed somewhat since then, but the legal framework hasn't kept up. For example compare them with printed publications, like news papers or broadcast media. You are correct that no one can force the media to say anything. They are privately run and can publish what they want. But they are acknowledged legally as publishers, not platforms. They are fully legally responsible for the content they publish, not the writers or contributes who create it. if the New York Times publishes an article by X journalist, that's deemed legally problematic, its the NYT that will be prosecuted rather than the author.

So you have two frameworks, that of the publisher or that of the platform. Currently the 'platform' is the option we take, based on very outdated rules that give social media platforms no legal responsibility for content that is on their sites, because they are technological platforms not publishers.

Legally, currently under section 230, they are in a golden spot though because they can perfectly legally edit, delete or censor your content without you having any legal recourse, and are immune to all of the legal responsibilities that publications have. Peoples arguments currently are that this outdated framework is giving them the benefits of being a publisher, with only the regulation of being a platform, its a best-of-both worlds scenario they are in. They can edit content, filter content, broadcast the message they want, all traits of a publisher, whilst coming under none of the regulatory and legal scrutiny. For example you can sue a newspaper for defamation if they publish an article that contains defamatory material, but you can't sue a social media platform that only allowed defamatory content to be published on it.

What people want changed, has nothing to do with them being private or not, but everything to do with what type of private business they are, and how they are legally able to be held accountable for their actions and the content published on their site.

So, either people want them to be regulated as platforms, under a new and updated bill that keeps them from being prosecuted for work published on their platforms, but also removes their ability to edit/remove/filter effect the publication of content in any way further then what is legally established by the government.

Or people want them to be regulated as platforms that can do what they want, ban any opinions they want, set their own publishing guidelines that are vastly different from the countries legal ones, and act in any way they like.. but then be legally responsible for content published through their domain, in the same way that other curated media is.

Neither option has anything to do with them being private or not private.. no one (except some very radical left/right individuals) want them to be nationalized or made public. Private or not private isn't the argument at all, its all about how they are regulated and what type of private business they function as.
i stopped reading at 'societal role' - just because a specific private platform has a large userbase doesn't justify it being more regulated than any other private platform, it is nonsensical to punish a business just because it is successful

i can already tell that this is essentially an argument of libertarian vs authoritarian

people are still free to say whatever the fck they want on the internet - these sites are a business and should be treated as one, simple as that
if people don't like it, they will move on and create new ones and the balance of power will be restored - supply and demand and all that
Last edited by 134841422; 3 weeks ago
0
reply
fallen_acorns
Badges: 20
Rep:
?
#53
Report 3 weeks ago
#53
(Original post by 134841422)
i stopped reading at 'societal role' - just because a specific private platform has a large userbase doesn't justify it being more regulated than any other private platform, it is nonsensical to punish a business just because it is successful

i can already tell that this is essentially an argument of libertarian vs authoritarian

people are still free to say whatever the fck they want on the internet - these sites are a business and should be treated as one, simple as that
if people don't like it, they will move on and create new ones - supply and demand and all that
"i stopped reading at 'societal role'"

That's pretty clear.

What's the use in reading what someone writes, when you can presume what they said fits inline with the perspective you want to argue against and just keep going...

I post a long explanation of how its not about whether they are a business, its all to do with the regulation of different types of businesses

You reply: "these sites are a business and should be treated as one"

Spot on.
1
reply
Napp
Badges: 22
Rep:
?
#54
Report 3 weeks ago
#54
(Original post by 423635)
So one or two people incited some violence on there
You see it non stop by the left and its left alone
Just double standards, used to justify censorship of a whole group
God this pathetic victim complex you lot seem to have is ridiculous. I mean its also ********.
But i guess you're right letting off bombs in office blocks, drive cars into people, sending mail bombs, attempting to assassinate politicians, and considered the biggest terror threat facing the US ... oh wait... thats the 'right'
Get your facts straight in future. And this bit i cant stress enough in a broad sense, and irony abounds as its a 'rightwing' chatphrase, but this snow flakery needs to stop, its risible.
0
reply
134841422
Badges: 11
Rep:
?
#55
Report Thread starter 3 weeks ago
#55
(Original post by fallen_acorns)
"i stopped reading at 'societal role'"

That's pretty clear.

What's the use in reading what someone writes, when you can presume what they said fits inline with the perspective you want to argue against and just keep going...

I post a long explanation of how its not about whether they are a business, its all to do with the regulation of different types of businesses

You reply: "these sites are a business and should be treated as one"

Spot on.
I've read it all now and still stand by my opinion
you still want unnecessary regulation, I still don't

Trump's appointee for supreme court, Brett Kavanaugh says it better than me:
In short, merely hosting speech by others is not a traditional, exclusive public function and does not alone transform private entities into state actors subject to First Amendment constraints. If the rule were otherwise, all private property owners and private lessees who open their property for speech would be subject to First Amendment constraints and would lose the ability to exercise what they deem to be appropriate editorial discretion within that open forum. Private property owners and private lessees would face the unappetizing choice of allowing all comers or closing the platform altogether.


I get what you're trying to do, but it's the wrong way to do it - when's the last time the government being involved this much with the running of a business ended good? it's better to let the investors decide what they need to do/allow
Last edited by 134841422; 3 weeks ago
0
reply
fallen_acorns
Badges: 20
Rep:
?
#56
Report 3 weeks ago
#56
(Original post by 134841422)
I've read it all now and still stand by my opinion
you still want unnecessary regulation, I still don't

Trump's appointee for supreme court, Brett Kavanaugh says it better than me:
In short, merely hosting speech by others is not a traditional, exclusive public function and does not alone transform private entities into state actors subject to First Amendment constraints. If the rule were otherwise, all private property owners and private lessees who open their property for speech would be subject to First Amendment constraints and would lose the ability to exercise what they deem to be appropriate editorial discretion within that open forum. Private property owners and private lessees would face the unappetizing choice of allowing all comers or closing the platform altogether.
I agree with him. I don't think they should be forced to become some kind of public entity.

But that doesn't get to the real issue, which is what type of private entity they should be treated as. One of your earlier posts said (paraphrasing) 'you can't force a private entity to publish something they don't want to' - Which I agree with, but you can hold different private companies accountable in different ways for the content they publish, or is published through them. For example, as I mentioned, both social media platforms and news platforms are private businesses that operate as such, but currently they face a wholly different set of regulatory constrictions and legal ramification for their actions. You can fully support keeping them as private entities, that do not need to be forced to serve the public, while arguing for regulation to be brought inline with other forms of media (or if you are a libertarian, you may argue the reverse, that other private business should have their regulation decreased to the level of social media platforms), but either way the discrepancy is where the argument lies.

For example, how do you justify the following:

if a national news paper in the US published an article written by a journalist claiming "Donald trump raped and killed 50 babies". They could be sued and held accountable for their actions

if Facebook banned all accounts from journalists claiming he didn't rape and kill babies, but kept on its platform news sources claiming he did. They can't be legally held accountable for their actions

Both are essentially achieving the same aim, within their capability they are allowing a single opinion to be published on their platform. But one is accountable and the other is not. To be logically consistent you either need to be in favor of lowering regulations and laws governing other forms of media, or increasing them on social media platforms that edit their content.

That is a representation of the core of the censorship debate, its not arguing whether everyone should or shouldn't be allowed to post, its arguing to which point does censorship overlap into editorial publishing, and how should it be regulated. There are some extreme cases of people arguing for an entirely unregulated internet, but for the most people its the line of regulation that causes the conflict, even if they can't quite express it so they just resort to saying that it should be public and open, because that's the easiest solution that comes to mind, until you think about it a bit.

For me, I am in favor of minimal regulation that is set at a government level that sets the level of editorial decisions social media platforms can take regarding the content they publish. They remain not responsible for content posted on their site, but can't restrict content beyond the manners agreed by the government, presumably in line or a bit in excess of the law itself.

(I also think this is what many social media platforms want. They want the government to step in and take the heat away from them, so they don't have to face any of the problems of censorship, they just want to make sure that it happens in a way that favors them rather then hurts them. I personally think under the next democratic president, we will see regulatory and censorial responsibility transferred from social media platforms to the government, in exchange for their continued effective immunity).
0
reply
134841422
Badges: 11
Rep:
?
#57
Report Thread starter 3 weeks ago
#57
(Original post by fallen_acorns)
For me, I am in favor of minimal regulation that is set at a government level that sets the level of editorial decisions social media platforms can take regarding the content they publish. They remain not responsible for content posted on their site, but can't restrict content beyond the manners agreed by the government, presumably in line or a bit in excess of the law itself.
They want the government to step in and take the heat away from them, so they don't have to face any of the problems of censorship, they just want to make sure that it happens in a way that favors them rather then hurts them. I personally think under the next democratic president, we will see regulatory and censorial responsibility transferred from social media platforms to the government, in exchange for their continued effective immunity).
that seems like a rational and fair compromise - but we can't say anything for sure until we actually know what these decisions are

I personally abhor and am disgusted by any action by the government which tells a private company what it can and cannot allow on its platform
because at the end of the day, what happens when these seemingly small federal decisions start affecting the company's profit margins? what happens when these decisions make it impossible for the platform to stay open? (see my previous comment on company PR and association and ad revenue)
Last edited by 134841422; 3 weeks ago
0
reply
X

Quick Reply

Attached files
Write a reply...
Reply
new posts
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise

Are you tempted to change your firm university choice on A-level results day?

Yes, I'll try and go to a uni higher up the league tables (153)
17.81%
Yes, there is a uni that I prefer and I'll fit in better (75)
8.73%
No I am happy with my course choice (507)
59.02%
I'm using Clearing when I have my exam results (124)
14.44%

Watched Threads

View All