4chan shows how some platforms are accountable only to themselves
[ad_1]
4chan did not respond to repeated requests for comment from CNN Business. A direct inquiry sent to 4chan’s current owner, Hiroyuki Nishimura, also went unanswered.
The site — a barebones, forum-based site reminiscent of the early internet where users post anonymously — hosts a variety of communities where hate speech is tolerated or celebrated. While major platforms like Facebook and Twitter have multi-faceted terms of service agreements for users that lay out prohibited behavior like hate speech, harassment, racist speech, and more, 4chan has bucked the trend of social platforms adopting increasingly robust content moderation policies.
The site lists a series of rules and warns users that « if we reasonably think you haven’t followed these rules, we may (at our own discretion) terminate your access to the site. » But it’s not clear if or how the rules — which prohibit, for example, posting personal information or sharing content that violates US law — are enforced. In some cases, they appear to be ignored; for example, they state that racist posts are only allowed on a certain thread, but rampant racism is easily found throughout the site.
In the wake of the Buffalo shooting, many of the major social media platforms « did go to significant lengths » to quickly remove content related to the attack, « but there’s a real problem, which is that there are some platforms who are kind of holdouts that ruin it for everyone, » said Tim Squirrell, communications head at the think tank Institute for Strategic Dialogue. « The consequence of that is that you can never complete the game of whack-a-mole. There’s always going to be somewhere circulating [this content], » he said.
Squirrell added that such platforms’ opposition to removing or moderating content is why footage of the 2019 racist mass shooting in Christchurch, New Zealand « is still available even now, three years later, because you can never stop them all. » In the document believed to be authored by the alleged Buffalo shooter, he described being radicalized by the livestream of this 2019 shooting.
Limits of the law
Despite that legal protection, many of the Big Tech platforms have in recent years ramped up their efforts to moderate and remove certain harmful content — including hate speech and conspiracy theories — in response to pressures from advertisers and as they seek to maintain a broad base of users and an attempt to stay in the good graces of lawmakers.
But for sites like 4chan, which don’t rely on mainstream advertisers and seek to be homes for content prohibited on other platforms, rather than platforms broadly adopted by many users, there are few incentives to remove harmful or dangerous content. In an email to CNN in 2016, 4chan owner Nishimura said he « personally [doesn’t] like sexists and racists… [but] If I like[d] censorship, I would have already [done] that. »
Limited options
An extreme intervention with historical precedent would be a move by the internet infrastructure companies that allow sites like 4chan to exist. A similar site called 8chan, which was spun out of 4chan several years ago, has struggled to stay online since the internet infrastructure company Cloudflare stopped supporting it in 2019 after authorities believe it was used by the alleged gunman in the El Paso Walmart shooting to post white nationalist writings.
4chan is « intentionally sort of this censorship-free platform, but they have cloud providers and other [internet service providers] they rely on to exist, » said Decker, who is also CEO of digital investigations consultancy Memtica. In theory, those internet service providers could say, « we will not allow for this content anywhere, on any entity that uses our tech, » which could force 4chan and sites like it to implement stronger moderation practices.
Still, even that is not a surefire means of reining in such platforms. As the ranks of online platforms dedicated to supporting « free speech » at all costs have grown, internet service providers espousing similar views have also emerged.
[ad_2]