
PHOTO COURTESY OF FOX29
Jack Bond, opinion editor, is a senior English major from Marion
On February 28 and March 1, the Supreme Court will hear two cases that could change the way internet content is moderated: Gonzales v. Google and Twitter v. Taamneh.
Both cases will look into the part of section 230 which states websites that host user-created content cannot be held liable for the content said users post. They will address whether the respective companies can be sued for promoting terrorist content by allowing it to be on their social media platforms.
The bill has garnered bipartisan controversy. Democrats condemn the bill for allowing companies to get away with promoting hate speech. Republicans condemn the part of the bill that allows websites to remove content which they deem offensive, with many saying that it is an excuse to silence conservative viewpoints.
So the big question: Should websites like YouTube and Twitter be held accountable for hosting terrorist content? Absolutely.
It is up to them to remove such content from their services, yet they failed to do so. But because the law gives them immunity to liability, the law can do nothing to hold them accountable.
Granted, there is one problem with trying to do so regardless of what the law says. The problem is the near endless amount of content on the platforms. With the myriad of worldwide users, it is inevitable that dangerous content will fly under the radar of moderators.
It may also be difficult in certain cases to determine what constitutes potentially dangerous content. With the amount of insane people who use the internet, it’s sometimes impossible to differentiate between irony and sincerity.
Take YouTube’s recommendation algorithm for example. It is an automated process that, while able to be monitored by humans, runs so frequently that it would be impossible to control it completely.
The bill needs to account for this in its revision. It needs to acknowledge that a moderator can only do so much to make sure they host safe content. It needs to require moderation while accommodating for potential slip-ups.
As it stands now, the bill is contradictory. It allows companies to moderate user content, but removes any incentive to do so by giving them immunity to liability.
The bill is a relic of an internet that no longer exists. It needs to be changed and it needs to be changed now.
Categories: Opinion
Leave a Reply