Partisanship has made the logjam worse. Republicans, a few of whom have accused Facebook, Twitter and other sites of censoring them, have pressured the platforms to go away more content up. In contrast, Democrats have said the platforms should remove more content, like health misinformation.
The Supreme Court case that challenges Section 230 of the Communications Decency Act is more likely to have many ripple effects. While newspapers and magazines might be sued over what they publish, Section 230 shields online platforms from lawsuits over most content posted by their users. It also protects platforms from lawsuits after they take down posts.
For years, judges cited the law in dismissing claims against Facebook, Twitter and YouTube, ensuring that the businesses didn’t tackle recent legal liability with each status update, post and viral video. Critics said the law was a Get Out of Jail Free card for the tech giants.
“In the event that they don’t have any liability on the back end for any of the harms which are facilitated, they’ve mainly a mandate to be as reckless as possible,” said Mary Anne Franks, a University of Miami law professor.
The Supreme Court previously declined to listen to several cases difficult the statute. In 2020, the court turned down a lawsuit, by the families of people killed in terrorist attacks, that said Facebook was chargeable for promoting extremist content. In 2019, the court declined to listen to the case of a person who said his former boyfriend sent people to harass him using the dating app Grindr. The person sued the app, saying it had a flawed product.
But on Feb. 21, the court plans to listen to the case of Gonzalez v. Google, which was brought by the family of an American killed in Paris during an attack by followers of the Islamic State. In its lawsuit, the family said Section 230 mustn’t shield YouTube from the claim that the video site supported terrorism when its algorithms really helpful Islamic State videos to users. The suit argues that recommendations can count as their very own type of content produced by the platform, removing them from the protection of Section 230.
A day later, the court plans to contemplate a second case, Twitter v. Taamneh. It deals with a related query about when platforms are legally chargeable for supporting terrorism under federal law.