People walk across the plaza of the U.S. Supreme Court constructing on the primary day of the court’s latest term in Washington, U.S. October 3, 2022.
Jonathan Ernst | Reuters
The Supreme Court on Monday stepped into the politically divisive issue of whether tech corporations must have immunity over problematic content posted by users, agreeing to listen to a case alleging that YouTube helped aid and abet the killing of an American woman within the 2015 Islamic State terrorist attacks in Paris.
The family of Nohemi Gonzalez, certainly one of 130 people killed in a series of linked attacks carried out by the militant Muslim group, argued that YouTube’s energetic role in recommending videos overcomes the liability shield for web corporations that Congress imposed in 1996 as a part of the Communications Decency Act.
The availability, Section 230 of the act, says web corporations will not be answerable for content posted by users. It has come under heavy scrutiny from the fitting and left lately, with conservatives claiming that corporations are inappropriately censoring content and liberals saying that social media corporations are spreading dangerous right-wing rhetoric. The availability leaves it to corporations to determine whether certain content must be removed and doesn’t require them to be politically neutral.
Gonzalez was a 23-year-old college student studying in France when she was killed while dining at a restaurant throughout the wave of attacks, which also targeted the Bataclan concert hall.
Her family is in search of to sue Google-owned YouTube for allegedly allowing ISIS to spread its message. The lawsuit targets YouTube’s use of algorithms to suggest videos for users based on content they’ve previously viewed. YouTube’s energetic role goes beyond the sort of conduct that Congress intended to guard with Section 230, the family’s lawyers allege. They are saying in court papers that the corporate “knowingly permitted ISIS to post on YouTube lots of of radicalizing videos inciting violence” that helped the group recruit supporters, a few of whom then conducted terrorist attacks. YouTube’s video recommendations were key to helping spread ISIS’s message, the lawyers say. The plaintiffs don’t allege that YouTube had any direct role within the killing.
Gonzalez’s relatives, who filed their 2016 lawsuit in federal court in northern California, hope to pursue claims that YouTube violated a federal law called the Anti-Terrorism Act, which allows people to sue people or entities who “aid and abet” terrorist acts. A federal judge dismissed the lawsuit nevertheless it was revived by the San Francisco-based ninth U.S. Circuit Court of Appeals in a June 2021 decision that also resolved similar cases brought by the families of other terrorist attacks against tech corporations.
Google’s lawyers urged the court not to listen to the Gonzalez case, saying partly that the lawsuit would likely fail whether or not Section 230 applies.
The Supreme Court has previously declined to take up cases on Section 230, although conservative Justice Clarence Thomas has criticized it, citing the market power and influence of tech giants.
One other related issue is probably going heading to the Supreme Court concerning a law enacted by Republicans in Texas that seeks to stop social media corporations from barring users who make inflammatory political comments. On Sept. 16, a federal appeals court upheld the law, which the Supreme Court in May prevented from going into effect.
In a separate move, the court also said it might hear a related appeal brought by Twitter on whether the corporate will be liable under the Anti-Terrorism Act. The identical appeals court that handled the Gonzalez case revived claims brought by relatives of Nawras Alassaf, a Jordanian citizen killed in an Islamist attack in Istanbul in 2017. The relatives accused Twitter, Google and Facebook of aiding and abetting the spread of militant Islamic ideology. In that case, the query of Section 230 immunity had not yet been addressed.