X

US Supreme Court Shields Internet Platforms from Terrorism Content Liability

In a highly anticipated ruling, the US Supreme Court has decided that internet platforms cannot be held legally responsible for terrorism-related content posted by their users. The court’s unanimous opinion in the cases of Twitter v. Taamneh and Gonzalez v. Google LLC clarifies the liability of social media platforms under the Antiterrorism and Effective Death Penalty Act.

The lawsuits were initiated by the families of victims of the Reina nightclub attack in Istanbul, which was claimed by ISIS. They filed suits against Twitter, Facebook, and Google, alleging that the tech giants were aware of ISIS’s use of their platforms for recruitment and propaganda dissemination yet failed to take appropriate action.

In delivering the court’s opinion, Justice Clarence Thomas focused on the plaintiffs’ claim that the social media platforms aided and abetted ISIS in the terrorist attack. Under 18 USC § 2333(d)(2), US nationals who are victims of terrorist attacks have the right to hold individuals civilly liable for aiding and abetting the attack. However, the court found that the plaintiffs did not present sufficient evidence to prove that the platforms played a culpable role in the Reina nightclub attack.

The court established a three-prong test to determine whether someone “aided and abetted” a terrorist act under §2333(d)(2):

Take action now and submit your resume to LawCrossing for access to thousands of available jobs!

  • A known terrorist must commit a terrorist attack.
  • The defendants must have knowledge of their involvement in the terrorist’s activities.
  • The defendants must have provided “knowing and substantial assistance” that demonstrates their culpable participation in the attack.

While the court acknowledged that the first two prongs were satisfied in the case of Twitter, the plaintiffs failed to meet the threshold for the third prong. The court deemed the platform’s recommendation algorithms, which formed the basis of the plaintiffs’ claims, as merely integral to the platform’s infrastructure. It stated that attributing liability based solely on these algorithms would be overly broad and unreasonable. The court emphasized that similar arguments could be applied to other communication technologies like cell phones, email, or the internet itself.

The court clarified that holding the defendants liable would set a precedent of attributing culpability to technology providers for every terrorist act committed by extremist groups worldwide. Consequently, it reversed the lower court’s ruling, concluding that the social media platforms, including Twitter, could not be held liable under §2333(d)(2) in this particular case.

In the related case of Gonzalez v. Google LLC, the court used the same reasoning as in Twitter. It found that the plaintiffs failed to demonstrate that Google, as the owner and operator of YouTube, had any agreement that met the threshold for conspiracy liability with ISIS regarding the disputed content posted on the platform. The court remanded the case to the lower court, instructing it to reconsider the case using the newly established standard from the Twitter ruling.

The Supreme Court’s decision establishes an important precedent regarding the liability of internet platforms for terrorism-related content. It clarifies that while bad actors may use social media platforms for nefarious purposes, the platforms themselves cannot be held solely responsible for the actions of their users. The ruling acknowledges the crucial role of technology providers in facilitating communication and sharing information, emphasizing that attributing liability to them for criminal activities would be impractical and undermine free speech principles.

This landmark ruling has significant implications for the ongoing debate surrounding the responsibility of internet platforms in regulating content and combating terrorism. It highlights the need for a balanced approach that considers the limitations of technology companies while also addressing legitimate concerns related to online extremism and terrorist activities.

Rachel E: