Twitter is not liable for providing material support to the Islamic State group, also referred to as the ISIS, by allowing its members to sign up and use accounts on its site, a federal judge in California ruled Wednesday.
The lawsuit against Twitter filed by the familes of two victims of a terror attack in Jordan is similar to another filed by the father of a victim of the Paris attack in November against Twitter, Google and Facebook for allegedly providing material support to terrorists by providing them a forum for propaganda, fund raising and recruitment.
These lawsuits accuse the internet companies of violating provisions of the Anti-Terrorism Act and and aim to deny the internet companies refuge under provisions of the Communications Decency Act, which protect publishers from liability for content posted to their site by third parties.
Citing the Act, Judge William H. Orrick of the U.S. District Court for the Northern District of California wrote in his order that “as horrific as these deaths were, under the CDA Twitter cannot be treated as a publisher or speaker of ISIS’s hateful rhetoric and is not liable under the facts alleged.”
Section 230(c)(1) of the Communications Decency Act states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
In November 2015, Lloyd “Carl” Fields, Jr. and James Damon Creach were shot and killed by a Jordanian police officer, Anwar Abu Zaid, while working as U.S. government contractors at a law enforcement training center in Amman. ISIS claimed responsibility for the attack by the police officer, who was studying at the center, describing him as a “lone wolf.”
The families of Fields and Creach filed the suit, claiming that Twitter’s provision of material support to ISIS was a proximate cause of the shooting. 18 U.S. Code 2339A and 2339B prohibit the knowing provision of material support or resources for terrorist activities or foreign terrorist organizations, and the term “material support or resources” is defined to include “any property, tangible or intangible, or service,” including “communications equipment,” according to the court papers.
Twitter’s alleged violations of the anti-terrorist laws cannot be accurately characterized as publishing activity, addressed by the Communications Decency Act, but rather as the provision of the means through which ISIS communicates, according to the victims’ families. “Even if ISIS had never issued a single tweet, [Twitter’s] provision of material support to ISIS in the form of Twitter accounts would constitute a violation of the ATA,” they said in a filing.
The judge, however, noted that under either theory, “the alleged wrongdoing is the decision to permit third parties to post content – it is just that under plaintiffs’ provision of accounts theory, Twitter would be liable for granting permission to post (through the provision of Twitter accounts) instead of for allowing postings that have already occurred.” The judge added that he was not convinced that the provision of accounts theory treats Twitter as something other than a publisher of third-party content.
The families were also not able to establish a cause-and-effect link between Twitter’s provision of accounts to the ISIS and the deaths of Fields and Creach. The only arguable connection between Abu Zaid and Twitter that was identified is that his brother told reporters that Abu Zaid had been very moved by an execution by ISIS, which the group publicized through Twitter. That connection, however tenuous, is based on specific content disseminated through Twitter, not the mere provision of Twitter accounts, the judge noted.
In the other lawsuit filed against Google’s YouTube, Twitter and Facebook, the father of Paris terror victim Nohemi Gonzalez charges that the companies “have knowingly permitted the terrorist group ISIS [Islamic State group] to use their social networks as a tool for spreading extremist propaganda, raising funds and attracting new recruits.”
Social networks claim they are doing their best to weed out terrorist content though it is turning out to be like trying to whack-a-mole, with the proscribed content or new content resurfacing elsewhere. Twitter said in February that as noted by many experts and other companies, “there is no ‘magic algorithm’ for identifying terrorist content on the internet.
Judge Orrick allowed the families of the victims to file their second amended complaint, if any, within 20 days of his order.