Twitter Suspended 166,513 Accounts During the Second Half of 2018 for Promoting Terrorism

456,989 were removed for violations related to child sexual exploitation

The number of accounts suspended for child sexual exploitation was down 6% from the first half of 2018 VladSt/iStock

Twitter suspended 166,513 accounts during the second half of 2018 for violations related to promotion of terrorism, along with a whopping 456,989 unique accounts for violations related to child sexual exploitation, the social network revealed in its latest biannual Twitter Transparency Report.

Legal, policy and trust and safety lead Vijaya Gadde shared key points from the report in a blog post Thursday.

She said the number of accounts suspended for promoting terrorism was down 19% from the previous reporting period (the first half of 2018), and 91% of the suspended accounts were flagged by its internal tools.

Gadde wrote, “The trend we are observing year-on-year is a steady decrease in terrorist organizations attempting to use our service. This is due to zero-tolerance policy enforcement that has allowed us to take swift action on ban evaders and other identified forms of behavior used by terrorist entities and their affiliates. In the majority of cases, we take action at the account setup stage—before the account even tweets. We are encouraged by these metrics but will remain vigilant. Our goal is to stay one step ahead of emergent behaviors and new attempts to circumvent our robust approach.”

The number of accounts suspended for child sexual exploitation was down 6% from the first half of 2018, and Gadde said 96% of those were surfaced by its machine learning and artificial intelligence tools, as well as by other technology solutions, such as PhotoDNA.

Twitter continues to report its findings to the National Center for Missing and Exploited Children.

Gadde said Twitter received roughly the same number of legal requests for account information from governments and nongovernmental organizations as it did in the previous reporting period, noting a drop in such activity from the U.S., with 6% fewer requests than in the first half of last year, specifying 58% fewer accounts.

Twitter received requests of this sort from 86 different countries in the second half of 2018, with the U.S. accounting for 30% of them, followed by Japan (24%), the U.K. (13%), India (6%), Germany (6%) and France (5%).

Global emergency disclosure requests were down 2% compared with the first half of 2018, with the U.K. (33% of the total) and the U.S. (30%) accounting for the bulk of them.

Gadde noted that while the number of legal requests for content removal was down 8% in the second half of last year compared with the previous reporting period, impacting approximately 2% fewer accounts, there was an 84% jump year-over-year in 2018 versus 2017.

She added that 74% of the requests it received came from Russia and Turkey, and overall, requests of this type came from 48 different countries and specified 27,283 accounts.

Finally, Gadde outlined the number of accounts reported by known government entities for violating the six Twitter Rules categories specified in its Twitter Transparency Report: abuse, child sexual exploitation, hateful conduct, private information, sensitive media and violent threats.

She said 16,388 accounts were reported in this fashion in the second half of 2018, versus 5,461 in the prior reporting period, adding, “It is worth noting that the raw number of reported accounts is not a consistent indicator of the validity of the reports we receive. During our review process, we may consider whether reported content violates aspects of the Twitter Rules beyond what was initially reported. For example, content reported as a violation of our private information policy may also be a violation of our policies for hateful conduct. If the content is determined to violate any Twitter Rule, it is actioned accordingly. Not all reported accounts are found to violate the Twitter Rules, and reported accounts may be found to violate a different rule than was initially reported.”

Gadde added, “We may also determine that reported content does not violate the rules at all. The volumes often fluctuate significantly based on world events, including elections, national and international media stories and large conversational moments in social and political culture.”


david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.
{"taxonomy":"","sortby":"","label":"","shouldShow":""}