TikTok during the US Government’s investigation into material on child sexual abuse

TikTok is under investigation by U.S. government agencies over its handling of material on child sexual abuse as the budding short-form video app struggles to moderate a stream of new content.

Dealing with sexual predators has been an ongoing challenge for social media platforms, but TikTok’s young user base has made it vulnerable to being a target.

The U.S. Department of Homeland Security is investigating how TikTok handles material about child sexual abuse, according to two sources familiar with the matter.

The Ministry of Justice is also reviewing how a specific privacy function on TikTok is being exploited by predators, said a person with knowledge of the case. The DOJ has a long-standing policy of not confirming or denying the existence of ongoing investigations.

“It’s a perfect place for predators to meet, care for, and engage children,” said Erin Burke, unit head of the Child Exploitation Unit at Homeland Security’s cybercrime department, calling it “the preferred platform” for the behavior.

The surveys highlight how TikTok is struggling to cope with the flow of content generated by more than $ 1 billion. users. The company, owned by China’s ByteDance, has more than 10,000 human moderators worldwide and has quickly hired staff in this field.

The business is booming. An Insider Intelligence forecast estimates TikTok’s advertising revenue at $ 11.6 billion. this year – a tripling from last year’s $ 3.9 billion.

Mark Zuckerberg, Meta CEO, has blamed TikTok’s popularity among young people for being a major reason for curbing interest in its longer-established social media platforms such as Facebook and Instagram.

But Meta has more experience handling problematic material, with about 15,000 moderators globally and using other automated systems designed to mark posts.

Between 2019 and 2021, the number of TikTok-related child exploitation surveys conducted by Homeland Security has increased sevenfold.

Social media networks use technology trained in a database of images collected by the National Center for Missing and Exploited Children (NCMEC), a centralized organization where companies are legally required to report material on child abuse.

TikTok reported nearly 155,000 videos last year, while Instagram, which also has more than 1 billion. users, had almost 3.4 million. reports. TikTok received no removal requests from NCMEC last year, unlike rivals Facebook, Instagram and YouTube.

“TikTok has zero tolerance for material sexually abused children,” the company said. “When we find an attempt to post, procure or distribute [child sexual abuse material]we remove content, ban accounts and devices, report to NCMEC immediately, and cooperate with law enforcement as needed. “

However, Homeland Security’s Burke claimed that international companies like TikTok were less motivated when working with U.S. law enforcement. “We want [social media companies] to proactively ensure that children are not exploited and abused on your sites – and I can not say that they do, and I can say that many American companies will be, “she added.

TikTok said it had removed 96 percent of the content that violated its minor security policies before anyone had seen them. Videos of minors drinking alcohol and smoking accounted for the majority of the removals under these guidelines.

One pattern that the Financial Times confirmed with law enforcement and child safety groups was content that was purchased and traded through private accounts by sharing the password with victims and other predators. Keywords are used in public videos, usernames and biographies, but the illegal content is uploaded using the app’s ‘Only Me’ feature, where videos are only visible to those who are logged in to the profile.

Seara Adair, a child safety activist, reported this trend toward U.S. law enforcement after first marking the content on TikTok and being told that a video did not violate policies. “TikTok is constantly talking about the success of their artificial intelligence, but a clearly naked child slips through it,” Adair said. All accounts and videos referred to TikTok by FT have now been removed.

“We are deeply committed to the safety and well-being of minors, and that’s why we build youth security into our policies, enable privacy and security settings by default on teen accounts, and limit features by age,” TikTok added.

Leave a Comment