• Home
  • Internet
  • Very serious allegations against the company came from TikTok moderators

Very serious allegations against the company came from TikTok moderators

TikTok moderators once again came to the fore with serious allegations against the company. These allegations raise questions about how TikTok's monitoring team handles child sexual abuse material.
 Very serious allegations against the company came from TikTok moderators
READING NOW Very serious allegations against the company came from TikTok moderators

A Forbes report raises questions about how TikTok’s moderation team handles child sexual abuse material, alleging extensive, unsafe access to illegal photos and videos.

Employees of a third-party auditing team called Teleperformance, which works with TikTok among other companies, claim to have asked them to review a disturbing spreadsheet called Daily Required Reading in the DRR or TikTok auditing standards. The spreadsheet allegedly contains content that violates TikTok’s guidelines, including “hundreds of pictures” of nude or abused children. Employees say hundreds at TikTok and Teleperformance can access this content both inside and outside the office, opening up the possibility of a wider leak.

Teleperformance denied showing sexually abusive content to its employees, and TikTok said the training materials “have strict access controls and do not include visual examples of CSAM”, but have not confirmed that all third-party vendors meet this standard.

But employee narratives are different, and as Forbes points out, this is legally controversial. Content moderators routinely have to deal with CSAM posted on many social media platforms. But images of child abuse are illegal in many parts of the world and should be handled with care. Companies are required to report the content to the National Center for Missing and Exploited Children (NCMEC), then keep it for 90 days, but minimize the number of people who see it.

The claims here go far beyond that limit. These claims state that Teleperformance is showing employees graphic photos and videos while playing fast and loosely with access to that content, as examples of what to tag on TikTok. An employee said that while it’s unclear whether an app was opened, he contacted the FBI to ask if the app constituted spreading CSAM in a criminal manner.

Comments
Leave a Comment

Details
250 read
okunma11644
0 comments