Instagram highlights pedophile content
Accounts found by researchers advertise using obvious hashtags such as #pedowore, #preteensex and #pedobait. These accounts offer “menus” of content, including video and images, that users can purchase or order.
Researchers say that when they create a test account and view content shared by these networks, they are immediately offered more accounts to follow. “Following just a few of these recommendations was enough to flood a test account with content that sexualized children,” the WSJ said. says.
Working on meta topic
In response to the report, Meta said it had set up a unit to address the issues raised by the investigation. “Child abuse is a terrible crime. We are constantly exploring ways to actively defend against this behavior,” the company said. said.
Meta said it removed 490,000 accounts that violated child safety policies in January alone, and 27 pedophile networks in the past two years. The company, which also owns Facebook and WhatsApp, said it has also blocked thousands of hashtags associated with the sexualization of children, restricting the use of these terms in user searches.
Other platforms also reviewed
Other platforms were also examined in the report, but they were found to be less suitable for growing such networks. Stanford researchers found 128 accounts offering to sell child sexual abuse material even though Twitter had far fewer users, less than a third of what they found on Instagram. It was stated that TikTok and Snapchat did not see growth in such networks.
David Thiel, chief technologist at the Stanford Internet Observatory, said in a statement that Instagram failed to strike the right balance between recommendation systems designed to highlight and connect users, and security features that examine and remove abusive content.