Meta has a problem with pedophiles on its platforms, as a new report from The Wall Street Journal has found instances where the algorithms powering the company’s platforms have actually promoted pedophile accounts.
VIEW GALLERY – 2 IMAGES
It was back in June that the WSJ and Stanford University worked together to find an underground pedophile network on Instagram. The June report revealed Instagram’s algorithm was connecting a distribution network of underage sex content, which was promptly responded to by Meta, who formed a child-safety task force dedicated to solving underage sexual content on the platform. However, it isn’t just Instagram that has problems.
The WSJ report found there are still entire Facebook groups that are dedicated to sharing content that sexualizes children, pedophile-related hashtags are abundant, and many other Facebook groups that are currently celebrating incest and sex with children. The WSJ report states it flagged these grounds to Meta, which responded to the publication by saying the groups weren’t violating any of the platform’s community standards.
Meta has announced that it has removed more than 16,000 accounts since July that violated child safety community standards, but if the WSJ report is anything, there is much more work that needs to be done before content sexualizing children is removed from its social media platforms.
Additionally, the Canadian Centre For Child Protection discovered several Instagram accounts that had up to 10 million followers livestreaming videos of children being sexually abused.