[elfsight_social_share_buttons id=”1″] 

A deeply disturbing new report from the Wall Street Journal found that Instagram’s algorithm has helped build a pedophile network on the app, including sellers of child pornography.

The algorithm is designed to identify niche interests to connect communities, but that same process also easily connects child predators.

According to the Journal, researchers discovered that Instagram allows for the search of hashtags connected to explicit content of minors. From there, the algorithm suggests similar accounts, even one’s promoting child abuse.

Parent company Meta acknowledged the problem and is setting up a task force to address the issue.

“Child exploitation is a horrific crime. We’re continuously investigating ways to actively defend against this behavior,” the company said in a statement.

A spokesperson said Meta had taken down 490,000 users since January for violating its child safety policies. They’ve also disabled a number of hashtags.

Leave a Reply

Your email address will not be published. Required fields are marked *