Facebook and Instagram allow predators to 'trade child pornography,' according to lawsuit filed by New Mexico

Lawsuit alleges Meta promotes child pornography on its platforms

Facebook and Instagram allegedly promote minors’ accounts to apparent child predators and recommend sexual content to underage users, according to a lawsuit filed against parent company Meta Platforms and its CEO, Mark Zuckerberg. 

New Mexico Attorney General Raúl Torrez filed the civil lawsuit in New Mexico state court Tuesday, alleging that "Meta has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey" and claims the company failed to implement proper protections to prevent users under 13 and instead targeted the vulnerabilities of young children to increase advertising revenue. Investigators at the AG's office said "certain child exploitative content" is over ten times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans. 

"Our investigation into Meta’s social media platforms demonstrates that they are not safe spaces for children but rather prime locations for predators to trade child pornography and solicit minors for sex," Torrez said in a press release. "As a career prosecutor who specialized in internet crimes against children, I am committed to using every available tool to put an end to these horrific practices and I will hold companies — and their executives — accountable whenever they put profits ahead of children’s safety." 

The state AG's office ran an investigation that set up test accounts on the social media platforms where they created decoy accounts of children 14-years and younger and found that the social media platforms directed underage users to "a stream of egregious, sexually explicit images — even when the child has expressed no interest in this content," enabled dozens of adults "to find, contact, and press children into providing sexually explicit pictures of themselves or participate in pornographic videos," recommended that children join unmoderated Facebook groups "devoted to facilitating commercial sex" and allowed users "to find, share, and sell an enormous volume of child pornography." 

AI NOW BEING USED TO GENERATE CHILD PORNOGRAPHY, BLACKMAIL TEENAGERS: DIGITAL SAFETY EXPERT

One Facebook account created by investigators at the AG's office was under the name Rosalind Cereceres, a 40-year-old fictional "bad mother" to 13-year-old, Issa Bee, who incorporated signals that Cereceres was interested in trafficking her daughter. Most of Issa’s followers were males between the ages of 18- and 40-years old and comments ranged from admiring to sexually suggestive and sometimes outright threatening, the complaint said. 

"On Facebook Messenger, Issa’s messages and chats are filled with pictures and videos of genitalia, including exposed penises, which she receives at least 3-4 times per week," the complaint stated. "As the messages come in, she has no means of screening or previewing the messages."

In addition, "Issa’s [Instagram] Reels delivered a graphic sexual image (excluded here), followed by an advertisement for a law firm representing ‘trafficking survivors,’ suggesting that Meta had linked the sexual content and human trafficking, again, for purposes of generating content, but not compliance," the complaint stated. 

The lawsuit also alleges that the decisions described in the complaint "make clear" that "Mark Zuckerberg called the shots" in making the decisions that mattered to children and parents. 

SOCIAL MEDIA COMPANIES UNPREPARED FOR HAMAS ‘HIJACKING’ THEIR PLATFORMS, TECH EXPERT SAYS

"Mr. Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children," Torrez added in his statement. "Despite repeated assurances to Congress and the public that they can be trusted to police themselves, it is clear that Meta’s executives continue to prioritize engagement and ad revenue over the safety of the most vulnerable members of our society."

The Wall Street Journal conducted its own investigation into Instagram's algorithm and found that the platform serves videos to adults who they think might have a prurient interest in children. 

The paper reported that it set up the test accounts after noticing that many of the thousands of followers of social media accounts included "large numbers" of adult men. In addition, "many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults," according to the report. When the Journal followed the accounts of adult men, the outlet said the Instagram algorithm "produced more-disturbing content interspersed with ads."

Vice President, Client Council and Industry Trade Relations of Meta, Samantha Stetson, previously told Fox News Digital that the company doesn't want this kind of content on its platforms and brands don’t want their ads to appear next to it. 

TEXAS LAW REQUIRING SCHOOLS TO RATE BOOKS BASED ON SEXUAL CONSTENT CHALLENGED IN FEDERAL APPEALS COURT

"We continue to invest aggressively to stop it - and report every quarter on the prevalence of such content, which remains very low," she said. "Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions."

"These results are based on a manufactured experience that does not represent what billions of people around the world see every single day when they use our products and services," she added. "We tested Reels for nearly a year before releasing it widely - with a robust set of safety controls and measures. In 2023, we actioned over 4 million Reels per month across Facebook and Instagram globally for violating our policies."

"Child exploitation is a horrific crime and online predators are determined criminals," a Meta Spokesperson said in a statement provided to Fox News Digital on Wednesday. "We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators. In one month alone, we disabled more than half a million accounts for violating our child safety policies."

AG Torrez did not respond to Fox News Digital's request for comment.

CLICK TO GET THE FOX NEWS APP

Load more..