Visitor Stats
top of page
Search

Facebook and the fight for #opchildsafety

When the news broke on Wednesday, December 6th, that Facebook (Meta) and its offshoot Instagram were finally being served with a lawsuit for steering children to child sexual abuse material (CSAM) and toward predators, those of us who work in #OpChildSafety cheered. It’s incredibly reassuring to know that someone is listening and responding aggressively to this epidemic, which is out of control.




Meta was served the lawsuit the previous day, and two days later, they dared to announce the decision to introduce end-to-end encryption for its platforms. In essence, such a privacy move will, by consequence, allow predators the liberty to solicit and distribute CSAM and other harmful content across its platforms without Meta first having an efficient way to combat this epidemic.


Think about this for a moment.


I digress. Attorney General Raúl Torrez of Santa Fe, New Mexico, announced the lawsuit that day, releasing a lengthy and detailed account that explained the scope of child abuse and child sex trafficking on the social media platforms, which also contained censored screenshots of child abuse and evidence of trafficking.


The most insane thing is that it’s not so much a question of how efficient the algorithms are. I say this because those of us who work in #OpChildSafety on those platforms know they are not working. But rather, Meta’s response to the accusations leads to more unanswered questions that demand immediate attention and cessation of cookie-cutter corporate responses.


Even when the content wasn’t being searched, the advertising algorithms were hard at work, exposing the content and illicit peddlers toward children. Ultimately, this also allowed child predators to uncover and message kids. Thus, the lawsuit was necessary to form an effort to protect children from such things as grooming, human trafficking, and online solicitation of CSAM.


According to Torrez, Meta’s lack of proactivity and, by extension, absence of protections to insulate children from harmful exposure to this content is because of the possibility their advertising revenue could become negatively impacted.


His office launched an undercover investigation by setting up fake accounts of fictional underage persons, which worked as a honeypot to attract predators. They used images of fictional teens and preteens generated by artificial intelligence and watched as Meta’s algorithms began recommending sexual content. This also consisted of a wave of explicit messages and solicitations from adults.


"Meta has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey," the lawsuit alleges.


This whole episode reminds me of the 2010 case against Craigslist, which was exposed for soliciting child prostitution and trafficking of women, which prompted a swift change in the platform's algorithms.



What is Meta hiding?


According to the filing, the day after the lawsuit was filed, New Mexico investigators on the case were contacted and informed that Meta shut down the test accounts that were used to investigate the profiles promoting CSAM. This also meant that these test accounts would no longer be collecting and accessing data. Included in the notice was a warning that said these offending accounts would be “permanently disabled.”


But why?


What about the child predators and traffickers that use them? Disabling these accounts should be the final action once the perpetrators operating behind these profiles have been brought to justice.


Because of this, Torrez asked a judge to enforce an order against Meta not to terminate any data connected to the test accounts after Meta supposedly claimed it would only retain relevant information to the claims.


Meta aggressively refuted the lawsuit’s claim by the Attorney General. “We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” said spokesperson Nkechi Nneji in a statement.


Meta spokesperson Andy Stone said in a statement responding to the accusations in the lawsuit, “We will, of course, preserve data consistent with our legal obligations.”


In my opinion, in the culmination of all the statements released on behalf of Meta, they’ve been caught red-handed, knowing this was an epidemic on their platform but not exactly fighting against it.


But why?


The filing states, “While it is unclear whether “permanently disabl[ing]” an account is the functional equivalent to deleting the account, the State believes that is the case. … Indeed, in a California social media litigation, another technology company ‘locked’ the plaintiffs’ accounts following initiation of the action … The company confirmed in a recent court filing there that these ‘locked’ accounts were inadvertently deleted by the company’s automated processes.”


This is exactly what the State hopes to prevent the social media tech giant from doing.


What gets more alarming is that on the same day these accounts were disabled, confirmation was requested from Meta to ensure it would preserve all the information they collected from the test accounts and other accounts detailed in the complaint.


Interestingly enough, Meta’s lawyers seemed to sidestep the order with a carefully worded response by simply stating they would take “reasonable steps” to analyze the referenced accounts and retain “relevant data.”


It gets worse.


Meta didn’t reply to a request for follow-up, alleged the filing. This means that Meta provided no clarification on what data they would or would not determine as “relevant.” The filing also states that Meta refuses to preserve “all data” with the referenced accounts. Most importantly, a court order is necessary to preserve this data as key evidence for trial.


Facebook vs. #opchildsafety


The lawsuit against Meta platforms and CEO Mark Zuckerberg is vital to our fight against CSAM and sex trafficking because, as hunters, many of us feel the platform is stonewalling any effort to inspect, analyse, and resolve our complaints.


#OpChildSafety workers have been reporting accounts of this nature ad nauseam for as long as we have been operating, and Facebook isn’t exactly prioritizing removing the illegal content. On the other hand, if you say something offensive, you’re put in “Facebook Jail” almost immediately.






By Jesse McGraw


Original article can be found here

13 views0 comments

Download Google Dork Lists - View Free Educational/Training Resources

bottom of page