A former Facebook employee told US authorities that the company’s efforts to remove child pornography material from the platform were “inadequate” and “under-resourced”.
The allegations are contained in documents viewed by BBC News and filed with the US Securities and Exchange Commission (SEC) two weeks ago.
The anonymous whistleblower states that the moderators “are not sufficiently trained and are ill prepared”.
Facebook said in a statement: “We do not tolerate this abominable abuse of children and use sophisticated technologies to combat it.
“We funded and helped build the tools used by the industry to investigate this terrible crime, save the children and bring justice to the victims.”
He added that he has shared his anti-abuse technologies with other companies.
The revelations come after former insider Frances Haugen told the US Congress earlier this month that Facebook platforms “harm children, fuel division and harm our democracy.”
This week it also provided evidence to the UK parliamentary committee examining the online security bill.
Senior executives from Facebook, Twitter, Google, YouTube and Tiktok are also expected to testify.
The latest revelations come from an unnamed whistleblower with an insider knowledge of the teams within Facebook set up to intercept malicious material.
In an affidavit to the SEC, which regulates securities markets and protects investors, the individual said there was no solution to the illegal material on Facebook because there were no “adequate assets dedicated to the problem.”
They argue that a small team created to develop software that can detect indecent videos of children was split up and redistributed because it was considered “too complex”.
Facebook claims to use technology known as PhotoDNA and VideoDNA, which automatically scans known child abuse images – every image retrieved by law enforcement agencies around the world and reported to the American National Center for Missing and Exploited Children, receives a unique identification code.
Other allegations by the whistleblower include:
- Facebook does not know the full scale of the child pornography problem because it “does not track it”
- A constant question supposedly asked by senior managers was “what is the return on investment?”
The whistleblower told the SEC that this was a legitimate business question, “but not when it comes to critical public safety issues such as child sexual abuse.”
In the five-page legal document there was also a warning about Facebook “Groups”, which were described as “facilitating harm”.
Groups, many of which are visible only to members, are places where “a lot of terrifying and repulsive behavior occurs.”
Pedophiles “use code words to describe the type of child, the type of sexual activity … they use the encrypted service Messenger from Facebook or Whatsapp to share these codes, which change regularly.
“Facebook’s system depends on a model of self-control that cannot be rationally or reasonably applied.”
Facebook told the BBC that it crawls private groups for content that violates its policies and has 40,000 people working for security, with an investment of over $ 13 billion (£ 9.4 billion) since 2016. .
It claimed to have activated 25.7 million content for the sexual exploitation of children in the second quarter of 2021.
Sir Peter Wanless, chief executive of the NSPCC, said: “These revelations raise deep and disturbing questions about Facebook’s commitment to fighting illegal child abuse on its services.
“For the first time, evidence from within Facebook suggests that they have abdicated their responsibility to comprehensively address child sexual abuse material.”
The former employee concluded his statement by writing: “Unless there is … the credible threat of legislative and / or legal action. Facebook will not change.”
Related topics
- Child abuse
Facebook is making the hate worse, says one whistleblower
- Published
- 2 days ago
Read More about Tech News here.
This Article is Sourced from BBC News. You can check the original article here: Source