Plans to implement end-to-end encryption on Facebook and Instagram have been delayed due to a discussion about child safety.
Meta, as Facebook’s parent company is now called, said encryption of messages on apps would arrive in 2023.
The process means that only the sender and recipient can read the messages, but law enforcement or Meta cannot.
However, child protection groups and politicians have warned it could hinder police investigating child abuse.
The National Society for the Prevention of Cruelty to Children (NSPCC), said private messaging “is the front line of child sexual abuse.”
UK Interior Minister Priti Patel also criticized the technology, saying earlier this year that it could “severely hamper” law enforcement from prosecuting criminal activities, including online child abuse.
Privacy versus protection
End-to-end encryption works by “scrambling” or encrypting data as it travels between phones and other devices.
The only way to read the message is usually to get physical access to an unlocked device that sent or received it.
The technology is the default for the popular WhatsApp messaging service, also owned by Meta, but not for the company’s other apps.
The NSPCC has sent freedom of information requests to 46 police forces in England, Wales and Scotland asking them for a breakdown of the platforms used to commit sexual offenses against children last year.
The responses revealed:
- More than 9,470 cases of images of child sexual abuse and online pedophilia crimes have been reported to the police
- 52% of these occurred on Facebook-owned apps
- over a third of cases occurred on Instagram and 13% on Facebook and Messenger, very few via WhatsApp
This has led to fears that Meta’s plans to expand encryption to widely used Facebook Messenger and Instagram direct messages may protect most abusers from detection.
The NSPCC said that encrypting messages by default could lead to easier dissemination of images of child abuse or solicitation online.
But proponents say encryption protects user privacy and prevents intrusion from both governments and unscrupulous hackers. Meta CEO Mark Zuckerberg personally voiced these arguments when he announced Facebook’s encryption plans in 2019.
‘Deal with it right’
Antigone Davis, head of global security at Meta, said the delay in the implementation of cryptography to 2023 it was because the company was taking its time “to get it right”.
The company previously said the change would take place no earlier than 2022.
Ms. Davis said, “As a company that connects billions of people around the world and has created industry-leading technology, we are determined to protect people’s private communications and keep people safe online.”
It also outlined a number of additional preventative measures the company had already put in place, including:
- “proactive detection technology” that scans for suspicious activity patterns such as a user who repeatedly sets up new profiles or sends messages to a large number of people they don’t know
- put users under 18 in private or “friends only” accounts by default and prevent adults from texting them if they are not already logged in
- educate young people with in-app tips on how to avoid unwanted interactions
Andy Burrows, head of online child safety policy at the NSPCC, welcomed Meta’s delay.
He said: “They should move forward with these measures only when they can demonstrate that they have the technology in place that will ensure that children are not at greater risk of abuse.
“More than 18 months after a global coalition of 130 NSPCC-led child protection organizations sounded the alarm about the danger of end-to-end encryption, Facebook must now demonstrate that it is serious about the risks to the company. safety of children and not just to take time. while resisting difficult titles “.
- Assistive technology
- End-to-end encryption
What is encryption?
- January 22, 2016
Read More about Tech News here.
This Article is Sourced from BBC News. You can check the original article here: Source