Facebook Messenger and Instagram delay default end-to-end encryption until 2023

Which does Meta value more: privacy or safety? The answer is not very clear-cut and seems to get more complicated every single day.

Meta’s head of safety Antigone Davis made a post on The Telegraph regarding Meta’s plans to roll out end-to-end encryption (E2EE). In the post, Davis says that they don’t plan to roll out E2EE as a default feature on Messenger and Instagram until sometime in 2023.

Since they’re part of the same parent company, Facebook Messenger and Instagram were actually merged into a cross-platform system last year. These platforms also already have E2EE modes implemented, but they are not enabled by default. In Messenger, E2EE comes in the form of ‘Secret Conversations’, and on Instagram it’s simply called an ‘end-to-end encrypted chat’. However, at the moment it appears to be offered for selected users in selected regions. On the other hand, WhatsApp is also under Meta and does have E2EE turned on by default.

Meta originally planned to roll out default E2EE on Instagram and Messenger in 2022, but they pushed it back one more year because they wanted to “get it right”.

Davis says the reason why the change is taking so long is that E2EE makes it much harder to catch criminals. With an encrypted chat, only you and the recipient can see your messages, which makes surveillance very difficult. Once default E2EE rolls out in the future, they will use non-encrypted data and reports to help stop criminal activity instead. This is exactly what WhatsApp does, and it got them in a big controversy as they still provided metadata information to law enforcement.

Meta is in a bit of a pickle. They have to choose between user privacy and user safety, and it seems like they’re just delaying that decision. The UK’s upcoming Online Safety bill isn’t making it any easier, with Home Secretary Priti Patel criticizing the use of E2EE, saying “Sadly, at a time when we need to be taking more action… Facebook is still pursuing E2EE plans that place the good work and the progress that has already been made at jeopardy.”

Apple also went through this dilemma with their child safety plan, which allows them to look through hashes of your iCloud photos to detect child sexual abuse material (CSAM). Even though they have privacy in mind, Apple is still able to manually review your photos if they “cross a threshold of known CSAM content”.

[ SOURCE, IMAGE SOURCE ]

Recent Posts

Gentari’s largest EV Charging Hub in Penang, 540kW total capacity with 6 bays at Bayan Baru

Besides deploying more DC Chargers in Penang Island in partnership with MBPP, Gentari has just…

2 hours ago

BMW 7 Series gets Neue Klasse upgrade. New i7 now offers over 700km range and 250kW DC fast charging

BMW has officially revealed the updated 7th generation BMW 7 Series (G70), and this isn’t…

11 hours ago

Oppo Find X9s goes official in Malaysia: Triple 50MP Hasselblad cameras, Dimensity 9500s, 6.59″ AMOLED, priced at RM3,899

Aside from the big boss Find X9 Ultra, Oppo Malaysia has also introduced another member…

11 hours ago

Honor 600 series launched in Malaysia: Snapdragon 8 Elite, 200MP camera, 7,000mAh battery, priced from RM2,599

The Honor 600 and Honor 600 Pro have finally made their launch in Malaysia, making…

13 hours ago

Apple’s Tap to Pay on iPhone is now in Malaysia

First announced in 2022, Apple has finally rolled out Tap to Pay on iPhone in…

21 hours ago

Huawei MatePad Mini: Compact tablet with 8.8″ OLED PaperMatte Display, now available for RM2,199

Huawei has officially launched the MatePad Mini in Malaysia, positioning it as a compact tablet…

21 hours ago

This website uses cookies.