Meta explains why it is taking so long to bring end-to-end encryption to Facebook Messenger and Instagram -- and what it is doing in the meantime
Meta recently announced that the protection, privacy and security offered by end-to-end encryption will not be coming to Facebook Messenger or Instagram until some time in 2023. Until then, anyone looking to send secure messages through Meta's platform will have to turn to WhatsApp.
What was not made particularly clear at the time of the announcement, however, is just why there is such a delay. Now Meta has opened up and revealed some of the thinking behind holding back on the roll-out of end-to-end encryption across all of its messaging services.
See also:
- Microsoft releases PowerToys v0.51.0 with lots of fixes, a new mouse utility... and two exciting teasers
- 0patch beats Microsoft to fix serious local privilege escalation vulnerability in Windows
- KB5007205 update for Windows is breaking Microsoft Defender for Endpoint
In a blog post entitled "Our Approach to Safer Private Messaging", Meta's Global Head of Safety, Antigone Davis, explains that the company is "taking [its] time to thoughtfully build and implement end-to-end encryption (E2EE) by default across Messenger and Instagram DMs". The reason given for this is that while end-to-end encryption helps to improve privacy and prevents sensitive data from falling into the wrong hands, there is also the potential for abuse.
Seemingly ignoring the fact that WhatsApp already offers end-to-end encryption, Davis says:
While most people use messaging services to connect with loved ones, a small minority use them to do tremendous harm. We have a responsibility to protect our users and that means setting a clear, thorough approach to safety. We also need to help protect people from abuse without weakening the protections that come with encryption. People should have confidence in their privacy while feeling in control to avoid unwanted interactions and respond to abuse. Privacy and safety go hand-in-hand, and our goal is to provide people with the safest private messaging apps.
Davis does not make it clear why end-to-end encryption is such a problem to implement for Facebook Messenger and Instagram but it is fine for WhatsApp in its current form. She does, however, go to some lengths to try to explain what the company is doing instead in the meantime.
Now it could be argued that this is little more than a distraction from a lack of activity in the right areas, but Davis says that Meta's desire is to take steps to prevent abuse taking place while simultaneously working on "the complex build of default E2EE". She writes:
Preventing abuse from happening in the first place is the best way to keep people safe. In an end-to-end encrypted environment, we will use artificial intelligence to proactively detect accounts engaged in malicious patterns of behavior instead of scanning your private messages. Our machine learning technology will look across non-encrypted parts of our platforms -- like account information and photos uploaded to public spaces -- to detect suspicious activity and abuse.
For example, if an adult repeatedly sets up new profiles and tries to connect with minors they don’t know or messages a large number of strangers, we can intervene to take action, such as preventing them from interacting with minors. We can also default minors into private or “friends only” accounts. We've started to do this on Instagram and Facebook.
Davis outlines other measures such as empowering users to block people or words they don't want to see. She also makes it clear that even when end-to-end encryption is implemented, it will not be as complete as some people might have hoped, while simultaneously stressing some of the benefits of non-encrypted traffic:
We'll continue to enforce our Community Standards on Messenger and Instagram DMs with end-to-end encryption. Reporting decrypts portions of the conversation that were previously encrypted and unavailable to us so that we can take immediate action if violations are detected -- whether it’s scams, bullying, harassment or violent crimes. In child exploitation cases, we'll continue to report these accounts to NCMEC. Whether the violation is found on or through non-encrypted parts of our platform or through user reports, we're able to share data like account information, account activity and inbox content from user reported messages for compliance with our Terms of Service and Community Standards.
In short, it seems that -- as ever -- Meta will try to appease people on both sides of an argument, and will end up disappointing everyone.
Image credit: Sergei Elagin / Shutterstock