No shit! Experts say backdoors and encryption limits are security risks
Adding backdoors so governments can access data is a "major security risk". This is the (perhaps slightly obvious) conclusion of security experts and cryptographers writing in a report entitled Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications.
The report from the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Lab criticizes plans to allow law enforcement agencies unfettered access to encrypted data through the use of either front doors or backdoors. More importantly it poses the question: "if we want to maintain the security of user information, is this sort of access even technically possible?"
Now that we are all more security conscious than ever before, news that governments around the world -- including the US and UK -- want to limit the strength of encryption that can be used escaped few people. The government wants to be able to access whatever data it wants, whenever it wants, and has tried to force hardware and software manufacturers to build backdoors into their products that it can use.
MIT's report says that calls for law enforcement agencies to be granted "exceptional access" to data fail to take into account the "risks inherent in exceptional access systems". In short, if a backdoor can be used by a government agency, it can be used by someone else; security is massively reduced. We have already seen how a backdoor was exploited by the FREAK SSL flaw -- something that came about because of historic government bans on the use of strong encryption. And just this week Hacking Team -- a security firm used by governments around the world to spy on web activity -- was itself hacked and hundreds of gigabytes of data were made publicly available, highlighting the very real security risks that exist.
The report argues that granting access to encrypted data would prove counterproductive. It "will open doors through which criminals and malicious nation states can attack the very individuals law enforcement seeks to defend". It warns:
...exceptional access would create concentrated targets that could attract bad actors. Security credentials that unlock the data would have to be retained by the platform provider, law enforcement agencies, or some other trusted third party. If law enforcement's keys guaranteed access to everything, an attacker who gained access to these keys would enjoy the same privilege. Moreover, law enforcement's stated need for rapid access to data would make it impractical to store keys offline or split keys among multiple keyholders, as security engineers would normally do with extremely high-value credentials.
It also highlights the problem of the global nature of the internet, asking how countries would interact with each other:
China has already intimated that it may require exceptional access. If a British-based developer deploys a messaging application used by citizens of China, must it provide exceptional access to Chinese law enforcement? Which countries have sufficient respect for the rule of law to participate in an international exceptional access framework? How would such determinations be made? How would timely approvals be given for the millions of new products with communications capabilities? And how would this new surveillance ecosystem be funded and supervised?
The authors suggest that proposals surrounding security backdoors raise more questions than they answer. They say that the onus is on governments to not only prove the need for exceptional access to data, but also to devise the mechanism for access themselves rather than passing the buck to hardware and software manufacturers. As is often the case, this is an issue that boils down to balancing the need for national and international security with personal privacy. It is also important to bear in mind the very great costs that go hand in hand with backdoors and asking the question: is this a price worth paying?