The question of building backdoors into encryption
Over the last ten years, technology companies have been on a mission to save encryption as governments across the world have been pushing to build in backdoor codification to combat the increase in terrorism and child abuse.
s, including when the FBI demanded Apple assist in unlocking the encrypted work phone of one of the San Bernardino shooters in December 2015; as well as after a shooting in Pensacola Florida in December 2019. More recently In the UK, Operation Venetic resulted in the arrest of 746 organised criminals, after the National Crime Agency infiltrated a secretive and highly-encrypted global Encrochat mobile phone network, which used built-in codes and timers that wiped data automatically.
It is difficult to argue against helping law enforcement put criminals behind bars, but the trade-off from these ”anti-encryption” measures is simply too devastating to justify.
Personal safety
Tech companies focused heavily on privacy and security in the 2010s and many rolled out products with improved encryption. Messaging platforms WhatsApp and Signal both added end-to-end encryption in 2014 and in the same year, Apple enabled encryption by default in iPhones with the release of iOS8.
While encryption can come in many forms, it always comes with the same goal: protecting data confidentiality. End-to-end encryption achieves that goal by setting up an encrypted channel where only the client applications themselves have access to the decryption keys. In the case of WhatsApp, this means that even though users’ messages might traverse or be stored on WhatsApp servers, the company doesn’t have access to the encryption keys to decrypt and read those messages. The messages stay private to all but the sender and the receiver.
In the case of encryption-at-rest such as on the iPhone, the user’s password or PIN acts as the encryption key. When the phone boots up, the user has to enter their password or PIN to unlock the phone’s data. Any new data the phone receives or creates, like images or chat messages, are encrypted using that key. If the phone powers off or is put in a lockdown mode, the decrypted data is flushed from the phone’s memory and the user must enter their password again to unlock it.
Ramifications of forced backdoors
The FBI and other law enforcement agencies around the world are asking Apple and other manufacturers to create a ”golden key” so to speak, with the ability to decrypt all messages on all devices. Australia even managed to pass legislation in 2018 that allows them to force companies to create backdoors in their encryption. While it is technically possible to accomplish that goal, the security and privacy ramifications would be massive.
There’s simply no such thing as a ”good guys only” backdoor. Eventually, a cyber-criminal will get their hands on the golden key or exploit the intentional chink in the armor to break their way in. The NSA losing its stockpile of Windows zero-day vulnerabilities in 2016 should be clear proof that we shouldn’t be so quick to trust government agencies to act responsibly with security. Organisations rely on encryption to protect their intellectual property. Journalists rely on encryption to protect themselves and their sources. You can probably imagine the level of resources a hostile nation state would pour into finding such a backdoor if it existed.
What if we take a step back and examine the encryption debate using a physical safe as an analogy? People use safes to store important documents and items that they want to keep out of the hands of criminals. At the same time, people can use them to store evidence of crimes. Should safe manufacturers be required to intentionally add a weak point to every safe or create a master key? Or should law enforcement be required to go through legal channels to compel owners to give up their keys?
The former is exactly what governments are asking Apple, WhatsApp and others to do. Law enforcement already has the power to obtain massive amounts of data through the court system. In the case of the Pensacola shooter for example, Apple handed over iCloud backups, account information and transactional data for multiple accounts. The FBI eventually gained access to the phone in question without Apple’s help, calling into question why they need a backdoor at all.
Pushback leads to covertness
Pushback against anti-encryption regulations has become strong enough that many governments are becoming much more covert about their attempts. For example, the EARN IT Act introduced to the US Senate doesn’t explicitly outlaw encryption, but it sets up a government agency that can define a checklist of ”best practices” organisations must follow to remain under the protection from civil and criminal liability for its users under the Communication Decency Act. That list of best practices could easily include weakened encryption requirements.
Even if most governments managed to pass anti-encryption laws, criminals would simply move to different apps instead of the ones that maintain compliance. Giving up the security and privacy of the masses is simply too big of a price to pay for something that is very unlikely to prevent crime and incredibly likely to result in abuse.