|Written by Nikos Vaggalis|
|Monday, 08 May 2017|
Page 1 of 2
Blaming everything on encryption is a recurring event. Whenever something bad happens that the intelligence services have no control over, it's because the encryption is at fault.
The latest outcry against it was UK Home Secretary Amber Rudd's take on What's Ups end-to-end encryption arguing that Britain’s intelligence services must have:
“the ability to get into situations like encrypted WhatsApp”
adding to the voices supporting the weakening of encryption or the planting of backdoors to popular consumer-level applications.
In stark contradiction to the Home Secretary's fruitless political talk, consider the position of ENISA, the European Union Agency for Network and Information Security, and the centre of network and information security expertise for the member states, the private sector and Europe’s citizens. This was outlined in its December 2016 essay on its recommendations on cryptography in the context of proposals to reduce its strength in order to facilitate interception and decryption of communications by the Security Services and strongly advices against any such weakening.
Its key findings are a mix of legal and technical aspects:
It goes without saying that terrorists and criminals should not be able to hide but be apprehended and punished accordingly, but is really weakening encryption the correct way to go about it?
The European Commission is more or less aligned to the views of the UK Home Secretary, which in itself is striking given that the forthcoming EU General Data Protection Regulation assigns a major role to encryption in protecting all kinds of assets:
"the GDPR (General Data Protection Regulation) now expressly states that such measures include:
In short, with the introduction of the GDPR, encryption and other security measures are established as data protection standards responsible organizations are expected to utilize or face the consequences. So if, prior to the breach taking place, the data were rendered unintelligible, for example by means of encryption, businesses will not have to notify the data subjects of the breach."
So how does planting a backdoor fit into General Data Protection Regulation's landscape?
Alec Muffett of the Open Rights Group, told the Guardian that:
“Government time and money would be better spent elsewhere – pursuing criminals through ‘human’ means and by building upon metadata – than in attempting to combat ‘secure communication across the internet’ as an abstract entity.”
Anne Jellema, CEO of the World Wide Web Foundation finds that:
“Today, our lives are online. We rely on encryption to bank safely, do business privately, protect sensitive information from terrorists and other criminals, or even just to chat privately with our loved ones. Companies must do everything in their power to keep our personal information safe, so we can trust that our online communications are private and secure.
Attempts to break encryption or otherwise weaken online security are misguided and will have the net effect of leaving us all less safe. Forcing companies to build backdoors into their products is equivalent to leaving a key for law enforcement under a mat, and then naively imagining that criminals won’t use that key too."
In his statement, WWW inventor Tim Berners says:
“It’s not possible to build a system, which you can guarantee that only a definition of good guys can break, what you should do is you should build a system which will work in a world where there’s a government in power that you do not trust at all. Giving that sort of power to the government is inappropriate.
If encryption were not a thing then huge amounts of modern life would be impossible. If you put a hole in encryption – if you decide WhatsApp shouldn’t be secure – then you do that to everything else that is equivalent to WhatsApp you’d have a battle in which you would have a huge number of disasters."
The reasoning is simple; if you leave a backdoor open for the good guys, then it's just a matter of time for the bad guys to find it too.
But let's put things into perspective. So much of the talk on planting backdoors never gets into the specifics; what would such a backdoor morph into? If it's realized by hiding secret keys inside applications or devices, then it's game over from minute 1 as the latest Cisco vulnerability example prevalently demonstrates:
"Security experts at Cisco discovered default SSH Key in many Cisco security appliances, an attacker could use them to establish SSH connection and control the devices. The abuse of the SSH key could represent a serious problem for enterprises and organizations that are exposed to cyber attacks.
The vulnerability is due to the presence of a default authorized SSH key that is shared across all the installations of WSAv, ESAv, and SMAv. An attacker could exploit this vulnerability by obtaining the SSH private key and using it to connect to any WSAv, ESAv, or SMAv. An exploit could allow the attacker to access the system with the privileges of the root user."
So planting a secret key and sharing it with the intelligence services is comparable to the Benjamin Franklin quote:
"Three can keep a secret, if two of them are dead."
Increase the number of interested parties by the hundreds since those keys won't be locked in a safe but rather shared with other institutions, private contractors and the likes, hoops in the large governmental chain.
In a sense even the World Wide Web as we know it balances on a very thin thread from revealing its secrets, and that's due to the way the secure client/server communication exchange is utilized. All that it takes is getting hold of the server's master RSA key to be able to decrypt all, current and PAST, internet traffic:
"The BENIGNCERTAIN exploit included in The Shadow Brokers data dump which could extract the RSA keys from PIX firewalls and thus able to decrypt any traffic going or had gone through them."
That's why the switch to the Perfect Forward Secrecy model under the Diffie-Hellman algorithm is necessary; there's no saved key to recover.
Yes everything, since modern society is structured on the Web, its fabric is woven on the Web. If it weren't for encryption, the Web wouldn't be the same Web as we know it, with e-banking, e-commerce, e-health and secure communications between industries or people. In other words, things that are now taken for granted, wouldn't be possible.
Yet, encryption is not at fault every time. The press is filled with stories of terrorists travelling back and forth to and from Syria, already being in FBI's database of suspects or under close monitoring which somehow they manage to escape from, crossing country wide borders after having successfully launched an attack inside the very premises of the European Union.
There's something foul going on here and that's not encryption. Is it maybe the inability of the intelligence services to stop criminals in their tracks?
Surely the challenges that any service has to face are tough, it's literally finding a needle in a haystack, but should that be so after the millions of pounds or dollars spent each year? To promptly dismiss that thought and just blame it on encryption, is childish the least.
Maybe more advancements in technology are necessary to narrow the gap, like the invention of better data mining algorithms which can sift through patterns in the FBI's database of thousands of suspects, or better yet offload such tasks exclusively to Artificial Intelligence agents who can compensate for the human inefficiency.
|Last Updated ( Monday, 08 May 2017 )|