Intent To Deprecate HTTP
Written by Ian Elliot   
Thursday, 16 April 2015

A suggestion on the Mozilla Dev forum aims to deprecate HTTP in favour of HTTPS. Has it really come to this? Browser devs dictating the protocols we use? Of course, it is all in the name of freedom.

mozillasecurity

The idea is a fairly simple one. HTTP is a plain text protocol. If you intercept a TCP/IP packet then you can read its payload. You can also intercept packets, change their contents and send them on the way.

However, there are two distinct aspects to the security issues of HTTP. The first is the tracking and surveillance. Because HTTP is readable, anyone can find out what you are doing on the web. The second is the possibility of an actual attack on a website, or the user of a website, to intercept valuable information like a password or a pin number. 

You can argue that HTTP is fine for websites that simply provide information, news, articles and so on because there is no exchange of private information. However, the surveillance aspect pushes many to claim that even in this case HTTP is not good enough because it lets agencies discover what you have been doing - even if it is totally harmless and seems to give little away of importance. 

In short, it is possible to conclude that all web traffic, no matter how unimportant, should be encrypted using something like HTTPS. This would mean that only the client and server can read the data in the TCP/IP packets because they have their corresponding private keys to decrypt the data. So you can't read intercepted HTTPS traffic and hence you can't use it for surveillance or for exploits unless you can crack the encryption.

Hence the move to deprecate HTTP. On the Mozilla.dev.platform Google Group, Richard Barnes writes:

In order to encourage web developers to move from HTTP to HTTPS, I would like to propose establishing a deprecation plan for HTTP without security. Broadly speaking, this plan would entail  limiting new features to secure contexts, followed by gradually removing legacy features from insecure contexts.  Having an overall program for HTTP deprecation makes a clear statement to the web community that the time for plaintext is over -- it tells the world that the new web uses HTTPS, so if you want to use new things, you need to provide security.

The proposed plan is that eventually all of the web is HTTPS and HTTP isn't used at all. 

There are a few problems with this plan.

The first is that currently HTTPS makes use of certificates to provide the encryption keys and these not only provide encryption but authentication. A certificate issued by an appropriate authority is supposed to prove that you are who the certificate says you are. This authentication doesn't come cheap and the cost of a certificate certainly is something that would put an end to the more casual use of the web for things like personal home pages. 

The solution is to make use of a self-signed certificate which provides encryption but not authentication. At the moment this isn't an easy option, but initiatives like the EFF's Let's Encrypt promises a service that will provide free certificates with some automatic domain validation and a database of certificates. This is makes using "lightly validated" certificates a possiblity, but at the moment browsers tend to put up warning messages when you encounter a website that has a self-signed certificate. This makes an HTTPS site using a self-signed certificate look more risky than an HTTP site that has no encryption at all.

This crazy mix up between encryption and authentication requirements puts authentication at the top of the list of concerns.

Clearly before HTTPS can become the norm we need to solve the certificate problem.

However, there are other problems.

Encryption is not without its cost in terms of efficiency. If the web goes 100% encrypted all devices are going to have to waste a lot of CPU cycles doing the associated number crunching. This isn't a good thing either for small, under-powered devices and for big, heavily loaded web servers and the Mozilla devs seem to want to force a one size fits all approach. 

There are lots of other minor downsides to encrypting everything. There is the matter of not being able to cache encrypted web pages because each delivered page is only usable by one client - the client with the private key to decrypt it. Many ISPs make use of large scale caching to reduce network traffic. If HTTPS was the norm there would be no caching possible. 

It isn't just Mozilla who are keen on getting HTTPS to be the transport protocol of the web. Google is pushing the same idea in various ways including reducing the ranking of websites that do not use HTTPS. 

Is this a step to preserving freedom or imposing a different kind of loss of freedom. You could say that imposing certificates on every use and server is removing the last trace of anonymity from the web making it easier for agencies to track where you visit even if not what you do when you get there.

This is a very complicated situation. It is clear that there are situations were HTTPS is essential and there are many situations were it is largely irrelevant and actually harmful. 

Which to us is not a decision that should be left to browser developers. 

 mozillasecurity

Banner


The State Of AI According To Rodney Brooks
14/04/2019

AI - overhyped and oversold - or is it? Rodney Brooks, Robotics Professor at MIT, should know. He has been trying to apply it real world uses for a long time. He recently delivered the annual Sackler  [ ... ]



Babel Adds Smart Pipelines
01/04/2019

Babel has been updated with support for smart pipelines and private instant accessors. The JavaScript compiler takes JavaScript ECMAScript 2015+ code and converts it into a backwards compatible versio [ ... ]


More News

 

Python

 



 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Thursday, 16 April 2015 )