Quantcast
Channel: Encryption – Gigaom
Viewing all articles
Browse latest Browse all 129

Snowden’s legacy: The open web could soon be encrypted by default

$
0
0

Well here’s one way to stymie the NSA: in a couple of years, much if not most of the open web will be encrypted by default. Following recent discussions between the big browser makers, standards-setters and other industry folks, the World Wide Web Consortium’s (W3C) HTTP Working Group announced on Wednesday that the upcoming second version of the HTTP protocol will only work with secure “https” web addresses.

Those “https” prefixes usually denote what’s known as Transport Layer Security (TLS), or sometimes its predecessor Secure Sockets Layer (SSL). Commonly used on banking or email services today — or really anything that needs protection of some kind — this kind of basic security technique was already on the rise before Edward Snowden revealed mass online surveillance. Now the movement has gained fresh urgency.

Engineering effort

A lot of this effort is taking place under the auspices of the W3C and the Internet Engineering Task Force (IETF), which are working together on developing and specifying HTTP 2. The new version of the Hypertext Transfer Protocol is supposed to be faster and more resource-efficient than the current one, as well as being safer. Currently, the plan is to submit HTTP 2 for formalization at the end of 2014, though of course these are standards-setting bodies we’re talking about, so that timeframe may well slip.

On Wednesday, Mark Nottingham (an Akamai emissary and chair of the working group), said there was “strong consensus to increase the use of encryption on the web, but there is less agreement about how to go about this.” The various options included only letting HTTP 2 work with “https” addresses, and a couple of halfway-house options based around a concept called “opportunistic encryption,” which would involve using “http” addresses but having the client and/or the server ignore that and institute encryption anyway. The former option won, largely for simplicity and transparency’s sake:

As for precisely how this new feature of HTTP 2 will be implemented, well, that still needs to be worked out. As Nottingham put it:

“To be clear — we will still define how to use HTTP/2.0 with http URIs, because in some use cases, an implementer may make an informed choice to use the protocol without encryption. However, for the common case — browsing the open Web — you’ll need to use https URIs… if you want to use the newest version of HTTP.

“I believe this approach is as close to consensus as we’re going to get on this contentious subject right now. As HTTP/2 is deployed, we will evaluate adoption of the protocol and might revisit this decision if we identify ways to further improve security.”

Professor Alan Woodward, a security expert from the University of Surrey in England, told me via email that the move was “definitely a step in the right direction” but not a panacea.

“False sense of security”

For one thing, Woodward noted, users of encrypted connections can still fall victim to so-called man-in-the-middle attacks, where the attacker effectively sits in between the user and the service, duping the user into thinking the attacker is the service they’re trying to reach. The NSA has certainly claimed to have capabilities against SSL/TLS encryption, although it’s likely that this only means the agency can break it in targeted situations – SSL/TLS is probably still effective as a way of blocking indiscriminate mass surveillance.

“There is a danger that if people think HTTP is secure by default they might develop a false sense of security,” Woodward warned, adding that webmasters and software vendors really should be implementing the existing HTTP Strict Transport Security (HSTS) tool anyway. “I fear if we try to insist on HTTP 2.0 it might be a little like the issue we’ve had with IPv6: momentum behind the older standards is considerable and effecting a change isn’t that successful,” he wrote. “Maybe better to see what we can do with HTTP as it is through initiatives such as HSTS.”

However, Woodward also noted that HTTP 2 should be backwards-compatible with today’s HTTP 1.1, so work on other anti-surveillance mechanisms shouldn’t conflict too much with the IETF/W3C’s current work.

“What is interesting is that those thinking ahead are thinking about security by default which is really encouraging,” Woodward added. “The history of the internet is one of protocols and standards being developed without considering security at all… it has tended to be a bit of an afterthought, so seeing forethought is very welcome.”

Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.


Viewing all articles
Browse latest Browse all 129

Trending Articles