Mass surveillance prompts work on SSL deployment guidelines by a Internet-friendly group
A newly created working group within the Internet Engineering Task Force (IETF) has set out to develop best practices for deploying SSL encryption for Internet communications.
The IETF describes itself as a large open international community of network designers, operators, vendors, and researchers concerned with the evolution of the Internet architecture and the smooth operation of the Internet.
The group’s creation follows revelations in recent months about mass Internet surveillance programs run by the U.S. National Security Agency, the U.K.’s Government Communications Headquarters (GCHQ), and other intelligence agencies.
IETF joined several other Internet infrastructure groups in October in expressing strong concern over what they called “the undermining of the trust and confidence of Internet users globally due to recent revelations of pervasive monitoring and surveillance.”
The IETF’s new “Using TLS in Applications” (UTA) group became active Dec. 11 when its charter was approved. It will focus on issuing guidance on using TLS (Transport Layer Security), the successor of SSL (Secure Sockets Layer), with several application protocols: SMTP (Simple Mail Transfer Protocol) used for email transmission across the Internet; POP (Post Office Protocol) and IMAP (Internet Message Access Protocol) used by email clients to retrieve emails from servers; XMPP (Extensible Messaging and Presence Protocol) used for instant messaging; and HTTP (Hypertext Transfer Protocol) version 1.1, the foundation of data communication on the World Wide Web.
This working group has its roots in the IETF “perpass” mainling list that was created explicitly to coordinate ideas and discussions on pervasive monitoring and surveillance, said Leif Johansson, member of the board of directors at Internet Exchange Point (IXP) operator Netnod and co-chair of the new IETF UTA group, via email.
Mass Internet surveillance was the topic that received the most attention at the 88th IETF Meeting in early November, according to IETF chair Jari Arkko. During that meeting’s technical plenary, cryptography and security expert Bruce Schneier, who had access to the cache of secret documents leaked by former NSA contractor Edward Snowden, said that the goal of the technical community should be to make eavesdropping expensive and force the NSA to abandon wholesale collection of data in favor of targeted collection.
“Ubiquitous encryption on the Internet backbone will do an enormous amount of good—provide some real security and cover traffic for those who need to use encryption,” he said. “The more you can encrypt data as it flows on the Internet, the better we’ll do.”
Later in November, the IETF working group responsible for developing the next version of the HTTP protocol—HTTP 2.0—said it’s considering making encryption a standard requirement for the protocol.
While this change would be a major improvement for the security of the Web, HTTP 2.0 is at least a year away from becoming a standard and it will probably take a long time for it to become widely adopted. In the meantime, the newly established IETF UTA working group aims to encourage the adoption of SSL/TLS encryption to secure existing Internet data transmissions.
The main problem right now is that most protocols that support TLS don’t get deployed with TLS or are deployed with weak ciphers enabled, Johansson said. The new working group’s goal is to provide clear and simple operational guidelines that can inform actual real-world deployment of TLS in actual real-world protocols, he said.
According to its charter, the group has the following tasks:
- Update the definitions for using TLS over a set of representative application protocols. This includes communication with proxies, between servers, and between peers, where appropriate, in addition to client/server communication.
- Specify a set of best practices for TLS clients and servers, including but not limited to recommended versions of TLS, using forward secrecy, and one or more ciphersuites and extensions that are mandatory to implement.
- Consider, and possibly define, a standard way for an application client and server to use unauthenticated encryption through TLS when server and/or client authentication cannot be achieved.
- Create a document that helps application protocol developers use TLS in future application definitions.
“The working group (WG) will make the fewest changes needed to achieve good interoperable security for the applications using TLS,” the group’s charter says. “No changes to TLS itself will be made in this WG, and the WG will ensure that changes to current versions of popular TLS libraries will not be required to conform to the WG’s specifications.”
Not easy to do
The main problem with deploying SSL/TLS is that there are many things to get wrong, from using configurations with insecure ciphers and insufficiently strong private keys to using older versions of TLS libraries that don’t have all security patches.
”SSL/TLS is a deceptively simple technology,” SSL experts from security firm Qualys said in a document describing SSL/TLS deployment best practices. “SSL is easy to deploy, but it turns out that it is not easy to deploy correctly. To ensure that SSL provides the necessary security, users must put extra effort into properly conguring their servers.”
In recent years, researchers demonstrated attacks against TLS configurations that use the RC4 stream cipher or block ciphers operating in cipher-block-chaining (CBC) mode, leaving ciphers that operate in Galois/Counter Mode (GCM) as the secure alternatives. However, GCM ciphers are only available in TLS 1.2 which is not widely deployed at the moment.
According to statistics from the SSL Pulse project, only around 22 percent of the world’s 161,000 most popular HTTPS (HTTP Secure) websites had support for TLS 1.2 as of Dec. 2. On the client-side, only recent versions of the most popular browsers support this version of the protocol.
”I believe there will be a lot of effort among large-scale deployers of HTTPS to move to TLS 1.2,” Johansson said.
SSL private keys
Following reports about NSA’s efforts to defeat encryption, security experts believe that breaking 1024-bit SSL private keys is within the agency’s ability given its financial resources and access to powerful computers.
Providers of popular web services such as Google, Facebook, Microsoft, and Twitter already use SSL certificates with 2048-bit keys, and the Baseline Requirements for the Issuance and Management of Publicly Trusted Certificates, a set of guidelines published by the Certification Authority/Browser (CAB) Forum, mandates that all newly issued SSL certificates with a validity period ending after Dec. 31 should use 2048-bit RSA keys.
However, cracking private keys using brute-force methods is not the only way to subvert encryption. An intelligence agency like the NSA could simply ask or coerce service providers to hand over their keys or they could break into servers and steal them. This would allow the decryption of all previously captured traffic.
To counter that, security experts recommend configuring SSL deployments to use key exchange algorithms that support a feature called perfect forward secrecy. The algorithms generate separate and temporary private keys for each individual session, making it impossible to decrypt previously captured traffic by obtaining a single key.
Such security considerations are just some of the factors that should guide a strategy for deploying TLS. There are also differences between using TLS with HTTP and using TLS with other application protocols, which can make things even more confusing for application developers, server administrators and other TLS implementers.
”The IETF at its best can bring together the best and the brightest and as a chair I hope that efforts like the Qualys SSL Labs, the XMPP Manifesto and others will join together to inform UTA,” Johansson said.