Real World Cryptography

Originally by Luther Martin, Chief Security Architect, Voltage Security.

In January of 2014, the third Real World Cryptography Workshop was held at the City College of New York. The purpose of this workshop was described like this:

The Real World Cryptography Workshop aims to bring together cryptography researchers with developers implementing cryptography in real-world systems. The main goal of the workshop is to strengthen the dialogue between these two groups. Topics covered will focus on uses of cryptography in real-world environments such as the Internet, the cloud, and embedded devices.

Researchers and developers of cryptography have very different goals: researchers need to create new things while developers need to create things that are so useful that people will pay for them. These two goals lead to very different points of view. And while the Real World Cryptography Workshops are definitely getting these two groups to talk to each other, it isn’t clear that they’re actually making either of these groups more useful to the other. But this probably isn’t the fault of the organizers of the workshop or the participants. Instead, it’s just an inherent problem with the industry.

Introducing innovations in the field of cryptography is extremely hard because of entrenched standards. If you invent a new encryption algorithm that’s perfectly secure, much faster than any previous algorithm and reduces the cost of key management by a factor of a hundred, the first question that you’ll be asked is if it’s FIPS 140-2 validated. It’s very hard to get a new encryption algorithm approved for use under FIPS 140-2, and even the rare successful attempts to do this take at least several years – enough time to land a man on the moon and return him safely to Earth if you’re NASA (Maybe cryptography is actually harder than “rocket science?”).

But from the point of view of complying with some regulations, using encryption that isn’t FIPS 140-2 validated can be interpreted as being equivalent to using absolutely no encryption at all. So even if you’ve encrypted your data with non-validated encryption in a way that’s perfectly secure, there can be cases where your auditors will tell you that you have committed a data breach if you used the non-validated encryption to protect sensitive information and then send the encrypted information across the internet.

This point of view is probably familiar to information security specialists in the commercial world. But it seems to be not very well understood by researchers, and this led to a rather heated discussion that I recently had with a researcher about an algorithm that he had invented.

This particular algorithm was very fast and secure, but it was only faster than existing alternatives for a particular range of key sizes. In this case, the new approach was best for keys roughly equivalent to a 900-bit RSA key. Today’s standards require at least the equivalent of a 2,048-bit RSA key, so this particular newer and faster approach can’t be used when compliance with those standards is needed. And if you were to use this new algorithm with one of the smaller keys, from the point of view auditors validating compliance with some standards it would be just like you weren’t using any encryption at all.

That doesn’t mean that such encryption isn’t reasonably secure, of course. There are lots of applications where the equivalent of a 900-bit RSA key is perfectly good enough because it will stop all realistic attackers from getting access to any sensitive information that it’s protecting. But from the point of view of some types of regulatory compliance, this type of encryption can be considered to be essentially useless.

The researcher didn’t understand this at all. It just didn’t make sense to him because the target for his new algorithm was constrained devices where using longer keys might not be practical and he thought that weaker encryption was probably better than no encryption at all.

Other issues that commercial information security specialists deal with are often met with similar astonishment from their research-oriented counterparts and these sorts of issues seem to plague the interactions between theoretical and practical cryptographers. One group invents lots of new technologies and the other group explains why the new technologies can’t be used in the real world.

Unfortunately, these sorts of issues weren’t discussed at the recent Real World Cryptography Workshop. But it seems to me that working around this sort of issue is something that future workshops really ought to discuss.

Leave a Reply

Your email address will not be published. Required fields are marked *