Why I like RC5 and RC6: Part 1

In a previous entry, I said that RC5 is the greatest block cipher ever. A colleague thought I would declare the Goat Cipher the greatest (that's the one I designed), but I have to admit, Ron Rivest is a better block cipher desiger than I am. Who'd have thought that?

By the way, Rivest, the inventor of RC5, is the "R" in RSA and also designed RC4, a stream cipher that, because of SSL, has almost certainly encrypted more data than any other cipher in the history of the world.

In the early 1990's, Ron Rivest developed RC5. It employed data-dependent rotations in its main mixing operation, something, for the most part, new and untried in block ciphers. He and RSA Data Security patented the algorithm.

Later on, Rivest, along with RSA Labs researchers Matt Robshaw, Lisa Yin, and Ray Sidney, devised RC6, a variant of RC5, to be a contender in the AES competition. That algorithm was also patented.

You might be asking, "If RC5 is so great, why wasn't it the AES submission, and why didn't RC6 win?" Well, I'll get to that in parts 2 and 3. But first …

DISCLAIMER
I used to work for RSA Data Security. In fact, I worked for RSA when RC5 was released and was still working there when RC6 was released. I had nothing to do with the development of the algorithms, but because I worked for the company, you might think I'm biased. Maybe I am a little, but I think I would like RC5 and RC6 even if I had never worked for the company. Besides, I was laid off by RSA, so maybe I am biased against the company, maybe I'm bitter about being let go after 9 years there.
END DISCLAIMER

One thing I'd like to point out is that Ron Rivest, in designing RC5, had the stated goal of building an algorithm that would be secure, fast in software, and small in size. And to obtain speed, he didn't think in terms of O(n) (order of number of operations, see the earlier blog post, "Algorithm Performance in Academic Papers and Practice"), he thought in terms of possible assembly code implementations. In other words, right from the beginning, how the operations would really run on a machine was incorporated into the design. I wonder how many other algorithm designers think at that low a level (I don't know, maybe most of them, but from my experience writing assembly code versions of several crypto algorithms, I haven't seen much evidence that they do).

The security of a block cipher is a complicated thing, but probably the best way to describe it is this: A block cipher is secure if the fastest attack is brute-force on the key. So when you present an algorithm to the world, you must also supply research that shows your algorithm withstands all known attacks, or that any "weakness" is exploitable only at a level that is slower than brute-force. For example, you have to show that a "known-plaintext" attack would require so many plaintext/ciphertext pairs that to generate and analyze them would take longer than simply trying every possible key.

Clearly there might be future attacks that have not been devised/discovered yet. But even the crypto world does not demand that you be able to see into the future to defend your design. However, you must do the work of analyzing your algorithm from a cryptanalytic point of view (or pay someone to do it) before submitting it to the world. You don't create an algorithm then demand the rest of the world analyze it. Of course, even though you do supply analysis, if the algorithm seems to be catching on, many other crypto people will subject it to more scrutiny.

It turns out that there are plenty of algorithms that meet this security criteria. So how does one choose a cipher? Look at speed and code/memory size. Look also at speed and size on several platforms and in several languages. Is an algorithm fast on a Sparc chip but not so fast on an Intel? Is it fast if you have 64-bit registers, but very slow if you don't, or so slow on smart cards that it would not be usable? Does it require 1 megabyte of memory? Or 512K of code size? Does that make it prohibitive on a cell phone?

Another thing to look at is how easy it is to analyze. A simple algorithm that is easy to understand and analyze will be studied more, and the analysis will inspire more confidence. There are lots of people who have the skills to analyze a simple algorithm, not so many have the skills to analyze a complicated one. If both the skilled and highly skilled analyze an algorithm and it passes muster, that inspires confidence. If only a few people have the ability to analyze the complicated algorithms (and how can we know they really have the skills to overcome the greater complexity?), we might not feel so assured.

An analogy would be analyzing two mutual funds, one is an index fund and the other pours its money into derivatives, futures, hedge funds, REIT buyback paper, and other hard-to-understand investments. Maybe the complicated fund is better, maybe not, but not many people have the skills to examine the complex fund, so how can we know for sure?

Back to RC5. Rivest published RC5 in 1994 and had accompanying analysis. Because it was Rivest (one of the biggest names in crypto) and RSA Data Security (the crypto company with by far the most customers at the time, including Microsoft and IBM), it received attention. It also started getting traction in the market (as an alternative to DES). Hence, cryptanalysts outside of the RSA "family" started analyzing it. A few minor issues arose, the worst being that the algorithm was susceptible to differential cryptanalysis at 12 or fewer rounds. That was no real problem because the recommended round count was 20, and at that level it was still 40 to 80 times faster than DES (which has a fixed round count of 16).

Note that every block cipher, and I mean every, is susceptible to attacks at low round counts. That's not a problem. It's determining where the threshold is and showing that more rounds solves the problem and does not degrade performance too much. In other words, if an algorithm employs 20 rounds, you can't say it is broken because it is not safe at 12 rounds. That's like saying a six-step ladder is no good because if you stand on the second step you can't reach the ceiling. To declare a 20-round cipher unsecure, you either have to find the weakness at 20 rounds, or else break it at 12, then 14, then 16 rounds, and show a trend.

One of the beauties of RC5, however, was that each round was so fast, adding more rounds to be more secure was not a very big penalty.

The other beauty was code size. The encryption part of the algorithm was 7 lines of code. Add in key setup and data formatting, and an RC5 implementation could be less than 500 bytes of compiled code size. The key table itself at 20 rounds would be less than 200 bytes of memory. That means when you add RC5 to your application, you add about 500 bytes of object code and increase your memory requirements by 200 bytes. That's not megabytes or kilobytes, that's bytes.

Furthermore, this small, simple cipher was easy to analyze. Because it did so little (each round consisted of 6 simple steps: XOR, rotate, add, XOR, rotate, add), and had no complicated tables, it was easy to trace data and watch the transformation. Many people had no problem testing its security.

RC5 is secure, easy to analyze, fast, and small. However, nothing is perfect, so I'll discuss the drawbacks in Part 2. Also, no matter how good something is, you can't look at it in a vacuum, the next step is to compare it to other algorithms. That will be in part 3.

Leave a Reply

Your email address will not be published. Required fields are marked *