Why unary is impractical
I've been asked lots of questions about unary representations of numbers recently. Usually, this is stuff that absolutely nobody cares about, but because of the connection to Gentry's homomorphic encryption scheme, it's received lots of interest recently. Here's why the need to represent numbers in unary makes a cryptographic scheme totally impractical.
Unary is a way to represent numbers without a base at all. Suppose that we have a number that's 165 in base 10. If we write it in base 16 it's 0xa5. In binary it's 10100101. In unary, however, we use a number of 1s that equal the number. So for 165, we use 165 1s: 1111…111 (you'll have to imagine a string of 165 1s here – the sotware that runs this blog apparently has trouble displaying a string that long).
That clearly doesn't scale well. In a number system using a base that's two or greater, it takes roughly log n digits to represent the number n. And because n increases much more rapidly than log ndoes, unary isn't really very practical, particularly when you're dealing with numbers like those used in cryptography.
Consider a DES key. Ignoring the parity bits, this key has only 56 bits, which is trivial to store on a modern computer. If we use unary to represent a 56-bit number, however, we need to write 256 1s in a row. If we store chunks of eight bits of this number in 1 byte, this takes 253 bytes to store, or about 9 petabytes. That's a lot of storage.
If you try to use unary to represent a number like those often used in public-key cryptography, it's even worse. If we have a 1,024-bit key and try to represent it in unary, it takes 21,021 bytes, or over 10307 bytes. I'm not sure how accurate estimates like these are, but I've read that physicists estimate that there are roughly 1078 protons in the entire universe. So if you're thinking of implementing a public-key scheme that uses unary, it's going to be totally impractical.