Portable Code: Wait for the Bite

In my two previous posts (here and here), I said I don't like assertions or system calls when writing portable code. Looking back, I realize I may not have made something clear. So let me say it here.

Assertions and system calls are tools, sometimes powerful tools. They often make your life easier and sometimes are absolutely necessary to get the job done. However, they also cause problems in porting. So if you can avoid them, I think it's a great idea to avoid them. It may turn out that you use something and it does NOT come back to bite you. However, you don't know in advance what is going to bite you and what is not, so I think a programmer who has to write portable code should simply avoid these things if possible. It might make your job a little harder now, but it can save you a ton of misery later on.

Actually, there's one thing not completely correct in that last paragraph. I said, "… you don't know in advance what is going to bite you and what is not…" Sometimes you can know. Of course, sometimes the only way of knowing is because something on a previous project bit you, so for future projects you don't do that.

And I think that's why a lot of programmers don't believe me, they've never been bitten. Let me give you a couple stories.

Two companies ago, I learned that void * is not always a generic pointer. I also learned that not all compilers treat void * the same in all situations. Then at my next company, I was happy to discover that one of the coding standards said "Don't use void *." But one day I was working on a new project, let's call it project lion cub, and another programmer was using void *. I pointed out that our coding standards said not to and explained why one shouldn't use void * when writing portable code. The other programmer said I was being too paranoid and void * was a great tool.

When we tried to port lion cub to Solaris, it wouldn't. It turned out that the way Solaris and Windows treated void * was different enough to cause an incompatibility (in the way it was being used in the project). The other programmer spent a lot of time (weekend and night time) trying to reconcile the two but later on decided to just get rid of void * and use something else.

Then about two years later, this same programmer was working on another project. I noticed he was using void * in the same way, and suggested that's a bad idea. I reminded him of what happened in project lion cub. He remembered and decided not to use void *.

Here's another story. At my previous company I was working sort of as a consultant to another company who was writing crypto code. I noticed that the programmer was using enums. I pointed out that enums were not always portable. I told him why and described a situation in the past where someone who used enums found they didn't port and had a very difficult time getting rid of them while retaining backwards compatibility. This other programmer dismissed my concerns immediately.

Then he tried to port the code to another operating system. This new operating system treated enums just a little bit differently than his base OS, and the difference was just enough that his fastest course of action was to simply get rid of them.

In both those stories, someone had an opportunity to save themselves some headaches, but they didn't believe the advice. Once they were bitten, they did believe.

I think this is one thing that happens when I talk about portability. People just don't believe me until they see it with their own eyes. They have to be bitten to believe.

To be fair, there are some other reasons. First and foremost is that I'm almost certainly TOO paranoid when it comes to porting. I believe in a very safe route, sometimes safer than it needs to be. Second, many of the porting issues I have encountered 10 or 15 years ago, aren't issues anymore. And third, it is possible to use void * and enums in a portable way.

That third is an important issue. I say, "Don't use void * and enums." Someone else might say, "Go ahead and use void * and enums, but do the work necessary to make sure they are portable."

My philosophy is, "Write to a lowest common denominator, don't even tempt fate by using things that have been problematic in the past." Another philosophy might be, "Use all the tools at your disposal, just make sure you know how to use them in a portable fashion."

Either philosophy requires more work up front but has big payoffs later on. I still prefer my philosophy, after all, you can't really know what all the porting issues are until you try to port. So how can you know how to "use them in a portable fashion"? Or another argument is this: because you know you're going to have to do more work up front either way, why not choose the philosophy that's safer? Same work effort, safer porting.

What are you going to do when you run across a company that uses 16-bit chips or an old OS (maybe they've figured out a way to make a profit on outdated but extremely cheap technology) or a company that builds embedded devices or something else you can't predict? Maybe you've used the tools in a portable way, but your portable way covers Windows, Linux, Solaris, mainframe and most or all of the most common targets. But we just don't know what someone is going to come up with.

Leave a Reply

Your email address will not be published. Required fields are marked *