Avoiding a technology monoculture

At the recent National Cyber Leap Year Summit, there were lots of ideas discussed about how to make our current computing environment more secure. One of the ideas discussed was how to deal with the monoculture that success in the marketplace tends to cause. Some people thought that a technical monoculture was a very bad idea that led to all sorts of problems. Others saw the benefits to a standard technology outweighing the problems that it might cause. Still others didn’t see a feasible way to get people to diversify the technology that they use.

Having a single technology that’s used everywhere has the potential to cause all sorts of problems because if the technology fails in one place it fails everywhere. That’s why some applications that need to be extremely reliable, like those that run nuclear power plants or aircraft, need to have redundant systems that are also different from each other.

If there’s a single operating system used on 100 percent of computers and hackers find a way to exploit it, then they can also exploit 100 percent of the computers. On the other hand, having only a single platform to support can give you significant cost savings in the areas of support and maintenance. For large businesses, it’s possible to save hundreds of millions of dollars (according to people from large businesses at the NCLY Summit) by standardizing on a minimal set of supported applications, and it’s not clear that doing this causes anything close to hundreds of millions of dollars in losses due to the increased ease with which hackers can exploit the businesses that have done this. So there’s definitely a cost to standardizing technologies, but it’s not clear that these costs outweigh the benefits from it.

On a more practical note, it’s not clear how you could get businesses to not standardize their technologies. Would you require each business to use a diverse set of technologies? Or would you let each business standardize on a set of technologies but force diversification of technologies from company to company? If you somehow required businesses to diversify the technologies that they use, how would you compensate them for the additional costs that they might experience?

And exactly how would you define diversity? Is version 2.0 of a product the same or different from version 3.0 of a product? A quick look at the National Vulnerability Database seems to show that the vulnerabilities that a product has change a lot over its lifetime, so maybe it’s reasonable to say that version 4.0 is different enough from version 2.0 to justify being classified a different product for this particular purpose. This, of course, could lead to the bizarre effect of encouraging businesses to only patch and upgrade part of their systems, so that they’ll have enough diversity to meet whatever monoculture-banning rules that we come up with.

So maybe that’s a good challenge: find a reasonable way to get businesses to diversify their technology purchases in a way that doesn’t cause huge side-effects. Can it even be done?

Leave a Reply

Your email address will not be published. Required fields are marked *