The two-envelope problem in risk management

Does it make sense to never change your information security strategy? That's a possible consequence of the so-called two-envelope paradox. This is a problem in probability theory that has confused students of probability theory for over 50 years. It goes like this.

Suppose that you're given two envelopes and you're told that one envelope contains twice as much money as the other. You then open one of the envelopes and see how much money it contains. Based on this information, you decide to either keep the contents of the first envelope or to switch its contents for the contents of the second, unopened envelope.

It might seem that it always pays to switch.

Suppose you find $2 in the first envelope. You know that the other envelope either contains $1, which happens with probability 0.5, or it contains $4, which also happens with probability 0.5. So you can calculate the expected value of the second envelope as $1 x 0.5 + $4 x 0.5 = $2.5. Because this is greater than $2, it always pays to switch.

There's a problem with this argument, of course, but it's fairly subtle. Even specialists in probability theory don't agree what the problem actually is, although they all agree that there's a problem with the argument.

Now let's suppose that we can't find a flaw in the above argument and we apply it to our information security strategy. Let's suppose that we have some initial set of technology, policies and procedures that end up giving us some exposure to risk that we'll denote R, and if we change to a different set of technology, policies and procedures, we might either increase the risk to 2R or decrease it to R/2. If we apply the same reasoning that we applied above, we find that it never pays to change, because the alternative always has a greater than the risk than what we have now. This clearly doesn't make sense, but it's what you might get if you do a risk analysis that isn't as careful as it could be.

The bottom line is probably that probability is a complicated and subtle concept, which means that risk management, which relies on it, also is.

  • Russell Thomas

    Luther,
    I do agree with your bottom line: risk management for InfoSec is “complicated and subtle” but I disagree with the implication that it should be avoided.
    Your example of the Two Envelopes Problem is interesting, but the paradox can be resolved by reframing it into “possible worlds”.
    Furthermore, you made some errors in your formulation of the corresponding InfoSec problem, which make it very different from Two Envelopes. Unintentionally, your post is a demonstration of the complications and subtly of risk analysis.
    I’ve posted a more complete analysis here: http://newschoolsecurity.com/2009/09/is-risk-management-too-complicated-and-subtle-for-infosec/
    Russ
    P.S. I’m not a risk analysis zealot nor a pollyana. I appreciate the critiques of John Adams and others. I just don’t believe that these cautions and critiques mean that risk analysis/management is a dead end.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *