Odd intuitions

Effluvia

Via Bruce Schneier, a good description of a common fallacy:

Imagine you've invented a machine to detect terrorists. It's good, about 90% accurate.

…you receive urgent information … that a potential attacker is in the building. Security teams seal every exit and all 3,000 people inside are rounded up to be tested.

The first 30 pass. Then, dramatically, a man … fails. Police pounce, guns point.

How sure are you that this person is a terrorist?

A. 90%
B. 10%
C. 0.3%

Last 5 posts by David

16 Comments

16 Comments

  1. PLW  •  Jul 27, 2009 @9:40 am

    Depends on your prior.. are we assuming 1/3000 is a terrorists to begin with, or might there be a cell?

  2. Ken  •  Jul 27, 2009 @9:50 am

    That was my thought as well.

  3. Robert  •  Jul 27, 2009 @10:10 am

    C is the best answer (* but not exactly right! You've already tested 10 people. I'm a mathematician)

    There's a 10% chance of a false positive for each person. There are 3000 people. You'll have at least 300 false positives… Let's assume there's one terrorist. Then only 1/300 positives will be the terrorist.

    This same "paradox" is why mandatory HIV testing is a rotten idea, given the false positive rate. You'll have more false positives than real positives.

  4. Ken  •  Jul 27, 2009 @10:23 am

    Here's a question — are we also assuming that the device only gives false positives, not false negatives? It seems that you're assuming that the terrorist WILL be among the 300 positives. What if the terrorist is among the 2700 negatives?

  5. Robert  •  Jul 27, 2009 @10:28 am

    Yes, Ken. But this is a multiple choice test and the goal is to pick the "best" answer.

  6. David Schwartz  •  Jul 27, 2009 @11:47 am

    I think answer B is correct, it's ten percent. You have no idea that your information is perfectly accurate. There could be zero terrorists in the building. Everyone in the building could be a terrorist — heck, it could be a terrorist headquarters and the first few people were all false negatives.

    Where does it say that there's exactly one terrorist in the building? Why is that a reasonable assumption?

  7. David Schwartz  •  Jul 27, 2009 @11:49 am

    Ahh, I see the problem. Critical information about the problem was eliminated in the excerpt. The original problem description gives you good reason to assume that there's likely to be one terrorist in the building.

    (The way I've usually heard this explained is by a person who, analyzing their risk factors, has about a one in 10,000 chance of having AIDS. A 99% accurate AIDS test comes back positive. What's their odds of actually having AIDS?)

  8. PLW  •  Jul 27, 2009 @12:07 pm

    For 10% to be right, though, you'd need over 36 terrorists, which seems too high to me.

  9. David  •  Jul 27, 2009 @12:25 pm

    Critical information about the problem was eliminated in the excerpt

    Hence the ellipses and of course the link; the latter was the focus of the post and the only reason to provide the teaser….

    Rum thing, this interweb.

  10. David Schwartz  •  Jul 27, 2009 @2:00 pm

    PLW: And that's the key insight to avoiding the base rate fallacy. Given that a test gave a particular result, you need both the accuracy of the test and the base rate to assess the probability that the test was accurate.

    For a 90% accurate test, here's how it breaks down:
    If there is one terrorist in the building: .29%
    If there are ten: 2.9%
    If there are thirty: 8.3
    You need 37 terrorists to make 10%.
    If there are 100 terrorists, it's 23%.
    If there are 300 terrorists, the odds that this guy is a terrorist are 50%.

    This assumes that the test is 90% likely to identify a terrorist as a terrorist and also 90% likely to misidentify an innocent person as a terrorist.

  11. Patrick  •  Jul 27, 2009 @2:20 pm

    More interesting than the question posed is how the machine's reliability was tested and established at 90% in the first place.

    But that's a question for engineers. Everyone's a lay mathematician. No one's a lay engineer.

  12. Charles  •  Jul 27, 2009 @2:47 pm

    More interesting than the question posed is how the machine’s reliability was tested and established at 90% in the first place.

    After testing a sample group, everyone was waterboarded until it could be accurately determined whether they were honest about their knowledge of terrorist networks.

  13. Ken  •  Jul 27, 2009 @2:58 pm

    That's why you should use a Detainee Detector instead. By definition it is 100% effective.

  14. Mark  •  Jul 27, 2009 @5:37 pm

    I invented a fantastic terrorist-detector, but it's hard to get terrorists to volunteer for product testing.

    My Craigslist ads go unanswered. :(

  15. Ken  •  Jul 28, 2009 @2:34 pm

    I'm thinking this puzzle might be of interest to Bill O'Reilly.

  16. David Schwartz  •  Jul 29, 2009 @12:37 am

    That quote from Bill O'Reilly is hilarious. Everyone should listen to it.