Invisible Problems

If the FDA is too lenient with its testing or approves a drug too early, dozens or maybe even hundreds of people could die. It would be a national scandal, and if you heard about it you’d likely be at least a little upset. If the FDA is too conservative with its testing or approves a drug too late, many times that number – tens of thousands – could die because of the lack of the drug, but you’d never even notice.

If we take as a given that the FDA attempts to be neither too lenient nor too conservative, but also take as a given that no one is perfect nor has perfect information and sometimes you have to make your best educated guess, which direction do you think the FDA is more likely to lean over time?

This is the concept behind the famous “broken window fallacy.” The fallacy is that breaking a window is actually a good thing for the economy, because look! The store owner has to replace that window, so he spends money! And then the glass-maker gets a new order, so he has more money to spend on tools! And the tool-maker gets a new order, so he has more money to spend on shoes! And so on. The problem is that while it’s easy to see the money spent on the window, it’s very hard to “see” that same money not being spent on whatever it would have been spent on in the absence of a broken window. That money still would have been spent, and still would have created a similar chain of economic activity, just in a different direction. But we never see that happen, so it’s hard to be emotionally affected by it.

In statistics, there are what are called Type I Errors and Type II Errors. A Type I Error is when you have a “false positive” – when you believe something to be true and act accordingly, but it wasn’t. A Type II Error is the reverse, a “false negative” – when you believe something to be false and act accordingly but it was true.

Even though they’re two sides of the same coin, there’s a HUGE difference between them. Did you know that people used to think tomatoes were poisonous? So folks didn’t eat them for a long time. Let’s take a look at two scenarios:

  1. People think a particular food is perfectly fine to eat. They’re wrong; it’s poisonous. Someone eats it and gets sick, maybe even dies – but then the belief is immediately corrected, and no one eats that again.
  2. People think a particular food is poisonous to eat. They’re wrong; it’s perfectly fine. No one eats it, so no one ever shows that it’s fine, and the belief persists.

That’s the difference between Type I Errors and Type II Errors. Type II Errors are Invisible Problems. Now, you might be thinking “okay, so no one eats that food, so what?” But let me expand on scenario 2 a little: What if the food that everyone thinks is poisonous isn’t just fine, it’s actually an incredibly healthy super-food that would double your lifespan if you ate it regularly? It would cut away fat, build muscle, and improve memory. But no one eats it, because they think it’s poisonous. No one ever even thinks to challenge this belief.

Why? Because the belief never challenges you. Falsely believing a poisonous fruit is okay to eat will have a pretty immediate impact on your life. But you could go your whole life and never even realize you’d made a mistake in the other direction.

Invisible Problems.

If you take a bad drug that the FDA approved and get sick, you’ll be pretty mad at the FDA. But if you just get sick from something else, you’ll never think to blame the FDA for holding back a drug you aren’t even aware exists.

Thus we have the second dilemma caused by Invisible Problems. Not only are we losing tremendous personal and societal benefit every day from these kinds of issues, but we’re also forcing each other – and ourselves! – into incentive structures that reward making the problems worse.

If you try something new and it sucks, you’ll be sad. If you don’t try something new that would have been awesome, you won’t even notice – that’s just the status quo. So most people naturally push themselves into a life where they’re way more concerned about avoiding Type I Errors than Type II Errors, even though the reverse might make your life much, much better.

And then we do it not only to ourselves, but to others. The FDA makes decisions that affect lives. Being too lenient might cost dozens of lives; being too conservative likely costs thousands every year. Despite this, they lean in the direction of being too conservative, because no one is going to yell at them for the thousands lost to that strategy, but people will get mad about the dozens that would be lost to the other strategy.

There are invisible problems everywhere. Sometimes there are ways to reveal them, almost always because of an outside perspective that may be less emotionally tied to the risks involved and thus more objective. Seek out those people. Let others examine your choices and live out loud.

Don’t just question your beliefs. Question what you don’t believe, what you haven’t even considered. You may catch a glimpse of an invisible problem, and if you can solve it, there’s tremendous value to be gained.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s