I love “armchair epistemology.” I love thinking about how we come to know things, and how we come by the information we believe to be true.
Knowledge can only come to you in one of two ways when you boil it right down – firsthand or secondhand. You can discover something with your own intellect and senses or you can accept truth from another source.
Both methods have their flaws; sadly, we don’t have any universally-perfect way of discerning absolute truth. But I think the flaws in our ‘firsthand’ information-gathering systems are more apparent – certainly, we seem as individuals to spend more time refining them. If our senses aren’t giving us accurate information, we correct them with technology. If our intellect gives us the wrong answers, we study the flaws in our reasoning skills or we obtain new information and information models. In this way, we generally get better at acquiring firsthand knowledge, at least if we genuinely try.
But there are many flaws in secondhand knowledge as well, and we all too often simply ignore them. Secondhand knowledge has to come to you through The Three Great & Terrible Filters, each one stripping truth from information and scrambling the knowledge you receive. Despite their ubiquity (or perhaps because of it, like a fish being unaware of water), we barely acknowledge their existence.
The Three Great & Terrible Filters are Agenda, Noise, and Error. These are the Three Horsemen of Falsehood, and you’ve simply got to contend with them. If you don’t, you might as well just abandon knowing truth forever.
Agenda is… well, people have reasons to lie. Especially to big groups, and you’re part of a big group pretty much any time you receive secondhand information. I’m going to avoid using any majorly controversial examples here and pick a (hopefully!) less contentious topic to use as an example: the Moon Landing. Look, here’s the absolute reality: you don’t know that the moon landing was real. You don’t. You weren’t there. You don’t have firsthand knowledge unless you’re one of a very, very small number of people (and if Buzz Aldrin reads this blog, that is awesome).
So that means that your belief in the Moon Landing as a real event is based on secondhand information. Someone else told you that it happened. And here’s the thing – unless Dr. Aldrin himself is the one that told you, the person who told you also doesn’t know for sure that it happened. Secondhand becomes tenth-hand really quickly and you don’t even realize it. So sure, you can think that the person yelling “the moon landing was faked!” is a crackpot, but they have exactly as much firsthand knowledge as you do. So the first Horseman you must slay is this one: does anyone have anything to gain by lying to me? If they do, then they probably are. The reason I think that the moon landing is real is simply that I don’t think that enough people have enough to gain by lying about it, versus the enormous cost of maintaining that lie. Not everything passes that test.
The Second Horseman is Noise. Look, you want to trust the experts. Sure… but there are a lot of experts, and they don’t all agree. Consensus is a myth. People, even very reasonable and smart people with expertise on the same subject, will disagree about that subject. They will use their firsthand systems to reach different conclusions, and then they will try to transmit those conclusions to other people, and they won’t actually be lying, but they also won’t agree. To you, that’s noise. And people are really, really bad at admitting that two reasonable people can both be smart and yet disagree – instead, they start calling each other charlatans and accusing them of being Agents of The First Horseman, and this only increases the effect of Noise as far as you, the secondhand recipient, are concerned. And other secondhand people have all sorts of tribal loyalties, and they’re basically using those loyalties to stand in for certainty about information that they can’t possibly be certain about, so whether or not they believe Reasonable Expert A or Reasonable Expert B will almost always boil down to which color flag that Reasonable Expert appears to be waving. Noise, noise, noise.
Okay, so you’ve got lots of people with reasons to lie, and lots of people with no reason to lie but who disagree with other people with no reason to lie, and lots of mixes between those two. But now, to compound all of that, is the Third Horseman: Error. Because all of those people also have all the same problems gathering firsthand knowledge that you do, plus all of the problems with their own interpretations of the secondhand knowledge they need as part of their work, which means on top of everything else, they can just be wrong. So in addition to getting skewed by passing through the Agenda filters and the Noise filters, the information might not even have been right in the first place.
Okay, so this isn’t me saying you shouldn’t believe anything. It’s me saying you should be a lot less certain than you are about stuff. And the primary outcome of being a lot less certain is that you should be a lot less angry. Other people being wrong about stuff affects you every day. Someone else’s bad ideas about automotive safety can put my life at risk when they cut me off at high speeds to run a red light. Someone else’s bad ideas about business can put my livelihood at risk. This is just… you know, life. Being angry about it helps exactly not at all, ever. Focus your energy on raising your own awareness of things that you can use to directly improve your life and the lives of those around you, and be less certain about everything else. Let people be wrong; insulate yourself from their wrongness before you try to be angry that they’re not right.
And when you hear anything, anything, remember that it is borne to you by Three Terrible Horsemen, and their only weakness is the swift sword of detached skepticism. Be not afraid.