Who’s Lying?

(Warning: coming in hot today.)

When someone says something that isn’t true, there are two possibilities: they mistakenly believe that it is true, or they’re deliberately lying.

People have all sorts of reasons to deliberately lie, but people have all sorts of ways they can come by a false belief “honestly” and repeat that belief. Telling the difference is a very useful skill. If someone is honestly mistaken, then accusing them of deliberately lying isn’t very helpful – it may damage relationships that you otherwise would want to salvage and even at best it tends to push the person deeper into their position, rather than opening them up to accepting the truth.

Meanwhile, if someone is lying deliberately, then trying to “correct” them is probably just playing into their hands. I see this particular pattern play out all the time with otherwise very smart people I know. A politician will say something that is obviously untrue to anyone with the tiniest bit of expertise on this subject. Some perfectly smart friend of mine will say, “Gosh, how do politicians that dumb get elected? They clearly don’t know anything at all about subject X.”

And I will look at this person like they just said the moon is made of cheese. If I were the kind of person who got into arguments, I might say: “You think that politician actually believes that thing they just said? They’re a politician. They said it because they know 51% of their voter base wants to hear it. There’s nothing more to it. They know perfectly well that it’s false.”

Believing that everyone lies will make your life very hard. Believing that no one lies will do the same. But it’s not random – there are patterns. They mostly follow incentives.

The random person on your Facebook friends list that tells you how great homeopathic medicine is? She probably genuinely believes it. The company that sells homeopathic “medicine?” They know perfectly well that it’s sugar water, and they are lying. How am I so certain of these things?

“Incentives versus proximity to knowledge.” Your friend Sam on Facebook has no great incentive to lie about the efficacy of snake oil; Sam gains very little personally if you believe it. And most other people have very little motivation to try to correct Sam, knowing that the battle isn’t likely to be fruitful. Sam also probably doesn’t do much research of the non-confirmation-bias-enforcing sort, so Sam’s incentives to lie and proximity to truth are both very low.

But now let’s look at a company that makes and sells this stuff. First, they obviously have a huge incentive to lie. If they told the honest truth about their product, they’d go out of business. So the incentive to lie is very high. And proximity to truth is also very high – you can’t be in the business of making and selling something for very long without finding out pretty much everything about it, and your position means that people will readily tell you when you’re wrong. So those folks know that they’re just pouring sugar water into bottles because they’ve been challenged a hundred times or more and they’ve had to lie their way out of those challenges. Just like in politics, it’s possible that someone initially got into it believing that their particular brand of snake oil was legitimate, but they would rapidly discover that it wasn’t once they were inside the actual snake oil factory. And at that point, some people quit in disgust, their position changed and their worldview perhaps a little more cynical; but most other people just start lying to uphold what they once may have honestly believed.

Here’s a little thought experiment I like to do to consider whether I think someone might be trustworthy: I think about how much that person would lose if they discovered evidence that they were wrong and admitted it. Example: Senator Smith got elected on Issue X, holding position X1. X is Smith’s signature issue, and X1 is Smith’s signature position; Smith’s entire brand is built around it and Smith’s voter base is rabid about it. If concrete, irrefutable evidence that X2 was the correct position and X1 was completely false & damaging, what would Smith do, do you think? Get up on the pulpit and say “mea culpa?”

Nah. Smith would lie. That doesn’t mean X1 is false by itself, mind you. It just means that Smith saying it is neither evidence that it’s true nor evidence that Smith even thinks it’s true.

My father used to tell me that people were likely being honest if they got more trouble than reward for what they said. They might not have been right, but they probably weren’t lying. Your average person on the street probably doesn’t lie much about stuff like whether a particular policy is a good idea or whether a particular kind of medical intervention works well. They might be right or wrong, but they probably believe what they say. In order to really lie about something, you often have to be pretty deep into that thing. You have to have wrapped up enough of yourself into that thing to suffer personally if that thing falters.

But here’s the slipperiest, trickiest part of this: it can sometimes be very hard to tell the difference between someone who personally profits if they are right about X, versus someone who personally profits if people believe their particular position on X.

Both types of people may present as “experts in the industry.” Both might even be experts! But expertise often comes with entanglements, and those entanglements aren’t always obvious. So check those incentives.

And also, don’t buy anything with the word “homeopathic” on it.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s