Imagine the following hypothetical:
You have a friend, let’s say Jill, who’s working on a difficult problem, a mathematical equation of some kind. It’s deep and complex, but Jill is good at this sort of thing. Generally, Jill’s foundation is secure – however, by a stroke of bad luck, she received some bad information somewhere and her answer is incorrect.
Now let’s say that you’re also good at this sort of thing, and you spot the point of error. You decide you’re going to let your friend know. But before you can do so, some other obnoxious person, let’s say Jack, starts screaming at them that Jack knows the right answer, because Space-Emperor Zorgnax revealed it to him in a drug-fueled fever dream. Not only is Jack obviously bananas, but in this hypothetical Jack is also being a real dingus about it – claiming not just that Jill happens to be in error, but that Jill is generally unintelligent or not fit for her work, etc.
Now, let’s complicate things further. Let’s say by some coincidence, Jack actually is right. Not about the Space-Emperor Zorgnax stuff, but his actual end answer is the correct answer to the equation. You know, because you got there using actual math. He essentially just guessed, but even a broken clock is right twice a day and he happens to have gotten this one right.
What would you do?
It’s not as simple anymore as just saying to Jill, “Hey, you know how Jack is telling you that the answer is 48, in between screaming obscenities at you and threatening your death? Well, he’s right. It actually is 48.”
Your ability to be seen as objective has been stolen by a lunatic who happens to share your viewpoint.
That happens in real life, too. Not just in hypotheticals. More often that I’d like, I find myself in a similar situation:
“On the subject of X, I hold the A view. Lots of people hold the B view; I think they’re wrong, but they’re reasonable, polite, and I can see why they’ve reached the conclusion they have. Meanwhile, there are a bunch of drooling maniacs loudly shouting the A view – in my view, they’re correct, but not only are they ridiculous, they’re right for the wrong reasons.”
Why does it matter if you’re right for the wrong reasons?
Because your methodology won’t duplicate. Let’s say you decide which of two wires to cut on a live bomb by numbering them 1 to 50 and then picking a number out of a hat. If by some miracle you’re correct (and hey, there’s a chance), you still haven’t created some miraculous new bomb-defusing technique. Your methodology won’t withstand iteration.
That’s why it’s so frustrating when that broken clock is right. It’s hard enough proving someone wrong when their bad methodology has led them to an incorrect answer. It’s downright impossible to convince someone that their thinking is flawed when it accidentally got them the right answer.
I once saw a trivia-style game show where the host asked a contestant, “If I’m drinking the penultimate beer of a six-pack, which number beer am I on?” (‘Penultimate’ is a cool word that means ‘second-to-last.’)
The contestant, who clearly didn’t know the word, uttered the following string of nonsense: “Okay, ‘penultimate,’ huh? Penultimate. Pen. Like pentagon. And a pentagon has five sides. So… five?”
Ha. Broken clocks, etc.
I have a rule of thumb that’s served me well: If I have a strong opinion and a lot of people agree with me, I check my work. I’m wary of strong consensus. I’m doubly wary if the people that agree with me seem unhinged. They might still be correct, but for all the wrong reasons. That causes me to not only double-check my results, but my methodology. If I’m still right, I can be satisfied by that – but I’ll sleep better knowing that I did my extra homework before saying anything.
“It’s better to keep your mouth shut and appear stupid than open it and remove all doubt.” – Mark Twain