Over the years, I’ve met several individuals of immense capacity for logical deduction, genuinely impressive education, and sky-high IQ (we’re talking well north of 160). Many of these people have an almost supernatural ability that seems, to the merely mortal, to allow them to discover the truth rapidly and definitively – and in such a manner as to prevent any seemingly-reasonable argument to the contrary.
If you are not one in a million and blessed with this combination, you might need a heuristic to filter out the bullshit that permeates the Internet – a place that contains almost the sum of all human knowledge mixed prodigiously with the sum of all human stupidity.
The Six Filters for Truth
Scott Adams, in his book How to Fail at Almost Everything and Still Win Big, has an effective heuristic he calls The Six Filters for Truth. They are as follows:
- Personal experience.
- Experience of people you know.
- Scientific studies.
- Common sense.
- Pattern recognition.
Each one of these has glaring flaws when individually given precedence over the others. Personal experience can be iffy and unreliable. People you know could be stupid. Experts could be dishonest, incompetent, or saying whatever the bankroll wants them to. Scientific studies are great at answering very specific questions – but correlation is not causation. Common sense is a great way to be wrong with supreme confidence and, in any event, as the saying goes – it is not so common. Pattern recognition could incorporate personal bias or be coincidental.
However, if something is consistent in several of those bullet points, but one is completely different, it is reasonable to suspect the different one might be bullshit.
One of pop culture’s favorite examples of bullshit is the Tobacco Industry, which once denied cigarettes caused harm, and funded sympathetic or conflicting studies. Yet if someone had bothered to check the Six Filters for Truth, they probably would have seen through that right away. When you see people you know coughing, getting sick, etc., and they are chain smokers, you can check #2. If you smoked yourself and suffered these issues, you could also check #1. #5 is great because just thinking through breathing smoke into your lungs would have raised the thought “maybe this isn’t the most healthy thing to do.” #6 goes into noticing that old people who smoked a lot generally had various unpleasant problems.
So even if the Tobacco Industry funded experts and studies – and their opponents funded other studies and experts (effectively rendering #3 and #4 a tie), you could say reasonably “eh, the Tobacco Industry is probably full of shit.”
The Internet is Full of Shit
Google – and by extension, the Internet it searches – creates a number of conflicts for #3 and #4. One of the most famous examples of this the dispute over minimum wage. Scott Alexander at SSC touches on this briefly:
“At some point in their education, most smart people usually learn not to credit arguments from authority. If someone says “Believe me about the minimum wage because I seem like a trustworthy guy,” most of them will have at least one neuron in their head that says “I should ask for some evidence”. If they’re really smart, they’ll use the magic words “peer-reviewed experimental studies.
But I worry that most smart people have not learned that a list of dozens of studies, several meta-analyses, hundreds of experts, and expert surveys showing almost all academics support your thesis – can still be bullshit.
Which is too bad, because that’s exactly what people who want to bamboozle an educated audience are going to use.”
There is no definitive answer to this question that can be easily and casually proven. Many complex issues are like this. They are not monocausal. There are so many second-order effects of a decision that it’s almost impossible to work out beforehand just what will happen. The minimum wage is like that. But on a meta level, many issues are like this.
In other words, for many of the most intensely debated political issues, at least by midwits on social media, filters #3 and #4 are muddied so much as to be inconclusive. That was the purpose of the Tobacco Industry funding competitive research: to muddy the expert opinion waters. Falling back on the other filters provided a reasonable answer, however – it was probably harmful, just look at your cousin who has lung cancer or something. An answer that eventually turned out to be right once the Tobacco Industry gave up its war.
Cigarette smoking was, however, a relatively easy one. Issues like Minimum Wage, Global Warming, etc… are much more difficult to resolve and may never be conclusively answered the way cigarette smoking (eventually) was.
Google statistics, studies, and experts on almost any subject. You will get results sympathetic to your argument, ones against your position, and everything in between. If you’re a social media normie, you might get in a spat with your cousin’s friend about Global Warming and start a cycle of posting sympathetic sources, and in turn try to come up with ways to dismiss your opponent’s sources. Perhaps, your opponent suggests, your sources are funded by evil oil companies. You then point out that his studies come from government grants, and we all know the government wants more power.
You can go down that road forever. Nothing will come of it.
For your personal belief, you will have to employ other truth filters. For convincing your cousin’s friend, it’s hopeless, and it’s a waste of time to try.
Experts Are Not Equally Trustworthy
Tom Nichols – a man who loves to pontificate on why people should trust the experts (on everything) – frequently likes to compare to airline pilots. Generally speaking, airline pilots are very qualified, and we trust them to fly us safely all the time. So, in his thinking, we should ascribe similar levels of trust to, say, foreign policy experts.
This stems from a misconception: the notion that experts in one domain are equally honest and competent as experts in another.
If you employ the truth filtering, you’ll quickly realize how false that is.
Yes, we trust airline pilots a lot. There are common folk wisdoms about it, like “flying is safer than driving” – folk wisdoms that happen to be true – and they check box #5. You can check box #2 in that, in all probability, you do not know anyone who personally died in an airliner due to pilot error. Your personal experiences with flying were probably good too. Box #1.
If you were so inclined, you could research box #4 and find out that statistically, airliners don’t crash very often. In fact, airlines are great in that you can usually check all six boxes. Tom, of course, wants you to think that all experts can be treated so graciously.
Apply the same thinking to another field of experts: auto mechanics.
Dear reader, I bet you are already feeling a bit skeevy at that. Most of us have probably encountered dumb and/or dishonest auto mechanics. I bet you have stories from friends about folks getting screwed by a local wrench turner. Local media loves to do exposes on auto mechanic dishonesty, overcharging, and incompetence. Yes, auto mechanics probably know more than most of my readers about fixing cars, but only an idiot would trust them the same way as an airline pilot.
As bad as auto mechanics can be, however, politicians are probably worse. Sure, they are probably more knowledgeable about the inner workings of politics than you are. This does not mean it is advisable to trust them.
There is a bit of Motte & Bailey Fallacy in Tom Nichols’ arguments. When he says “trust the experts” he trots out the nice, well groomed, hyper-competent airline pilot. But what he really means is for you to listen to whatever experts thought the Afghanistan pull out strategy was great or endorsed Janet Reno’s Waco strategy. It’s an “experts can do no wrong” argument dressed up in the best possible clothing.
One of my own personal heuristics here is to ascribe more trust to experts in the physical sciences and in engineering than I do in the soft sciences. For instance, I am more likely to trust a physicist or an aeronautical engineer than a sociologist or a political science expert. There is less wiggle room for bullshit when your plane either flies, or crashes in a burning pile of rubble.
Man Made Global Warming
So just what do you do when presented with issues that defy easy resolution, but which have immense political power backing them? It’s clear that most powerful world governments (China excepted – which I find personally hilarious) want to employ various Global Warming mitigation strategies, most of which involve curtailing freedoms and raising taxes.
Let’s say that, like me, you find this a bit too convenient (in a Tobacco Industry research kind of way) and also do not like curtailing freedoms and raising taxes as a general rule. Good luck proving the experts wrong – you won’t be able to.
Of course, that doesn’t make them right either. Yet how can you oppose the political power backing such things without clear cut proof of error? You might employ the filters for truth and come to the conclusion that the experts are full of shit, but a good portion of that list relies on anecdote which as we’ll touch on later, doesn’t work very well when convincing others.
For one, you can’t win that fight. A thousand victories in social media debates with your cousin’s second roommate will not move the political needle. Either the politicians have the power to get what they want, or they don’t. That’s very orthogonal to whether or not the underlying position is true.
For the most part, I generally believe data that suggests the Earth has been warming in the last century or so. That is because this is a mostly physical science exercise. Obtain temperatures at various places around the world over a long period of time and compare these values, perform some math, average things out… and yeah, sure, you could probably get a general sense for temperature.
Causality, however, is an entirely different business. The pop culture narrative, generally supported by most major governments, is that this is predominately man made, that it is way out of line with prehistorical temperatures, and that this is a major crisis which will have terrible impacts, justifying all manner of government interventions.
All of those I find much more suspect. The Earth is so complex, I am inherently suspicious of anything predominantly monocausal, especially if it aligns with government interests. The latter is a source of confusion for me, at times. Political Leftists are often suspicious of large corporate interests but are prepared to believe big government. On top of that, all the doom and gloom is confusing, at times. Certainly, rising temperatures will have various costs, drawbacks, and problems. But it also probably has benefits. Off the top of my head, it probably means more arable land.
Of course, I’m not saying it’s a good thing, overall, either. I don’t know. I’m willing to generally trust the experts taking the temperature readings, but the various models and political conclusions from those models, and the blame assignment, those I am skeptical on.
It’s not very scientific, but those folks remind me a bit of the auto mechanics. Certainly not the airline pilots. In any case, the complexity of the issue renders most conclusions about it probably little better than guesses.
Anecdotal evidence can be both the strongest form of evidence, and the weakest. For yourself, if you experience a thing, that is strongly persuasive to you, personally. But its power to convince others is usually quite weak. This is a major kind of disconnect in more formalized debates. You can know a thing to be true, to a very high degree of certainty, but be completely unable to prove it to someone else. To observers watching the debate, it would then appear that your opponent is probably more right than you are.
Sometimes folks – usually midwits – will repeat the mantra “anecdotal evidence is the weakest form of evidence!” They probably heard about it in some class, or read it in a social media post, and it struck them as clever in a midwit sort of way. But the phrase’s conclusion really only applies to formal debate.
In truth, your anecdotal experiences can be valuable, or can be bullshit. It depends greatly on the circumstances. Let’s say you were one of the rare people that lost a loved one in an airline crash or something. You might say “airline pilots are incompetent assbags.” You’d be full of shit, and the anecdotal experience led you wrong. But if you go through the filters for truth, you’d realize that 5 of the filters said “airline pilots are great” and only one – YOU – suggested that they sucked. In all probability, then, it is you who is wrong.
But let’s take a more contemporary example. I’ve read several news articles and statements from politicians suggesting that either inflation is completely within norms, or is hardly even there at all, or if it is there, it’s a good thing and your salaries are surely keeping up.
Okay. We have the expert opinion. What about anecdotes? I’ve seen prices on almost everything I buy on a regular basis go up by significant amounts. Very little – if anything – has gone down. Nor has my salary gone up during this period. Maybe that’s just me, right? So, I talk to friends and family. Nope, they are seeing the same things I am seeing. Pattern recognition is saying it too. In fact, everything except the experts says this is probably happening.
In that case, anecdote is a great data point (among others) to suggest maybe the experts are full of shit. It’s not always effective, but the six filters are a great heuristic. If one is out of line with the rest, you’re probably dealing with bullshit. And if that one out of line with everything else is you, then maybe it’s you who is full of shit.
The Internet is wonderful in many ways. So much knowledge and experience are buried in it. But Sturgeon’s Law still applies: 90% of everything is crap. The Internet is vast. When you consider that most of it is probably porn, and 90% of the remainder is probably bullshit, you are left with very little truly useful content, as a percentage.
Maybe it does contain almost the sum of all human knowledge. But this is like sifting through a landfill for valuables. They probably exist. But it might be wise to have some techniques and quick filters to make wading through the muck easier. And the bigger the landfill gets, the harder it might be to find what you’re looking for.
Sometimes I wonder if going back to books might become a thing again. For a while, it was easier to find a how-to on, say, fixing your specific car’s problem by Googling it. You would find more information, how-to videos, and documentation than you ever could in the Chilton Manual. Furthermore, you could usually get to the specific problem a lot faster. Perhaps you wanted to recharge the A/C and just needed to find the recharge port.
But eventually it may become hard enough to find all of that information, wading through all the clickbait, bullshit, long-winded videos, spam, and morons who think they understand the problem, that a return to just buying the manual might be easier. Perhaps many other knowledge domains are like this, and it would be easier to buy a book, or secure a membership to a website or something specific to that knowledge domain – one that is scoured of bullshit – than to use more generalized Internet searches.
For political issues, maybe we’re already there. Whatever lunatic position someone might have, there’s probably support for it from some corner of the Internet, from some expert or authority of some kind. It’s another problem with Tom Nichols and his take on expertise: which experts should you trust? He clearly has an idea of which ones he trusts and expects you to come to the same conclusion. Those largely consist of the politically connected and powerful experts, those who have been endorsed by the ruling body. That’s his path out of the Internet muck: trust the powerful. Whether this is out of principle or submission and cowardice, I can’t say, though I’d probably guess the latter.
Personally, if the experts are a confused mess of conflicting opinions, I usually just scratch that off the list and look for another way to filter for probable truth, knowing always that there’s a good chance I’m still wrong. But, in the manner of Socrates, maybe I’m slightly less dumb for realizing that.