Why I Don’t Trust Experts

Recently, I had a conversation with an acquaintance on the question of why one should trust experts when they are often wrong. Put too simply, he argued that experts couldn’t be trusted and I argued that they could be trusted. He based his argument on a litany of cases where the experts were wrong. As far as I know, in every case that he cited, the experts were indeed either completely wrong; there remained only a small kernel of truth from some earlier expert position; or there was little or no agreement among exerts. Much of this discussion depended on how one defines “expert.” But the definition of expert is entangled with the extent to which an expert by any definition is reliable (by some definition of reliable).
I won’t try to reproduce the conversation here. What I will do is explain my side of the discussion in the context of his side of the discussion. I start with the assumptions that an expert is more or less wrong on many things within his or her field of expertise but that others are likely even more wrong. I rely on expert opinion out of a kind of existential laziness. I just don’t have the time or inclination to study everything at the required depth to reach meaningful conclusions. Life is just too short to even think you are an expert in more than a very few subjects. So, as far as there is a general agreement among experts in a field about which I have such a lazy curiosity (and that is nearly every field), I rely on the experts. I rely on them not because I necessary think they are correct in every detail or even because I necessarily think them correct on the larger issues but because of the process by which they become experts, a process that is in many ways similar to the way I have become somewhat knowledgeable on this or that.
First, they are trained, trained not just in the stuff of their field but the ways in which people come to know the stuff of their field. That takes time. And for most experts it takes help – teachers who know the field much better than the budding expert. Yes, some experts are self-trained but they are indeed exceptions. In any field of inquiry, this initial process takes more than focus and initiative. It takes hard work. A PhD or some other terminal degree in the field of supposed expertise is a common symbol that this first step is complete. Everyone knows of some honored professor, generally at a major research university, who lacks a terminal degree in his or her field of expertise but they are rare and rightly so. In some fields, years, many years, of experience, critical reflection and interactions with others fulfills the same requirement. Note that I do not have a PhD. Only my ego allows me to think that I am in anyway an expert in anything.
But formal training is the beginning – not the end. Expertise is built in discussion with others who know the evidence and the methodologies as well as or better than the budding expert. The knowledgeable community grinds ideas in the mill of evidence and sifts them through the filter of parsimony. Generally, the experts propagate the better ideas and suppress the worse ideas or they use the worse ideas as fodder for the development of better ideas. But once in a while, for a while, a good idea is ignored and a bad idea is propagated. Over the long haul, and that may take decades, in come cases centuries, the grinding and filtering process favors the better ideas. By this laborious process, expert opinion approaches but probably never reaches a more or less correct, if tentative, conclusion. Notice that the refining process results in iterative improvements in expert opinion rather than in certainty.
So, does that make expert opinion correct? No, it just makes it the best opinion available for those who are not experts.
Isaac Asimov summarized the process in an answer to a young skeptic,

John, when people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.

Asimov’s whole essay is worth of reading or even rereading.
There is an important sign of expertise that speaks to how a person becomes an expert and to how one can identify an expert. An expert is always able and willing to accurately and charitably rehearse positions other than his or her own and to use evidence and logic to explain how and why his or her position is presumably better.
What about those cases where there is no or poor agreement among experts? This is really only a problem when the practical need for an opinion is great. Truthfully, that isn’t very often. In such cases, there is no reason to think that any opinion of a non-expert has more than a random chance of being helpful. So, in those cases should we just consult the Urim and Thummim or roll the dice? No, in those cases, we need to rollup our sleeves and understand why it is, in detail, that the experts differ. In other words, we need to become experts (or find experts if we are lazy) on the reasons for a lack of expert agreement and then make the best informed, the highest probability, choose we can make.
Now here’s the weird part. My acquaintance admitted that this was exactly how he went about work a field in which he is a lesser expert. And in that field, when he needs the opinion of greater experts, he consults them knowing that they might be wrong but that they had a better chance of being right than he did. He loathes the intrusion of non-experts into to his own domain of expertise. But he just cannot see how his own experience and behavior translates to fields in which he is not even a lesser expert. In those areas, he views himself as not just a greater expert but a greater expert than the experts. Based solely on his own intuitions, he is certain of the truth of his beliefs in fields ranging from climate change to the history of the founding of the United States, to the transmission of the text of the Bible. After all, there have been many occasions when the experts were wrong.

One thought on “Why I Don’t Trust Experts”

  1. I wonder if you have had this experience (you probably have). Occasionally we run into that guy who churns out one wacko idea after another, and he is absolutely convinced about how accurate that idea is, providing pages of “evidence” in support of his argument. Each time I see a new idea from one of these guys (before I even read it) I’m already preparing to pull my hair out in frustration due to all the holes/faulty-logic. But — on rare occasion — one of these ideas actually makes me stop and ponder if there might be something to what he is saying. I often then go on a line of investigation that either comes to a dead end, or more-or-less counters his idea, but after those particular occasions, I find I have learned bits I had not had before. One of these days I wonder if I’ll find one of these wacko ideas actually have something to them (monkeys and typewriters and all). Could these wackos being serving a purpose in making sure we don’t all fall into the trap of blindly following the herd-speak?
    A comment more applicable to what you wrote: The sifting analogy is a good one, but I always have the concern about those few bad ideas that get through. Once there is consensus agreeing on a bad idea, they also tend to start throwing out the ideas that come later which disagree with that idea (including ones that could point to the correct interpretation). If this occurs, you have a situation where bad bits are supporting other bad bits, and it’s harder to realize that you are dealing with a mess of bad bits. (Earth-centric universe and the fight for and against could be pointed at as an example). Unfortunately, I can’t think of anything that would help prevent such things occurring.

Comments are closed.