Expertise is a very specific quality. Dorion Sagan said of scientists ― but it applies to expertise in general ― that an expert who continues to make progress in their field “learns more and more about less and less until [they] know everything about nothing”. Admittedly, experts tend to retire or die before they meet Sagan’s end point, but it is an interesting starting point to understanding what experts are good for, why we should trust expertise in general and that leads to how one should police disciplines and expertise.
We all read and study. That’s why you’re here, at this blog and others better than it. But your dedication to reading doesn’t lead to expertise until you start to submit your understanding to peer review; to have your conceptions scrutinised, criticised and evaluated by other people interested in the field and in rational discourse. Even then, you are only informed. The next step is to take seriously the questions that cohort of people have about that field, and start leading your enquiry there. If you can start to present tentative ideas that basically survive scrutiny on those unanswered questions in that field, then you are an expert in that small part of that field. Yes, there are blurry lines between the expert and nonexpert in the area. That linguistic divide is not wholly useful, as there is a middle ground of “well read”, “informed” or even “qualified”.
I am qualified to talk about environmental management, eco-ethical questions (to a limit) and questions like climate change; I have a formal qualification from an institution that holds its cohort of experts to strict scrutiny (or, at least, outsources that scrutiny by demanding they submit their ideas to peer review in Journal Articles). I am well read and arguably even informed in certain areas of philosophy and science. But I wouldn’t want to mislead anyone ― I am not an expert in any field. And this is true of most people: not an expert in anything, qualified in something, informed about a few things (even if it is normally sports).
In general, we trust experts because we understand their standing in academic fields relies on their ability to both present valid progress in a field and tear down other people’s attempts at progress. We trust that bad ideas and bad evidence will get scrutiny that exorcises them from the field.
But we need to make two informed side-steps from absolute trust in ‘experts’: the fact that the immune system in these disciplines is all-too-human, and that a expert is an expert in very little.
I recently wrote about Michael Egnor (a blogger, neurosurgeon and creationist) and his criticism of Sean Carroll (author, physicist, cosmologist, lecturer and atheist) and Carroll’s work. Egnor is literally a brain surgeon, but that doesn’t speak to an understanding of cosmology (the content of Carroll’s work) or philosophy (the game Egnor ultimately tried to play); it doesn’t even speak to an understanding of psychiatry, which Egnor tried wrangle in as well. What I’m trying to point out here is an expert is only an expert in their field. Outside of brain surgery, even brain surgeons can be uninformed or even unread on the issue. As soon as an ‘expert’ steps outside their field, the title doesn’t come with them. Unless they decide to start citing experts, their ejaculated ideas carry no more weight than anyone else’. Egnor’s post was that of an uninformed, unread, unqualified blogger ― not an expert in their field.
Reza Aslan (public speaker, PhD) knows this risk all too well. He claims to be a religious scholar with an expertise in Islamic culture or the history of religion. But, he’s not. He has a Master’s in Theology and another in Fiction Writing, and I don’t consider myself an expert at Master’s level. He also has a PhD in Sociology, having submitted a thesis titled Global Jihadism as a Transnational Social Movement: A Theoretical Framework (the introduction to which is available here). Aslan knows he is an expert in a very narrow field, and he knows he has made a career out of talking about another field altogether. So he lies about his “expertise” and tried to use his PhD to bulldoze over criticism. The public sphere doesn’t have as reliable immune system against fraudulent expertise and false credentials as the academic sphere does.
However, this immune system is human and imperfect. Groupthink can overtake a discipline, where some foundational claims of that discipline become the basis of interpretation, even though honestly sticking to rational interpretation would cut it out. Take dietary science and obesity, for example: it simply does not appear to be the case that the best way to think about diet and obesity is “calories in – calories out”. Instead, the “quality” of those calories ― i.e. whether they are refined sugars, complex carbohydrates, fats or proteins ― and any food which may impact metabolic function add so much complexity to the picture that “calories in – calories out” is so simplistic and misleading as to be fairly called “wrong”. But it was a foundational claim in dietary science and so the “peers” who reviewed publications are lenient on work that interprets the data that way.
There is not much doubt that dietary science has been working itself out of this rut for a long time. However, the controversy caused by Gary Taubes in 2002, with his publication of What if It’s All Been a Big Fat Lie? highlights groupthink. Essentially, it was taken as a given that eating fat makes you fat. It made obvious sense and was backed up by the idea that calories made you fat ― with fat having approximately 10 times the concentration of calories compared with most carbohydrates. Taubes is a journalist and wrote his article not as an expert dietician, but as an informed sceptic. He has since written Good Calories, Bad Calories (2007) (The Diet Delusion in the UK). He does offer a review of the data and answers questions that are now emerging from medical and dietary science, so perhaps he is now an expert in the field. But, I find he could well be called a ‘metaexpert’, in that he is offering insight into and ― to an extent ― providing opportunity to police the field in question. Taubes is not necessarily an expert in diet, but he is knows enough about science (he’s a physicist) in general to have held the field to account.
Taube is far from being the only “metaexpert”. There are fields of philosophy of science that professionally worry about the oligarchic structure of individual fields in science communities. Complex disciplines, like economics and sociology, fall into groupthink because the data is so statistically complex and ‘noisy’ that any falsification of a founding idea can be put down to statistical fluke or methodological error, and that is generally acknowledged (a google search).
We do rely on experts to pass down knowledge. But expertise is a messy field, with some experts being mistaken in their field, some fields being wrong and many experts simply not being all-knowing superintelligences that can speak as accurately on all topics as well as they can their own field. All information passed on from experts in, necessarily, something to be sceptical of. The consensus of a field is less dubious. The utterances of someone on secondment from another field is possibly as good as the utterances of day-drinking Dave. We need experts, but we also need to know why we need them.