Book review of “The Death of Expertise: the campaign against established knowledge and why it matters”

Preface.  Those who attack experts are exactly the people who will not read this book review (well, mainly some Kindle notes) of Nichols “The Death of Expertise: The Campaign against Established Knowledge and Why it Matters”. They scare me, they scare the author — that’s why he wrote it.  The Zombies are among us already on FOX news and hate talk radio, brains not dead, but not functioning very well, and proud of it.

Energyskeptic is about the death by a thousand cuts that leads to collapse, with fossil fuel decline the main one. Rejecting expertise in favor of “gut feelings”, superstitions, preferred notions, and rejection of science are yet one more cut, one more factor leading to collapse.  A rejection of expertise is manifested by those who voted for Trump, whose ignorance and incompetence is literally killing people, quite a few of them his voters.  It’s happening in the covid-19 pandemic, trying to get rid of Obamacare, undoing environmental rules, and financial regulations that protected the poor and middle class from rapacious capitalists. 

A few quotes from the book:

  • What I find so striking today is not that people dismiss expertise, but that they do so with such frequency, on so many issues, and with such anger.  
  • The death of expertise is not just a rejection of existing knowledge. It is fundamentally a rejection of science and dispassionate rationality, which are the foundations of modern civilization.
  • We have come full circle from a premodern age, in which folk wisdom filled unavoidable gaps in human knowledge, through a period of rapid development based heavily on specialization and expertise, and now to a postindustrial, information-oriented world where all citizens believe themselves to be experts on everything.
  • Some of us, as indelicate as it might be to say it, are not intelligent enough to know when we’re wrong, no matter how good our intentions.
  • There’s also the basic problem that some people just aren’t very bright. And as we’ll see, the people who are the most certain about being right tend to be the people with the least reason to have such self-confidence.  The reason unskilled or incompetent people overestimate their abilities far more than others is because they lack a key skill called “metacognition.”  
  • the root of an inability among laypeople to understand that experts being wrong on occasion about certain issues is not the same thing as experts being wrong consistently on everything. Experts are more often right than wrong, especially on essential matters of fact. And yet the public constantly searches for the loopholes in expert knowledge that will allow them to disregard all expert advice they don’t like.  
  • We all have an inherent tendency to search for evidence that already meshes with our beliefs. Our brains are actually wired to work this way, which is why we argue even when we shouldn’t.  
  • Colleges also mislead their students about their competence through grade inflation. When college is a business, you can’t flunk the customers. A study of 200 colleges and universities up through 2009 found that A was the most common grade, and increase of 30% since 1960. 

Related links:

2020 Trumpers are resistant to experts — even their own

Alice Friedemann www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer, Barriers to Making Algal Biofuels, and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Collapse Chronicles,Derrick Jensen, Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report

***

Nichols, T. 2017. The Death of Expertise. The Campaign Against Established Knowledge and Why it Matters. Oxford University Press.

The big problem is that we’re proud of not knowing things. Americans have reached a point where ignorance, especially of anything related to public policy, is an actual virtue. To reject the advice of experts is to assert autonomy, a way for Americans to insulate their increasingly fragile egos from ever being told they’re wrong about anything. It is a new Declaration of Independence: no longer do we hold these truths to be self-evident, we hold all truths to be self-evident, even the ones that aren’t true. All things are knowable and every opinion on any subject is as good as any other.

I wrote this because I’m worried. People don’t just believe dumb things; they actively resist further learning rather than let go of those beliefs. I was not alive in the Middle Ages, so I cannot say it is unprecedented, but within my living memory I’ve never seen anything like it.

Back in the late 1980s, when I was working in Washington, DC, I learned how quickly people in even casual conversation would immediately instruct me in what needed to be done in any number of areas, especially in my own areas of arms control and foreign policy. I was young and not yet a seasoned expert, but I was astonished at the way people who did not have the first clue about those subjects would confidently direct me on how best to make peace between Moscow and Washington. To some extent, this was understandable. Politics invites discussion. And especially during the Cold War, when the stakes were global annihilation, people wanted to be heard. I accepted that this was just part of the cost of doing business in the public policy world. Over time, I found that other specialists in various policy areas had the same experiences, with laypeople subjecting them to ill-informed disquisitions on taxes, budgets, immigration, the environment, and many other subjects. If you’re a policy expert, it goes with the job.

In later years, however, I started hearing the same stories from doctors. And from lawyers. And from teachers. And, as it turns out, from many other professionals whose advice is usually not contradicted easily. These stories astonished me: they were not about patients or clients asking sensible questions, but about those same patients and clients actively telling professionals why their advice was wrong. In every case, the idea that the expert knew what he or she was doing was dismissed almost out of hand.

Worse, what I find so striking today is not that people dismiss expertise, but that they do so with such frequency, on so many issues, and with such anger. Again, it may be that attacks on expertise are more obvious due to the ubiquity of the Internet, the undisciplined nature of conversation on social media, or the demands of the 24-hour news cycle. But there is a self-righteousness and fury to this new rejection of expertise that suggest that this isn’t just mistrust or questioning or the pursuit of alternatives: it is narcissism, coupled to a disdain for expertise as some sort of exercise in self-actualization.

This makes it all the harder for experts to push back and to insist that people come to their senses. No matter what the subject, the argument always goes down the drain of an enraged ego and ends with minds unchanged, sometimes with professional relationships or even friendships damaged. Instead of arguing, experts today are supposed to accept such disagreements as, at worst, an honest difference of opinion. We are supposed to “agree to disagree,” a phrase now used indiscriminately as little more than a conversational fire extinguisher. And if we insist that not everything is a matter of opinion, that some things are right and others are wrong … well, then we’re just being jerks, apparently.

There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that “my ignorance is just as good as your knowledge” as Isaac Asimov once said.

In the early 1990s, a small group of “AIDS denialists,” including a University of California professor named Peter Duesberg, argued against virtually the entire medical establishment’s consensus that the human immunodeficiency virus (HIV) was the cause of Acquired Immune Deficiency Syndrome. There was no evidence for Duesberg’s beliefs, which turned out to be baseless. The Duesberg business might have ended as just another quirky theory defeated by research.

In this case, however, a discredited idea nonetheless managed to capture the attention of a national leader, with deadly results. Thabo Mbeki, then the president of South Africa, seized on the idea that AIDS was caused not by a virus but by other factors, such as malnourishment and poor health, and so he rejected offers of drugs and other forms of assistance to combat HIV infection in South Africa. By the mid-2000s, his government relented, but not before Mbeki’s fixation on AIDS denialism ended up costing, by the estimates of doctors at the Harvard School of Public Health, well over 300,000 lives and the births of some 35,000 HIV-positive children.

These are dangerous times. Never have so many people had so much access to so much knowledge and yet have been so resistant to learning anything. In the United States and other developed nations, otherwise intelligent people denigrate intellectual achievement and reject the advice of experts. Not only do increasing numbers of laypeople lack basic knowledge, they reject fundamental rules of evidence and refuse to learn how to make a logical argument. In doing so, they risk throwing away centuries of accumulated knowledge and undermining the practices and habits that allow us to develop new knowledge.

All of these choices, from a nutritious diet to national defense, require a conversation between citizens and experts. Increasingly, it seems, citizens don’t want to have that conversation. For their part, they’d rather believe they’ve gained enough information to make those decisions on their own, insofar as they care about making any of those decisions at all. On the other hand, many experts, and particularly those in the academy, have abandoned their duty to engage with the public. They have retreated into jargon and irrelevance, preferring to interact with each other only.

The death of expertise is not just a rejection of existing knowledge. It is fundamentally a rejection of science and dispassionate rationality, which are the foundations of modern civilization. It is a sign, as the art critic Robert Hughes once described late twentieth-century America, of “a polity obsessed with therapies and filled with distrust of formal politics,” chronically “skeptical of authority” and “prey to superstition.” We have come full circle from a premodern age, in which folk wisdom filled unavoidable gaps in human knowledge, through a period of rapid development based heavily on specialization and expertise, and now to a postindustrial, information-oriented world where all citizens believe themselves to be experts on everything.

Any assertion of expertise from an actual expert, meanwhile, produces an explosion of anger from certain quarters of the American public, who immediately complain that such claims are nothing more than fallacious “appeals to authority,” sure signs of dreadful “elitism,” and an obvious effort to use credentials to stifle the dialogue required by a “real” democracy. Americans now believe that having equal rights in a political system also means that each person’s opinion about anything must be accepted as equal to anyone else’s.

The immediate response from most people when confronted with the death of expertise is to blame the Internet. Professionals, especially, tend to point to the Internet as the culprit when faced with clients and customers who think they know better. As we’ll see, that’s not entirely wrong, but it is also too simple an explanation. Attacks on established knowledge have a long pedigree, and the Internet is only the most recent tool in a recurring problem that in the past misused television, radio, the printing press, and other innovations the same way. So why all the

The secrets of life are no longer hidden in giant marble mausoleums and the great libraries of the world. So in the past, there was less stress between experts and laypeople, but only because citizens were simply unable to challenge experts in any substantive way. Moreover, there were few public venues in which to mount such challenges in the era before mass communications.  We now live in a society where the acquisition of even a little learning is the endpoint, rather than the beginning, of education. And this is a dangerous thing.

Some of us, as indelicate as it might be to say it, are not intelligent enough to know when we’re wrong, no matter how good our intentions. Just as we are not all equally able to carry a tune or draw a straight line, many people simply cannot recognize the gaps in their own knowledge or understand their own inability to construct a logical argument.

Education is supposed to help us to recognize problems like “confirmation bias” and to overcome the gaps in our knowledge so that we can be better citizens.

In this hypercompetitive media environment, editors and producers no longer have the patience—or the financial luxury—to allow journalists to develop their own expertise or deep knowledge of a subject. Nor is there any evidence that most news consumers want such detail. Experts are often reduced to sound bites or “pull quotes,” if they are consulted at all. And everyone involved in the news industry knows that if the reports aren’t pretty or glossy or entertaining enough, the fickle viewing public can find other, less taxing alternatives with the click of a mouse or the press of a button on a television remote.

Maybe it’s not that people are any dumber or any less willing to listen to experts than they were a hundred years ago: it’s just that we can hear them all now.

A certain amount of conflict between people who know some things and people who know other things is inevitable. There were probably arguments between the first hunters and gatherers over what to have for dinner. As various areas of human achievement became the province of professionals, disagreements were bound to grow and to become sharper. And as the distance between experts and the rest of the citizenry grew, so did the social gulf and the mistrust between them. All societies, no matter how advanced, have an undercurrent of resentment against educated elites, as well as persistent cultural attachments to folk wisdom, urban legends, and other irrational but normal human reactions to the complexity and confusion of modern life.

Democracies, with their noisy public spaces, have always been especially prone to challenges to established knowledge. Actually, they’re more prone to challenges to established anything: it’s one of the characteristics that makes them “democratic”.

The United States, with its intense focus on the liberties of the individual, enshrines this resistance to intellectual authority even more than other democracies. Alexis de Tocqueville, the French observer noted in 1835 that the denizens of the new United States were not exactly enamored of experts or their smarts. “In most of the operations of the mind, each American appeals only to the individual effort of his own understanding.” This distrust of intellectual authority was rooted, Tocqueville theorized, in the nature of American democracy. When “citizens, placed on an equal footing, are all closely seen by one another, they are constantly brought back to their own reason as the most obvious and proximate source of truth. It is not only confidence in this or that man which is destroyed, but the disposition to trust the authority of any man whatsoever.”

Such observations have not been limited to early America. Teachers, experts, and professional “knowers” have been venting about a lack of deference from their societies since Socrates was forced to drink his hemlock.

The Spanish philosopher José Ortega y Gasset in 1930 decried the “revolt of the masses” and the unfounded intellectual arrogance that characterized it:

Hofstadter argued back in 1963 that overwhelming complexity produced feelings of helplessness and anger among a citizenry that knew itself increasingly to be at the mercy of smarter elites. “What used to be a jocular and usually benign ridicule of intellect and formal training has turned into a malign resentment of the intellectual in his capacity as expert,” Hofstadter warned. “Once the intellectual was gently ridiculed because he was not needed; now he is fiercely resented because he is needed too much.

Somin wrote in 2015 that the “size and complexity of government” have made it “more difficult for voters with limited knowledge to monitor and evaluate the government’s many activities. The result is a polity in which the people often cannot exercise their sovereignty responsibly and effectively.” More disturbing is that Americans have done little in those intervening decades to remedy the gap between their own knowledge and the level of information required to participate in an advanced democracy. “The low level of political knowledge in the American electorate,” Somin correctly notes, “is still one of the best-established findings in social science.

The death of expertise, however, is a different problem than the historical fact of low levels of information among laypeople. The issue is not indifference to established knowledge; it’s the emergence of a positive hostility to such knowledge. This is new in American culture, and it represents the aggressive replacement of expert views or established knowledge with the insistence that every opinion on any matter is as good as every other. This is a remarkable change in our public discourse.

The death of expertise actually threatens to reverse the gains of years of knowledge among people who now assume they know more than they actually do. This is a threat to the material and civic well-being of citizens in a democracy.

Some folks seized on the contradictory news stories about eggs (much as they did on a bogus story about chocolate being a healthy snack that made the rounds earlier) to rationalize never listening to doctors, who clearly have a better track record than the average overweight American at keeping people alive with healthier diets.

At the root of all this is an inability among laypeople to understand that experts being wrong on occasion about certain issues is not the same thing as experts being wrong consistently on everything. The fact of the matter is that experts are more often right than wrong, especially on essential matters of fact. And yet the public constantly searches for the loopholes in expert knowledge that will allow them to disregard all expert advice they don’t like.

No one is arguing that experts can’t be wrong, but they are less likely to be wrong than nonexperts. The same people who anxiously point back in history to the thalidomide disaster routinely pop dozens of drugs into their mouths, from aspirin to antihistamines, which are among the thousands and thousands of medications shown to be safe by decades of trials and tests conducted by experts. It rarely occurs to the skeptics that for every terrible mistake, there are countless successes that prolong their lives.

There are many examples of these brawls among what pundits and analysts gently refer to now as “low-information voters.” Whether about science or policy, however, they all share the same disturbing characteristic: a solipsistic and thin-skinned insistence that every opinion be treated as truth. Americans no longer distinguish the phrase “you’re wrong” from the phrase “you’re stupid.” To disagree is to disrespect. To correct another is to insult. And to refuse to acknowledge all views as worthy of consideration, no matter how fantastic or inane they are, is to be closed-minded.

The epidemic of ignorance in public policy debates has real consequences for the quality of life and well-being of every American. During the debate in 2009 over the Affordable Care Act, for example, at least half of all Americans believed claims by opponents like former Republican vice presidential nominee Sarah Palin that the legislation included “death panels” that would decide who gets health care based on a bureaucratic decision about a patient’s worthiness to live. (Four years later, almost a third of surgeons apparently continued to believe this.) Nearly half of Americans also thought the ACA established a uniform government health plan. Love it or hate it, the program does none of these things. And two years after the bill passed, at least 40% of Americans weren’t even sure the program was still in force as a law.

First, while our clumsy dentist might not be the best tooth puller in town, he or she is better at it than you.  Second, and related to this point about relative skill, experts will make mistakes, but they are far less likely to make mistakes than a layperson. This is a crucial distinction between experts and everyone else, in that experts know better than anyone the pitfalls of their own profession. Both of these points should help us to understand why the pernicious idea that “everyone can be an expert” is so dangerous.

Knowing things is not the same as understanding them. Comprehension is not the same thing as analysis.

We all have an inherent and natural tendency to search for evidence that already meshes with our beliefs. Our brains are actually wired to work this way, which is why we argue even when we shouldn’t. And if we feel socially or personally threatened, we will argue until we’re blue in the face.

There’s also the basic problem that some people just aren’t very bright. And as we’ll see, the people who are the most certain about being right tend to be the people with the least reason to have such self-confidence.  The reason unskilled or incompetent people overestimate their abilities far more than others is because they lack a key skill called “metacognition.” This is the ability to know when you’re not good at something by stepping back, looking at what you’re doing, and then realizing that you’re doing it wrong. Good singers know when they’ve hit a sour note; good directors know when a scene in a play isn’t working; good marketers know when an ad campaign is going to be a flop. Their less competent counterparts, by comparison, have no such ability. They think they’re doing a great job.

Pair such people with experts, and, predictably enough, misery results. The lack of metacognition sets up a vicious loop, in which people who don’t know much about a subject do not know when they’re in over their head talking with an expert on that subject. An argument ensues, but people who have no idea how to make a logical argument cannot realize when they’re failing to make a logical argument. In short order, the expert is frustrated and the layperson is insulted. Everyone walks away angry.

Dunning described the research done at Cornell as something like comedian Jimmy Kimmels point that when people have no idea what they’re talking about, it does not deter them from talking anyway. In our work, we ask survey respondents if they are familiar with certain technical concepts from physics, biology, politics, and geography. A fair number claim familiarity with genuine terms like centripetal force and photon. But interestingly, they also claim some familiarity with concepts that are entirely made up, such as the plates of parallax, ultra-lipid, and cholarine. In one study, roughly 90% claimed some knowledge of at least one of the nine fictitious concepts we asked them about.

In other words, the least-competent people were the least likely to know they were wrong or to know that others were right, the most likely to try to fake it, and the least able to learn anything. Dunning and Kruger have several explanations for this problem. In general, people don’t like to hurt each other’s feelings, and in some workplaces, people and even supervisors might be reluctant to correct incompetent friends or colleagues. Some activities, like writing or speaking, do not have any evident means of producing immediate feedback. You can only miss so many swings in baseball before you have to admit you might not be a good hitter, but you can mangle grammar and syntax every day without ever realizing how poorly you speak.

Confirmation Bias

Not everyone, however, is incompetent, and almost no one is incompetent at everything. What kinds of errors do more intelligent or agile-minded people make in trying to comprehend complicated issues? Not surprisingly, ordinary citizens encounter pitfalls and biases that befall experts as well. “Confirmation bias” is the most common—and easily the most irritating—obstacle to productive conversation, and not just between experts and laypeople. The term refers to the tendency to look for information that only confirms what we believe, to accept facts that only strengthen our preferred explanations, and to dismiss data that challenge what we already accept as truth. If we’ve heard Boston drivers are rude, the next time we’re visiting Beantown we’ll remember the ones who honked at us or cut us off. We will promptly ignore or forget the ones who let us into traffic or waved a thank you. For the record, in 2014 the roadside assistance company AutoVantage rated Houston the worst city for rude drivers. Boston was fifth.

For people who believe flying is dangerous, there will never be enough safe landings to outweigh the fear of the one crash. “Confronted with these large numbers and with the correspondingly small probabilities associated with them,” Paulos wrote in 2001, “the innumerate will inevitably respond with the non sequitur, ‘Yes, but what if you’re that one,’ and then nod knowingly, as if they’ve demolished your argument with their penetrating insight

We are gripped by irrational fear rather than irrational optimism because confirmation bias is, in a way, a kind of survival mechanism. Good things come and go, but dying is forever. Your brain doesn’t much care about all those other people who survived a plane ride

Your intellect, operating on limited or erroneous information, is doing its job, trying to minimize any risk to your life, no matter how small. When we fight confirmation bias, we’re trying to correct for a basic function—a feature, not a bug—of the human mind.

Confirmation bias comes into play because people must rely on what they already know. They cannot approach every problem as though their minds are clean slates. This is not the way memory works, and more to the point, it would hardly be an effective strategy to begin every morning trying to figure everything out from scratch. Confirmation bias can lead even the most experienced experts astray. Doctors, for example, will sometimes get attached to a diagnosis and then look for evidence of the symptoms they suspect already exist in a patient while ignoring markers of another disease or injury.

In modern life outside of the academy, however, arguments and debates have no external review. Facts come and go as people find convenient at the moment. Thus, confirmation bias makes attempts at reasoned argument exhausting because it produces arguments and theories that are non-falsifiable. It is the nature of confirmation bias itself to dismiss all contradictory evidence as irrelevant, and so my evidence is always the rule, your evidence is always a mistake or an exception. It’s impossible to argue with this kind of explanation, because by definition it’s never wrong.

An additional problem is that most laypeople have never been taught, or have forgotten, the basics of the “scientific method.” This is the set of steps that lead from a general question to a hypothesis, testing, and analysis. Although people commonly use the word “evidence,” they use it too loosely; the tendency in conversation is to use “evidence” to mean “things which I perceive to be true,” rather than “things that have been subjected to a test of their factual nature by agreed-upon rules.

Conspiracy Theories

The most extreme cases of confirmation bias are found not in the wives’ tales and superstitions of the ignorant, but in the conspiracy theories of more educated or intelligent people. Unlike superstitions, which are simple, conspiracy theories are horrendously complicated. Indeed, it takes a reasonably smart person to construct a really interesting conspiracy theory, because conspiracy theories are actually highly complex explanations

Each rejoinder or contradiction only produces a more complicated theory. Conspiracy theorists manipulate all tangible evidence to fit their explanation, but worse, they will also point to the absence of evidence as even stronger confirmation. After all, what better sign of a really effective conspiracy is there than a complete lack of any trace that the conspiracy exists? Facts, the absence of facts, contradictory facts: everything is proof. Nothing can ever challenge the underlying belief.

One reason we all love a good conspiracy thriller is that it appeals to our sense of heroism. American culture in particular is attracted to the idea of the talented amateur (as opposed, say, to the experts and elites) who can take on entire governments—or even bigger organizations—and win.

More important and more relevant to the death of expertise, however, is that conspiracy theories are deeply attractive to people who have a hard time making sense of a complicated world and who have no patience for less dramatic explanations. Such theories also appeal to a strong streak of narcissism: there are people who would choose to believe in complicated nonsense rather than accept that their own circumstances are incomprehensible, the result of issues beyond their intellectual capacity to understand, or even their own fault.

Conspiracy theories are also a way for people to give context and meaning to events that frighten them. Without a coherent explanation for why terrible things happen to innocent people, they would have to accept such occurrences as nothing more than the random cruelty either of an uncaring universe or an incomprehensible deity.

The only way out of this dilemma is to imagine a world in which our troubles are the fault of powerful people who had it within their power to avert such misery. In such a world, a loved one’s incurable disease is not a natural event: it is the result of some larger malfeasance by industry or government.

Whatever it is, somebody is at fault, because otherwise we’re left blaming only God, pure chance, or ourselves.

Just as individuals facing grief and confusion look for reasons where none may exist, so, too, will entire societies gravitate toward outlandish theories when collectively subjected to a terrible national experience. Conspiracy theories and the flawed reasoning behind them, as the Canadian writer Jonathan Kay has noted, become especially seductive “in any society that has suffered an epic, collectively felt trauma. In the aftermath, millions of people find themselves casting about for an answer to the ancient question of why bad things happen to good people.” This is why conspiracy theories spiked in popularity after World War I, the Russian Revolution, the assassination of John F. Kennedy, and the terror attacks of September 2001, among other historical events.

Today, conspiracy theories are reactions mostly to the economic and social dislocations of globalization, just as they were to the aftermath of war and the advent of rapid industrialization in the 1920s and 1930s. This is not a trivial obstacle when it comes to the problems of expert engagement with the public: nearly 30% of Americans, for example, think “a secretive elite with a globalist agenda is conspiring to eventually rule the world.

If trying to get around confirmation bias is difficult, trying to deal with a conspiracy theory is impossible. Someone who believes that the oil companies are suppressing a new car that can run on seaweed is unlikely to be impressed by your new Prius or Volt. The people who think alien bodies were housed at Area 51 won’t change their minds if they take a tour of the base. The alien research lab is underground.

Such theories are the ultimate bulwark against expertise, because of course every expert who contradicts the theory is ipso facto part of the conspiracy.

Stereotyping & Generalizations

Stereotyping is an ugly social habit, but generalization is at the root of every form of science. Generalizations are probabilistic statements, based in observable facts. They are not, however, explanations in themselves—another important difference from stereotypes. They’re measurable and verifiable. Sometimes generalizations can lead us to posit cause and effect, and in some cases, we might even observe enough to create a theory or a law that under constant circumstances is always true.

The hard work of explanation comes after generalization. Why are Americans taller than the Chinese? Is it genetic? Is it the result of a different diet? Are there environmental factors at work? There are answers to this question somewhere, but whatever they are, it’s still not wrong to say that Americans tend to be taller than the Chinese, no matter how many slam-dunking exceptions we might find. To say that all Chinese people are short, however, is to stereotype. The key to a stereotype is that it is impervious to factual testing. A stereotype brooks no annoying interference with reality.  Stereotypes are not predictions, they’re conclusions. That’s why it’s called “prejudice”: it relies on pre-judging.

Dispassionate discussion helps

Conversations among laypeople, and between laypeople and experts, can get difficult because human emotions are involved, especially if they are about things that are true in general but might not apply to any one case or circumstance. That’s why one of the most important characteristics of an expert is the ability to remain dispassionate, even on the most controversial issues.

Experts must treat everything from cancer to nuclear war as problems to be solved with detachment and objectivity. Their distance from the subject enables open debate and consideration of alternatives, in ways meant to defeat emotional temptations, including fear, that lead to bias. This is a tall order, but otherwise conversation is not only arduous but sometimes explosive.

There are other social and psychological realities that hobble our ability to exchange information. No matter how much we might suffer from confirmation bias or the heavy hand of the Dunning-Kruger Effect, for example, we don’t like to tell people we know or care about that they’re wrong. Likewise, as much as we enjoy the natural feeling of being right about something, we’re sometimes reluctant to defend our actual expertise.

Not wanting to offend can lead to poor decisions, social insecurity, faking it

When two people were involved in repeated discussions and decision making—and establishing a bond between the participants was a key part of the study—researchers found that the less capable people advocated for their views more than might have been expected, and that the more competent member of the conversation deferred to those points of view even when they were demonstrably wrong.

This might make for a pleasant afternoon, but it’s a lousy way to make decisions. As Chris Mooney, a Washington Post science writer, noted, this kind of social dynamic might grease the wheels of human relationships, but it can do real harm where facts are at stake. The study, he wrote, underscored “that we need to recognize experts more, respect them, and listen to them. But it also shows how our evolution in social groups binds us powerfully together and enforces collective norms, but can go haywire when it comes to recognizing and accepting inconvenient truths.

The reality is that social insecurity trips up both the smart and the dumb. We all want to be liked. In a similar vein, few of us want to admit to being lost in a conversation, especially when so much information is now so easily accessible. Social pressure has always tempted even intelligent, well-informed people to pretend to know more than they do, but this impulse is magnified in the Information Age.

People skim headlines or articles and share them on social media, but they do not read them. Nonetheless, because people want to be perceived by others as intelligent and well informed, they fake it as best they can. As if all of this weren’t enough of a challenge, the addition of politics makes things even more complicated. Political beliefs among both laypeople and experts work in much the same way as confirmation bias. The difference is that beliefs about politics and other subjective matters are harder to shake, because our political views are deeply rooted in our self-image and our most cherished beliefs about who we are as people.

What we believe says something important about how we see ourselves as people. We can take being wrong about the kind of bird we just saw in our backyard, or who the first person was to circumnavigate the globe, but we cannot tolerate being wrong about the concepts and facts that we rely upon to govern how we live our lives. Take, for example, a fairly common American kitchen-table debate: the causes of unemployment. Bring up the problem of joblessness with almost any group of laypeople and every possible intellectual problem will rear its head. Stereotypes, confirmation bias, half-truths, and statistical incompetence all bedevil this discussion

Consider a person who holds firmly, as many Americans do, to the idea that unemployed people are just lazy and that unemployment benefits might even encourage that laziness. Like so many examples of confirmation bias, this could spring from personal experience. Perhaps it proceeds from a lifetime of continuous employment, or it may be the result of knowing someone who’s genuinely averse to work. Every “help wanted” sign—which confirmation bias will note and file away—is further proof of the laziness of the unemployed. A page of job advertisements or a chronically irresponsible nephew constitutes irrefutable evidence that unemployment is a personal failing rather than a problem requiring government intervention.

Now imagine someone else at the table who believes that the nature of the American economy itself forces people into unemployment. This person might draw from experience as well: he or she may know someone who moved to follow a start-up company and ended up broke and far from home, or who was unjustly fired by a corrupt or incompetent supervisor. Every corporate downsizing, every racist or sexist boss, and every failed enterprise is proof that the system is stacked against innocent people who would never choose unemployment over work. Unemployment benefits, rather than subsidizing indolence, are a lifeline and perhaps the only thing standing between an honest person and complete ruin.

It’s unarguable that unemployment benefits suppress the urge to work in at least some people; it’s also undeniable that some corporations have a history of ruthlessness at the expense of their workers, whose reliance on benefits is reluctant and temporary. This conversation can go on forever, because both the Hard Worker on one side and the Kind Heart on the other can adduce anecdotes, carefully vetted by their own confirmation bias, that are always true

There’s no way to win this argument, because in the end, there are no answers that will satisfy everyone. Laypeople want a definitive answer from the experts, but none can be had because there is not one answer but many, depending on circumstances. When do benefits encourage sloth? How often are people thrown out of work against their will, and for how long? These are nuances in a broad problem, and where our self-image is involved, nuance isn’t helpful. Unable to see their own biases, most people will simply drive each other crazy arguing rather than accept answers that contradict what they already think about the subject. The social psychologist Jonathan Haidt summed it up neatly when he observed that when facts conflict with our values, “almost everyone finds a way to stick with their values and reject the evidence.

Dumbing down of education, lack of critical thinking taught

Many of those American higher educational institutions are failing to provide to their students the basic knowledge and skills that form expertise. More important, they are failing to provide the ability to recognize expertise and to engage productively with experts and other professionals in daily life. The most important of these intellectual capabilities, and the one most under attack in American universities, is critical thinking: the ability to examine new information and competing ideas dispassionately, logically, and without emotional or personal preconceptions. This is because attendance at a postsecondary institution no longer guarantees a “college education.” Instead, colleges and universities now provide a full-service experience of “going to college.” These are not remotely the same thing, and students now graduate believing they know a lot more than they actually do. Today, when an expert says, “Well, I went to college,” it’s hard to blame the public for answering, “Who hasn’t?” Americans with college degrees now broadly think of themselves as “educated” when in reality the best that many of them can say is that they’ve continued on in some kind of classroom setting after high school, with wildly varying results.

Students at most schools today are treated as clients, rather than as students. Younger people, barely out of high school, are pandered to both materially and intellectually, reinforcing some of the worst tendencies in students who have not yet learned the self-discipline that once was essential to the pursuit of higher education. Colleges now are marketed like multiyear vacation packages,

The new culture of education in the United States is that everyone should, and must, go to college. This cultural change is important to the death of expertise, because as programs proliferate to meet demand, schools become diploma mills whose actual degrees are indicative less of education than of training.

Young people who might have done better in a trade sign up for college without a lot of thought given to how to graduate, or what they’ll do when it all ends. Four years turns into five, and increasingly six or more. A limited course of study eventually turns into repeated visits to an expensive educational buffet laden mostly with intellectual junk food, with very little adult supervision to ensure that the students choose nutrition over nonsense

Schools that are otherwise indistinguishable on the level of intellectual quality compete to offer better pizza in the food court, plushier dorms, and more activities besides the boring grind of actually going to class.  The cumulative result of too many “students,” too many “professors,” too many “universities,” and too many degrees is that college attendance is no longer a guarantee that people know what they’re talking about.

College is supposed to be an uncomfortable experience. It is where a person leaves behind the rote learning of childhood and accepts the anxiety, discomfort, and challenge of complexity that leads to the acquisition of deeper knowledge—hopefully, for a lifetime. A college degree, whether in physics or philosophy, is supposed to be the mark of a truly “educated” person who not only has command of a particular subject, but also has a wider understanding of his or her own culture and history. It’s not supposed to be easy.  

Over 75% of American undergraduates attend colleges that accept at least half their applicants. Only 4% attend schools that accept 25% or less, and fewer than 1% attend elite schools that accept fewer than 10% of their applicants. Students at these less competitive institutions then struggle to finish, with only half completing a bachelor’s degree within six years.

Many of these incoming students are not qualified to be in college and need significant remedial work. The colleges know it, but they accept students who are in over their heads, stick them in large (but cost-efficient) introductory courses, and hope for the best. Why would schools do this and obviously violate what few admissions standards they might still enforce? As James Piereson of the Manhattan Institute wrote in 2016, “Follow the money.”

Parenting obviously plays a major role here. Overprotective parents have become so intrusive that a former dean of first-year students at Stanford wrote an entire book in which she said that this “helicopter parenting” was ruining a generation of children.

More people than ever before are going to college, mostly by tapping a virtually inexhaustible supply of ruinous loans. Buoyed by this government-guaranteed money, and in response to aggressive marketing from tuition-driven institutions, teenagers from almost all of America’s social classes now shop for colleges the way the rest of us shop for cars. The idea that adolescents should first think about why they want to go to college at all, find schools that might best suit their abilities, apply only to those schools, and then visit the ones to which they’re accepted is now alien to many parents and their children.

This entire process means not only that children are in charge, but that they are already being taught to value schools for some reason other than the education it might provide them. Schools know this, and they’re ready for it. In the same way the local car dealership knows exactly how to place a new model in the showroom, or a casino knows exactly how to perfume the air that hits patrons just as they walk in the door, colleges have all kinds of perks and programs at the ready as selling points, mostly to edge out their competitors over things that matter only to kids.

Driven to compete for teenagers and their loan dollars, educational institutions promise an experience rather than an education. I am leaving aside for-profit schools here, which are largely only factories that create debt and that in general I exclude from the definition of “higher education.” There’s nothing wrong with creating an attractive student center or offering a slew of activities, but at some point it’s like having a hospital entice heart patients to choose it for a coronary bypass because it has great food.

At many colleges, new students already have been introduced to their roommates on social media and live in luxurious apartment-like dorms. That ensures they basically never have to share a room or a bathroom, or even eat in the dining halls if they don’t want to. Those were the places where previous generations learned to get along with different people and manage conflicts when they were chosen at random to live with strangers in close and communal quarters.

In 2006, the New York Times asked college educators about their experiences with student email, and their frustration was evident. “These days,” the Times wrote, “students seem to view [faculty] as available around the clock, sending a steady stream of e-mail messages … that are too informal or downright inappropriate.” As a Georgetown theology professor told the Times, “The tone that they would take in e-mail was pretty astounding. ‘I need to know this and you need to tell me right now,’ with a familiarity that can sometimes border on imperative

Email, like social media, is a great equalizer, and it makes students comfortable with the idea of messages to teachers as being like any communication with a customer-service department. This has a direct impact on respect for expertise, because it erases any distinction between the students who ask questions and the teachers who answer them. As the Times noted, while once professors may have expected deference, their expertise seems to have become just another service that students, as consumers, are buying. So students may have no fear of giving offense, imposing on the professor’s time or even of asking a question that may reflect badly on their own judgment. Kathleen E. Jenkins, a sociology professor at the College of William and Mary in Virginia, said she had even received e-mail requests from students who missed class and wanted copies of her teaching notes.

Professors are not intellectual valets or on-call pen pals. They do not exist to resolve every student question instantly—including, as one UC Davis professor reported, advice about whether to use a binder or a subject notebook. One of the things students are supposed to learn in college is self-reliance, but why bother looking something up when the faculty member is only a few keystrokes away?

Small colleges do not have the resources—including the libraries, research facilities, and multiple programs—of large universities.

When rebranded universities offer courses and degree programs as though they are roughly equivalent to their better-known counterparts, they are not only misleading prospective students but also undermining later learning. The quality gap between programs risks producing a sense of resentment: if you and I both have university degrees in history, why is your view about the Russian Revolution any better than mine? Why should it matter that your degree is from a top-ranked department, but mine is from a program so small it has a single teacher? If I studied film at a local state college, and you went to the film program at the University of Southern California, who are you to think you know more than I? We have the same degree, don’t we?

We may not like any of these comparisons, but they matter in sorting out expertise and relative knowledge. It’s true that great universities can graduate complete dunderheads. Would-be universities, however, try to punch above their intellectual weight for all the wrong reasons, including marketing, money, and faculty ego. In the end, they are doing a disservice to both their students and society. Studying the same thing might give people a common language for further discussion of a subject, but it does not automatically make them peers.

Colleges also mislead their students about their competence through grade inflation. When college is a business, you can’t flunk the customers. A study of 200 colleges and universities up through 2009 found that A was the most common grade, and increase of 30% since 1960.  Grades of A or B account for over 80% of all grades in all subjects.  Even at Harvard the most common grade was straight As.  Princeton tried to limit the faculty’s ability to give A grades in 2004, but the faculty fought it. When Wellesley tried to cap the average grade at a B+ those courses lost 20% of enrollments and participating departments lost a third of their majors.

In the end, grade inflation gives students unwarranted confidence in their abilities.  Almost all institutions collude on grades, driven by market pressures to make college fun, students attractive to employers, and professors to escape the wrath of dissatisfied students.

Kindle notes end

Next chapter: the internet, books, radio, Rush Limbaugh, and above all FOX news as the death of expertise.  How people choose the news that suits them.  People don’t hate the media, just the news they don’t like or that has views with which they don’t agree.

Helpful hints

Be humble. Assume that the people who wrote a story know more about the subject than you do and spent a lot more time on that issue.

Vary your diet, consume mixed sources of media, including from other countries.

Be less cynical, or so cynical. It’s rare someone is setting out intentionally to lie to you.

Lots of good stuff.  Too much to enter notes on.

Trump won because he connected with voters who believe that knowing about things like America’s nuclear deterrent is pointy-headed claptrap. They didn’t know or care Trump was ignorant or wrong, and most didn’t even recognize his errors.  Trump’s strongest supporters in 2016 were concentrated among people with low levels of education. “I love the poorly educated,” trump exulted and that love was clearly reciprocated.  In Trump, Americans who believe shadowy forces are ruining their lives and that intellectual ability is a suspicious characteristic in a national leader found their champion.  The believed that the political elite and their intellectual allies were conspiring against them.

Plummeting literacy and growth of willful ignorance is part of a vicious circle of disengagement between citizens and public policy. People know little and care less about how they are governed, or how their economic, scientific, or political structures actually function. And as these processes become more complex and incomprehensible, citizens feel more alienated.  Overwhelmed, they turn away from education and civic involvement and withdraw into other pursuits. This in turn makes them less capable citizens, and the cycle continues and strengthens, especially when there are so many entertainments to escape into.  Many Americans have become almost childlike in their refusal to learn enough to govern themselves or guide the policies that affect their lives.

And quite a bit more about what’s resulted from American’s rejection of expertise.

This entry was posted in Critical Thinking, Political Books, Politics and tagged , , , . Bookmark the permalink.

Comments are closed.