When does it become fair to say that offbeat, unscientific ideas are not just harmless intellectual errors but dangerous? Well, to take a few examples, maybe...
- when a trainer at hip Crunch Gym, according to a lawsuit, gives a woman with high blood pressure supplements that were meant to enhance her performance but instead caused a stroke (one of several cases prompting recent regulatory action against ephedra)...
- when rumors spread in Nigeria that the polio vaccine is really a Western plot to spread AIDS or other illnesses and vaccinations in some areas are suspended as a result (prompting Nigerian officials to travel abroad in search of proof the polio vaccine is safe while polio recurs in their country)...
- when the entire food industry seems to be retooling for a low-carb, meat-seeking, Atkins-dieting population despite a lack of solid evidence that the late Dr. Atkins' diet is any more effective than calorie reduction by any other means (prompting the radical animal rights/anti-meat group misleadingly known as the Physicians' Committee for Responsible Medicine to leak Atkins' private medical records, showing he was overweight and had chronic heart problems, in turn prompting a defensive Atkins spokesperson to make the implausible claim that Atkins gained about seventy pounds from water retention during the short time he was in the hospital before his death)...
...and the list, sadly, goes on endlessly. Well, not quite endlessly, but this marks roughly the 100th article I've written for HealthFactsAndFears and the American Council on Science and Health (and to do a better job of keeping up with the torrent of wackiness, HFAF will become a blog -- a "web log" -- of frequent, short, chronological posts with a lot of links -- in early spring).
Sometimes I think the only way to get to the root of all the unscientific claims may be to ask what factors in human psychology draw people away from science. And I'm not merely saying that everyone is naive and gullible. I'm interested in why we are drawn to certain ideas and not others, regardless of our general lack of scientific sophistication. I've created a list of Four Big Problems with our brains worth keeping in mind.
1. Epistemological Perversion
For starters: the human brain, I suspect, is hard-wired to be on the lookout for weird but useful discoveries -- our ancestors' lives often depended upon spotting unexpected yams and antelopes amidst the shrubbery, after all, and that's fine. But perhaps that explains why a substantial subset of the population seems to be drawn to precisely those ideas that seem bizarre and implausible. The thought "It sounds fishy -- but, my goodness, imagine if it were true!" is irresistible to some people, like finding an unexpected yam (or Easter egg), while "This sounds unlikely" is boring. You might call this attraction to unlikely claims epistemological perversion.
Occasionally, the very absurdity of an idea creates feelings of wonder and bafflement that some people find irresistible. So, for instance, if you told me that cotton candy becomes twenty times harder than steel when it gets dipped in water -- but that the steel industry is covering up this fact to protect its business -- I would tend to be skeptical and demand very substantial evidence. However, some people would become very excited, as though let in on a great secret, and would only reluctantly acknowledge that the claim is insupportable.
That, I suspect, is how so many people end up believing that, say, the oil industry has been deep-sixing perpetual motion machines and cars fueled by water; that modern medicine -- in spite of the millions of lives saved and the millions more lives dramatically lengthened in recent decades by pharmaceuticals and vaccines -- is secretly aimed at keeping us sick and that primitive herbs and potions are more effective; or for that matter (to take a theory popular in France for a fleeting instant) that the Pentagon was never attacked on 9/11/01.
Precisely because something has been marginalized, driven to the fringes, some find it attractive. Seeing an idea marginalized evokes feelings of pity in some people. But we must ask what people think is supposed to become of genuinely bad ideas. There will always be people who think that anti-Darwinists, Holocaust deniers, perpetual motion machine-builders, homeopathic cancer-curers, and Bigfoot trackers haven't gotten a fair hearing yet. But they have. They have failed to make their cases, and it is perfectly fair that the rest of us -- whether as voters, science journal editors, FDA researchers, or pundits -- move on at some point, always remaining open to substantive new evidence but not stopping the presses every time the fringe loonies want to repeat their discredited claims and move in from the fringe.
2. Intellectual Dishonesty (the Lies We Tell Ourselves -- and Millions of Other People)
In addition to the problem of bizarre ideas looking attractive, there is the problem of most people being guilty of at least some intellectual dishonesty -- I don't mean outright, conscious lies, but rather the all-too-natural tendency of our brains to veer away from the weak points in our own arguments, to dwell far more happily upon the weak points in our opponents' arguments. Frankly, you can defend almost any bad philosophy imaginable with just a tiny dash of intellectual dishonesty. Any idea with the slightest intuitive appeal can be made to seem rock-solid (to at least some listeners) if you're just willing to glide quickly over its flaws and ignore its unfounded assumptions: Naziism, the Klan, communism, you name it.
Add to that the fact that many people instinctively feel that their anger at their opponents justifies using intellectually imperfect tactics against them (see: virtually every political argument).
One of the most common ways in which even "nice" people routinely engage in intellectual dishonesty is by appeals to what they "know in their hearts." What the sentimentalists usually mean by "heart" in such statements is really those portions or functions of the brain involving emotion rather than strict logic -- that is to say, fuzzy thinking instead of careful thinking. It should hardly be a surprise that sloppy, emotive thinking tends to (appear to) shore up precisely those things that we long to defend (such as the belief that your luck is changing for the better, belief that Rover somehow survived that encounter with the car years ago and will come back someday, belief that your family/friends/nation would never do anything wrong).
It is precisely to avoid that sort of bias and woolly thinking that we employ skepticism, asking ourselves not the softball question "Do I have a gut feeling I'm right about things?" but "Are there ways I may be wrong that I haven't carefully examined?" Without an earnest skeptical drive to counter wishful thinking, without the scientific method to filter out bad evidence (or non-evidence), without the basics of philosophy as a check on unwarranted leaps in logic, one is almost guaranteed to go astray and let wishful thinking replace careful reasoning. Sometimes I think we'd be much closer to sanity if "intuition" became a dirty word.
3. Aversion to Changing One's Mind or, Worse, Having It Changed By Someone Else
In childhood, people tend to form very short, concrete lists of things that exist in the world, like so:
People
Doggies
Cars...
and so on. And there are some rather abstract items that soon find their way onto the list:
Minds
Consciences
Lessons
Places...
and so on. And people get bit frustrated if their lists start growing more and more complicated due to analysis that forces them to break down the tidy list of concretes into a longer list:
Domesticated doggies
Feral doggies
Wolf-dog hybrids...
They are even more resistant to breaking down the more abstract items on their lists, such as minds, into subcategories, since they soon realize they aren't entirely sure how to define them or where to draw the line between ambiguous cases:
Average human minds
Near-brain-dead minds with only minimal life systems still functioning
Computers with human-level intelligence, real or hypothetical...
People often get combative when shown that they must add complexity to their short lists, and they'll often go so far as to deny the truth of complexity-revealing analysis in order to keep their beloved lists short and simple. This rather anti-intellectual impulse crops up across the political spectrum. On the right, it can produce a fear of thinking too deeply about whether souls are real things or mere metaphorical constructs, whether religion might be an amalgam of truths and superstitions, whether illegal drugs might have their good points, whether some outre lifestyles might be more benign than others, whether some uses of cloning technology might be more ethical than others, and so forth. Similarly, once people on the left have, for instance, come to think of industries as "evil polluters," they aren't likely to be interested in cost-benefit analysis that might reveal the industries do far more good than harm. Leftists also tend to be resistant to such complicating notions as excessive union demands, wealthy ethnic minorities, militaristic socialist regimes, violent peace protesters, or all-natural carcinogens. These things don't fit easily into the left's usual readymade categories. And I've written at length about how difficult it is to get some of my fellow libertarians to see that addiction and insanity are greyer areas than the routine purchase of widgets. People also don't like new facts that complicate a simple story, especially one they've long been familiar with (and especially one they've already repeated countless times).
Closely related to the desire to avoid changing one's mind or complicating one's worldview is a fourth problem...
4. Fear of the Unknown
Faced with a familiar threat and an unknown one, people will almost always behave as if the unknown threat is worse. This is to some extent a useful instinct: a familiar terrain, no matter how rocky, is to some extent one you know how to negotiate, while a valley you've never seen before, no matter how green and peaceful-looking, may hold hidden traps. People should be willing to adjust their fears to fit the facts as they acquire new, more accurate data, though. If solidly-documented statistics show that virtually no one has been injured in the valley while even the most experienced negotiators of the rocky terrain break a limb occasionally, you might want to consider spending more time in the valley, even if it seemed mysterious and thus scary at first.
Similarly, diseases such as malaria, the flu, polio, measles, or malnutrition may seem so familiar as to be almost boring, but our familiarity and apathy shouldn't lull us into thinking those things carry smaller risks than the weird, new-fangled pesticides, vaccines, or genetically-modified foods with which we might fight those familiar, boring risks. Boring things can kill and strange new things can sometimes save us, so avoiding the unknown will only get you so far. Indeed, without any strange new things, we'd still be living in caves.
When I figure out why people are instantly drawn to some new things (see problem #1) but repelled by other new things, I'll really have a handle on how the mind works.
In the meantime, I admit I have my own broad idea of how the world ought to be and that that no doubt shapes and subtly biases my own perceptions -- I'd like to see the whole planet modernized, secularized, privatized, globalized, and thus in some sense Americanized (though doing all that would be more of a victory for Enlightenment-style rationality and for humanity in general than a narrowly nationalistic feat, and I wouldn't push a more chauvinistic agenda). I may tend to err on the side of skepticism when people say they can cure cancer with magic tree bark, outlaw industry without harming the economy, or lose weight by eating more calories and exercising less -- but I remain more than willing to look at new data, will be wary of my own inevitable intellectual errors, and am always happy to change my mind, even about the most basic assumptions, when the evidence warrants it.
I'll keep you posted on what I find out, at an even faster pace than I have over the past two years.
And if all that sounded recklessly abstract, here's an antidote: up-to-date specifics on osteoarthritis and its treatment, in the form of ACSH's new report
and consumer booklet on the subject.