The Remarkable Human Capacity For Being Fooled
The Remarkable Human Capacity For Being Fooled

Sunday • December 14th 2025 • 5:26:26 pm

The Remarkable Human Capacity For Being Fooled

Sunday • December 14th 2025 • 5:26:26 pm

Here is a fact that I find endlessly fascinating: the human brain contains approximately 86 billion neurons, each capable of forming thousands of connections with other neurons, creating a network of such staggering complexity that if you tried to count every synaptic connection at a rate of one per second, you would need roughly 3 million years to finish. You would also, one imagines, be rather bored by the end.

This magnificent organ—the most complex structure in the known universe, capable of composing symphonies, solving differential equations, and experiencing the sublime terror of existence—can also be convinced to buy a pet rock.

In 1975, a California advertising executive named Gary Dahl sat in a bar listening to his friends complain about their pets. Pets, they noted, required feeding, walking, veterinary care, and emotional attention. Dahl, in what one can only describe as either a stroke of genius or a devastating commentary on consumer culture, announced that he had the perfect pet: a rock.

He was joking. But then he stopped joking.

Dahl purchased smooth stones from a builder's supply store in Rosarita Beach, Mexico, for about a penny each. He packaged them in cardboard boxes designed to look like pet carriers, complete with air holes and a bed of straw. He wrote a 32-page instruction manual—"The Care and Training of Your Pet Rock"—filled with commands like "sit" and "stay" (at which, the manual noted, rocks excelled) and warnings about teaching your rock to "attack" (which required caution).

Within six months, Gary Dahl had sold over 1.5 million pet rocks at $3.95 each, becoming a millionaire from what was, by any objective measure, the sale of ordinary rocks in fancy packaging.

I mention this not to mock the purchasers—though one is tempted—but to make a point that will become rather important as we proceed: the same cognitive architecture that enables human beings to land spacecraft on distant moons also makes us susceptible to purchasing rocks as pets.

This is not a bug. It's a feature. Sort of.


The Brain That Fooled Itself

The human brain, you see, did not evolve to discern truth. It evolved to survive. These are not the same thing.

Consider our ancestors on the African savanna, some 200,000 years ago. When they heard rustling in the tall grass, they faced a choice: assume it was the wind, or assume it was a lion. Those who assumed "lion" and were wrong suffered only the minor cost of unnecessary vigilance. Those who assumed "wind" and were wrong got eaten.

Over countless generations, natural selection favored brains that erred on the side of seeing patterns, even when patterns weren't there. We are the descendants of the paranoid, the credulous, the ones who saw faces in the clouds and agency in the thunder. The skeptics, by and large, got eaten.

This is why we see faces in electrical outlets and hear meaningful messages when records are played backward. It's why we find it almost impossible to believe that random events are truly random—surely there must be some meaning to the fact that we thought of Aunt Edna just before she called. It's why conspiracy theories spread so easily: they offer patterns where official explanations offer only the unsatisfying messiness of coincidence and incompetence.

And it's why, when a tobacco executive looked into a camera in 1994 and said "I believe that nicotine is not addictive," millions of people found it easier to believe him than to accept the terrifying alternative: that corporations would knowingly sell products that killed their customers.

The alternative required accepting that the world was more dangerous, more corrupt, and more indifferent to human welfare than we wanted to believe. The lie was comforting. The truth was not.

Our brains are not truth-seeking machines. They are survival machines, pattern-seeking machines, comfort-seeking machines. They can be taught to seek truth, but it doesn't come naturally. It requires effort, training, and a willingness to be uncomfortable.

In short: it requires education.


A Brief Digression Concerning the Educational Philosophy of the Prussian Military

If you've ever wondered why schools look the way they do—why children sit in rows, facing forward, moving in unison at the sound of bells—I'm afraid the answer is not terribly inspiring.

In the early 19th century, Prussia (the German state that would eventually unify Germany, start two world wars, and give us both the kindergarten and the goose-step) faced a problem. They had just been rather thoroughly thrashed by Napoleon at the Battle of Jena-Auerstedt in 1806, and they wanted to make sure it never happened again.

Their solution was education reform.

But not education reform in the sense of "let's help people think more clearly." Rather, education reform in the sense of "let's produce soldiers who follow orders without question and workers who show up on time."

The Prussian system—which American educators like Horace Mann would later import to the United States with great enthusiasm—was designed to instill obedience, punctuality, and the ability to perform repetitive tasks on command. Students were sorted by age (like products on an assembly line), trained to respond to bells (like factory workers), and taught to defer to authority (like soldiers).

The system was, by its own standards, a magnificent success. It produced armies that conquered Europe and factories that powered the Industrial Revolution. It also produced, less intentionally, populations that were remarkably susceptible to propaganda, nationalist fervor, and authoritarian leadership.

When people ask why modern education seems so poorly designed to produce critical thinkers, the answer is uncomfortable but simple: it was never designed to produce critical thinkers. It was designed to produce obedient workers and compliant citizens. The fact that it still does exactly what it was designed to do is not a failure. It's a success.

Just not a success for students.


Thomas Midgley Jr. and the Art of Being Impressively Wrong

Let us now consider the remarkable case of Thomas Midgley Jr., a man who may have had more impact on the Earth's atmosphere than any other single organism in the planet's 4.5-billion-year history.

This is not a compliment.

Midgley was a brilliant mechanical engineer and chemist who, in the 1920s, was tasked by General Motors with solving the problem of "engine knock"—the unpleasant pinging sound that early automobile engines made when the fuel mixture ignited prematurely. He tested thousands of compounds before discovering, in December 1921, that tetraethyl lead eliminated knock entirely.

There was just one small problem: lead is horrifically poisonous.

This was not, it should be noted, a secret. The toxicity of lead had been recognized since at least the 2nd century BC, when the Greek physician Dioscorides noted that lead "makes the mind give way." The Romans knew their lead pipes were slowly poisoning them but kept using them anyway, in one of history's earliest examples of trading long-term health for short-term convenience. (Some historians have argued, somewhat controversially, that lead poisoning contributed to the fall of Rome itself. The evidence is mixed, but the irony is delicious.)

Midgley knew about lead poisoning. In fact, he experienced it himself while working on tetraethyl lead, and had to take a vacation to "get some fresh air" and recover. His colleagues were not so fortunate. At the DuPont plant in Deepwater, New Jersey—which workers nicknamed the "House of Butterflies" because of the hallucinations lead exposure caused—five workers died in the first two months of production.

Now, here is where it gets interesting.

Did General Motors halt production? Did they investigate alternatives? Did they, at minimum, warn workers of the dangers?

No. They formed a new company called the Ethyl Gasoline Corporation, marketed the product as "Ethyl" (carefully avoiding the word "lead"), and sent Thomas Midgley himself to give a press conference in which he poured tetraethyl lead over his hands and inhaled its fumes for sixty seconds to demonstrate its "safety."

He was, at the time, still recovering from lead poisoning.

The performance worked. The public was reassured. And for the next seventy years, lead was added to gasoline around the world, resulting in what historians now recognize as one of the largest mass poisonings in human history. Blood lead levels in Americans peaked in the 1970s at approximately 15 micrograms per deciliter—a level we now know causes measurable cognitive impairment. The economist Rick Nevin has calculated a disturbing correlation between lead exposure and violent crime rates, with crime waves following lead exposure by about twenty years.

Midgley, meanwhile, went on to develop chloro fluoro carbons (CFCs)—the chemicals that would later be discovered to be destroying the ozone layer. He thus achieved the remarkable distinction of being personally responsible for two of the most environmentally catastrophic inventions of the 20th century.

He died in 1944, strangled by a system of ropes and pulleys he had invented to help him get out of bed after contracting polio. I do not wish to suggest that the universe has a sense of irony, but I am also not in a position to deny it.


Why Smart People Believe Dumb Things

Here is something that may surprise you: intelligence, by itself, does not protect against deception. In some ways, it makes deception easier.

This seems counterintuitive. Surely smart people should be better at spotting lies? Surely higher education should inoculate against manipulation?

Alas, no.

The problem is that intelligence, without the specific training to recognize manipulation, simply makes you better at rationalizing. Smart people don't believe fewer wrong things than less intelligent people. They believe wrong things more cleverly. They are better at constructing elaborate justifications for beliefs they hold for emotional or tribal reasons. They are better at spotting flaws in arguments they disagree with while remaining blind to identical flaws in arguments they favor.

This phenomenon has a name: "motivated reasoning." And it's why some of the most educated people in history have believed some of the most absurd things.

Consider the doctors who, for decades, insisted that ulcers were caused by stress and spicy food, and dismissed the Australian researchers who suggested bacteria might be responsible. The researchers, Barry Marshall and Robin Warren, were correct—and would eventually win the Nobel Prize for their discovery. But for years, they were ignored or mocked by specialists who "knew" that bacteria couldn't survive in the acidic environment of the stomach.

Consider the physicists who ridiculed the idea of continental drift when Alfred Wegener proposed it in 1912, pointing out (quite correctly) that Wegener had no plausible mechanism for how continents could move. They were right about the mechanism (Wegener's theory was indeed wrong about how it happened), but they were wrong about the fundamental fact. The continents do move. It took fifty years and the discovery of plate tectonics before the scientific establishment accepted what the evidence had been suggesting all along.

Consider the economists who, right up until 2008, insisted that the housing market was fundamentally sound and that sophisticated financial instruments had distributed risk so efficiently that a systemic crash was essentially impossible. These were brilliant people. They had PhDs from the finest universities. They had Nobel Prizes. They were, as the event revealed, catastrophically wrong.

Intelligence without humility is dangerous. Expertise without intellectual honesty is dangerous. The belief that your education makes you immune to error is the most dangerous belief of all.


The Manufacture of Doubt

In 1953, executives from the major tobacco companies met at the Plaza Hotel in New York to discuss a problem: science was beginning to suggest, rather convincingly, that their products killed people.

Their response was not to make safer products. Their response was not to warn consumers. Their response was to manufacture doubt.

They hired a public relations firm called Hill & Knowlton—the same firm that would later help the Kuwaiti government sell the first Gulf War to the American public (Kuwaiti government, in collaboration with various public relations firms, shaped public opinion in the United States regarding the Gulf War (1990-1991). Specifically, they convinced the American public and the U.S. government to support military intervention in the war, which was primarily aimed at expelling Iraqi forces from Kuwait after Iraq's invasion of the country in August 1990.)

Hill & Knowlton like a malevolent Forrest Gump, seems to appear at a remarkable number of pivotal moments in the history of deception. Their advice was simple: don't try to prove cigarettes are safe. That's a losing battle. Instead, emphasize the uncertainty. Fund research that asks questions without answering them. Create scientific controversy where consensus exists.

"Doubt is our product," one tobacco executive would later write in an internal memo, "since it is the best means of competing with the body of fact that exists in the mind of the general public."

The strategy was devastatingly effective. For fifty years, the tobacco industry maintained the pretense of scientific debate on a question that was, scientifically, not debatable. Millions of people who might have quit smoking continued, reassured by the existence of "controversy" that was entirely manufactured.

And here's the truly remarkable part: the strategy was so effective that it became a template.

When the chemical industry faced questions about the safety of DDT, they used the tobacco playbook. When the lead industry faced questions about leaded gasoline, they used the tobacco playbook. When the pharmaceutical industry needed to minimize concerns about opioid addiction, they used the tobacco playbook. When the fossil fuel industry needed to delay action on climate change, they used the tobacco playbook.

In some cases, they used the same consultants.

This is not conspiracy theory. This is documented fact. The internal documents are available. The same names appear across industries. The same strategies recur. The same language—"the science is uncertain," "more research is needed," "we shouldn't rush to judgment"—echoes across decades and industries.

The manufacture of doubt is an industry. It has practitioners, techniques, and a track record. And it works because our brains are not designed to handle it. We evolved to evaluate evidence, not to evaluate manufactured pseudo-evidence designed by professionals specifically to confuse us.


A Thought Experiment

Imagine, for a moment, that you are a physician in 1847. You work at the Vienna General Hospital, and you've noticed something troubling: women giving birth in the ward staffed by medical students die of childbed fever at a rate of about 10%. Women giving birth in the ward staffed by midwives die at a rate of about 4%.

The difference is stark, consistent, and unexplained.

Your colleague, Ignaz Semmelweis, proposes a theory. Medical students, he notes, often come to the maternity ward directly from the autopsy room, where they dissect cadavers. Midwives do not perform autopsies. Perhaps, Semmelweis suggests, the students are carrying something—some "cadaverous material"—on their hands that causes the fever.

He proposes an experiment: require medical students to wash their hands in a chlorine solution before examining patients. The result is dramatic. The mortality rate in the medical student ward drops from 10% to about 1%.

You might think this would be cause for celebration. You would be wrong.

The medical establishment rejected Semmelweis's findings. They rejected them not because of insufficient evidence—the evidence was overwhelming—but because accepting them would require accepting that doctors themselves were killing their patients. The psychological cost of this admission was too high.

Semmelweis was mocked, marginalized, and eventually dismissed from the hospital. He grew increasingly erratic, obsessively writing letters accusing obstetricians of murder. In 1865, he was committed to an asylum, where he died two weeks later, possibly beaten by guards.

Twenty years after his death, germ theory was established, and Semmelweis was vindicated. Handwashing became standard practice. The lives saved are incalculable.

I tell this story not because it's cheerful—it very much is not—but because it illustrates something important about how truth and institutions interact. The truth, by itself, is not enough. Truth requires institutions capable of recognizing it, and institutions are made of people, and people have egos, interests, and an extremely limited tolerance for being told they've been killing their patients.

The same dynamic plays out today. Scientists who discover inconvenient truths—about tobacco, about lead, about climate, about opioids—face the same institutional resistance Semmelweis faced. The details differ; the pattern persists.


The Peculiar Optimism of the Long View

At this point, you might be feeling rather depressed. I know I am.

We have established that the human brain is not designed for truth-seeking, that education systems were designed to produce compliance rather than critical thinking, that industries manufacture doubt as a business strategy, and that even well-intentioned institutions resist inconvenient truths.

It seems rather hopeless, doesn't it?

And yet.

Here's the thing about hopelessness: it's usually wrong.

Consider what we know now that we didn't know a hundred years ago. We know that lead is poisonous, and we've removed it from gasoline and paint. We know that tobacco causes cancer, and smoking rates have plummeted. We know that CFCs destroy the ozone layer, and we banned them, and the ozone layer is recovering. We know that corporations lie, and we've created systems—imperfect, inadequate, but real—to hold them accountable.

The Semmelweis of 1847 died in obscurity. The Semmelweises of today—the whistleblowers, the investigative journalists, the scientists who refuse to be silenced—have tools he never dreamed of. They have the internet, which makes suppressing information much harder. They have FOIA requests, which pry documents from resistant institutions. They have litigation, which forces disclosure through discovery. They have each other, connected across the globe in ways that isolated truth-tellers of the past could not have imagined.

The UCSF Industry Documents Library contains over 14 million pages of internal corporate documents—the smoking guns of countless deceptions, now available to anyone with an internet connection. The Climate Files database contains the evidence of what Exxon knew and when they knew it. The Monsanto Papers revealed the ghostwriting, the manipulation, the systematic corruption of science—and just this month, a key paper was retracted as a result.

These revelations happened not because corporations suddenly became honest, but because people refused to stop asking questions, because journalists refused to stop investigating, because lawyers refused to stop litigating, because scientists refused to stop researching.

The arc of history does not bend toward justice automatically. It bends because people bend it. It bends because individuals—often at great personal cost—insist on telling truths that institutions would prefer to suppress.


What Education Could Be

Let me propose something that might seem radical: education could be different.

Not just better in the sense of "more effective at teaching students to pass tests"—that's rearranging deck chairs on the Titanic. Different in the sense of serving a different purpose entirely.

What if schools taught students how to spot manipulation? What if, alongside mathematics and literature, students learned the history of corporate deception—the tobacco playbook, the lead industry cover-up, the climate denial campaign? What if they emerged from school not just with knowledge, but with pattern recognition—the ability to see the same old tricks dressed up in new clothing?

What if students learned, not as an afterthought but as a central purpose of education, to ask: Who benefits from this claim? What evidence supports it? What would change my mind? What are the interests of the person making this argument?

What if we taught children that doubt—real doubt, the kind that demands evidence rather than the manufactured doubt that serves incumbent interests—is a virtue? That changing your mind in response to new evidence is not weakness but strength? That the most important words in the English language might be "I was wrong"?

What if we taught moral courage? Not just ethics in the abstract, but the specific, practical courage required to speak up when everyone around you is silent? The courage Semmelweis needed and found? The courage that whistleblowers summon when they sacrifice their careers to expose wrongdoing?

This is not utopian fantasy. It's a design choice.

We chose to build education systems that prioritize compliance. We can choose to build education systems that prioritize critical thinking. The fact that powerful interests prefer compliant citizens does not mean those interests must prevail.


A Final Thought, Concerning Rocks

Let us return, for a moment, to the pet rock.

I confess that when I first learned about Gary Dahl and his geological pets, my reaction was mockery. How could anyone be foolish enough to buy a rock in a box?

But the more I thought about it, the more my mockery seemed misplaced.

The people who bought pet rocks were not fools. They were people who wanted a bit of absurdity in their lives, a bit of humor, a small rebellion against the dreary seriousness of adult existence. They were in on the joke. They were buying, not a rock, but a laugh—and by that standard, $3.95 was a reasonable price.

The real fools are not the people who bought pet rocks knowing they were buying a gag gift. The real fools are the people who buy lies sold as truths—who accept corporate assurances about product safety without skepticism, who believe manufactured doubt over scientific consensus, who mistake confident assertion for actual evidence.

The pet rock buyers knew they were being silly. They were in on the joke.

The question is: Are we?

When a pharmaceutical company tells us their opioid is safe, are we in on the joke? When an oil company tells us the science is uncertain, are we in on the joke? When a chemical company tells us their forever chemicals are as safe as table salt, are we in on the joke?

Or are we the marks?

The difference between buying a pet rock and buying a corporate lie is that the pet rock seller never claimed to be anything other than what he was. He was selling absurdity, and everyone knew it. The corporation, by contrast, presents itself as a trusted authority, wraps its lies in the language of science, and denies deceiving you even as it picks your pocket.

One transaction is honest silliness. The other is sophisticated fraud.

Education—real education, the kind that teaches you to think rather than what to think—is what allows you to tell the difference.


Conclusion: The View From Here

We have, in this text, ranged rather widely. We have visited the African savanna and the Vienna General Hospital. We have met Thomas Midgley Jr. and Ignaz Semmelweis. We have contemplated pet rocks and manufactured doubt. We have considered why smart people believe dumb things and why institutions resist inconvenient truths.

Yet, there is a thread connecting these disparate topics, and it is this: the human capacity for self-deception is matched only by the human capacity for self-correction.

We are the species that invented lead gasoline and the species that banned it. We are the species that created the tobacco industry and the species that exposed it. We are the species that builds systems of deception and the species that tears them down.

Which of these capacities prevails is not predetermined. It depends on choices—on whether we build educational systems that cultivate critical thinking, on whether we create institutions that reward truth-telling, on whether we summon the courage to ask uncomfortable questions and accept uncomfortable answers.

The view from here, I confess, is ambiguous. We can see both the scale of the deception and the possibility of its exposure. We can see both the forces arrayed against truth and the stubborn persistence of those who insist on telling it.

What we cannot see—what no one can see—is how the story ends. That depends on us.

But I will say this: the very fact that you are reading these words is evidence for hope. You have chosen to learn about the mechanisms of deception, which means you are less likely to fall for them. You have chosen to engage with uncomfortable truths, which means you are practicing the very skills that make manipulation harder.

A pet rock cannot read. It cannot think. It cannot question. It is exactly what it appears to be: an inert lump of mineral, doing nothing, changing nothing, learning nothing.

You are not a pet rock.

You are a human being, heir to 86 billion neurons and 200,000 years of survival and the accumulated knowledge of countless generations who asked questions, sought answers, and passed what they learned to those who came after.

Use it well.

Artwork Credit