The First Principle: An Introduction to Not Being a Stupid Jackass

 “The first principle is that you must not fool yourself, and you are the easiest person to fool.” — Richard P. Feynman, Nobel Prize-winning physicist

We all like to think that we are rational actors, but I would like to invite you to consider the possibility that you are not as rational as you think you are. Now, if you are willing to entertain this possibility, to exist (if only for a moment) in this uncertainty, I promise you that this article will dramatically enhance your reasoning abilities, your capacity for impartial observation and experimentation, and your overall critical-thinking skills. So, without further ado, if you are interested in becoming a more reasonable person, open your mind, and read on.

We will begin with the brilliant conclusion of “Unafraid of the Dark” — the 13th and final episode in season one of Cosmos: A Spacetime Odyssey (2014) — wherein Neil deGrasse Tyson lays out five rules for discoverers:

1) Question authority. No idea is true just because someone says so, including me.

2) Think for yourself.

3) Question yourself. Don’t believe anything just because you want to, believing something doesn’t make it so.

4) Test ideas by the evidence gained from observation and experiment.If a favorite idea fails a well designed test it’s wrong, get over it. Follow the evidence, wherever it leads. If you have no evidence, reserve judgment.

5) And perhaps the most important rule of all. Remember, you could be wrong. Even the best scientists have been wrong about some things. Newton, Einstein, and every other great scientist in history. They all made mistakes, of course they did, they’re human. Science is a way to keep from fooling ourselves and each other.

Cosmos: A Spacetime Odyssey is an altogether incredible series, well-worth watching in its entirety. It was nominated for 12 Emmys, and is based on Carl Sagan’s Cosmos: A Personal Voyage (1980). This stellar science documentary was written by Ann Druyan and Steven Soter, directed by Brannon Braga, Bill Pope and Ann Druyan, starring Neil deGrasse Tyson with accompanying music composed by Alan Silvestri. Seth MacFarlane, the beloved creator of Family Guy, was — in addition to being an executive producer — instrumental in bringing Cosmos: A Spacetime Odyssey to broadcast television.

Logical Fallacies and A Demon Haunted World

“Trial, Error and the God Complex”

In his stellar presentation at TEDGlobal 2011, “Trial, Error and the God Complex”, economist Tim Harford urges listeners to recognize the unfathomable complexity of our world.

Harford starts his talk with a story about Archie Cochrane, a prisoner of war and a doctor in World War II. Cochrane and the men under his care are suffering from a debilitating condition, the likes of which Cochrane has never encountered, that causes a horrible swelling up of fluids under the skin. In spite of dreadful conditions, Cochrane manages to provide a cure by procuring some Marmite (a rich source of vitamin B12), breaking his men into two equal groups and carefully noting the effects of Marmite consumption. Upon presenting his results to their German captors, Cochrane provided the Germans with incontrovertible evidence of malnutrition, and the Germans realized that failing to provide vitamin B12 would be a war crime.

Harford goes on to say:

I’m telling you this story because Archie Cochrane, all his life, fought against a terrible affliction, and he realized it was debilitating to individuals and it was corrosive to societies. And he had a name for it. He called it the God complex. Now I can describe the symptoms of the God complex very, very easily. So the symptoms of the complex are, no matter how complicated the problem, you have an absolutely overwhelming belief that you are infallibly right in your solution…

I see the God complex around me all the time in my fellow economists. I see it in our business leaders. I see it in the politicians we vote for — people who, in the face of an incredibly complicated world, are nevertheless absolutely convinced that they understand the way that the world works.

Harford then turns to the creation of Unilever’s high-pressure, factory, liquid detergent nozzles. When Unilever first attempted to design one of these working nozzles they found themselves a “little God”, a mathematician/physicist who understood fluid dynamics. Unfortunately, the problem was too complicated, and Unilever’s “little God” failed to create a working nozzle.

So, Harford explains, calling upon the work of geneticist Professor Steve Jones, Unilever employed a systematic trial, error, variation and selection method. The method worked brilliantly, and “after 45 generations” Unilever managed to create an incredible working nozzle. “The moment you step back from the God complex — let’s just try to have a bunch of stuff; let’s have a systematic way of determining what’s working and what’s not — you can solve your problem.”

Next Harford confesses that, for as long as he has been giving this talk, “People sometimes say to me… ‘Obviously trial and error is very important. Obviously experimentation is very important. Now why are you just wandering around saying this obvious thing?’”

In response, Harford explains:

You think it’s obvious? I will admit it’s obvious when schools start teaching children that there are some problems that don’t have a correct answer… And if you can’t find the answers, you must be lazy or stupid… When a politician stands up campaigning for elected office and says, ‘I want to fix our health system. I want to fix our education system. I have no idea how to do it. I have half a dozen ideas. We’re going to test them out. They’ll probably all fail. Then we’ll test some other ideas out. We’ll find some that work. We’ll build on those. We’ll get rid of the ones that don’t.’ — When a politician campaigns on that platform, and more importantly, when voters like you and me are willing to vote for that kind of politician, then I will admit that it is obvious that trial and error works…

Until then, I’m going to keep banging on about trial and error and why we should abandon the God complex. Because it’s so hard to admit our own fallibility. It’s so uncomfortable.

In conclusion, Harford talks about being “haunted by something a Japanese mathematician said on the subject [of trial, error and the God complex].” He briefly relates the story of Yutaka Taniyama, a young mathematician living in post-WWII Japan, who developed the Taniyama-Shimura Conjecture (in association with Goro Shimura). This conjecture turned out to be instrumental in proving Fermat’s Last Theorem. “In fact… it’s equivalent to proving Fermat’s Last Theorem. You prove one, you prove the other.”

Sadly, in Taniyama’s lifetime, it remained conjecture, and, despite his best efforts, he could never prove it was true. Then, shortly before his 30th birthday in 1958, Taniyama killed himself. Decades later, Shimura, Taniyama’s friend and colleague, reflected on Taniyama’s life. “He was not a very careful person as a mathematician. He made a lot of mistakes. But he made mistakes in a good direction. I tried to emulate him, but I realized it is very difficult to make good mistakes.”

Harford’s talk is brilliant in its entirety, and well worth your time. Check it out here:

Spotting Bad Science 101

“Nothing is more irredeemably irrelevant than bad science.” — John Polanyi, the Hungarian-Canadian chemist who won the Nobel Prize in 1986 for his research in chemical kinetics

This section consists primarily of passages and concepts from “Spotting Bad Science 101: How Not to Trick Yourself” — an appendix in The 4-Hour Body: An Uncommon Guide to Rapid Fat-Loss, Incredible Sex, and Becoming Superhuman by Timothy Ferriss. In its entirety, The 4-Hour Body is an excellent, unconventional guide to radical self-transformation; I would wholeheartedly recommend it to anyone trying to become a healthier, more well-rounded person.

Ferriss begins by assuring us that, “science isn’t arbitrary… You just need to learn a few simple concepts to separate truth (or probable truth) from complete fiction.” Next he emphasizes the importance of self-reliance — a common theme throughout The 4-Hour Body — and reminds us that it is dangerous to rely too heavily on doctors to solve our problems for us.1 “After reading the next eight pages, you will know more about research studies than the average MD.”

After his brief introduction, Ferriss gets into the five “tools most often used [by media or propagandists with agendas] to exaggerate and brainwash.” He refers to these as, “The Big Five.” Each one of “The Big Five” tools is phrased, “as a question you should ask yourself when looking at diet advice or the ‘latest research.’”

1) “Is a relative change (like percentages) being used to convince?”

To illustrate this concept, Ferriss lays out two potential news headlines. First, “Studies Show People Who Avoid Saturated Fat Live Longer.” He then explains that you should “find out exactly what ‘longer’ means” before making the decision to start avoiding saturated fat.

Based on available data, it turns out that reducing your saturated fat intake to 10% of daily calories for your entire adult life would add only 3–30 days to your lifespan.

The second potential headline is, “People Who Drink Coffee Lose 20% More Fat Than Those Who Don’t.” Ferriss explains that, “Relative [in this case, the 20% more] isn’t enough. It’s critical to ask what the absolute increase or decrease was — in this case, how many pounds of fat did both groups actually lose?”

If it were 0.25 pounds lost for the control group and 0.30 pounds (20% more) for the coffee group over eight weeks at three cups per day, is picking up the coffee habit worth the side effects of high-dose caffeine? Nope… Distrust percentages in isolation.

2) “Is this an observational study claiming to show cause and effect?”

Ferriss places special emphasis on this concept: “This is the mother lode. If you learn just one concept in this chapter, learn this one.” At this point you might be asking yourself, “What the hell is an observational study?”

Observational studies, also referred to as uncontrolled experiments, look at different groups or populations outside the lab and compare the occurrence of specific phenomena, usually diseases. One example is the often misinterpreted ‘China study.’

He goes on to deliver, “the most important paragraph in this chapter:”

Observational studies cannot control or even document all of the variables involved. Observational studies can only show correlation: A and B both exist at the same time in one group. They cannot show cause and effect.

“In contrast,” Ferriss explains that, “randomized and controlled experiments control variables and can therefore show cause and effect (causation): A causes B to happen.”

In an effort to further illustrate the importance of understanding the difference between correlation and causation, Ferriss calls on Pastafarianism — the satirical religion whose deity is the Flying Spaghetti Monster — to purposely confuse correlation and causation:

With a decrease in the number of pirates, there has been an increase in global warming over the same period. Therefore, global warming is caused by pirates… [And, even more compelling.] Somalia has the highest number of Pirates AND the lowest Carbon emissions of any country. Coincidence?

Drawing unwarranted cause-and-effect conclusions from observational studies is the bread-and-butter of media and cause- or financially-driven scientists blind to their own lack of ethics… Don’t fall for Pastafarianism in science.

Wrapping up this concept, Ferriss writes:

Observational studies are valuable for developing hypotheses (educated guesses that can then be tested in controlled settings) but they cannot and should not be used to show cause and effect. To do so is both irresponsible and potentially dangerous.

3) “Does this study depend on self-reporting or surveys?”

To the greatest extent possible, avoid studies that depend on after-the-fact self-reporting. Trust your own data. Just record it when things happen.

4) “Is this diet study claiming a control group?”

Desirable as it may be, it is almost impossible to change just one macronutrient variable (protein, carbohydrate, fat) in a diet study. It is therefore almost impossible to create a control group.

If a researcher makes such a claim and vilifies a single macronutrient, your skeptical spider sense should tingle.

Ferriss explains the advantages offered by self-experimentation in terms of establishing a “control”.

The control is everything you’ve tried up to a certain point that hasn’t produced a desired effect. Isolating one variable is often less important than the sum impact of a group of changes. In other words, has your bodyfat percentage gone up or down in the last two weeks of replacing diet A with diet B? If you weren’t losing fat on A and now you are, A was your control.

In an ideal (but unattractive) test, you would go back to A and see if bodyfat then moves in the other direction. Then repeat the switch again. This would minimize the possibility that the first change in bodyfat just happened to coincide with the change in diet to B.

Alas, this switching would also maximize your likelihood of going insane. If something seems to be working, just stick with it.

5) “Do the funders of the study have a vested interest in a certain outcome?”

Beware of unholy unions between scientists and funding sources.

Dr. Ben Goldacre

Dr. Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks. He was trained in medicine at Oxford and London, and is the author of Bad Science (2009), Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients (2013), and I Think You’ll Find it’s a Bit More Complicated Than That (2014).

Dr. Goldacre takes the stage at TEDGlobal 2011, delivering this brilliant talk wherein he shows us (at high speed) the ways evidence can be distorted, from blindingly obvious nutrition claims to subtle tricks of the pharmaceutical industry.

Dr. Goldacre makes an appearance at TEDMED 2012, delivering another brilliant talk wherein he explains potential dangerous and misleading consequences of unpublished trials. When new drugs are tested, the results of the trials should be published for the rest of the medical world — except, much of the time, negative and inconclusive findings go unreported, leaving doctors and researchers in the dark.

Dr. Goldacre also teamed up with Collins, to create some resources for teachers: Teaching Science with Bad Science: Resources for Teachers.

Here Be Dragons and Cognitive Biases

Most people fully accept paranormal and pseudo-scientific claims without critique as they are promoted by the mass media. Here Be Dragons offers a toolbox for recognizing and understanding the dangers of pseudoscience, and appreciation for the reality-based benefits offered by real science.

Here Be Dragons is written and presented by Brian Dunning, host and producer of the Skeptoid podcast, author of Skeptoid: Critical Analysis of Pop Phenomena, and Executive Producer of The Skeptologists and Truth Hurts.

Wikipedia's List of Cognitive Biases

Nassim Taleb’s 4-Volume Incerto Series

The ethical imperative driving Taleb’s Incerto series is: “If you see fraud and don’t shout fraud, you are a fraud.”

The Incerto series is:

[An] investigation of opacity, luck, uncertainty, probability, human error, risk, and decision making when we don’t understand the world, expressed in the form of a personal essay with autobiographical sections, stories, parables, and philosophical, historical, and scientific discussions in non-overlapping volumes that can be accessed in any order.

Additional Resources

The Work of David McRaney

Additional-additional Resources

This is not the end…

“They say it’s the last song. They don’t know us, you see. It’s only the last song if we let it be.” — Dancer in the Dark (2000)