A Field Guide to Lies: Critical Thinking in the Information Age

  • By Daniel J. Levitin
  • Dutton
  • 304 pp.
  • Reviewed by Danielle LaVaque-Manty
  • October 20, 2016

An entertaining, user-friendly primer on evaluating data wisely.

A Field Guide to Lies: Critical Thinking in the Information Age

“Statistics,” as Daniel J. Levitin says in A Field Guide to Lies: Critical Thinking in the Information Age, “because they are numbers, can appear to us to be cold hard facts.” And this can lead us to accept them unquestioningly. But questions are crucial: Who counted what, and why? What kind of language do they use to explain what their numbers mean?

Advising readers that we can’t take claims about data at face value because sometimes people mislead us, whether deliberately or because they don’t understand the numbers they’re presenting, Levitin, dean of social sciences at the Minerva Schools at the Keck Graduate Institute in San Francisco, as well as a professor of psychology at McGill University, offers tools anyone can use to evaluate the information we encounter in the news, at work, and online.

Don’t let Levitin’s academic background put you off. In addition to A Field Guide to Lies, he has written three international bestsellers: The Organized Mind, This is Your Brain on Music, and The World in Six Songs. His style is conversational and often funny. (As an example of how averages aren’t always useful, he points out that “on average, humans have one testicle.”)

The book has three parts: “Evaluating Numbers,” “Evaluating Words,” and “Evaluating the World.” Each has its own focus — data, language, or logic — but all three emphasize the importance of asking questions, seeking alternative explanations, and recognizing our own cognitive weak spots. (We are easily misled by visual patterns, for instance, and most of us have a hard time calculating probabilities in our heads.)

Part one explains basic concepts needed for interpreting statistical information, such as the difference between mean, median, and mode when discussing “averages,” and why an apparent correlation between two phenomena doesn’t mean one causes the other. It also explains common biases that can result from faulty data collection, and ways that visual presentations can hide trends while seeming to reveal them.

Perhaps most helpful of all, part one offers a clear and concise introduction to probability in general and conditional probability in particular. Conditional probability is the likelihood that something is true given that something else is also true — the likelihood that a woman whose mammogram says she has breast cancer really does have it, for example. Levitin demonstrates how to calculate this with pencil and paper using a hand-drawn two-by-two grid. It doesn’t require mathematical skills beyond the ability to add, subtract, multiply, and divide.

Part two might as easily have been called “Evaluating Explanations and Explainers.” It describes how to assess “experts” and how to account for the fact that large numbers of people can come to believe things that simply aren’t true. “Humans are a storytelling species,” Levitin says, “easily seduced by a good tale.” Especially when that tale includes “facts” that sound authoritative, like a claim that the Twin Towers wouldn’t have collapsed vertically in the 9/11 attack.

Those of us who know nothing about structural engineering might assume that anyone making such a claim knows more about building collapses than we do. “But a little bit of checking reveals that structural engineers have found nothing mysterious about the collapse of the towers.” Levitin’s advice about how to decide between competing perspectives connects directly back to what the reader learns in part one: “The difference between a false theory and a true theory is one of probability.”

Part three provides an introduction to logic, common logical fallacies, and the relationship between logic and science. It describes the critical thinking that led one young physician to discover that having doctors wash their hands before touching patients could save many lives, and outlines the “pitfalls in reasoning” that lead people to believe that vaccines cause autism. It then works through four case studies — an instance of medical decision-making, a historical mystery, a world record that may have been the result of a magician’s trickery, and a scientific possibility from the world of physics — to illustrate how to engage in systematic analysis.

Thinking critically takes time; we can’t evaluate every bit of information that comes our way. As Levitin notes, social media has generated a deluge: “We’ve created more human-made information in the last five years than in all of human history before them.”

But some wrong beliefs are costlier than others, and readers who want to avoid making reckless choices in high-stakes situations — at the doctor’s office, in the jury box, etc. — will appreciate having such a user-friendly guide to sorting things out.

Danielle LaVaque-Manty is a freelance editor and fiction writer living in Ann Arbor, Michigan. She is coauthor of Writing in Political Science: A Brief Guide (2016), and coeditor of Using Reflection and Metacognition to Improve Student Learning: Across the Disciplines, Across the Academy (2013) and Transforming Science and Engineering: Advancing Academic Women (2007).

Like what we do? Click here to support the nonprofit Independent!
comments powered by Disqus