Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Lies, Damned Lies, And 'Proofiness'

Sen. Joseph McCarthy famously brandished a list of 205 known Communists in the State Department. Or was it 207? Or 57? Or maybe 81? No one really knew, mostly because McCarthy himself didn’t know. He made those numbers up, because he knew that numbers have power. And those 205 nonexistent Communists made Tail Gunner Joe into a household name.

Author Charles Seife has written a new book -- Proofiness: The Dark Arts of Mathematical Deception -- about the dirty tricks people can play with numbers. Seife tells NPR’s Mike Pesca that McCarthy’s list of Communists is a good example of proofiness.

“Numbers communicate how far you can trust them,” he says, “and a nice round number is signaling, well, you know, I’m in the area, that it’s not a very precise number.”

But an exact number, like McCarthy’s 205 Communists, is easier to believe.

“People know that that means exact, exact numbers, and that it has some real basis in reality, and it’s to be trusted,” Seife says.

But trusting a number too much can be dangerous, the author says.  It’s a phenomenon he calls ‘disestimation,’ and it happens when people take a number far too seriously.

Seife recalls the story of a docent at the American Museum of Natural History in New York, who gave the age of a dinosaur as 65 million and 38 years.

“The guide says, well, when I started at this museum 38 years ago, a scientist told me it was 65 million years old. Therefore, now, it’s 65 million and 38.”  Seife says the docent was placing far too much value on the 65 million figure, “when in fact, the error involved in measuring the dinosaur was plus or minus a hundred thousand years. The 38 years is nothing.”

And speaking of error, Seife has some choice words about opinion polls and the way they’re reported.

“When journalists report polls, they just don’t know better that they shouldn’t take these results literally,” he explains.

Seife says the margin of error you see attached to a poll is only measuring one specific kind of error: taking too small of a statistical sample.

“But in fact, when polls go wrong,” he says, “it’s due to a completely different type of error, called a systematic error.”

That means the poll hasn’t been set up correctly or the questions are misleading, or simply that people answering the poll are lying -- which Seife says happens quite frequently. “So when journalists report polls, most of which aren’t worth the paper they’re written on, I think they’re kind of innocently performing an act of proofiness.”

Copyright 2023 NPR. To see more, visit https://www.npr.org.