October 11, 2016 false positive
(1980’s | doctorese | “bad diagnosis”)
An example of an older expression that has grown common and become less specialized (other examples: “blowback,” “grounded,” “politically correct,” “template“). In medicine, “false positive” goes back at least to the forties, probably earlier; for some reason, the only results in Google Books from those days have to do with the Wassermann test for syphilis. In the seventies, the phrase got a boost from the popularity of home pregnancy tests. In the eighties, it was employee drug testing. Both developments got plenty of press, so use of the phrase grew sharply, and as it spread it began to turn up outside of strictly medical contexts. Now it can apply to virus or spam detection, security systems, internet search results, or even economic forecasting or earthquake warnings. The last two are notable because they involve not results but predictions, which adds a new twist. You said there will be a recession and it doesn’t materialize — instead of you said there was cancer and there was no cancer there. Another example from the scientific community: “A false positive is a claim that an effect exists when in actuality it doesn’t,” that is, detecting a correlation that exists only because of your misinterpretation of the data. All these meanings rely on presumably preventable misreadings of an empirical result, incorrectly assigning too broad a significance to a single symptom, or maybe just running the test wrong.
False positives are a big problem; they can creep into the work of the most careful scientists. Medical tests that show a disease that isn’t really present can result in unnecessary or dangerous treatment, and all the expense that goes with it. The effect is subtler in empirical science, but pressure to obtain statistically significant results can skew the perspectives even of conscientious experimenters. (This article explains how it happens.) Such errors are dangerous because it’s worse to be sure of something that isn’t true than to fail to know something that is. As a great American philosopher, possibly Josh Billings or maybe Will Rogers, said, “It ain’t what people don’t know that’s the problem; it’s what they know that ain’t so.”
The expression was well settled by 1980, but only in medical contexts. (“False negative” is just as old.) When it turned up in general-interest articles, it often came packaged in quotation marks. It had not become a regulation noun; in those days it was still normally a compound adjective, applied to readings, results, reactions, responses, rates. Now it is more common as a noun than as an adjective.
I’m sure I wasn’t the first or last kid to stumble over the counterintuitive meaning of “positive” in medicine. I thought “the test came back positive” was good news, whereupon my hard-working parents (I kept ’em hopping) had to explain that the word you wanted to hear was “negative.” Doctors test for the presence of a disease or condition, and a positive result means they’ve found it, and you’re stuck with an undesirable disorder. It’s the only zone in everyday language in which “positive” means “negative,” I do believe. (It reminds me of middle-aged parents in the seventies cheerily reminding each other that “bad” meant “good.”) We must ever observe the instructions in the song and accentuate the positive, but not in the lab, please!