Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

Tag Archives: science


(1980’s | “non-dairy vegetarian”)

There is such a firm on-line consensus about when, where, and by whom the word “vegan” was invented that I’m inclined to believe it, though I might not if the OED didn’t back it up. No more suspense: In 1944, an Englishman named Donald Watson and a small group of like-minded “non-dairy vegetarians” founded a group to promote their way of life. The story goes that the word is an abridgement of “veg-etari-an” that evolved from group deliberations among Watson and his circle as they searched for a simple, memorable way of referring to themselves. They did not invent veganism, of course; many religious movements and illustrious individuals had attached themselves to it over the centuries. But they did change its course.

It takes a while for tiny groups with unpopular ideas to make headway, but it happens more often than you think. Outside of scientific journals, “vegan” seems to have occurred rarely before 1975, and infrequently until a decade or so after that. One supposes, without having looked for any evidence, that a critical mass had built by then within the culture — that is, enough people practiced veganism and were willing, nay, determined to talk about it — and therefore the existing word began to take up room in our common vocabulary. Well, it might be true, but how would you prove it?

Note on pronunciation: Apparently Watson preferred “veegan” from the beginning; dictionaries printed as late as 1990 gave “vejjan” as an alternative. (“Vaygan,” as in a being from the star Vega, and “veggan,” off-rhyming with legging, seem not to have been considered.) “Veegan” has definitely become standard; I don’t recall ever hearing it pronounced any other way. (I think I first encountered the word around 1990, probably in print.)

The roots of Watson’s veganism lay in an abhorrence of animal cruelty; it stems from anti-vivisectionism. Vegans despised the exploitation of animals and the violence that went with it. That is still true, but I sense that the case for veganism has come to rest more on nutritional and environmental grounds. Raising plants for food is much more efficient than raising animals. (There the argument can be made in terms of going beyond sparing animals to sparing the earth.) Nutritional justifications have had a harder time — vegans have had to combat the perception that their diet leads to various deficiencies, most of which can be corrected with supplements. But in comparison with the effects of meat-eating, veganism doesn’t look so bad.

Vegans must also reckon with our species’ prehistoric domestication of animals, and millennia of hunting before that — we’ve always killed and eaten animals, so why should we stop now? To which the vegan replies, there are many, many ancient practices that civilized people don’t perform any more, and killing our fellow animals for their products, edible or otherwise, ought to be on that list. As the earth continues to groan under us, it’s getting harder to deny that at least some forms of domestication will not be sustainable much longer. Just don’t make me live without potatoes fried in peanut oil.

The cruelty argument, powerful though it is, collapses if we discover that plants are conscious, feel pain, etc. It may be that we cannot feed ourselves without viciously exploiting one or another sentient product of the earth. I suppose we could try eating each other (wait, that sounds like a movie). If we devour ourselves like the cats of Kilkenny, that will save the planet, won’t it?

Tags: , , , , , , , , , , , , ,

the science

(2000’s? | “the (scientific) consensus,” “(the best) scientific evidence,” “the latest studies”)

Trust the science. Follow the science. Believe in the science. Government policy will be determined by the science. (“Data” gets the same treatment; you must do what the data tell you. The article is less obtrusive in front of “data” than “science.”) The prescription has drawbacks, most notably that science can’t make up its mind right away and will issue conflicting decisions and rules as the evidence continues to roll in. This lack of certitude does create problems, which scientists themselves may exacerbate by showing certainty before it is warranted or just by talking down to the rest of us. Such problems are not permanent, however; one indication of good medical research is that it gets both more accurate and more sure of itself over time, leading to more effective diagnosis and treatment. Besides, given the complex and uncertain world we live in, the power to adapt to new information ought to inspire confidence rather than undermine it.

One trick of the definite article is that it suggests that science says only one thing, so that it can be counted on for unambiguous guidance. We have all encountered exceptions, but in the case of the coronavirus that has been largely true, I think. Dissension does arise within the scientific ranks; for the most part it is resolved as more tests are run and more results produced.

Of course it has always been possible to plop down a definite article before “science.” But it was almost always followed by something further — the science center, the science headlines, the science of . . . . But science solus has been lumbered constantly with the article during the pandemic, as doctors and public officials implore us to heed infectious-disease specialists. “The science” has become a mantra of sorts, asking us to accept medical research as a reliable source of knowledge that offers maximum protection from a weird and frightening virus. Not everyone wants to listen, of course, and COVID has confounded the experts from time to time, eroding their claim to be the most trustworthy voice.

The plea to “trust the science” is a quasi-religious gesture; we are enjoined to hope that scientists have our best interests at heart and will perform competently. That’s a watered-down version of what Jews, Christians, and Muslims believe about God. Most of us do not understand how the scientists arrive at their results any more than we understand the Lord’s mysterious ways, so our level of helplessness is about the same, for all that scientists can adduce a much longer list of verified empirical results than priests can. Science has what I think is a built-in problem: the more advanced it gets the more it looks like magic, which resembles religion in that it wins loyalty by producing wonders that defy comprehension. Contemporary physics is almost perversely counterintuitive, producing theories that flout what we thought were fundamental principles. Western medicine, whatever its shortcomings, continues to produce cures unthinkable a few generations ago. We can look up almost anything instantly on a cheap handheld device. What comes with these advances? An abandonment of earthbound common sense, and a profession of faith in a select group of mandarins who alone understand how the universe works. That’s not what Paine and Voltaire had in mind.

Ah, the humble definite article — let us not overlook its semantic power. (And prosodic: articles make the iambic a characteristic English meter, even though most of our words are accented on the first syllable.) In English, unlike many European languages, “the” transforms nouns from general to particular. (E.g., “keys” vs. “the keys.” Note that this rule holds in the case of “the science,” if you hear it as a reference to work in epidemiology or another specific branch of medicine.) Sometimes definite articles are indispensable — “make bed,” “walk dog,” or “rock boat” all sound ridiculous — yet other languages get along happily without them. (And their misuse is a quick way to recognize a non-native speaker.) We scatter them thoughtlessly and pay them no mind. We would do better to reckon with the power of “the.”

Tags: , , , , , , , , , , , , , , ,

drill down

(1980’s | computerese | “dig deeper,” “bore in,” “focus”)

Somewhere in the late eighties, I believe, “drill down” moved from the petroleum industry to the computer industry. In the former, it referred to boring through rock to reach richer deposits of oil and gas. The connection is easy to see with the computerese usage, where one is in pursuit of more revealing (i.e., remunerative) data, getting past the obvious and ferreting out telling details (cf. “data mining“). In both cases, the surface is an obstacle, and we must look beneath it, perhaps far beneath it, to find real significance (cf. “unpack“). “Drill down” has another meaning in computerese, referring to navigating a complicated web site with multiple levels of menus; so one might say, “I had to drill down pretty far to get to the page I needed.” In either sense, it may be used as noun or adjective, and may appear with or without a preposition.

The idea is uncovering data which can be accurately broken down into useful categories and subjecting it to a more minute analysis (cf. “granular“), often with the implication that the process will make everything clearer without a lot of tedious work (cf. “get into the weeds“). Yet “drill down” is occasionally used to refer to attaining a general truth, getting past distractions and superficies to reach a greater understanding. Here again, the satori you seek must be exhumed from its buried lair. It seems to me, though, that a reference to upward movement would work just as well. Shouldn’t we rise into rarefied realms of knowledge rather than trap ourselves in a suffocating pit? Especially when considering philosophical enlightenment, a more elevated, Olympian perspective is indicated.

As I thought about this expression, I was struck by the number of related words and phrases that Lex Maniac has covered, some of which I have already alluded to. The spread into everyday language of such expressions tells us that we are more dependent than ever on quantitative analysis for guidance in public and private dealings, causing us to produce far more data than we can use. While solid correlations drawn from experience and observation are essential (cf. “data-driven“), we’re not always as careful as we should be in collecting and interpreting them. If you’re going to lionize data, it has to be sound, and making it sound has to be built into the process. Sometimes statistics are quite reliable — medicine has a pretty good track record, for example — and sometimes they aren’t. In a world where dubious surveys are presented as rock-solid fact, we have to be aware of where those numbers come from, how they were gathered, and their limitations. Incomplete or unreliable raw material will lead us astray sooner or later; the longer it takes us to notice, the deeper a hole we will find ourselves in — not a hole to drill farther into, but to haul our way out of.

Tags: , , , , , , , , ,


(1990’s | computerese | “by-the-numbers”)

When I wrote a follow-up post on coronavirus vocabulary a few weeks ago, I missed this one completely. I concede the oversight, because we’ve been hearing it a lot this year from the hang-on-and-let-science-take-care-of-it school of dealing with the pandemic (“science-driven” is a variant). I don’t know what the other school is — maybe let’s-wait-for-it-to-go-away-and-not-count-the-bodies. It’s not that scientists are perfect, but they hold our best chance of getting through all this. Winston Churchill is reported to have said that democracy is the worst form of government, except for all the others. We are in an analogous situation now — stuck with the scientists because the alternatives are very unlikely to improve matters and might ultimately make the damage worse.

“Data-driven” sounds better than “data-determined,” which is closer to the mark. In public policy, it implies that decisions are based rigorously on numerical results of tests, surveys, clinical trials, etc. If the numbers go up, you do one thing; if they go down, you do something else (not necessarily the opposite). The goal is to prevent other concerns or priorities from wresting control away from the objective and verifiable. When Gov. Cuomo talks of data-driven school re-opening (another COVID buzzword), he means that public health and education experts have come up with a set of standards for reducing risk, in order to keep contagion from getting out of hand and overwhelming the health care system — a fear that has driven (there’s that word again) most of the response to the pandemic. The observed rate of positive tests and hospitalizations guides government action, the closest it can get to acting for all the people, not special interests or wealthy groups. Data-driven action should be consistent, predictable, and unswayed by other considerations. It is based on the idea that sustained, careful observation yields trustworthy information, which if properly collected, combined, and correlated forms a sound basis for getting a grip on the problem and taking action to solve it.

“Driven” is a funny adjective. A driven person is at least a little bit obsessive and type A, quite possibly successful, but not always pleasant. There’s an old expression, “pure as the driven snow,” which raises a completely different association (“driven,” short for “wind-driven,” indicates that the snow hasn’t hit the ground yet and is thus untainted). It might turn up in a movie review, describing a film as “plot-driven” or “character-driven.” In eighties computer magazines, you might have seen “menu-driven,” “technology-driven,” or “process-driven,” for example. I’ve alluded before to Pastor Rick Warren’s book “The Purpose-Driven Life.” It appears that “driven,” the appendage, is here to stay, able to form new compounds (and here and there the odd new concept) in a single bound.

When I was a kid, elected officials followed the advice of experts, or at least said that’s what they were doing. The use of “data-driven” may suggest that the experts are unnecessary middlemen; just give us the statistics. Of course, if the pandemic has taught us anything, it’s that we still need experts — people who have spent years studying the phenomenon, understand it better than the rest of us, and can offer informed counsel when a crisis blows up. A little learning is a dangerous thing, but a lot is useful.

Tags: , , , , , , , , , , , , , , ,


(1980’s | academese (science) | “randomly generate”)

A term born of empirical science — experiment design and statistics. Now it is used primarily to talk about clinical trials; an essential part of testing a medication or treatment is “randomizing” the patients — that is, making sure that those getting the treatment and those getting the placebo are sorted by non-human means, to eliminate as much bias in the results as possible. Such processes are easiest to envision in a binary world, where there are only A and B, and the category you belong to is “decided” by mechanical means. Computer programmers picked it up very soon, before most of us knew there was such a thing as computer programming, so by 1980 “randomize” had a number of technical uses, which for the most part it still has. In the eighties and nineties, I found examples from other endeavors as well: poker; esthetics (choreographer Merce Cunningham “randomized” his decisions at particular junctures by throwing the I Ching to determine the outcome); CD players; creating standardized tests; listing candidates on a ballot. It most often has to do with some sort of testing, medical or otherwise.

An “-ize” verb, “randomize” doesn’t sound as clunky (to me, at least) as “incentivize,” “weaponize,” or “monetize.” Probably because it’s rooted in science and mathematics; ize-itis is easier to take with technical terms. And “randomize” hasn’t filtered into everyday speech much. It’s a word you come across in print occasionally, but it hasn’t exactly taken the vernacular by storm. It seems like a modest enough word, filling a need without taking up too much room.

A related yet unrelated word is “rando.” It’s sort of a portmanteau of random and weirdo — the rando has a definite hint of unpleasantness, not someone you want to have to deal with. (Though the highest-ranked definitions on don’t give the term a negative implication, and at least one on-line source thinks randos are a good thing, so the jury is out.) An unrelated yet related word is “anonymize,” to which my attention was drawn by Lovely Liz from Queens, as in “anonymize data.” It’s how to divorce you from your personal information and preferences; more precisely, it’s how internet titans vacuum up everything worth knowing about your on-line habits while creating the illusion that your name and identity can’t be connected with any of it. But anonymizing is also part of randomizing; in fact, removing patients’ names is an essential step in the process.

Random isn’t as simple as it sounds. Take a simple example: if you flipped a coin and it came up heads ten times in a row, you wouldn’t think that was random at all. Some ordering force must be at work, right? Yet it’s perfectly possible for a fair coin to land on the same face ten times in row. There doesn’t even have to be a balancing streak of ten tails later on, but over time the number of heads and tails will even out. In a truly random sequence or assortment, you will almost certainly find stretches that appear to be grouped logically, but that’s just how it shakes out; it’s not proof, or even evidence, of a master intelligence running things. We want to call random only that which is jumbled, devoid of an obvious organizing principle. But the random may look very organized if you focus on a small section.

Tags: , , , , , , , , , , , , ,


(1990’s | journalese (sports) | “percentage baseball”)

Few of my few devoted readers being baseball fans, it behooves me to offer some explanation of this odd word. (Don’t you always look for chances to use “behoove” in a sentence?) “Sabermeterics” refers to rigorous statistical analysis, which begins by establishing a reliable set of numbers measuring the performance of single players and entire teams and then reinterpreting them, taking them apart, recombining them, and generating new statistics, thought to be more revealing than the old ones. The word itself is an eponym, “saber” being derived from the acronym SABR, the Society for American Baseball Research, founded in 1971 as a small organization devoted to using statistics to understand baseball history. Nowadays, sabermetrics attracts more attention as a way of helping executives and managers arrive at the most effective ways to evaluate and use their players, or decide how much they should be paid or traded for. Now other sports have been bitten by the bug, and the concept may even be familiar to non-fans; many baseball abstainers have heard of Michael Lewis’s book “Moneyball,” an account of the Oakland A’s under general manager Billy Beane, who adopted sabermetric insights wholesale and built a successful team with limited means. (If you missed that, there was a Simpsons episode in 2010.)

The term has always been credited to one of its leading practitioners, Bill James, who has — not single-handedly — revolutionized our understanding of baseball. (Full disclosure: my copy of his “New Historical Baseball Abstract” is pretty much disbound due to wear.) He began a one-man samizdat in the seventies, producing mimeographed collections of statistics and evaluations of major-league players; within a few years, the annual “Baseball Abstract” was picked up by a major publisher. Since then, he has written several compendious reference books that have laid out new frameworks for understanding how baseball works. In 2003 the Boston Red Sox hired him as a special advisor, a post he retains. He has indeed created some very complex and arcane statistics, but they have become common currency in discussions of baseball.

There are two inspiring stories here: James’s rise from outsider devoid of credentials to respected insider; and the triumph of empiricism and scholarship. The first proves that such storybook careers remain possible, but the latter, it seems to me, has wider cultural import. The SABR scholars, with little to offer except patient, unremunerated toil, have applied a version of the scientific method to baseball, emphasizing observation, data gathering, and statistical analysis in order to reach well-founded formulas for success. And to a great extent, it has worked. Baseball teams can no longer ignore sabermetrics; the insights of those nerdy statisticians — “statistorians” as a pre-James pioneer, L. Robert Davids, called them — have become so standard that ignoring them is a form of malpractice. It may give us a flicker of faith that in the face of a rising tide of obscurantism, that kind of work still proves its worth and compels respect, even in a game as anti-intellectual and tradition-bound as baseball.

Like the sciences, sabermetrics ultimately proves itself through successful prediction. Why is it that sabermetrics gets more credit than, say, climate science, despite the fact that the broad claims made by climatologists thirty years ago have been borne out? It’s a much smaller audience, for one thing; most people don’t care enough about baseball to set any store by ingenious statistical hermeneutics, but nearly everyone has an opinion about climate change. Baseball has a very long tradition of statistical study, and there have always been a few “figure Filberts,” as people like James used to be called; outside of baseball, most people don’t understand statistical analysis and don’t hold with it, unless it happens to confirm what they already believed. In baseball, the goal is to win, and winning is clearly defined and easily measured. That is much less true in the greater world, where a lot more people win by casting doubt on human-caused climate change than by taking issue with sabermetricians.

Tags: , , , , , , , , , , , , , ,


(1980’s | computerese? | “innate,” “(pre-)programmed,” “fixed,” “unalterable”)

The hard-wired smoke detector was already around in 1980; in that sense the term has not changed meaning since. “Hard-wired” meant connected directly to the building’s electrical system, meaning it was not powered by batteries, meaning that it would not infallibly begin making horrible chirping noises one morning at 3:00 and resist every sleep-fogged effort to silence it. A hard-wired telephone was similar in that it was harder to disconnect than the standard model you plug into a wall jack (already common in my youth, though far from universal). The cord connected to the system inside the wall rather than on the outside. Cable television might be hard-wired in that the cables connected to the source physically entered your house and attached themselves to a television set. Computer scientists had been using the term before that, generally to mean something like “automatic” or “built-in” — the only way to change it is to make a physical alteration to part of the equipment — and it remained firmly ensconced in the technical realm until the eighties. That’s when “hard-wired” became more visible, as computer jargon was becoming very hip. (PCMAG offers a current set of computer-related definitions.) In computer lingo, “hard-wired” came to mean “part of the hardware,” so “soft-wired” had to follow to describe a capability or process provided by software.

My father, erstwhile electrical engineer, pointed out that in his world, “hard-wired” was the opposite of “programmable.” In other words, the hard-wired feature did what it did no matter what; it couldn’t be changed simply by revising the code. Yet you don’t have to be too careless to equate “hard-wired” with “programmed” (see above) in the sense of predetermined. It’s not contradictory if you substitute “re-programmable” for “programmable,” but that requires an unusual level of precision, even for a techie. Every now and then you find odd little synonym-antonym confusions like that.

Still in wide technical use, this expression has reached its zenith in the soft sciences, in which it is commonly used to mean “part of one’s make-up,” with regard to instincts, reflexes, and basic capacities (bipedal walking, language, etc.), and more dubiously to describe less elemental manifestations such as behavior, attitude, or world-view. “Hard-wired” is not a technical term in hard sciences such as genetics or neurology. The usefulness of the expression is open to question: one team of psychologists noted, “The term ‘hard-wired’ has become enormously popular in press accounts and academic writings in reference to human psychological capacities that are presumed by some scholars to be partially innate, such as religion, cognitive biases, prejudice, or aggression . . . remarkably few psychological capacities in humans are genuinely hard-wired, that is, inflexible in their behavioral expression” (citation). Scientists may sniff at the term as used in pop psychology, but it does make for easy shorthand and probably won’t go away any time soon.

The reason we take so easily to applying the term “hard-wired” to the brain is that the computer, as developed over the last fifty years, forms the most comprehensive map yet for the workings of our minds. A contributing reason is the very common, casual linking of brain activity with electricity, as in referring to one’s “wiring” — even though one may also refer to one’s “chemistry” to explain mental quirks, probably a superior explanation. Watching a computer “think” helps us understand how our brains work, or maybe it just misleads us, causing us to disregard our own observations in order to define our own mentation with reference to the computer’s processing. There are obvious connections and obvious divergences; surely any device we concoct must reflect the workings of our own minds. But computers aren’t just for playing solitaire, calculating your tax refund, running a supercollider. They serve a humanistic function by giving us new ways to think about the old ways we think.

Tags: , , , , , , , , , , , ,

false positive

(1980’s | doctorese | “bad diagnosis”)

An example of an older expression that has grown common and become less specialized (other examples: “blowback,” “grounded,” “politically correct,” “template“). In medicine, “false positive” goes back at least to the forties, probably earlier; for some reason, the only results in Google Books from those days have to do with the Wassermann test for syphilis. In the seventies, the phrase got a boost from the popularity of home pregnancy tests. In the eighties, it was employee drug testing. Both developments got plenty of press, so use of the phrase grew sharply, and as it spread it began to turn up outside of strictly medical contexts. Now it can apply to virus or spam detection, security systems, internet search results, or even economic forecasting or earthquake warnings. The last two are notable because they involve not results but predictions, which adds a new twist. You said there will be a recession and it doesn’t materialize — instead of you said there was cancer and there was no cancer there. Another example from the scientific community: “A false positive is a claim that an effect exists when in actuality it doesn’t,” that is, detecting a correlation that exists only because of your misinterpretation of the data. All these meanings rely on presumably preventable misreadings of an empirical result, incorrectly assigning too broad a significance to a single symptom, or maybe just running the test wrong.

False positives are a big problem; they can creep into the work of the most careful scientists. Medical tests that show a disease that isn’t really present can result in unnecessary or dangerous treatment, and all the expense that goes with it. The effect is subtler in empirical science, but pressure to obtain statistically significant results can skew the perspectives even of conscientious experimenters. (This article explains how it happens.) Such errors are dangerous because it’s worse to be sure of something that isn’t true than to fail to know something that is. As a great American philosopher, possibly Josh Billings or maybe Will Rogers, said, “It ain’t what people don’t know that’s the problem; it’s what they know that ain’t so.”

The expression was well settled by 1980, but only in medical contexts. (“False negative” is just as old.) When it turned up in general-interest articles, it often came packaged in quotation marks. It had not become a regulation noun; in those days it was still normally a compound adjective, applied to readings, results, reactions, responses, rates. Now it is more common as a noun than as an adjective.

I’m sure I wasn’t the first or last kid to stumble over the counterintuitive meaning of “positive” in medicine. I thought “the test came back positive” was good news, whereupon my hard-working parents (I kept ’em hopping) had to explain that the word you wanted to hear was “negative.” Doctors test for the presence of a disease or condition, and a positive result means they’ve found it, and you’re stuck with an undesirable disorder. It’s the only zone in everyday language in which “positive” means “negative,” I do believe. (It reminds me of middle-aged parents in the seventies cheerily reminding each other that “bad” meant “good.”) We must ever observe the instructions in the song and accentuate the positive, but not in the lab, please!

Tags: , , , , , , , , , ,

junk science

(1980’s | legalese | “quackery”)

This is an expression with an agenda. We began to hear it regularly a little after 1990; a Washington Post editorial referred to “what some are beginning to call ‘junk science'” in March 1996. Google Books and LexisNexis cough up several instances from the eighties, and even this diamond in the rough from 1903: “But that conceited laugh of junk science, how laughable it is after all” (Peter Burrowes, “What is truth?” in Revolutionary Essays in Socialist Faith and Fancy, Comrade Publishing Co.). Whatever Burrowes may have meant, both the meaning and connotation of this expression were pretty clear when it came into its own at the other end of the century. The term was most often used by lawyers to complain about so-called expert witnesses purveying unsubstantiated theories about harms to plaintiffs and driving up the cost of judgments against well-meaning, God-fearing corporations. The phrase generally reared its head in discussions of tort law, that is, lawsuits filed to obtain compensation for wrongs not covered by criminal law. And it was generally used to assail dubious medical or technical testimony that swayed gullible juries (or judges).

It isn’t always so clear what “good” science is, even in our everyday Newtonian world; practicing scientists with good credentials may disagree vigorously on the interpretation of a piece of evidence even in simple cases. Attacks on junk science often rely on the unstated assumption that proper science is easily defined and recognized, not subject to controversy among scientists. That is true most of the time, but not all the time, and it does foreclose the possibility of finding value in the new or unconventional. The Supreme Court has ruled that scientific evidence should be peer-reviewed but stopped short of setting absolute limits on what can or can’t be presented in the courtroom.

No doubt many verdicts have been influenced by doubtful expert testimony. Peter Huber cited and documented several with relish in “Galileo’s Revenge: Junk Science in the Courtroom” (Basic Books, 1991); the subtitle probably played a role in popularizing the phrase. His plaintiff-bashing set the tone; it took several years before “junk science” came to be applied regularly to any doubtful theories propounded by big business. In its early days, junk science always had a bleeding heart, causing courts to fall for sob stories bolstered by expert witnesses who were far too sure of themselves. Crazed environmentalists, quack psychiatrists, doctors on the take — they were the ones who relied on junk science to con the scientifically illiterate. Nowadays, the phrase is comfortably used in a much wider variety of contexts, but it still seems to be favored by the right wing, though it is no longer solely their property. (I shudder when I ponder future semantic possibilities given the recent rise of “junk” as a slang term for “genitals.”)

The funny thing is that you would expect the forward-looking lefties to brandish science against the backward righties, but they got in first on this term, which fit neatly into their strategy of attacking people seeking redress for injuries allegedly caused by corporate negligence. The web site Junk Science, opened in 1996, is unabashedly right-wing and contemptuous of the scientific establishment, debunking climate change, solar power, and other usual suspects, particularly government participation in scientific research. (Ironic, because the original definition of “junk science” as propounded by lawyers depends on conformity with scientific consensus.) The phrase allows right-wingers to dismiss a favorite left-wingers’ trump card and beat them at their own game. References to science make you sound serious and learned, and who’s going to make you explain why the object of your scorn violates this or that scientific principle? It has become one more way to say, “Shut up. You’re wrong.” But then, it never really was anything else.

Tags: , , , , , , , , ,


(1990’s | counterculturese? journalese? | “hippie” (adj.), “tree-hugging”)

“Crunchy granola” (adjective or noun) is a common variant. I remember hearing “nutty-crunchy” first around 1990, and I had to have it explained to me. (Even then, your humble maniac was hard at work.) It’s not clear to me when this expression arose, but surely not before 1980. One is expected to suppress mental cross-references to the old sense of “nutty” (crazy), but detractors of the environmental movement cheerfully let them creep in. In fairness, some exponents also emphasize the “nutty” in “nutty-crunchy,” taking pride in their purity. But “crunchy” is the word you have to watch, for its overtones have changed. At first, it referred to environmentalists, with the implication that they lived off the land or at least made their own stuff. Now the implication is a little more rarefied, especially in the term, “crunchy (granola) mom”: someone who gives birth with the aid of a midwife, breastfeeds, uses cloth diapers, makes her own organic baby food (but need not grow the vegetables herself), won’t eat meat, and maybe co-sleeps or refuses vaccinations. Not being a big player in the parenting game, I wasn’t familiar with this phrase until I started looking around, but we may measure its ubiquity by the number of on-line quizzes telling new mothers how crunchy they are.

A digression on “crunchy granola” used as an adjective: It continues to sound strange to me, but you do hear it; it may obliterate “nutty-crunchy,” which I sense has become less common. The short form, “crunchy,” at least sounds like an adjective. The full-length form reflects a certain exuberance — the “I’m weird and proud of it” attitude characteristic of the counterculture, the weirdness extending to the eccentric use of “granola” as an adjective. It is not clear to me whether this expression arose from the believers or the mockers, but in practice it may not matter, since the former steal from the latter all the time. The other odd thing about the yoking is the fact that the connection between granola and the counterculture does not hinge on crunchiness. “Organic granola” would make more sense, or even “nutty granola.” “Crunchy” is more evocative than either of these, and “chewy” would be worse, but I haven’t quite figured out why it became the preferred shorthand for one who is environmentally conscious, or fanatical about one’s health or childrearing practices.

Crunchy beliefs and behavior do not belong exclusively to the left or right; they are where both extremes converge. A 2006 book by Rod Dreher, “Crunchy Cons,” points out that many right-wingers do crunchy things, too. The specific manifestations may differ — right-wingers seem to do more home-schooling, for example — but both modalities boil down to rejection of the way most people obtain the necessities of life and raise their children, powered by the middle-of-the-road scientific consensus that tells us how to live our lives in a thousand minute, complicated ways. It’s an old idea in this country, though in some instances it has relied on science rather than keeping it at bay. In the nineteenth century (the word “granola,” originally a trade name, goes back to 1875) we had Graham and Kellogg; before them countercultural ideas about nutrition or lifestyle often stemmed from outlying sects like the Shakers. I’m old enough to remember Euell Gibbons, who shilled for Grape Nuts (there’s that nut again). The sixties gave natural living another boost, and the tradition goes on.

Tags: , , , , , , , , , , , ,