Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: psychology

word salad

(2000’s | journalese (politics) | “gibberish,” “incoherent speech,” “obfuscation”)

This expression recently underwent a significant change after a hundred stable years. The first citation I found dates from a psychiatric handbook of 1907, where it occurs in a discussion of dementia precox, the old name for schizophrenia, more or less (they weren’t exactly the same, but that’s the closest term in modern mental health vocabulary). It hasn’t changed meaning in that context; a textbook published in 1970 gave the following: “A jumbled, unintelligible mixture of words, usually containing both real words or phrases and neologisms. This disturbance in verbal communication is most frequently found in advanced schizophrenic reactions.” By 1980, arts writers used it now and then to talk about writers like Gertrude Stein and James Joyce, both of whom were considerably more artful than your average schizo, but somewhat less syntactically or semantically forthright than Mickey Spillane, say. It took thirty more years before the expression came to characterize political speeches; the first consistent victim was Sarah Palin in 2008, but in 2016, both Trump and Clinton, widely different speaking styles notwithstanding, were accused of producing word salad. (Somehow this expression doesn’t take to the plural.) The older uses are still found, but in ten short years the phrase has become quite common in political commentary, in which it was never used before Sarah Palin took the national stage. Merriam-Webster On-line provides a history with plenty of examples.

Like “hive mind,” “word salad” has become a favored term of abuse, but it need not be an insult. When used to refer to the ramblings of the mentally ill, it probably was always implicitly insulting — and that origin continues to be felt as we use the phrase today — but literary critics may treat it as a neutral descriptor. Not long before the move into political discourse, “word salad” took on two new uses: one referred to a technique of creating spam e-mails that used blocks of unconnected words in order to fool the filters; more significantly, it started to imply deception, pointing the way to politics. The crucial difference has to do with volition; the schizophrenic babbles uncontrollably, but the purveyor of catch-phrases strung together so as to defeat interpretation is doing it on purpose. In political discourse, it may take either shading, and they’re equally insulting — a variation on the old Reagan cleft stick: if he knows what’s going on, he’s a criminal; if he doesn’t, he’s too out of it to be president. Whether you think Trump just doesn’t know any better or is deliberately snowing us, you probably think he shouldn’t have the job.

Now that “word salad” is firmly enmeshed in political journalism, it is anyone’s guess whether psychiatrists will continue to use it; they may be forced to find a new phrase if the old one changes connotation for good. As late as the nineties, it was pressed into service as the title of a computer game and an on-line poetry magazine, suggesting that it might yet be considered favorable, or at least eye-catching. Those days appear to be over.

Why salad, anyway? The idea of several heterogeneous ingredients, mixed but not blended together, seems to be at the bottom of it, though the expression probably hails from German or French originally, and I’m not certain “salad” carries the same mental picture in those languages. I’ve seen “word hash” offered as a synonym, but if there ever was a contest, “word salad” has won. It’s more memorable than “jumble” or “logorrhea,” that’s for sure (personally, I’d like to see “word avalanche”). And I like the idea of pouring oil (and vinegar) on troubled word salad.

Advertisements

Tags: , , , , , , , , , , ,

hard-wired

(1980’s | computerese? | “innate,” “(pre-)programmed,” “fixed,” “unalterable”)

The hard-wired smoke detector was already around in 1980; in that sense the term has not changed meaning since. “Hard-wired” meant connected directly to the building’s electrical system, meaning it was not powered by batteries, meaning that it would not infallibly begin making horrible chirping noises one morning at 3:00 and resist every sleep-fogged effort to silence it. A hard-wired telephone was similar in that it was harder to disconnect than the standard model you plug into a wall jack (already common in my youth, though far from universal). The cord connected to the system inside the wall rather than on the outside. Cable television might be hard-wired in that the cables connected to the source physically entered your house and attached themselves to a television set. Computer scientists had been using the term before that, generally to mean something like “automatic” or “built-in” — the only way to change it is to make a physical alteration to part of the equipment — and it remained firmly ensconced in the technical realm until the eighties. That’s when “hard-wired” became more visible, as computer jargon was becoming very hip. (PCMAG offers a current set of computer-related definitions.) In computer lingo, “hard-wired” came to mean “part of the hardware,” so “soft-wired” had to follow to describe a capability or process provided by software.

My father, erstwhile electrical engineer, pointed out that in his world, “hard-wired” was the opposite of “programmable.” In other words, the hard-wired feature did what it did no matter what; it couldn’t be changed simply by revising the code. Yet you don’t have to be too careless to equate “hard-wired” with “programmed” (see above) in the sense of predetermined. It’s not contradictory if you substitute “re-programmable” for “programmable,” but that requires an unusual level of precision, even for a techie. Every now and then you find odd little synonym-antonym confusions like that.

Still in wide technical use, this expression has reached its zenith in the soft sciences, in which it is commonly used to mean “part of one’s make-up,” with regard to instincts, reflexes, and basic capacities (bipedal walking, language, etc.), and more dubiously to describe less elemental manifestations such as behavior, attitude, or world-view. “Hard-wired” is not a technical term in hard sciences such as genetics or neurology. The usefulness of the expression is open to question: one team of psychologists noted, “The term ‘hard-wired’ has become enormously popular in press accounts and academic writings in reference to human psychological capacities that are presumed by some scholars to be partially innate, such as religion, cognitive biases, prejudice, or aggression . . . remarkably few psychological capacities in humans are genuinely hard-wired, that is, inflexible in their behavioral expression” (citation). Scientists may sniff at the term as used in pop psychology, but it does make for easy shorthand and probably won’t go away any time soon.

The reason we take so easily to applying the term “hard-wired” to the brain is that the computer, as developed over the last fifty years, forms the most comprehensive map yet for the workings of our minds. A contributing reason is the very common, casual linking of brain activity with electricity, as in referring to one’s “wiring” — even though one may also refer to one’s “chemistry” to explain mental quirks, probably a superior explanation. Watching a computer “think” helps us understand how our brains work, or maybe it just misleads us, causing us to disregard our own observations in order to define our own mentation with reference to the computer’s processing. There are obvious connections and obvious divergences; surely any device we concoct must reflect the workings of our own minds. But computers aren’t just for playing solitaire, calculating your tax refund, running a supercollider. They serve a humanistic function by giving us new ways to think about the old ways we think.

Tags: , , , , , , , , , , , ,

blended family

(1980’s | therapese | “stepfamily”)

Contested terrain semantically, as in other, more obvious, ways. Start with the definition. Nowadays, most people would probably endorse a relatively loose definition of “blended family”: any family formed when an adult with one or more children takes up with a different adult, who may or may not have children. If you’re a purist, you might require that both adults have at least one child. In 1983, a writer defined it thus: “pop-psychology euphemism for members of two broken families living under the same roof, a mixture of step-parents, step-children and step-siblings.” Ten years before that, a psychology textbook defined it as a “family consisting of a husband and a wife, the children of either or both from a previous marriage, and children of the present marriage.” The new spouses had to have kids together, not just with former partners. The extra distinctions may have been made possible by a wider panoply of related terms than we can remember now. A surprisingly large amount of vocabulary sprang up around such filial configurations; in 1980, the New York Times propounded the following list: “conjugal continuation, second-marriage family, stepfamily, blended family, reconstituted family and metafamily.” (It missed “merged family,” also in use by 1980. “Mixed family” means that the parents are of different race, ethnicity, or religion.) Of these, only “stepfamily” would be familiar to most people in 2017, but Wikipedia distinguishes between stepfamilies (only one adult has a pre-existing kid) and blended families (both adults). According to the OED, “stepfamily” goes back to the 19th century; the earliest citation I found for “blended family” dated from 1964.

Why did “blended family” win out? Probably the usual mixture of euphony and accuracy, or intuitiveness. Most of us understood pretty quickly what it meant the first time we heard it in context, and it sounds good — not too long, not too short, scans nicely. “Second-marriage family” is clunky; “metafamily” is jargony and doesn’t make a whole lot of sense anyway. “Blended family” sounds a lot better than “reconstituted family” (just add water!), you have to admit. The only mystery: why didn’t “merged family” catch on?

We like to think that the quirks and foibles of our own generation are unprecedented, but blended families are hardly new. My father’s father grew up in one after his mother divorced his father and married her second husband. My mother’s mother was the daughter of a second marriage, an old widower and a young wife. Life expectancy was lower then, so remarriages were more often occasioned by death than divorce. Was there a decline in the number of blended families for a generation or two, long enough to forget how common such arrangements used to be? If so, the phenomenon has come roaring back. Somehow, before 1970 or so, we got along without a general term for it. Now we’ll never get rid of this one.

There may have been earlier examples on television, but “The Brady Bunch” was the first show to feature a blended family week after week, thus perhaps making the whole idea seem more wholesome. It is doubtful that the sitcom had much effect in its time, given its poor ratings and reviews, but pop-culture observers agree that it had a long and powerful afterlife among those of a certain age (mine), for whom the Brady Bunch is part of a comforting nostalgic penumbra (accent on “numb”). Several shows about different varieties of blended family have succeeded Mike and Carol and Sam* and Alice: Full House, Step by Step, Modern Family. The Bradys anticipated a trend; their descendants follow along behind, trying to catch up to everyday life. The Stepfamily Foundation started life in 1977; support groups and talks at the local library aimed at blended families seem to have arisen in the eighties, when the requisite self-help books also began to appear. New terms must surely arise to reflect new conditions, but the rule is that only one or two out of a larger number will make it to the next generation and a shot at immortality.

* The butcher. Remember?

Tags: , , , , , , , , , , ,

in denial

(1980’s | therapese | “hiding one’s head in the sand”)

My guess is we owe today’s prominence of “denial” in psychological lingo to Elizabeth Kübler-Ross’s stages of grief. I doubt we would have “in denial” without the predecessor; the phrase as we use it now didn’t turn up before 1970 anywhere I looked. The term and associated concept — refusing to believe that which is clear to others, as by failing to acknowledge an emotional or psychological state, or even sheer physical reality — were already in existence, but Kübler-Ross’s “On Death and Dying” (1969) was very influential; one of its effects was to make the experience of denial common to nearly everyone. Not long after, the term became popular among counselors of alcoholics and other drug addicts who refused to admit they had a problem. “In denial” may be merely a compressed version of “in a state of denial.” It appears to be the most common phrase descended from “denial,” but not the only one; Pam Tillis hit the country charts in 1993 with a song about Cleopatra, Queen of Denial (though I’m pretty sure the redoubtable Rev. Billy C. Wirtz had used the joke before then).

“In denial” has been in use for a long time in other contexts, but the grammar is new. Now the phrase is most common as a predicate complement (e.g., “You’re in denial.”), possibly followed by “about,” but not “of.” In the old days, when it followed a verb it had to be active (e.g., “result in denial” or “engage in denial”). Of course, it appeared everywhere in legal prose (e.g., “in denial of the motion”), and it started to bob up in political contexts in the eighties, particularly around the time the Iran-Contra revelations were unraveling Reagan’s second term. It was kinder to say Reagan was in denial than to contend that he really didn’t know what was going on. Maybe this is one of the many terms Reagan helped into the language directly or indirectly, or maybe it would have happened anyway. By 1990 it had made its mark, though ace sportswriter Thomas Boswell put it in quotation marks as late as that spring. No surprise that it became popular — it’s compact and it packs a punch. The expression conjures a state of passive malignity or dangerous indifference, willful or not; like “passive-aggressive,” it’s always an insult.

Now “in denial” is entirely standard, eligible to be adapted to all sorts of uses, including humor, irony, and wordplay. (Here’s a bouquet of suggestions for compilers of rhyming dictionaries: “infantile,” “spin the dial,” “undefiled,” “linden aisle.”) I haven’t heard “SO in denial” or “in deep denial,” but I don’t get around much; both certainly lie within the universe of possible utterances. Or “Live in denial,” which may also be heard “living denial” (as in “Girl, you are just living denial 24/7“). “Oh, he’s such an old in-denial crocodile” could be the next catch phrase. “Hit denial on the head” might be a self-help slogan, meaning something like overcoming obliviousness and seeing the world without illusions. Why not “The In Denial 500,” which pits the nation’s most noxiously clueless bachelors against each other to see who can act the most idiotic? For you tongue-twister fans out there, it’s not much, but it’s the best I can do: Say “undeniably in denial” five times fast.

Tags: , , , , , , , , , , ,

blank on

(1990’s | journalese (arts? politics?) | “forget (temporarily),” “(have it and) lose it”)

I’m not very rigorous about it, but in everyday conversation I try to avoid using the kind of new expressions I write about here, just as I try to avoid using such new expressions in posts except to refer to them directly. But this one is an exception, and I catch myself using it fairly often. It has a host of predecessors. Probably descended directly from “draw a blank on” (be unable to remember whatever it is), it also recalls “blank look” and “let your mind go blank” (or the more involuntary “my mind is a blank”). The word implies a temporary but vertiginous mnemonic malfunction, a moment of vacuity that may lead to a deer-in-the-headlights look. A related verb is “blank out” in its intransitive sense, though that may cover a longer time span. “Blank on” means forget something and then recover it, a short-term lapse, more like a senior moment. It may also mean, on occasion, “fail to respond.” (“Shooting blanks” means something entirely different. “To blank” in sports lingo normally refers to holding the opposing team scoreless. Then there’s that charming if now unnecessary euphemism, “blankety-blank.” It is one of those linguistic oddities that “blank,” descended from the French word meaning “white,” looks and sounds much more like “black.”) “Blank on” has so many ancestors that some don’t even involve the word “blank”; doesn’t the phrase “(totally) blanked on it” remind you of “bank on it”? I continue to maintain, without proof, that such phonological resemblances influence new entries into the language.

One does hear occasional variations in meaning when this expression is used, but they never seem to catch on or persist. I saw this sentence recently in a food column in the Dayton Daily News: “Pasta is always a conundrum as a side dish. I want to pencil it into my weekly meal plan, but then I blank on how to sauce it: Cream? Tomato? Lots of cheese?” Here the emphasis falls on inability to choose among alternatives rather than failing to remember them. This usage may prove a solitary exception to the rule, but the contretemps is one we find ourselves in often enough that another word for it may be welcome.

The verb really did not exist before 1980, as far as I can tell. It started to turn up occasionally afterwards; in one of the first uses I found Reagan was the subject of the verb, and this may be yet another expression to which his presidency gave a boost, on the strength of his well-known absent-mindedness rather than policy initiatives. It had entered the language pretty definitively by 1995, often used by politicians and press secretaries, but actors also use it a lot. During the latest presidential campaign, it quickly became the standard verb to denote Libertarian candidate Gary Johnson’s inability to address the significance of Aleppo. As is often the case when a new phrase resembles an old one, or several old ones, the trail into everyday language is not well-blazed and it may be impossible to determine, even in retrospect, how it wormed its way in.

Tags: , , , , , , , , ,

racial profiling

(1990’s | legalese | “discrimination”)

The verb “to profile” has a relatively complicated recent history, even if you set aside the usual literal or technical meanings from geology, engineering, esthetics, etc. For most of the twentieth century, the most common usage had to do with interview-based journalism — describing a worthy individual or organization in detail. Usually an actor or comparable cultural phenomenon, hence the phrase “celebrity profile.” The word was available as both noun and verb, but from either angle it seems an odd choice. The classical meaning of “profile” — a face seen from the side — would seem, on the face of it (sorry), to have little to do with a revealing biographical portrait. To carry the metaphor to its logical conclusion implies that the reporter has left out half the relevant information, as a profile leaves out half the visage — although an art critic might argue that sometimes the profile is more revealing than a full-face view, and there’s no denying some faces are far more interesting in profile. Sometimes “profile” means little more than “categorize,” as in a corporate profile that provides statistics grouped under various measures of performance. In African-American slang, “profiling” was another word for “showing off.” But when we use the term in African-American contexts today, it has an entirely different slant.

Our use of “racial profiling” today is descended from the more sinister practice of psychological profiling; the OED lists its first example of this usage in 1951. The goal is to see beneath the surface presented by the soldier, teacher, or employee, the psychologist’s trained eye constructing an account of each personality that understands the subject better than she understands herself, or at least better than the boss understands her. Inevitably, it occurred to the criminal justice system that such a thing might be useful in dealing with malefactors, and the idea of profiling this depraved criminal or that deranged terrorist entered the mainstream in the seventies and eighties. By 1990, the concept had undergone further refinement in the form of DNA profiling, by which the expert found a unique way to identify any individual through a bit of hair or saliva, again finding a distinctive marker that was not apparent to the unaided eye or brain. A DNA profile is a hyper-detailed diagram constructing a definitive portrait that cannot be confused with that of anyone else. Though the technology is often used in the context of medical research, it turns up much more often in news accounts of criminals, which has paved the way for “racial profiling,” now the dominant locution in which “profiling” appears. (I append the ACLU’s definition along with a reasonably non-partisan discussion of various kinds of profiling.)

The extraordinary thing about the new expression is that it has turned the old idea on its head. Racial profiling dispenses entirely with a painstaking account of the individual, teasing out a detailed map of characteristics, and replaces it with a simple question: Do you belong to this or that dangerous group? (Profiling based on religion or nationality is also possible, of course.) On one view, the change in usage is a complete reversal, but from another it is more or less seamless — profiling is merely one more weapon in the eternal war against the bad guys — and therefore it may be entitled to a certain poetic license.

The illogic of widespread, systematic profiling has been proven so often that the practice has few defenders but many adherents. When Americans feel threatened — some of us don’t even have to feel threatened — we disregard the studies and the logic and reach for the easy, satisfying answer. If a few people from a certain group mean us harm, make all of them suspect. For that to have any chance to work, the group must be very small, but preferred objects of unequal treatment in our society number in the millions, most of whom are law-abiding and just trying to do their jobs and pay their taxes. Having been mistreated by the justice system, such members of minority groups have no incentive to work with police and a quite reasonable desire to avoid them. Police departments around the country have learned this the hard way. (An exchange between Sam Harris and Bruce Schneier may flesh out the argument sketched above.) But their experiences have not dissuaded an uncomfortably large percentage of us, who demand that the law be simple and punitive. In America, foolish and failed policies can be enacted over and over again, if they benefit — or harm — the right people.

Tags: , , , , , , , , , , ,

mindfulness

(1980’s | academese | “attention,” “meditation,” “being in the moment”)

“Mindfulness” is not a new word, but it has a better claim to newness than “mindful,” long in common use, meaning “well aware” (occasionally it means “considerate”). You used the word when talking about something that demanded more than ordinary attention, or caution. One was mindful of bitter past experiences, or of threats, or of risks. I remember my Sunday school teacher exhorting us to be mindful of scriptural principles. In other words, it shouldn’t just be one more thing rattling around up there, it should fill your mind (if it were a noun, you might say a mindful). It wasn’t a casual word, but a portentous one. “Mindful” may still be used that way, but “mindfulness” never is.

The term today today also refers to heightened awareness, but the object of attention is that which confronts your consciousness right now. In this sense it comes straight out of Buddhism. The eightfold path prescribes mental habits; number seven is “right mindfulness.” “Mindfulness” as we understand it was probably invented by Dr. Jon Kabat-Zinn, a professor at the University of Massachusetts Medical School, who founded the Mindfulness-Based Stress Reduction program in 1979. He turned up in the news occasionally in the eighties. Another longtime proponent, Dr. Ellen Langer of Harvard, seems to have come on board a few years later; she used the expression as a book title in 1990 (the book, according to amazon.com, did not cite Kabat-Zinn, or even mention his name). To be sure, their focuses were different. Kabat-Zinn emphasized bodily relaxation through techniques very similar, if not identical, to what we used to call meditation: concentrate on breathing and sensations, keep returning to monitor them when the mind wanders, note your feelings without judgment, step back and try to experience your mind and body neutrally. Langer thought of it as more of an intellectual exercise; one method she prescribed was to “watch [television] as though [you] were someone else — a politician, or an athlete or a criminal. The point is to break through people’s assumptions with an active attention that stimulates their thinking” (quoted in the New York Times, March 4, 1986). Langer doesn’t talk about listening to your body, but both theorists emphasize conscious control of the mind and of keeping your focus on what’s going on right now.

Kabat-Zinn and Langer are high-powered professors with strong reputations. I’ve written elsewhere of the spread of Eastern religions in the U.S., but it’s not just gurus and rock stars; academics have often gotten into the act. It’s hard to see Kabat-Zinn as doing anything other than smuggling an anonymous version of Buddhist thought into the scientific community, buttressed with impressive studies proclaiming its beneficial effects. Langer seems more focused on business and social relations, so the connection to the mysterious East is weaker. But it’s there either way; mindfulness involves increased attention, though practitioners may differ on whether one should concentrate stubbornly on what’s going on inside the head or outside the body. To a good mindfulist, that’s an illegitimate distinction, or it should be. The point is to connect your consciousness with the movements and rhythms of the body, and that does tend to make any chasm between the two seem imaginary.

No pain, no gain” and “karma,” also adapted from Asian religious wisdom, have been treated much more roughly than “mindfulness”; their American definitions contradict the original meanings and have pretty much obliterated them. Perhaps because mindfulness has always been promoted by academics and other elites, it has held onto something closer to its original denotation.

Business leaders, athletes, and politicians all discourse solemnly on the benefits of mindfulness. It has become an industry, with many experts and many web sites, and it has many of the characteristics of a fad. Its popularity is starting to provoke a backlash: a recent column on salon.com decries mindfulness training in scholastic settings, and other voices depict it as a convenient way to hold students (or employees) responsible for feelings of stress that actually are imposed by their superiors. If the bosses spring for a few mindfulness classes, they don’t have to try to create a better work environment. To still others, mindfulness is just a nice word for mindlessness, as the comic strip “Pickles” had it. Turn off your critical thinking and creativity, and vegetate yourself stupid.

My thoroughly brilliant girlfriend gave me this expression to write about. Where would I be without her?

Tags: , , , , , , , , , ,

comfort food

(1980’s | journalese (gastronomy) | “home cooking,” “favorite dish”)

You could construct a good personality test by asking subjects to define this expression and list examples. Food writers use it confidently, but it has a wide range of meaning, though the gradations can be pretty subtle. The bottom lines that seem to underlie every use of the phrase: it has to be something the diner is already familiar with, and likes. Beyond that, it can go in several directions with equal confidence. Obviously, there is some overlap among the categories below, but I find the taxonomy helpful:

-What you ate when you were a small child, therefore often mushy or liquid, that makes you feel like you’re in Mama’s arms again. In other words, comforting food. Things like macaroni and cheese or tomato soup.

-What lovely Liz from Queens calls “white food.” Also often mushy and associated with childhood, but the point is it’s uncomplicated — bland and starchy as well as pale in color. Mashed potatoes, bananas, vanilla ice cream.

-What people eat in the country. “Comfort food” is sometimes used as a synonym for down-home dishes, and it may have a strong regional tinge. Comfort foods in the South may differ from comfort foods in the Northwest, for example (Moon Pies are not big in Seattle). Burritos in the Southwest, lobster rolls in the Northeast.

-Anything plain and unsurprising. Sometimes “comfort food” refers to things that are simple to prepare as well as eat, perhaps with the implication that it’s for family consumption rather than guests. This covers the first two above and other areas as well. Oatmeal, spaghetti, scrambled eggs.

-Heavy or at least substantial preparations; usually meat, frying, or both are involved. Meat loaf, casseroles, pot roast, burger and fries. Don’t be alarmed if the word “rib-sticking” appears nearby.

-Whatever you happen to enjoy, whatever makes you feel better for having eaten it, or makes up for a bad day. This sense of the term really opens the floodgates; now fancy gourmet concoctions can sit right beside the humblest fare. Sushi or catfish, crème brulée or egg custard, sweetbreads or scrapple. Such broad usage may be an abuse of the term, but you hear it a fair amount.

Notable by its absence from the lists above is the noble vegetable. The more effort it requires to eat, and the less obviously sweet, salty, or fatty it is, the less likely it will qualify as comfort food (except under the last definition, where anything goes).

There are some obvious faults — in the geological sense — in the meaning of “comfort food” that help explain the multiplicity sketched above. The main one: both personal preference and social custom are part of the field covered by this expression, and neither can be disregarded. Each person has their own, to some degree, but there is usually a fairly strong consensus on what most people in the same culture would consider comfort food. If your version of it is a rice cake with a shmear of tofu, that’s your business, but don’t expect your peers to share your tastes. Another fault: Lovers of exotic cuisine may depict “comfort food” with a sneer as unworthy of an adventurous palate, but more often it operates with reverse snobbery, as the lower classes contrast their chow lovingly with the pretentious, fussy gourmet variety. I also note in passing that “comfort food” partakes of nostalgia, real or imagined, especially when it summons our childhood diet or rural eating habits. But once again, the nostalgia may be deeply personal (childhood) or sociocultural (down home). Another point of negative interest: the expression is rarely used metaphorically (e.g., calling a novel “literary comfort food” as a reviewer in the New York Times did in 1987). We have chicken soup for the soul, but comfort food fills only the belly. To round off this sequence of unrelated points, I will suggest that there is no direct connection between the rises of “comfort zone” and “comfort food,” but they occurred at the same time, and it’s quite possible the two expressions helped each other into everyday language.

My brilliant, beautiful girlfriend gave me this expression months ago, and I finally decided to take a bite out of it. Thanks, baby!

Tags: , , , , , , , , , , ,

mindset

(1980’s | therapese? | “basic assumptions,” “world view,” “framework,” “preconceived notions,” “idées fixes”)

This is one of those expansive words that has grown fat with use. “Mindset” goes back to the early twentieth century, but it didn’t spread until the seventies, when according to Google Books it started to appear regularly, particularly in writing having to do with therapy and religion, or politics. Now it is used everywhere, though if LexisNexis is to be believed, it is especially popular among athletes these days, a backhanded homage to the great Yogi Berra’s observation that ninety per cent of baseball is half mental. In recent years, some therapists have tried to retake control of the word by popularizing a standoff between “fixed mindset” (belonging to those who think they can’t get any smarter than they are) and “growth mindset” (those who rejoice in breaking through their mental barriers and blocks). It’s not clear to me how reputable this Manicheanism is, but it has gained traction in the on-line community.

We must pause to define the term, which I will do with reference to authorities. In 1983, William Safire described the evolution of “mindset”: “Tendency, attitude, or inclination used to be the primary meaning, akin to frame of mind; now the primacy goes to fixed state of mind or predetermined view.” The OED highlights “established set of attitudes, esp. regarded as typical of a particular group’s social or cultural values.” Safire’s contention, which is correct in my humble view, may result from the ambiguity, not to say polyguity, of the word “set,” which means “group” or “collection,” but also means “immobile” or “deep-rooted.” It’s a list of beliefs or assumptions that causes our minds to move predictably along certain paths, or it’s just the mind set in its ways.

When athletes use the word, it usually comes closest to “(mental) approach”, the quality that allows you to concentrate on the game and bear down harder than your opponents. Your mindset may need to change, or you may have trouble keeping the right mindset on the field. This does not correspond precisely to either of the primary definitions cited above, but it is related to the “growth mindset” discussed in the first paragraph. True, “mindset” doesn’t take prepositions as readily as “approach,” but a player might “bring the right mindset to the game.” The new word certainly does not preclude all the old clichés dear to athletes for generations: focus on winning, all I care about is the team, don’t worry about things you can’t control, etc.

There is a class of expression that lies dormant for decades, even centuries, and then bursts into the vocabulary. Other examples I have covered: “holistic,” “comfort zone,” and “artisanal” are twentieth-century examples, and some are older still, like “hurtful,” “ramp up,” or “overthink.” The OED cites “mindset” as early as 1909, but the word didn’t hit its stride for another sixty or seventy years after that. It seems like it ought to have come from the students of altered consciousness that had their heyday in the sixties (Timothy Leary talked about “set and setting”), but as far as I can tell its rise cannot be attributed to any particular guru, professor, or Esalenite.

Tags: , , , , , , , , , , ,

binge-watching

(2010’s | “overindulging,” “spending too much time in front of the TV”)

A binge has always had something disreputable about it, and the mixture of pride and shame with which binge-watchers confess their latest debauchery proves that it still does — it’s been but a year since the Washington Post declared binge-watching socially acceptable. A word that goes back to the nineteenth century, “binge” means the same thing as “spree.” A prolonged drunk, spending too much money in a short period of time, that sort of thing. It always meant excess. People started talking about “binge eating” and “binge drinking” in the seventies and eighties, probably the first time “binge” was used as an adjective in any widespread way. There was a rough equivalent to binge-watching in my youth, but we named the actor rather than the activity: couch potato (still in use, though it need not have anything to do with television any more). Couch potatoes’ preferred verb was “view,” anyway. Some people do say “binge-viewing,” though it is less common, at least in the States.

What is this thing called “binge-watching”? One psychologist notes that all it really means is “spending a longer time than normal watching television. . . . Netflix conducted a survey in 2014 where viewers defined binge watching as viewing between two to six episodes of a show in one sitting.” The phrase does conjure up red-eyed, addled viewers losing entire weekends to the new season of their favorite Netflix series, but does prolonged viewing become “binge-watching” only when it is obviously harmful? According to my limited research, the consensus answer is no; “binge-watching” may just denote a harmless way to spend a few stray hours. But the dubious heritage of the word “binge” will make that innocuousness hard to keep up.

The earliest unmistakable instance of “binge watching” in LexisNexis comes from Australia in 2006, and it trickled into American English shortly thereafter. Before the advent of home video recording, such a thing wasn’t really possible, and it didn’t become feasible until the practice of issuing entire seasons of television programs on DVD became prevalent — archaic as that seems in the days of Netflix and Hulu and lots of hipper streaming services I’ve never heard of. In my younger days, a complete retrospective of a certain director’s films, say, might have been called a marathon, or a festival, or maybe just a complete retrospective. (You come across expressions like “Game of Thrones marathon” even today.) In the nineties, it was possible to buy complete runs of at least a few television series on VHS, but the term did not arise then. So maybe this is a millennial thing: the idea that watching hours and hours of your favorite show, and dropping everything to do it, is a worthy activity. Not that you have to be a millennial. And now, new series must be written with an eye to the preferences of binge-watchers.

When I was in college, “Wheel of Fortune” turned the 1968 song “I’m a Girl Watcher” into an advertisement for itself. Then “Baywatch” was all the rage. The act of watching seems to have become linked ever more suffocatingly with television in the seventy years we have been groveling before the tube — I guess we have to call it “the screen” now, since there’s no tube any more, unless your television set is as old as mine. After “binge-watching” settled into our vocabulary, “hate-watching” arrived as well, meaning simply “binge-watching a show you hate,” with the implication that it’s the sort of show you love to hate, at least according to one writer. Perhaps inevitably, “purge watching” has sprung up, meaning “hate-watching” with less passion, more out of a desire to get the offending show over with than to enjoy noting how awful it is. Who knows what other “watch”-words will come?

Tags: , , , , , , , , , , ,