Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

lanyard

(1990’s)

You’d be surprised at how many meanings this word has — anyway, I was. They’re all related to the notion of a strap (going back to Middle French and Old High German), and that association holds true even in these latter days. A hundred years ago, its primary meanings were: a strong rope used to help secure ship’s rigging, cord used to fire a cannon, strip of leather used to hold a snowshoe (for example) together, and another nautical reference, which evolved directly into today’s use of the word: the cord a sailor hung his knife on so he could carry it around his neck. “Lanyard” has a few other miscellaneous meanings as well: chinstrap (on a hat), summer camp favorite (the lanyard as craft project seems to date back at least to the middle of the twentieth century), what the referee wears a whistle on (back to the sailors again), military decoration (worn on the shoulder), part of a safety harness. When used to help construction workers secure their tools, lanyards aren’t just for necks any more; they can attach to shoulder or wrist as well. By and large, the old meanings are still active, and probably no less common than they ever were. Around them has sprung up a new field that calls on lucrative forces of the Zeitgeist like security, fashion, and commerce, commerce, commerce. Now we have Lanyards USA, Lanyards Tomorrow, and CustomLanyard.net.

If the old meanings are still around and there aren’t any new ones, why an entry? Just my sense that the word has become vastly more popular. It may have meant a lot of things in its storied past, but it always had a specialized air about it. Nowadays, though, you hear it everywhere — everybody from lowly janitors to Super Bowl spectators wears one. And it’s being applied in ways it never was before. Now “lanyard” is what you call the cord or chain you hang your glasses on around your neck; in my youth, plenty of people wore their glasses around their necks, but not on lanyards. It was always true that the lanyard, whatever it denoted, had a strongly utilitarian cast. But not any more; lanyards still serve everyday functions, but they also should match your clothes or sparkle or advertise or say something interesting. An accessory at the very least, potentially more.

The lanyard revolution is more than anything a consequence of our efforts to keep ourselves safe, which has made us ID-happy. Why do you need a lanyard at the Super Bowl, or to get into your office? So you can display your credentials and prove that you belong there. Sure, people carry keys and other household objects around their necks, too, but your standard lanyard nowadays comes with the clear plastic ID-holder, so you never have to dig out your card to show the guard. In the seventies, members of a few professions were using lanyards for that purpose, but now almost any public employee and lots of private ones wear them as a matter of course. It’s a tacit acknowledgment that we have accepted increasing restrictions on our movements in hopes of preventing, or at least limiting, mayhem and bloodshed. Lanyards are an emblem of that loss of freedom, another component of the uniform that the wealthy, ever security-conscious (and for good reason), force on the masses. Adding insult to injury, the bosses and big money want us to regard the badge of servitude as just one more consumer good. If they succeed, we lose again.

Tags: , , , , , , , , , , ,

latchkey kid

(1980’s | therapese?)

Also latchkey child, though that wording seems almost archaic now. Some sources date the expression to the 19th century, but it’s probably later. Random House assigns an origin between 1940 and 1945, and Dorothy Zietz in “Child Welfare: Principles and Methods” (Wiley, 1959) cites not only “latchkey child” but “eight-hour orphan” and “dayshift orphan” as synonyms. Zietz points to “’emergency’ day care programs which became prominent during World War II [that] are now regarded as part of the community’s basic child welfare services,” which will come as no surprise to anyone who has ever heard of Rosie the Riveter. Nonetheless, in 2017 it is generally assumed that Generation X both invented and perfected the concept of the latchkey kid. Scattered references can be found before 1980, but the phrase really took off afterwards, which explains why Gen X gets the credit. (Full disclosure: I’m a proud member of Generation X (the older end) but was not a latchkey kid.) I can’t find any sign that “latchkey child/kid” came along before World War II, certainly not as early as the nineteenth century. It’s easy to imagine a Victorian illustration of a disconsolate waif with a key on a string or chain (not a lanyard) around her neck, but the term was not needed then because the kids were working the same hours as their parents. We still have plenty of latchkey kids, of course, but the novelty has worn off. Today, Free Range Kids carries on the tradition of advocating unsupervised time for children.

God help us, a lot of those Gen X’ers are parents now, and they indulge in the eternal practice of contrasting their kids’ experience unfavorably with their own. The Generation Next of parents proclaims that all that time with no adults in the house made them resilient and self-reliant, and maybe it did. But then why have so many turned into helicopter parents who starve their own kids of opportunities to learn how to manage without adult intervention? I suspect such generational shifts aren’t all that unusual, because parents have a commendable desire to spare their children the traumas they had to go through. But the wider tendency to bewail these kids today goes back a long time, too long and steady to be wholly unfounded. Every generation of parents sees their own experiences as definitive and notices only that which has deteriorated. The thing is, a lot of the time they’re right; standards do change, sometimes for the worse, and good parents must be especially alert to such slippages.

We associate latchkey kids with working single mothers and always have, though plenty of them have working fathers. From this has arisen a certain stigma the phrase can never seem to shake. Even today, it is used as a class marker, one of many indications of poverty, crime, substandard education, and the rest of it. Numerous studies suggest that latchkey kids don’t generally do worse than average; they share the fate of all studies that call easy explanations into question. We just know that the kids are worse off now and/or will do worse as adults; don’t try to tell us different. It is common to read nostalgic accounts of eighties childhoods, but at the time most press coverage — and there was quite a bit — was marked by dubiety. Some researchers pointed to pervasive fear among latchkey kids of emergencies they were unequipped to handle, or of intruders, or just of being all alone in an empty house. Latchkey kids may not want to relate such feelings to their parents, knowing that expressing doubt or anxiety will disappoint or irritate their hard-working elders. Then again, some kids learned to keep house, manage their time, or just watch lots of television. It’s unlikely that most parents want to leave their kids alone day in and day out, but unless the kid shows obvious ill effects, there’s no point feeling guilty over it.

Tags: , , , , , , , , , , , ,

hive mind

(1990’s | science fiction | “zeitgeist,” “will of the people,” “conventional wisdom,” “groupthink”)

It all started with the bees. The British apiarist H.J. Wadey probably did not invent the term, but he used it in the 1940’s to describe the process by which lots and lots of bees, each of which has next to no mental capacity on its own, work together to create an intelligence that cannot be accounted for simply by adding up the microcapacities of each bee in the colony. There was something a bit mystical about it, and that transcendent quality was picked up by other authorities on bees. From there it became property of science fiction writers, for whom the concept was tailor-made. In their hands, it could retain the sense of a purer intelligence emerging from the collective, or it could be a means of imposing zombie conformity and obedience on the rest of us. Science fiction runs to utopia or dystopia anyway, and the hive mind can be used to exemplify both, even in the same book. The phrase had not become common outside of science-fiction circles; I doubt most Americans were familiar with it when I was young.

There the matter rested until the mid-1990’s, when the expression received the benefit of two cultural megaphones: first Kevin Kelly, founder of Wired magazine, then the film Star Trek: First Contact. Kelly saw the hive mind as the result of amplifying human capability with computers (preferably implanted) to enhance our collective intelligence and create a larger force, human yet superhuman, that would change everything for the better — although individual drones might not fare so well. A year or two later, Star Trek: First Contact came out, which featured the Borg as the villain. The Borg had appeared on Star Trek: The Next Generation (the Patrick Stewart cast, which also populated the film), but this seems to have been the first time the phrase “hive mind” ever appeared in the script. The Wired geeks and the Star Trek geeks between them formed a critical mass, and “hive mind” emerged from the sci-fi shadows and began to be encountered much more often. The onset of social media certainly didn’t slow the spread of the phrase; here again, the concept may be beneficent or noxious.

Kelly was an optimist, positing that the computer-aided hive mind would lead to a much greater capacity to solve human problems, whereas the Borg represents the dark side, gobbling up plucky individualists and producing numbing conformity while enriching its own hive mind with the contributions of other civilizations (sounds like imperialism, or the one percent). My sense is that today the pessimists are winning; “hive mind” has become a favored grenade to toss across the political divide, as stalwarts of the right and left accuse their opponents of stupidly parroting the sentiments put forth by their respective opinion makers. On this view, the hive mind is simply an overlord to which the bad guys pledge dumb fealty. (Of course, both left and right have their share of unreasoning myrmidons, but I wonder if they may be more characteristic of the right wing. “Dittohead” is no longer fashionable, but it’s worth noting that only right-wingers called themselves “dittoheads,” often with pride.) Even if the insulting use predominates right now, the more hopeful meaning may rise again. Take UNU, for example, which promises to help us “think together” by setting up a “swarm intelligence.”

Once you get away from the notion of a literal superbrain, the metaphorical uses of the expression come quickly into view. A single brain can itself be seen as a teeming hive mind, with neurons equivalent to drones, each doing its tiny duty but producing prodigious results by subordinating itself. (A more recent issue of Wired showcases an example of this sort of analogy, which has no counterpart for the queen bee.) More generally, the hive mind may serve as a symbol of our politics, in which millions combine to create and support a unified national government. (If that idealized picture makes you snicker, you’re not alone.) Our national motto, E pluribus unum, means “out of many, one,” and that’s not a bad summary of how a hive mind works. No single individual knows everything or can do it all by herself; the nation must muddle along making the most of whatever contributions it can get from hard-working citizens, who create the polity by banding together, at least partly unconsciously, to assert a collective will.

This post was inspired by the one and only lovely Liz from Queens, who nominated “hive mind” only last week, thereby sparing me the trouble of coming up with a new expression to write about. Thanks, baby!

Tags: , , , , , , , , , , ,

coffee date

(1990’s | teenagese? | “first date,” “brief encounter,” “not even a date, really”)

It’s tempting to see the rise of the phrase “coffee date” as concomitant with the rise of the gourmet coffee craze (which hasn’t abated), and the expression did become a lot more common around the time Starbucks did, from the late eighties to the mid-nineties. On-line dating services made their mark only a few years later and produced many more coffee dates, but the term existed well before that. Google Books fishes up a solid reference in Mademoiselle magazine from 1966 (not that Google Books’s dating is all that reliable). That article explained that the coffee date was the college equivalent of the Coke date. There’s no obvious origin for either phrase that I can find in my limited corpora; maybe it bubbled up from below.

College students being so mature and all, naturally they prefer coffee. But the point of the coffee date is not what you consume; it’s a probationary first meeting, which the parties use to size each other up. So it must be short, inexpensive, casual, easy to escape, and in a neutral, public place. Nothing much can happen, and that’s the point. If you hit it off, maybe a lunch date next. “Lunch date,” “dinner date,” and “movie date” are older terms — or at least they became common earlier — that imply a progression whose first step now is the coffee date.

Coffee dates have become so firmly part of the romantic how-to manual that a reaction has developed. While conventional wisdom still recommends them as sensible first meetings, certain apostates, such as this eHarmony blogger, dismiss them as old hat and unlikely to lead to serious relationships; others question whether they should be called “dates” at all. There are always doubters, but even they can’t deny that the dating landscape has changed, tilting the playing field decisively toward Starbucks.

speed dating

(late 1990’s)

An expression, and concept, with a verifiable origin. The on-line consensus — unanimous as far as I can tell — says that Rabbi Yaakov Deyo and his wife invented speed dating in 1998 as a way to encourage Jewish singles to meet each other and form relationships. It goes like this: between five and ten women sit at individual tables. The same number of men wait nearby. At a signal, each man sits down at a table and talks with the woman for eight minutes, then moves to the next table and does it again. In slightly over an hour, you meet several candidates, at least one or two of whom might be worth a follow-up. Now several national organizations sponsor speed-dating events, which may or may not have any religious, ethnic, or gender restrictions. The practice is sometimes known as “turbo-dating.”

I was struck by the ritual character of speed-dating, which was after all created by a rabbi. The basics of the process don’t vary much regardless of who’s in charge: several conversations in succession, each a fixed period of time; then participants notify the organizer which live one(s) are of further interest, whereupon the organizer puts two people in touch if they appear on each other’s lists. Perhaps the level of rigor does not measure up to the detailed ritualistic instructions of the Torah, but there’s a rule-bound quality all the same. One site notes the roots of speed dating in the traditional Jewish concept of the shiddach (match), basically an arranged marriage made with the help of a middleman or -woman. At any rate, like many concepts invented by Jews, from monotheism to relativity, speed dating has spread quickly and exercised tremendous influence.

It’s another kind of prescribed first date and so is related to the coffee date, but it’s even more circumscribed. Like a coffee date, your chances of success are low but the investment of time and energy is small, and like a coffee date, it can only arguably be called a date at all. Speed dating is distinctive because of the sheer number of people involved; if you buy the theory that most of the time we decide in a matter of seconds whether we’re attracted to someone or not, the approach makes sense. Just get a bunch of generally like-minded, well-disposed people in the same room and let nature take its course. The irony is that while speed dating looks like it was designed to deal with a glut of possibly eligible partners, it was actually invented to keep members of a relatively small, insular group from finding mates elsewhere. (Of course, in a large city like Los Angeles, where Rabbi Deyo first tried out speed dating, there are thousands of unattached Jewish adults, still an impossible number to navigate on one’s own.)

When a reader asked advice columnist Carolyn Hax her opinion of speed dating, she replied, “I liked it a whole lot better when it was called a ‘cocktail party.’” The point is well taken; speed dating is a highly regulated version of what was once known as “mingling.” You went to a party with people you didn’t know, and you went around and talked to them, allowing you to determine who might be a possible romantic interest. No timekeepers or chaperones required, and if you wanted someone else to have your number, you gave it to them. I’ve never tried speed dating, but I was never much good at mingling, so something tells me I wouldn’t be much of a speed dater, either. Both of my long-term relationships began with dates that lasted several hours, so maybe that’s just how I roll.

Tags: , , , , , , , , , ,

all in

(2010’s | militarese? | “giving one’s all,” “bound and determined”)

“All in all.” “All-in-one.” “All in the wrist.” “All in your head.” “All in the same boat.” “All in good time.” Or you could just settle for “all in,” shorn of superfluous objects and uttered with quiet conviction. It means we won’t turn back; we won’t give in. But that’s not what it meant in my childhood. Back then “all in” meant “worn out,” “exhausted.” That definition was on its way out then, and the usage we see today represents a revival, doubtless an unnecessary one. In poker, it meant “having put all one’s chips in the pot” (which makes more sense). “All in” was a bit anomalous among the many vigorous expressions for states of lassitude. Most of them are straight predicate adjectives: “beat,” “pooped,” “spent,” “wrecked.” It reminds me a little of “done in,” but literally that means “murdered,” something much stronger. The old usage (citations date back to the nineteenth century in Lighter) is mostly gone, but I believe the term is still current in poker. (Ian Crouch gives a good account of the evolution of “all in” in the New Yorker.) In the modern sense, popularized by David Petraeus’s biography (2012), it also seems related to poker somehow, but in a more positive way — a confidence in the supremacy of your hand that causes you to bet your entire stack of chips without hesitation. But “all in” doesn’t connote arrogance or unseemly displays of power so much as steely resolve or unswerving attention to the task at hand. “All in” is what you are at the beginning of the day; it used to be what you are at the end of the day.

Theoretically it ought to be possible to be “all in” squared — bent on reaching the goal AND too tired to go on. But the effort required to maintain such commitment precludes helplessness born of weariness. Being all in implies that you have enough energy to figure out and make the next move, or enough force of will to overcome the newest obstacle. The other verb that precedes the expression is “go,” which reminds us of how closely it resembles “go all out,” a phrase much beloved of sports announcers in my youth. I don’t listen to play-by-play as much as I used to, but I have the impression we don’t hear “go all out” much any more.

“All” in itself implies a group, so “all in” should suggest effort toward a common goal, as in “we’re all in this together.” It may, but it doesn’t have to. It is possible to go all in on your own private project, but it might sound a little odd. When politicians and military people use it, there’s at least a hint of pulling together. That assumption of camaraderie is made explicit in what may prove to be yet another new meaning for the expression. Penn State University’s “All In” initiative provides an example, the motto being “A Commitment to Diversity and Inclusion.” Here the term is used very self-consciously to express the ideal of a tolerant, easy-going community. Donald Trump’s ascendance has given this sort of communitarianism a boost, and so I suspect we may see the expression used this way more and more. Keep your eyes peeled; “all in” may shed its skin yet again.

Tags: , , , , , , , , , , , ,

blended family

(1980’s | therapese | “stepfamily”)

Contested terrain semantically, as in other, more obvious, ways. Start with the definition. Nowadays, most people would probably endorse a relatively loose definition of “blended family”: any family formed when an adult with one or more children takes up with a different adult, who may or may not have children. If you’re a purist, you might require that both adults have at least one child. In 1983, a writer defined it thus: “pop-psychology euphemism for members of two broken families living under the same roof, a mixture of step-parents, step-children and step-siblings.” Ten years before that, a psychology textbook defined it as a “family consisting of a husband and a wife, the children of either or both from a previous marriage, and children of the present marriage.” The new spouses had to have kids together, not just with former partners. The extra distinctions may have been made possible by a wider panoply of related terms than we can remember now. A surprisingly large amount of vocabulary sprang up around such filial configurations; in 1980, the New York Times propounded the following list: “conjugal continuation, second-marriage family, stepfamily, blended family, reconstituted family and metafamily.” (It missed “merged family,” also in use by 1980. “Mixed family” means that the parents are of different race, ethnicity, or religion.) Of these, only “stepfamily” would be familiar to most people in 2017, but Wikipedia distinguishes between stepfamilies (only one adult has a pre-existing kid) and blended families (both adults). According to the OED, “stepfamily” goes back to the 19th century; the earliest citation I found for “blended family” dated from 1964.

Why did “blended family” win out? Probably the usual mixture of euphony and accuracy, or intuitiveness. Most of us understood pretty quickly what it meant the first time we heard it in context, and it sounds good — not too long, not too short, scans nicely. “Second-marriage family” is clunky; “metafamily” is jargony and doesn’t make a whole lot of sense anyway. “Blended family” sounds a lot better than “reconstituted family” (just add water!), you have to admit. The only mystery: why didn’t “merged family” catch on?

We like to think that the quirks and foibles of our own generation are unprecedented, but blended families are hardly new. My father’s father grew up in one after his mother divorced his father and married her second husband. My mother’s mother was the daughter of a second marriage, an old widower and a young wife. Life expectancy was lower then, so remarriages were more often occasioned by death than divorce. Was there a decline in the number of blended families for a generation or two, long enough to forget how common such arrangements used to be? If so, the phenomenon has come roaring back. Somehow, before 1970 or so, we got along without a general term for it. Now we’ll never get rid of this one.

There may have been earlier examples on television, but “The Brady Bunch” was the first show to feature a blended family week after week, thus perhaps making the whole idea seem more wholesome. It is doubtful that the sitcom had much effect in its time, given its poor ratings and reviews, but pop-culture observers agree that it had a long and powerful afterlife among those of a certain age (mine), for whom the Brady Bunch is part of a comforting nostalgic penumbra (accent on “numb”). Several shows about different varieties of blended family have succeeded Mike and Carol and Sam* and Alice: Full House, Step by Step, Modern Family. The Bradys anticipated a trend; their descendants follow along behind, trying to catch up to everyday life. The Stepfamily Foundation started life in 1977; support groups and talks at the local library aimed at blended families seem to have arisen in the eighties, when the requisite self-help books also began to appear. New terms must surely arise to reflect new conditions, but the rule is that only one or two out of a larger number will make it to the next generation and a shot at immortality.

* The butcher. Remember?

Tags: , , , , , , , , , , ,

victory lap

(1980’s | athletese | “bow,” “youthful exuberance,” “rubbing it in”)

I’m not quite sure when the custom originated of taking a victory lap after a race. The first instances of the phrase I’ve found turn up in the context of auto racing in the fifties and sixties, but runners have probably been taking them at least that long. (Victory laps are reserved for runners and drivers; horses are spared.) For all I know the Lacedaemonians or Athenians made an extra circuit of the stadium after trouncing the other, and being Greek, they must have had a word for it. The purpose of the act seems simple enough: it gives the adrenaline a little time to subside and the athlete a little time to soak up adulation. As late as 1980, the expression was restricted to track, racing, or occasionally non-athletic contests, like the Miss America pageant, already a small deviation from the literal.

In its figurative sense, the term is used most often by political journalists, though not exclusively; musicians and fashion designers may take them, for example. (In student slang, a “victory lap” refers to a year of high school or college beyond the usual four.) The first specifically political use I found appeared in the Washington Post, describing Reagan’s meetings with assorted government officials after he won his first term in 1980 (he pledged “cooperation with all,” as new presidents customarily do). Non-racing athletes also rated the term around the same time; I was somewhat startled to discover that as early as 1983 Reds’ catcher Johnny Bench’s last season was referred to as a “victory lap.” When a well-known athlete announces retirement far enough in advance, he may reap respectful send-offs at opponents’ stadiums as well as his own. Sometimes it’s called a “victory tour” to give the whole exercise a grander sound; either way, it’s all about adoring crowds, which is what politicians are after, too. Even today, “victory lap” denotes the acts of elected officials more often than not. As far as I know, neither man used the term, but the post-election travels of both Obama and Trump were widely described as “victory laps”: Trump’s thank-you tour and Obama’s last round of visits to European capitals. In the latter case, the phrase didn’t evoke any particular triumph so much as a sense that it was Obama’s last chance to talk up his achievements.

The rise of this expression in political journalism has given it an unsavory connotation. Victory laps used to be joyful celebrations, perhaps not always spontaneous, but at least a moment of innocent exultation shared by athlete and audience. A certain amount of self-congratulation was involved, to be sure. But the politician’s victory lap generally has more to do exaggerating an achievement or rubbing salt in the wounds of the defeated. It is a thoroughly calculated gesture, at worst malicious and at best indulged in purely for its own sake. Politicians are forever being taken to task for taking crass advantage of such opportunities for self-promotion, either because the victory is illusory or because the victor is crude and ungracious. That tendency hasn’t changed and seems unlikely to.

Tags: , , , , , , , , , , , ,

in denial

(1980’s | therapese | “hiding one’s head in the sand”)

My guess is we owe today’s prominence of “denial” in psychological lingo to Elizabeth Kübler-Ross’s stages of grief. I doubt we would have “in denial” without the predecessor; the phrase as we use it now didn’t turn up before 1970 anywhere I looked. The term and associated concept — refusing to believe that which is clear to others, as by failing to acknowledge an emotional or psychological state, or even sheer physical reality — were already in existence, but Kübler-Ross’s “On Death and Dying” (1969) was very influential; one of its effects was to make the experience of denial common to nearly everyone. Not long after, the term became popular among counselors of alcoholics and other drug addicts who refused to admit they had a problem. “In denial” may be merely a compressed version of “in a state of denial.” It appears to be the most common phrase descended from “denial,” but not the only one; Pam Tillis hit the country charts in 1993 with a song about Cleopatra, Queen of Denial (though I’m pretty sure the redoubtable Rev. Billy C. Wirtz had used the joke before then).

“In denial” has been in use for a long time in other contexts, but the grammar is new. Now the phrase is most common as a predicate complement (e.g., “You’re in denial.”), possibly followed by “about,” but not “of.” In the old days, when it followed a verb it had to be active (e.g., “result in denial” or “engage in denial”). Of course, it appeared everywhere in legal prose (e.g., “in denial of the motion”), and it started to bob up in political contexts in the eighties, particularly around the time the Iran-Contra revelations were unraveling Reagan’s second term. It was kinder to say Reagan was in denial than to contend that he really didn’t know what was going on. Maybe this is one of the many terms Reagan helped into the language directly or indirectly, or maybe it would have happened anyway. By 1990 it had made its mark, though ace sportswriter Thomas Boswell put it in quotation marks as late as that spring. No surprise that it became popular — it’s compact and it packs a punch. The expression conjures a state of passive malignity or dangerous indifference, willful or not; like “passive-aggressive,” it’s always an insult.

Now “in denial” is entirely standard, eligible to be adapted to all sorts of uses, including humor, irony, and wordplay. (Here’s a bouquet of suggestions for compilers of rhyming dictionaries: “infantile,” “spin the dial,” “undefiled,” “linden aisle.”) I haven’t heard “SO in denial” or “in deep denial,” but I don’t get around much; both certainly lie within the universe of possible utterances. Or “Live in denial,” which may also be heard “living denial” (as in “Girl, you are just living denial 24/7“). “Oh, he’s such an old in-denial crocodile” could be the next catch phrase. “Hit denial on the head” might be a self-help slogan, meaning something like overcoming obliviousness and seeing the world without illusions. Why not “The In Denial 500,” which pits the nation’s most noxiously clueless bachelors against each other to see who can act the most idiotic? For you tongue-twister fans out there, it’s not much, but it’s the best I can do: Say “undeniably in denial” five times fast.

Tags: , , , , , , , , , , ,

unclear on the concept

(1990’s | journalese (comics) | “missing the point,” “failing to grasp the situation,” “obtuse”)

Such a mild way to call someone an idiot, this expression imputes a genuine doltishness, a stark inability to understand the simplest processes or cause-and-effect relationships, which is strange, because on the surface it doesn’t sound so condemnatory. “Unclear” suggests nothing worse than temporary befuddlement, and “concept” lends a faintly professorial air to the whole affair. The phrase is not intuitive, but instantly understandable; one does not ask, “what concept?” Its rhythm and alliteration give it a rough poetry that has helped perpetuate it in our ear.

Not many new expressions have an easily identified source, but this one does: cartoonist Joe Martin, creator of “Mr. Boffo,” a comic strip that started showing up in the late eighties and had a running gag titled “people unclear on the concept,” illustrated each time by some species of magnificent cluelessness (examples here). Neither LexisNexis nor Google Books shows any instances of the phrase before Mr. Boffo came on the scene, but afterwards it popped up all over the place, so I think Martin gets credit for it. Only a select few expressions — factoid, glass ceiling, hot-button, irrational exuberance, tiger mother, trophy wife are all that come to mind — can be safely credited to a specific person. This one hasn’t become as widespread as it might have, but it turns up in numerous contexts: politics, computers, sports, arts, you name it. A notable feature: It’s most unusual to call oneself “unclear on the concept”; we use it about other people. The need for such an expression is obvious, as our fellow human beings continue to plumb new depths of stupidity. Our habit of believing the worst of those we disagree with, which I noted recently, has created fertile ground for any locution that helps call attention to the treacherous folly of others, sedulously contrasted with our own forbearance and rectitude.

There seems to be a sub-category of new expressions that deal, like this one, with varieties of denseness. “D’oh” starts at home — it’s what you say when you’re hoist by your own petard — but most of these expressions rely on the “dumbth” (word by the, alas, largely forgotten Steve Allen) of others. “Hello?!” and “didn’t get the memo” tend to cover small-scale, interpersonal situations; “dumb down” and “special needs” have a larger, more civic role. It is an oddly heterogeneous group, ranging from personal to political, from desirable to pejorative, from jocular to confrontational. We forget sometimes how complex a concept stupidity can be.

Tags: , , , , , , ,

blank on

(1990’s | journalese (arts? politics?) | “forget (temporarily),” “(have it and) lose it”)

I’m not very rigorous about it, but in everyday conversation I try to avoid using the kind of new expressions I write about here, just as I try to avoid using such new expressions in posts except to refer to them directly. But this one is an exception, and I catch myself using it fairly often. It has a host of predecessors. Probably descended directly from “draw a blank on” (be unable to remember whatever it is), it also recalls “blank look” and “let your mind go blank” (or the more involuntary “my mind is a blank”). The word implies a temporary but vertiginous mnemonic malfunction, a moment of vacuity that may lead to a deer-in-the-headlights look. A related verb is “blank out” in its intransitive sense, though that may cover a longer time span. “Blank on” means forget something and then recover it, a short-term lapse, more like a senior moment. It may also mean, on occasion, “fail to respond.” (“Shooting blanks” means something entirely different. “To blank” in sports lingo normally refers to holding the opposing team scoreless. Then there’s that charming if now unnecessary euphemism, “blankety-blank.” It is one of those linguistic oddities that “blank,” descended from the French word meaning “white,” looks and sounds much more like “black.”) “Blank on” has so many ancestors that some don’t even involve the word “blank”; doesn’t the phrase “(totally) blanked on it” remind you of “bank on it”? I continue to maintain, without proof, that such phonological resemblances influence new entries into the language.

One does hear occasional variations in meaning when this expression is used, but they never seem to catch on or persist. I saw this sentence recently in a food column in the Dayton Daily News: “Pasta is always a conundrum as a side dish. I want to pencil it into my weekly meal plan, but then I blank on how to sauce it: Cream? Tomato? Lots of cheese?” Here the emphasis falls on inability to choose among alternatives rather than failing to remember them. This usage may prove a solitary exception to the rule, but the contretemps is one we find ourselves in often enough that another word for it may be welcome.

The verb really did not exist before 1980, as far as I can tell. It started to turn up occasionally afterwards; in one of the first uses I found Reagan was the subject of the verb, and this may be yet another expression to which his presidency gave a boost, on the strength of his well-known absent-mindedness rather than policy initiatives. It had entered the language pretty definitively by 1995, often used by politicians and press secretaries, but actors also use it a lot. During the latest presidential campaign, it quickly became the standard verb to denote Libertarian candidate Gary Johnson’s inability to address the significance of Aleppo. As is often the case when a new phrase resembles an old one, or several old ones, the trail into everyday language is not well-blazed and it may be impossible to determine, even in retrospect, how it wormed its way in.

Tags: , , , , , , , , ,