Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

it’s all good

(1990’s | African-American? | “everything’s fine OR cool,” “it’s o.k.,” “all better now”)

Sometimes it seems that the vocabulary of satisfaction has turned over completely since I was young. It hasn’t, of course, but it has added some heavy hitters to the lineup, and this is one. (“I know, right?” is another). I assign it as a synonym to “everything’s fine,” and that comes close, but “it’s all good” brings a bit more enthusiasm to bear.

The phrase barreled into American language during the nineties; some sources suggest an African-American origin. In 2000, it made the Banished Words List of overused new expressions. Here are a couple of examples from the previous decade that are precursors if not early sightings:

-in a 1988 review of a war memoir: “It’s all good, but the infantry assault and the glasshouse [military detention barracks] inhumanities are the high spots.” Here the writer means, pretty clearly, that the novel as a whole is good.

-from a Sierra Club critique of Republican environmental policy (1988): “It’s all good, but they didn’t say how they are going to achieve the goals.” They are saying the right things but making no provision for accomplishing any of them. (Now it’s hard to imagine a Sierra Club official reacting to Republican policy with anything other than horror.)

Neither seems clear-cut to me. I suspect “it’s all good” results from a shortening of phrases like “it’s all good fun/stuff/news” or, less likely, from “it’s all good for the cause/country/planet.” In these instances, the antecedent would be clear, established in recent statements. Without the extra words after “good,” “it” loses specificity and becomes vague and (if you’re lucky) universal: I’m in a good frame of mind and there’s nothing to complain about. “It’s all good” often acts as a response to a question or apology indicating that the speaker harbors no ill will, analogous to the Australian “no worries,” now popular in the U.S. (Cf. “you’re good” — which kids use to mean “I accept your apology” or “no need to apologize” — and “I’m good,” which is pretty close to “it’s all good.”)

The expression is trotted out now in many different contexts; it has made its mark firmly on our language. Its primary quality is reassurance, even nonchalance, though it has an ironic side that implies that assent to the situation is coerced and all is not well. Still, we are to understand that the speaker, if not entirely pleased, is on board and will not make trouble down the road. Sometimes, like “in a good place,” it is a way to say a celebrity has survived detox. And sometimes it is used almost as a benediction, an “amen.” And why not? This simple sentence packs a lot of benignity in its short span.

There are many examples of this week’s entry in popular art and culture. I’ll cite only one, Bob Dylan’s song “It’s All Good” (2009), a sustained example of the ironic use mentioned above. Each stanza relates more and more serious misdeeds and injustices, then closes with the title phrase, brutally papering over the suffering and loss of the victims. “It’s all good” may be misused to obscure abuses of power, whether between two people or across whole societies.

Tags: , , , , , , , ,

forever (adj.)

(1990’s | “permanent,” “interminable,” “endless,” “unstoppable”)

“Forever” has long done time as a noun, an adverb (“forever young”), even an interjection (forever and ever, amen). What was left? Adjective. And it is coming to pass, led by three expressions detailed below. We shall see whether we can make a verb of it.

“Forever war,” familiar to anyone who has been following the news lately, apparently got its start as the title of Joe Haldeman’s science fiction novel (1978). According to LexisNexis, it took more than twenty years for the expression to gain currency in political commentary; it started appearing in the aughts, the decade in which we launched two prolonged, costly, unsuccessful wars in the name of a third, the war on terrorism. Its recent popularity, owed largely to Joe Biden, is spawning spinoffs; Eric Alterman gave us “forever warriors” and “forever nonsense” in the title of a recent column.

“Forever family” is first spotted in LexisNexis in the late eighties, attributed to foster children hoping to land in a stable environment. Here it has a wistful, aspirational sound, softened further by its connection with children in difficult straits. “Forever home,” which seems to have trailed it by a few years, is very similar, used for both children and pets who would benefit from adoption. Recently it has taken on another meaning, analogous to the old expression “dream house” — where a family intends to settle down. In the early seventies, Lady Bird Johnson used “forever home” to mean “childhood home,” not a particular dwelling so much as the place or region one can always go back to, a perfectly logical interpretation that has not stood the test of time.

Those three are established in everyday language. So far, “forever” hasn’t adopted many other nouns. The term “forever chemicals” (in polluted groundwater) seems to be spreading slowly, like the chemicals themselves. I’ve seen “forever prisoners” and “forever commitment.” The Forever Project in New Zealand devotes itself to mitigating the effects of climate change. The Forever Purge, a film about a white supremacist uprising, has done well at the box office this year. The adjective seems poised for greater things as we tremble on the verge of a forever pandemic.

“Forever” has a strong religious echo, yet earnest teenagers use it all the time, too (as in “BFF”). The word may at times denote the full span of eternity, but more often we use it to mean “as long as you or I live.” In “forever war,” it doesn’t even mean that — more like “taking an unreasonably long time to end.”

Lex Maniac has worked a whimsical vein lately, so here are more things “forever” could modify beyond death and taxes: beta version (I’m looking at you, Google), interim coach or other official (sometimes they hang around for a while), speech, movie, line, or wait (it works better in front of one word than in front of several). Then there are more serious possibilities: friend, pension, budget deficit, shortage. Some things do last forever, or come so close they might as well.

Tags: , , , , , , , , , , , , , , ,

baked in

(2000’s | financese? | “built in”)

In a literal sense, we use “baked in” to refer to an ingredient incorporated before cooking, meaning that it is inseparable from the other ingredients and inextricable from the dish as a whole (as “baked-in flavor”). When we use it figuratively, it means something more like “inevitable.” It seems to have originated among financial types in the seventies and eighties (LexisNexis records the great banker Walter Wriston dropping it in 1979), generally in the form “(already) baked in the cake,” i.e., predetermined because macroeconomic conditions now in place (not necessarily because we planned it that way) must result in certain consequences no matter what we do.

Nowadays “baked in” retains that air of inevitability, but an alternate connotation has arisen: there from the start (also inherent in the literal). It is unreasonable to expect to get rid of it because it was always there, and everything around it has changed to reflect its presence. Any strongly held tenet of a political stance, a social movement, a scientific process, or a hunch can be baked in — and the phrase is still used often to talk about markets and marketing. Take a sentence like “In America, racism is baked in,” a proposition obvious to anyone with a glancing knowledge of our history. It was there from the beginning, it’s impossible (so far) to get rid of, and it continues to loom over contemporary politics and events.

“Baked in” doesn’t have to refer to a flaw, but it usually does. Here are two in the same ballpark: “hard-wired” and “overdetermined.” They are all generally used to explain after the fact why something happened and to tell us we should have seen it coming. The relation to the older “built in” — which “baked in” has not to date displaced — is obvious; the connection to “half-baked” is more subtle. “Steeped in” is another old culinary metaphor that works the same crowd.

Even now, “baked in” usually comes after the (linking) verb and spends little time acting as verb itself. (You do see it occasionally, especially among techie writers.) It doesn’t act as adjective often before a noun, either, but it could. It seems noteworthy that it is much more common in a passive mood than an active, a significant trait that may change over time. Poetic justice favors its use in discussions of climate change, but that turn does not seem to have been fully taken.

The descent from “baked in the cake” to “baked in” reminds us how many new expressions arrive at their final form simply by having pieces lopped off, usually at the end. An elaboration deemed necessary when an expression sounds new and daring grows tiresome over time, and we retreat gratefully to the shortened version. As with “lean in” and others, the process has yielded a new phrasal verb, or rather made an old one more common, operating over a much wider field.

Lovely Liz from Queens has ventured “baked in” more than once, which means she considers it a good candidate for the blog. Dead-on as usual, ba-bee! I know you will set me right if I have erred.

Tags: , , , , , , , ,

compassion fatigue

(1980’s | journalese?)

One way to sum up compassion fatigue is “from empathy to apathy.” That is, it results from exerting so much effort to care for others that one gets worn out and no longer has strength or patience to help. Some writers, in fact, prefer “empathy fatigue.” Another way is to think of it as a special case of burnout, although some would distinguish the two. And another: a stress disorder that you get from other people’s traumas rather than your own; “secondary traumatic stress” is another synonym.

If you Google “compassion fatigue,” you will get the impression that it is the sole province of health care workers — or more generally those whose job it is to help others — and properly considered at an individual level. That is, an exhausted and overworked nurse or caregiver is afflicted with it, and the patients bear the consequences. Originally, however, compassion fatigue occurred on a national level. The phrase appears first in LexisNexis in December 1980, thanks to Senator Alan Simpson, who was talking about allowing beleaguered foreigners to resettle in the U.S. Americans did not want to accommodate them, according to Simpson, because of “compassion fatigue.” As late as 2000, that was the primary connotation when the phrase occurred in the press. It is true that you don’t get national compassion fatigue without lots and lots of individuals with compassion fatigue. Yet the scale of the phenomenon is clearly different. In the first instance, you’re talking about, at most, direct effects on a few dozen people. In the second, it’s in the millions.

Senator Simpson may have given the phrase its final push into prominence, but it certainly predates his use of it. There is some on-line evidence that Norman Cousins, editor and leading light of the Saturday Review, invented the expression in the context of foreign aid. In medicine and psychology, Carla Joinson (1992) and C.R. Figley (1995) are often credited with steering the phrase into new fields. (Not only did Figley help popularize the term, he seems to have originated the idea of understanding it as a stress disorder.)

The treatment for personal compassion fatigue relies on two concepts that Lex Maniac has covered, self-care and me time. In order to refresh your empathy, it is necessary to take a break, meet your own needs, and do things because you want to do them, not because someone else is making you. Experts often advise that compassion fatigue results from an inadequate self-care regimen (yes, regimen), and me time is just one component of self-care. There is no cure for mass compassion fatigue, but when times are flush and we need lots of imported workers to keep things going, Americans may get more liberal about immigration.

I don’t think it’s gotten there yet, but “fatigue” is a suffix ripe for spreading. “Donor fatigue” is one example; it widens the field by linking fatigue with persons rather than qualities. Let’s widen it some more. Q. “Why were you late to the office?” A. “Commuting fatigue.” (A much larger problem now than a couple of years ago, or maybe it’s just more openly discussed.) Students might develop exam or term paper fatigue. Most of us have a bad case of politics fatigue these days. You name it, if you’re sick and tired of it, or have used it up, tack on “fatigue” et voilà! a fun new phrase is born.

Tags: , , , , , , , , , , , ,

landline

(1990’s | “home phone”)

An old term in telephony, “landline” achieved its present status in the nineties, during the dawn of the cell phone era. Suddenly we needed a way to distinguish our home phone, which had up to then been known as a “phone,” from our portable phone, which has several different names (cellular, mobile, portable in French, Handy in German). That momentous shift forced this sleepy engineers’ term into prominence. It is used less often because fewer and fewer people have them now, but they haven’t disappeared, and the expression will remain in our vocabulary for another couple of generations, at least.

There is a corresponding shift in denotation, of course. Landline(s) used to refer to cables and wires, not the set connected to them that lives in our houses, or the ten-digit number that goes with it. The evolution is so natural as to seem inevitable, a classic metonymy. “Home phone” was already distinguished from “business (work) phone,” so it wasn’t well-suited to serve as an antonym for “cell phone” — never “cell line,” never “land phone.” The split is strange, but maybe it reflects how quickly and unquestioningly we adopted cell phones and the terminology that came with them. It is almost axiomatic that the change from the telephone as something that sits in one place to something you carry around with you and use whenever you want is fundamental, epoch-making. That’s true especially if you get stuck with a flat tire in a remote place, or break your leg on a wilderness hike, but in more general and comprehensive ways as well. The cell phone revolution, followed immediately by the smartphone revolution, has forced dramatic and relatively sudden changes in how we manage and conduct work, leisure, politics, social life, family relations — everything. Now that we are content to have smartphones run our lives, it’s hard to remember how different it all was.

“Landline” must carry cultural baggage, too, due to an ever-strengthening association with organizations and old people, representing stodginess or its friendlier cousin stability. Those under forty generally don’t have landlines because they are superfluous. I keep mine partly because it transmits sound more accurately than any cell phone I am likely to have, and my hearing isn’t getting any better. Also because I find the stationary telephone comforting, even natural; I still plan my communications sometimes as if landlines are all we have, though I know there are options in these latter days, and I have access to several of them. (When I was a kid, the only way you could carry the phone from one room to the next was if you had a really long cord; now people walk for miles pursuing animated conversations.) But I also know that some day my beloved landlines will disappear, as the fiber-optic cable ages and requires more trouble to maintain, and nevermore will we see the phone plugged into the wall — except when the battery is low.

Tags: , , , , , , , , ,

proof of concept

(1970’s | enginese? miltarese? | “practical demonstration”)

It’s neither proof nor concept, really. “Concept,” a philosopher’s term, is slumming here, donning the mantle of any crassly marketable consumer good. “Proof” — the bottom line for everyone, as Paul Simon said — is replaced by “we’re pretty sure it will work,” or “we got it to work once.” Proof of concept is an early milestone in the process of research and development; the expression has become essential in both technological and financial circles. It is a rudimentary demonstration of the viability or feasibility — terms that sound more ordinary and less jargony than they did — of a new idea or application of an idea, whether product, technology, or government program. Trying to answer the simplest of questions: Can it get off the ground?

There are a couple of related expressions. An obvious one is prototype, but that refers to something more advanced, a working model that approximates the mechanics and processes that cause your thing to go. Proof of concept merely establishes that what you want to do is possible, leaving many fine points unexplored. A prototype has to answer more questions. “Minimum viable product,” which I have alluded to before, goes beyond the prototype; no longer a model, it is the thing itself in its simplest form. This site explains it all for you.

I don’t know if they were meaningfully related or not, but “proof of concept” reminds me of “proof of purchase.” By the mid-1970’s, the latter was quite common, having replaced “box tops” in consumer vocabulary as your means of earning a reward for buying enough of a particular product. Such lagniappes still exist, but they are less built into the system than they used to be. “Proof of” has a long history, and plenty of words may follow it. But I have a feeling that “purchase” was its most frequently heard adjunct at the crucial moment when “proof of concept” was formed.

The phrase dates from 1967, says Wikipedia. Executives and experts in military and technological fields tossed it around some back in the seventies. Available as a noun or adjective even then, it now takes an indefinite article and has become widespread, although still mostly associated with new technology and the wherewithal to develop it. But it’s used in other contexts and may creep into more.

Once again, my old buddy Charles comes through with a solid new expression, though not the one he thought he was supplying. If Charles says it, it must be good.

Tags: , , , , , , , , , , , ,

rabbit hole

(journalese (politics) | “cloud-cuckoo land”)

For all that Lewis Carroll gave us the rabbit hole, we use it today in more Kafkaesque terms. By definition, rabbit holes bring us to strange places where normal rules and conventions are suspended, just as Carroll imagined it in the beginning of Alice in Wonderland. But the term has taken on a wider sense, applicable to any confusing or indefinite process or any place where weird, unaccountable things happen. And unpleasant. A rabbit hole is a bad place to be, where the goings-on not only defy reason but have unwanted consequences. These are not Carroll’s “wonders.” When you go down a rabbit hole, you may know what comes next. You’ll just wish it hadn’t happened.

By now, we are prone to think of rabbit holes in political terms, specifically as a metaphor for the sort of conspiracy-minded interpretation of current events that apparently continues to grip ever more of us. Here the emphasis is on descent into folly, as eager cultists latch onto wilder and wilder allegations and hang on like pit bulls. As of 2021, “rabbit hole” can be used to denote many other things, but the socio-political usage has been gaining ground, and the phrase may bind itself closer and and more irrevocably to QAnon and its ilk. Carroll would not be pleased; Kafka would not be surprised.

While “rabbit hole” has become one more verbal grenade in our political wars,* the phrase still implies a greater-than-usual departure from reality. Those who go down rabbit holes are not guilty of ordinary stubbornness or reasonable skepticism; they believe more or less fervently in preposterous propositions, patently absurd on their face, such as QAnon conspiracy “theories” — the word dignifies them; they are ravings, or merely vile jokes. Such beliefs may take root even in good minds, and when they do, they are tenacious, for we must believe hardest that which is hardest to believe. There’s no way around it. The less evidence for your particular notion there is, the harder you have to work to maintain faith in it. When there’s no evidence at all, it takes that much more effort to keep the lodestar in place — and when we put in that kind of effort, we don’t want to admit we’re wrong.

It seems to me that a phrase like this ought to come from hunting: going down the rabbit hole like a dog in hot pursuit of prey. And maybe there’s an echo of that among followers of QAnon, haring off down every new trail, amassing “clues” and piecing them together with the tenacity of a Sherlock Holmes. But all the uses of it I’ve found look back to Carroll. The phrase has been with us for a long time, but it grows more frequent by the decade as rabbit holes increase in number like unto rabbits. I fear we will hear it more and more.

* Depressingly long bonus list of terms that form part of the vocabulary of political abuse, as covered by Lex Maniac: aspirational, cancel culture, cognitive dissonance, death spiral, deplorables, dog whistle, feeding frenzy, first-world problem, flyover country, gotcha, hive mind, junk science, nothing-burger, out of the loop, politically correct, race to the bottom, snowflake, stay in your lane, stick a fork in him, he’s done, throw under the bus, trickle down, virtue signaling, wonk, word salad

Tags: , , , , , , , , , , , , ,

note to self

(1990’s | journalese? | “mental note”)

You tied a string around your finger in order to remember something important, and it worked because every time you looked at your finger, it reminded you of whatever it was — generally a specific task. In truth, you didn’t really see people walking around with strings around their fingers cutting off circulation, but the concept was still current in my youth, if only figurative, at least among older people. Now it has gone the way of so many colorful expressions of our forebears.

Never fear, we rustled up a verbal equivalent. “Note to self” appears to be a Briticism; the first hit in LexisNexis in the U.S. dates from 1994, by which time it had become familiar in the U.K. and Canada. From the start, it was the property of lifestyle columnists, book reviewers, and the like — writers whose stock in trade is hipness. The phrase had and retains a humorous or ironic quality, making it easy prey for comedians. Most of all, it is rueful; if everything goes fine, there’s no need for a note to self. Normally the phrase introduces an observation that is either obvious or has recently impressed itself upon the speaker, with the implication that the reminder should not have been necessary. When it is used to set up a more far-fetched piece of advice, it adds an extra layer of irony. It nearly always is used to introduce a sentence, though it may appear on its own, as a response to oneself or to another speaker.

“Note to self” is commonly associated with recording one’s own voice, as with a dictaphone or mp3 recorder; picture a high-powered executive muttering a reminder to acquire a smaller company or buy chocolate for the spouse. That scene owes its ancestry in turn to spy movies and their parodies. (I don’t think Maxwell Smart ever said “note to self,” but it’s easy to imagine him doing it.) Yet the voice recorder was never essential even in early uses of the expression, which has always been available as a purely written flourish. Because the phrase prods us to useful accomplishments or at least to avoid pitfalls — no one ever says, “Note to self: alienate friends and ruin life” — it forms a twig on the great tree of self-improvement and self-help, which has overshadowed American culture. The note to self is both an admission that one continues to need to do better and a path to the goal. The phrase met quick acceptance partly because it fit in readily with one of our national obsessions.

We owe this week’s expression to lovely Lenny from Houston, who impressed it on my ear several years ago. It has been rattling around all this time, surprisingly long considering how fruitful it is.

Tags: , , , , , , , , , , , ,

metric

(1990’s | businese | “statistic,” “number,” “yardstick,” “standard,” “scale,” “category”)

Remember the metric system? You’re alone. Except for two-liter soda bottles, ordinary Americans have remained safely cocooned from dreaded decimal measurements. Well, gather ’round, kids. When I was a boy, we actually learned the metric system in school. We haven’t had much occasion to use it since, but we added the word to our active vocabulary, and we have an entirely new use for it: now a noun, omnipresent wherever numbers are crunched. At first I thought it was simple, but it has a sneaky range to it, sewing up several different meanings (see above) in one short, easily uttered word. It is successful because it covers a lot of ground.

The word was not used commonly as a noun before 1990; some time during that decade today’s meaning arose, or at least occurred far more frequently, in the business press. Metrics were the glue that held all the other employee relations buzzwords together, because there have to be official, defined standards for everything that you might be judged on. All of which means that decisions about what to metric-ize have both local and wider political significance; DEI metrics are very fashionable this year. Any metric is only as good as the intelligence of its designers, and as reliable as the data collected.

“Performance metric” has a more specific and sinister application. A performance metric tells the boss whether the workers are productive — in other words, whether to crack down on them or not. The purpose of performance metrics is not to reward good employees but to find reasons to get rid of the ones you don’t like. The more detailed and thoroughgoing your data, the more intrusive you have to be to get it. There are also metrics to judge the performance of executives, of course, but management may choose not to pay attention to those.

They are not identical, but the businese use of “metric” reminds me a lot of “analytics” as used in baseball: roughly, making decisions about roster, strategy, etc. based on statistical analysis, correlation, and calculation. Both terms strain in the same direction, toward data-driven policy and action. While “analytics” has become the word for the entire discipline as practiced in major league front offices, “metrics” has not become a separate study — although figuring out what to measure and how to combine your measurements is its own science. The word could take on wider responsibilities, but so far it hasn’t.

Final note: There’s a significant distinction between a system of measurement and a unit of measurement. Metrics are units, never the systems that bind them together. They provide the details that make up the big picture, with each metric adding its mite to the common store. In the case of “metric system,” the most common adjectival use, the word encompasses rather than distinguishes.

This week’s expression comes to you courtesy of Marc from Palo Alto, a first-time contributor, although he didn’t know he was contributing. Lex Maniac takes ’em where he can get ’em.

Tags: , , , , , , , , , , ,

mashup

(1990’s | journalese? | “collage,” “medley,” “pastiche”)

It all started with food. The idea of turning potatoes, or turnips, or whatever, into mush, to make them easier to eat or just because you like them better that way, forms the root of this week’s expression. Once we mastered the art of mashing up one item, an adventurous chef decided to combine two items, such as potatoes and celeriac in the manner of Lovely Liz from Queens. That’s a closer analogue to a mashup in music — combining two or more existing songs in an interesting way. The splicing may be done vertically (one track superimposed on another) or horizontally (placing snippets from different songs one after another). As with potatoes, a mashup involves compression and mixing (or meshing — why not “meshup”?), but not the same loss of structure. Mashups can be, usually are, a respectful way to “create something new out of something familiar,” in the words of one commentator. Now videos and other media may receive the same treatment; it will be interesting to see if the term spreads much beyond media.

“Mashup” seems to come from Australia, or anyway some past or present dependency of the British crown. The first clear-cut instance I found in LexisNexis dates from Australia in 1988; by 2000 it was accepted and no longer even particularly hip in the British press. It does not seem to have found its footing in the U.S. until after 2000, although one can find examples before then in arts journalism. Before “mashup” stood on its own two feet, the idea was expressed in transitional forms, such as “mashed up” (adjective) or “mash up” (verb), both of which preceded the noun.

Plausible earlier exemplars may be found, from Charles Ives to Harry Nilsson, but for practical purposes this form as we understand it goes back to rap DJ’s of the 1980’s, who did a lot of work by stitching snatches of songs together. Sampling was essential, and it is the basis of the mashup as well. Plenty of mashup artists have hip-hop roots, though the form has spread far beyond. Digital audio and internet distribution make this sort of thing much simpler, so other favorite villains are implicated. Mashup culture, animated partly by animus to oppressive copyright laws, is a thing.

The analogy doubtless leaves something to be desired, but mashups remind me of fan fiction. Both rely heavily on work others have done, and for that reason both partake of a sense of homage noticeable even when the artist does not intend to be laudatory. Although fan fiction generally demands more creativity of its author, mashups may incorporate original passages. Yet the work has less to do with stirring in your own material and more to do with noticing interesting resemblances, and attending to striking connections or superimpositions. Those are what the artist contributes in these newly potent forms.

How valuable is such derivative art? In a sense, all art is derivative, at least by now, when there is nothing new under the sun even when everything is new. But mashups and fan fiction are so willfully derivative that they may seem cheaper or less worthy somehow — similar to the difference between summarizing an issue in your own words and quoting someone else at length. I hesitate to venture a definitive answer, or even an opinionated one. The culture has made room for such creations, and to some extent the market has as well. It doesn’t matter whether I think it’s worthy or not. Maybe that’s dodging the issue on a technicality, but it’s the best answer I’ve got.

Tags: , , , , , , , , , , , , , , ,