Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

crater

(1980’s | businese (finance) | “collapse,” “crash,” “tumble,” “drop,” “fall off a cliff,” “go south”)

“Crater” has always entailed a certain magnitude, not to mention violence. Its original meaning in English, the primary opening of a volcano, goes back to a Greek word for bowl. The noun has always denoted a big hole in the earth, which may be caused by rumblings from below or impact from above, from a meteor, say, or a bomb. Craters by implication are dangerous. If you don’t get engulfed in boiling lava, you may fall into one and never be heard from again. When used figuratively, say to describe a bullet wound, “crater” may be small in absolute terms but large in proportion to where it is. I dimly remember “crater face,” an affectionate nickname for high school students with heavy acne.

When “crater” takes the mantle of a verb — a relatively new phenomenon, even if Chambers etymological dictionary found a citation as far back as 1884 — it brings the same suggestions with it. I didn’t find many pre-2000 examples in LexisNexis; nearly all of them occurred in the financial press. Finance and corporate culture are reliable sources of American vocabulary, and analysts and executives often have a surprising flair for the dramatic. Whether the field is small (a single company) or great (the entire banking system), the verb signifies a sudden and unforeseeable decrease in value (we used to talk about the bottom falling out). The effect of the pandemic on certain industries — airlines, restaurants, petroleum, Hollywood, etc. — gave it a boost last year. As the economy craters, the frequency of the verb does the opposite. One might resort to “crater” when discussing roads or buildings, but such matter-of-fact use is rare.

By now the verb has become more common and has loosened up. Wherever it is used, the sense of a sudden sharp decrease is still the rule, but the magnitude may be more disconcerting than cataclysmic. Still used frequently in financial circles, the verb turns up more and more elsewhere, especially on the sports page, where it is not unusual to see reference to a team cratering (getting clobbered) in a single game or stretch of games, even an entire season. Other mutations are gaining speed as well. The past participle does duty as an adjective, which was rare twenty years ago (except when used to describe terrain or pavement). “Crater” is very often intransitive, but the transitive form, which goes back a long way, has been asserting itself more. (A recent example from a financial blogger: “Restaurant traffic is picking up after the pandemic cratered most of the sector.”) There is another semi-transitive use, when the verb is followed by an amount: for example, a stock’s value “cratered 25%.” That was possible twenty years ago but much less conventional. As best I can figure the percentage is really an adverb, but it certainly looks like a noun, and I’m not convinced it isn’t really an object.

“To crater” started as a narrow, specialized term; now we see it spreading its wings, eyeing new directions, acquiring variant usages, dropping from more lips. That probably means we will hear it more in the next twenty years than we have in the past twenty.

Tags: , , , , , , , , ,

not o.k.

(1980’s | journalese? | “not (all) right,” “not so hot,” “serious,” “unacceptable”)

I encountered this expression first in the early nineties in the form of a reprimand, uttered by the supervisor when a co-worker and I were goofing off a little too obviously (part of growing up is learning how to goof off unobtrusively). There it ran “it’s not o.k. to . . .” with an impersonal subject. Instead of saying “Cut it out,” the boss said it’s not o.k. Seems like a bit of a retreat, doesn’t it? In this case, he didn’t have direct control over us — we were hired by the university, not by him — so maybe he didn’t feel like he could boss us around, or maybe he just didn’t have the nerve to tell a couple of giggling students to get back to work.

It stuck in my head, because it seemed so banal, yet it had a certain power. “O.k.” has reached a plateau of ubiquity higher than even culture-blanketing expressions like “iconic” or “don’t go there.” We all use it endlessly, here and around the world; you don’t have to speak English. It has had a lot of time to spread — “o.k.” was born right here in America in the late 1830’s as an abbreviation for “oll korrect,” an example of the “phunny spelling” that was a staple of nineteenth-century American humor. It appears in countless contexts, mostly to signify some sort of approval or acceptance, but not necessarily. (For example, if used to respond to verbal resistance, by drawing out the first syllable and cutting short the second while adding a rising intonation, it suggests the situation will cause more trouble than expected.) We use “o.k.” the same way we use “fine,” to convey a sense of contentment, or at least no trouble worth mentioning. When a person is o.k., everything is acceptable or better in his or her life. When a thing is o.k., there is nothing objectionable about it. “Not o.k.” is the opposite. Something is wrong, and it can’t be denied, papered over, or justified — even if all looks well on the surface.

More recently, “not o.k.” has taken a personal turn, normally the first person, as in “I’m not o.k.” That phrasing evokes disquiet or worse, especially in one’s mental or emotional state. Yet it might sound odd for a terminal cancer patient to say “I’m not o.k.” It’s for real problems, but not that serious. Uncertainty or suffering can bring on discomfort and anxiety; trauma may do greater damage, and one might use the expression in either case. Needless to say, “not o.k.” has profited from our collective misfortune over the past year, and a new manifestation has taken root: “it’s o.k. to be not o.k.,” meaning that you have good reason to be miserable or uneasy or disoriented. Then there’s “I’m not o.k. with . . .” (I don’t condone, support, or agree with . . .), which takes us right back to “it’s not o.k. to . . . .”

LexisNexis shows a few instances in the seventies, when it usually took the form of a rejoinder. One’s foil had described an act or policy as o.k. (meaning anything from adequate to great), to which one thundered, “No, it is not o.k.!” In other words, don’t try to palm this crap off on us. But it could have another shading, as when Pope John Paul II remarked, after some trouble with the microphone during his visit to Philadelphia in 1979, “I see, also in the United States, something might be not o.k.” — that it might not work properly. It sounds strange now to hear it in the context of an object like a microphone, rather than an action, an idea, or someone’s sense of well-being. But the pope was twitting our stereotypical national characteristics: can-do optimism, confidence in technology, and a preference for not acknowledging problems. We talk big, but we don’t always get it right. The American expression typified an American attitude.

Tags: , , , , , , , , , , , , , ,

epic

(1990’s | “grand,” “great,” “wonderful,” “incredible,” “memorable”)

When I was young, I knew all about “epic.” Noun or adjective, it referred to a long, heroic poem that explained how a nation came to be, or laid out legends of deeds lost to history. In the Greco-Roman West, we have Homer and Virgil, but the Mahabharata or the Kalevala also qualify, and dozens of others. (Nobody ever called Exodus an epic, as far as I can remember, but my scholarly readers may correct me.) When I got older, I found out about “epic theater,” as espoused by Erwin Piscator and Bertolt Brecht, which had nothing to do with Homer but plenty to do with Aristotle, as these German young Turks demanded new forms and purposes for the drama — less catharsis, more social criticism — and set about writing and producing their own exemplars.

Those are both literary definitions because “epic” was a literary term. It was available for metaphorical use even as an adjective in the 1970’s, but only on special occasions. The word indicated that the hearer was to imagine the story on the scale of Odysseus’s journey to Ithaca. So you might recount the epic saga of getting home after the car broke down, or an epic tale of a controversial bill worming its way through Congress. But mostly it described films and stories, according them the sweep and scope of the ancient poets singing still more ancient feats of arms and guile. Not just long and complicated, but stirring and uplifting as well.

You can probably guess what’s next. “Epic” has not lost all of its mojo, but it is used to describe many, many things that can’t reasonably be compared to the old poems, or even old Brecht. Resort visits, automotive performance, hamburger stands, t-shirts, a sports rivalry. Almost anything can attract the name, though it helps if it can claim a modicum of longevity and tradition. The change has occurred largely since 1990, and mostly on the adjective side; the noun has fared a little better. As the adjective has overrun the language, it has generated its own fixed phrases. A favorite example of mine has become familiar to ears of all ages but falls primarily from younger lips: “epic fail,” a punchy and rather appealing evocation of at least minor disaster caused by human action. When a kid rolls her eyes and says, “It was an epic fail!,” you can be sure it didn’t go well. From the world of commerce: I had not known that there is a cryptocurrency called “Epic Cash.”

There is an obvious connection to “iconic,” but I’m more inclined to compare “epic” to “awesome.” “Awe” had real power once, not that long ago, and “awesome” had replaced “awful” as its adjective. An awesome thing was magnificent, colossal, humbling. As everyone knows, that’s all over now. “Awesome” has become threadbare and reduced, mouthed a propos of anything cool or nifty (has that word died? It was better than “neat-o”) — all it says is that the speaker likes the thing in question. “Epic” has so far held onto more of its power, but who’s to say that will last? Another decade or two, it may be just as attenuated as “awesome.”

Tags: , , , , , , , , , , , ,

iconic

(1980’s | journalese (arts) | “classic,” “legendary,” “exemplary,” “representative”)

We have become so poundingly familiar with this word that we no longer think about what it means. The best informal definition I can come up with is “widely recognized and been around for a while” (at least a generation). It was hardly an unknown word in the 1970’s, but it was much less common and more technical. As for what it meant back then, I would hazard “visually representative.” (Then as now, it might modify an abstraction such as power or beauty.) The word was common in discussions of signs and symbols: “iconic” meant representing an object with a two-dimensional picture that resembled it somehow. In the memorable language of a U.S. army technical report (April 1977), “In iconic representation the symbol ‘looks like’ the feature it represents — it may simply be a pictorial representation of the ‘real thing.'” (Love those scare quotes.) It was not a word, or symbol, that had an arbitrary relation to the thing it denoted. It appealed more directly to the senses, and therefore was harder to misinterpret.

How matters have changed since then. From technical term to hellishly overused buzzword, a swirl of meanings has continued to engulf “iconic.” In the eighties, there was a transitional sense visible in the arts press, applied to objects that were both naively pictorial and had a patina of greatness or perfection. A sterling example was Andy Warhol’s Campbell’s soup can, and Warhol was an early example of an “iconic” figure as we use the word today. The first pop icon, as lovely Liz from Queens points out, Warhol embodied in his art and persona a balance of representation and representativeness; that is, his art both looked like what it depicted AND captured the Zeitgeist, or came as close as anyone. “Icon” in this sense was available by the late seventies and widespread by the late eighties. The adjective followed along a bit more slowly, but not much. By the late eighties, “iconic” in the transitional sense outlined above was well settled, and the word continued to spread in the nineties over ever wider fields of language. The decisive turn toward the way we use it today took place then. Now it modifies everything: persons, places, images, buildings, bridges, brands, events, and on and on.

To understand the shift in usage, consider the phrase “iconic interface.” (Apologies for lapsing into computerese, but computers themselves are iconic, y’see.) Ca. 1990, that would have been understood to mean a computer screen that relied on little pictures to tell you where to click. It was iconic because it used images rather than words or arbitrary symbols. If you heard that now, you would think it meant something else, right? An iconic interface would be a touch old-fashioned, but still recognizable, and it would represent somehow an ideal form of the user interface. Iconic really means Platonic.

Am I right in thinking that even now, in its decadence, “iconic” still bears traces of its old mystic force? When we call it iconic, aren’t we endowing it with a superior standing? Or at least asking it to hold out hope that somewhere outside the cave there are higher standards and greater deeds? Let’s celebrate when someone really gets it right and creates something that we all respect or remember fondly, that becomes a point of reference. We want to honor that, and we reach for a religious term that begs to grant superhuman power to an entirely human thing.

If I have counted correctly, this is the six hundredth new expression I have written about. While I’m preening, I am also nearing the tenth anniversary of Lex Maniac (March 23), and this is my 497th post. A near-confluence of round numbers. I always thank my readers in moments like these, and I’ll do it again, especially the regulars — never hesitate to fire off a comment, folks. I hope you’re enjoying the show.

Tags: , , , , , , , , , , , ,

off-putting

(1980’s | “annoying,” “disturbing,” “offensive,” “disconcerting”)

Altogether an odd expression. Let me count the ways:

1. Nearly all the uses I found from the 1970’s occurred in the Canadian press. Canada has not done well in the new-expressions-since-1970 sweepstakes. I’ve identified two demonstrably Canadian expressions, cougar and optics, along with a few maybes (list here). I’m not certain about this one, but when it appeared at all before 1980, it was mainly in the north-of-the-border press.

2. An adjective in this form ought to have a verb that goes with it, but “off-putting” doesn’t quite. (You know the old saying: never off-put till tomorrow what you can do today.) There is no such noun as “off-put,” either (man, that’s a real off-put). Not even “off with which I will not put.” While “put off” may mean “discourage” or “repel,” its most common meaning is “defer.” When you use “put off” to mean “repulse,” it sounds a little old-fashioned, or British. Then there’s a similar phrasal verb, “put out,” which, among other things, means “irritate” or “disrupt,” close to how we use “off-putting.” But if you said “out-putting,” it would sound like computerese, or something an audio engineer would say. “Off-putting” has singularly few close relatives. It reminds me of “put off one’s feed” or possibly “throw one off,” but neither of those has exactly the same force or sense.

3. The range of meaning of “off-putting” may be a little puzzling. It usually lies somewhere between a touch perturbing and genuinely repugnant, though it could stand in for either pole in a pinch. “Disagreeable” or “unpleasant” is probably the most reliable synonym, but “off-putting” has more specific shades. It goes with mannerisms of the body (tics, habitual gestures) or mind (political or religious beliefs). It might also describe an element of a performance that keeps the audience from enjoying the show, or a public official’s utterance that offends constituents. Broadly, if it makes you uncomfortable, or distracts you, or both, you can call it off-putting.

4. As point no. 2 may have hinted, “off-putting” has potential for drollery, in which I take a certain heavy-handed interest. Start with the obvious golf joke — if your short game is weak today, your putting is off, therefore off-putting; or maybe you were just off putting on the practice green. But I want more. I want “off-pissing” to be an adjective. The world is waiting to hear uncouth teenage boys grumbling over a particularly raw deal: that’s really off-pissing, dude. While we’re at it, I want “on-putting” to be an adjective, too. When a member of the group gets hoity-toity, the others can mutter, “What an on-putting display” (think of “putting on airs,” if that’s not too archaic). Similarly, if he tries to yank your chain, “he’s really on-putting today” (“putting you on”). And how come you put on your clothes, but you don’t put off your clothes? “Off-putting” unfairly denied another humorous avenue! We’ve only scratched the surface, and the sheer variety proves that “put” is one of our most versatile and unsung verbs, a true workhorse with a range of use most verbs only dream about.

My father donated this expression, having most likely encountered it in last week’s post, in which I broke my (not terribly strict) rule against using new expressions in blog entries. For the sake of posterity, I must note that Dad does not approve of “off-putting,” finding it ungraceful. I trust he will not be put off by my jocular tone.

Tags: , , , , , , , ,

sell-by date

(1980’s | bureaucratese? | “expiration date”; “peak,” “prime”)

We had this sort of consumer aid when I was a kid, but we called it an expiration date, applied to food as well as bus passes and insurance policies. A related expression that started in grocery stores and has likewise sprouted a wider significance is “shelf life.” In a 2009 article, The Guardian dates the term to 1973 and declares the concept not much older. They credit it to Marks and Spencer, the British clothing store, and LexisNexis shows that it emerged in quantity first in the British press, and is still deployed more often there.

“Sell-by dates” aren’t seen much in American grocery stores any more. You’re more likely to encounter “best if used before” or something similar; in the U.K. they are called simply “use-by dates.” I was somewhat bemused to learn that government agencies and major news outlets still feel the need to issue Talmudic commentaries on the various types of expiration date (is there a general term?) after forty years of ubiquity. The real question is whether the date stamped on the package is there to help the seller or the buyer; in this light, the distinction between “sell by” and “use before” is obvious. Some have observed that the use of the dates increases sales and shields stores from liability. That’s true, but so be it. It’s a low-margin business, and if grocers can wring a little extra profit out of a helpful service for customers, we shouldn’t begrudge it to them.

While “sell-by date” as used in the grocery store is new enough to qualify for Lex Maniac treatment, it has evolved in its short life to take on a social and sometimes even moral dimension. Rather than signaling that food might be spoiled, the term may now convey that a person, idea, or fad has run its course and then some — not just off-putting, but objectionable. It is primarily a matter of fashion, but in the case of a noxious idea — e.g., antisemitism is way past its sell-by date — it also carries moral weight. The expression covers a range of unwelcomeness: old, passé, superannuated, out of style, out of favor, contemptible. Once this thing was important and we had to pay attention, but now it’s time to flush it (or, if we want to be really up to the minute, cancel it).

A wider field of operation gives the expression room to expand into other kinds of commodities. Why not stock, for example? Those small investors who got swept up in the GameStop rally and proceeded to lose their shirts — well, they shouldn’t have kept those shares past their sell-by date. Your friend drives a jalopy, tell him his crummy car is past its sell-by date. A piece of land in an area that is undergoing development? Better strike while the iron is hot, before the plot passes its sell-by date. You do see this sort of usage here and there already; I predict it will get more common.

Lovely Liz from Queens tossed this one off a few days ago with her usual sprezzatura bravura. All hail!

Tags: , , , , , , , , , ,

food desert

(1990’s | activese?)

They are not geographical features but products of the class system. A food desert is an area within a relatively prosperous nation in which it is difficult or impossible to buy fresh food, especially produce, to prepare at home. We may understand the phrase primarily in reference to densely populated urban areas with large minority populations, but largely white rural areas may be food deserts as well, and many are. (The word “underserved” — a little old for the blog but only a little — is never far away.) Just one more fact of life for the poor: the food on offer is processed in ways that make it unhealthy, while nutritious food costs more time and money than most people have. The use of “desert” is intended to emphasize the severity of the problem. In food deserts, there’s nothing like what any suburbanite would consider a well-stocked supermarket — no cheap, reliable places to get your groceries. Yet while “desert” conjures a vast expanse, food deserts usually are no bigger than several square miles.

“Health care desert” and “pharmacy desert” have been coming into use recently, also “news desert.” The definitions differ somewhat, obviously, but we have adopted “desert” to refer to a locus of deprivation, where a necessary item or service is scarce or hard to find. In the case of health care, the class distinction persists, but the gap feels different when it comes to news, where the division falls along partisan lines. While there are broadly-agreed upon standards to judge whether people have enough to eat, there is much less agreement on how to determine that news is wholesome (that is, reliable and fair-minded).

I was surprised to find that “food desert” is a Briticism; it began turning up in the U.K. press in the mid-1990’s. It doesn’t seem to have spread in the U.S. until after 2000, like another close relative, “food insecurity.” There does not seem to have been an equivalent older term. I doubt that’s because the phenomenon did not occur before 1995; more likely it just took that long for someone to hit on a new locution to draw attention to hunger in first-world countries.

Politicians periodically pass, or at least introduce, legislation to encourage the infiltration of food deserts by farmers’ markets, supermarket chains, etc. that can provide fruit, vegetables, and other basics in relatively unprocessed form. It isn’t clear to me how useful such legislation has been, but it’s slightly heartening to see elected officials so much as trying to help their constituents. Four Senators — two Democrats and two Republicans — recently introduced a bill designed to mitigate the ravages of food deserts in both rural and urban areas. A bipartisan bill! Two cheers.

Tags: , , , , , , , , , , , ,

unicorn

(2000’s | “rara avis,” “one of a kind,” “exceptional person, etc.”)

If the unicorn is a mythical, non-existent creature (unless you count the oryx), why do we use the word to mean something or someone that does exist, but is exceedingly rare? Aside from the meaning of the term in financial circles — a start-up company that has achieved capitalization of one billion dollars — it has come to mean a remarkable person or thing, often with the implication that the so-named is desirable. Is that because a couple of generations have grown up thinking unicorns are cute and cuddly? Do large portions of the population have a secret fascination with unicorns that has caused the culture to wish fiction over the line into reality? (The culture has developed strong tendencies to wish fiction into reality, I’m told.) I don’t think we take the names of any other mythical beasts in vain in quite the same way, but what other name might we use? Hippogriff? Cockatrice?

The adoption of “unicorn” for this purpose seems partly intuitive and partly not; I feel like I understand it at first but on reflection it slips out of my grasp. It looks like an elaboration of “unique,” an adjective, but beyond doubt a related word. And it sounds a lot like “unica,” a word collectors use to mean “the only thing like it in the world.” The quickly growing financial usage has already begun to water down that sense of uniqueness, and my guess is that in a decade or less, “unicorn” will be used much more loosely to refer to anything a little out of the ordinary. I suspect the connotation of desirability will persist.

If you look at the pre-1980 equivalents listed above, you see another incentive for a word like “unicorn” to work its way into the language. The first (literally “rare bird”) was already pretty “rara” when I was a boy; one encountered it in crossword puzzles but nowhere else. The second matches, in a longer, clunkier sort of way, but can’t take an article; the third has no grace. Rare, desirable things come up often in our conversation, albeit as something wished for rather than something had. To fill that slot, we want a nice, simple one-word noun that follows an article naturally, and if it has some cachet, all the better.

The financial meaning came along within the past ten years and is credited to Aileen Lee, a financial manager and investor who first used “unicorn” that way in 2013. Well before then — I found an example in the late seventies, but only one — it was used to refer to anything sui generis. (It is a relatively short step from none to one.) Even that older usage did not become common until much later, after 2000; I have a notion I’ve been seeing it more often lately.

“Unicorn” names several different things with no obvious connection: a cryptocurrency, a motorcycle, a sitcom. The word has a definite pull; while it has never overrun everyday language in any of its figurative senses — if its primary referent is mythical, can it usefully be said to have a literal sense? — some writers really like it. Little kids, apparently, never tire of the mythical beasts.

Tags: , , , , , , , , , , , ,

outlier

(1980’s | mathematese? | “anomaly,” “exception (to the rule),” “outsider”)

Our definition of this term today owes much to statisticians, but it seems to have come into play earlier in geology and topography. An outlier is part of a formation that is physically separate from the rest of it, due to a fault line or possibly erosion. The OED cites several other meanings, most of which coalesce around the idea of an entity noticeably outside the norm. (One which doesn’t is my favorite: “person who sleeps or lives in the open air, or away from his or her place of business, duty, etc.,” now considered obsolete.) Available in statistical analysis for decades, the word showed up infrequently if at all before 1980 in the press. By now it is ordinary, although I suppose it retains a slightly technical flavor.

My image of an outlier: picture a number of data points sprinkled on a graph. Most of the points cluster together, and it’s easy to visualize a straight line running through the plot. But then there are a couple of points that are nowhere near the line, wrecking your experiment. That’s what “outlier” means to scientists — and to the rest of us too, by now. In 2020, we use it much more often to designate persons, companies, cities, nations than in 1990. But that sort of usage was hardly unheard of. As early as 1994, the word had a vogue in relation to Japan; American economists liked to point out that Japan needed to be treated differently (i.e., more punitively) because it was an “outlier” that didn’t do business like the other nations. In medical insurance jargon, the outlier is a special case of the general definition: a patient who uses an unusually large amount of medical resources (as in “outlier payment,” “outlier policy”), generally due to unpredictable complications.

The way you handle outliers says a lot about your approach to statistics. One way to look at it is that they suggest a failure of observation or a part of the experiment done incorrectly. According to this line of thinking, they are warning signs telling us that further checking is needed. On the other hand, since outliers throw off the data and complicate simple conclusions, there is always a temptation to explain them away or throw them out. In statistical analysis, that may be the right response, but in public discourse it can too easily turn into suppression of dissenting voices or alternative ideas. A lot of outliers don’t, in fact, have much useful to say, but dismissing too quickly anyone who disrupts the meeting will over time cause decision-making to become too insular. Which may be the idea.

It strikes me, though it may not be true all the time, that “outlier” when used of a person conveys a negative connotation. Maybe it’s because it sounds suspiciously like “out-and-out liar.” When you want to commend someone for staying outside the herd, you call them something else — principled, honorable, and so forth. Whether due to stubbornness or incompetence, the outlier’s predicament is self-inflicted.

Every now and then I unearth an “-er” noun with no corresponding verb, and “outlier” is one — it may be that scientists say that certain data points “outlie” the normal distribution, but if so, it hasn’t penetrated mainstream vocabulary. “Whistleblower” is a classic instance. Others: caregiver, doorbuster, headhunter, rainmaker, stakeholder, warfighter. (There are a number of two-word examples, too.) Then there are the past participle adjectives that lack present indicatives, as in “handwritten.” Outliers. Hmph. What did you expect?

Tags: , , , , , , , , , , , ,

trust exercise

(1980’s | therapese? | “building teamwork,” “confidence-building exercise”)

“Exercise” is an interesting word, a bit slippery. The root is a Latin verb meaning “drive forth,” “put to work,” or “keep busy.” The twenty-first century associates it primarily with healthful physical activity, and that usage has a long history. In the more formal worlds of law, finance, and government, it means “invoke” or simply “use” (authority, power, veto, rights, financial option, judgment), but it often carries a further implication of wielding these things in restrained or responsible ways. I don’t think the word is used in modern textbooks, but I’m old enough to remember when the math problems at the end of a chapter were called exercises, aimed at developing intellectual muscles. Most generally, it refers to more or less regimented or ritualized routines designed to improve us one way or another. Related phrases: “point of the exercise” (the reason we did this in the first place) and “exercise in futility” (wasted effort).

“Trust exercise,” like other expressions I could name, is a distillation. In the eighties, it was possible to encounter the phrase tout court but more common to find longer versions of the same thing, such as “trust-building exercise,” “exercise in trust” (now that sounds financial!), or “exercise designed to build trust.” With enough such variants swirling around, the emergence of the compact two-word phrase seems inevitable. (Another synonym was “trust game,” which sounds a little too light-hearted somehow. The most widely known manifestation, the backwards fall into a partner’s arms, is also known as a “trust fall.”)

I suspect trust exercises existed in some form before the mid-eighties, but references to them started growing around then. I hadn’t realized it, but there are broadly speaking two kinds of trust exercise: indoor and outdoor. The outdoor kind — rappelling, obstacle courses, etc. — demanded vigorous, cooperative exertion. That sort of thing was usually reserved for those with high-stress, high-intensity jobs who really depend on each other — firefighters, athletes, stock traders, and so forth. But lots of people get more out of indoor trust exercises: couples, actors, low-impact office colleagues. There are trust exercises for kids, for couples, or, to stoop to that iniquitous, ubiquitous word, for teams of every sort.

The lesson of these rituals is that without risk there can be no trust. This is not the simple faith a baby has in its mother, or a religious believer in a particular messiah. To build confident interdependence you need at least the perception of danger that two or more people can unite against. Trust exercises may be so formalized and superficial that they actually work against their goal, especially now that they have had thirty years to settle into the folklore. They may also combat mutual wariness or indifference within the ranks, teaching participants to rely on each other enough to solve problems together. Anything that helps convince us we can count on our colleagues is good.

Tags: , , , , , , , , , , ,