(1990’s | teenagese?)
This evocative verb phrase is more of a head-scratcher than most, and I’m scratching as hard as anyone. There does not seem to be any convincing reason why “go commando” should mean “go without underwear.” The most common explanation found on-line is that commandos (see below) don’t wear underwear because it’s too much trouble keeping it clean when you’re on a mission, because it causes jungle rot, or because ferocious Scots warriors went without under their kilts. The problem is that even if it’s true that special forces never wear underwear, the reasons adduced for scanting scanties are not specific to commandos, but shared by all soldiers. I guess it sounds better than “go doughboy” or “go GI” or “go grunt,” but the association with the ilk of Navy SEALS seems fanciful at best. The expression does seem to be used more often of men than women; accordingly, it is not generally used to mean “go without a bra.” It’s the lower story.
The first citation in the OED dates from 1974, but it doesn’t start showing up in LexisNexis until 1996, when it was cited in a list of slang terms current among college students. More to the point, it was used on “Friends” by Joey (played by Matt LeBlanc) early that same year, which seems to have provided the impetus for “go commando” to enter our vocabulary. The phrase needed several years before it could be used without quotation marks and glosses, but most people recognize it by now. There is some dispute over whether the phrase is of British origin (probably not). The word “commando” goes back to the Boer War, where it referred to a raid or one who participated in the raid; the word comes originally from Afrikaans. Certain British troops were called “commandos” during World War II, and from there it entered American vocabulary. If “go commando” meant anything fifty years ago, it meant “act like a commando” — notably brave, relentless, or capable of quick, decisive action. Today, “commando” has a slightly musty sound, and the armed services don’t use it, at least not in the U.S. But we continue to honour the valour of those daring English soldiers who carried out assassinations and rescues behind enemy lines by naming an eccentric sartorial practice after them.
Maybe that word “daring” forms the bridge, as lovely Liz from Queens suggests. Going without underwear requires a devil-may-care defiance of convention and homely wisdom; forget everything you learned about what to wear in case you’re in an accident. It also suggests forgoing protection or a degree of safety, demonstrating courage and nerve, in which qualities commandos are unmatched. There is something exciting about dispensing with drawers — I remember in high school one of the more advanced boys (he had moved east from California) bragged to a girl that he wasn’t wearing any (unfortunately, I can’t remember exactly how he said it, but I’m pretty sure it didn’t have anything to do with commandos); he lowered his waistband an inch or two to demonstrate. The girl was suitably impressed. But whatever the cachet, it is a deeply personal decision, and many of us find forgoing that bottom layer uncomfortable or unhygienic.
Some on-line sources identify “freeballing” as a pre-1980 equivalent. It’s not a word I know, but that doesn’t mean it didn’t exist. Otherwise, I’m not sure there was an old word for it. And now we have one — language marches on. I’d like to thank lovely Martha from Queens for giving me this week’s subject! Always a pleasure to hear from my dedicated readers.
(1990’s | businese (finance) | “provide incentive,” “encourage,” “promote”)
Editors and writers made a sport of deploring this word in the eighties and nineties, when it was reviled as unnecessary and clumsy, an obvious instance of a noun turned awkwardly into a verb by adding the “-ize” suffix, like “prioritize.” As late as 1997, E.J. Dionne expressed hope that “incentivize” could be expunged from the language. But it was not to be; this word has taken root and become quite common, though we already had several equivalents and it sounds clunky and jargony. The contexts in which it arose — business and politics — remain the ones in which it is most regularly used. How do you incentivize car sales or job creation? Customers or executives? Agriculture or high tech? The expression straddles the divide between creating incentive TO do something and incentive FOR someone to do something, and can apply with equal facility to either. It always boils down to creating compelling reasons for people to act a certain way, but it is not always necessary to explain what you’re doing. Gov. (now hapless presidential candidate) George Pataki once talked about “incentivizing work” among welfare recipients; Congressman Bob Inglis asked how to incentivize good health. It’s the same idea, but they skipped the part about rewarding people for finding paying jobs or adopting salubrious habits.
“Incentivize” first appeared in LexisNexis credited to chair of the Federal Reserve G. William Miller, who used the word repeatedly in testimony before Congress early in 1979. He certainly did not invent it; Google Books offers several examples as far back as 1969. In 1985, J. Peter Grace used it and was given tentative credit for the coinage by UPI. It’s all wishful thinking. Neither Miller nor Grace invented the term, but the fact that experienced reporters were inclined to give them the honors gives proof of its slow rise. “Incentivize” appeared now and then in the eighties (Jack Kemp used it in 1989) but did not really get rolling until the nineties.
A minor but nagging variant is the verb “incent,” which still makes most people with an ear for English wince, but does turn up occasionally. George W. Bush used it in the mid-1990’s as governor of Texas, though according to Texas Monthly, his aides made him stop. It was pretty new then and may not really have qualified as a word, depending on your standards, and it may not make the cut even now, for all that it appears in several dictionaries. It still sounds more illiterate than cutting-edge, and it seems to incense usage mavens more than “incentivize,” which is longer and windier but has the oddly comforting, or just anesthetizing, tone of bureaucratic language.
For while “incentivize” originated among our business mavens, it is a classic example of bureaucratese. No surprise that bureaucrats and financiers share a lingua franca, but one may wonder about the special needs of bureaucrats that bring forth words so obnoxious to the rest of us. To form a verb from a noun by adding a suffix — a practice nearly universally scorned among authorities in usage — may be seen as an attempt at precision, avoiding the use of a synonym, or near-synonym, that may be misconstrued or misunderstood, preferring to work with the exact word at hand. Cynics will reply that precision and clarity are the furthest things from the minds of bureaucrats, and their real intent is to bewilder us by creating strings of not-quite-comprehensible English sculpted to mean the opposite of what they appear, or carefully avoid saying anything at all. Both sides are right at least part of the time, but the debate has less to do with language than with politics. Ha! Just try keeping them apart.
(1980’s | journalese (gastronomy) | “home cooking,” “favorite dish”)
You could construct a good personality test by asking subjects to define this expression and list examples. Food writers use it confidently, but it has a wide range of meaning, though the gradations can be pretty subtle. The bottom lines that seem to underlie every use of the phrase: it has to be something the diner is already familiar with, and likes. Beyond that, it can go in several directions with equal confidence. Obviously, there is some overlap among the categories below, but I find the taxonomy helpful:
-What you ate when you were a small child, therefore often mushy or liquid, that makes you feel like you’re in Mama’s arms again. In other words, comforting food. Things like macaroni and cheese or tomato soup.
-What lovely Liz from Queens calls “white food.” Also often mushy and associated with childhood, but the point is it’s uncomplicated — bland and starchy as well as pale in color. Mashed potatoes, bananas, vanilla ice cream.
-What people eat in the country. “Comfort food” is sometimes used as a synonym for down-home dishes, and it may have a strong regional tinge. Comfort foods in the South may differ from comfort foods in the Northwest, for example (Moon Pies are not big in Seattle). Burritos in the Southwest, lobster rolls in the Northeast.
-Anything plain and unsurprising. Sometimes “comfort food” refers to things that are simple to prepare as well as eat, perhaps with the implication that it’s for family consumption rather than guests. This covers the first two above and other areas as well. Oatmeal, spaghetti, scrambled eggs.
-Heavy or at least substantial preparations; usually meat, frying, or both are involved. Meat loaf, casseroles, pot roast, burger and fries. Don’t be alarmed if the word “rib-sticking” appears nearby.
-Whatever you happen to enjoy, whatever makes you feel better for having eaten it, or makes up for a bad day. This sense of the term really opens the floodgates; now fancy gourmet concoctions can sit right beside the humblest fare. Sushi or catfish, crème brulée or egg custard, sweetbreads or scrapple. Such broad usage may be an abuse of the term, but you hear it a fair amount.
Notable by its absence from the lists above is the noble vegetable. The more effort it requires to eat, and the less obviously sweet, salty, or fatty it is, the less likely it will qualify as comfort food (except under the last definition, where anything goes).
There are some obvious faults — in the geological sense — in the meaning of “comfort food” that help explain the multiplicity sketched above. The main one: both personal preference and social custom are part of the field covered by this expression, and neither can be disregarded. Each person has their own, to some degree, but there is usually a fairly strong consensus on what most people in the same culture would consider comfort food. If your version of it is a rice cake with a shmear of tofu, that’s your business, but don’t expect your peers to share your tastes. Another fault: Lovers of exotic cuisine may depict “comfort food” with a sneer as unworthy of an adventurous palate, but more often it operates with reverse snobbery, as the lower classes contrast their chow lovingly with the pretentious, fussy gourmet variety. I also note in passing that “comfort food” partakes of nostalgia, real or imagined, especially when it summons our childhood diet or rural eating habits. But once again, the nostalgia may be deeply personal (childhood) or sociocultural (down home). Another point of negative interest: the expression is rarely used metaphorically (e.g., calling a novel “literary comfort food” as a reviewer in the New York Times did in 1987). We have chicken soup for the soul, but comfort food fills only the belly. To round off this sequence of unrelated points, I will suggest that there is no direct connection between the rises of “comfort zone” and “comfort food,” but they occurred at the same time, and it’s quite possible the two expressions helped each other into everyday language.
My brilliant, beautiful girlfriend gave me this expression months ago, and I finally decided to take a bite out of it. Thanks, baby!
(1990’s | journalese (politics) | “creative accounting,” “fudging the numbers”)
The first thing you need to know about fuzzy mathematics is that it is a genuine discipline, invented in the 1960’s and well established. I’m no mathematician, a rather ancient B.S. in the field notwithstanding, but the idea at the heart of fuzzy mathematics is that it is non-binary, or rather superbinary. This branch of mathematics or logic allows for states between absolute membership or non-membership in a set or truth or falsehood of a proposition, for example. I’m sure it’s every bit as incomprehensible as most advanced math or physics, but that doesn’t mean it’s any less rigorous than that old-time mathematics that’s good enough for us. The foregoing is a bit fuzzy, so allow me to append an example from Professor Bart Kosko, relayed in the New York Times (November 7, 2000): “A good example of a fuzzy concept is cool air — it has a clear meaning, but it is not black or white. Fuzzy logic builds rules out of such fuzzy terms and then embeds those rules in a computer. One rule might be ‘If the air is cool, then set the motor speed to slow.’ Another might be ‘If the air is warm, then set the motor speed to high.’ Fuzzy math lets an expert program a computer in English, but helps the computer interpret ‘cool’ according to a number of interacting variables — just as a human being does.” Cool example, dude!
In the mid-1990’s, “fuzzy math” began to be directed at a new mathematics curriculum for elementary schools based on collaboration among students, word problems aimed at practical contexts rather than memorization, and permitting the use of calculators. In some ways, the curriculum was descended from the new math of the 1960’s, and in some quarters it was called “new new math.” Critics began referring to it as “fuzzy math” in the mid-1990’s, and according to LexisNexis the term really took off in 1997, when it was used regularly in major media to deride this pedagogical approach. Culture warriors Lynne Cheney and Diane Ravitch adopted the expression; editorialists and columnists soon followed suit. At that point, the phrase had a very limited ambit, referring — invariably derisively — only to methods of teaching mathematics, and there was no sign of it in LexisNexis in any other context before 2000. The same charges are made today against Common Core’s method of teaching math, merely the latest chapter in a debate that has raged for fifty years.
October 3rd, 2000: That’s when George W. Bush vaulted the phrase into everyone’s vocabulary during the first presidential debate with Al Gore. He accused Gore of employing “fuzzy math” in his tax and budget proposals in an effort to cast doubt on promised surpluses and revenue increases. According to many commentators, Bush’s arithmetic was equally dubious, but he landed the punch. Bush can claim honors in two categories: pushing the relatively new expression into ubiquity and changing its meaning. He may not have been the first to use “fuzzy math” to mean suspect accounting, but he was the first one to use it on a national stage at a moment when most of us were paying attention. In another post I have sketched the impact of recent presidents on our vocabulary. Bush was not known for facility or felicity with language, but he gets credit for “fuzzy math,” along with “faith-based” and “surge.”
Right after Bush cannonballed the phrase into the lexicon, others took it up, and almost immediately it became a synonym for misleading calculations or misuse of statistics. Bush used “fuzzy math” to impute not just incompetence, but intent to deceive, and the phrase retains that implication, as well as a strong tendency to turn up in political discussions. It is also much more habitual on the right side of the political spectrum, like “junk science,” and it has the same quality of quick dismissal that is not required to justify itself, a quality sired by Bush in that fateful debate, when he dismissed Gore’s numbers without disputing them. The expression rarely has an affectionate side, for all that it contains the word “fuzzy.” (No word on what “warm fuzzy math” might look like.) It’s fuzzy as in imprecise, of course, not cuddly. But for all I know, the critics of new new math in the 1990’s intended it the other way, at least partly, as in coddling the children rather than giving them a dose of good old-fashioned multiplication tables.
no harm, no foul
(1980’s | athletese | “no harm done,” “nothing to worry about,” “let’s forget the whole thing”)
I’m afraid the obvious origin is the true one. This expression was first used by basketball referees and sportswriters to describe a “philosophy” of officiating; here is a lovely definition from 1958: “if the contact does not interfere with the progress of the game, a foul shall not be called.” The earliest uses I found attributed the maxim to referees in the Big Ten Conference, in America’s heartland. “No harm, no foul” remained the property of basketball people until around 1980, when it started to creep into the language of politics and law. It’s not hard to see that the expression might appeal to lawyers, in that it summarizes an important legal principle: There must be real injury — not just the potential for it or some theoretical wrong — or there can be no tort. The phrase is commonly used now in the law to denote an argument or strategy designed to undercut plaintiffs by demonstrating that they have suffered no damage. If no one has really been hurt, there’s no infraction, and we can all get on with our lives.
It’s an awfully convenient argument, and requiring a plaintiff to prove incontrovertible injury makes redress less probable. Harm isn’t always visible to the naked eye, and if the malefactor is clever, or powerful, enough, he or she may be able to do great harm without warning. If the government invades your privacy, or a corporation poisons your water, the effects may not be felt for years, but they are real. Sometimes “no harm, no foul” is used when there is obvious harm, as way of obscuring it, or denying culpability. Thus the expression has developed a definite dark side in legalese; now it may go beyond time-honored principle to something a lot sleazier: “Yes, we broke the rules, but the same damage would have occurred if we hadn’t, so we’re not liable.” Only the government or another large institution can afford to take this position in court, illustrating another venerable legal principle: money and power almost always win.
In everyday speech, the phrase has become a stock response to an apology, loosely translated as “It doesn’t bother me, so you don’t have to feel guilty.” As Urban Dictionary notes, it has taken on a kinship with “No problem” or “no worries” to complement its persistent echo of the older “no harm done.” More broadly, “no harm, no foul” has become an all-purpose dismissal, with shades of meaning from “not my problem” to “everything’s fine.” These days, it is casually tossed off in myriad contexts, not just among athletes and lawyers, but chefs, farmers, art critics, you name it. And it has become almost empty of any specific meaning. It’s one of dozens of signals that there’s no need to take offense, or we’re all cool. What little rigor it had during its sheltered life among the basketball referees has vanished. And why should we care, after all? No harm, no foul.
(1980’s | therapese? | “basic assumptions,” “world view,” “framework,” “preconceived notions,” “idées fixes”)
This is one of those expansive words that has grown fat with use. “Mindset” goes back to the early twentieth century, but it didn’t spread until the seventies, when according to Google Books it started to appear regularly, particularly in writing having to do with therapy and religion, or politics. Now it is used everywhere, though if LexisNexis is to be believed, it is especially popular among athletes these days, a backhanded homage to the great Yogi Berra’s observation that ninety per cent of baseball is half mental. In recent years, some therapists have tried to retake control of the word by popularizing a standoff between “fixed mindset” (belonging to those who think they can’t get any smarter than they are) and “growth mindset” (those who rejoice in breaking through their mental barriers and blocks). It’s not clear to me how reputable this Manicheanism is, but it has gained traction in the on-line community.
We must pause to define the term, which I will do with reference to authorities. In 1983, William Safire described the evolution of “mindset”: “Tendency, attitude, or inclination used to be the primary meaning, akin to frame of mind; now the primacy goes to fixed state of mind or predetermined view.” The OED highlights “established set of attitudes, esp. regarded as typical of a particular group’s social or cultural values.” Safire’s contention, which is correct in my humble view, may result from the ambiguity, not to say polyguity, of the word “set,” which means “group” or “collection,” but also means “immobile” or “deep-rooted.” It’s a list of beliefs or assumptions that causes our minds to move predictably along certain paths, or it’s just the mind set in its ways.
When athletes use the word, it usually comes closest to “(mental) approach”, the quality that allows you to concentrate on the game and bear down harder than your opponents. Your mindset may need to change, or you may have trouble keeping the right mindset on the field. This does not correspond precisely to either of the primary definitions cited above, but it is related to the “growth mindset” discussed in the first paragraph. True, “mindset” doesn’t take prepositions as readily as “approach,” but a player might “bring the right mindset to the game.” The new word certainly does not preclude all the old clichés dear to athletes for generations: focus on winning, all I care about is the team, don’t worry about things you can’t control, etc.
There is a class of expression that lies dormant for decades, even centuries, and then bursts into the vocabulary. Other examples I have covered: “holistic,” “comfort zone,” and “artisanal” are twentieth-century examples, and some are older still, like “hurtful,” “ramp up,” or “overthink.” The OED cites “mindset” as early as 1909, but the word didn’t hit its stride for another sixty or seventy years after that. It seems like it ought to have come from the students of altered consciousness that had their heyday in the sixties (Timothy Leary talked about “set and setting”), but as far as I can tell its rise cannot be attributed to any particular guru, professor, or Esalenite.
(1980’s | therapese | “handicapped,” “disabled”)
Presumably descended from the already widespread phrases, “special education” and “Special Olympics.” The crucial change in recent years has to do with part of speech; “special needs” has gone from noun-adjective phrase to unhyphenated compound adjective — not that the old formulation has disappeared. The compound adjective started getting tossed around in the eighties. Before it was applied wholesale to students, it went with orphans and foster children. As one commentator put it in 1984, the old word for “special needs” was “unadoptable.” (Another was “problem,” as in “problem child.”) Now it can apply even to pets. “Special needs” come in many forms, from familiar physical handicaps to mental or emotional instabilities of various kinds, or maybe your kid is just slow (excuse me, has a developmental disability). It has become standard to talk about special needs kids, or the institutions that serve them — classes, programs, transportation — or the group that they are part of; “special needs community” is a common expression now, and it wasn’t twenty years ago. When you’re talking about children, “special needs” refers to disorders of individuals; when it is used to talk about the elderly or anyone else, it normally encompasses conditions common to most members of the group.
That distinction is interesting, and to see why we’ll have to go back to the noun-adjective construction, which has been available for a long time. Kids generally do not claim special-needs status for themselves; there are plenty of people anxious to claim it for them. But other kinds of special needs are advertised by the group they belong to. Take a phrase like “special needs of the oil industry.” In 1975, this phrase could easily have been used (in fact, it was) not to emphasize the burdens fossil-fuel barons labored under, but the privileges that their circumstances entitled them to. It was the sort of thing a lobbyist or legislator might remark upon just before pushing through a big tax break. You didn’t have to be underprivileged (does anyone use that word any more? — it was all the rage back in the seventies) to have special needs. And you don’t now. But we are much more inclined to hear it that way thanks to the last thirty years’ worth of education policy. Before 1985 or so, “special needs” meant “I’m better than you” rather than “I’m worse off than you.”
What does “special” mean, anyway? When it doesn’t mean “specific or distinct” (as it did in the Middle Ages and the Renaissance) or “extraordinary,” as it did then and still does, it means “unique,” a much more recent definition dinned into us by pop psychology. When I was a kid, this use of “special” was common, but it had grown up only in the previous couple of decades. “I am special” came to mean “I am unique,” with the corollary that uniqueness entitled you to respect. It was a word used by eager kindergarten teachers to reassure children that they were valued. “Special needs” doesn’t rely on that definition, though there is a clear echo in parents’ insistence that each special needs child is unique (and adorable, and so forth). But lots of kids may have the same, or very similar, maladies, so that they can be grouped together for purposes of education or therapy. “Special needs” doesn’t have to refer to extreme or bizarre conditions; almost any kid with a problem may qualify if their parents are persistent enough, and some of ’em are, because special needs is where the money is.
The phrase seems more like a euphemism than anything else, a way of coating disabilities — mild or severe — with kindergarten cheer. Language so used is ripe for parody, and “special,” which for centuries had a generally favorable connotation, has become an insult. Uttered with a smirk, it means “substandard,” and every kid knows it, just as they understand that students with special needs have something wrong with them. Yet the expression has hung onto a palliative quality in spite of all the currents running the other way.
(1980’s | journalese | “statement”)
A bit of a cheat by my avowed chronological limits, but only a bit, “fashion statement” arose in the 1970’s and became available for use outside the industry in the 1980’s. Google Books shows effectively no instances before 1970, when it started to creep into fashion journalism. By 1985 it could turn up anywhere in the entertainment press, from sports to theater reviews, and even in political reporting. “Fashion model” and “fashion sense” are much older, “fashion plate” is older still; any of them might have provided a model for the new coinage. “Fashion police” and “fashionista” came along later.
The phrase can mean a lot of things. As of 2015, it applies loosely to any wearing of clothes or accessories to get any sort of attention. Like a muumuu, the vagueness conceals many meanings. Let’s try a few on:
-announcement of new line or even trend, normally at a major show, but only by means of the clothes themselves (a designer’s description of her new line at a press conference would not be considered a fashion statement)
-declaration of allegiance to a particular designer or trend.
In these two cases, the statement is delivered by the clothes themselves, and it centers on a designer or trend. But fashion statements may say more about the person making them:
-using clothes and accessories to show that you are independent of the current mode, or have an interesting variation on it
or, more broadly,
-any expression of one’s character, preferences, passions, etc., etc. through the medium of apparel. I’m not sure if wearing an Aeropostale shirt counts as a fashion statement. More loosely still, the phrase means
-doing or wearing something because it’s chic
-drawing attention to oneself by means of what one is wearing.
But references to the world beyond the runway are possible, too. Fashion statements may take aim at a social or political issue (as in students wearing Confederate flag t-shirts, or showing solidarity with gay peers by wearing denim.)
It doesn’t even have to be wearables: I came across an article in the Oberlin Review (April 3, 2015) about “decorative beards,” which are adorned with flowers, miniature Christmas lights, and who knows what. More than one student used “fashion statement” to talk about the new phenomenon. True, the donning of a three-dimensional object is still required to trigger use of the expression, but one wonders how long before beards or tattoos become potent fashion statements in themselves.
What you really have to watch with this expression is who (or what) makes the statement. It may be a designer, stunning this year’s audience with sheer audacity. It may be you or I, or it may just be the clothes. Who you are and what you wear may blend seamlessly, with your garments reflecting, nay, expounding your inner self through your carefully chosen wardrobe. Or you make your wardrobe as discordant or opinionated as possible in order to provoke reactions from bystanders. The gregarious looseness of this expression — abetted by the word “statement,” more general than declaration, announcement, or testimony — lets it cover such a broad range.
Fashion is often derided as superficial and trivial, but fashion statements, even light-hearted ones, are rarely dismissed out of hand. They are influential, or at least have the potential to be, and the power of a designer to inspire imitation through bold novelties remains considerable. Frankly, I would have expected the phrase to have taken on a negative cast over time, like hipster or comfort zone. No such derogatory usage has ever become the norm.
be careful out there
(1980’s | journalese (television) | “watch yourself,” “stay alert,” “pay attention”)
Now here is a phrase brought to us by television — or at least propelled into our vocabulary by television. In its full form, “Let’s roll, and hey, let’s be careful out there,” it was delivered near the beginning of each episode of Hill Street Blues by Sgt. Esterhaus, played by Michael Conrad. The drama debuted in 1981, and it seems to have been one of those rare instances of an offering of the popular arts that survived on sheer critical acclaim for a long time before it found a loyal audience. There was a lag of a year or two before the phrase began to appear regularly in the press, but its upward progress was swift. By the time President Reagan used it in May 1983, reporters cited Hill Street Blues knowingly, and there was no doubt about what had made it a household word. Of course the phrase, at least in its condensed form, is not catchy, an utterance utterly ordinary semantically and syntactically and a poor candidate for a cliché, yet it has gone from tag line to stock phrase. People who use the expression today may not know they are quoting Hill Street Blues, but they know they are quoting something. According to the New York Times (June 8, 1986), the writers of Hill Street Blues probably adapted the expression from The Police Tapes, a series of cinema vérité documentaries on police work shot in the South Bronx, where the sergeant ended roll call with a similar injunction.
The expression is as self-explanatory as any, I suppose, but it has one distinctive feature: as far as I can tell it is rarely used jocularly. Telling a person or group to be careful out there is not to be taken trivially. You say it when there is genuine danger, whether physical or financial. We have become more preoccupied with safety and security in the last thirty years, which may account partly for the spread of the expression.
There was a time when cop shows were a fertile source of catch phrases. “Who loves ya, baby?” (Kojak), “Book ’em, Danno” (Hawaii Five-O), “I pity the fool” (The A-Team). (The grandaddy of them all, “Just the facts, ma’am” from Dragnet, never appeared on the show in any of its incarnations, according to multiple on-line sources, although Friday did say “All we want are the facts, ma’am” once.) That era appears to have ended in the mid-eighties. Maybe I’ve missed something, but cop shows of the last thirty years or so don’t seem to have spawned any linguistic fads. Did Homicide or The Wire lend any expressions to the language? Miami Vice? NYPD Blue? Cops? I wouldn’t know, but I did come across several lists of cop-show catch phrases on the web, and none of them had anything later than Hill Street Blues or The A-Team. If this is so, can anyone explain why? Faithful readers?
I don’t think I ever watched Hill Street Blues back when it was new, but I watched an episode (o.k., half an episode) on Hulu to hear Sgt. Esterhaus for myself. More or less at random, I chose the first episode from the third season (1983), in which a nun has been raped and murdered and a man has gotten his head wedged immovably between a filthy toilet bowl and the wall. (That’s only two of the story lines, and the others were also pretty lurid.) The intent seemed to be to extort the most raw and violent emotional response from viewers by assaulting them at every turn. If we do not respond viscerally, the producers have failed the advertisers. I had the same sensations about ten years ago, when I watched Desperate Housewives and Grey’s Anatomy back-to-back one fraught night. In each program, all of the half-dozen or so plot lines were grotesque, or nauseating, or perverse to the point of absurdity, almost as if the writers were challenging each other to make each plot twist more appalling than the last. I watch almost no television drama, and maybe I just have a knack for tuning in on the most excessive evenings. But if this sort of bombardment, or anything like it, is the norm, I don’t see how regular viewers can be anything other than numb. After weeks and years of this, how can pity and terror make themselves known? How can stories pushed far beyond anything like everyday experience — even the everyday experience of cops and emergency room doctors — tell us anything?
(1990’s | businese, computerese | “new business, firm, etc.”)
Hyphenated or not, this expression was well established as both a noun and an adjective by 1975, particularly in the business press, and it doesn’t appear to be much older than that. Google’s n-gram viewer finds almost no instances before 1950, and the curve doesn’t start to rise sharply until 1970, so it was fairly new then, but easy to use and soon absorbed. When it was a noun, it meant “commencement of operations,” or more colloquially, “opening” or “launch.” Normally it went with heavy industry, so it was common to talk of the startup of a plant or pipeline, for example. But businessmen love to scoff at grammar distinctions — there’s no denying startups invariably entail startup costs, a startup period, or, heaven forbid, startup problems — so they converted it effortlessly into an adjective. “Startup” may also clock in as a verb, but in that part of speech it is usually two words, even today.
By 1990, the concept of a “start[-]up company” had emerged, and occasionally the noun disappeared, leaving “startup” on its own. That wasn’t normal then, but today it is the rule. Back in the eighties, the shift from galumphing old factories to nimble new firms that didn’t make anything three-dimensional was driven by a hostile takeover of American life by the personal computer, a fait accompli by 1995. So many new companies concerned themselves with computer hardware and software that “startup” became common in computer publications by the late eighties. The word is older, but the way we use it today was probably driven by increasing computer sales, and computerese became the funnel for a businese expression — no surprise there. Michael Dell (of Dell Computers) was quoted recently on the “startup ecosystem” in India, and he even spoke of “meeting” (without “with”) several startups, not a use of “meet” I’ve encountered before. Since I haven’t actually offered a definition, here’s one I encountered on a German web site that does the job pretty well: “Startups sind Jungunternehmen mit besonderen Ideen – sehr oft im digitalen Bereich.” (Roughly, “Startups are new enterprises with unusual ideas, most often in the computer sector.”)
My sense is that “startup” had primarily a favorable connotation when it was getting established between 1985 and 1995. Such budding concerns were generally pegged as plucky or scrappy, determined pioneers taking on long odds with heads held high and a sound business plan. In that respect, it was more or less the opposite of “upstart,” which was always uncomplimentary. But as the term has lost novelty, it may have lost its sheen. Anyway, I don’t have the sense any more that it is complimentary. It seems more neutral than anything else.
The key related concept is the entrepreneur, always a figure celebrated in American mythology. Entrepreneurs breed startups, or shed them, or bring them forth from their heads, like Zeus giving birth to Athena. The crashes and recessions that have become frequent since the Nixon years may have dampened the spirits of some of these go-getters who start their own companies, but their flame burns bright as ever in our official worship of business. Entrepreneurs take the initiative, do their homework, embody healthy risk-taking, create jobs and prosperity, and otherwise exemplify the American way. Entrepreneurs are lauded especially on the right, because entrepreneurism is all about me rather than all about us. (That’s an oversimplification, but I’ll stick with it.)
According to my calculations, this is the 300th expression I have written about, at greater or lesser length. (I have become more loquacious over time, not less. Brevity is the soul of wit, indeed.) I chose “startup” as anniversary fodder partly because no operation was ever more shoestring or quixotic than this blog. I say thank you to my readers, to the ones who landed here once off of Google and never came back as well as the ones who read every post and comment faithfully. (You know who you are, and there ain’t very many of you.) I don’t do enough to encourage comments and feedback, but at least here I will say, if you ever feel an impulse to fire off a reply to one of my posts, or to send me an e-mail at usagemaven at verizon dot net, do it. Even if I don’t answer, I am grateful that you took the time, and I will profit from your wise words.