Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: language

food court

(1980’s | businese (real estate) | “dining area”)

Why “court”? It’s more like a dining mall, but in the early days food courts were found only inside of malls, and a mall within a mall would have caused confusion. And why “mall”? A mall was a wide pedestrian boulevard, often grassy, and it never had anything much to do with commerce — though shopping malls did typically have wide central corridors that one walked along. I haven’t done the research, but it seems to me that “mall” and “court” were adopted for these bastions of plebeian retail because of their grand associations with aristocracy, elegance, and luxury. Not that there’s anything particularly elegant or luxurious about your standard food court, yet “court,” with its echoes of royalty, lends the enterprise a touch of class. A more plebeian explanation is that the word conjures up a big open space, like a basketball or tennis court. Or it’s where you go to judge the food.

One chronicler of the food court lays it at the door of James Rouse, a developer who responded to Levittowns by creating the planned community Columbia, MD ten years before he opened Harborplace in 1980 in downtown Baltimore. (I grew up between those two landmarks, in the heart of Rouseland.) For a developer, he wasn’t that bad, according to the New York Times obituary. It’s not clear if the phrase “food court” is due to Rouse; he may be responsible for “shopping mall.”

Pioneering food courts stirred in the seventies, and by the mid-eighties they were de rigueur, and not just in newly constructed malls — older malls were forced to renovate in order to add them. The term followed quickly, arising in both Canada and the U.S. by the late seventies (the oldest hit in LexisNexis comes from a Toronto paper in 1979). The term came straight out of the oddly buoyant language of developers, but food courts themselves were symbols of adolescence then, understood as places for the disaffected young to get away from their parents and pretend they were adults. The emphasis on fast food (they were sometimes called “fast-food courts”) made them popular with kids. They turned up next on college campuses, heralding a revolution in campus food service. Adults had to get used to eating in them soon enough when they invaded hospitals, airports, and office buildings.

The idea of restaurants and specialty food stores in shopping malls was not new in the seventies, but gathering several of them around a large open seating area was an innovation that demanded a new expression. The malls I went to in my youth didn’t have food courts, but they had drugstore lunch counters and Orange Julius and Baskin-Robbins. I don’t remember fast food restaurants being common in malls back then, but I didn’t get around much and they may have been. (My beloved Gino’s on Frederick Road wasn’t part of any mall, I’ll tell you that. Now it’s a McDonald’s.) Some chains — Sbarro’s, Panda Express — really took off with the advent of food courts.

I find them more than a little repulsive, personally. The open space — bare except for nondescript tables and chairs and people who don’t want you anywhere near them — always feels hostile, and there’s nothing I want on any of the menus. Then there’s the indignity of figuring out how to punch the order into a machine that doesn’t work half the time. Whatever I order, it’s cold by the time I find a seat, and it wasn’t all that good when it was hot. Plastic furniture, plastic cutlery, and the food . . . Everything predictable and disposable. That’s partly why food courts are becoming passé after a thirty-year reign, as “food halls” supplant them. It’s the same idea, only the restaurants on offer are more varied and quirky (and pricier — this is about consumption, after all). “Food hall” makes more sense as a name, “hall” being a word for large open area with action at one end, but “food court” should remain in the language for at least another generation or two.


Tags: , , , , , , , , , , , , ,

got you covered

(1980’s | athletese? | “backing you up,” “helping you out,” “taking care of you”)

got your back

(1980’s | athletese | “looking after you,” “watching your back,” “behind you”)

Odd thing about these two expressions. They don’t, or didn’t, mean the same thing, but there’s no reason they couldn’t, and they appear to be growing together. (Sometimes you even see “got your back covered.”) They arose in their latter-day definitions around the same time — “got you covered” a few years earlier, according to LexisNexis — and they both have rather unsettling antecedents. “Got you covered” was something police officers shouted when they had their guns drawn, and it followed “Don’t move” or “Put your hands up.” That use was out there at least as early as those listed above, if not earlier; it has developed a friendlier side. “Got your back” was once more often than not followed by “against the wall,” or perhaps “to the onrushing train (or whatever),” that is, in some kind of danger. Shorn of the prepositional phrase, it means the opposite: guard someone’s blind spot, or more generally keep them safe from harm. Both expressions seem to have taken root in athletese first, particularly “got your back,” which as far as I can tell was used mainly by African-Americans in the nineties.

The two expressions threaten to merge when they have both pivoted as far as they can from their risky cognates. “I’ve got you covered” can mean “I’m protecting you.” But it is used more in a jocular vein, not as a matter of life and death but as a matter of offering more choices to the customer than one’s rivals. It’s a matter of being all things to all people. “Got your back” traditionally carries higher stakes. At least in its youth it was not said lightly. It’s what a bodyguard says; it suggests a real threat. It is often used as an assertion of mutual loyalty, reminiscent of the older expression “you scratch my back, I’ll scratch yours.” (Definitely not “got your back up,” which meant irritated or offended.) But it doesn’t have to; no reciprocity is required.

More and more in everyday use, “got you covered” and “got your back” are becoming interchangeable, although the process is not complete and the two phrases retain separate identities. Most commonly, “got your back” is used to mean “got you covered”: I’m here to help rather than “I’ll protect you.” The misunderstanding may occur during discussions of holiday gift-giving, for example, coinciding with the way “got you covered” is used by purveyors of goods and services. Without evidence, I suggest that the blurred distinction stems more from carelessness than from anything else. Yet why shouldn’t they merge? “Got you covered” sounds like something an insurance company would say (it was an Allstate slogan in the late seventies), and what does an insurance company do if not protect you against bad luck and disaster? Why couldn’t the adjuster have your back as well as having you covered? Two similar expressions, born around the same time and in the same place, gradually coming to be used in the same way despite the original distinction. We owe the confluence of the two expressions to consumerism rampant; “got you covered” went over to the dark side long ago, and “got your back” has become a good example of merchandese.) One wonders if there will be any distinction left within a generation.

Tags: , , , , , , , , ,


(late 1990’s | businese | “executive suite,” “boardroom,” “front office”)

It’s “C” as in CEO, CFO, CIO, COO, etc. — the C-suite stands for the very top brass, possibly a physical location or, more often, as shorthand for the executives themselves. So there are many things it doesn’t stand for: Caesarean, clostridium, century (as in “c-note”), control (as in “c-panel”), capitalist (o.k., maybe that one), California, the programming language, the middle note on the keyboard, etc. Or the mathematical sense of “constant.” My high school calculus teacher used to say, “Don’t forget the seven c’s” while we were taking quizzes, to remind us not to neglect an essential part of the solution to an indefinite integral. Then there’s the homonymy with “sea,” “see,” “si” . . . It’s a rich range of referents, but here it is stands for something less even than “chief”; the “C” stands on its own and the abbreviations no longer need their spelled-out forms. “Chief-suite” would sound very strange.

I believe the expression was invented by consultants (another “c”), and it started to show up in the late nineties in LexisNexis. It soon made its presence felt all over the English-speaking world — in the U.S., U.K., and Australian business press — suggesting the easy global reach of the rentier class, through which new vocabulary gets around the world faster than the latest strain of the flu. “Executive suite” just took too darn long to say, I reckon, but the jargoneers reached for “C-suite” rather than “E-suite,” for alliterative reasons? I started noticing it in news reports in the last year or two, so it is not a specialized term any more. It would still be theoretically possible to misunderstand the phrase, but it’s generally quite clear in context.

“C-suite” is fast and glib, and glibness goes with arrogance. The informality of the expression seeks to draw our attention away from the sheer power exercised by a very small group of men (mostly) at the top. Two quick spondaic, sort-of rhymed syllables represent the power to make or break thousands of people, yet they make that power seem less forbidding. If you get the right ear of the right boss at the right time, fortune smiles upon you, and if you don’t, well, there’s a tough adjustment period ahead. Not for the boys in the C-suite, for you. They will do fine, getting paid hundreds of times what the rest of us make whether they do the company any good or not. The bigger the screw-up, the more they take home, and even outright lawbreaking shaves only a few million off the total compensation package. In the U.S., we have historically tolerated this sort of thing until the infallible C-suite boys drive the entire economy into a ditch, which they came close to doing in 2008 before the government bailed them out, allowing a semblance of economic normality — the rich get richer and the rest of us don’t do as well — to persist. What happened in 1929 was worse, and now the government (both parties, but especially the Republicans) is busily replicating the conditions that made the Great Depression possible, by encouraging Brobdingnagian wealth disparities and refusing to regulate financial institutions. The only good thing that happens during such calamities caused by a tiny group of oligarchs is that the rest of us finally see them for what they are — amoral, irresponsible greedpigs — and stop buying their lies. But even then the C-suite never goes away or gives up much. At worst, they have to pull in their horns for a while.

Tags: , , , , , , , , , , ,


(2010’s | internese | “alternate Thanksgiving”)

How old am I? Old enough not to have been aware of Friendsgiving when Adam from Pittsburgh, staunch member of the younger set, mooted it the other night. (Have I mentioned how much I like it when readers submit grist for Lex Maniac’s mill?) Well, older set, it is a sure-enough phenomenon, roughly a decade old, in general circulation for at least five years, part of advertising campaigns, celebrated in blog posts, and crowned with etiquette and customs of its own. Though it’s generally held on a different day and billed as a supplement to the traditional family Thanksgiving gathering, it may serve as a genuine alternative for those who just can’t bear one more big family get-together. It’s true: The only real, persistent difference between Thanksgiving and Friendsgiving seems to be the company. (For example: Friendsgiving and Thanksgiving menus overlap quite a bit; most Friendsgiving discussions I’ve come across on-line assume a turkey is involved, probably mashed potatoes and/or sweet potatoes, pumpkin pie, etc., etc. Due and cheerful allowance made for vegetarians, vegans, and the allergic and intolerant, of course.) That need not imply rejection of the family bosom; lots of students and young adults starting out in the workforce just don’t have the time and money to fly home for three days even if they want to. But if you have to anyway, Friendsgiving gives you a pleasant memory to counteract your painful, dutiful family get-together.

Merriam-Webster dates the term back to 2007 in print — or on screen — and LexisNexis concurs. LexisNexis further suggests that it really burst forth in 2013, though Bailey’s (of Irish Cream fame) used the name in a 2011 promotional campaign. Already well-established, Friendsgiving has spawned, or at least served as a precursor of — precursed? — ersatz holidays such as “Friendsmas” or “Galentine’s Day” (that’s “gal” as in “woman,” o.k.? I don’t make this crap up.) It is soon to the be the title of a major motion picture. Roughly coeval with Twitter and a creature of social media, it crossed into everyday language thanks to the mainstream press’s earnest monitoring of new vocabulary in the Twitterverse. All those anxious culture reporters, assigned to cover the latest developments in this or that tranche of the internet — they need something to do, so they fill us in on the latest trends, or the third-to-latest trends, at least.

Though the Thanksgiving episodes of “Friends” are often cited as examples of Friendsgivings past, the term did not exist then, as far as I can tell. Little doubt that the expression comes out of college lingo; a high percentage of its instances continue to occur in university student publications. But it is more widely available now; the business press has sniffed it out, having determined there’s money in it (the more Friendsgivings, the more turkeys sold). And there is certainly a whiff of the sumptuary about it; several on-line how-to guides prescribe lavish boards and elaborate place settings. There are some basic rules that seem to be pretty universal: They are potluck affairs; the host prepares the main course. The crowd consists of people of a certain age (quite a bit lower than mine) generally without any family connection. Everybody brings wine and pitches in.

Friendsgiving started as a distinctly young person’s phenomenon, but that was ten or fifteen years ago; now people in their thirties celebrate them. Some of those same people will still be doing it twenty or thirty years from now, by which time it will have become another tired old tradition that the young circumvent, or try to, while the old laud themselves for having been in the vanguard long ago. Thus the revolution spends itself.

Tags: , , , , , , , , , , , , , ,

said no one ever

(2010’s | internese | “that’s out of the question,” “no way”)

Consider the following sentence: Yogi Berra said no one ever goes there any more. This word string turned up quite often in LexisNexis — generally punctuated differently — which bedeviled me in the course of my research. But always as a component of a sentence followed by a verb, never as a set phrase with nothing after it except a period. (Urban Dictionary offers some reasonable definitions.) That all changed in April 2012, according to LexisNexis, quoting someone’s Twitter feed, and the first several instances seem to be concentrated in social media, not the best material to scout in LexisNexis. If someone who knows more of Twitter than I do wants to do some extra research and push the date back further, I will happily report the results. The expression has since leached into advertising and to some extent into general hip vocabulary, although I don’t hear people say it much; it occurs more in writing.

I first encountered the phrase in a subway advertisement, not more than three or four years ago. I had to look at it twice to grasp the full import, so it gets a few points for novelty, though to my ear, it’s annoying on most other fronts. Although it is not as obnoxious as “not!,” or as conspiratorial as “not in a good way,” it shares with them a sense of pulling back from a statement that you seemed to be espousing. There’s still a little spite there, but “said no one ever” is different, since it usually occurs after a line that would strike the average listener as dubious if not utterly implausible. The point of the expression is to signal that what has just been uttered is an impossibility — not logically but empirically. You couldn’t mean that, the expression says. Nothing or nobody could mean that.

“Said no one ever” belongs in the wearisomely hip category, alongside “baby on board,” “back atcha,” “been there, done that,” “don’t go there,” “no pain, no gain,” “same old, same old.” Most of these belong to a class of expressions that become so common so fast that a sort of whiplash sets in, maybe even a full-blown reaction, but the expression settles in hard and fast and can’t be rooted out, even if it becomes less relentlessly common. “Said no one ever” hasn’t attained that level; I suspect that in the heartland it still sounds unfamiliar, if it’s known at all. Besides, there’s not much of lexical interest there, no cleverness, no sting of repartee. (In fact, it’s designed to shut down repartee.) It has not had the meteoric impact that most of its cousins listed above had. But it may well become more popular in the next five or ten years; I suppose the incubation period must vary from expression to expression.

Thanks as ever to lovely Martha from Queens for proposing this expression! Your wish is my command.

Tags: , , , , , , , , , , ,

raise your game

(1980’s | athletese | “give 110 percent,” “turn it up a notch,” “excel”)

Any possessive pronoun will do. The definite article would definitely imply something different, closer to “raise the bar.” An early elaboration was “raise the level of your game,” and in either form it seemed to emerge most readily from the mouths of soccer and tennis players. LexisNexis finds no instances before 1980, but I would expect there to be a few years’ lag for a phrase like this one. By the end of the decade it was available in any sport, but not beyond sports; by the end of the century, that was still largely true, and today it remains largely but not exclusively the property of athletes. (It seems to have become much more prominent in Great Britain than in the U.S. in the intervening thirty years.) But it has trickled into wider use, through arts writing, which is odd considering that artists don’t usually adopt athletic novelties, and now much more often among politicians, which is less surprising. When non-athletes use the phrase, it brings with it a hint of, if not competitive instinct, the idea of a particular skill that you polish in response to an external stimulus, which might come from a competitor, colleague, or coach.

Finally, in the last ten years, “raise one’s game” has become popular among candidates and elected officials; it puzzles me that it took so long, both because it is an athlete’s expression and because politicians are forever claiming they will do better at the same time they claim that their performance to date could not have been improved on. When athletes talk about raising their game, it’s fairly specific: execute a strategy, concentrate harder, or hone a particular skill. When a politician does, it means everything and nothing — an unbacked promise to improve. My sense is that politicians use it more often of others than themselves; they denigrate other politicians by suggesting that they need to raise their game rather than take responsibility for raising their own. Even so, the phrase has an optimistic sound, conjuring a picture of the game-raiser spurred to greater heights, overcoming obstacles to provide ever better service to an adoring public (or the opposite, when used by an opponent). It just seems like such a useful phrase that it should have caught on in politics faster than it did.

There are a number of related expressions: “top of your game,” “elevate your game”; now “up your game” is a hip substitute for “raise your game.” In this context, “game” is a rough synonym for “performance” with a dash of will to win. The priority appears to go to “top of your game,” which seems to be a bit older than the others. The apparent distinction between being “on top of your game,” which might mean managing your abilities exceptionally well, and “at the top of your game,” meaning playing at your absolute peak, rarely exists in practice; it means “doing your best” no matter which preposition is in play. “Elevate your game” came along a few years later and seems to be an uncomplicated synonym replacement. “Up your game” became visible after 2000, drawing on the already familiar use of “up” as a verb (“up the ante”). It has nothing to do with “the game is up,” as a detective would say to a criminal, meaning he has been found out and it’s all over.

Tags: , , , , , , , , ,


(1980’s | doctorese? therapese? | “be your own doctor”)

From new expression to casus belli. Armies of people self-medicate (actually, we all do at least some of the time), and such practices are generally discouraged, particularly in the case of mental or emotional distress. While self-medication might involve a very wide range of substances, in practice it is generally brought up with reference to alcohol and popular recreational drugs. In some quarters it is assumed that regular use of intoxicants must mean that a person is dealing with some kind of mental health lapse. “Self-medicate” occurs with extraordinary frequency in discussions of addiction; the connection between drug addiction and (undiagnosed) mental illness is all but tautologous — you only get hooked if you have a problem. I’m sure that is often true, but there are exceptions. And of course, addiction may arise from physical pain as well as mental. Self-medicators don’t use only drugs; overeating is a common villain in this story, and other indulgences may qualify.

Google Books coughs up a couple of examples of the phrase pre-1970, but it didn’t start coming into its own until the late seventies. “Self-medicate” is not an idiom in the sense that there’s anything counterintuitive about it; the two components are readily available and easily combined, and the first time you heard it you wouldn’t know it was a fixed phrase. The OED dates “medicate” back to the seventeenth century, but the citations show that at first it more likely meant “cure” or perhaps “infuse (an object) with medicine,” but not “treat (a person) with medicine” until the late nineteenth century. The word “medication” (now often simply “meds”) did not take off until after the war, when everybody started popping pills. Thus “self-medicate” as we know it today probably could have burgeoned only in the twentieth century.

Because health care providers almost always disapprove of self-medication, it has taken on a transgressive, rebellious quality — not so much a sign of foolishness as independence, or a refusal to cede the power to make decisions for yourself. The libertarian counter-narrative has a certain appeal. Truth is, the doctor doesn’t want you to come in every time you have the sniffles; minor ailments can be handled just as well with over-the-counter remedies. But if it’s the least bit serious, you’d better get down to the office. And bring your insurance card with you.

Obviously, the medical profession has a lot at stake and wishes to maintain a monopoly over patients’ life-and-death decisions, and the less grave ones as well. Self-medication is deprecated partly because it threatens that monopoly. They want us to use their high-priced fancy drugs that put money in their pockets all up and down the line. There is another side to the case, of course. Those fancy drugs have been tested and looked at carefully by a lot of experts (that’s called regulation, people — and we need lots of it), so their failures and side effects are known. Home remedies and nostrums may do all sorts of damage without doing any good, and doctors have a duty to warn us of that. Doctors are bound by the Hippocratic Oath — though pharmaceutical and insurance executives are not — and so must advise against doing ourselves harm. It’s tempting to regard the prescription regime simply as a way to keep us in thrall to the medical establishment, but like most things, it’s not that simple.

Tags: , , , , , , , , , , ,

mental health day

(1980’s | businese? therapese? | “sick day,” “day off”)

Always closely linked to workplace stress (cf. “power nap” and “go postal” in this regard) and always tied to the much older concept of the sick day, that venerable custom which affords employees the right (nay, duty, in the case of contagious disease) to take an unscheduled day off due to unforeseen illness. In the seventies, the phrase “mental health day” was unusual, most often used about intensive care nurses or inner-city teachers; now anyone with a medium-stress job may need one. The expression became more common in the eighties, beating out competitors including “sick-and-tired leave,” which I rather like. I don’t remember hearing it before the mid-nineties, when I learned it from my worldly-wise girlfriend. That was just after I had started working nine-to-five following a stint in graduate school, where every day is a mental health day.

I should not fail to mention World Mental Health Day, which falls every year on October 10. This is not a day for everyone in the world to sick out (great idea, though), but a day to learn about and think about mental illness and how we may help those who are afflicted. That’s actually what you would expect from this construction; phrases that end in “day” often refer to such secular observances. (Weeks and months get the same treatment.) Oh, it’s Mental Health Day and the president of the Mental Health Society is giving an address at the bughouse. Or getting one. I apologize for the persiflage, but sometimes I just can’t resist. Anyway, if it weren’t for the fixed association with “sick day,” we might hear this phrase quite differently.

There has never been a generally effective way to prevent people from taking sick days when they feel fine physically, and employers resent that. But the mental health day partly redeems it; you’re skipping work to cope with excessive stress, which, left unchecked, will exact a much greater toll — physical and mental — than an occasional day off. The expression still carries the implication of an undeserved break, but that appears to be changing slowly as the old bosses die off. The next generation may be more willing to accept them as inevitable. Maybe union contracts of the future will include provision for mental health days. And power naps.

Lovely Liz from Queens, or maybe her daughter, pointed out recently that mental health means mental illness. It’s true, and it’s a big reason why troubled minds continue to attract less sympathy than injured bodies. If you are not demonstrably mentally ill, then mental health is not an issue; the subject just doesn’t come up. That isn’t true of corporeal health, which we understand in more complex terms than mere absence of obvious infirmities. Improved mental health is a goal only for those who know they are sick. There is such a thing as mental fitness, but it’s a legal expression. And it’s not analogous to physical fitness; it’s more like the minimum strength required to get around without keeling over. Just as most people have minor bodily ailments that don’t prevent them from getting through the day, most of us have observable but non-crippling deformities of the mind or spirit. But we take greater pains to ignore them, because of the shame and stigma they bring.

Tags: , , , , , , , , , , ,


(1980’s | academese? literese? | “underlying idea,” “hidden meaning,” “substratum”)

Ah, the drama. Dramatists have given us hundreds of new expressions — Shakespeare alone is responsible for dozens — but this expression is different in that we owe it to a theorist of drama rather than a creator of it. At any rate, several sources point to Stanislavsky as the source of this expression. It might be defined in different ways. Story behind the story, undercurrents among the characters, unexplained background, unexpressed motivations. Most simply, it’s the unstated yet significant part(s) of the plot, and it may be made obvious to the audience or not. It is the result of what we used to call “reading between the lines,” even though it places itself under rather than within.

The OED’s examples go back to the late nineteenth century, so Stanislavsky didn’t invent it, but he doubtless gave it a powerful push. That great literary critic Freud’s “unconscious” (das Unbewusste) was carelessly rendered as “sub-conscious” for many years, which probably helped “subtext” gain a toehold in everyday language. Another precursor was “subliminal,” as in message, which spiraled into the language in the late fifties thanks to the underappreciated Vance Packard, who published an exposé of dubious Madison Avenue practices called “The Hidden Persuaders.” Subliminal advertising was intended to bypass conscious understanding or thought and appeal to a part of the mind over which we have limited control (there’s your subconscious), a bit like hypnosis. So you want to buy the product without knowing why. (“Liminal” means “of or pertaining to thresholds”; the messages were intended to stay below the threshold of conscious thought.) It’s not clear how effective subliminal advertising was, but pretty much everyone except the advertisers agreed that it was unethical.

Probably in the late seventies, “subtext” ventured forth from its theatrical cocoon and took wing. LexisNexis would have you believe it entered political contexts first, but that may be due to its indexing bias. Political scientist Larry Sabato recently defined it as “the between-the-lines character sketch that guides and sets the tone for press coverage.” In this definition it has a personal focus; the subtext gives us a frame for understanding coverage of political figures more than issues or developments. While it may come out of a pattern of undisputed facts adduced in previous reporting, it is always more or less subject to bias. That’s part of the reason Trump’s defenders and critics see him in such starkly different terms; they are starting from entirely different premises. Every word and act is measured against antipodal subtexts, both maintained with considerable rancor, each producing a radically different basis of interpretation.

Sabato’s definition is unusually precise. “Subtext” has come to refer generally to any underlying message or idea that must be divined, or ferreted out. Those who grasp it will understand the situation better and respond more effectively. As in politics and fiction, there is room for idiosyncratic judgments, so different observers may see different things underlying a situation, or assign greater or lesser significance to the same underlying element. Applying principles of drama criticism to real life is a touchy business, but it’s inevitable. If you really want to understand what’s going on, you need to look below the surface, in life as in literature.

Tags: , , , , , , , , , , , , , ,


(2000’s | computerese | “precise,” “precisely categorized,” “well-organized,” “detailed,” “distinct”)

The new meaning of this expression has become ingrained (sorry) in our language rather quickly. Twenty years ago it turned up occasionally in computer talk — generally modifying “data” or “information” — now it turns up in all sorts of speech. Like many expressions born of computer jockeys, it is rather vague and indiscriminate, particularly ironic in light of its meaning, and so it has spread to modify lots of things since the dawn of the new millennium. A recent list drawn from LexisNexis: “focus,” “workloads,” “control,” “detail,” “list,” “insight,” “goals,” “analysis,” “stories,” and “urea” (just wanted to see if you were paying attention). And there are times when the equivocator in me wants it to be replaced by “granulated.” Take this clause from 1999: “Because precise capacity planning requires a highly granular collection of network traffic data . . .” It’s not the collection that’s granular; it’s the data. But I like the idea of a “granulated collection,” in which the data is chopped and ground ever more finely, perhaps to the point where we would have to call it “powder(ed).” Excuse me, I have to go powder my data.

There is at least a notional connection between the new meaning and the old, which was firmly literal, describing the consistency of sand or table salt, too coarse to be powder, too fine to be pellets. Useful in the laboratory and the kitchen, it had three fields, broadly speaking: industrial processes, meteorology, where it modified “snow,” and cuisine. “Granular” is about two hundred years old (“granulate” is older still), but only recently has it developed any kind of figurative life. In computerese, it suggests more of a sliding scale than an absolute state; data is stored, organized, and retrieved in more or less granular ways, with more granular understood to be better. Greater granularity implies more than taking a data set and channeling it into new and finer categories; it also implies more reliable access, and perhaps, as a consequence, data made useful in more contexts or fields.

Computerese has taken a number of terms with primarily physical applications and used them to talk about things that have little or no corporeality. The key change undergone by such three-dimensional words drafted into computerese is precisely that surrender of dimension. (Is data one-dimensional? Two? Three? N? Or none?) “Access,” “bug,” “folder,” “handshake,” “packet,” “virus.” “Granular” is an adjective, not a noun, which seems noteworthy. Not all computer terminology comes from existing words, but a lot of it does, and none of the examples adduced above seems to have taken any great semantic twists or turns as they settled into the new dialect. Even “granular” seems a logical enough borrowing, if not quite as right as some of the nouns. As noted above, computer whizzes don’t have much of a way with words, but their comprehension seems solid and straightforward in these instances, at least. Give the geeks their due.

Tags: , , , , , , ,