Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: social media

data mining

(1990’s | computerese)

I propose we blow up the boring old meaning of this phrase and start using it to mean the act of planting cybertraps in databases that will detect the intrusion of software designed to find patterns and connections within vast assortments of data and mess it up somehow — disable the software, infect it with malware that captures its proprietary information, flood the CEO’s inbox with threatening e-mails, I dunno. Someone who knows something about computers can make better suggestions. If you’re data mining and you step on a mine, what happens?

For “data mining” as we use it has no explosive traces. It is a computer term, pure and simple — always has been since it started turning up in the late eighties and early nineties — denoting the practice of analyzing large accumulations of data quickly with an eye to digging out buried yet useful trends and relationships. “Data mine” is much less common but may be employed as a verb and occasionally as a noun, though normally with an indication that the writer is aiming at drollery. Databases — why do we call them that, anyway? — don’t have ore, veins, or mother lodes, and you don’t freebase data. Data mining need not have anything to do with “undermining,” which goes back to the meaning of mining mooted above, though, when used for nefarious ends, it often does anyway.

One web site points out that data mining is useful in many professions, and we often associate it with marketing in particular. (I take an unwarranted pause to point out that “marketing” still meant “grocery shopping” when I was little, at least among older people. There’s a word that has changed forever.) That accounts for the suspicion we attach to data mining, which after all is capable of being perfectly innocuous (cf. “game the system“) and is often used for laudable ends, such as improving customer service or finding correlations in wide-ranging clinical trials that escape even the most observant doctors. Yet data mining leaves a bad taste in our mouths. How can this be? We resent it when vast, wealthy entities exert power over us, especially when they invade our privacy to do it. Worse yet, we know the data jockeys will use whatever they get to relieve us of our money. First they steal our secrets and then they use them against us. Oh, they’re just making a few recommendations, trying to help, but we know what’s really going on.

As I’ve pointed out before, one of the problems of our culture is that we heap up more data than we can possibly use; the more we have, the easier it is to miss something important. We’re in a permanent state of information overload, which is not the same as “too much information.” Data mining works to defend against such slips, and it certainly ought to be a useful tool. Would better data mining techniques have helped to prevent 9/11, for example? I don’t see how they could have hurt.

This expression was given to me years ago by my old buddy Charles, and I dusted it off in honor of a recent exchange of hospitality. High time.


Tags: , , , , , , , , ,

hive mind

(1990’s | science fiction | “zeitgeist,” “will of the people,” “conventional wisdom,” “groupthink”)

It all started with the bees. The British apiarist H.J. Wadey probably did not invent the term, but he used it in the 1940’s to describe the process by which lots and lots of bees, each of which has next to no mental capacity on its own, work together to create an intelligence that cannot be accounted for simply by adding up the microcapacities of each bee in the colony. There was something a bit mystical about it, and that transcendent quality was picked up by other authorities on bees. From there it became property of science fiction writers, for whom the concept was tailor-made. In their hands, it could retain the sense of a purer intelligence emerging from the collective, or it could be a means of imposing zombie conformity and obedience on the rest of us. Science fiction runs to utopia or dystopia anyway, and the hive mind can be used to exemplify both, even in the same book. The phrase had not become common outside of science-fiction circles; I doubt most Americans were familiar with it when I was young.

There the matter rested until the mid-1990’s, when the expression received the benefit of two cultural megaphones: first Kevin Kelly, founder of Wired magazine, then the film Star Trek: First Contact. Kelly saw the hive mind as the result of amplifying human capability with computers (preferably implanted) to enhance our collective intelligence and create a larger force, human yet superhuman, that would change everything for the better — although individual drones might not fare so well. A year or two later, Star Trek: First Contact came out, which featured the Borg as the villain. The Borg had appeared on Star Trek: The Next Generation (the Patrick Stewart cast, which also populated the film), but this seems to have been the first time the phrase “hive mind” ever appeared in the script. The Wired geeks and the Star Trek geeks between them formed a critical mass, and “hive mind” emerged from the sci-fi shadows and began to be encountered much more often. The onset of social media certainly didn’t slow the spread of the phrase; here again, the concept may be beneficent or noxious.

Kelly was an optimist, positing that the computer-aided hive mind would lead to a much greater capacity to solve human problems, whereas the Borg represents the dark side, gobbling up plucky individualists and producing numbing conformity while enriching its own hive mind with the contributions of other civilizations (sounds like imperialism, or the one percent). My sense is that today the pessimists are winning; “hive mind” has become a favored grenade to toss across the political divide, as stalwarts of the right and left accuse their opponents of stupidly parroting the sentiments put forth by their respective opinion makers. On this view, the hive mind is simply an overlord to which the bad guys pledge dumb fealty. (Of course, both left and right have their share of unreasoning myrmidons, but I wonder if they may be more characteristic of the right wing. “Dittohead” is no longer fashionable, but it’s worth noting that only right-wingers called themselves “dittoheads,” often with pride.) Even if the insulting use predominates right now, the more hopeful meaning may rise again. Take UNU, for example, which promises to help us “think together” by setting up a “swarm intelligence.”

Once you get away from the notion of a literal superbrain, the metaphorical uses of the expression come quickly into view. A single brain can itself be seen as a teeming hive mind, with neurons equivalent to drones, each doing its tiny duty but producing prodigious results by subordinating itself. (A more recent issue of Wired showcases an example of this sort of analogy, which has no counterpart for the queen bee.) More generally, the hive mind may serve as a symbol of our politics, in which millions combine to create and support a unified national government. (If that idealized picture makes you snicker, you’re not alone.) Our national motto, E pluribus unum, means “out of many, one,” and that’s not a bad summary of how a hive mind works. No single individual knows everything or can do it all by herself; the nation must muddle along making the most of whatever contributions it can get from hard-working citizens, who create the polity by banding together, at least partly unconsciously, to assert a collective will.

This post was inspired by the one and only lovely Liz from Queens, who nominated “hive mind” only last week, thereby sparing me the trouble of coming up with a new expression to write about. Thanks, baby!

Tags: , , , , , , , , , , ,


(1990’s | computerese | “totem,” “alter ego”)

Occasionally a word will just flip around. For thousands of years, “avatar” meant corporeal embodiment of an abstraction, often a god (the word comes from Sanskrit and is mainly associated with the Hindu god Vishnu). What does it mean now? An incorporeal representation of a flesh-and-blood person. Now your avatar is a tiny image file that stands for you in the on-line universe –- on Facebook, in a role-playing video game, or on a blogger’s profile. It could be a portrait but usually isn’t, and there are dozens of sites that will give you some raw material and a little coaching to help you make your own. Another word for it is “icon,” which is a little different but not that far off. “Avatar” was definitely available in its present form by 2000, though perhaps not all that widespread in the days before everybody needed one to keep up a virtual social life. Now it is all over everywhere.

Is it odd that both terms have a distinctly religious cast? By the time you’ve settled on which South Park character you want to represent you on your social media pages, the sense of majesty and mystery that religion must command has pretty well worn away. Both “avatar” and “icon” have had non-religious uses for at least a century now, or at least “avatar” has, but there’s still a distinct whiff of it. You might also argue that technology, the most up-to-date religion we have going, has simply appropriated sacred vocabulary and repurposed it.

The question leads to a favorite pastime of mine: constructing narratives of cultural decline. “Avatar” once had a mystical tone, associated either with the gods themselves or people who live out philosophically pure distillations of noble principles. Now it’s a few pixels thrown together that allows you to play video games. A decided downward drift, and all in my lifetime! A quick scan of search results does confirm that Google, for one, doesn’t think old-fashioned religious uses of the term count for much — though, of course, the results are skewed by the 2009 blockbuster film. I didn’t see it, but from what I gather the eponymous characters in the film had at least a touch of the guru or sage about them. (I remember the movie as being about blue characters who weren’t Smurfs, but that just shows you how primitive my cinematic consciousness is.)

On-line avatars remind me of choosing your token at the beginning of a Monopoly game (we usually called it simply a “piece,” but if I remember correctly, the rules used the word “token”). The dog, the shoe, the little car, etc. (I liked the wheelbarrow myself.) Most people had a preference, whether they considered it lucky or somehow apt. True, you couldn’t cobble your own avatar together in Monopoly; you had to take what Parker Brothers gave you. But those were the boring old days, my children. Image files are not my strong suit, but I came up with a few related user names, free to a good home. Want to be an actress? Ava Tardner. Fond of the Middle Ages? Avatar and Heloise. For fans of ancient video games, Avatari. For a historical touch, how about Amelia Earhart, avatrix? That’ll light up the chat boards.

Tags: , , , , , , , , , , , , ,


(enginese, computerese | “number sign,” “cross-hatch”)

Let’s start with a home truth. This symbol, in my youth, was commonly known as a “number sign” in the U.S. That was by far the most settled, widespread way of referring to it. I don’t remember ever seeing it used to denote “pound(s),” though apparently it was. The musical kids might have called it a “sharp sign,” although the pitch symbol is tilted upward and doesn’t look quite the same. It could be called a “hash mark,” although that isn’t how I remember seeing that term used. “Hash mark” in the military meant service stripe (a patch sewn onto the sleeve of the uniform), and it’s part of a football field, where it refers to yard markers between the yard lines that run the width of the field and mark multiples of five and ten. You might call the symbol a “cross-hatch,” or possibly a “grid” (another football echo: the football field was once known as the “gridiron”). And of course, a tic-tac-toe board, for that quickest of childhood games: four lines on a piece of paper and off you go. True, a tic-tac-toe board has all right angles, unlike the slanted lines necessary for the number sign or sharp sign.

The common name for this symbol has changed twice in the last thirty years, which is unusual, even striking. “@” has been revived by the onset of e-mail and then Twitter, but it is still generally referred to as the “at-symbol” or just “at,” as far as I know. (But who knows what our young, fast fellow citizens call it now?) “Star” has gained a lot of ground on “asterisk,” but it was common to call an asterisk a star before the dawn of the computer age, and “asterisk” has remained ordinary, partly due to its common use in discussions of baseball statistics. Typographical symbols, punctuation marks, oh, they may have more than one name, but four or five? No, “#” seems uniquely blessed in that department.

Some recent writers have erred on the side of credulity by citing “octotherp” (or “octothorpe”) as the proper technical term for this symbol. There are several versions of the story on-line. Bell Telephone introduced Touch-Tone dialing in 1963, but the pound key and star key did not appear until 1968. The engineers didn’t know what to call it — some say that it was called “pound sign” from the beginning, but evidence either way is sparse — and some of them began referring to it as “octotherp” (“therp” being a nonsense syllable) or “octothorpe” (in honor of Jim Thorpe). That may be what the guys down at the engineers’ lodge called it on wild Friday nights, but no one else ever uttered such a word until the internet — able to spread more misinformation faster than any previous medium — came along.

Our no longer new friend, the “pound sign,” seems to have entered our vocabulary around 1990 (I don’t remember when I first encountered it, but that sounds about right) in reference to the telephone keypad. You can find many elaborate explanations on-line of the “lb” glyph with a ligature evolving into the “pound” sign. Maybe so. As noted above, I don’t recall ever seeing “#” used that way until people started trying to figure out why the hell we were all calling it the pound sign all of a sudden. I would love to see some old photos, or movies, that showed an actual use of the symbol to stand for “lb.” Not that there any more convincing explanations out there. A few brave souls try to derive it from the L-shaped symbol for British currency, but that seems less likely still. (For the most comprehensive exposition of the pound sign mystery, try the ever-reliable Language Log.) Thanks mainly to endless recorded instructions played over the telephone, we all learned the new name in short order, and it was even starting to worm its way into non-telephonic fields, when along came Twitter.

The new social media service was looking for a simple way for people to express common interests and form groups; in 2007 Chris Messina proposed using the pound sign as a prefix to allow easy searches for tags. The idea took off, and now “hashtag” is used even in spoken conversation. “Hash” is an older computerese term, and the “pound sign” has been called the “hash key” (presumably a corruption of “hatch”) for years in Britain. “Tag” was and remains a blogger’s term for a subject heading, a term appended to a post to make it easier to find with a search engine. So “hashtag” was ripe for the plucking, and “#” grew yet another name. While “pound sign” still rules telephony after 25 years, “hashtag” is moving beyond Twitter and teenage conversation. The fact of the matter is that outside of telephones and Twitter, we seldom have occasion to refer to “#” and therefore probably don’t really need a general term, much less two or five. Well, not five, now that “number sign” is extinct. A humble old name for a once-humble symbol, pushed aside by the usual suspect, aggressive technological change.

If Twitter remains part of everyone’s everyday life, it’s quite possible that “#” will remain “hashtag,” shed its other names, and settle into respectability. Maybe it’s another symbol’s turn to develop a promiscuous side. I nominate the caret (shift-6), to be renamed (at first) the “hat sign,” indicating one’s preference in headwear, as in “^fedora” or “^tarboosh.” #anotherbreedofhat

Tags: , , , , , , , , , , , ,


(2000’s | academese | “select and display”)

Here is a verb that has begun covering a lot of new ground in the last decade. “Curate” once had a sharply limited set of objects. The word was nearly always used in the context of museums, galleries, or libraries, and it meant select, arrange, explain. The curator chooses a subset of the entire collection of the institution(s), usually based on an era, a specific person or group, or a theme. Coming up with the focus isn’t necessarily the curator’s job, but everything after that is. You pick your items, you array them within the exhibition space, you label them, you compose explanatory text, and you stay up nights making it all add up to a coherent, satisfying whole. The term didn’t always go with museums, but when it didn’t, it applied to a film festival or something that required the same kind of vision and similar coordination of diverse material.

Curators have other responsibilities, such as maintaining collections (i.e., choosing what to add and preserving it). But the verb “to curate” has always been used in a narrower way to denote preparing a set of items for display in a way that will inform, educate, and entertain visitors. Up until 2000, and probably for a few years after that (here LexisNexis and my memory agree), the verb was rarely used any other way. By 2010, though, “curate” had gobbled up many new things: content shared on social media, consumer goods, tourist attractions, even tidbits of wisdom derived from hard-won experience. One does the same sort of thing in these vast new fields that one did with paintings, manuscripts, or architectural designs. And one needs a word for it, so this shy wallflower, once the sole property of bespectacled museum employees, has spread its arms wide to conquer new worlds.

The growth of social media certainly seems to have played a role in the spread of “curate.” A site like Pinterest makes every person her own curator, plucking related bits and pieces from near and far off the web and grouping them for others to admire. But the spread of “curate” does not seem to depend entirely on Pinterest and its manifold kin. The proximate cause of this post was an offhand reference to a “carefully curated [music] playlist” in a recent New Yorker article by Michael Pollan. I stumbled over the phrase and asked myself Lex Maniac question number one: How would we have said that forty years ago? I didn’t have to cast around long for an answer: “selected” or possibly “organized.” After another minute’s thought, I settled on “selected” or “chosen” as a perfectly adequate substitute, even in 2015. Aside from alliteration, why had Pollan chosen “curated” in this context? One reason is that the tracks were chosen for a common purpose to have a particular effect. But it’s also true that using the high-toned verb lent the whole enterprise more dignity. It’s not just a bunch of songs someone threw together, it has a coherent goal and requires a solidly respectable term.

Of course, “curate” means more than “select” (or at least it used to), and the term generally seems to retain the original sense of organizing and placing in context even now — whether it will in twenty years is anyone’s guess. The spread of “curate” into so many new areas is most likely caused by the constant striving for class and tone that our obsession with shopping and kitty pictures forces upon us. We would like to think that the daily tsunami of trivia and ephemera — whether in the form of tweets, video files, bizarre news stories, or spam and scam — has merit that may not be obvious to the unimpressed observer. Words like “curate” confer the extra class with very little effort. Unless he is reading carefully, the unimpressed observer may think he’s dealing with something more important than it really is.

Thanks to my sister, who proposed this word months ago. I’m not the quickest little brother in the world, but I did finally get a round tuit.

Tags: , , , , , , , , ,

selfie (2010’s | internese | “self-portrait”)
twerk (2010’s | journalese (music) | “shake your booty”)

These words are too new to say much about, but they both effloresced violently recently, and they have occasioned no end of cultural commentary. Their chronological pattern is similar: sporadic appearances at best before 2012, followed by cautious acceptance, followed by the great outburst that was 2013. “Selfie” was named Word of the Year by Oxford Dictionaries last year, while “twerk” was flung into everyone’s consciousness by the lovely, infamous Miley Cyrus. “Twerk” probably has been around longer; it shows up a few years earlier in LexisNexis (hardly the most promising source of information on such an expression, I’ll admit). I’ve read it said that “twerk” goes back to the nineties, but I wasn’t frequenting the right clubs and can’t say one way or another. No one disputes that the word is characteristic of African-American youth culture; it may be a corruption of the exhortation “work it,” shouted to dancers. The Oxford blog records the first instance of “selfie” in 2002 in Australia and posits an Australian origin (I can add that an unusually high percentage of hits in LexisNexis come from Australian periodicals).

“Twerking” is a form of dancing, solo or with a partner, kind of a specialized, advanced form of what we used to call “shaking your booty.” You bend at the knees and grind or gyrate your tuchus. I’m not sure when it started showing up in rap lyrics (the Hip Hop Word Count doesn’t seem to be available), but that was about the only place it showed up for a long time. Around 2010, a rapper named Kstylis released some songs (and videos) with the sole purpose of encouraging female listeners to twerk. He seems to have played a role in the diffusion of the term, right around the same time it started turning up in disapproving editorials. (Coincidence? You be the judge!) It didn’t really become the property of mass mainstream culture until Miley Cyrus appropriated it last year; particularly after her performance with Robin Thicke at the VMA awards in August.

From its humble roots down under, “selfie” — a photographic self-portrait usually taken at arm’s length with a tablet, phone, or digital camera — also took a few years to get established. Hillary Clinton, of all people, gave the word a boost in 2012 in responding to a satirical web site entitled “Tweets from Hillary.” The brainchild of two Washington publicists, TFH lasted just long enough for Hillary herself to take note of it, crediting one of the authors with a “nice selfie.” That seems to have been about the first time anyone with any profile used the word in public. Now the word is almost as common as the thing; here in New York, it’s impossible to walk a block without passing someone smiling inanely into their smartphone. (The practice has cut down quite a bit on the old custom of asking passing strangers to take your picture so you can prove you were in New York.) “Selfie” seems more ripe for adaptation than “twerk.” One clever inventor has come up with a bicycle storage device called the “Shelfie.” A self-portrait taken by a tall, willowy young woman ought to be called a “sylphie.”

These words have come into their own largely due to the rise of social media. Twerking is a Youtube phenomenon, and selfies are inextricably linked with Facebook, either as the easiest way to generate a profile picture (although using any image except your own face seems to be the rule on Facebook) or simply as a place to show your friends what you’ve been up to. In this case, our new networks (net-twerks?) have acted more as megaphones, since both words predated widespread use of social media, but Facebook and Twitter and Pinterest are going to have an effect on our language. The main effect of social media on American English will probably amount to maiming it, but at least we’re getting some new words out of our new toys.

Tags: , , , , , , , , , , , ,


(1990’s | businese (finance), advertese? | “drift (up or down),” “move or go (in a certain direction),” “peak”)

In 1980, “trend” was used as a verb in two contexts: geology and topography, and finance and economics. The same was still true in 1990. The verb as we think of it now doesn’t seem to have spread beyond those fields until after 2000, although the informed reader had to be familiar with the financial use by the mid-nineties. Truth is, the OED finds recognizable uses of the verb (defined as “to turn in some direction, to have a general tendency”) as far back as 1863; none of the three citations it lists has anything to do with numbers or statistics, which is overwhelmingly how we use it today. The topographical use, as in a mountain range “trending east” or a streambed “trending down a slope,” has its roots in an older use of the verb that sounds obsolete now, as in “trending along the coast” (i.e., sailing along a coastline without trying to make landfall). Pollsters discovered the verb, naturally enough, but not until ten or fifteen years ago, long after the bankers; advertisers had picked it up before then.

For a long time, the word was used primarily to talk about prices, interest rates, sales — and it still is most of the time, although it turns up now and then on the sports or entertainment pages. Is the average going up, or going down? What’s the pattern? What do the numbers show? We’ve been talking about upward trends and downward trends for a long time; in fact, it’s not unusual to see the phrase “trend up” or “trend down,” where “trend” is used as a noun with the adjective following rather than preceding. But now “trend up” and “trend down” may be readily construed as verbs, which was more of a wrench thirty years ago and all but unknown fifty years ago.

The onslaught of web-based news, blogs, and social media has given a new, if predictable, twist to the verb; now it means something like “hot” or “riding a wave,” almost always used as a present participle. When Yahoo or Facebook calls your attention to what’s “trending now,” they want you to think it’s the latest thing, hot off the wire. Even if it’s old or recycled (how many times have the Kardashians “trended” over the last five years?), it’s what people are talking about today (i.e., which hashtags are getting the fiercest workout). In the case of a search engine, when a word or phrase is trending it means that more people are searching it than yesterday or last week. Today’s fad is tomorrow’s has-been, as always, so even when we use “trend” in a way that suggests the only possible direction is up, the downward trend is always lurking there in the background, but we prefer that it remain unspoken. When the cyberaudience loses interest in one story, another invariably comes along to take its place. (Le trend est mort, vive le trend!)

If you ask me, this more recent use goes straight back to the adjective “trendy,” already well established in my boyhood. “Trendy,” even when used contemptuously, meant popular, hip, with it — not following a path along with everyone else, but leading the crowd (although in order to be truly trendy you did have to be part of the pack). The use of “trend” to mean “peak” has now taken precedence over the intermediate meaning, as in “to trend up” or “to trend down.”

Tags: , , , , , ,


(2010’s | therapese? | “be overeager,” “say too much”)

I believe I heard this word for the first time last year. The first use I found in LexisNexis dated back to 1998, and it turned up occasionally after 2000. In 2008, Webster’s New World Dictionary named it Word of the Year: “the name given to ‘TMI (too much information),’ whether willingly offered or inadvertently revealed. It is the word for both the tedious minutiae on personal websites and blogs and the accidental slips of the tongue in public (or even private) situations. Both a verb and a noun . . . ” The term has always, in its short life, had an affinity with social media and on-line communication generally, but it isn’t restricted to embarrassing e-revelations; the word may also be used to describe someone’s behavior in old-fashioned conversation. In fast circles nowadays, you can use “overshare” as an interjection, exactly as one used “TMI” ten years ago.

One way “overshare” differs from “TMI” is that you use it to denote a leak of information that leads to fraud or identity theft (or even burglary — I’ve heard tell of people who noted on Facebook when and how long they would be away from home and returned to find someone had ripped them off). An “overshare” may have nothing to do with personal hygiene or medical history, in other words, but it has a damaging impact all the same. Here again, the connection with social media is clear. The connection may be reinforced by the knowledge that an English company, Exonar, makes a product called “Social Overshare” designed to protect companies against employees’ (presumably unintentional) leaks of sensitive data.

The expression is often applied to on-line behavior, and the web is full of explanations for the phenomenon. Aside from “some people don’t know any better,” oversharing may be diagnosed as: an effort to get attention, an attempt to take a shortcut to close friendship or intimacy, or because telling others about ourselves activates pleasure centers in the brain. Whatever the cause, there is a redefinition of privacy going on here among the rising generation, a sense that practically everybody needs to know the details of your mother’s colonoscopy (or, worse, her maiden name), or how and when you pick your nose. Hard to say if this sort of thing will go out of style or become less and less noteworthy. There have always been people who said too much and cast a pall over the dinner table. Has Facebook made such behavior so commonplace that it will perforce become acceptable?

I haven’t heard anyone say, “Thanks for oversharing” yet, although someone must have. Several web sites collect examples of on-line oversharing — more small-minded sport for the wired masses — but I didn’t find Oversharers Anonymous anywhere on-line. What are we waiting for? It seems only a matter of time.

TMI (“too much information”)

(2000’s | computerese? | “more than I wanted to know,” “I wouldn’t have told that”)

I was hoping that this expression would turn out to have a simple origin (click here and scroll down to comments for some speculation). I don’t know why; we rarely get that satisfaction. Even phrases confidently attributed to this or that celebrity (I swear I saw “TMI” credited to Matthew Perry on “Friends,” which as far as I can tell is entirely baseless; another site cited Christopher Hitchens) turn out to have slithered onto the shore of everyday language from a dank, dark pond and curled around our tongues before we were aware. This one came along at the right time to be yet another internet abbreviation, but it doesn’t seem to have shown up in computerese any more often than anywhere else. The phrase probably predates the abbreviation anyway, according to my girlfriend’s (and my) recollection.

While this phrase could mean simply, “you’re telling me more than I can absorb,” and occasionally does, it almost invariably means “you’re telling me more than I want to know.” It covers embarrassment or distaste, a way to deflect a person who just doesn’t know when to shut up and avoid difficult scenes. Our need for such an expression has increased in my lifetime as we’ve placed more and more emphasis on making the culture sharing rather than shared, so it was necessary to find or invent an expression that fills that particular gap. Somewhere in the last ten or fifteen years, it became the rejoinder of choice to any intimate detail regarded as more icky than juicy.

But why doesn’t “TMI” serve as a reply to a glut of technical detail or just more data than we can use? “Too much information” is a situation we find ourselves in every day, whether we’re trying to figure out how an appliance works or following the news. The web makes it all too easy to turn up more facts or conjectures than we can possibly use or even process on almost any public issue. After 9/11, the grand poohbahs explained the failure of our multi-billion dollar system of spies and soldiers and high tech by saying, in effect, TMI. There was just too much data to sift, too many e-mails to go through. It was a plausible defense, but their proposed solution — more surveillance and more eavesdropping, trawling for more information with a still wider net — proved that they hadn’t grasped the fundamental problem. We couldn’t handle what we got, so we need to collect more — that’s a non sequitur. Unless you hire zillions more people to handle the increased load. That was the idea, but I get the feeling that the proliferation of secret, semi-secret, and semi-public agencies has outpaced the addition of competent people to the ranks of those sworn to protect us.

Tags: , , , , , ,

i’ll shut up now

(2000’s | “I’ve said enough,” “I’ve said too much,” “I’m boring you,” “you win”)

This expression has always been available without any sense of inevitability, at least as long as “shut up” has meant “stop talking,” or since the mid-nineteenth century, if the OED is anything to go by. It has taken on the quality of a fixed phrase only in the last twenty years. Most often it means “If I say anything else I’ll get in trouble,” or “I’m already in trouble,” or just “I’m getting tiresome.” But the range of possible meanings is quite broad — strikingly so. It can be as simple as “You need to concentrate, so I’ll stop distracting you.” Or “I’m tired of hearing my own voice” (not as common as it might be). Maybe “I’m not going to win this argument.” Sometimes it’s used to imply, “I don’t want to spoil it by going on too long.” Or even “you just put it better than I ever could.” Journalists like to use it to end a column, where it signals “don’t take what I just said too seriously.” And it has a corresponding range of moods: nervous, concerned, offended, resigned, jocular, self-deprecating, and so forth.

The fixed phrase came along in the 1990’s. The first instance I found on LexisNexis came from 1987, uttered by Madonna, who had ruffled feathers by referring to her “little smelly” hometown. When she explained herself afterwards, she closed her statement by saying, “I think I’ll shut up now.” A pretty clear instance of “I’ve already gotten in enough trouble.” I found scattered uses in the 1990’s. This is another expression that sounds like it ought to be a catch phrase from a television show or a comedian’s shtick, or some celebrity’s tag line. But there’s no evidence that Madonna’s use of it — in context, it’s not really possible to determine if she was composing a normal English sentence or deliberately echoing, or coining, a newly hatched phrase — helped push it into everyday language. It just trickled in over the course of a decade or so. (Compare “good luck with that,” which took a similarly indirect route into our consciousness.) Hell, now there’s even a country song called “I’ll Shut up Now.”

When this phrase is spoken, the emphasis falls not on “now” or “I’ll” but on “up.” Not only the wording but the rhythm is invariable. Google suggests that “I’ll shut up now” is particularly popular in blogs and social media, which may reflect Google’s bias or may reflect a real trend. There’s no question that the expression is becoming more common, and therefore more ordinary.

Tags: , , , , ,