Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: technology

crunch the numbers

(1980’s | computerese? enginese? | “wade through OR digest the figures”)

Some new expressions engulf the landscape, washing over us all and forcing themselves on every ear, if not every lip. When we talk about common expressions, those are usually the kind we mean. There is another kind, though, not so ubiquitous, but unavoidable because the preferred, or only, way to refer to a particular action, process, or concept. So it likewise forces itself on every ear, but without the same unrelenting insistence. “Crunch the numbers” is one of those. It has become inevitable, in a culture devoted to amassing vast reservoirs of data, that we have a word for getting something useful out of all those statistics — once you collect all those numbers, you have to do something with them. There’s really no other word for it, and the phrase has become invariably associated with statistical distillation. The commonplace is formed not only from sheer frequency; if you have no choice but to reach for the same expression every time, it makes its presence felt.

The point of “crunching” the numbers, I think, is that they are reduced in size and complexity, like a mouthful of bran flakes turning into easily swallowed mush. The computer — number-crunching is almost invariably associated with computers, occasionally with calculators — takes a huge, indigestible mass of data and breaks it down. The expression seems to have arisen in the engineering community in the sixties and moved beyond it by the early eighties. It gained ground quickly, and soon no longer required quotation marks or glosses (actually, it was never generally glossed). Some expressions, though slangy and therefore not reproduced in mainstream publications until well after they’ve become ordinary, at least in their field, take hold quickly once they do because they’re easy to grasp and enjoy.

“Crunch the numbers” was at one time sole property of engineers and programmers; a few more professions may try it on now — accountants and statisticians primarily. The function of the computer, as envisioned in the post-war world, was to do many, many calculations per minute by brute force, placing vast amounts of computing power in one place and letting ‘er rip. I haven’t done the research to determine the no doubt lively expressions the tech boys used in the decade or two before “crunch the numbers” came along, or maybe it arose earlier than I think. It seems likely that there was no predictable expression before we started using this one, because we so rarely needed to talk about that volume and density of computation.

“Crunch the numbers” doesn’t share the taint of “massage the numbers,” or “game the system” or “fuzzy math.” A ground-level, first-resort expression must remain neutral, and the phrase is not generally used to question the motives or competence of those doing the crunching. “Run the numbers” is a little different, meaning “execute the formula and get the answer.” It likewise lacks any dubious connotation, despite a superficial resemblance to that staple of urban gambling, “running numbers” (or “playing the numbers”).

Tags: , , , , , , , , ,

hard-wired

(1980’s | computerese? | “innate,” “(pre-)programmed,” “fixed,” “unalterable”)

The hard-wired smoke detector was already around in 1980; in that sense the term has not changed meaning since. “Hard-wired” meant connected directly to the building’s electrical system, meaning it was not powered by batteries, meaning that it would not infallibly begin making horrible chirping noises one morning at 3:00 and resist every sleep-fogged effort to silence it. A hard-wired telephone was similar in that it was harder to disconnect than the standard model you plug into a wall jack (already common in my youth, though far from universal). The cord connected to the system inside the wall rather than on the outside. Cable television might be hard-wired in that the cables connected to the source physically entered your house and attached themselves to a television set. Computer scientists had been using the term before that, generally to mean something like “automatic” or “built-in” — the only way to change it is to make a physical alteration to part of the equipment — and it remained firmly ensconced in the technical realm until the eighties. That’s when “hard-wired” became more visible, as computer jargon was becoming very hip. (PCMAG offers a current set of computer-related definitions.) In computer lingo, “hard-wired” came to mean “part of the hardware,” so “soft-wired” had to follow to describe a capability or process provided by software.

My father, erstwhile electrical engineer, pointed out that in his world, “hard-wired” was the opposite of “programmable.” In other words, the hard-wired feature did what it did no matter what; it couldn’t be changed simply by revising the code. Yet you don’t have to be too careless to equate “hard-wired” with “programmed” (see above) in the sense of predetermined. It’s not contradictory if you substitute “re-programmable” for “programmable,” but that requires an unusual level of precision, even for a techie. Every now and then you find odd little synonym-antonym confusions like that.

Still in wide technical use, this expression has reached its zenith in the soft sciences, in which it is commonly used to mean “part of one’s make-up,” with regard to instincts, reflexes, and basic capacities (bipedal walking, language, etc.), and more dubiously to describe less elemental manifestations such as behavior, attitude, or world-view. “Hard-wired” is not a technical term in hard sciences such as genetics or neurology. The usefulness of the expression is open to question: one team of psychologists noted, “The term ‘hard-wired’ has become enormously popular in press accounts and academic writings in reference to human psychological capacities that are presumed by some scholars to be partially innate, such as religion, cognitive biases, prejudice, or aggression . . . remarkably few psychological capacities in humans are genuinely hard-wired, that is, inflexible in their behavioral expression” (citation). Scientists may sniff at the term as used in pop psychology, but it does make for easy shorthand and probably won’t go away any time soon.

The reason we take so easily to applying the term “hard-wired” to the brain is that the computer, as developed over the last fifty years, forms the most comprehensive map yet for the workings of our minds. A contributing reason is the very common, casual linking of brain activity with electricity, as in referring to one’s “wiring” — even though one may also refer to one’s “chemistry” to explain mental quirks, probably a superior explanation. Watching a computer “think” helps us understand how our brains work, or maybe it just misleads us, causing us to disregard our own observations in order to define our own mentation with reference to the computer’s processing. There are obvious connections and obvious divergences; surely any device we concoct must reflect the workings of our own minds. But computers aren’t just for playing solitaire, calculating your tax refund, running a supercollider. They serve a humanistic function by giving us new ways to think about the old ways we think.

Tags: , , , , , , , , , , , ,

avatar

(1990’s | computerese | “totem,” “alter ego”)

Occasionally a word will just flip around. For thousands of years, “avatar” meant corporeal embodiment of an abstraction, often a god (the word comes from Sanskrit and is mainly associated with the Hindu god Vishnu). What does it mean now? An incorporeal representation of a flesh-and-blood person. Now your avatar is a tiny image file that stands for you in the on-line universe –- on Facebook, in a role-playing video game, or on a blogger’s profile. It could be a portrait but usually isn’t, and there are dozens of sites that will give you some raw material and a little coaching to help you make your own. Another word for it is “icon,” which is a little different but not that far off. “Avatar” was definitely available in its present form by 2000, though perhaps not all that widespread in the days before everybody needed one to keep up a virtual social life. Now it is all over everywhere.

Is it odd that both terms have a distinctly religious cast? By the time you’ve settled on which South Park character you want to represent you on your social media pages, the sense of majesty and mystery that religion must command has pretty well worn away. Both “avatar” and “icon” have had non-religious uses for at least a century now, or at least “avatar” has, but there’s still a distinct whiff of it. You might also argue that technology, the most up-to-date religion we have going, has simply appropriated sacred vocabulary and repurposed it.

The question leads to a favorite pastime of mine: constructing narratives of cultural decline. “Avatar” once had a mystical tone, associated either with the gods themselves or people who live out philosophically pure distillations of noble principles. Now it’s a few pixels thrown together that allows you to play video games. A decided downward drift, and all in my lifetime! A quick scan of search results does confirm that Google, for one, doesn’t think old-fashioned religious uses of the term count for much — though, of course, the results are skewed by the 2009 blockbuster film. I didn’t see it, but from what I gather the eponymous characters in the film had at least a touch of the guru or sage about them. (I remember the movie as being about blue characters who weren’t Smurfs, but that just shows you how primitive my cinematic consciousness is.)

On-line avatars remind me of choosing your token at the beginning of a Monopoly game (we usually called it simply a “piece,” but if I remember correctly, the rules used the word “token”). The dog, the shoe, the little car, etc. (I liked the wheelbarrow myself.) Most people had a preference, whether they considered it lucky or somehow apt. True, you couldn’t cobble your own avatar together in Monopoly; you had to take what Parker Brothers gave you. But those were the boring old days, my children. Image files are not my strong suit, but I came up with a few related user names, free to a good home. Want to be an actress? Ava Tardner. Fond of the Middle Ages? Avatar and Heloise. For fans of ancient video games, Avatari. For a historical touch, how about Amelia Earhart, avatrix? That’ll light up the chat boards.

Tags: , , , , , , , , , , , , ,

vape

(2010’s | hipsterese? teenagese?)

Primarily a verb, I would say, but available as a noun (short for “vaporizer” or for the practice of “vaping”), or for modifying fanciful store names (there’s one on 14th Street called Beyond Vape). One who vapes is a vaper, which may remind antiquarians of “viper,” a very old word for marijuana smoker. “Vape” was not entirely new when we first encountered it between 2005 and 2010 — 2009 is the first time it shows up in mainstream press sources, says LexisNexis — it had seen limited use before that as short for “vaporizer,” but that was before anyone thought of a vaporizer as a way to ingest nicotine or anything else. For that we had to wait until the early 2000’s, when a Chinese pharmacist invented the battery-powered nicotine delivery device, which heats liquid to form vapor rather than leaf to form smoke. It took a few years, but by 2010 electronic cigarettes had become noticeable. They looked suspiciously like cigarettes — and plenty of people were and remain suspicious — but they produced far less dangerous fumes, though probably not perfectly safe. A few short years later, vaping need have nothing to do with nicotine, and dispensers need not look like cigarettes, though the ever-popular vape pen retains the slim, cylindrical shape. It’s become an art and science and commerce all its own. Shops have sprung up everywhere, and vaporizers have supplanted hookahs as the hip smoking device. I see people vaping all the time now on the streets of New York. Professional worriers have stopped worrying about hookah languages and started worrying about kids taking up vaping.

There are a number of associated terms, of course, (and a legion of brands to match); if you want a chuckle, check out the alphabetical list of headwords on the right of Urban Dictionary’s “vape” page. I won’t try to go into all of them, but here’s one glossary (here‘s another). The medium for the nicotine, flavoring, or whatever you put in your vaporizer is generally called “e-juice” or “e-liquid.” Another term for the device is “PV,” for “personal vaporizer.” Basic tools of the trade have been shortened to “atty” (atomizer), “cart” (cartridge) and “bat” (battery). A souped-up PV is called a “mod” (short for “modified”), which should not conjure up visions of the Mod Squad. A “clone” is a fake, basically, a knock-off or counterfeit. The sensation of a puff of vapor going down is called a “throat hit.” Regular old tobacco cigarettes are known as “analog cigarettes,” though there’s nothing digital about an e-cigarette; the association with e-mail and other computer-spawned e’s is fortuitous.

We are entitled to wonder why vaping became so popular so fast. Much is made of its role as an aid to giving up smoking, with accompanying debates over how safe it really is — debates that continue to rage, though most observers agree that they are less toxic than old-fashioned cigarettes. It seems likely that many vapers took it up for that reason. Vaping is cool rather in the way that smoking used to be: not rebellious exactly, but a bit transgressive, a little dangerous, developing a subculture recognized by the general population. But there’s also the technological factor. Vaping is in because it has produced new gadgets and lots of opportunities to mess around with them. The engineer types like having things to play with, and the techno-buffs revel in the latest improvements. There’s also the rage for anything new that occupies a surprising number of our fellow citizens, which I have cited before as a powerful force behind new concepts and expressions in our discourse.

Tags: , , , , , , , , ,

warfighter

(1990’s | militarese | “combat soldier”)

My libertarian readers will need no reminding that this week’s expression became necessary only after dramatic changes in the functions of U.S. armed forces over the course of the twentieth century. But armies have always had numerous soldiers and hangers-on essential to the functioning of the machine who never see combat — who wants to serve in a battalion where all the cooks got shot? — and “warfighter” merely denotes a combat soldier as opposed to all the other kinds. Right-wingers like to grouse about Our Troops used for the dreaded Nation-Building, and they are correct that we ask our armed forces to perform more, and more varied, duties and take on more roles in the world than we did before World War II. But that fact is but a sidelight as far as this term is concerned.

Even now, I’m not sure the term counts as everyday language, since it still turns up predominantly in military or at least government publications, or journals published by and for military contractors. I ran across it last week in Newsday, which conjured up a few other foggy memories of seeing it in the last few years. The first instance I found in LexisNexis came from the illustrious pen of Sen. Mark Hatfield, but it was uncharacteristic (see below). Today’s meaning of the term stared turning up regularly in the nineties, when it made occasional incursions into the mainstream press. Perhaps a few years earlier, military commanders began to talk about “warfighter exercises” designed to simulate combat situations more accurately than the old exercises had. (The use of the word as an adjective, or first half of a compound noun, still appears, but it has not become the norm.) It’s important to remember that “warfighters” is not the same as “boots on the ground”; a drone pilot thousands of miles away is every bit as much a warfighter as a wretched infantryman in Kabul (if we have any wretched infantrymen left in Kabul). It is settled wisdom in the military that the entire infrastructure and bureaucracy is there to serve the warfighter, to give U.S. soldiers the best possible chance in whatever sort of combat they are pursuing at the moment, most often in terms of technology and training. Yet so far the word has not come into use as a means of glorifying soldiers or making them objects of pity (as in “support the troops” or “brave men and women in uniform”).

Occasionally one sees this week’s expression used as the second part of a compound, as in “nuclear warfighter” or “guerrilla warfighter.” (The former appeared in Hatfield’s New York Times op-ed in 1986.) It turns up infrequently, but it’s not an unreasonable broadening of usage, actually. “Warrior” has a definite old-fashioned sound, more suited nowadays to movie franchises and computer games than actual warfare, though it might still be used of elite fighters. I think “warfarer” should be given consideration, but it looks too much like “wayfarer,” I suppose. By the way, there’s an Android game called Galaxy Warfighter; maybe this will be the rising generation’s cue to adopt the expression and push it irreversibly into our vocabulary.

“Warfighter” is an accidental addition to an accidental series formed loosely around the idea of strife, or making it go away. See “conflicted” and “-whisperer.”Pushback” and “win-win” are other terms in this category. Peace out, y’all.

Tags: , , , , , , , , , ,

wow factor

(1980’s | journalese (film)? advertese? enginese? | “appeal,” “oomph,” “oohs and ahs,” “brilliance”)

Inasmuch as “wow” and “factor” both have relatively long and complicated histories, perhaps we should begin there before considering their union. “Wow” appears to go back to a Scots interjection, which could be laudatory or derogatory, and our modern understanding of the word emerged even before the beginning of the twentieth century; by 1925 it was going strong as an interjection and available as a noun or verb. The interjection is far more common than the other two today and probably always has been. “Factor” is an even older word that early in the twentieth century meant “gene,” basically (allowing for evolution in our understanding of genetics); now it is defined much more generally as “element or constituent, esp. one which contributes to or influences a process or result” (OED), especially if it’s important and its action is not well understood. “Factor” preceded by another term to denote a particular substance or catalyst is quite common in medicine; “Rh factor” being a longstanding example. “Risk factor” no doubt started life as a medical term but now flourishes in other fields. “Factor” became popular in Hollywood during the seventies, when it followed “Delta,” “Neptune,” “love,” and “human” (twice) in film titles (they all had to do with science fiction or espionage). And, to complete the picture — or the confusion — “wow factor” was used occasionally among stereophiles before 1980 to talk about irregularities in playback speed of tape decks and turntables, as in the phrase “wow and flutter.” So it seems the stage was well set.

By the mid-1980’s, the phrase started turning up in writing about entertainment (particularly films and television), computer software, merchandise more generally, and even service industries like banking. One early exponent was marketer Ken Hakuda, who used “wow factor” in 1987 to talk about his success in selling toys which he freely admitted were not useful or valuable except as a source of mindless fun. He used the term to refer to a highly visible feature of a product or its packaging that makes a strong, immediate impression, causing shoppers to whip out their wallets. That quality of impressiveness constitutes a common denominator among objects blessed with the wow factor. I’m not willing to take a firm position on the origin of this particular meaning. If I had to guess, I would say Hollywood, but advertese seems like an equally logical breeding ground, and I can’t say it didn’t start there. Because the phrase goes frequently with technological advance (especially when you’re talking about cinematic special effects), it is possible to argue that its source is enginese. While two of the earliest citations found in LexisNexis are due to Steven Spielberg and Harrison Ford, the very first (1984) was in the title of Miss Manners’s column, of all places. Did she supply the headline, or do we owe it to a forever anonymous editor? By the mid-1990’s, the expression was no longer extraordinary and had shed quotation marks, exclamation points, capital letters, and such tricks of the trade.

If you looked even cursorily at the pre-1980 equivalents listed at the top of this entry, you may have surmised, correctly, that I struggled to find convincing synonyms from the old days. That is because we used to say the same thing with adjectives — e.g., dazzling, eye-catching, awe-inspiring, cool — or verb phrases: knock your socks off, set something apart, jump off the shelves. Many new expressions have ensconced familiar ideas in new parts of speech, which usually is a net gain for the language. More ways to say the same thing reduces monotony and opens up room for small but significant variations in connotation. I’m inclined to consider the popularity of “wow factor” deserved. It’s short and to the point. And the ground meaning is quite clear, though it can imply two slightly different things, just as in the sixties, “wow” conveyed two different levels of excitement. One was the old gee-whillikers gob-smacked elation at seeing anything unexpected and pleasing. The other was quieter, more meditative, as in the pothead grokking the universe as he exhales. No squealing or jumping up and down, but the profound sense of something worthier than oneself that must be absorbed and appreciated with a drawn-out “wow.” “Wow factor” has always leaned more heavily in the direction of the former sense, but it can shade toward the latter sense as well, and seems to do so more often as time goes by. Not that the two meanings are all that far apart.

It has occurred to me to wonder if we should hear this expression with a whiff of the tawdry or meretricious. Given its early use and likely origins, it’s not hard at all for an old snob like myself to inflect it this way. But that would demand an ironic edge that I rarely or never hear when the phrase is used. A “wow factor” is a good thing that will impress the audience, sell the product, or make something stand out. The idea that there must be something cheap or slutty about it never seems to have taken root.

Tags: , , , , , , , , , , , , ,

standalone

(1980’s | computerese, businese | “independent,” “unconnected,” “separate,” “isolated”)

The earliest instances of “standalone” (sometimes hyphenated, even in this day and age) in Google Books date from the sixties and seventies, nearly always in talking about non-networked computers. The first hits recorded in LexisNexis all date from 1979 in that trusty journal American Banker — but invariably in discussions of the use of computers in banking. The word was used often in the early days of ATM’s, which could, in the manner of computers, be divided into the ones clustered together for protection (e.g., in a bank lobby) and the ones out in the field, far from help. (The latter had to be connected to a mainframe somewhere or they wouldn’t have access to anyone’s account data, of course. And even a standalone computer had to be connected to a power source. No computer is an island; no computer stands alone.) ATM’s were brave and new in the eighties, and I suspect their spread pushed “standalone” into prominence. Other business types were certainly using the word by 1990, generally in reference to computers. It was widely understood by then but remained primarily a hardware term at least until 2000. One mildly interesting point about “standalone” is that it could apply to an entire system as well as to a single device. A standalone device can function even if it is not part of a larger system, but an entire system can also absorb the adjective if it doesn’t depend obviously on comparable systems.

“Standalone” retains a strong business bias, even today, but it is available to describe many things besides computers. A complicated piece of legislation might be broken up into standalone bills. Or a film for which no prequels or sequels are planned (or one in which a character that had been a supporting player in other films becomes the protagonist) might be so described. A government agency that doesn’t rely on another agency for its writ. A restaurant that isn’t part of a chain. “Standalone” is not generally used to mean “freestanding,” although it seems like it ought to be, literally speaking. I am a little surprised that I find almost no examples of the word used as a noun (one does see it used as a trade name), although that seems inevitable. All it takes is the careless insertion of one lousy one-letter article, and the deed is done. You’d think it would be harder to blur fundamental grammatical categories, but no.

The rise of this term inevitably accompanied a change in how we use computers. In the seventies and eighties, when we began to get serious about turning them into tools for business, the idea was that each employee’s work station had to be connected to the mainframe, where all the applications and data were stored. In the nineties, we shifted to the opposite model: each employee’s computer should have a big enough hard drive to store software and data; every work station became its own mainframe (or server, as we would say now). In the last few years, we’ve pushed the other way, and now minuscule laptops and tablets run software from the cloud, and store data there as well. The same shift has taken place outside the office; home computers have undergone a similar evolution. There are no doubt good reasons for the shift; the rules and conventions of the computer game have changed quite a bit. But like many such sizable shifts in our culture, it has taken place with little or no consideration of why we did it the other way. Are the once highly-touted advantages of standalone computers no longer real or significant? We don’t know, because the issue was never debated out where most of us could hear. We did it the old way because there was money in it, and now the powers that be have found a new way to make money. You’re stuck with it whether it helps you or not, and you’re not even entitled to an explanation. That should be surprising, but in practice, it isn’t. Our policy debates routinely fail to explore how things got to be the way they are. It’s as if we all woke up one day and said, “Look, a problem! Let’s fix it!” With insufficient historical understanding, we attack large-scale problems with little or no attention to how they arose and fail to acknowledge the evils the existing approach has successfully prevented.

Tags: , , , , , , , , ,

user friendly

(1980’s | computerese | “easy to use or learn,” “accommodating,” “welcoming”)

Somewhere around 1980, lots of people started writing about the imperative of making the computer — whose rise was, by acclaim, inevitable and unstoppable — into a device that laypersons could operate. Much was said about the developers and programmers, lost in their own little world, unable to design a computer so the rest of us could use it without going stark staring mad. Strides have been made, but we are still talking today about software geniuses who just don’t understand how to make nifty new features readily available to non-experts. At least as much effort must be expended to adapt technological advances to the limitations of the average computerphobe as to create the advances themselves. (And all that effort must have paid off, because today, no one is ever stymied, balked, or thrown by a computer, right?) The phrase “user friendly” (often spelled as two words, though it ought to be hyphenated, at least according to this grumpy grammarian) bloomed when this kind of talk became prevalent thirty years ago.

A curious point about this expression: it did not turn up at all in LexisNexis before 1980. (The word was in use before then, of course, mainly among those writing about computer hardware and operating systems.) Both the Christian Science Monitor and the Washington Post described “user friendly” as a “buzzword” in 1980, however. Such an unheralded leap into prominence is unusual. It was not unusual to see “user friendly” glossed in the early eighties; sometimes it was placed in quotation marks, but that was not the rule. By the end of the decade, the expression applied not just to gadgets like computers, cars, and toys, but to airports, parks, police stations –- almost any place that provided an amenity of some kind or that people had to find their way around. It was already turning up in reviews describing a novel or a band’s music (where it meant “accessible” or “easily grasped”). George Will, in 1989, used “user friendly” to describe a cat; he seems to have meant “affectionate.”

Beneath the rapid growth of the expression lies a change in the force of the word “user.” In my youth, it meant “exploiter” (someone who takes advantage of someone else) or “one who ingests illegal drugs” (“Users are losers” came along later, but that sense was already current). It was a malignant word, almost always bearing a negative connotation. But the computer revolution changed all that. “User” has become a neutral term, applied to anyone looking at your web site or opening up your software. We talk about “user interfaces” and “user statistics” very casually. No hint of the revulsion with which the word was imbued a scant forty years ago.

Since the beginnings of the Industrial Revolution, for all I know since the beginnings of agriculture, a gap has persisted between the creators of new technology and the ultimate operators to whom it trickles down. The assembly line is one way to adapt the grunt laborers to technological advance, but in more enlightened times, we prefer to adapt the advances to the laborers and make a wider range of functions and effects available with less effort. In the computer era, it seems like we have been talking more about this sort of thing than we used to; more experts have worked harder to bring all those time- and labor-saving features down to our level. In other words, the computer is more complex, more alien, than the internal combustion engine or the cathode-ray tube — still another indication that the personal computer marks a new plateau in the uneasy relationship between ourselves and our technology.

Tags: , , , , ,

beta version

(1990’s | computerese | “dry run,” “preliminary version,” “demo”)

Ineligible for the blog on both historical and semantic grounds, but it seems to be creeping into contexts other than testing computer software. Even now, the association with computers remains strong if not quite inevitable: a stage of software development in which the product is not ready for sale but is ready to be tested on a large scale by real live users, not just the developer’s own engineers. (The latter stage is known as “alpha testing” when it is referred to at all.) The cynical way to look at this is that the developer gets lots of work done for nothing, although most will offer discounts to beta testers on the finished product. The phrase required some explaining at first, but now everyone knows what it means: On April Fool’s Day, Google invited users to test the beta version of Google Nose, which enables users to search by odor (Google, of course, is famous for keeping some of its products in the beta stage for years).

According to Wikipedia and at least one other source, “alpha” and “beta” in this sense were broken in by IBM computer jocks long ago, well before the personal computer was thought of. It was widely used in talking about both hardware and software certainly by the late eighties, if not before, but it didn’t turn up much in the mainstream until some time in the nineties. (I believe that’s when I became familiar with the concept.) Beta versions are for the expert or adventurous, or for employees of large corporations who act as guinea pigs. Ordinary shlubs like me avoid beta versions; let somebody else step on all the landmines. (Hey, it’s not like most software doesn’t have plenty of bugs and glitches even after rollout.)

One descendant, “beta reader,” has cropped up recently in the realm of fan fiction, where it means something like “editor.” Here again, it refers to a reader at a certain stage of the development of the text, not in its roughest form but not finished, either; the idea is that the beta reader will help the author improve the story and get it ready for publication, posting, or whatever comes next. In this sense it obviously derives from the old computer-industry use but may point the way to a new set of meanings. Watch this space.

early adopter

(1990’s | academese? advertese? | “pioneer,” “early bird” )

The interesting thing about “early adopter” is that its meaning has never varied in the slightest, and while its range of use has broadened a bit, neither its denotation nor connotation has changed to speak of. Someone who snaps up a new practice or product as it becomes available — someone interested in the latest technologies, brands, or services. From the beginning, the expression has had a strong connection with technological advance, and it still does, although nowadays it may freely be used in talking about customs, rules, or attitudes. That was not true in the 1980’s.

The earliest uses recorded in LexisNexis date from the early 1980’s, concentrated in the banking press. It was not long before “early adopter” was taken up in computer circles, and the term quickly became common in talking about (i.e., promoting) new personal computers, network technology, operating systems, etc. The term likely was coined by the economist Everett Rogers, who invented a field by publishing a book called “Diffusion of Innovation” in 1962, in which he classifies people according to how quickly they adopt new things; one of the classes was “early adopter,” who weren’t the very first to pick up the latest thing (those are the “innovators”) but who come right after and presage wide consumption or use. Most of us are content to follow our own English Pope:

Be not the first by whom the new is tried,
Nor yet the last to lay the old aside.

Marketers were probably the first to use the term regularly, and it was rarely seen outside the business or computer press until at least the mid-1990’s; it was rendered in quotation marks as late as 1999 in US News and World Report. But that wasn’t really typical. Mostly the phrase is used without any particular notice or explanation, and that has been true for a long time. (Rogers dubbed the last to lay the old aside “laggards” — those who take up innovations slowly (not until they are already obsolete) or not at all. I’m a laggard.)

The phrase has long had a strong correlation with favorable terms like “forward-thinking” or “progressive.” An early adopter typically is not seen as an uncritical, superficial customer who will walk out with anything that is sold as the dernier cri, but as a discerning shopper who is quick to see the advantages of the latest technology. Early adopters are usually thought to be knowledgeable and well-off — people you want to know and emulate. There’s no reason for this that I can see except that the people who use the phrase are also the people who have a strong interest in inducing early adopters to buy whatever it is they happen to be selling. So they need to flatter the adventurous ones willing to endure bugs and kinks, because success with that group portends general success. You don’t go describing your client base as gullible, hysterical, or lacking wisdom. With that goes a tendency to denigrate the laggards as stuck in the mud, out of the loop, and selfishly standing in the way of progress. So all those of us who didn’t spend money on Beta videocassettes, New Coke, the DeLorean, or WebTV, are losers. Time to repent. Go thou forth and bankrupt thyself on every crummy two-bit novelty that comes down the pike.

Tags: , , , , , , , ,

cutting-edge

(early 1990’s | “up to the minute,” “avant-garde,” “most advanced”)

Maybe there’s no more to the evolution of this term than the adoption of a new part of speech, a hyphenated adjective. It has commonly been used as a noun for ages, often quite literally; the “cutting edge” was what a knife or guillotine has; the idea was that it was sharp and designed to slice, not that it was at the head of the pack. When it was used figuratively, same thing — the tenor of the metaphor was the keen, whetted blade; the sense of leading the way was secondary at best. There is an obvious overlap, and it’s not hard to see how one sense might have given way to the other. As the eighties wore on, the noun phrase referred more and more often to the “leading edge,” and this meaning had taken precedence by 1990 or so. At that time, the noun probably occurred more often the adjective; now I would venture to say the situation is reversed, although phrases like “on the cutting edge” are not unusual even today.

Unlike “state of the art,” “cutting-edge” can apply comfortably to almost any field of endeavor. Technology, research, science, yes, yes, and yes. But also in the arts and social sciences. “Cutting-edge sculpture” or “cutting-edge fashion” is a meaningful concept. In the arts, it’s whatever the avant-garde is doing this year. In the social sciences, it’s the newest theory about abnormal psychology or macroeconomics. The latest thing, in other words, in any field that relies on a sense of some sort of development.

There’s a cute paradox in the use of the term “state of the art” being restricted to technology while “cutting edge,” which sounds industrial or mechanical, can be used to talk about art, social movements, or almost anything. “Art” could refer to lots of things with a technical aspect (is shop class still called “industrial arts”?), but “cutting edge” just doesn’t go the other way. Yet it has resisted being funneled into one single niche much more effectively than “state of the art.”

state of the art

(1980’s | enginese? | “very latest,” “top of the line,” “most advanced”)

Not a new expression by any means, but its use as an adjective phrase was just getting going in the late 1970’s and now I think has taken over. It’s hard to hear this phrase as a noun any more. The “state of the art” means simply “where we are now” in a particular field or area of knowledge — how far we have advanced. The funny thing about it is that it’s rarely used in reference to one of the arts. You don’t hear about state-of-the-art sculpture, music, or blown glass. While it’s true that the arts do not evolve in the same purposeful way that engineering does, they do undergo technological innovation and evolution just like any other field of human endeavor. (If you talk about a “state-of-the-art film studio,” for example, it doesn’t mean the most advanced films are made there, but that it boasts all the latest equipment.) There’s a mild irony in the use of the word “art” in an expression habitually applied to science, technology, or business.

The phrase now has taken on a gee-whiz quality that wasn’t necessary in the old days. Sometimes the state of the art was deficient; a scientist would lament that we couldn’t do what we needed to do because the technology just wasn’t there. Now it serves always as a compliment, part of an effort to puff up whatever innovation happens to be under discussion. It’s as good as it gets; it’s the most advanced technology available, even if it’s merely the best we can do until something better comes along. But the phrase does seem to have acquired a relentless optimism in the last thirty years, which suggests that we’re less critical than we used to be about the power of technology to solve our problems. And it’s not just for nuclear physics any more. A new umbrella promises “state of the art protection from sun, wind, and rain.”

The adjective phrase had started to turn up in the late 1970’s, and it became a favorite of the promoters of the personal computer revolution. It was well-established by the time Circuit City (now defunct) adopted it in the early 1990’s: “Welcome to Circuit City, where service is state of the art.” It took advantage of the persistent association of the phrase with technological advances and made for a memorable, long-lived slogan. But in this case, the advertiser followed rather than led. The phrase used after the verb rather than before the noun may have cemented the adjectival usage in the public ear, but it wasn’t genuinely new.

Tags: , , , , ,