Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

Tag Archives: technology

legacy

(1990’s | computerese? | “old,” “out-of-date,” “inherited,” “left over”)

I fear I am easing into a spell of griping over grammatical shifts. A few weeks ago it was “step up” idly taking a formerly unknown intransitive sense. “Legacy” has gone further, opening a whole new adjective department, one which is as far as I can tell based in computerese, worse yet. It has come to mean “hard to maintain because superannuated, but still useful or needed.” For example, an old computer that has to be kept around to run some indispensable software, or an old recording that has to be digitized. “Legacy” can be digital or analog. It is similar to other johnny-come-latelys such as “classic,” “retro,” and “vintage,” but more technical. It has spread; in politics “legacy issue” means “problem inherited from one’s predecessor,” turning the word into a convenient way to blame the previous administration. The noun “legacy” is still used literally and figuratively to refer to that which one leaves behind — something of value left in a will or, more often, an inspiration that lives on after one passes from the scene, or a series of achievements that needs to be preserved and augmented.

The academy offers another possible source for the concept of the legacy, in the sense of “descendant of an alum.” Phrases like “legacy preference” and “legacy admission” had appeared by 1990 but do not seem to have been common before then. The arrival of the adjective around the same time in computerese may be simply parallel evolution, or there may be some kind of connection. Both uses evoke the dead hand of the past, but in the academic context the state of being a legacy is desirable. That’s not how tech people use it.

“Legacy” sounds attractive, raising associations of class and financial advantage. But in computerese it is anything but a compliment, denoting a thing to be tolerated at best and a damnable nuisance at worst. The world is older than the personal computer, and it still has things in it that must be made legible to the machine brain. That’s legacy data, or legacy media, which may be thousands of years old, or as little as a decade. But the swath left by widespread computer use, after only about forty years, is already littered with many generations of hardware, software, operating systems, and standard file formats. Almost everyone who does a lot of work with computers has a legacy component somewhere, or has to help out someone else who does. The wages of computers is obsolescence. Concentrated and continuous technical advance must produce generations of disused hardware and outgrown software — even if they still work. But everything doesn’t die at the same time. Just as you can keep an antique car going far beyond its normal lifespan, so you can still run Windows 95, with all its limitations. The mere act of operating and maintaining computer systems over time breeds what you might call legacies (which has not become a collective plural, as far as I can tell, but probably should).

Many businesses prosper by helping corporations deal with legacy problems. There’s something threatening in the (not always implicit) message: if you don’t enlist our services, you will fall irrevocably behind and slide into failure. The problem is, being all state-of-the-art and having your legacy problems faithfully taken care of doesn’t guarantee you’ll be successful; it’s a necessary but not sufficient condition. It’s probably true that you need to spruce up your systems, but doing so doesn’t mean you’ll live happily ever after.

Tags: , , , , , , , , , , ,

disruptive

(1990’s | businese? athletese? | “shaking things up,” “causing a stir”)

A word of long standing, but when did it take on a favorable connotation? Not everywhere, of course, but executives use it approvingly now, unthinkable in the days of Henry Ford or even Lee Iacocca. Successful corporations have traditionally avoided boat-rocking and sought the even keel, but now executives congratulate each other on their disruptive business practices. It is not solely a matter of hobbling the competition; a certain amount of disruption is tolerated within the organization if it keeps employees on their toes, for example, or pushes a complacent division into activity. The buttoned-down set seems to have loosened their vests.

The first occurrences in the press that I found date from the late nineties, a few due to far-sighted business gurus but more from coaches describing the defensive unit, particularly in football and basketball. (Often it applied to a single defensive player.) I couldn’t guess which source influenced the other, but there’s nothing new about businessmen borrowing vocabulary from athletes — in this case, giving it more of an offensive than a defensive cast. By 2010 the word was ordinary in business contexts. Nowadays artificial intelligence and business models or strategies attract the label “disruptive.”

It’s a very forward-looking buzzword, associated with innovation, technology, and improved corporate management. Senior executives sling it around confidently, extolling the virtues of novelty and adroit exploitation of one’s strengths, or just crowing about how they’re going to mess with their competitors. There’s the usual tension between the goal of making the world a better place (if only for p.r. purposes) and simply extracting greater profit from it.

“Disruptive” is close to a newer expression — “game-changing” — and an older one, “revolutionary.” But these are both stronger than “disruptive,” which encompasses lesser shocks to the system. You can be disruptive without altering the playing field permanently or overthrowing an old order. It reminds me of Joseph Schumpeter’s notion of “creative destruction,” a hallmark of capitalism, which requires not just that single enterprises should fall so that better ones might rise, but that the rules of doing business, or other received wisdom, must fall to the new and improved. (Schumpeter believed strongly in innovation and entrepreneurism, by the way.) In today’s world, disruptive tactics are mainly intended to weaken or drive out competitors, but getting rid of rivals was always part of the entrepreneur’s toolbox. The fine talk of less able businesses fertilizing their successors didn’t disguise the fact that Schumpeter was merely peddling social Darwinism dressed up as economic law — yet another instance of trahison des clercs.

We owe this week’s expression to Will from Paris, a first-rate student of the language and a damn fine host to boot. He says, based on recent dealings with the corporate set, that this word will soon take over the world, and Lex Maniac wants nothing more than to get in on the rez-de-chaussée. Merci!

January 28, 2020: An obituary of consultant and professor Clayton Christensen in today’s newspaper reveals that he introduced “disruptive” into businese starting in the mid-1990’s. His name did not come up in my sketchy research, but I’m perfectly willing to acknowledge his role in popularizing, if not inventing, the new expression.

Tags: , , , , , , , , , , , ,

crunch the numbers

(1980’s | computerese? enginese? | “wade through OR digest the figures”)

Some new expressions engulf the landscape, washing over us all and forcing themselves on every ear, if not every lip. When we talk about common expressions, those are usually the kind we mean. There is another kind, though, not so ubiquitous, but unavoidable because the preferred, or only, way to refer to a particular action, process, or concept. So it likewise forces itself on every ear, but without the same unrelenting insistence. “Crunch the numbers” is one of those. It has become inevitable, in a culture devoted to amassing vast reservoirs of data, that we have a word for getting something useful out of all those statistics — once you collect all those numbers, you have to do something with them. There’s really no other word for it, and the phrase has become invariably associated with statistical distillation. The commonplace is formed not only from sheer frequency; if you have no choice but to reach for the same expression every time, it makes its presence felt.

The point of “crunching” the numbers, I think, is that they are reduced in size and complexity, like a mouthful of bran flakes turning into easily swallowed mush. The computer — number-crunching is almost invariably associated with computers, occasionally with calculators — takes a huge, indigestible mass of data and breaks it down. The expression seems to have arisen in the engineering community in the sixties and moved beyond it by the early eighties. It gained ground quickly, and soon no longer required quotation marks or glosses (actually, it was never generally glossed). Some expressions, though slangy and therefore not reproduced in mainstream publications until well after they’ve become ordinary, at least in their field, take hold quickly once they do because they’re easy to grasp and enjoy.

“Crunch the numbers” was at one time sole property of engineers and programmers; a few more professions may try it on now — accountants and statisticians primarily. The function of the computer, as envisioned in the post-war world, was to do many, many calculations per minute by brute force, placing vast amounts of computing power in one place and letting ‘er rip. I haven’t done the research to determine the no doubt lively expressions the tech boys used in the decade or two before “crunch the numbers” came along, or maybe it arose earlier than I think. It seems likely that there was no predictable expression before we started using this one, because we so rarely needed to talk about that volume and density of computation.

“Crunch the numbers” doesn’t share the taint of “massage the numbers,” or “game the system” or “fuzzy math.” A ground-level, first-resort expression must remain neutral, and the phrase is not generally used to question the motives or competence of those doing the crunching. “Run the numbers” is a little different, meaning “execute the formula and get the answer.” It likewise lacks any dubious connotation, despite a superficial resemblance to that staple of urban gambling, “running numbers” (or “playing the numbers”).

Tags: , , , , , , , , ,

hard-wired

(1980’s | computerese? | “innate,” “(pre-)programmed,” “fixed,” “unalterable”)

The hard-wired smoke detector was already around in 1980; in that sense the term has not changed meaning since. “Hard-wired” meant connected directly to the building’s electrical system, meaning it was not powered by batteries, meaning that it would not infallibly begin making horrible chirping noises one morning at 3:00 and resist every sleep-fogged effort to silence it. A hard-wired telephone was similar in that it was harder to disconnect than the standard model you plug into a wall jack (already common in my youth, though far from universal). The cord connected to the system inside the wall rather than on the outside. Cable television might be hard-wired in that the cables connected to the source physically entered your house and attached themselves to a television set. Computer scientists had been using the term before that, generally to mean something like “automatic” or “built-in” — the only way to change it is to make a physical alteration to part of the equipment — and it remained firmly ensconced in the technical realm until the eighties. That’s when “hard-wired” became more visible, as computer jargon was becoming very hip. (PCMAG offers a current set of computer-related definitions.) In computer lingo, “hard-wired” came to mean “part of the hardware,” so “soft-wired” had to follow to describe a capability or process provided by software.

My father, erstwhile electrical engineer, pointed out that in his world, “hard-wired” was the opposite of “programmable.” In other words, the hard-wired feature did what it did no matter what; it couldn’t be changed simply by revising the code. Yet you don’t have to be too careless to equate “hard-wired” with “programmed” (see above) in the sense of predetermined. It’s not contradictory if you substitute “re-programmable” for “programmable,” but that requires an unusual level of precision, even for a techie. Every now and then you find odd little synonym-antonym confusions like that.

Still in wide technical use, this expression has reached its zenith in the soft sciences, in which it is commonly used to mean “part of one’s make-up,” with regard to instincts, reflexes, and basic capacities (bipedal walking, language, etc.), and more dubiously to describe less elemental manifestations such as behavior, attitude, or world-view. “Hard-wired” is not a technical term in hard sciences such as genetics or neurology. The usefulness of the expression is open to question: one team of psychologists noted, “The term ‘hard-wired’ has become enormously popular in press accounts and academic writings in reference to human psychological capacities that are presumed by some scholars to be partially innate, such as religion, cognitive biases, prejudice, or aggression . . . remarkably few psychological capacities in humans are genuinely hard-wired, that is, inflexible in their behavioral expression” (citation). Scientists may sniff at the term as used in pop psychology, but it does make for easy shorthand and probably won’t go away any time soon.

The reason we take so easily to applying the term “hard-wired” to the brain is that the computer, as developed over the last fifty years, forms the most comprehensive map yet for the workings of our minds. A contributing reason is the very common, casual linking of brain activity with electricity, as in referring to one’s “wiring” — even though one may also refer to one’s “chemistry” to explain mental quirks, probably a superior explanation. Watching a computer “think” helps us understand how our brains work, or maybe it just misleads us, causing us to disregard our own observations in order to define our own mentation with reference to the computer’s processing. There are obvious connections and obvious divergences; surely any device we concoct must reflect the workings of our own minds. But computers aren’t just for playing solitaire, calculating your tax refund, running a supercollider. They serve a humanistic function by giving us new ways to think about the old ways we think.

Tags: , , , , , , , , , , , ,

avatar

(1990’s | computerese | “totem,” “alter ego”)

Occasionally a word will just flip around. For thousands of years, “avatar” meant corporeal embodiment of an abstraction, often a god (the word comes from Sanskrit and is mainly associated with the Hindu god Vishnu). What does it mean now? An incorporeal representation of a flesh-and-blood person. Now your avatar is a tiny image file that stands for you in the on-line universe –- on Facebook, in a role-playing video game, or on a blogger’s profile. It could be a portrait but usually isn’t, and there are dozens of sites that will give you some raw material and a little coaching to help you make your own. Another word for it is “icon,” which is a little different but not that far off. “Avatar” was definitely available in its present form by 2000, though perhaps not all that widespread in the days before everybody needed one to keep up a virtual social life. Now it is all over everywhere.

Is it odd that both terms have a distinctly religious cast? By the time you’ve settled on which South Park character you want to represent you on your social media pages, the sense of majesty and mystery that religion must command has pretty well worn away. Both “avatar” and “icon” have had non-religious uses for at least a century now, or at least “avatar” has, but there’s still a distinct whiff of it. You might also argue that technology, the most up-to-date religion we have going, has simply appropriated sacred vocabulary and repurposed it.

The question leads to a favorite pastime of mine: constructing narratives of cultural decline. “Avatar” once had a mystical tone, associated either with the gods themselves or people who live out philosophically pure distillations of noble principles. Now it’s a few pixels thrown together that allows you to play video games. A decided downward drift, and all in my lifetime! A quick scan of search results does confirm that Google, for one, doesn’t think old-fashioned religious uses of the term count for much — though, of course, the results are skewed by the 2009 blockbuster film. I didn’t see it, but from what I gather the eponymous characters in the film had at least a touch of the guru or sage about them. (I remember the movie as being about blue characters who weren’t Smurfs, but that just shows you how primitive my cinematic consciousness is.)

On-line avatars remind me of choosing your token at the beginning of a Monopoly game (we usually called it simply a “piece,” but if I remember correctly, the rules used the word “token”). The dog, the shoe, the little car, etc. (I liked the wheelbarrow myself.) Most people had a preference, whether they considered it lucky or somehow apt. True, you couldn’t cobble your own avatar together in Monopoly; you had to take what Parker Brothers gave you. But those were the boring old days, my children. Image files are not my strong suit, but I came up with a few related user names, free to a good home. Want to be an actress? Ava Tardner. Fond of the Middle Ages? Avatar and Heloise. For fans of ancient video games, Avatari. For a historical touch, how about Amelia Earhart, avatrix? That’ll light up the chat boards.

Tags: , , , , , , , , , , , , ,

vape

(2010’s | hipsterese? teenagese?)

Primarily a verb, I would say, but available as a noun (short for “vaporizer” or for the practice of “vaping”), or for modifying fanciful store names (there’s one on 14th Street called Beyond Vape). One who vapes is a vaper, which may remind antiquarians of “viper,” a very old word for marijuana smoker. “Vape” was not entirely new when we first encountered it between 2005 and 2010 — 2009 is the first time it shows up in mainstream press sources, says LexisNexis — it had seen limited use before that as short for “vaporizer,” but that was before anyone thought of a vaporizer as a way to ingest nicotine or anything else. For that we had to wait until the early 2000’s, when a Chinese pharmacist invented the battery-powered nicotine delivery device, which heats liquid to form vapor rather than leaf to form smoke. It took a few years, but by 2010 electronic cigarettes had become noticeable. They looked suspiciously like cigarettes — and plenty of people were and remain suspicious — but they produced far less dangerous fumes, though probably not perfectly safe. A few short years later, vaping need have nothing to do with nicotine, and dispensers need not look like cigarettes, though the ever-popular vape pen retains the slim, cylindrical shape. It’s become an art and science and commerce all its own. Shops have sprung up everywhere, and vaporizers have supplanted hookahs as the hip smoking device. I see people vaping all the time now on the streets of New York. Professional worriers have stopped worrying about hookah lounges and started worrying about kids taking up vaping.

There are a number of associated terms, of course, (and a legion of brands to match); if you want a chuckle, check out the alphabetical list of headwords on the right of Urban Dictionary’s “vape” page. I won’t try to go into all of them, but here’s one glossary (here‘s another). The medium for the nicotine, flavoring, or whatever you put in your vaporizer is generally called “e-juice” or “e-liquid.” Another term for the device is “PV,” for “personal vaporizer.” Basic tools of the trade have been shortened to “atty” (atomizer), “cart” (cartridge) and “bat” (battery). A souped-up PV is called a “mod” (short for “modified”), which should not conjure up visions of the Mod Squad. A “clone” is a fake, basically, a knock-off or counterfeit. The sensation of a puff of vapor going down is called a “throat hit.” Regular old tobacco cigarettes are known as “analog cigarettes,” though there’s nothing digital about an e-cigarette; the association with e-mail and other computer-spawned e’s is fortuitous.

We are entitled to wonder why vaping became so popular so fast. Much is made of its role as an aid to giving up smoking, with accompanying debates over how safe it really is — debates that continue to rage, though most observers agree that they are less toxic than old-fashioned cigarettes. It seems likely that many vapers took it up for that reason. Vaping is cool rather in the way that smoking used to be: not rebellious exactly, but a bit transgressive, a little dangerous, developing a subculture recognized by the general population. But there’s also the technological factor. Vaping is in because it has produced new gadgets and lots of opportunities to mess around with them. The engineer types like having things to play with, and the techno-buffs revel in the latest improvements. There’s also the rage for anything new that occupies a surprising number of our fellow citizens, which I have cited before as a powerful force behind new concepts and expressions in our discourse.

Tags: , , , , , , , , ,

warfighter

(1990’s | militarese | “combat soldier”)

My libertarian readers will need no reminding that this week’s expression became necessary only after dramatic changes in the functions of U.S. armed forces over the course of the twentieth century. But armies have always had numerous soldiers and hangers-on essential to the functioning of the machine who never see combat — who wants to serve in a battalion where all the cooks got shot? — and “warfighter” merely denotes a combat soldier as opposed to all the other kinds. Right-wingers like to grouse about Our Troops used for the dreaded Nation-Building, and they are correct that we ask our armed forces to perform more, and more varied, duties and take on more roles in the world than we did before World War II. But that fact is but a sidelight as far as this term is concerned.

Even now, I’m not sure the term counts as everyday language, since it still turns up predominantly in military or at least government publications, or journals published by and for military contractors. I ran across it last week in Newsday, which conjured up a few other foggy memories of seeing it in the last few years. The first instance I found in LexisNexis came from the illustrious pen of Sen. Mark Hatfield, but it was uncharacteristic (see below). Today’s meaning of the term started turning up regularly in the nineties, when it made occasional incursions into the mainstream press. Perhaps a few years earlier, military commanders began to talk about “warfighter exercises” designed to simulate combat situations more accurately than the old exercises had. (The use of the word as an adjective, or first half of a compound noun, still appears, but it has not become the norm.) It’s important to remember that “warfighters” is not the same as “boots on the ground”; a drone pilot thousands of miles away is every bit as much a warfighter as a wretched infantryman in Kabul (if we have any wretched infantrymen left in Kabul). It is settled wisdom in the military that the entire infrastructure and bureaucracy is there to serve the warfighter, to give U.S. soldiers the best possible chance in whatever sort of combat they are pursuing at the moment, most often in terms of technology and training. Yet so far the word has not come into use as a means of glorifying soldiers or making them objects of pity (as in “support the troops” or “brave men and women in uniform”).

Occasionally one sees this week’s expression used as the second part of a compound, as in “nuclear warfighter” or “guerrilla warfighter.” (The former appeared in Hatfield’s New York Times op-ed in 1986.) It turns up infrequently, but it’s not an unreasonable broadening of usage, actually. “Warrior” has a definite old-fashioned sound, more suited nowadays to movie franchises and computer games than actual warfare, though it might still be used of elite fighters. I think “warfarer” should be given consideration, but it looks too much like “wayfarer,” or “Warfarin,” I suppose. By the way, there’s an Android game called Galaxy Warfighter; maybe this will be the rising generation’s cue to adopt the expression and push it irreversibly into our vocabulary.

“Warfighter” is an accidental addition to an accidental series formed loosely around the idea of strife, or making it go away. See “conflicted” and “-whisperer.”Pushback” and “win-win” are other terms in this category. Peace out, y’all.

Tags: , , , , , , , , , ,

wow factor

(1980’s | journalese (film)? advertese? enginese? | “appeal,” “oomph,” “oohs and ahs,” “brilliance”)

Inasmuch as “wow” and “factor” both have relatively long and complicated histories, perhaps we should begin there before considering their union. “Wow” appears to go back to a Scots interjection, which could be laudatory or derogatory, and our modern understanding of the word emerged even before the beginning of the twentieth century; by 1925 it was going strong as an interjection and available as a noun or verb. The interjection is far more common than the other two today and probably always has been. “Factor” is an even older word that early in the twentieth century meant “gene,” basically (allowing for evolution in our understanding of genetics); now it is defined much more generally as “element or constituent, esp. one which contributes to or influences a process or result” (OED), especially if it’s important and its action is not well understood. “Factor” preceded by another term to denote a particular substance or catalyst is quite common in medicine; “Rh factor” being a longstanding example. “Risk factor” no doubt started life as a medical term but now flourishes in other fields. “Factor” became popular in Hollywood during the seventies, when it followed “Delta,” “Neptune,” “love,” and “human” (twice) in film titles (they all had to do with science fiction or espionage). And, to complete the picture — or the confusion — “wow factor” was used occasionally among stereophiles before 1980 to talk about irregularities in playback speed of tape decks and turntables, as in the phrase “wow and flutter.” So it seems the stage was well set.

By the mid-1980’s, the phrase started turning up in writing about entertainment (particularly films and television), computer software, merchandise more generally, and even service industries like banking. One early exponent was marketer Ken Hakuda, who used “wow factor” in 1987 to talk about his success in selling toys which he freely admitted were not useful or valuable except as a source of mindless fun. He used the term to refer to a highly visible feature of a product or its packaging that makes a strong, immediate impression, causing shoppers to whip out their wallets. That quality of impressiveness constitutes a common denominator among objects blessed with the wow factor. I’m not willing to take a firm position on the origin of this particular meaning. If I had to guess, I would say Hollywood, but advertese seems like an equally logical breeding ground, and I can’t say it didn’t start there. Because the phrase goes frequently with technological advance (especially when you’re talking about cinematic special effects), it is possible to argue that its source is enginese. While two of the earliest citations found in LexisNexis are due to Steven Spielberg and Harrison Ford, the very first (1984) was in the title of Miss Manners’s column, of all places. Did she supply the headline, or do we owe it to a forever anonymous editor? By the mid-1990’s, the expression was no longer extraordinary and had shed quotation marks, exclamation points, capital letters, and such tricks of the trade.

If you looked even cursorily at the pre-1980 equivalents listed at the top of this entry, you may have surmised, correctly, that I struggled to find convincing synonyms from the old days. That is because we used to say the same thing with adjectives — e.g., dazzling, eye-catching, awe-inspiring, cool — or verb phrases: knock your socks off, set something apart, jump off the shelves. Many new expressions have ensconced familiar ideas in new parts of speech, which usually is a net gain for the language. More ways to say the same thing reduces monotony and opens up room for small but significant variations in connotation. I’m inclined to consider the popularity of “wow factor” deserved. It’s short and to the point. And the ground meaning is quite clear, though it can imply two slightly different things, just as in the sixties, “wow” conveyed two different levels of excitement. One was the old gee-whillikers gob-smacked elation at seeing anything unexpected and pleasing. The other was quieter, more meditative, as in the pothead grokking the universe as he exhales. No squealing or jumping up and down, but the profound sense of something worthier than oneself that must be absorbed and appreciated with a drawn-out “wow.” “Wow factor” has always leaned more heavily in the direction of the former sense, but it can shade toward the latter sense as well, and seems to do so more often as time goes by. Not that the two meanings are all that far apart.

It has occurred to me to wonder if we should hear this expression with a whiff of the tawdry or meretricious. Given its early use and likely origins, it’s not hard at all for an old snob like myself to inflect it this way. But that would demand an ironic edge that I rarely or never hear when the phrase is used. A “wow factor” is a good thing that will impress the audience, sell the product, or make something stand out. The idea that there must be something cheap or slutty about it never seems to have taken root.

Tags: , , , , , , , , , , , , ,

standalone

(1980’s | computerese, businese | “independent,” “unconnected,” “separate,” “isolated”)

The earliest instances of “standalone” (sometimes hyphenated, even in this day and age) in Google Books date from the sixties and seventies, nearly always in talking about non-networked computers. The first hits recorded in LexisNexis all date from 1979 in that trusty journal American Banker — but invariably in discussions of the use of computers in banking. The word was used often in the early days of ATM’s, which could, in the manner of computers, be divided into the ones clustered together for protection (e.g., in a bank lobby) and the ones out in the field, far from help. (The latter had to be connected to a mainframe somewhere or they wouldn’t have access to anyone’s account data, of course. And even a standalone computer had to be connected to a power source. No computer is an island; no computer stands alone.) ATM’s were brave and new in the eighties, and I suspect their spread pushed “standalone” into prominence. Other business types were certainly using the word by 1990, generally in reference to computers. It was widely understood by then but remained primarily a hardware term at least until 2000. One mildly interesting point about “standalone” is that it could apply to an entire system as well as to a single device. A standalone device can function even if it is not part of a larger system, but an entire system can also absorb the adjective if it doesn’t depend obviously on comparable systems.

“Standalone” retains a strong business bias, even today, but it is available to describe many things besides computers. A complicated piece of legislation might be broken up into standalone bills. Or a film for which no prequels or sequels are planned (or one in which a character that had been a supporting player in other films becomes the protagonist) might be so described. A government agency that doesn’t rely on another agency for its writ. A restaurant that isn’t part of a chain. “Standalone” is not generally used to mean “freestanding,” although it seems like it ought to be, literally speaking. I am a little surprised that I find almost no examples of the word used as a noun (one does see it used as a trade name), although that seems inevitable. All it takes is the careless insertion of one lousy one-letter article, and the deed is done. You’d think it would be harder to blur fundamental grammatical categories, but no.

The rise of this term inevitably accompanied a change in how we use computers. In the seventies and eighties, when we began to get serious about turning them into tools for business, the idea was that each employee’s work station had to be connected to the mainframe, where all the applications and data were stored. In the nineties, we shifted to the opposite model: each employee’s computer should have a big enough hard drive to store software and data; every work station became its own mainframe (or server, as we would say now). In the last few years, we’ve pushed the other way, and now minuscule laptops and tablets run software from the cloud, and store data there as well. The same shift has taken place outside the office; home computers have undergone a similar evolution. There are no doubt good reasons for the shift; the rules and conventions of the computer game have changed quite a bit. But like many such sizable shifts in our culture, it has taken place with little or no consideration of why we did it the other way. Are the once highly-touted advantages of standalone computers no longer real or significant? We don’t know, because the issue was never debated out where most of us could hear. We did it the old way because there was money in it, and now the powers that be have found a new way to make money. You’re stuck with it whether it helps you or not, and you’re not even entitled to an explanation. That should be surprising, but in practice, it isn’t. Our policy debates routinely fail to explore how things got to be the way they are. It’s as if we all woke up one day and said, “Look, a problem! Let’s fix it!” With insufficient historical understanding, we attack large-scale problems with little or no attention to how they arose and fail to acknowledge the evils the existing approach has successfully prevented.

Tags: , , , , , , , , ,

user friendly

(1980’s | computerese | “easy to use or learn,” “accommodating,” “welcoming”)

Somewhere around 1980, lots of people started writing about the imperative of making the computer — whose rise was, by acclaim, inevitable and unstoppable — into a device that laypersons could operate. Much was said about the developers and programmers, lost in their own little world, unable to design a computer so the rest of us could use it without going stark staring mad. Strides have been made, but we are still talking today about software geniuses who just don’t understand how to make nifty new features readily available to non-experts. At least as much effort must be expended to adapt technological advances to the limitations of the average computerphobe as to create the advances themselves. (And all that effort must have paid off, because today, no one is ever stymied, balked, or thrown by a computer, right?) The phrase “user friendly” (often spelled as two words, though it ought to be hyphenated, at least according to this grumpy grammarian) bloomed when this kind of talk became prevalent thirty years ago.

A curious point about this expression: it did not turn up at all in LexisNexis before 1980. (The word was in use before then, of course, mainly among those writing about computer hardware and operating systems.) Both the Christian Science Monitor and the Washington Post described “user friendly” as a “buzzword” in 1980, however. Such an unheralded leap into prominence is unusual. It was not unusual to see “user friendly” glossed in the early eighties; sometimes it was placed in quotation marks, but that was not the rule. By the end of the decade, the expression applied not just to gadgets like computers, cars, and toys, but to airports, parks, police stations –- almost any place that provided an amenity of some kind or that people had to find their way around. It was already turning up in reviews describing a novel or a band’s music (where it meant “accessible” or “easily grasped”). George Will, in 1989, used “user friendly” to describe a cat; he seems to have meant “affectionate.”

Beneath the rapid growth of the expression lies a change in the force of the word “user.” In my youth, it meant “exploiter” (someone who takes advantage of someone else) or “one who ingests illegal drugs” (“Users are losers” came along later, but that sense was already current). It was a malignant word, almost always bearing a negative connotation. But the computer revolution changed all that. “User” has become a neutral term, applied to anyone looking at your web site or opening up your software. We talk about “user interfaces” and “user statistics” very casually. No hint of the revulsion with which the word was imbued a scant forty years ago.

Since the beginnings of the Industrial Revolution, for all I know since the beginnings of agriculture, a gap has persisted between the creators of new technology and the ultimate operators to whom it trickles down. The assembly line is one way to adapt the grunt laborers to technological advance, but in more enlightened times, we prefer to adapt the advances to the laborers and make a wider range of functions and effects available with less effort. In the computer era, it seems like we have been talking more about this sort of thing than we used to; more experts have worked harder to bring all those time- and labor-saving features down to our level. In other words, the computer is more complex, more alien, than the internal combustion engine or the cathode-ray tube — still another indication that the personal computer marks a new plateau in the uneasy relationship between ourselves and our technology.

Tags: , , , , ,