Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 50 years

Tag Archives: software

legacy

(1990’s | computerese? | “old,” “out-of-date,” “inherited,” “left over”)

I fear I am easing into a spell of griping over grammatical shifts. A few weeks ago it was “step up” idly taking a formerly unknown intransitive sense. “Legacy” has gone further, opening a whole new adjective department, one which is as far as I can tell based in computerese, worse yet. It has come to mean “hard to maintain because superannuated, but still useful or needed.” For example, an old computer that has to be kept around to run some indispensable software, or an old recording that has to be digitized. “Legacy” can be digital or analog. It is similar to other johnny-come-latelys such as “classic,” “retro,” and “vintage,” but more technical. It has spread; in politics “legacy issue” means “problem inherited from one’s predecessor,” turning the word into a convenient way to blame the previous administration. The noun “legacy” is still used literally and figuratively to refer to that which one leaves behind — something of value left in a will or, more often, an inspiration that lives on after one passes from the scene, or a series of achievements that needs to be preserved and augmented.

The academy offers another possible source for the concept of the legacy, in the sense of “descendant of an alum.” Phrases like “legacy preference” and “legacy admission” had appeared by 1990 but do not seem to have been common before then. The arrival of the adjective around the same time in computerese may be simply parallel evolution, or there may be some kind of connection. Both uses evoke the dead hand of the past, but in the academic context the state of being a legacy is desirable. That’s not how tech people use it.

“Legacy” sounds attractive, raising associations of class and financial advantage. But in computerese it is anything but a compliment, denoting a thing to be tolerated at best and a damnable nuisance at worst. The world is older than the personal computer, and it still has things in it that must be made legible to the machine brain. That’s legacy data, or legacy media, which may be thousands of years old, or as little as a decade. But the swath left by widespread computer use, after only about forty years, is already littered with many generations of hardware, software, operating systems, and standard file formats. Almost everyone who does a lot of work with computers has a legacy component somewhere, or has to help out someone else who does. The wages of computers is obsolescence. Concentrated and continuous technical advance must produce generations of disused hardware and outgrown software — even if they still work. But everything doesn’t die at the same time. Just as you can keep an antique car going far beyond its normal lifespan, so you can still run Windows 95, with all its limitations. The mere act of operating and maintaining computer systems over time breeds what you might call legacies (which has not become a collective plural, as far as I can tell, but probably should).

Many businesses prosper by helping corporations deal with legacy problems. There’s something threatening in the (not always implicit) message: if you don’t enlist our services, you will fall irrevocably behind and slide into failure. The problem is, being all state-of-the-art and having your legacy problems faithfully taken care of doesn’t guarantee you’ll be successful; it’s a necessary but not sufficient condition. It’s probably true that you need to spruce up your systems, but doing so doesn’t mean you’ll live happily ever after.

Tags: , , , , , , , , , , ,

morph

(1990’s | computerese? advertese? | “turn (into),” “transform,” “change”)

History first. This term starts to turn up in LexisNexis after 1990, although it seems likely to have formed part of professional jargon before then in film studios and computer labs. A Gumby-like shape-shifting clay animation figure named “Morph” was invented in 1977 by Tony Hart, a British animator. I came across a stray reference to a toy known as “Morph-o-Droid,” a kind of vehicle that turns into other things, about the same time (1985) as the Transformers socked the culture in the eye. These characters all were capable of changing form; if you knew your Greek, “morph” used to refer to form (or shape) wasn’t much of a stretch. I first learned the term as a combining form, not an independent word, as in “ectomorph.” Experts agree that our use today is no more than a shortening of “metamorphose.” But don’t look to caterpillars to provide an analogy; “morphing” as we know it is an exquisitely artificial process.

It’s tempting to see “morph” as a synonym for “transform,” which can also be used transitively or intransitively (oddly, however, “transform” seems to have a definite bias toward the transitive, while “morph” goes the other way). But it’s a little more specific. “Morph” almost always carries the idea of changing from one distinct image to another through a series of defined and visible stages. And it meant that by 1994, when it was chosen as a word of the year by the American Dialect Society. That was the year it plunged irrevocably into the mainstream — more on that in a moment.

The term was nearly always associated with computer graphics or digital animation in the beginning. Films like “Willow” and “Terminator 2” used visual sequences that amounted to primitive CGI, involving gradual transformations from one image to another, and technicians seem already to have been using the word, although I’m not sure when they started. (According to the Hollywood Reporter (1993), computer scientist Tom Brigham was responsible for “‘the original concept and pioneering work’ involved with the technology,” which he demonstrated first in 1982. But what did he call it?) I suspect the word arose solely because the need for it arose; once morphing became relatively simple after some technical advances, we had to have a word for it. I have no real evidence for the surmise, but I’ll take it until something better comes along. Such an origin myth suggests that the term comes from computerese.

But the ancestry of a new word isn’t necessarily what puts it on the map. “Morph” did get some early exposure in writing about movies and computer software. But the twin evils of advertising and public relations probably did more to spread the word. A rather lengthy quotation from Advertising Age (September 23, 1991) makes the point: “Advertising is in the grip of Morphing Mania. Morph, short for metamorphosis, is a captivating computer animation process that enables us to watch one shape, one image, speedily transform itself into another, and another, and another, in seemingly endless progressions.” The next year, the proposed mascot for the 1996 Olympics in Atlanta was known as the “Whatizit” (later “Izzy”), which could transform itself instantly into all sorts of things, and commentators (most of whom deplored the design, to put it mildly) frequently used “morph” or a form of it to describe the protean creature. Finally, in 1994, along came the Mighty Morphin Power Rangers (someone — William Bennett, perhaps — must have noticed the uncomfortable proximity to “morphine”), a popular television show and then a can’t-do-without set of action figures that Christmas shopping season, which powered the new word solidly into everyday use. The Power Rangers were kids who turned into superheroes — the emphasis wasn’t so much on a change in form (although one did take place) as on becoming a different sort of being. Now, “morph” often is used casually to mean simply “change.” A pair of recent headlines (techcrunch.org and financial writer Jim Cramer, respectively) make the point: “As Mobile Devices Morph into Wearables” and “How Perfect Stocks Morph Into Lemons.” It turns up as a noun every now and then, too.

About ten years ago, I was startled to encounter this word in a text by a writer I knew as an upholder of traditional grammar and diction, who scorned those who adopt every new word that comes down the pike. He used it — without gloss or quotation marks — to describe an image changing through stages into another. In that narrow sense, I don’t think there was a precise, brief equivalent before 1990 or so.

Tags: , , , , , ,

beta version

(1990’s | computerese | “dry run,” “preliminary version,” “demo”)

Ineligible for the blog on both historical and semantic grounds, but it seems to be creeping into contexts other than testing computer software. Even now, the association with computers remains strong if not quite inevitable: a stage of software development in which the product is not ready for sale but is ready to be tested on a large scale by real live users, not just the developer’s own engineers. (The latter stage is known as “alpha testing” when it is referred to at all.) The cynical way to look at this is that the developer gets lots of work done for nothing, although most will offer discounts to beta testers on the finished product. The phrase required some explaining at first, but now everyone knows what it means: On April Fool’s Day, Google invited users to test the beta version of Google Nose, which enables users to search by odor (Google, of course, is famous for keeping some of its products in the beta stage for years).

According to Wikipedia and at least one other source, “alpha” and “beta” in this sense were broken in by IBM computer jocks long ago, well before the personal computer was thought of. It was widely used in talking about both hardware and software certainly by the late eighties, if not before, but it didn’t turn up much in the mainstream until some time in the nineties. (I believe that’s when I became familiar with the concept.) Beta versions are for the expert or adventurous, or for employees of large corporations who act as guinea pigs. Ordinary shlubs like me avoid beta versions; let somebody else step on all the landmines. (Hey, it’s not like most software doesn’t have plenty of bugs and glitches even after rollout.)

One descendant, “beta reader,” has cropped up recently in the realm of fan fiction, where it means something like “editor.” Here again, it refers to a reader at a certain stage of the development of the text, not in its roughest form but not finished, either; the idea is that the beta reader will help the author improve the story and get it ready for publication, posting, or whatever comes next. In this sense it obviously derives from the old computer-industry use but may point the way to a new set of meanings. Watch this space.

early adopter

(1990’s | academese? advertese? | “pioneer,” “early bird” )

The interesting thing about “early adopter” is that its meaning has never varied in the slightest, and while its range of use has broadened a bit, neither its denotation nor connotation has changed to speak of. Someone who snaps up a new practice or product as it becomes available — someone interested in the latest technologies, brands, or services. From the beginning, the expression has had a strong connection with technological advance, and it still does, although nowadays it may freely be used in talking about customs, rules, or attitudes. That was not true in the 1980’s.

The earliest uses recorded in LexisNexis date from the early 1980’s, concentrated in the banking press. It was not long before “early adopter” was taken up in computer circles, and the term quickly became common in talking about (i.e., promoting) new personal computers, network technology, operating systems, etc. The term likely was coined by the economist Everett Rogers, who invented a field by publishing a book called “Diffusion of Innovation” in 1962, in which he classifies people according to how quickly they adopt new things; one of the classes was “early adopter,” who weren’t the very first to pick up the latest thing (those are the “innovators”) but who come right after and presage wide consumption or use. Most of us are content to follow our own English Pope:

Be not the first by whom the new is tried,
Nor yet the last to lay the old aside.

Marketers were probably the first to use the term regularly, and it was rarely seen outside the business or computer press until at least the mid-1990’s; it was rendered in quotation marks as late as 1999 in US News and World Report. But that wasn’t really typical. Mostly the phrase is used without any particular notice or explanation, and that has been true for a long time. (Rogers dubbed the last to lay the old aside “laggards” — those who take up innovations slowly (not until they are already obsolete) or not at all. I’m a laggard.)

The phrase has long had a strong correlation with favorable terms like “forward-thinking” or “progressive.” An early adopter typically is not seen as an uncritical, superficial customer who will walk out with anything that is sold as the dernier cri, but as a discerning shopper who is quick to see the advantages of the latest technology. Early adopters are usually thought to be knowledgeable and well-off — people you want to know and emulate. There’s no reason for this that I can see except that the people who use the phrase are also the people who have a strong interest in inducing early adopters to buy whatever it is they happen to be selling. So they need to flatter the adventurous ones willing to endure bugs and kinks, because success with that group portends general success. You don’t go describing your client base as gullible, hysterical, or lacking wisdom. With that goes a tendency to denigrate the laggards as stuck in the mud, out of the loop, and selfishly standing in the way of progress. So all those of us who didn’t spend money on Beta videocassettes, New Coke, the DeLorean, or WebTV, are losers. Time to repent. Go thou forth and bankrupt thyself on every crummy two-bit novelty that comes down the pike.

Tags: , , , , , , , ,

your mileage may vary

(late 1990’s | advertese? computerese? | “no guarantees,” “you may not get the same results,” “you may have a different opinion”)

I am just old enough to remember gas lines and the energy crisis. Before then, nobody cared much about gas mileage, and therefore this phrase could not have been born. When you can fill your tank for five bucks, energy efficiency is not much of a concern. But by the late seventies, everyone wanted to know the MPG of the latest model, and we had to learn the difference between city and highway mileage. And with all the new statistics came the necessity of noting that there’s no guarantee that you’ll actually get 30 mpg highway in your daily driving life, or anything close to it. Thence sprang a new expression, which quickly became standard in car advertisements as in government reports. The more prosaic phrase, to which this one is closely related, is “(Actual) results may vary,” which may also be used fancifully.

The point of the expression was, we tested the car under certain assumptions and conditions, and these are the numbers we got. You probably won’t do so well, and it’s not our problem if you don’t. “Your mileage may vary” is one of the classic disclaimers. The phrase always invokes a single statistical, theoretically empirical standard, whether it was determined by the manufacturer or the watchdog.

It doesn’t seem to have eased into non-automotive use for another ten years or so. By the late eighties, it was starting to creep into other contexts, particularly computer journalism. It was common in racier circles by the mid-nineties. Texas Monthly (Feb. 1995) glossed it as “what works for me may not work for you.” By 2000, it could be used to talk about investment strategies, or the effectiveness of prescription drugs — it wasn’t about only cars or computers any more. More recently, it has become looser and sometimes means little more than “we may disagree,” used in reference to matters of opinion, as in this recent example from the Off the Kuff sports blog: “I’d rather identify and be identified with something small and independent than something big and corporate. Obviously, your mileage may vary.” There is no gesture at a numerical standard here; it’s purely a matter of personal preference. The older use still predominates, I would say, but maybe the spread will continue.

This phrase was born of the auto industry, but its metaphorical angle came to us through computer journalism. It was used regularly in evaluating software as early as the late eighties, and it was often cited as an early example of an e-mail or newsgroup abbreviation (yes, even before texting). “YMMV” turned up in guides to internet slang circa 1995 along with “FAQ,” “LOL,” or “IMHO.” It doesn’t seem to have lasted as well as these others. But in its day, it had that brand spanking new ultra-modern sound, a phrase liberated from stodgy automotive roots to be adopted by the darlings of the future. Only later did it become more comfortable among the rest of us.

Tags: , , , , , , ,

template

(1990’s | computerese | “blueprint,” “model,” “example”)

A word that had one or two narrow meanings for a long time that turns up everywhere now. It comes out of architecture, where it originally meant a horizontal weight-bearing beam. It was originally spelled templet, and a few people, like me, still pronounce it that way, but you hear it often with equal stress on both syllables, the second syllable pronounced with a long “a.”

The word acquired a specific meaning in the nineteenth century that persists into the twenty-first, although it is no longer primary. A template was a guide or form along which one guided a pencil, drill, or saw to make precision cuts or drawings. In other words, it was a fancy stencil. While it turned up often in industrial and home-improvement contexts, there were humbler domestic examples, like sewing patterns or cookie cutters. Then there were a couple of more specialized senses. “Template” could mean a three-dimensional scale model, such as a model of a room in which each piece of furniture was sized and placed precisely. In the oil industry, a template is a platform laid on the ocean floor which guides the drills and is connected to the drilling apparatus above. In chemistry, it was (and is) the “recipe” a cell follows to replicate itself. Occasionally it meant a standard against which something was measured, usually literally. As late as 1980, these definitions were what we invoked when we used the word.

Now, “template” has no end of extent. Here’s a recent implausible example, from the New York Times (March 17, 2012): “the experience [of Irish immigrants in the 1840’s] established a stereotype, a template, applied ever since to whichever national or ethnic group happened to be the latest impoverished arrivals.” The prejudice and abuse good Americans heaped on the Irish back then continues to guide us in dealing with all those who came after. (And they say Americans don’t know anything about history.) There remains a resemblance to the old sense of guide or form, but in a much less concrete way. Another common contemporary example describes Mitt Romney’s health care law in Massachusetts as a template for Obama’s federal plan passed in 2010. Here is a fine example of “template” used the way we used to say “blueprint.” The administration followed the outlines of the state plan and adopted many of the same proposals. It’s vaguely reminiscent of the idea of a stencil, but it also harks back to the sense of a scale model.

What’s changed? Now when you think of a template, you don’t think of an object any more; it’s abstract. Politicians and planners use it a lot — any kind of public policy that can be adopted or imitated where people are trying to solve a similar problem is called a template. Every temporarily successful stratagem is so crowned, and others are urged to adopt it without considering whether it will work under completely different conditions.

Back in the awful eighties, the growing allure of computerese gave “template” its entrée into everyday speech. It had two separate meanings. One was a printed guide to keyboard commands (no mice back then, remember, kids) that you laid out alongside the function keys so you didn’t have to remember what alt-F4 does in this software. It’s an odd use of the term: the idea of laying the guide over the work area is retained, but it’s more like a crib sheet than a form. Another meaning erupted at the same time, as glossed in American Banker (October 25, 1989): “A spreadsheet with formulas and text already entered for a specific use. Example: IRS Form 1040 set up for use with SuperCalc, a spreadsheet program.” It didn’t have to involve spreadsheets, of course, any software could come with built-in forms that turned complicated tasks into fill-in-the-blanks. This sense has paved the way for our ubiquitous usage, but what exactly is it about this meaning of “template” that has taken over? It’s the prefab quality. All you have to do is enter some data and the template does the work, whether it’s getting your taxes ready or displaying a web page or designing a health-care system. The real work has been done by the programmer or designer or think-tank wizard, and you don’t mess with that part. If the template is the Massachusetts health-care plan, the federal government has to put in different information, but the basic moves are already there — the ways the plan crunches the numbers (and gathers them in the first place), the policy prescriptions, the legal requirements, etc. — and you stick in your data and pull out a multi-volume federal law. It’s not quite as simple as what we used to call “crank-turning” in math class, but that’s what it boils down to.

Tags: , , , , , ,