Skip to content

Lex maniac

Investigating changes in American English vocabulary over the last 40 years

Tag Archives: computers

real time

(1970’s | computerese | “clock time”)

Another departure from my chronological standards, “real time” was well established by 1980, though mainly in technical contexts. The expression has a few slightly different meanings that pretty much come down to simultaneity — one system changes more or less instantly as a parallel system does, generally as the result of new information. The other notion it conveys is time neither expanded (as in slow-motion replay) nor compressed (as in time-lapse photography). “Real time” demands strict observance of the clock, giving it still greater power to circumscribe our every thought and sensation.

As the expression has become more colloquial, it has leaned more on a preposition: “In real time” corresponds to “live” or “as it unfolds,” which seems like a perfectly natural development; sometimes it means no more than “up to the minute” or “at this moment.” The expression retains a strong technical bias, but it has been available to arts writers for at least thirty years. The concept is easily grasped and we all labor under the computer’s yoke, so it has become common property; most of us are capable of using the phrase in ordinary conversation, without quotation marks. It’s also available as an adjective. Despite a superficial resemblance, “real time” probably has nothing to do with the older “have a real time of it” — a rough time — which is passing from the scene.

Improving communication speed has been a primary technical goal for many centuries now. The days are over when Cecil Rhodes (according to Mark Twain in “Following the Equator”) could land in Sydney and make a fortune because he caught and gutted a shark that thousands of miles away had eaten a man who happened to be carrying a newspaper with significant financial news — news much more recent than the “latest,” which came by steamship from England and was a month or two old. Those days ended with the invention of the telephone, the first long-distance real-time communications device. (It took several decades before intercontinental telephone calls became feasible, of course.) A hundred years later, in the 1970’s and ’80’s, a lot of money and effort were still being spent to improve data transmission speed and the ability of various kinds of software to incorporate fresh observations or calculations quickly and accurately. Data velocity has not decreased in the years since, and the packages have grown enormous. Files that would have taken days to send thirty years ago, if they could be sent at all without overwhelming the network, now arrive in seconds. The combination of increasing speed and vast volume have made possible dizzying advances in a range of fields, not to mention terrifying information overload.

The changes real time hath wrought — in banking, medicine, journalism, and on and on — are too numerous and well-known to list. We may think of it mainly in economic terms, counting the ways faster movement of bytes and more and more seamless coordination between networks and devices have enabled us to make money. But there are other forces at work. One is simply the drive to innovate and improve, so fundamental to technological advance. The other is its complement, greed for novelty, not necessarily caused by cupidity, which creates the cheering section for the engineers and programmers who find ways to make it all work faster and better. The early adopters, in turn, make it financially possible to maintain the techies by enabling the middlemen to make a profit off their work, and we’re back to money.

If my count is correct, this is the 400th expression Lex Maniac has written about at greater or lesser length. My first association is with the Four Hundred of nineteenth century New York society, perhaps not the most fortunate. Second is auto races, also a little out of place. “Into the valley of death”? Doesn’t sound right, either. An inauspicious milestone.


Tags: , , , , , , , , , , ,

crunch the numbers

(1980’s | computerese? enginese? | “wade through OR digest the figures”)

Some new expressions engulf the landscape, washing over us all and forcing themselves on every ear, if not every lip. When we talk about common expressions, those are usually the kind we mean. There is another kind, though, not so ubiquitous, but unavoidable because the preferred, or only, way to refer to a particular action, process, or concept. So it likewise forces itself on every ear, but without the same unrelenting insistence. “Crunch the numbers” is one of those. It has become inevitable, in a culture devoted to amassing vast reservoirs of data, that we have a word for getting something useful out of all those statistics — once you collect all those numbers, you have to do something with them. There’s really no other word for it, and the phrase has become invariably associated with statistical distillation. The commonplace is formed not only from sheer frequency; if you have no choice but to reach for the same expression every time, it makes its presence felt.

The point of “crunching” the numbers, I think, is that they are reduced in size and complexity, like a mouthful of bran flakes turning into easily swallowed mush. The computer — number-crunching is almost invariably associated with computers, occasionally with calculators — takes a huge, indigestible mass of data and breaks it down. The expression seems to have arisen in the engineering community in the sixties and moved beyond it by the early eighties. It gained ground quickly, and soon no longer required quotation marks or glosses (actually, it was never generally glossed). Some expressions, though slangy and therefore not reproduced in mainstream publications until well after they’ve become ordinary, at least in their field, take hold quickly once they do because they’re easy to grasp and enjoy.

“Crunch the numbers” was at one time sole property of engineers and programmers; a few more professions may try it on now — accountants and statisticians primarily. The function of the computer, as envisioned in the post-war world, was to do many, many calculations per minute by brute force, placing vast amounts of computing power in one place and letting ‘er rip. I haven’t done the research to determine the no doubt lively expressions the tech boys used in the decade or two before “crunch the numbers” came along, or maybe it arose earlier than I think. It seems likely that there was no predictable expression before we started using this one, because we so rarely needed to talk about that volume and density of computation.

“Crunch the numbers” doesn’t share the taint of “massage the numbers,” or “game the system” or “fuzzy math.” A ground-level, first-resort expression must remain neutral, and the phrase is not generally used to question the motives or competence of those doing the crunching. “Run the numbers” is a little different, meaning “execute the formula and get the answer.” It likewise lacks any dubious connotation, despite a superficial resemblance to that staple of urban gambling, “running numbers” (or “playing the numbers”).

Tags: , , , , , , , , ,


(1980’s | computerese? | “innate,” “(pre-)programmed,” “fixed,” “unalterable”)

The hard-wired smoke detector was already around in 1980; in that sense the term has not changed meaning since. “Hard-wired” meant connected directly to the building’s electrical system, meaning it was not powered by batteries, meaning that it would not infallibly begin making horrible chirping noises one morning at 3:00 and resist every sleep-fogged effort to silence it. A hard-wired telephone was similar in that it was harder to disconnect than the standard model you plug into a wall jack (already common in my youth, though far from universal). The cord connected to the system inside the wall rather than on the outside. Cable television might be hard-wired in that the cables connected to the source physically entered your house and attached themselves to a television set. Computer scientists had been using the term before that, generally to mean something like “automatic” or “built-in” — the only way to change it is to make a physical alteration to part of the equipment — and it remained firmly ensconced in the technical realm until the eighties. That’s when “hard-wired” became more visible, as computer jargon was becoming very hip. (PCMAG offers a current set of computer-related definitions.) In computer lingo, “hard-wired” came to mean “part of the hardware,” so “soft-wired” had to follow to describe a capability or process provided by software.

My father, erstwhile electrical engineer, pointed out that in his world, “hard-wired” was the opposite of “programmable.” In other words, the hard-wired feature did what it did no matter what; it couldn’t be changed simply by revising the code. Yet you don’t have to be too careless to equate “hard-wired” with “programmed” (see above) in the sense of predetermined. It’s not contradictory if you substitute “re-programmable” for “programmable,” but that requires an unusual level of precision, even for a techie. Every now and then you find odd little synonym-antonym confusions like that.

Still in wide technical use, this expression has reached its zenith in the soft sciences, in which it is commonly used to mean “part of one’s make-up,” with regard to instincts, reflexes, and basic capacities (bipedal walking, language, etc.), and more dubiously to describe less elemental manifestations such as behavior, attitude, or world-view. “Hard-wired” is not a technical term in hard sciences such as genetics or neurology. The usefulness of the expression is open to question: one team of psychologists noted, “The term ‘hard-wired’ has become enormously popular in press accounts and academic writings in reference to human psychological capacities that are presumed by some scholars to be partially innate, such as religion, cognitive biases, prejudice, or aggression . . . remarkably few psychological capacities in humans are genuinely hard-wired, that is, inflexible in their behavioral expression” (citation). Scientists may sniff at the term as used in pop psychology, but it does make for easy shorthand and probably won’t go away any time soon.

The reason we take so easily to applying the term “hard-wired” to the brain is that the computer, as developed over the last fifty years, forms the most comprehensive map yet for the workings of our minds. A contributing reason is the very common, casual linking of brain activity with electricity, as in referring to one’s “wiring” — even though one may also refer to one’s “chemistry” to explain mental quirks, probably a superior explanation. Watching a computer “think” helps us understand how our brains work, or maybe it just misleads us, causing us to disregard our own observations in order to define our own mentation with reference to the computer’s processing. There are obvious connections and obvious divergences; surely any device we concoct must reflect the workings of our own minds. But computers aren’t just for playing solitaire, calculating your tax refund, running a supercollider. They serve a humanistic function by giving us new ways to think about the old ways we think.

Tags: , , , , , , , , , , , ,


(1980’s | journalese (politics) | “lambaste,” “lash out at,” “rap”)


(2000’s | computerese? | “snowed under,” “overburdened”)

If you persist in associating slamming with doors, home runs, telephones, or fists on the table, you are behind the times. If you think of poetry or dancing, you’re in better shape, but the verb has taken on two intriguing and non-obvious definitions since 1980, both of which are technically transitive, but one of which is generally used in a way that disguises its transitivity, to the point that it may not be transitive at all any more — more like an adjective. To be fair, only doors and home runs got purely transitive treatment in the old days; telephones and fists required an adverb, usually “down,” although Erle Stanley Gardner, creator of Perry Mason, sometimes wrote of a telephone receiver being “slammed up,” which may reflect an older usage or may have been idiosyncratic. Sometimes a preposition is required, usually “into,” as in slamming a car into an abutment. (But you might also say the car slammed into an abutment.) That’s if you neglected to slam on the brakes.

“Slam” today means “attack” or “harshly criticize,” while “slammed” means overwhelmed by work or life in general, when it isn’t merely serving as a past participle. The former emerged first, before 1990 — I found very few examples before 1980 — primarily in political contexts, though it could also be used to talk about entertainment, as in slamming an actor, or his performance. It appears to have been more common in England and Australia; I doubt it originated there but our anglophone cousins may have taken it up faster than we did. “Slammed” came along later; I found only a few examples before 2000, mainly among computer jockeys.

How are the two meanings related? They both rest on deliberate infliction of metaphorical violence, obvious when one politician slams another, less so when one feels slammed “at” or “with” (not “by”) work. When I first encountered that usage, I understood it to mean the boss had assigned a whole bunch of work without recognizing that the employee already had too much to do. That doesn’t seem particularly true any more. “Slammed” no longer automatically imputes malice — if it ever did — and need not suggest anything other than adverse but impersonal circumstances. Gradually it has spread so that it need not refer strictly to having too much to do; in recent years it has developed into a synonym for “exhausted.” It has somewhat more potential for expansion than “slam,” which has not strayed from the basic idea of heated verbal assault.

Is there a direct link between the two? We might expect to discern a path from the older meaning to the newer, but how would it work? The boss can excoriate your performance, or he can dump too many tasks on you, but they would seem to be separate operations. If you’re no good to begin with, why would the boss ask you to louse up still more projects? It’s a compliment if the boss piles work on you, not an insult. The linguistic pathways that led to these two recent additions to the dictionary may remain mysterious, but there should be no confusion about why they have become so popular in the last thirty years. Our pleasure in believing the worst of each other has led inescapably to uglier discourse, offering numerous opportunities to use the older verb. On the job front, whatever productivity increases we’ve wrung out of the workforce since 1970 have come from longer hours and fewer people; so those who still have a job must work harder. Conditions favored harsher language, and there was versatile “slam(med)” to fill the gap

Tags: , , , , , , , , , ,


(1990’s | computerese | “totem,” “alter ego”)

Occasionally a word will just flip around. For thousands of years, “avatar” meant corporeal embodiment of an abstraction, often a god (the word comes from Sanskrit and is mainly associated with the Hindu god Vishnu). What does it mean now? An incorporeal representation of a flesh-and-blood person. Now your avatar is a tiny image file that stands for you in the on-line universe –- on Facebook, in a role-playing video game, or on a blogger’s profile. It could be a portrait but usually isn’t, and there are dozens of sites that will give you some raw material and a little coaching to help you make your own. Another word for it is “icon,” which is a little different but not that far off. “Avatar” was definitely available in its present form by 2000, though perhaps not all that widespread in the days before everybody needed one to keep up a virtual social life. Now it is all over everywhere.

Is it odd that both terms have a distinctly religious cast? By the time you’ve settled on which South Park character you want to represent you on your social media pages, the sense of majesty and mystery that religion must command has pretty well worn away. Both “avatar” and “icon” have had non-religious uses for at least a century now, or at least “avatar” has, but there’s still a distinct whiff of it. You might also argue that technology, the most up-to-date religion we have going, has simply appropriated sacred vocabulary and repurposed it.

The question leads to a favorite pastime of mine: constructing narratives of cultural decline. “Avatar” once had a mystical tone, associated either with the gods themselves or people who live out philosophically pure distillations of noble principles. Now it’s a few pixels thrown together that allows you to play video games. A decided downward drift, and all in my lifetime! A quick scan of search results does confirm that Google, for one, doesn’t think old-fashioned religious uses of the term count for much — though, of course, the results are skewed by the 2009 blockbuster film. I didn’t see it, but from what I gather the eponymous characters in the film had at least a touch of the guru or sage about them. (I remember the movie as being about blue characters who weren’t Smurfs, but that just shows you how primitive my cinematic consciousness is.)

On-line avatars remind me of choosing your token at the beginning of a Monopoly game (we usually called it simply a “piece,” but if I remember correctly, the rules used the word “token”). The dog, the shoe, the little car, etc. (I liked the wheelbarrow myself.) Most people had a preference, whether they considered it lucky or somehow apt. True, you couldn’t cobble your own avatar together in Monopoly; you had to take what Parker Brothers gave you. But those were the boring old days, my children. Image files are not my strong suit, but I came up with a few related user names, free to a good home. Want to be an actress? Ava Tardner. Fond of the Middle Ages? Avatar and Heloise. For fans of ancient video games, Avatari. For a historical touch, how about Amelia Earhart, avatrix? That’ll light up the chat boards.

Tags: , , , , , , , , , , , , ,

on demand

(1980’s | businese (finance) | “on request,” “when you want it”)

When did “by request” become “on demand”? The expression in financial circles is quite old; a note or loan might be payable “on demand” (all at once when the lender calls for it) rather than on a fixed schedule over time. But somewhere in there, it took on a much wider range of use. The campaign for abortion rights certainly played a role; by 1970 it was not unusual to hear talk of abortion on demand, which became a rallying cry as laws banning abortion came under attack. That trend has been going the other way for the last two decades, too late to stop the expansion of “on demand,” which now applies to nearly everything that can be ordered over the internet, from groceries to streamed movies to academic courses. All you have to do is snap your fingers, or tap your phone. (Doesn’t sound right, does it? even though it’s a literal description. But the old meaning of tapping a phone continues to get in the way.) You may have to wait longer than you did when you left the house to supply this need or that, but we are beguiled by the ease of letting a credit card and a delivery service do all the work, making the new “order” seem all the more attractive.

So a staid and venerable financial term has sprawled all over the place like lava flow from an angry volcano, aided first by medical and cultural trends (not just abortion — drug treatment and medical care more generally glommed onto the phrase in the seventies and eighties) and then by the rise of the personal computer, which even before the internet infiltrated our lives occasioned much talk of providing computational or word-processing services on demand. The phrase has become a hyphenated adjective as well. “On-demand economy,” based on people spending money from their smartphones, is a phrase you will hear more and more.

There seems to be an implicit democratization at work, too. If you have enough money, just about anything is available on demand, and that’s been true for centuries, making allowances for the fact the number of things we want, or think we need, has grown over time. Now you don’t need much money to acquire goods or entertainment on demand. If money can’t buy it, it’s not so easy. We may forget that not everything desirable can be had at the click of a mouse.

I’ve suspected for a long time that the internet has completed our transformation into a nation of three-year-olds, a trend initiated by the Sears Roebuck catalogue and the rise of advertising in the late nineteenth century. The consumer economy requires people to come up with new stuff to want and must continually come up with quicker and more reliable ways to get it to them. eBay, for example, consummates a huge number of transactions every day called “buy it now.” Is that much different from “Want it NOW” or “gimme NOW”? When it comes to tangible items, it’s not even instant gratification — that CD or toaster won’t fall into your lap the minute you click “confirm and pay” on Paypal. But we’ve learned to treat it as instant gratification; making the purchase is as good as holding the object of desire in our hands. Amazon wants to use drones to deliver packages faster than ever; next year it will be something else. We have created an economic monster that requires our appetites, and the means to sate them, to continue growing indefinitely. How long can we keep it up?

Tags: , , , , , , , , ,

bells and whistles

(1980’s | advertese? | “additional features,” “doodads,” “frills”)

There’s plenty of on-line speculation about the origin of this expression. It is a puzzler because there are several possibilities, none of which has anything obvious to do with “bells and whistles” as used since 1970 or so: features of a product — a car, computer, camera, etc. — not needed to make it work but which add power, capability, or luxury, and cost. In today’s language, it doesn’t even have to be tangible; a web site, business plan, or legislation might have bells and whistles. What I saw in Google Books makes me think that the main conduit into everyday language was computerese, although some on-line authorities say car dealers used it first. Generally bells and whistles are thought to be a good thing, but there is a persistent undercurrent dogging the phrase. Sometimes bells and whistles are considered distracting, superfluous, or excuses to drive prices up without delivering better performance. An investment manager “aims to provide simple, yet solid guidelines that work for any investment plan in the long run, devoid of any quick fixes or bells and whistles.” A writer deplores over-elaborate restaurant desserts: “There were tuiles, there were chocolate towers, there were flowers made of spun sugar. Good luck finding the actual dessert amid all the bells and whistles.”

All right, you ask, why use “bells and whistles,” devices not normally associated with ease or comfort, to refer to such things? World Wide Words, WiseGeek, and Phrase Finder have all taken a swing at this, and the consensus seems to be that it has to do with fairground or movie-house organs, which incorporated many sound effects, including bells and whistles, the better to hold the crowd’s attention. Or it may derive from model railroading, as in “This train set is so true to life it has all the bells and whistles.” There are other possibilities as well — factory time signals, parties and celebrations, buoys, alarm systems — but they seem less plausible. The fact is, no one knows for sure how this popular expression crept, seeped, or slithered into our vocabulary between 1960 and 1990. That’s not a dig or swipe; that’s a respectful acknowledgment of the mysteries of language.

Actually, I found one reference as far back as 1977 to a writer who explained the origin of the phrase, in a short-lived magazine called “ROM.” Unfortunately, Google Books’ snippet view, which I have complained about before, didn’t show me the answer, only the set-up. The nearest library that has copies of this periodical is in Rochester. But if any of my faithful readers can track this one down, I will award you a free subscription without hesitation. I’ll bet the proposed derivation has something to do with circus organs or locomotives. But even if it doesn’t, it won’t be definitive.

What “bells and whistles” ought to denote is “means of getting your attention.” It’s not that they’re cool or make your machine better, it’s that they make you sit up and take notice, like loud a noise going off in your ear. It should be what marketers do, not what designers and engineers do (arguably, the engineers create the features and the marketers turn them into bells and whistles). That’s why I suggest that we thank advertese for pushing a new, improved meaning for this expression into the forefront.

Tags: , , , , , ,


(1990’s | businese, computerese | “new business, firm, etc.”)

Hyphenated or not, this expression was well established as both a noun and an adjective by 1975, particularly in the business press, and it doesn’t appear to be much older than that. Google’s n-gram viewer finds almost no instances before 1950, and the curve doesn’t start to rise sharply until 1970, so it was fairly new then, but easy to use and soon absorbed. When it was a noun, it meant “commencement of operations,” or more colloquially, “opening” or “launch.” Normally it went with heavy industry, so it was common to talk of the startup of a plant or pipeline, for example. But businessmen love to scoff at grammar distinctions — there’s no denying startups invariably entail startup costs, a startup period, or, heaven forbid, startup problems — so they converted it effortlessly into an adjective. “Startup” may also clock in as a verb, but in that part of speech it is usually two words, even today.

By 1990, the concept of a “start[-]up company” had emerged, and occasionally the noun disappeared, leaving “startup” on its own. That wasn’t normal then, but today it is the rule. Back in the eighties, the shift from galumphing old factories to nimble new firms that didn’t make anything three-dimensional was driven by a hostile takeover of American life by the personal computer, a fait accompli by 1995. So many new companies concerned themselves with computer hardware and software that “startup” became common in computer publications by the late eighties. The word is older, but the way we use it today was probably driven by increasing computer sales, and computerese became the funnel for a businese expression — no surprise there. Michael Dell (of Dell Computers) was quoted recently on the “startup ecosystem” in India, and he even spoke of “meeting” (without “with”) several startups, not a use of “meet” I’ve encountered before. Since I haven’t actually offered a definition, here’s one I encountered on a German web site that does the job pretty well: “Startups sind Jungunternehmen mit besonderen Ideen – sehr oft im digitalen Bereich.” (Roughly, “Startups are new enterprises with unusual ideas, most often in the computer sector.”)

My sense is that “startup” had primarily a favorable connotation when it was getting established between 1985 and 1995. Such budding concerns were generally pegged as plucky or scrappy, determined pioneers taking on long odds with heads held high and a sound business plan. In that respect, it was more or less the opposite of “upstart,” which was always uncomplimentary. But as the term has lost novelty, it may have lost its sheen. Anyway, I don’t have the sense any more that it is complimentary. It seems more neutral than anything else.

The key related concept is the entrepreneur, always a figure celebrated in American mythology. Entrepreneurs breed startups, or shed them, or bring them forth from their heads, like Zeus giving birth to Athena. The crashes and recessions that have become frequent since the Nixon years may have dampened the spirits of some of these go-getters who start their own companies, but their flame burns bright as ever in our official worship of business. Entrepreneurs take the initiative, do their homework, embody healthy risk-taking, create jobs and prosperity, and otherwise exemplify the American way. Entrepreneurs are lauded especially on the right, because entrepreneurism is all about me rather than all about us. (That’s an oversimplification, but I’ll stick with it.)

According to my calculations, this is the 300th expression I have written about, at greater or lesser length. (I have become more loquacious over time, not less. Brevity is the soul of wit, indeed.) I chose “startup” as anniversary fodder partly because no operation was ever more shoestring or quixotic than this blog. I say thank you to my readers, to the ones who landed here once off of Google and never came back as well as the ones who read every post and comment faithfully. (You know who you are, and there ain’t very many of you.) I don’t do enough to encourage comments and feedback, but at least here I will say, if you ever feel an impulse to fire off a reply to one of my posts, or to send me an e-mail at usagemaven at verizon dot net, do it. Even if I don’t answer, I am grateful that you took the time, and I will profit from your wise words.

Tags: , , , , , , , , , , ,


(1990’s | businese (finance) | “con man,” “crook,” “trickster”)

A far as I can tell, we owe this one to the Brits, or maybe their ex-colonials. There are very few LexisNexis results from U.S. sources before 1995; nearly all come from Great Britain and the recent colonies. We’re certainly not above borrowing Briticisms and their cousins (“at the end of the day,” “over the moon,” “selfie“) in these parts, and the venerable “-ster” suffix (see below) rolls off the tongue easily in America, the land of gangsters and mobsters.

This word probably is not necessary in either hemisphere, but it does have the advantage of incorporating the very act its embodiments practice. The old equivalents (see above) do not. “Con[fidence] man” is itself a bit deceptive; it really refers to someone you should not have confidence in, and he has to convince you to do so. “Crook” is more general than “fraudster,” but the two words line up pretty well, so it’s rarely jarring to substitute one for the other. A fraudster sees an opening and uses it dishonestly for personal advantage. The word seems indifferent to distinctions like preying on individuals vs. cheating corporate bodies, or large-scale vs. small-scale crime. As the OED (first citation: 1975) notes, “fraudster” denotes in particular one who lies or cheats in the course of a business transaction. But it is not restricted to such use and has already spread out.

I cannot help but wonder if the rise of this expression since the seventies does not result from a simple (or rather geometric) increase in chicanery. It may be that an ever more complicated and less regulated financial system, coupled with increased criminal activity enabled by widespread use of computers, has made it ever easier to pull scams, causing a new expression to erupt, a boon to harassed writers if no one else. It’s such a relief to have a new, yet easily grasped, synonym to haul out once in a while.

The “-ster” suffix repays study. My feeling is that it has had a bit of an underbelly for centuries, but any negative connotation probably became more pronounced in the twentieth century, at least in the U.S. “Gangster” and “mobster” both date back to somewhere around 1900, according to Random House; I suspect that “gangster” relies on “teamster,” an eighteenth-century expression that did not, as far as I know, develop a dark side until the twentieth, when the Teamsters’ Union for a time became synonymous with corruption. (Tapster, tipster, and trickster, all dubious trades, are much older words. Speaking of deplorable occupations, where do “barrister” and “monster” fit into all this?) The suffix doesn’t always have a negative connotation, even today; when connected to a name, it may be affectionate. For example, Tom Bergeron used to call Whoopi Goldberg “Whoopster” on Hollywood Squares. Then again, after fifty years as a compliment, “hipster” finally became a dirty word somewhere around 2000. I’m not enough of a linguist to offer a proper history of the suffix, but “baxter” (female baker) and “brewster” (female brewer) are very old. According to Chambers Etymological Dictionary (thanks, Liz!), “-ster” comes from Anglo-Saxon, where it denoted specifically a female practitioner, but well before the Elizabethan era the gender distinction had disappeared. Chambers also notes that the suffix, originally attached to verbs (bake, brew), as befits an equivalent of the “-er” suffix, now hooks more readily to nouns. It has gone on yoking itself to new words for centuries now, and it usually seems to have something shady or untrustworthy about it. “Fraudster” thus takes its place in a long, rich tradition.

Tags: , , , , , , , , , ,


(1980’s | computerese, businese | “independent,” “unconnected,” “separate,” “isolated”)

The earliest instances of “standalone” (sometimes hyphenated, even in this day and age) in Google Books date from the sixties and seventies, nearly always in talking about non-networked computers. The first hits recorded in LexisNexis all date from 1979 in that trusty journal American Banker — but invariably in discussions of the use of computers in banking. The word was used often in the early days of ATM’s, which could, in the manner of computers, be divided into the ones clustered together for protection (e.g., in a bank lobby) and the ones out in the field, far from help. (The latter had to be connected to a mainframe somewhere or they wouldn’t have access to anyone’s account data, of course. And even a standalone computer had to be connected to a power source. No computer is an island; no computer stands alone.) ATM’s were brave and new in the eighties, and I suspect their spread pushed “standalone” into prominence. Other business types were certainly using the word by 1990, generally in reference to computers. It was widely understood by then but remained primarily a hardware term at least until 2000. One mildly interesting point about “standalone” is that it could apply to an entire system as well as to a single device. A standalone device can function even if it is not part of a larger system, but an entire system can also absorb the adjective if it doesn’t depend obviously on comparable systems.

“Standalone” retains a strong business bias, even today, but it is available to describe many things besides computers. A complicated piece of legislation might be broken up into standalone bills. Or a film for which no prequels or sequels are planned (or one in which a character that had been a supporting player in other films becomes the protagonist) might be so described. A government agency that doesn’t rely on another agency for its writ. A restaurant that isn’t part of a chain. “Standalone” is not generally used to mean “freestanding,” although it seems like it ought to be, literally speaking. I am a little surprised that I find almost no examples of the word used as a noun (one does see it used as a trade name), although that seems inevitable. All it takes is the careless insertion of one lousy one-letter article, and the deed is done. You’d think it would be harder to blur fundamental grammatical categories, but no.

The rise of this term inevitably accompanied a change in how we use computers. In the seventies and eighties, when we began to get serious about turning them into tools for business, the idea was that each employee’s work station had to be connected to the mainframe, where all the applications and data were stored. In the nineties, we shifted to the opposite model: each employee’s computer should have a big enough hard drive to store software and data; every work station became its own mainframe (or server, as we would say now). In the last few years, we’ve pushed the other way, and now minuscule laptops and tablets run software from the cloud, and store data there as well. The same shift has taken place outside the office; home computers have undergone a similar evolution. There are no doubt good reasons for the shift; the rules and conventions of the computer game have changed quite a bit. But like many such sizable shifts in our culture, it has taken place with little or no consideration of why we did it the other way. Are the once highly-touted advantages of standalone computers no longer real or significant? We don’t know, because the issue was never debated out where most of us could hear. We did it the old way because there was money in it, and now the powers that be have found a new way to make money. You’re stuck with it whether it helps you or not, and you’re not even entitled to an explanation. That should be surprising, but in practice, it isn’t. Our policy debates routinely fail to explore how things got to be the way they are. It’s as if we all woke up one day and said, “Look, a problem! Let’s fix it!” With insufficient historical understanding, we attack large-scale problems with little or no attention to how they arose and fail to acknowledge the evils the existing approach has successfully prevented.

Tags: , , , , , , , , ,