Tag Archives: computers
(1980’s | computerese? | “innate,” “(pre-)programmed,” “fixed,” “unalterable”)
The hard-wired smoke detector was already around in 1980; in that sense the term has not changed meaning since. “Hard-wired” meant connected directly to the building’s electrical system, meaning it was not powered by batteries, meaning that it would not infallibly begin making horrible chirping noises one morning at 3:00 and resist every sleep-fogged effort to silence it. A hard-wired telephone was similar in that it was harder to disconnect than the standard model you plug into a wall jack (already common in my youth, though far from universal). The cord connected to the system inside the wall rather than on the outside. Cable television might be hard-wired in that the cables connected to the source physically entered your house and attached themselves to a television set. Computer scientists had been using the term before that, generally to mean something like “automatic” or “built-in” — the only way to change it is to make a physical alteration to part of the equipment — and it remained firmly ensconced in the technical realm until the eighties. That’s when “hard-wired” became more visible, as computer jargon was becoming very hip. (PCMAG offers a current set of computer-related definitions.) In computer lingo, “hard-wired” came to mean “part of the hardware,” so “soft-wired” had to follow to describe a capability or process provided by software.
My father, erstwhile electrical engineer, pointed out that in his world, “hard-wired” was the opposite of “programmable.” In other words, the hard-wired feature did what it did no matter what; it couldn’t be changed simply by revising the code. Yet you don’t have to be too careless to equate “hard-wired” with “programmed” (see above) in the sense of predetermined. It’s not contradictory if you substitute “re-programmable” for “programmable,” but that requires an unusual level of precision, even for a techie. Every now and then you find odd little synonym-antonym confusions like that.
Still in wide technical use, this expression has reached its zenith in the soft sciences, in which it is commonly used to mean “part of one’s make-up,” with regard to instincts, reflexes, and basic capacities (bipedal walking, language, etc.), and more dubiously to describe less elemental manifestations such as behavior, attitude, or world-view. “Hard-wired” is not a technical term in hard sciences such as genetics or neurology. The usefulness of the expression is open to question: one team of psychologists noted, “The term ‘hard-wired’ has become enormously popular in press accounts and academic writings in reference to human psychological capacities that are presumed by some scholars to be partially innate, such as religion, cognitive biases, prejudice, or aggression . . . remarkably few psychological capacities in humans are genuinely hard-wired, that is, inflexible in their behavioral expression” (citation). Scientists may sniff at the term as used in pop psychology, but it does make for easy shorthand and probably won’t go away any time soon.
The reason we take so easily to applying the term “hard-wired” to the brain is that the computer, as developed over the last fifty years, forms the most comprehensive map yet for the workings of our minds. A contributing reason is the very common, casual linking of brain activity with electricity, as in referring to one’s “wiring” — even though one may also refer to one’s “chemistry” to explain mental quirks, probably a superior explanation. Watching a computer “think” helps us understand how our brains work, or maybe it just misleads us, causing us to disregard our own observations in order to define our own mentation with reference to the computer’s processing. There are obvious connections and obvious divergences; surely any device we concoct must reflect the workings of our own minds. But computers aren’t just for playing solitaire, calculating your tax refund, running a supercollider. They serve a humanistic function by giving us new ways to think about the old ways we think.
(1980’s | journalese (politics); | “lambaste,” “lash out at,” “rap”)
(2000’s | computerese? | “snowed under,” “overburdened”)
If you persist in associating slamming with doors, home runs, telephones, or fists on the table, you are behind the times. If you think of poetry or dancing, you’re in better shape, but the verb has taken on two intriguing and non-obvious definitions since 1980, both of which are technically transitive, but one of which is generally used in a way that disguises its transitivity, to the point that it may not be transitive at all any more — more like an adjective. To be fair, only doors and home runs got purely transitive treatment in the old days; telephones and fists required an adverb, usually “down,” although Erle Stanley Gardner, creator of Perry Mason, sometimes wrote of a telephone receiver being “slammed up,” which may reflect an older usage or may have been idiosyncratic. Sometimes a preposition is required, usually “into,” as in slamming a car into an abutment. (But you might also say the car slammed into an abutment.) That’s if you neglected to slam on the brakes.
“Slam” today means “attack” or “harshly criticize,” while “slammed” means overwhelmed by work or life in general, when it isn’t merely serving as a past participle. The former emerged first, before 1990 — I found very few examples before 1980 — primarily in political contexts, though it could also be used to talk about entertainment, as in slamming an actor, or his performance. It appears to have been more common in England and Australia; I doubt it originated there but our anglophone cousins may have taken it up faster than we did. “Slammed” came along later; I found only a few examples before 2000, mainly among computer jockeys.
How are the two meanings related? They both rest on deliberate infliction of metaphorical violence, obvious when one politician slams another, less so when one feels slammed “at” or “with” (not “by”) work. When I first encountered that usage, I understood it to mean the boss had assigned a whole bunch of work without recognizing that the employee already had too much to do. That doesn’t seem particularly true any more. “Slammed” no longer automatically imputes malice — if it ever did — and need not suggest anything other than adverse but impersonal circumstances. Gradually it has spread so that it need not refer strictly to having too much to do; in recent years it has developed into a synonym for “exhausted.” It has somewhat more potential for expansion than “slam,” which has not strayed from the basic idea of heated verbal assault.
Is there a direct link between the two? We might expect to discern a path from the older meaning to the newer, but how would it work? The boss can excoriate your performance, or he can dump too many tasks on you, but they would seem to be separate operations. If you’re no good to begin with, why would the boss ask you to louse up still more projects? It’s a compliment if the boss piles work on you, not an insult. The linguistic pathways that led to these two recent additions to the dictionary may remain mysterious, but there should be no confusion about why they have become so popular in the last thirty years. Our pleasure in believing the worst of each other has led inescapably to uglier discourse, offering numerous opportunities to use the older verb. On the job front, whatever productivity increases we’ve wrung out of the workforce since 1970 have come from longer hours and fewer people; so those who still have a job must work harder. Conditions favored harsher language, and there was versatile “slam(med)” to fill the gap.
(1980’s | businese (finance) | “on request,” “when you want it”)
When did “by request” become “on demand”? The expression in financial circles is quite old; a note or loan might be payable “on demand” (all at once when the lender calls for it) rather than on a fixed schedule over time. But somewhere in there, it took on a much wider range of use. The campaign for abortion rights certainly played a role; by 1970 it was not unusual to hear talk of abortion on demand, which became a rallying cry as laws banning abortion came under attack. That trend has been going the other way for the last two decades, too late to stop the expansion of “on demand,” which now applies to nearly everything that can be ordered over the internet, from groceries to streamed movies to academic courses. All you have to do is snap your fingers, or tap your phone. (Doesn’t sound right, does it? even though it’s a literal description. But the old meaning of tapping a phone continues to get in the way.) You may have to wait longer than you did when you left the house to supply this need or that, but we are beguiled by the ease of letting a credit card and a delivery service do all the work, making the new “order” seem all the more attractive.
So a staid and venerable financial term has sprawled all over the place like lava flow from an angry volcano, aided first by medical and cultural trends (not just abortion — drug treatment and medical care more generally glommed onto the phrase in the seventies and eighties) and then by the rise of the personal computer, which even before the internet infiltrated our lives occasioned much talk of providing computational or word-processing services on demand. The phrase has become a hyphenated adjective as well. “On-demand economy,” based on people spending money from their smartphones, is a phrase you will hear more and more.
There seems to be an implicit democratization at work, too. If you have enough money, just about anything is available on demand, and that’s been true for centuries, making allowances for the fact the number of things we want, or think we need, has grown over time. Now you don’t need much money to acquire goods or entertainment on demand. If money can’t buy it, it’s not so easy. We may forget that not everything desirable can be had at the click of a mouse.
I’ve suspected for a long time that the internet has completed our transformation into a nation of three-year-olds, a trend initiated by the Sears Roebuck catalogue and the rise of advertising in the late nineteenth century. The consumer economy requires people to come up with new stuff to want and must continually come up with quicker and more reliable ways to get it to them. eBay, for example, consummates a huge number of transactions every day called “buy it now.” Is that much different from “Want it NOW” or “gimme NOW”? When it comes to tangible items, it’s not even instant gratification — that CD or toaster won’t fall into your lap the minute you click “confirm and pay” on Paypal. But we’ve learned to treat it as instant gratification; making the purchase is as good as holding the object of desire in our hands. Amazon wants to use drones to deliver packages faster than ever; next year it will be something else. We have created an economic monster that requires our appetites, and the means to sate them, to continue growing indefinitely. How long can we keep it up?
bells and whistles
(1980’s | advertese? | “additional features,” “doodads,” “frills”)
There’s plenty of on-line speculation about the origin of this expression. It is a puzzler because there are several possibilities, none of which has anything obvious to do with “bells and whistles” as used since 1970 or so: features of a product — a car, computer, camera, etc. — not needed to make it work but which add power, capability, or luxury, and cost. In today’s language, it doesn’t even have to be tangible; a web site, business plan, or legislation might have bells and whistles. What I saw in Google Books makes me think that the main conduit into everyday language was computerese, although some on-line authorities say car dealers used it first. Generally bells and whistles are thought to be a good thing, but there is a persistent undercurrent dogging the phrase. Sometimes bells and whistles are considered distracting, superfluous, or excuses to drive prices up without delivering better performance. An investment manager “aims to provide simple, yet solid guidelines that work for any investment plan in the long run, devoid of any quick fixes or bells and whistles.” A writer deplores over-elaborate restaurant desserts: “There were tuiles, there were chocolate towers, there were flowers made of spun sugar. Good luck finding the actual dessert amid all the bells and whistles.”
All right, you ask, why use “bells and whistles,” devices not normally associated with ease or comfort, to refer to such things? World Wide Words, WiseGeek, and Phrase Finder have all taken a swing at this, and the consensus seems to be that it has to do with fairground or movie-house organs, which incorporated many sound effects, including bells and whistles, the better to hold the crowd’s attention. Or it may derive from model railroading, as in “This train set is so true to life it has all the bells and whistles.” There are other possibilities as well — factory time signals, parties and celebrations, buoys, alarm systems — but they seem less plausible. The fact is, no one knows for sure how this popular expression crept, seeped, or slithered into our vocabulary between 1960 and 1990. That’s not a dig or swipe; that’s a respectful acknowledgment of the mysteries of language.
Actually, I found one reference as far back as 1977 to a writer who explained the origin of the phrase, in a short-lived magazine called “ROM.” Unfortunately, Google Books’ snippet view, which I have complained about before, didn’t show me the answer, only the set-up. The nearest library that has copies of this periodical is in Rochester. But if any of my faithful readers can track this one down, I will award you a free subscription without hesitation. I’ll bet the proposed derivation has something to do with circus organs or locomotives. But even if it doesn’t, it won’t be definitive.
What “bells and whistles” ought to denote is “means of getting your attention.” It’s not that they’re cool or make your machine better, it’s that they make you sit up and take notice, like loud a noise going off in your ear. It should be what marketers do, not what designers and engineers do (arguably, the engineers create the features and the marketers turn them into bells and whistles). That’s why I suggest that we thank advertese for pushing a new, improved meaning for this expression into the forefront.
(1990’s | businese, computerese | “new business, firm, etc.”)
Hyphenated or not, this expression was well established as both a noun and an adjective by 1975, particularly in the business press, and it doesn’t appear to be much older than that. Google’s n-gram viewer finds almost no instances before 1950, and the curve doesn’t start to rise sharply until 1970, so it was fairly new then, but easy to use and soon absorbed. When it was a noun, it meant “commencement of operations,” or more colloquially, “opening” or “launch.” Normally it went with heavy industry, so it was common to talk of the startup of a plant or pipeline, for example. But businessmen love to scoff at grammar distinctions — there’s no denying startups invariably entail startup costs, a startup period, or, heaven forbid, startup problems — so they converted it effortlessly into an adjective. “Startup” may also clock in as a verb, but in that part of speech it is usually two words, even today.
By 1990, the concept of a “start[-]up company” had emerged, and occasionally the noun disappeared, leaving “startup” on its own. That wasn’t normal then, but today it is the rule. Back in the eighties, the shift from galumphing old factories to nimble new firms that didn’t make anything three-dimensional was driven by a hostile takeover of American life by the personal computer, a fait accompli by 1995. So many new companies concerned themselves with computer hardware and software that “startup” became common in computer publications by the late eighties. The word is older, but the way we use it today was probably driven by increasing computer sales, and computerese became the funnel for a businese expression — no surprise there. Michael Dell (of Dell Computers) was quoted recently on the “startup ecosystem” in India, and he even spoke of “meeting” (without “with”) several startups, not a use of “meet” I’ve encountered before. Since I haven’t actually offered a definition, here’s one I encountered on a German web site that does the job pretty well: “Startups sind Jungunternehmen mit besonderen Ideen – sehr oft im digitalen Bereich.” (Roughly, “Startups are new enterprises with unusual ideas, most often in the computer sector.”)
My sense is that “startup” had primarily a favorable connotation when it was getting established between 1985 and 1995. Such budding concerns were generally pegged as plucky or scrappy, determined pioneers taking on long odds with heads held high and a sound business plan. In that respect, it was more or less the opposite of “upstart,” which was always uncomplimentary. But as the term has lost novelty, it may have lost its sheen. Anyway, I don’t have the sense any more that it is complimentary. It seems more neutral than anything else.
The key related concept is the entrepreneur, always a figure celebrated in American mythology. Entrepreneurs breed startups, or shed them, or bring them forth from their heads, like Zeus giving birth to Athena. The crashes and recessions that have become frequent since the Nixon years may have dampened the spirits of some of these go-getters who start their own companies, but their flame burns bright as ever in our official worship of business. Entrepreneurs take the initiative, do their homework, embody healthy risk-taking, create jobs and prosperity, and otherwise exemplify the American way. Entrepreneurs are lauded especially on the right, because entrepreneurism is all about me rather than all about us. (That’s an oversimplification, but I’ll stick with it.)
According to my calculations, this is the 300th expression I have written about, at greater or lesser length. (I have become more loquacious over time, not less. Brevity is the soul of wit, indeed.) I chose “startup” as anniversary fodder partly because no operation was ever more shoestring or quixotic than this blog. I say thank you to my readers, to the ones who landed here once off of Google and never came back as well as the ones who read every post and comment faithfully. (You know who you are, and there ain’t very many of you.) I don’t do enough to encourage comments and feedback, but at least here I will say, if you ever feel an impulse to fire off a reply to one of my posts, or to send me an e-mail at usagemaven at verizon dot net, do it. Even if I don’t answer, I am grateful that you took the time, and I will profit from your wise words.
(1980’s | computerese, businese | “independent,” “unconnected,” “separate,” “isolated”)
The earliest instances of “standalone” (sometimes hyphenated, even in this day and age) in Google Books date from the sixties and seventies, nearly always in talking about non-networked computers. The first hits recorded in LexisNexis all date from 1979 in that trusty journal American Banker — but invariably in discussions of the use of computers in banking. The word was used often in the early days of ATM’s, which could, in the manner of computers, be divided into the ones clustered together for protection (e.g., in a bank lobby) and the ones out in the field, far from help. (The latter had to be connected to a mainframe somewhere or they wouldn’t have access to anyone’s account data, of course. And even a standalone computer had to be connected to a power source. No computer is an island; no computer stands alone.) ATM’s were brave and new in the eighties, and I suspect their spread pushed “standalone” into prominence. Other business types were certainly using the word by 1990, generally in reference to computers. It was widely understood by then but remained primarily a hardware term at least until 2000. One mildly interesting point about “standalone” is that it could apply to an entire system as well as to a single device. A standalone device can function even if it is not part of a larger system, but an entire system can also absorb the adjective if it doesn’t depend obviously on comparable systems.
“Standalone” retains a strong business bias, even today, but it is available to describe many things besides computers. A complicated piece of legislation might be broken up into standalone bills. Or a film for which no prequels or sequels are planned (or one in which a character that had been a supporting player in other films becomes the protagonist) might be so described. A government agency that doesn’t rely on another agency for its writ. A restaurant that isn’t part of a chain. “Standalone” is not generally used to mean “freestanding,” although it seems like it ought to be, literally speaking. I am a little surprised that I find almost no examples of the word used as a noun (one does see it used as a trade name), although that seems inevitable. All it takes is the careless insertion of one lousy one-letter article, and the deed is done. You’d think it would be harder to blur fundamental grammatical categories, but no.
The rise of this term inevitably accompanied a change in how we use computers. In the seventies and eighties, when we began to get serious about turning them into tools for business, the idea was that each employee’s work station had to be connected to the mainframe, where all the applications and data were stored. In the nineties, we shifted to the opposite model: each employee’s computer should have a big enough hard drive to store software and data; every work station became its own mainframe (or server, as we would say now). In the last few years, we’ve pushed the other way, and now minuscule laptops and tablets run software from the cloud, and store data there as well. The same shift has taken place outside the office; home computers have undergone a similar evolution. There are no doubt good reasons for the shift; the rules and conventions of the computer game have changed quite a bit. But like many such sizable shifts in our culture, it has taken place with little or no consideration of why we did it the other way. Are the once highly-touted advantages of standalone computers no longer real or significant? We don’t know, because the issue was never debated out where most of us could hear. We did it the old way because there was money in it, and now the powers that be have found a new way to make money. You’re stuck with it whether it helps you or not, and you’re not even entitled to an explanation. That should be surprising, but in practice, it isn’t. Our policy debates routinely fail to explore how things got to be the way they are. It’s as if we all woke up one day and said, “Look, a problem! Let’s fix it!” With insufficient historical understanding, we attack large-scale problems with little or no attention to how they arose and fail to acknowledge the evils the existing approach has successfully prevented.
(1990’s | businese | “the easy part or stuff,” “easy pickings,” “quick results”)
The primary point of this expression is quick, easy, and beneficial; the secondary point is making yourself look good. New managers often go after low-hanging fruit to get quick, eye-catching results. This may lead others to denigrate their accomplishments as cheap, but fixing obvious problems for the sake of an obvious improvement (in the bottom line, productivity, or morale) is something no one ought to apologize for. The expression is generally used to hint that it will be impossible to continue to make progress at the same pace, but it may also suggest the quick progress made so far promises more of the same. My sense is that the expression has never really borne the dishonorable connotation of “easy way out,” although I have seen a few examples very recently, so the concept may be coarsening as we speak. A recent post on greentechmedia.com defines the expression as “do a few small things, and big results will happen.”
I can’t discern a definite origin, but this expression was used most often in the business community and started to show up regularly after 1990, with scattered use at best before then. Executives, consultants, and bankers used it, usually with “pick the” in front of it. Politicians, ever keen to be where the money is, latched onto it quickly, and it mostly remains the property of those with power. The meaning of the phrase has changed little: obvious ways to improve efficiency or profit, or maybe just your life. (Wisegeek has the best discussion I found in two minutes closeted with Google.) The phrase can cover more ground now, of course. In the nineties, cheerleaders for technology used the phrase to refer to savings or gains in output rendered by computers. More recently, the emphasis has shifted. Now, rather than harvesting the fruit, you try to avoid being harvested, that is, avoid becoming an easy target for hackers and cyberthieves, ever on the cyberprowl for low-hanging cyberfruit.
One interesting point about this expression is that it is nearly always used with the past tense. By the time anyone mentions it, it is all gone; its notable absence reminds everyone that the easy part is over, and everything from now on will be more costly and harder to obtain. It is beloved of managers warning their bosses that they can’t be expected to keep producing at the same rate, but it might also be an executive claiming that an industry has done everything reasonable to meet regulators’ demands, or a salesman telling you the most likely customers are already sewn up. When it’s all picked, we’ve reached the point of diminishing returns. The processes or upgrades that constitute low-hanging fruit change over time, and yesterday’s complications are today’s low-hanging fruit: “There comes a time when new technologies are no longer new and become a series of low-hanging fruit components to assemble into new and disruptive opportunities.” (citation)
Another interesting point about this phrase is that even after all the low-hanging fruit has been picked, there must be more opportunities; it can never be used to mean that we have exhausted all the possibilities. There can be no low-hanging fruit without high-hanging fruit.
brick and mortar
(2000’s | businese? computerese? | “with a fixed address”)
I was surprised to learn that “bricks and mortar” is, or at least was, heard as often as “brick and mortar.” The former may come from England, and my ear tells me loud and clear that “brick and mortar” is much more common. But both forms come up often enough to be taken seriously. American Heritage rules it a hyphenated adjective, but it doesn’t seem to be hyphenated very often in the corpora, and it can also be used as a noun. There’s no doubt it is predominantly an attributive adjective. I can imagine someone using it as the complement of a copula (“The store is brick and mortar”), but I’d notice if I actually heard it. A related expression, which I never encountered until I got to wondering about “brick and mortar,” is “click(s) and mortar.” That describes a business that operates both on-line and in physical locations (“bricks and clicks” is another variation). Anyway, the opposite of “brick and mortar” doesn’t have to be “on-line,” however likely most of us are to think of that first. It could be through a mail-order catalogue or even the old stand-by, door-to-door sales, which were antiquated by my childhood and which require a building somewhere, anyway, even if it’s not used for direct customer service. But so do on-line businesses. You can’t leave all those high-powered servers out in the rain.
Indulge me as I drag in one other related term, “showrooming,” which they say is mushrooming. (But one writer says “reverse showrooming” is more common.) It denotes the practice of examining a product in a store, then buying it on-line. I encountered this word only a few years ago, but it has surely leaped the gap between specialized vocabulary and everyday language. It’s almost always used as a gerund. Showrooming is a form of freeloading — you’re using the retailer’s facilities without paying for them. And if all there is to shopping is convenience and saving money, most of the time you can do better on-line, although the Internet ain’t perfect, either.
“Brick and mortar” is older than I thought, and I was probably wrong about its lineage, too. I had assumed it came out of computerese, but it turns up earlier in marketing lingo and earlier still in that surprisingly fecund source of new expressions, American Banker (cf. “firewall,” “takeaway,” “best practices“). The first examples in LexisNexis date from the early eighties, and they’re in articles about changes in banking that make ATM’s and telephone banking more profitable than maintaining branches with parking lots and bullet-proof glass. I can’t rule out the possibility that the bankers got the term from primal computer geeks, but I don’t want to give the geeks too much credit. The New York Times soon provided a sterling example from the wide world of shopping (or “teleshopping” — there’s a neologism that didn’t catch on) in April 1984, and the expression slipped into consumer lingo. It was possible to buy on-line even then, but mail-order catalogues were more the rule. The computer industry was nascent, and very few people had figured out how to make it pay reliably (which, come to think of it, is still true).
Before remote shopping was dreamed of, “brick(s) and mortar” referred to housing; it could also refer to the value of a house (as in: don’t tie up all your capital in bricks and mortar). Businessdictionary.com offers the following: “Originally, a firm’s investment buildings housing its offices, warehouses, and other facilities.” “The brick and mortar business” was occasionally used in the American press as a set phrase to refer to the building industry.
One impetus for this post was the announcement that Amazon, scourge of brick-and-mortar stores, is about to open one on W. 34 Street in Manhattan. Surely the second coming is at hand! Is this a case of “it takes one to know one” or “if you can’t beat ’em, join ’em”? Actually, it will be more of a take-out joint than a three-star shopping experience; the Wall Street Journal reports that it is designed to give impatient New Yorkers a way to go pick up their Amazon orders rather than waiting for the poky old Postal Service to shlep it to their door. It will be what they call a “fulfillment center” — doesn’t that sound like a health resort for new agers? One more temple to the gods of consumerism. Apparently Amazon is lowering expectations by calling it an experiment rather than a shift in policy. Wouldn’t it be funny if Amazon became a card-carrying hod carrier?