‘Scratch Paper’ or ‘Scrap Paper’?

Andrew Marc Greene e-mails: “My son’s fourth-grade class was debating whether paper on which one scribbles offhand notes is scrap paper or scratch paper. Scrap paper describes where it comes from, and scratch paper defines what it’s used for. We were wondering if the phonetic similarity is just coincidence, or if one term was derived from the other.”

For speakers of American English, at least, the dividing line between scratch paper and scrap paper is none too clear. The linguist Bert Vaux conducted an online survey of American dialects from 2000 to 2005, and he included this question: “What do you call paper that has already been used for something or is otherwise imperfect?” More than 10,000 people responded, and the overall results were evenly split, with about 31 percent saying scratch paper and an equal number saying scrap paper. A third survey choice garnered 36 percent: “Scratch paper is still usable (for example, the paper you bring to do extra work on a test); scrap paper is paper that isn’t needed anymore and can be thrown away.”

Dig deeper into Vaux’s data, and you’ll find distinct regional patterns: respondents from the West and Midwest prefer scratch paper, while Northeasterners go for scrap paper. (Southerners are more likely to split the difference and choose the third option.) Outside of the United States, scratch paper is rarely used, and it gets marked as an Americanism in dictionaries from Oxford and Cambridge. British speakers plump for scrap paper — or if the activity of quick note-taking is foregrounded, scribbling paper. Likewise what some Americans would call a scratch pad is known in Britain as a scribbling pad or scribbling block.

Given this state of affairs, you might think scratch paper shows up much later than scrap paper in the documentary record. The Google Books database shows scrap paper in use from 1838, but surprisingly enough it also contains an instance of scratch paper from five years earlier, in colonial India of all places. The Bombay Gazette bemoaned that the books in the local literary-society’s library “have been most unmercifully scribbled on,” and “various attempts have been made to put a stop to such a scratch-paper practice.”

The Bombay example turns out to be something of an outlier, however. First, even if those library vandals were scratching away on book pages, that’s different from scratch paper in its later incarnation as cheap paper, loose or in a pad, for jotting notes. Scratch paper and its British counterpart scribbling paper did not truly take off until the late 19th century, no doubt helped along by advances in wood-pulp papermaking and the mass production of pencils. Scrap paper, meanwhile, had already been in circulation as a name for waste paper that could be recycled or reused, with note-taking emerging as one prominent type of reuse.

Scratch paper, then, likely owes some of its success in American usage to the fact that it happens to resemble the more widespread scrap paper. That would make scratch paper a potential “eggcorn,” to use a term coined by linguists for a misconstrued word or phrase that gets reshaped with a new semantic motivation. Scratch paper makes sense in a new way, as it describes the note-taker’s hurried writing rather than the cheap source of the paper. Since the two variants are now equally available to Americans, the choice between scrap paper and scratch paper ultimately comes down to a question of the medium vs. the message.

Ben Zimmer will answer one reader question every other week.


Full article: http://www.nytimes.com/2010/12/05/magazine/05FOB-onlanguage-t.html


Allan Curry writes: “What’s to account for the ubiquity of the word resonate, once largely confined to the concert hall, now more (and more) often used to suggest receptivity (to an idea, a political message, etc)?”

Twenty years ago in this space, William Safire pegged resonate as a “vogue word” that had “gone out of control” in the 1980s. He said he would gladly join the crusade of a longtime correspondent, the linguist Louis Jay Herman, against resonate and other words that suffered from “pretentious overuse,” like frisson.

A quick check of the Corpus of Historical American English, an endlessly useful resource made available by Mark Davies at Brigham Young University, finds that the vogue for frisson seems have peaked in the 1990s. Resonate, on the other hand, shows no sign of abatement. Among the sources collected by Davies, the frequency of resonate has risen steadily, from about two appearances per million words during its supposed heyday in the 1980s, to more than five per million in the past decade.

For most of its history in English, resonate led a peaceful life. Its Latin root, resonare, meaning “to make a prolonged or echoing sound,” had already entered the language by Chaucer’s time in the form resound. It was reborrowed with a more classical air as resonate in the 17th century. The word took on a more technical meaning in the science of acoustics, where resonance is understood as “the reinforcement or prolongation of sound by reflection or by the synchronous vibration of a surrounding space or a neighboring object,” to quote the Oxford English Dictionary.

The noun resonance and the adjective resonant first made the semantic trip from sonorous acoustic qualities to more metaphorical vibrations, suggesting a person’s sympathetic response to something — “striking a chord,” to use another musical figure of speech. In 1607, for instance, an English translation of Henri Estienne’s “World of Wonders” included the line, “So ought our hearts … to have no other resonance but of good thoughts.”

By the early 20th century, the verb resonate began to shimmer with sympathetic vibes. The O.E.D. credits H.G. Wells with the first known figurative use in 1903: “The men and women of wisdom, insight and creation, as distinguished from those who merely resonate to the note of the popular mind.” Wells wrote “resonate to,” but as the metaphorical meaning took off in later decades, the word more typically took the preposition with. Other acoustical metaphors have followed suit: if someone else’s ideas resonate with you, you could also say that the two of you are “on the same wavelength” or “in sync” (two idioms that haven’t aged particularly well, either).

There’s nothing wrong with transferring sonic lingo to the realm of personal sympathies, but if Safire and Herman found resonate hackneyed in 1990, the increased usage in the intervening years has done it no favors. These days we can blame management types in particular for overuse, as the term frequently gets hauled out to convey how “resonant leaders” connect emotionally with a team or audience. No matter what your line of work is, it’s best to use resonate sparingly if you want your words to fall on receptive ears.

Ben Zimmer will answer one reader question every other week.


Full article: http://www.nytimes.com/2010/11/21/magazine/21FOB-onlanguage-t.html


The origins of our Webified age were hardly auspicious. Two decades ago, Tim Berners-Lee, a British software programmer at the CERN physics-research laboratory outside Geneva, was sketching out a global system for sharing information over the Internet. A March 1989 document that he drafted with the drab title “Information Management: A Proposal” had met with minimal internal interest. Berners-Lee’s group leader, Mike Sendall, was mildly intrigued and allowed him to keep tinkering on the project, calling it “vague, but exciting.”

On Nov. 12, 1990, Berners-Lee tried his hand at a new proposal, now collaborating with the Belgian engineer Robert Cailliau. As Berners-Lee would later recount in his memoir, “Weaving the Web,” he decided some rebranding was in order, and he ran through a number of potential names for the project. One idea was Mesh, “but it sounded a little too much like mess.” Mine of Information might seem “too egocentric” when treated as an acronym, MOI, French for “me.” The Information Mine could be seen as “even more egocentric” based on its acronym: TIM, Berners-Lee’s first name.

“We had been trying to find a good name for the thing for a while,” Cailliau told me via e-mail. “CERN’s experiments and projects were usually given names of Greek or Egyptian mythological figures, and I specifically did not want that because I wanted something for the future and different. I had looked at Nordic mythology but not found anything suitable.”

Finally, Berners-Lee came up with a three-word name that suitably described the global reach of the system they were envisioning: World Wide Web. Cailliau recalls that Berners-Lee put forward the name “as a temporary measure.” They agreed to use it for their revamped proposal for CERN management, as the proposal could not be delayed any further. “If the proposal was accepted,” Cailliau said, “we would find a better name.”

They never did manage to replace that stopgap label, of course, and the 1990 proposal would forever change the English lexicon. In the original title, the three words were run together as WorldWideWeb, but Berners-Lee and Cailliau would soon separate it into World Wide Web (despite the fact that worldwide is best treated as a single word), underscoring the alliteration.

How to abbreviate the name was problematic from the beginning. “Friends at CERN gave me a hard time, saying it would never take off,” Berners-­Lee wrote in his memoir, “especially since it yielded an acronym that was nine syllables long when spoken”: double-u, double-u, double-u. Cailliau, who hailed from the Dutch-speaking region of Belgium, told me that it was not so troublesome for him, because in Dutch and other Northern European languages WWW is simply pronounced weh-weh-weh.

Another possible abbreviation was W3, but that alphanumeric version never caught on, lingering only in W3C, the abbreviation for the World Wide Web Consortium, an international standards organization that Berners-Lee founded. A 1992 paper by Berners-Lee and Cailliau pointed the way to future usage: “The W3 worldview is of documents referring to each other by links,” they wrote. “For its likeness to a spider’s construction, this world is called the Web.”

That single spidery word, capitalized or uncapitalized, would bear countless offspring. The online edition of the Oxford English Dictionary catalogs some of the most common web compounds, like web address, web browser, webcam, webcast, web crawler, web developer, web design, webinar, weblog, webmaster, webmistress, web page, web publisher, web server, web site, web surfer and webzine. (The O.E.D. might have gone overboard by including a couple of iffy web-words: webliography, for a Web-based bibliography, and webmeister, a silly alternative to webmaster.)

But that’s not all: weblog, first used in 1997 on Jorn Barger’s “Robot Wisdom Weblog,” made lexical history two years later when Peter Merholz playfully shortened it to blog. Blog soon begat a whole new generation of techno-neologisms in the blogosphere, where bloggers compile blogrolls, celebrate blogiversaries and suffer from blogorrhea. The vowel of blog can mutate, as when law blogs are called blawgs or requests via blog posts are called blegs (combining blog and beg). The “b” in these words is all that remains from its ancestor, Berners-Lee’s Web, and even that slim vestige can be lost when blog blends with other words, as in vlog (a video blog) and splog (a spam blog).

Though the appearance of the phrase World Wide Web 20 years ago was surely a crucial linguistic milestone, Berners-Lee wasn’t the first to hit upon that happy collocation. It has long been a handy journalistic designation for international spy rings, as in the 1853 notice in the London-based Weekly News and Chronicle warning of “a world-wide web of espionage” conducted by Russia under Czar Nicholas or the sensational headline in The Boston Globe of November 1914, “World-Wide Web of German Spies.”

More pertinent, the phrase also has a history among writers imagining complex communication networks. In an 1867 lecture, the English clergyman and novelist Charles Kingsley warned that the scientific advances of the day could be abused in the service of centralized oppression. “I can conceive — may God avert the omen! — centuries hence, some future world-ruler sitting at the junction of all railroads, at the centre of all telegraph-wires — a world-spider in the omphalos of his world-wide web.”

Thankfully, the web of information that Berners-Lee and Cailliau first wove has developed in a highly decentralized fashion, with no “world-spider” in control of the whole thing. And that has allowed the language of the Web to flourish in ways that its innovators could never have foreseen, all derived from a name that was merely a “temporary measure.”

Ben Zimmer will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/11/14/magazine/14FOB-onlanguage-t.html


In my recent column on Stephen Colbert’s coinage of “truthiness,” I wrote that the title of Charles Seife’s new book, “Proofiness,” is “very much a homage to Colbert.” Laura Kozin e-mails: “My brain stubbed its toe on this. I thought we pronounced it ‘an (h)omage.’ Did I miss a change to spoken ‘h’? Is it now ‘a herb garden’ as well?” Steve Penn e-mails: “Your last column indicated that you pronounce homage with the ‘h.’ Me too. A few years ago, on the radio, I was jolted to hear this word pronounced oh-MAZH. A real stomach-turner. Since then, I’ve heard this pronunciation fairly frequently on the radio and occasionally on television. Are the broadcast folks cooking up a new pronunciation, or do they intend some word other than homage?”

The New York Times style guide does not specifically address the word homage, and in such matters the copy desk typically turns to Webster’s New World Dictionary for guidance. As with other leading American dictionaries, Webster’s New World currently recognizes two equally accepted pronunciations of the word: either HOM-ij or OM-ij. Since the pronunciation with “h” is listed first, that would favor “a homage” over “an homage.” (The Times has not been terribly consistent on this score, however. Since 2001, “a homage” has appeared in the paper 500 times, but “an homage” has appeared 407 times.)

While most U.S. dictionaries list HOM-ij first, one exception is Merriam-Webster’s Collegiate Dictionary. Joshua S. Guenter, Merriam-Webster’s pronunciation editor, explained to me that prior to the Tenth Edition of the dictionary in 1993, the pronunciation of homage was given with the initial “h” in parentheses, “indicating the two variants were about equally common.” Starting with the Tenth, they began giving a slight edge to OM-ij. “Our citation files do show the ‘h’-less variant to be more common than the ‘h’-ful one, though not by a huge degree,” Guenter said.

Dropping the “h” sound from homage appears to be gathering steam in American speech, and other dictionaries will no doubt begin to reflect this move. This actually represents a return to a much older pronunciation pattern. As with many other imports from Norman French into Middle English, the initial “h” was not originally pronounced in homage. Eventually, so-called spelling pronunciation introduced the “h” sound to words like habit, host, hospital and human. Some words resisted the extra puff of aspiration, like heir, honest, honor and hour. Still others took on the “h” only in certain dialects: herb, for instance, stayed unaspirated in American pronunciation while it gained the “h” sound in British English. Starting around the eighteenth century, homage joined the “h”-ful crowd.

The shifting status of homage is further muddied by the modish French-influenced version, oh-MAZH. Strictly speaking, that pronunciation ought to be limited to artistic contexts where the French word hommage has been reintroduced into English as a term for a work that respectfully emulates that of another artist. Something similar happened with the word auteur, which cinephiles borrowed from French to refer to directors with distinctive styles, even though the word had already entered the lexicon centuries ago as author.

The oh-MAZH pronunciation is gaining a foothold beyond the arts world, and for some that’s a cause for alarm. In his book “The Accidents of Style,” Charles Harrington Elster calls this a “preposterous de-Anglicization” that is “becoming fashionable among the literati.” Elster had previously complained that good old HOM-ij was losing out to OM-ij “in havens for the better-educated like National Public Radio,” and for defenders of the “h” pronunciation oh-MAZH just adds insult to injury.

A check of NPR’s audio archives corroborates Elster’s hunch. Listening to 10 recent uses of the word homage by on-the-air personalities, I found an even split: five for oh-MAZH and five for OM-ij, with the latter generally reserved for the “respect” meaning, as in pay homage. The HOM-ij pronunciation, meanwhile, seems to be losing out to its trendier h-less rivals, despite the protestations of traditionalists. And since it’s a fight of two against one, “a homage” may, over time, become increasingly rare.

Ben Zimmer will answer one reader question every other week.


Full article: http://www.nytimes.com/2010/11/07/magazine/07FOB-onlanguage-t.html

Creeper! Rando! Sketchball!

When Liana Roux, a junior at the University of North Carolina at Chapel Hill, was reading a Facebook event page for her friend’s birthday party recently, she noticed a terse proviso at the end of the announcement: “No randos.” The friend wanted only people she knew to come to her party and thus sought to bar any random strangers, or randos, in collegiate parlance.

Roux is keeping track of words like rando for an assignment in a class she is taking on the grammar of current English, taught by Connie C. Eble, the resident linguist in U.N.C.’s English department. Since 1972, Eble has asked her students to compile lists of slang that they encounter in their everyday interactions, and this semester, rando is going on Roux’s list.

Rando is one of a surprisingly large number of words that U.N.C. students use to refer to unfamiliar, suspicious or anxiety-producing outsiders. Skimming the lists that Eble has collected from recent classes, I kept spotting a familiar pattern: along with rando, there are nouns like creeper, sketcher and sketchball and adjectives like dubious, grimy, sketchy, sketch and skeazy. Sketchy and sketch have, in fact, been among the most frequently attested words culled from Eble’s students for the past several semesters.

These treacherous terms have been percolating for years on many American campuses. A list of slang compiled from students at the University of Arkansas, Fayetteville, published in the journal American Speech in 1975, included sketch as an adjective meaning “dangerous, risky” (“I think we’re in a sketch situation”). By 1996, one of Eble’s U.N.C. students offered sketch as a noun meaning “someone who is hard to figure out.” The variations sketchball, sketcher and sketchmaster followed thereafter, all sharing an air of suspicion and possible danger or at least discomfort.

The creep family is much older, originally describing people you can’t trust because they’re always “creeping around.” In early-20th-century America, a creep or creeper could refer to a sneaky thief, a cheating lover or a despicable person more generally. In later years, the annoying or shady creep begat creepo, creepazoid and creepshow. (And just as you can be creeped out by a creepy person, you can be sketched out by a sketchy person.)

We can thank the fine minds at the Massachusetts Institute of Technology for moving random into the realm of the weird. As early as 1971, according to the Oxford English Dictionary, M.I.T.’s student paper, The Tech, was using random as an adjective meaning “peculiar, strange” or as a noun to disparage people outside a community, particularly the community of computer hackers. (The 1991 New Hacker’s Dictionary provides the example “The audience was full of randoms asking bogus questions.”) Eventually it could refer to unfamiliar faces in any social situation, like a party or a bar, with rando as a slangy 21st-century shortening.

Even if these terms describing creepy outsiders aren’t necessarily novel, the question remains: Why do they occur in such profusion on the U.N.C. slang lists? Eble points out that the words are typically used by women, who currently make up nearly 60 percent of U.N.C.’s student population. Compared with past generations, Eble said, “female students are putting themselves into more dangerous situations than they did in my day,” especially when it comes to dating and partying. Terms like creeper, rando and sketchball come in handy as women deal with men who may try to give them unwanted attention.

In interviews I conducted with Eble’s students, one recurring theme that emerged was the impact of technology and social media on the need to patrol social boundaries. “With Facebook and texting,” Natasha Duarte said, “it’s easier to contact someone you’re interested in, even if you only met them once and don’t really know them. To the person receiving them, these texts and Facebook friend requests or wall posts can seem premature and unwarranted, or sketchy.”

Facebook in particular lends itself to “stalkerish” behavior, Christina Clark explained, and indeed the compound verb Facebook stalk (meaning “excessively or surreptitiously peruse another’s Facebook profile”) shows up in the latest slang lists. “People put things on Facebook a lot of the time to show off pictures of themselves and to meet new people, but some of these new people are undesirables,” Clark said. “Unfortunately, it can be hard to filter these people out without feeling unkind, so this information is available to them, and often it is alarming if they seem to be looking through pictures or constantly trying to find out what you’re up to. These people then become stalkers or ‘creepers.’ ”

Lilly Kantarakias said she believes that the shift to technologically mediated exchanges among students is leading to a “loss of intimacy” and that this failure to engage in human contact is responsible for the rise in all of the “sketchy” talk. “People have lost both their sense of communication and social-interaction skills,” Kantarakias said. “We know only how to judge people off of a Facebook page or we easily misinterpret texts or e-mails. You can see it in the way people walk around campus, texting on their cells, being completely oblivious to the hundreds of people surrounding them. We’ve become lazy with our speech and our social profiling of fellow human beings.”

Roux observed that “as college students, we navigate through an enormous social landscape every day.” The slang words for suspicious outsiders “create a distance between ‘us’ and ‘them,’ between our clique and the creepers.” These “terms of exclusion,” as Roux sees them, don’t just separate an in-group of students from potentially dangerous people but also from “people we just dislike or people who are perceived as different or weird.” And that type of behavior, even if it is complicated these days by new technology, new social pressures and new slang expressions, is surely as old as the hills.

Ben Zimmer, New York Times


Full article and photo: http://www.nytimes.com/2010/10/31/magazine/31FOB-onlanguage-t.html


Should the word be used for things we can actually count?

McKay Stangler e-mails: ‘‘I was curious about your thoughts on the modern usage of ‘countless.’ The Oxford English Dictionary defines it as ‘that cannot be counted’; in other words, too many of something to count. I’ve noticed, however, that it has very nearly become a synonym for ‘many’ or ‘numerous.’ Do you have a sense of how and when the word started adopting this newly evolved meaning?’’

Countless falls into a family of adjectives that, when taken literally, imply an infinitude but in practice refer more loosely to a vast number. Others in this family include incalculable, immeasurable, inestimable, limitless and measureless. Even infinite gets used in this hyperbolic fashion, and has since the age of Chaucer. (Think of Hamlet’s line: ‘‘What a piece of work is a man! How noble in reason? How infinite in faculty?’’)

As for countless, its traditional use has been for quantities that are, if not strictly uncountable, at least too immense to allow for easy enumeration. A 1916 dictionary of similes supplies some typical literary examples to fill the slot in the phrase ‘‘as countless as ___’’: stars in the sky, grains of sand in the desert, motes of dust in a sunbeam, drops of water in the ocean. Thus, canonically countless items are natural objects that are so profuse that they defy human attempts to number them.

Has the sense of countless been weakening in recent years? Anecdotal evidence might suggest so. Stangler provides a few examples of usage from a week of New York Times coverage that he finds questionable. On Cuba: ‘‘Workers were being laid off in countless industries, from hospitals to hotels.’’ In an obituary of a voiceover actor: ‘‘As the narrator of countless movie trailers (his wife estimated he did 3,000), Mr. Gilmore was an especially effective pitchman.’’ In an article about community farming: ‘‘Even without the community agriculture program, there is tree pruning all winter and countless other tasks.’’

In all three cases, the nitpickiest among the Times readership could find grounds for complaint. If we set our minds to it, we probably could count all the industries in Cuba, the trailers narrated by Mr. Gilmore or the tasks on a community farm. But these things would not be easy to count, and that is generally what countless now implies. (The first sentence is guilty of a graver journalistic transgression: ‘‘from hospitals to hotels’’ is a false range generally scorned by copy editors.)

Though I haven’t found any complaints about the exaggerated use of countless in any of the standard usage guides, the writer David Foster Wallace, a well-known stickler on grammatical matters, seems to have been attuned to the word’s overextension. In a short story called ‘‘My Appearance”’ he tells of an actress going on David Letterman’s late-night talk show. Letterman mentions her ‘‘three quality television series’’ and ‘‘countless guest-appearances on other programs.’’ The actress replies matter-of-factly, ‘‘A hundred and eight.’’ Letterman corrects himself with ‘‘virtually countless guest-credits.’’ A hedging word like virtually, nearly or almost can help to tone down the hype of countless. Or why not mix it up with another adjective like myriad or multitudinous? The possibilities are limitless (well, not quite).

Ben Zimmer, New York Times


Full article: http://www.nytimes.com/2010/10/24/magazine/24onlanguage.html


Around 4 p.m. on Oct. 17, 2005, Stephen Colbert was searching for a word. Not just any word, but one that would fit the blowhard persona that he was presenting that night on the premiere episode of Comedy Central’s “Colbert Report.” He once described his faux-pundit character as a “well-intentioned, poorly informed, high-status idiot,” and the word he was looking for had to be sublimely idiotic.

During the rehearsal, Colbert was stuck on what term to feature for the inaugural segment of “The Word,” a spoof of Bill O’Reilly’s “Talking Points.” Originally, he and the writers selected the word truth, as distinguished from those pesky facts. But as Colbert told me in a recent interview (refreshingly, he spoke to me as the real Colbert and not his alter ego), truth just wasn’t “dumb enough.” “I wanted a silly word that would feel wrong in your mouth,” he said.

What he was driving at wasn’t truth anyway, but a mere approximation of it — something truthish or truthy, unburdened by the factual. And so, in a flash of inspiration, truthiness was born. In that night’s broadcast, he imagined the disdain his coinage would engender among elitist dictionary types. “Now I’m sure some of the Word Police, the wordinistas over at Webster’s, are gonna say, ‘Hey, that’s not a word,’ ” he said. As I pointed out at the time on the linguistics blog Language Log, truthiness already appeared in the Oxford English Dictionary under the adjective truthy. To be sure, it was exceedingly rare before 2005, but it had been recorded as a somewhat playful variant of truthfulness since the early 19th century.

Regardless of its pre-Colbert history, truthiness in its satirical new meaning charmed many a wordinista. A few months after its debut on “The Colbert Report,” at the annual meeting of the American Dialect Society (A.D.S.) in Albuquerque, it was selected as the 2005 Word of the Year. At the meeting, I was an unabashed supporter of the choice, doing my part to make sure it beat out such worthy adversaries as podcast and sudoku. The selection received a surprising amount of press attention, with Colbert himself stoking the flames by picking a fight with the Associated Press, which had unaccountably omitted any mention of “The Colbert Report” in the Word of the Year article that went out over the wires.

On the A.D.S. mailing list, society members expressed bemusement at the role they had unexpectedly played in bringing truthiness into wider circulation. “Like astronomers witnessing the birth of a nova,” wrote Allan Metcalf, the society’s executive secretary, “we are watching the nativity and infancy of a new word that has the possibility of becoming a permanent addition to the vocabulary. And we have been midwives.”

Ronald R. Butters, the former chairman of Duke University’s linguistics program, was not impressed. “Truthiness is not a lexicological nova,” he countered on the mailing list, predicting that it was a flash in the pan that would “go the way of bushlips, and about as quickly.” Bushlips, meaning “insincere political rhetoric,” was the first A.D.S. Word of the Year, in 1990 (when Bush the elder reneged on his “no new taxes” pledge), and it soon ended up on the scrapheap of history.

Five years later, truthiness has proved to be no bushlips. It has even entered the latest edition of the New Oxford American Dictionary, published earlier this year, with Colbert explicitly credited in the etymology. In an e-mail, Butters acknowledged that he was clearly wrong about the word’s staying power but said he still considers it nothing more than a “stunt word,” calling it “a hokey, unnaturally contrived coinage.”

For many other observers, though, there is something undeniably appealing about how truthiness signifies ersatz truth, so much so that the neologism has spawned numerous imitators ending in –iness — what the Stanford linguist Arnold Zwicky has called “the Colbert suffix.” In 2007, Meghan Daum of The Los Angeles Times used “fame-iness” to refer to Paris Hilton-style celebrity, while Ben Goldacre of The Guardian mocked an author’s superficial footnotes as providing “an air of ‘referenciness.’ ” The latest in the “X-iness” parade is the title of Charles Seife’s new book, “Proofiness,” defined by Seife as “the art of using bogus mathematical arguments to prove something that you know in your heart is true — even when it’s not.” Seife, an associate professor of journalism at New York University, told me that the title is very much a homage to Colbert. He credits his wife with recognizing during the writing of the book that his topic was “the mathematical analogue of truthiness.”

The enduring influence of truthiness has also been felt at Indiana University, where a team of information scientists has designed software to detect the propagation of political misinformation on Twitter. The project leader, Filippo Menczer, recalled that while the team was brainstorming about a name for the research tool, one of his graduate students suggested Truthy. “Everyone agreed it was perfect,” Menczer said. Contributors are now busy disentangling reliable political Twitter posts from those that are merely truthy.

Colbert, for his part, said that he’s amazed at how far truthiness has come. But as others have spread the word, he hasn’t felt the need to use it much himself. After Glenn Beck held a “Restoring Honor” rally at the Lincoln Memorial, Colbert’s fans clamored for their own “Restoring Truthiness” event; Colbert has chosen instead to lead a “March to Keep Fear Alive” in Washington on Oct. 30. Truthiness, Colbert pointed out, is in no need of restoring, since it continues to define those who appeal to raw feelings at the expense of facts. “I doubt that many people in American politics are acting on the facts,” he observed ruefully. “Everybody on both sides is acting on the things that move them emotionally the most.”

Ben Zimmer will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/10/17/magazine/17FOB-onlanguage-t.html

‘Making an Amends’

Meg e-mails: “I am a member of a 12-step program in which the eighth and ninth steps refer to ‘making amends.’ When people share their experience with these steps, they often talk about ‘making an amends’ as if it were a combination of singular and plural. I find this so annoying that I may need to make amends for interrupting people to correct their grammar. But perhaps I am in error. Could you please advise as to the correctness of ‘making an amends’?”

The 12 steps to recovery first outlined by the founders of Alcoholics Anonymous, Bill Wilson and Bob Smith, have been enshrined in A.A.’s “Big Book” for more than seven decades. Over the years, the remorseful focus on “making amends” in Steps 8 and 9 has extended beyond the A.A. movement to the language of recovery more generally, even making an appearance in the public statement by Tiger Woods earlier this year apologizing for his marital infidelities.

While Woods said in his prepared statement, “It’s now up to me to make amends,” he modified the idiom in an interview with ESPN the following month, speaking of the “many people I have to make an amends to.” Woods is hardly alone in treating the word amends as a singular noun, or even alternating between singular and plural interpretations of the word.

Uncertainty over how to treat amends is far from new. The Oxford English Dictionary has examples of amends used in a distinctly singular fashion all the way back to the fifteenth century. The English essayist Joseph Addison wrote of making “an honorable amends,” and T. S. Eliot, in his poem “Portrait of a Lady,” posed the question, “How can I make a cowardly amends / For what she has said to me?”

Amends came into English from the Old French word amendes, meaning “fines” or “penalties,” the plural of amende, meaning “reparation.” But while the singular form persisted in French, it dropped out of English, leaving us with a plural noun that has no proper singular equivalent. Something similar happened with other words in the language, like alms, odds, pains and riches.

Noah Webster tried to sort out this confusion in his 1789 book, “Dissertations on the English Language.” Webster held that “amends may properly be considered as in the singular number,” but concluded that judgment of the word as singular or plural was ultimately “at the choice of the writer.” He saw the word means as a parallel case: if means expresses a single action to achieve a result, it can be thought of as singular despite the -s ending, but if it encompasses more than one action, it can take the plural reading.

Sadly, idioms don’t always accord with logical argumentation. The singular version of means survives in the frozen phrase, a means to an end, but singular amends has never made much headway in standard English. Make an amends is vastly outnumbered by make amends in written use, though it is likely more popular in everyday speech, as Tiger Woods demonstrated when he went off-script. Notwithstanding illustrious predecessors like Addison and Eliot, it’s best to make amends and not an amends, lest your act of contrition turn into a grammatical squabble.

Ben Zimmer, New York Times


Full article: http://www.nytimes.com/2010/10/10/magazine/10onlanguage.html


Theodore Rockwell, who served as technical director for the U.S. Navy’s nuclear-propulsion program in the 1950s and ’60s, shared a telling anecdote about his onetime boss, the famously irascible Adm. Hyman G. Rickover. “One time he caught me using the editorial we, as in ‘we will get back to you by. . . .’ ” Rockwell recalled in his memoir, “The Rickover Effect.” “He explained brusquely that only three types of individual were entitled to such usage: ‘The head of a sovereign state, a schizophrenic and a pregnant woman. Which are you, Rockwell?’ ”

Rickover was hardly alone in his abhorrence of the editorial we — so called because of its usage by anonymous opinion columnists. In fact, his barb has been told in many different ways over the years. Consider another volatile personality, Roscoe Conkling, who served as senator from New York after the Civil War. In 1877, Conkling objected to how the new president, Rutherford B. Hayes, overused the word we, and The St. Louis Globe-Democrat reported his rejoinder: “Yes, I have noticed there are three classes of people who always say ‘we’ instead of ‘I.’ They are emperors, editors and men with a tapeworm.”

Conkling’s formulation was picked up by we-haters far and wide. The trifecta of “kings, editors and people with tapeworm” has been widely attributed to Mark Twain, but like so many witticisms credited to him, there’s no record he ever said it. It’s also unlikely that Henry David Thoreau ever made the remark once ascribed to him: “We is used by royalty, editors, pregnant women and people who eat worms.”

Worms, or more specifically tapeworms, figure prominently in we-­related humor. The earliest known joke to combine parasites and pronouns comes from George Horatio Derby, a humorist from California who assumed the pen name John Phoenix. “I do not think I have a tapeworm,” he wrote in 1855, “therefore I have no claim whatever to call myself ‘we,’ and I shall by no means fall into that editorial absurdity.”

What is it about the presumptuous use of we that inspires so much outrage, facetious or otherwise? The roots of these adverse reactions lie in the haughtiness of the majestic plural, or royal we, shared by languages of Western Europe since the days of ancient Roman emperors. British sovereigns have historically referred to themselves in the plural, but by the time of Queen Victoria, it was already a figure of fun. Victoria, of course, is remembered for the chilly line, “We are not amused” — her reaction, according to Sir Arthur Helps, the clerk of the privy council, to his telling of a joke to the ladies in waiting at a royal dinner party. Margaret Thatcher invited mocking Victorian comparisons when she announced in 1989, “We have become a grandmother.”

Nameless authors of editorials may find the pronoun we handy for representing the voice of collective wisdom, but their word choice opens them up to charges of gutlessness and self-importance. As the fiery preacher Thomas De Witt Talmage wrote in 1875: “They who go skulking about under the editorial ‘we,’ unwilling to acknowledge their identity, are more fit for Delaware whipping-posts than the position of public educators.”

Given the accumulated resentment of “nosism” (using we for I, from the Latin pronoun nos), it’s little wonder that modern literary writers have rarely tried to write narratives in the first-person plural. But the device of collective narration has worked effectively on occasion, from William Faulkner’s “Rose for Emily” to Joshua Ferris’s “Then We Came to the End.” Most recently, Lisa Birnbach has taken the nosist route in “True Prep,” her 30-year follow-up to “The Official Preppy Handbook.” (Opening lines: “Wake up, Muffy, we’re back. O.K., now where were we?”)

The royal and editorial we are examples of the exclusive we, meaning that the person being addressed is not included in the scope of the pronoun. English, like many languages, uses the same word for the inclusive first-person plural, encompassing the notional “you” along with “me.” The inclusive we seeks out a bond of empathy or common understanding between the speaker and the receiver of a message. Writers rely on it to establish rapport with readers, and teachers with students (“as we shall see”). But this is not always a welcome rhetorical move, especially when it comes across as pedantic or condescending. At worst, it can recall the we of caregivers for the very young and very old: “How are we feeling today?”

The overreaching effect of the inclusive we has sparked its own humorous traditions. In August 1956, the Los Angeles Times columnist Gene Sherman introduced into print what was already a well-traveled story about the Lone Ranger and his faithful sidekick, Tonto. Surrounded by “wild, screaming Indians,” the Lone Ranger desperately asks Tonto, “What will we do?” Tonto replies, “What do you mean ‘we,’ paleface?” Later versions changed “paleface” to “white man” or “kemo sabe,” Tonto’s endearing epithet for the Ranger. The joke is so well known in the United States that just the punch line is usually sufficient for rebuffing an overly inclusive we.

An equally colorful but less common American retort to the inclusive first-person plural pronoun is “We? You got a mouse in your pocket?” Curt Johnson, publisher of the Chicago literary magazine December, remarked in a 1966 article that he heard the line from a student talking back to a college instructor. Many other regional variants have sprung up, with “rat” or “frog” standing in for “mouse.” Another more sex-specific inquiry is about “a mouse in your purse.” Dabblers in nosism beware: whether it’s tapeworms or rodents, saying we where I would do can expose you to accusations of infestation.

Ben Zimmer will answer one reader question every other week


Full article and photo: http://www.nytimes.com/2010/10/03/magazine/03FOB-onlanguage-t.html


My ebullient 4-year-old son, Blake, is a big fan of the CDs and DVDs that the band They Might Be Giants recently produced for the kiddie market. He’ll gleefully sing along to “Seven,” a catchy tune from their 2008 album “Here Come the 123s” that tells of a house overrun by anthropomorphic number sevens. The first one is greeted at the door: “Oh, there’s the doorbell. Let’s see who’s out there. Oh, it’s a seven. Hello, Seven. Won’t you come in, Seven? Make yourself at home.”

Despite the song’s playful surrealism (more and more sevens arrive, filling up the living room), the opening lines are routine and formulaic. The polite ritual of answering the door and inviting a guest into your house relies on certain fixed phrases in English: “Won’t you come in?” “Make yourself at home.”

As Blake learned these pleasantries through the song and its video, I wondered how much — or how little — his grasp of basic linguistic etiquette is grounded in the syntactical rules that structure how words are combined in English. An idiom like “Make yourself at home” is rather tricky if you stop to think about it: the imperative verb “make” is followed by a second-person reflexive pronoun (“yourself”) and an adverbial phrase (“at home”), but it’s difficult to break the phrase into its components. Instead, we grasp the whole thing at once.

Ritualized moments of everyday communication — greeting someone, answering a telephone call, wishing someone a happy birthday — are full of these canned phrases that we learn to perform with rote precision at an early age. Words work as social lubricants in such situations, and a language learner like Blake is primarily getting a handle on the pragmatics of set phrases in English, or how they create concrete effects in real-life interactions. The abstract rules of sentence structure are secondary.

In recent decades, the study of language acquisition and instruction has increasingly focused on “chunking”: how children learn language not so much on a word-by-word basis but in larger “lexical chunks” or meaningful strings of words that are committed to memory. Chunks may consist of fixed idioms or conventional speech routines, but they can also simply be combinations of words that appear together frequently, in patterns that are known as “collocations.” In the 1960s, the linguist Michael Halliday pointed out that we tend to talk of “strong tea” instead of “powerful tea,” even though the phrases make equal sense. Rain, on the other hand, is much more likely to be described as “heavy” than “strong.”

A native speaker picks up thousands of chunks like “heavy rain” or “make yourself at home” in childhood, and psycholinguistic research suggests that these phrases are stored and processed in the brain as individual units. As the University of Nottingham linguist Norbert Schmitt has explained, it is much less taxing cognitively to have a set of ready-made lexical chunks at our disposal than to have to work through all the possibilities of word selection and sequencing every time we open our mouths.

Cognitive studies of chunking have been bolstered by computer-driven analysis of usage patterns in large databases of texts called “corpora.” As linguists and lexicographers build bigger and bigger corpora (a major-league corpus now contains billions of words, thanks to readily available online texts), it becomes clearer just how “chunky” the language is, with certain words showing undeniable attractions to certain others.

Many English-language teachers have been eager to apply corpus findings in the classroom to zero in on salient chunks rather than individual vocabulary words. This is especially so among teachers of English as a second language, since it’s mainly the knowledge of chunks that allows non-native speakers to advance toward nativelike fluency. In his 1993 book, “The Lexical Approach,” Michael Lewis to Classroom: Language Use and Language Teaching” and “Teaching Chunks of Language: From Noticing to Remembering.”

Not everyone is on board, however. Michael Swan, a British writer on language pedagogy, has emerged as a prominent critic of the lexical-chunk approach. Though he acknowledges, as he told me in an e-mail, that “high-priority chunks need to be taught,” he worries that “the ‘new toy’ effect can mean that formulaic expressions get more attention than they deserve, and other aspects of language — ordinary vocabulary, grammar, pronunciation and skills — get sidelined.”

Swan also finds it unrealistic to expect that teaching chunks will produce nativelike proficiency in language learners. “Native English speakers have tens or hundreds of thousands — estimates vary — of these formulae at their command,” he says. “A student could learn 10 a day for years and still not approach native-speaker competence.”

Besides, Swan warns, “overemphasizing ‘scripts’ in our teaching can lead to a phrase-book approach, where formulaic learning is privileged and the more generative parts of language — in particular the grammatical system — are backgrounded.” Formulaic language is all well and good when talking about the familiar and the recurrent, he argues, but it is inadequate for dealing with novel ideas and situations, where the more open-ended aspects of language are paramount.

The methodology of the chunking approach is still open to this type of criticism, but data-driven reliance on corpus research will most likely dominate English instruction in coming years. Lexical chunks have entered the house of language teaching, and they’re making themselves at home.

Ben Zimmer will answer one reader question every other week. Send your queries to onlanguage@nytimes.com.


Full article and photo: http://www.nytimes.com/2010/09/19/magazine/19FOB-OnLanguage-Zimmer.html

‘All’s I Know . . .’

Jan Conaway writes: “The first time I heard someone say ‘All’s I know is . . .’ was in the 1980s. As I’m sure you have noticed, this annoying expression has since become more and more prevalent, and I realize there is no hope that it will disappear. Do you know where all’s actually originated?”

The Dictionary of American Regional English, the go-to reference for local American speech patterns, explains that all’s started off as a contraction of all as, with as working like the relative pronoun that. In a speech survey of Amherst, Mass., in 1967, DARE reports, expressions like “All’s I get is” or “All’s he can do is” were in frequent use among some locals in their mid- to late-20s. A 1975 guide to Maine lingo by the columnist John Gould spelled the regional version as alst, giving the example, “Alst I know is what they tell me.”

The use of as in place of that or who was once widespread in both England and the United States. Shakespeare used it: a character in “The Merry Wives of Windsor” speaks of “those as sleep and think not on their sins.” In contemporary English, such usage has been considered  nonstandard, and even the dialects that retain this kind of as don’t use it in all cases. Sentences that introduce a subordinate clause with all (what linguists call all-clefts) are one part of the language where as meaning “that” has lingered.

Nineteenth-century writers, both British and American, frequently put all as in the mouths of rural or lower-class types, as in “All as we’ve got to do is to trusten,” from George Eliot’s “Silas Marner.” Contracting the two words into all’s or alls seems like an obvious way of representing a rapid pronunciation. One early example is from “Hearts of Oak,” an American play from 1879, in which a hardy Massachusetts fisherman says, “All’s I got to say is, Heaven bless the gal as you’d take hum for a wife.” Keeping with the nautical theme, a story published by The Los Angeles Times in 1894 has an old sailor telling a tale of a sea serpent: “Did we hit the beast? Well, that I can’t say. All’s I know there was a sudwint swish, and next minute the Betsy B. went plungin’ down. . . .”

You don’t have to be a wizened seaman to use “All’s I know” these days, but it’s still quite colloquial. It’s hard to say if there has been a marked increase in spoken usage over recent years or if we’re simply seeing it represented in print and online more often, along with other colloquialisms that pepper easygoing prose. In any case, it has become a fixed idiom, no longer connected to bygone dialects where as could serve as a relative pronoun. Much like “How’s about it?” or “Just so’s you know . . .”, the extra ’s in “All’s I know” might now simply add a breezy air of informality.

Ben Zimmer will answer one reader question every other week.


Full article:  http://www.nytimes.com/2010/09/12/magazine/12onlanguage.html

The Meaning of ‘Man Up’

The New York Mets lost their closer Francisco Rodriguez, a k a K-Rod, to season-ending surgery on a torn thumb ligament last month. But really the Mets lost him to two simple words: “man up.” According to The New York Daily News, that’s what Carlos Peña, the father of Rodriguez’s girlfriend, told him outside the Mets clubhouse, inciting an altercation that led to K-Rod busting his thumb and getting arrested on third-degree-assault charges for good measure.

While man up may have served as fighting words for Rodriguez, the exhortation is taking on many guises in American popular culture right now. Advertisers courting young male consumers are spreading the manly message. The Web site for the No Fear energy drink smacks the “Man Up” slogan across the screen, accompanied by an aggressive rock soundtrack. Meanwhile, Miller Lite has been running television commercials featuring a voice-over that growls, “Man up, because if you’re drinking a light beer without great pilsner taste, you’re missing the point of drinking beer.” (Light beer ads often amp up the masculinity, perhaps to compensate for their watered-down product.)

But man up isn’t just being used to package machismo as a commodity. Its spectrum of meanings runs from “Don’t be a sissy; toughen up” all the way to “Do the right thing; be a mensch,” to use the Yiddishism for an honorable or upright person. The Man Up Campaign, for instance, is a new global initiative that engages youth to stop gender-based violence: “Our call to action challenges each of us to ‘man up’ and declare that violence against women and girls must end,” its mission statement reads.

Not too long ago, man up was simply an alternative to the verb man, in the sense of “to supply with adequate manpower.” (Staff or staff up would be the more politically correct choices nowadays.) The Oxford English Dictionary cites a 1947 letter to the editor of The Times of London from Henry Strauss, a Conservative member of Parliament, complaining about man up as an insidious Americanism. “Must industries be fully ‘manned up’ rather than ‘manned’?” Strauss asked. “Must the strong, simple transitive verb, which is one of the main glories of our tongue, become as obsolete in England as it appears to be in America?”

Strauss might have thought of man up as less than virile, but phrasal verbs with up have filled a wide variety of roles in both British and American English. Up gets used for literal upward movement (lift up, stand up) or more figuratively to indicate greater intensity (stir up, fire up) or completion of an act (drink up, burn up). It’s particularly handy for blunt imperatives calling for resolute action: think of wake up!, grow up!, hurry up! and put up or shut up!

One notable forerunner of man up as we know it today is cowboy up, a phrase that has been used in rodeo circles for decades. In Douglas Kent Hall’s 1973 book on rodeo life, “Let ’Er Buck!,” an old hand tells a rookie rider, “It looks like we’re going to have to cowboy up a little.” Another rider, in a 1975 article in The Reno Evening Gazette, talked about what it’s like to get clobbered in a bull wreck, with the rodeo instructor “right behind you saying: ‘Cowboy up. Get tough. Get tough.’ ”

Cowboy up wasn’t much known outside of rodeo country until 2003, when it became the rallying cry for the Boston Red Sox, thanks to the players Kevin Millar and Mike Timlin — both Texans, not coincidentally. Millar and Timlin injected this bit of rodeo slang into Red Sox parlance to fire up a team (and a fan base) that had long been ruled by mopey fatalism. As one T-shirt of the time put it, “Are You Gonna Cowboy Up or Just Lay There and Bleed?”

Man up owes its early popularization to another American sport: football, where it originally had a more technical meaning relating to man-to-man pass defense. In 1985, for example, the New York Jets head coach Joe Walton praised the work of his defensive coordinator to The Times: “They’re playing the kind of defense that I wanted and that Bud Carson teaches — aggressive, man up, getting after it, hustling all over the field.” A year before that, a high-school coach in Texas previewed a coming game for the local paper, The Baytown Sun, by saying, “We’re expecting them to use an eight-man front with their secondary manning up on us.”

Describing man-to-man defense as manning up on the opposing team is an easy linguistic step to make — so easy, in fact, that the same expression can be found in the early ’80s in Canadian and Australian football as well. But it was in the American variety that man up took on a more general idea of resilience in the face of adversity. The earliest example I’ve found of this extended use is from 1987, when the San Diego Chargers defensive tackle Mike Charles told The Union Tribune: “Right now, by the grace of God, we’re hanging by the skin of our teeth. Now we’ve got to man up and take care of ourselves.”

In recent years, man up and cowboy up have been joined by other “X up” macho-isms. Some evoke what might be politely termed testicular fortitude, like sack up and nut up, dated by the slang lexicographer Grant Barrett to 1994 and 1999, respectively. Last year’s movie “Zombieland” even showcased the provocative tagline “Nut up or shut up.” It’s not all about cartoonish masculinity, though. There’s still the notion of the stand-up guy, the mensch. In a nice mash-up of idioms, Rabbi Daniel Polish has interpreted the Torah story of Joseph and his brothers as a parable of — what else? — mensching up.

Ben Zimmer will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/09/05/magazine/05FOB-onlanguage-t.html

‘Jersey’ as a Nickname for New Jersey

Commenting on my column about beach lingo from the Jersey Shore and elsewhere, Gary Muldoon writes: “If someone from the Garden State comes from ‘Jersey,’ do those residents refer to someone from the Empire State as being from ‘York’?”

“Jersey” as a nickname for New Jersey is an oddity: there’s no corresponding clipping of “New York” to “York,” “New Hampshire” to “Hampshire,” and certainly not “New Mexico” to “Mexico.” Some have complained that the use of “Jersey” is “demeaning” to the state. David Lavery, author of “This Thing of Ours: Investigating the Sopranos,” agrees: plain old “Jersey,” with its “familiar and slangy” feel, “does not elicit respect.”

Those who dislike the “Jersey” label may be surprised to discover that it has a distinguished historical pedigree. I asked Maxine N. Lurie, professor of history at Seton Hall University and co-editor of the Encyclopedia of New Jersey, about the usage, and she traced its origins to the end of the 17th century, when there were actually two “Jerseys”: the provinces of East and West Jersey, dividing the territory of New Jersey along a diagonal. (New Jersey was named in honor of the proprietor of East Jersey, George Carteret, who hailed from the Island of Jersey.)

Because of this split, it was common to talk of “the Jerseys,” even after the provinces were united in 1702. Lurie suspects it was “easier to refer to the ‘Jerseys’ and people from ‘Jersey’ than to say ‘East New Jersey’ and ‘West New Jersey.'” The historical record bears this out: 18th-century documents are peppered with mentions of “the Jerseys,” and colonial accounts from 1735 and 1746 refer simply to “the province of Jersey.”

Another factor that helped “Jersey” shed the “New” was the proliferation of compounds with “Jersey” as the first element. The Oxford English Dictionary lists “Jerseyman” from 1679, “Jersey maid” from 1713, and “Jersey blues” (the name of the local militia regiment) from 1758. Inhabitants of New Jersey could also be called “Jerseys,” as in a 1756 letter from George Washington that read, “The Jerseys and New Yorkers, I do not remember what it is they give.”

Many of these colonial vestiges carried over into statehood, though instead of East and West Jersey, the state has more typically been divided into North and South. Standalone “Jersey” worked its way into place names, too. Notably, in 1804, The Jersey Company incorporated the City of Jersey, counterbalancing the City of New York on the other side of the Hudson River. A few decades later it was reincorporated as Jersey City.

Making compound forms with “Jersey” has certainly never let up: consider the Jersey Shore and the Jersey Devil, Jersey justice (the rough kind) and Jersey lightning (strong liquor, usually applejack), Jersey boys and Jersey girls. Jersey Joe Walcott won the world heavyweight boxing title in 1951, and concrete highway dividers have been called “Jersey barriers” since the late ’60s.

New York has, for the most part, missed out on all of this “New”-less naming. New Yorkers were sometimes called “Yorkers” back in the revolutionary era (Abigail Adams wrote that “a regiment of Yorkers refused to quit the city” in a 1776 letter), but the epithet never stuck. There wasn’t much compounding of “York” either, except for the “York shilling,” a bygone local currency that lingered until 1865, according to H.L. Mencken.

That only makes the “Jersey” legacy even more peculiar to the state. As a born-and-bred Jersey boy, I embrace this idiosyncratic heritage. Sure, “Jersey” (or, God forbid, “Joisey”) can be used derisively at times, but Jersey-bashing won’t suddenly disappear just because people use the state’s full name. Better to revel in the timeless lyric of Bruce Springsteen, “My machine she’s a dud, out stuck in the mud somewhere in the swamps of Jersey.”

Ben Zimmer, New York Times


Full article: http://www.nytimes.com/2010/08/29/magazine/29onlanguage.html


When is a leak not a leak? Last month’s release of the Afghan war logs — tens of thousands of classified documents unveiled by the Web site WikiLeaks — stretched the semantics of leak to a bursting point.

“The word ‘leak’ just doesn’t seem adequate for a data dump and security breach of this magnitude,” wrote Peter Feaver, a professor of political science at Duke University, in a blog post for Foreign Policy. “This is not so much a leak as a gusher.” Jack Shafer of Slate concurred: “To call the torrent of information about the Afghanistan war released by WikiLeaks a mere leak is to insult the gods of hydrodynamics.”

Our canonical images of leakiness involve liquid seeping out through small openings in something — a dripping faucet, a roof letting in rain, a boat with a cracked hull. Physical leaks can be stopped with a patch or some other reinforcement, as when the little Dutch boy plugged that faulty dike with his finger. But political leaks have strayed far from their literal foundation.

The metaphor of confidential information leakingout is, in fact, an ancient one. In “The Eunuch,” a comedy by the Roman playwright Terence from the second century B.C., one character says of his inability to keep a secret, “I am full of holes, I leak at every point” (“Plenus rimarum sum, hac atque illac perfluo”). In English, blabby talkers (stereotypically women) have been called leaky since the late 17th century. And the phrasal verb leak out has been used for the revelation of secrets since at least 1806, when the British journalist William Cobbett, an advocate for parliamentary reform, wrote, “When any valuable information leaks out, let us note it down.”

An early glimpse of how leak entered American political vocabulary comes in John C. Frémont’s 1887 memoirs, which recount a political event leading up to the Mexican-American War, when Secretary of State James Buchanan “discovered a leak in his department.” Buchanan needed to patch a leak from below, but by the end of World War II, leaks could just as likely come from above, in the form of information revealed to reporters by high-ranking officials who didn’t want to be identified. As James Reston wrote in a 1946 New York Times dispatch on postwar peace negotiations, “Governments are the only vessels that leak from the top.”

Reston’s observation rang true with Daniel Schorr, the veteran newsman who died last month at 93. Schorr was an old hand at the leaking game, having reported for CBS News on the damaging disclosures that befell the Nixon administration, from Daniel Ellsberg’s release of the Pentagon Papers to the Watergate secrets passed on to The Washington Post by Mark Felt, known at the time only as Deep Throat. (Nixon’s would-be leak-pluggers, the “plumbers,” only made matters worse, of course.)

Schorr lost his job at CBS over a leak, which he described in his autobiography, “Staying Tuned,” as “the most tumultuous experience of my career.” In 1976 he received a draft copy of a secret House Intelligence Committee report on illegal C.I.A. and F.B.I. activities, which he in turn leaked to The Village Voice. Schorr was revealed as The Voice’s source, but he refused on First Amendment grounds to divulge who gave him the report.

A keen observer and instigator of Washington leaks, Schorr was equally perceptive about the word leak itself. “Originally, when information ‘leaked,’ ” he was quoted as saying by William Safire in a 1982 On Language column, “it was thought of as an accidental seepage — a lost document, a chauffeur’s unwary anecdote, loose lips in the Pentagon. Today, when information ‘is leaked,’ it is a witting (if sometimes witless) action. One leaks (active) to float or sink an idea, aggrandize self (the ‘senior official on the secretary’s plane’) or derogate an opponent.”

It was astute of Schorr to spot the transformation of leak into an active, intransitive verb with the source of information as the subject. The usage isn’t entirely new — for example, an 1897 article in The Daily Argus News of Crawfordsville, Ind., referred to attempts to find “the man that ‘leaked’ about the blackballing” of Gov. James A. Mount. But in modern political parlance, it’s not necessary to say that someone “leaked something” or even “leaked about something”; the verb can stand alone. When private postings from the e-mail list JournoList got some unwelcome exposure in June, a headline on Politico read, “JournoList wonders who leaked.”

Schorr died two days before the mother of all leaks made the news, when The New York Times and other papers published reports based on the WikiLeaks data dump from Afghanistan. In its very name, WikiLeaks marries old-fashioned political leaking with Web 2.0 methods of sharing information in a collaborative, bottom-up wiki style. (The original “wiki,” even before Wikipedia, was WikiWikiWeb, named in 1995 by the computer scientist Ward Cunningham after the Hawaiian word for “fast” — inspired by Honolulu International Airport’s Wiki Wiki Shuttle.)

Do we need new terminology for leaking on such an immense scale? Perhaps we can take a cue from linguistic debates over BP’s notorious oil-well blowout in the Gulf of Mexico. After the catastrophic extent of April’s accident became apparent, puny words like spill and leak suddenly seemed inadequate to many commentators. Wendalyn Nichols, editor of the newsletter Copyediting, proposed rupture as an alternative label, evoking “a wound that can’t clot, that is not self-healing.” With WikiLeaks capable of uploading even more classified material from the Afghan theater, the rupture in our wartime intelligence apparatus may prove equally difficult to repair.

Ben Zimmer will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/08/22/magazine/22FOB-onlanguage-t.html

The Origins of ‘Relatable’

Jane E. Wohl writes: “I have noticed among my students a growing use of the word ‘relatable,’ as in ‘I like Sarah Palin. She’s relatable’ (meaning, ‘I can relate to her’). Do you know the origins of this usage? It turns the verb ‘to relate to’ into a very odd adjective.”

Applying the word relatable to someone or something you can relate to is a modern peculiarity, but it’s not wholly without precedent. The usage draws on a meaning of relate to (“to understand, to empathize with, to feel a connection with”) that is itself rather new, recorded by the Oxford English Dictionary only since 1947 — first showing up in the literature of social work and childhood education.

When this touchy-feely use of relate to took off in the ’60s, the adjective form relatable also made its appearance. (Before that, relatable more predictably meant “able to be related”: a relatable story is one that can be told.) A 1965 article in the education journal Theory Into Practice showcased the new meaning when it detailed research findings that “boys saw teachers as more directive, while girls saw them as more ‘relatable.’ ”

From educational circles relatability eventually spread to television programming, where the concept flourished. In 1981, the game-show host Bob Eubanks told The Washington Post that “The Newlywed Game” featured “relatable humor, the kind that takes place in every home.” The following year, The New York Times quoted a press release for the syndicated series “Couples”: “The real difficulties, conflicts and problems of married, dating, living-together and divorced couples rival any type of fictional format for personal and relatable drama.”

“Couples” was an early example of reality television, a genre that plays on viewers’ feelings of identification and empathy. But TV executives would come to demand relatability in their dramas and sitcoms as well. By 2006, a Times profile of the new CW network could joke about the creakiness of the cliché: “Someday, there will be an article about television in which no executive uses the word ‘relatable,’ industry jargon for something with which viewers are supposed to identify or connect. Alas, this is not that article.”

Now that relatable is creeping into more mainstream usage, especially among younger speakers, it’s raising eyebrows. Some people object to how the word is formed, because it differs from most adjectives ending in -able. Usually the suffix -able attaches to a transitive verb: an enjoyable movie is one you can enjoy, a catchable ball is one you can catch and so forth. But relate in this case is intransitive, and the object of “relate to (someone)” is locked in a prepositional phrase. Shouldn’t it be relate-to-able?

The very same objection was once made about a word that we now find utterly commonplace: reliable. Beginning in the mid-19th century, according to Merriam-Webster’s Dictionary of English Usage, language commentators criticized reliable as a vile abomination, because it should technically mean “able to be relied,” not “able to be relied on.” Thomas De Quincey attacked Samuel Coleridge for using the negative version, unreliable, suggesting unrelyuponable as a “more correct” alternative.

Defenders of reliable pointed out that we have no trouble omitting the prepositions in laughable (“able to be laughed at”), dependable (“able to be depended on”), or indispensable (“unable to be dispensed with”). Just as reliable seemed novel and strange more than a century ago, relatable doesn’t sound quite right to many observers today. Over time, though, the word may move from laughable to indispensable.

Ben Zimmer, New York Times


Full article: http://www.nytimes.com/2010/08/15/magazine/15onlanguage.html

How Should ‘Microphone’ be Abbreviated?

In my recent column on the expression “rock the mic,” I wrote that “the M.C.’s of early hip-hop took the verb [rock] in a new direction, transforming the microphone (abbreviated in rap circles as mic, not mike) into an emblem of stylish display.” Laurence Reich e-mails regarding mic: “I must confess I have never seen that word before. I’ve only seen mike for that usage.” Ted Estersohn e-mails: “As far as I can tell mic the short form has always been spelled in audio and engineering circles with a ‘c,’ like an abbreviation and not like the boy’s name.”

The respondents on this one fell evenly into two camps: those like Reich who were unfamiliar with the shortening of microphone as mic and those like Estersohn who noted that mic is the prevailing form not just in rap circles but also among recording professionals more generally.

Mike came first, documented from the early days of radio. In the June 1923 issue of The Wireless Age, a photo caption of Samuel L. Rothafel (who was known as Roxy and who was broadcasting concert programs from New York’s Capitol Theater) reads, “When you hear Roxy talk about ‘Mike’ he means the microphone.” This suggests the abbreviation arose as a kind of nickname, playfully anthropomorphizing the microphone as Mike. But by 1926, when the pioneering broadcaster Graham McNamee published his book “You’re on the Air,” mike appeared in lowercase, not as a name. During broadcasts of baseball games, McNamee wrote, “the man at the ‘mike’ watches each play.”

Mic didn’t begin appearing in written works for another few decades, first recorded by the Oxford English Dictionary in Al Berkman’s 1961 “Singers’ Glossary of Show Business Jargon.” Berkman offered both mike and mic as possible clippings of microphone. Since then, mic has grown in popularity among those who work with recording equipment. The preference for mic likely stems from the way the abbreviation is rendered on the equipment itself: a microphone might be labeled “Mic No. 1,” for instance. And if you’re in the market for a microphone preamplifier, you’ll find it written as “mic preamp.”

It makes sense, then, that the early rappers of the South Bronx, intimately familiar with the sound systems that powered their performances, would take to the mic spelling. It also explains why The Associated Press Stylebook earlier this year reversed its advice to abbreviate microphone as mike. As the stylebook’s editors told the American Copy Editors Society in April, the A.P.’s broadcast division was unhappy with mike, and so the entry was revised to recommend mic instead.

Some of the copy editors voiced objections to the A.P.’s amended edict, on the grounds that mic could confuse readers who might be tempted to pronounce it as “mick.” The Washington Post’s Bill Walsh pressed the stylebook editors on the verb form: is a person mic’ed or miked? The A.P. style gurus allowed that the verb could be miked, even if the noun is mic.

The grumbling over mic emerges from its seeming violation of English pronunciation rules. Bicycle is abbreviated as bike, after all, not bic. But we do occasionally allow a mismatch between the spelling of an abbreviation and how it looks like it ought to be pronounced. Vegetable is shortened to veg, and Reginald to Reg, but the final g is not a “hard” one as in peg or leg.  So let the musicians and broadcasters have their mic, but as for me, I still like mike.

Ben Zimmer will answer one reader question every other week.


Full article: http://www.nytimes.com/2010/08/01/magazine/01-onlanguage-t.html

‘Mad Men’-ese

As the fourth season of the AMC series “Mad Men” kicks off, some of the show’s fans are gearing up to play another round of a peculiar language game: trying to spot flaws in the meticulously constructed dialogue portraying 1960s Madison Avenue.

No show in American television history, it is safe to say, has ever put so much effort into maintaining historically appropriate ways of speaking — and no show has attracted so much scrutiny for its efforts. The three seasons that have been broadcast, set between 1960 and 1963, triggered endless arguments in online discussion forums, with entire threads devoted to potential anachronisms. Among recent small-screen forays into historical fiction, only “Deadwood,” which ran on HBO from 2004 to 2006, generated remotely comparable discussion about the authenticity of its language. (Commenters on that series tended to focus on whether its torrents of colorful, modern-sounding cursing were out of place for a South Dakota mining camp in the 1870s — which they almost certainly were.)

When I spoke recently with Matthew Weiner, the creator, executive producer and head writer of “Mad Men,” he readily admitted that goofs sneak through on his show. He said he still regrets allowing the character Joan to say “The medium is the message” in the first season, four years before Marshall McLuhan introduced the dictum in print. But he defends Joan’s year-end valedictory, “1960, I am so over you,” by pointing to the Cole Porter song “So in Love” from “Kiss Me, Kate.” Scholars of semantics might disagree, seeing a nuance between Porter’s use of the adverb so, which quantifies the extent to which the character is in love, and the later Generation X-style spin on the word as an intensifier meaning “extremely” or “completely” without any comparison of relative degree.

Other lines that have struck a discordant note with quibblers include Don’s “The window for this apology is closing” and Roger’s “I know you have to be on the same page as him.” Window in its metaphorical sense (as in a window of opportunity) and on the same page evidently date to the late ’70s. In a piece in The New Republic, the linguist John McWhorter complained that Peggy’s line “I’m in a very good place right now” is actually in a bad place, historically speaking. Even interjections can come under fire. When the character Sal reacts to the abrupt end of a screening of “Bye Bye Birdie” by exclaiming “awwa!” his falling-and-rising intonation has a 21st-century tinge, according to the linguist Neal Whitman.

Very often, however, fans will discern anachronisms that aren’t there — “un-achronisms,” as they were dubbed in the online forum Television Without Pity. Deborah Lipp, who runs the “Mad Men” fan blog Basket of Kisses with her sister Roberta, has dispelled fans’ concerns about the appearance of words like intense, lifestyle, self-worth, regroup and recon. She credits the hard work of the “Mad Men” brain trust with making sure that the true clunkers are few and far between.

To a large extent, Weiner and his staff members brought this festival of nitpickery on themselves through their own perfectionism. The show is famous for its loving attention to retro details, most notably in the set design (Weiner has been known to halt production over matters as subtle as the size of fruit in a bowl) and wardrobe (the actresses bravely suffer through the exquisite discomfort of vintage undergarments). Language naturally comes under the same microscope. To try to ensure accuracy, Weiner and his fellow writers sometimes take cues from the films and books of the era, but, as Weiner told me, those sources don’t necessarily provide the best window into genuine speech patterns. “You’re much better off if you can find a letter from your grandmother,” he said. He did acknowledge that Joan owes much of her sultry style to the writings of Helen Gurley Brown, the author of ’60s advice books like “Sex and the Single Girl” and “Sex and the Office.”

Even after a script is painstakingly developed, Weiner said, certain words and phrases can be flagged as questionable during the table read, when the cast runs through the dialogue for the first time. Whenever there is a question of usage, the research staff consults the Oxford English Dictionary, slang guides and online databases to determine whether an expression is documented from the era and could have been plausibly uttered. “When in doubt,” Weiner said, “I don’t use it.”

Despite his aversion to revealing anything about the new season, Weiner did let slip two examples of words from coming episodes that had to be researched thoroughly before they were deemed acceptable. One is humorless, a pedestrian adjective that is recorded back to the mid-19th century but nonetheless sounded “really modern” in the portion of dialogue where it appears. The other word is much more vivid — too vivid for print here, in fact, but suffice to say it’s a scatological slur for a person’s head. Though cursing on “Mad Men” isn’t as rampant as it was on “Deadwood” or “The Sopranos” (on which Weiner previously worked), it has its place in the show and promises to become more prominent as the characters move through the ever-liberalizing ’60s.

As the show progresses, new linguistic pitfalls await the writers. Weiner says he welcomes the fault-finding from fans, because he identifies himself as “one of the most nitpicky people in the world.” “I’m glad that we’re held to a high standard, and I’m glad that people get pleasure from picking it apart,” he said. “But I’ll tell you, it’s a battle for me to make sure it’s right.”

Ben Zimmer will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/07/25/magazine/25FOB-onlanguage-t.html

Is ‘One-Year Anniversary’ Redundant?

Neale Gifford writes: “One practice that annoys me is the use of ‘one-year anniversary’ or ‘five-year anniversary’ instead of ‘first’ or ‘fifth.’ Reason? Anniversary is derived from the Latin annus meaning ‘year.’ ” Ed Morman writes of “n-year anniversary,” “It’s not very mellifluous and it is, of course redundant. Help me bring back ‘nth anniversary.’ “

Gifford and Morman aren’t alone in their irritation. A few years ago, when The San Francisco Chronicle ran a headline reading, “Four-year anniversary draws protests,” an irate reader left a profanity-riddled voicemail that unkindly insinuated what substance could be found between the ears of Phil Bronstein, then the editor of The Chronicle. (The same reader was even more offended by the appearance of “pilotless drone” in the newspaper, and his recorded rant became something of a Web sensation.)

The New York Times, too, is hardly immune to this redundancy. The June 28 edition, for instance, featured an article on Dylan Ratigan’s “one-year anniversary at MSNBC” and another stating that New York’s first gay pride parade commemorated “the one-year anniversary of the Stonewall uprising.” Philip Corbett, the newsroom’s style guru, has frequently complained about this usage in his “After Deadline” blog but laments that his reminders have “little effect.”

What has happened to the word anniversary? Even though the idea of yearly recurrence is built into the word etymologically, that idea has been clouded over centuries of use. And when an element of a word’s meaning becomes more opaque, redundancy is one method, however inelegant, that gets used to unmuddy the waters.

As the annual aspect has moved to the background of anniversary, the shift has opened the door for use of the term to mark the passing of shorter units of time. The 11th edition of Merriam-Webster’s Collegiate Dictionary, in its entry for the word, states that anniversary can refer more broadly to a date following a notable event “by a specified period of time measured in units other than years,” giving the example, “the 6-month anniversary of the accident.”

Linguists call this process “semantic bleaching”: the lessening of a word’s force through generalization. The bleaching of anniversary has been going on for quite a while, even if dictionaries are only now catching up. For more than a century, English speakers have been modifying anniversary with numbers of days, weeks or months. An article in the May 5, 1901, Atlanta Constitution described the Vanderbilt family’s Biltmore mansion in North Carolina, where little Cornelia Stuyvesant Vanderbilt “celebrated the three-months anniversary of her birth by planting out a tree on the estate.” A few years later, The San Jose Evening News reported on the unseemly nature of a widow marrying on “the third-month anniversary of her husband’s death.”

Anniversary has been pressed into service for nonannual commemorations in part because English has no other commonly used terms that can fill the gap. At various times since the 19th century, the monthly equivalent of anniversary has been dubbed a mensiversary, using the Latin root mensis for “month,” but this ad-hoc coinage has never caught on. (Members of the Facebook group “Make ‘Mensiversary’ a Word” continue to fight the good fight.) A more recent suggestion is the clunky English-Latin hybrid monthiversary.

With pressure on anniversary to expand its reach to subannual units, it’s no wonder that “nth anniversary” can somehow feel insufficient for traditional yearly celebrations, in need of the more explicit form, “n-year anniversary.” But if you don’t want to see the meaning of the word weakened any further, stick to the “nth” version and trust that others will have the good sense to discern anniversary‘s annualness.

Ben Zimmer will answer one reader question every other week.


Full article: http://www.nytimes.com/2010/07/18/magazine/18onlanguage-anniversary.html

The Origins of ‘One-Off’

Donn Barclay e-mails: “I have been hearing and reading the phrase ‘one-off’ more and more lately. It seems to me that it is a serious bastardization of ‘one-of,’ as in short for ‘one of a kind.’ Frankly, it makes absolutely no sense at all.”

As William Safire observed in a 2007 On Language column, one-off meaning “something unique” is a British expression that has been creeping into American speech and writing in recent years. And as with other Briticisms that impinge on these shores (gone missing comes to mind), the idiomatic origins of one-off are mostly lost on American ears.

The off in one-off does not, in fact, stem from some corruption of the word of. Rather, this British usage of off typically appears with a number to indicate a quantity of items produced in some manufacturing process. The Oxford English Dictionary, Safire notes, takes this back to a 1934 quotation from the Proceedings of the Institute of British Foundrymen: “A splendid one-off pattern can be swept up in a very little time.” Other numbers can fit the bill, as in the O.E.D.’s 1973 example of an advertisement for “Kienzle printers, 6 off, surplus to manufacturing requirements.”

Because Americans by and large have never encountered this use of off preceded by a number, one-off ends up, as Barclay puts it, making “absolutely no sense at all.” But when idioms make no sense, we often try to invest them with a new kind of sense. This etymological impulse has led some outside of Britain to conclude that one-off ultimately comes from one of a kind by way of the shortening one of.

I first heard about this proposed origin for one-off in my capacity as a contributor to the Eggcorn Database, a repository for words and phrases that have been reshaped to give them new semantic footing. The word eggcorn is itself such a reshaping: it’s a misspelling of the word acorn that makes perfect sense if you think of an acorn as roughly egg-shaped. Since its founding in 2004 by Chris Waigl, the database has cataloged more than 600 creatively reworked expressions, from the mainstream (chaise lounge for chaise longue) to the peculiar (grow like top seed for grow like Topsy).

Arnold Zwicky, a fellow eggcorn collector and a linguist at Stanford University, reported in 2005 that a friend suggested adding one-off to the database, in the belief that it derived from one of (a kind). But it turns out that one-off is the original (as the historical record proves), and one-of is actually the eggcorn, a seeming “correction” by those who think that one-off must be an error. For instance, a transcript of a 2007 speech by Francis Fukuyama at the Brookings Institution refers to 9/11 as “a pretty lucky one-of event.”

Though some Americans might still find one-off annoyingly British-sounding, there’s no need to fear it as a solecism. It’s a well-manufactured word, even if it feels a little off-kilter.

Ben Zimmer will answer one reader question every other week.


Full article: http://www.nytimes.com/2010/07/04/magazine/04FOB-onlanguage-t.html


The spelling of English is a bizarre mishmash, no doubt about it. Why do we spell acclimation with an “i” in the middle but acclamation with an “a”? Why do we distinguish between carat, caret, carrot and karat? For those who feel strongly that something needs to be done, there’s no better place to vent some orthographic rage than the Scripps National Spelling Bee. The 2010 bee, held earlier this month, was no exception, as a handful of protesters from the American Literacy Council and the British-based Spelling Society picketed the Grand Hyatt in Washington, while inside young spellers braved such obscurities as paravane (an underwater mine remover) and ochidore (a shore crab).

When talk turns to the irrationality of English spelling conventions, a five-letter emblem of our language’s foolishness inevitably surfaces: ghoti. The Christian Science Monitor, reporting on the spelling-bee protesters, laid out the familiar story (while casting some doubt on its veracity): “The Irish playwright George Bernard Shaw is said to have joked that the word ‘fish’ could legitimately be spelled ‘ghoti,’ by using the ‘gh’ sound from ‘enough,’ the ‘o’ sound from ‘women’ and the ‘ti’ sound from ‘action.’ ”

Just one problem with the well-worn anecdote: there’s not a shred of evidence that Shaw, though a noted advocate for spelling reform, ever brought up ghoti. Scholars have searched high and low through Shaw’s writings and have never found him suggesting ghoti as a comical respelling of fish.

The true origins of ghoti go back to 1855, before Shaw was even born. In December of that year, the publisher Charles Ollier sent a letter to his good friend Leigh Hunt, a noted poet and literary critic. “My son William has hit upon a new method of spelling ‘Fish,’ ” Ollier wrote. You guessed it: good old ghoti. Little is known about William Ollier, who was 31 at the time his father wrote the letter. According to Charles E. Robinson, a professor of English at the University of Delaware who came across the ghoti letter during research on the Ollier family about 30 years ago, William was a journalist whose correspondence reveals a fascination with English etymology.

As a language fancier in mid-19th-century England, William Ollier would surely have come into contact with the strong current of spelling reform — championed by the likes of Isaac Pitman, now remembered for inventing a popular system of phonetic shorthand: what Pitman called “phonography.” In 1845, Pitman’s Phonographic Institution published “A Plea for Phonotypy and Phonography,” by Alexander J. Ellis, a call to arms that laid the groundwork for ghoti and other mockeries of English spelling. To make the case for reform, Ellis presented a number of absurd respellings, like turning scissors into schiesourrhce by combining parts of SCHism, sIEve, aS, honOUr, myRRH and sacrifiCE. (If you’re wondering about the last part, the word sacrifice has historically had a variant pronunciation ending in the “z” sound.)

Ellis thought scissors was a downright preposterous spelling of sizerz, and he went about calculating how many other ways the word could be rendered. At first he worked out 1,745,226 spellings for scissors, then adjusted the number upward to 58,366,440, before finally settling on a whopping 81,997,920 possibilities. Isaac Pitman and his brothers liked to use the scissors example when proselytizing for phonetic spelling, and the 58 million number even worked its way into “Ripley’s Believe It or Not!”

Don’t believe it. Ellis admitted that “the real number would not be quite so large,” since English spelling does not actually work by stitching together parts of words in Frankensteinian fashion. Ghoti falls down for the same reason, if you stop to think about it. Do we ever represent the “f” sound as gh at the beginning of a word or the “sh” sound as ti at the end of a word? And for that matter, is the vowel of fish ever spelled with an “o” in any word other than women? English spelling might be messy, but it does follow some rules.

Robinson suggested to me that William Ollier could have come up with ghoti in a parlor game of Ellis-inspired silly spellings. Victorians often amused themselves with genteel language games, so why not one involving the rejiggering of common words? Into the 20th century, other jokey respellings made the rounds, such as ghoughphtheightteeau for potato (that’s gh as in hiccough, ough as in though, phth as in phthisis, eigh as in neigh, tte as in gazette and eau as in beau).

Ghoti was elevated above these other spelling gags when it became attached to the illustrious name of Shaw — who, like Churchill and Twain, seems to attract free-floating anecdotes. If Shaw never said it, who was responsible for the attribution? I blame the philologist Mario Pei, who spread the tale in The Los Angeles Times in 1946 and then again in his widely read 1949 book, “The Story of Language.” Pei could have been confusing Shaw with another prominent British spelling reformer, the phonetician Daniel Jones (said to be one of the models for Shaw’s Henry Higgins in “Pygmalion” ), since Jones really did make use of the ghoti joke in a 1943 speech.

With Shaw’s supposed imprimatur, ghoti lingers with us. Jack Bovill, chairman of the Spelling Society, told me that despite its jocularity, ghoti is nonetheless “useful as an example of how illogical English spelling can be.” I beg to differ: if presented with ghoti, most people would simply pronounce it as goaty. You don’t have to be a spelling-bee champ to know that written English isn’t entirely a free-for-all.

Ben Zimmer will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/06/27/magazine/27FOB-onlanguage-t.html


In my recent column on cool, I wrote, “From Old English to the ages of Chaucer and Shakespeare all the way to the present, cool has been able to mean ‘dispassionate, calm, self-composed.’ Some of our latter-day cool expressions — ‘stay cool,’ ‘play it cool,’ ‘cool as a cucumber,’ ‘cool customer’ — play off this ancient connotation of implacability.” Catherine Harris e-mails: “I believe you meant to use imperturbability rather than implacability, describing the calm sense of cool. Someone who is implacable is relentless and unappeasable, not necessarily perpetually unruffled.”

If there’s one lesson I learned from many years of reading my illustrious predecessor William Safire, it’s to show humility when called out by the Gotcha Gang. In a 1984 column, he introduced the Gotcha Gang as “that shock troop of Lexicographic Irregulars who specialize in correcting other language mavens.” Safire later revealed the hidden benefit of publicizing Gotchas: “By ostentatiously wolfing down one slim slice of humble pie, I buy the license to take pops at everybody else for months without appearing to be a wiseguy.”

Let this be my first On Language mea culpa: Harris (along with fellow Gotcha-ers Brian Hoffman, Peter Glasgow and Wade Richardson) rightly questions my use of implacability, which does not quite fit in the context of cool. Imperturbability would indeed have been a better choice, as would the more modern unflappability.

I cop to the malaprop, though my mix-up was not as blatant or farcical as the eponymous Mrs. Malaprop’s constant misuse of language in Richard Brinsley Sheridan’s 1775 play “The Rivals” (e.g., “He is the very pineapple of politeness,” with pineapple standing in for pinnacle). I might have been influenced in my word choice by such cool words as placid and placate, but of course, implacability tends to imply just the opposite, an inability to be made calm.

In my modest defense, the “relentless, steadfast” shade of implacability can intersect with the unruffled nature of imperturbability, and in fact the adjectives implacable and imperturbable often appear side by side – or as linguists would say, they can easily “collocate” with each other. For instance, the newly published translation of Simone de Beauvoir’s “Second Sex” by Constance Borde and Sheila Malovany-Chevallier contains a line about “precise, imperturbable, implacable logic.”

The overlap in usage between the two words doesn’t absolve the linguistic error, however. In a vicious dissection of James Fenimore Cooper’s many “literary offenses,” Mark Twain offered some simple rules for writing as an antidote. My favorite is: “Use the right word, not its second cousin.” This advice hits close to home, since I work as executive producer of the Visual Thesaurus, which essentially creates interactive family trees of words with similar meanings. Implacability and imperturbability might have a passing family resemblance, but it’s best not to treat them as blood brothers.

Ben Zimmer, New York Times


Full article: http://www.nytimes.com/2010/06/20/magazine/20FOB-onlanguage-t.html


If you use Facebook much, you know the drill. You log on for your social-networking fix, and the site looks different from how you remembered it, either subtly or radically so. An endless stream of design changes can leave users feeling unsettled — or even irate, as has been the case with the latest uproar over Facebook’s privacy settings. Why all the behind-the-scenes tinkering? It’s the way of the Web these days: everything is iterating.

“In the tech industry, a company like Facebook likes to say that it ‘iterates,’ ” Caroline McCarthy explained in a recent article on the technology site CNET. “Old products are killed. New ones are rolled out one at a time, rather than bundled together in a huge annual relaunch. Experimental features emerge and disappear.”

Facebook’s founder, Mark Zuckerberg, underscored the company’s shape-shifting attitude in his sheepish response to the privacy brouhaha. “We’re trying to be innovative and iterative with our development,” Zuckerberg said at a press conference. “Today, Facebook is very different than it was when we first started.”

Facebook is hardly alone in exalting the verb iterate, along with the noun iteration and the adjective iterative. Developers of Web-based services big and small have jumped on the iterating bandwagon. Neeru Paharia, a founder of the online educational project Peer 2 Peer University, recently told The New York Times that the company’s mission is to “experiment and iterate.” Engineers at the search behemoth Google, meanwhile, live by the mantra “Launch Early and Iterate.”

This is exciting new terrain for a word that had been associated with little more than dull monotony. From the Latin verb iterare (“to repeat”) formed from iterum (“again”), iterate entered English as a participial adjective meaning “done again,” recorded by the Oxford English Dictionary from 1471. The verb came on the scene in the 1530s, more than a century after its repetitive comrade reiterate grew out of the Latin reiterare (sharing the sense of iterare). That chronology should put to rest the notion that reiterate is nothing more than a redundant form of iterate (a pet peeve for some), since the re- prefix could be used simply for emphasis in Late Latin and its European inheritors. The seemingly superfluous re- shows up in other English words, like reduplicate and redouble.

While reiterate took hold in common parlance with the meaning “to state something (over and over) again,” iterate and iteration retreated to more scholarly corners of the language. In mathematics, iterate came to be used for the application of a formula in repeated fashion, taking the result from one pass and feeding it back as the input for the next go-round. An iterative approach can result in ever-closer approximations of a solution to an equation, as accuracy improves with each step.

Early computer programmers embraced iterate to refer to the repetitive processes that were their bread and butter, looping through a sequence of instructions with mechanical regularity to achieve a desired result. New technical shades of meaning began appearing in the late 1960s, as programmers extended iteration to broader computational problems. At I.B.M.’s Thomas J. Watson Research Center, software designers pioneered the model of “iterative enhancement,” in which a system is developed incrementally, taking advantage of lessons learned from earlier versions.

This gradual, evolutionary approach would later come to be known as “agile” software development, a term coined by the authors of “The Agile Manifesto” at a Utah ski resort in 2001. A similar philosophy of innovation can be found in other industries: Toyota, at least before its recent recall woes, was renowned for a commitment to kaizen, a Japanese word for “continuous improvement.” “Instead of trying to throw long touchdown passes,” James Surowiecki wrote in a 2008 article in The New Yorker, “Toyota moves down the field by means of short and steady gains.”

On the Internet, tiny attempts to improve a Web site’s design can be made with little risk, because failed experiments simply vanish into the cybervoid. Netflix became an online success story in large part by fine-tuning its site to meet the growing demands of DVD renters. In a 2006 company profile for User Interface Engineering, Joshua Porter said of Netflix, “They want to get even better, and for them that means iterate, iterate, iterate.” Google, the acknowledged master of iterative design, can sometimes take this tactic to absurd lengths, as when it tested 41 different shades of blue for its toolbar last year. (Google’s lead designer, Douglas Bowman, quit soon thereafter, explaining on his blog that he had “grown tired of debating such minuscule design decisions.”)

Now being skilled at iterating has emerged as a prized trait among the digital set. As the curiously named video-game designer American McGee told GamePro magazine last month, “There’s a lot to be said for smaller development teams, shorter schedules and a higher degree of innovation driven by a willingness to iterate.” No object for the verb is necessary, but it is often followed by the preposition on (as in “to iterate on a previous version”). A recent job listing for a software-development engineer at Amazon singled out the “ability to iterate on an idea” as a critical qualification.

In a landmark 1980 paper, the computer scientists Bill Buxton and Richard Sniderman summed up the iterative approach to design as “Keep trying until you get it right,” acknowledging that the key question was “How do you know when you have got it ‘right’?” The schemers behind Facebook may be rolling out iterations like crazy, but only time will tell if they’re iterating their way into oblivion.

Ben Zimmer will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/06/13/magazine/13FOB-onlanguage-t.html



The Times Literary Supplement, the erudite British weekly, isn’t the first place you would expect to find an outbreak of cool. But for a recent stretch of a few months, its letters page was home to a protracted debate over exactly how cool got cool.

It all started in January, when Toby Lichtig reviewed “Journey by Moonlight,” a 1937 novel by the Hungarian writer Antal Szerb that has recently been translated into English by Len Rix. Lichtig gave a thumbs up to Rix’s rendering, but he complained about the text’s occasional anachronisms, particularly the use of cool “in its contemporary sense” — that is, in the “stylish” or “admirable” meaning popularized by the cool cats and chicks of the postwar era and exemplified by the all-purpose expression of appreciation or approval, “That’s cool.”

A parade of nine T.L.S. readers questioned how modern the “contemporary sense” of cool actually is, pulling out 19th- and early-20th-century quotations from writers as diverse as Wilkie Collins and T. E. Lawrence to support the idea that our current understanding of cool is not so new after all. E. D. Hirsch Jr., the American author of “Cultural Literacy,” even chipped in with a line from Abraham Lincoln (“That is cool”). The whole discussion, unfortunately, drifted into a muddle of anecdotes without any firm grip on the semantics of cool.

The letter writers would have been well served to consult some cool lexicography, particularly the thorough treatment of the word in Jonathan Lighter’s Historical Dictionary of American Slang or the even more comprehensive entry for cool in the online Oxford English Dictionary. What the dictionaries tell us is that some shades of cool are quite old indeed. Already by the time of “Beowulf,” a millennium ago, the original low-temperature meaning of cool had veered into the realm of human emotion — or rather the lack thereof. From Old English to the ages of Chaucer and Shakespeare all the way to the present, cool has been able to mean “dispassionate, calm, self-composed.” Some of our latter-day cool expressions — “stay cool,” “play it cool,” “cool as a cucumber,” “cool customer” — play off this ancient connotation of implacability.

By the early 18th century, emotional coolness had branched off in another direction: “assured and unabashed where diffidence and hesitation would be expected,” as the O.E.D. has it. This impudent style of cool — no longer in common usage — is the one that turns up in the examples from Abraham Lincoln and Wilkie Collins given by the T.L.S. readers. Lincoln’s line, “That is cool,” from his 1860 speech at Cooper Union, was a response to the audacity of secessionist demands. Collins, likewise, has a character in his 1868 novel, “The Moonstone,” say, “Cool!” when presented with an insolent request. In both cases, cool was used disapprovingly, quite distinct from later, more positive uses.

Those early instances of cool are easy enough to explain, but what of the intriguing contribution to the T.L.S. colloquy from Allan Peskin, a biographer of President James A. Garfield? Peskin found an 1881 letter by Garfield’s teenage daughter Mollie to a friend, telling of her crush on her father’s private secretary, Joseph Stanley-Brown. “Isn’t he cool!” Mollie gushed in the letter. The “audaciously impudent” sense of cool wouldn’t seem to work here, since, as Peskin points out, Mollie went on to marry Stanley-Brown when she came of age. Could Mollie have been ahead of her time, already using cool to mean “sophisticated, stylish” or “admirable, excellent”?

Though it would be indubitably cool to find a hidden connection between schoolgirl talk of the 1880s and later hipster slang, my best guess is that Mollie was describing her future husband with the older “cool, calm and collected” nuance. “As a private secretary,” Peskin told me when I asked about Mollie’s letter, “Stanley-Brown demonstrated the customary diffidence that was expected of someone in his position.” Still, Peskin said he finds it difficult to believe that a teenage girl would be infatuated with a man for being dispassionate.

Sad to say, the historical evidence isn’t on the side of Mollie as a proto-cool chick. Even in African-American usage, where the later slang developments of cool would percolate, nothing definitive has been found to establish it as a general term of approval before the 20th century. An 1884 article by J. A. Harrison on “Negro English” includes “Dat’s cool!” in a list of undefined interjections, but there’s no way of knowing if the exclamation was merely a comment on a person’s assuredness or audacity, fitting in with one of the earlier meanings.

A half century later, cool was still by no means widespread as a mark of admiration among African-Americans. Zora Neale Hurston, a great chronicler of black speech, used the expression “whut make it so cool” several times in her writings, both fictional and ethnographic, beginning in 1933. But cool is conspicuously absent from lexicons of “jive slang” compiled in the late ’30s by the bandleader Cab Calloway and the New York Amsterdam News editor Dan Burley.

What we think of as modern cool (“Cool, man!”) does not come on the scene in a serious way until the early 1940s, in jazz circles — with the credit often given to the hipper-than-hip saxophonist Lester Young. It would take another decade for the slang word to hit the American mainstream, taking off among white teeny-boppers circa 1952. From our current vantage point, it’s easy to read older examples of cool as variations on the now-entrenched colloquial use. But for lovers of linguistic verisimilitude, that’s just uncool.

Ben Zimmer, New York Times


Full article and photo: http://www.nytimes.com/2010/05/30/magazine/30FOB-onlanguage-t.html


Lynne Geyser writes: “My son and I are in disagreement concerning the use of the word revert. The only usage I find acceptable is ‘to return to a previous state.’ He uses it (and claims that his Bahamian/English friends use it) to mean ‘to get back to someone.’ That is, instead of saying, ‘I will get back to you with that information or with respect to that issue,’ they say, ‘I will revert to you.’ Can you shed some light on this?”

Revert, as the major American and British dictionaries have it, does indeed primarily refer to a return to a former condition, belief or practice. For instance, George Vecsey, a sports columnist for The New York Times, recently wrote of the woeful New Jersey Nets, “Jason Kidd, Kenyon Martin and Richard Jefferson reached the finals in 2002 and 2003, but the Nets have since reverted to their haunted roots.”

Secondary senses of the verb include the legal one, denoting the return of something to its previous owner, as in the contractual boilerplate, “All rights revert to the author after publishing.” In evolutionary biology, it can signal an atavistic throwback to an ancestral type. To someone familiar with only these traditional meanings of the word, “I will revert to you” might sound like a bizarrely personal take on either property law or genetics.

It turns out that unbeknownst to most dictionaries, revert has been leading another life in several varieties of world English, notably the kind spoken on the Indian subcontinent. The usage has finally garnered the attention of the Oxford Advanced Learner’s Dictionary, which amended the definition of revert for its newly published eighth edition to include the meaning “to reply.” Marked in the OALD as “Indian English,” the use of the word is exemplified by the sentences: “Excellent openings – kindly revert with your updated CV,” and “We request you to kindly revert back if you have any further requirements.”

(Sticklers who are not already up in arms about this change in meaning will surely bristle at the redundancy of the second sentence: why revert back when you can simply revert? The same criticism can be leveled at reply back, with the superfluous addition of back resulting in “pleonasm,” or the use of more words than is strictly necessary.)

As Alison Waters, a lexicographer at Oxford University Press, told The Indian Express, revert in the sense of “reply” is one of eight contributions from Indian English included in the latest batch of OALD additions. It has spread beyond India, however, cropping up in the English of Singapore, Malaysia, Hong Kong and elsewhere. In these countries, the usage has occasionally been deemed improper by language authorities. Singapore’s Speak Good English Movement, for example, labels it “a mistake” that should be avoided in official correspondence.

Given the established use of revert in several Anglophone countries, often appearing in formal letter writing, it would be unfair to treat the “reply” meaning as simply erroneous. Paul Brians, an emeritus professor of English at Washington State University, previously catalogued revert in his online compendium of “Common Errors in English Usage.” Alerted to its prevalence in South Asia, Brians recently revised the entry, while still recommending that “it is best to stick with ‘reply’ when dealing with non-South Asian correspondents.” This is sound advice for now, but if Geyser’s son and his friends are any indication, revert may be in the midst of a global shift from which there is no turning back.

Ben Zimmer, New York Times


Full article: http://www.nytimes.com/2010/06/06/magazine/06FOB-onlanguage-t.html


W.F. Young asks: “Long ago I developed the expectation that on encountering the word fraught, I’d find it associated with a prepositional phrase, ‘with [something].’  Now, in old age, I find that expectation dashed, often.  Can you say what it was based on and anything about when, how or by whom it was undercut?”

Here we have a case of a very old word undergoing a rapid shift in contemporary usage. In Middle English, fraught (an etymological cousin of freight) was a verb meaning “to load (a ship),” and the identical form could serve as a past participle meaning “laden (with).” While the verb dropped out of the language almost entirely, the past participle stuck around, typically followed by “with” and an object — often a burden, whether real or figurative.

Fraught as a standalone adjective meaning “distressed, anxious, tense,” without an accompanying prepositional phrase, is a 20th-century innovation. When the word cropped up on William Safire’s radar in 2006, he offered a line from “King Lear” as a putative early example: Goneril tells her father to “make use of that good wisdom, whereof I know you are fraught.” But Shakespeare did use fraught with a preposition, whereof, and an object, wisdom, so it is in fact very much in line with the usage of the era. Lear was surely in a distressed emotional state, but that wasn’t what his daughter was driving at.

By the nineteenth century, the metaphorical extension of the word had developed a new twist. Instead of the traditional phrasing, fraught with followed by an object (something usually unpleasant or unfortunate), the object could appear before fraught in a hyphenated compound, such as danger-fraught, pain-fraught or war-fraught. Thus if a moment was fraught with emotion, it could just as well have been described as emotion-fraught or in time as emotionally fraught, signaling the implied object in the adverb.

The first glimmers of fraught without even a hint of an object start appearing in the 1920s and ’30s. The earliest example I’ve found so far comes from a 1925 serialized story by Henry Leyford Gates about a flapper named Joanna. In one installment Gates writes, “It was Joanna who at last broke the fraught silence.” The lyrical phrase fraught silence, perhaps evoking pregnant pause, shows up again in books from 1934, 1946 and 1958. Another early use is in George O’Neil’s 1931 novel about the poet John Keats, “Special Hunger”: “For Keats this was a singularly fraught circumstance.” Circumstances, along with anxiety-ridden situations, issues and relationships, would soon become familiar companions for fraught.

Standalone fraught picked up steam in the 1960s, attracting the notice of dictionaries and usage guides, but the last couple of decades have seen an even stronger uptick. In the texts collected in the Corpus of Contemporary American English from 1990 to 1994, only about 9 percent of the instances of fraught do not take the preposition with. From 2005 to 2009, however, the rate jumps to a whopping 30 percent. The usage has become a journalistic commonplace, as in the recent New York Times headlines, “For New Stadium, a Fraught Coin Flip” and “Opera Companies’ Fraught Seasons.” No doubt about it, we’re living in fraught times.

Ben Zimmer will answer one reader question every other week.


Full article: http://www.nytimes.com/2010/05/23/magazine/23FOB-onlanguage-t.html

Corporate Etymologies

When Keds kicked off a retro advertising campaign in March as “the original sneaker,” the venerable brand tripped over its own laces. Claiming that “Keds are the first shoes to be called sneakers,” the company’s Web site provided a seemingly authoritative origin story: “The term, coined in 1917 by Henry Nelson McKinney, an agent for the advertising firm N. W. Ayer & Son, refers to their soft, noiseless rubber soles, which allow the wearer to ‘sneak’ up on unsuspecting friends or family.”

Reporting on the marketing campaign for The New York Times, Andrew Adam Newman decided to check out the historical record. With the help of the lexicographer Grant Barrett, Newman determined that sneakers had been sneaking around since at least 1887, when The Boston Journal of Education observed, “It is only the harassed schoolmaster who can fully appreciate the pertinency of the name boys give to tennis shoes — sneakers.” Presented with the evidence, Keds toned down its marketing copy: “Keds created an American classic,” went the diluted version.

Barrett unearthed the early mention of sneakers by searching The Times’s digital archive to find a column of “Crisp Sayings” quoting the Boston source. A similar trawl through back issues of The Boston Globe turns up the word sneakers in tennis-shoe ads from the Jordan Marsh department store starting in the summer of 1889. But it isn’t even necessary to hit the newspaper databases to debunk the Keds anecdote: the Oxford English Dictionary has long had an entry for sneaker with citations for the shoe sense back to 1895, when it appeared in Funk and Wagnall’s Standard Dictionary.

The unfolding Keds drama, from the company’s bold coinage claim to its swift refutation, is depressingly familiar for word watchers. All too often, a business stakes its claim to a common term based solely on romantic corporate mythmaking. The lore travels down from one generation of employees to the next, accepted along with other tales of illustrious company founders. Think of these just-so stories as exercises in “etymythology,” to use a clever term from the Yale linguist Laurence R. Horn.

Corporate word myths are rarely obvious fabrications, though out-and-out whoppers are not unheard of. In 1992, the information-security company Rainbow Technologies ran a series of tongue-in-cheek ads in computing magazines, giving the origins of name of the dongle, a device that plugs into a computer to allow the use of protected software. The explanation that the gadget was named after its supposed inventor, Don Gall, was so egregiously false that the company happily owned up to it as a marketing ploy when pressed by Eric S. Raymond, who maintains the Jargon File, an online lexicon of hacker slang.

More typically, as Keds did with sneakers, a company locates an ostensible coinage in a pioneering ad campaign. Histories of the Haggar clothing company assert that in the 1940s, the Haggars, working with the adman Morris Hite, came up with the term slacks, so called because they were to be worn during leisurely “slack time.” Too bad the O.E.D. dates slacks, meaning “loosely cut trousers for informal wear,” all the way to 1824.

Names for sweet foodstuffs seem to attract fanciful factoids like flies to honey. Barry Popik, an inveterate corrector of word misinformation, points to the legend of Cracker Jack. When the popcorn and peanut confection was perfected in 1896, the story goes, an enthusiastic salesman yelled, “That’s a crackerjack!” — at once naming the product and coining the phrase. But by then, crackerjack was already in wide circulation to refer to fast horses, skillful baseball players or anything of superior quality.

Dictionaries will tell you that jimmies, the word for candy sprinkles, is of unknown origin, but that hasn’t stopped the Just Born candy company from peddling its own history of the term. James Bartholomew, it’s claimed, operated the first machine for making Just Born’s sprinkles, circa 1930, and jimmies were named in his honor. David Wilton, author of the excellent book “Word Myths,” says there’s no credible evidence of this. Wilton also dispels the widespread but undocumented notion that jimmies have a racist past, born out of some association of chocolate sprinkles with Jim Crow. A better guess is that the word originated as a diminutive of jim-jams, a term for “little doodads” that goes back to the 16th century.

The chocolate giant Hershey has mythologized its early candy-making machinery in the back story it tells for its teardrop-shaped Kisses. “While it’s not known exactly how Kisses got their name,” Hershey’s Web site coyly states, “it is a popular theory that the candy was named for the sound or motion of the chocolate being deposited during the manufacturing process.” Samira Kawash, an independent scholar working on a cultural history of candy, recently dismantled this “popular theory” on her blog, Candy Professor. When Hershey first rolled out its Kisses in 1907, Kawash observes, “a candy ‘kiss’ was just another name for a small bite-size candy, typically something with a softer texture.”

By promoting the “sound of the machine” origin for the once-generic kisses, Hershey is engaging in what Kawash calls “strategic corporate forgetting”: “they invent an original story for marketing purposes to make it seem unique to their candy.” Notably, Hershey’s historical whitewash took shape in the late ’90s, just about when the company’s lawyers were beginning an ultimately successful battle to trademark kisses. They didn’t use the story in their legal arguments, but it played right into their efforts to associate kisses uniquely with the Hershey brand. When a company is trying to make its product iconic in the minds of consumers, it doesn’t hurt to inject a pleasant etymological tidbit, no matter how easy it is to disprove.

Ben Zimmer, New York Times


Full article and photo: http://www.nytimes.com/2010/05/02/magazine/02FOB-onlanguage-t.html


“Wellness,” intoned Dan Rather in November 1979, introducing a “60 Minutes” segment on a new health movement known by that name. “There’s a word you don’t hear every day.”

More than three decades later, wellness is, in fact, a word that Americans might hear every day, or close to it. You can sign up for your company’s employee-wellness program, relax in a wellness spa treatment or even plan some “wellness tourism” for your next vacation. Your cat or dog can get in on the action, too, since the W-word has been pressed into service as a brand of all-natural pet food.

Wellness is here to stay, despite misgivings over the years that it isn’t such a well-formed word. How did it take over, and whatever happened to good old health?

Though the Oxford English Dictionary traces wellness (meaning the opposite of illness) to the 1650s, the story of the wellness movement really begins in the 1950s. New approaches to healthful living were emerging then, inspired in part by the preamble to the World Health Organization’s 1948 constitution: “Health is a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity.” Halbert L. Dunn, chief of the National Office of Vital Statistics, was looking for new terminology to convey the positive aspects of health that people could achieve, beyond simply avoiding sickness. In a series of papers and lectures in the late ’50s, Dunn sketched out his concept of “high-level wellness,” defined as “an integrated method of functioning, which is oriented toward maximizing the potential of which the individual is capable.”

Dunn collected his presentations in a 1961 book, “High-Level Wellness,” but it would take another decade for his work to resonate with a committed group of followers. An early acolyte was John W. Travis, who picked up Dunn’s book in 1972 from a $2 clearance table at the bookstore of Johns Hopkins Medical School, where he was enrolled in a preventive-medicine residency program. Travis didn’t think much of Dunn’s buzzword at first. “I thought the word wellness was stupid, and it would never catch on,” he recently told me. But Travis was enamored with the way Dunn presented his ideas, and he put those ideas into action — and reluctantly embraced the word itself — when he opened the Wellness Resource Center in Mill Valley, Calif., in November 1975. The center promoted self-directed approaches to well-being as an alternative to the traditional illness-oriented care of physicians.

Wellness was so unfamiliar at the time, Travis recalls, that he constantly had to spell the word when using it over the phone. It soon got national attention when a young doctoral student named Donald B. Ardell profiled Travis’s center in the April 1976 issue of Prevention magazine. In a sidebar, Prevention’s editor, Robert Rodale, welcomed the “exciting field of wellness enhancement,” promising that the magazine would “examine all aspects of wellness promotion.” Even greater exposure came with Rather’s “60 Minutes” piece, which focused on Travis and the Mill Valley center.

Travis and Ardell found a kindred spirit in Bill Hettler, a staff physician at the University of Wisconsin-Stevens Point. Influenced by their work, Hettler founded the annual National Wellness Conference at Stevens Point, now in its 35th year. The conference lent valuable academic prestige to the wellness movement. It also caught the attention of Tom Dickey, who was working with the New York publisher Rodney Friedman in the early 1980s to set up a monthly newsletter on health, based at the University of California, Berkeley. Friedman wanted the publication to compete with the Harvard Medical School Health Letter, and Dickey suggested using wellness in the title as a contrast. In 1984, the Berkeley Wellness Letter was born.

Joyce Lashof, then the dean of Berkeley’s School of Public Health, remembers that wellness was initially a tough sell at the school. Not much was known on campus about the earlier work of Travis and his fellow wellness advocates, but Lashof’s colleagues associated the term wellness with the “flakiness” of Mill Valley and surrounding Marin County. The NBC newsman Edwin Newman had televised an exposé of Marin County’s hedonistic lifestyle, which notoriously opened with a woman getting a peacock-feather massage from two nude men. The Berkeley Wellness Letter, however, managed to avoid such unseemly associations by publishing serious, evidence-based articles on health promotion, while debunking many of the holistic health fads of the day.

Though the Berkeley newsletter, which at its peak reached a million subscribers, did much to establish the credibility of wellness in the ’80s, language pundits continued to raise their eyebrows. Newman, who also moonlighted as a usage commentator, belittled wellness, calling it an example of “bloating” in the language. In 1988, a survey of the Usage Panel for the American Heritage Dictionary of the English Language found that a whopping 68 percent of panelists disapproved of the word when used to refer to employee-wellness programs and the like, and a critical note was included in the dictionary’s 1992 edition.

But carping over wellness faded away in the ’90s as the term gained a foothold in everyday use. The American Heritage Dictionary silently dropped the usage note on wellness in its fourth edition in 2000, a decision that its supervising editor, Steve Kleinedler, chalks up to the growing prevalence of wellness programs in the workplace and beyond. A word that once sounded strange and unnecessary, even to its original boosters, has become tacitly accepted as part of our lexicon of health. Well, well, well.

Reader Question
Erskine Kelly asks: “What is the difference between preventive and preventative? Like use versus utilize, my ears turn off when I hear the use of utilize or preventative by some so-called expert.”

There is, in fact, no difference in meaning between preventive and preventative. Some, including William Safire in a 1993 On Language column, have suggested using preventive as an adjective and preventative as a noun, but both forms of the word have alternated freely as adjective and noun since they entered the language in the 17th century. Despite their introduction into English at roughly the same time (the Oxford English Dictionary dates preventive back to 1626 and preventative to 1655), preventive has won out as the preferred version.

Many usage guides have disparaged preventative as improper, because it doesn’t accord with classical roots: the Latin past participle stem praevent- adds -ion to form prevention and -ive to form preventive. Words ending in -ative ought to have the -at- in the root already: demonstrat- begets demonstration and demonstrative, narrat- begets narration and narrative, and so forth. Since we don’t have preventation, then preventative is equally misbegotten, by this way of thinking.

English word formation isn’t always that tidy, however. The -ative ending often shows up even when there isn’t a corresponding noun ending in -ation: we have authoritative without authoritation, qualitative without qualitation and talkative without talkation. Talk, of course, isn’t even from Latin, but the friendly -ative suffix clung to it anyway, by analogy with other verbs that form adjectives by appending -ative, like affirm and affirmative, or represent and representative. Preventative got created from prevent by this same analogical pattern.

Complaints about preventative go back to the late 18th century. The spelling reformer James Elphinston wrote in 1787 that preventative could be heard among Londoners in unguarded speech, along with other disapproved pronunciations like umberella and mischievious that sneak in an extra syllable (a process that linguists call “epenthesis”). A 1795 review of the Earl of Lauderdale’s “Letters to the Peers of Scotland” criticized the appearance of preventative in the text, declaring that it was “not English.” Similarly, Francis Barnett took Andrew Reed’s “No Fiction” to task in 1823 for including the word: “In the English language there is no such word as preventative, preventive there is.”

Preventative soon drew enough attention that The New York Mirror devoted a whole column to it in the newspaper’s March 6, 1824, edition. “The conversion of preventive into preventative is an error too common,” wrote the anonymous scribe. “Some fall into it from ignorance, and others from inadvertence.” The writer attributed its use to “a disposition in people to spell words with more letters than belong to them; or to insert a syllable or syllables, where addition, so far from being advisable or requisite, proves injurious.” The column concluded with a call to action: “Let those, then, who from carelessness or any other cause, have been in the habit of using preventative, make it henceforth an invariable rule, whether in writing or in utterance, to prefer the proper and unexceptionable term preventive.”

That call went unheeded, however, as preventative has managed to survive as a variant, albeit a much less popular one. A search on the Corpus of Contemporary American English finds preventive beating preventative by a ratio of about 6 to 1 in current written usage, across both academic and nonacademic texts. You can still get away with using preventative in standard English, but that extraneous syllable won’t gain you anything, other than disdain from the sticklers.

Ben Zimmer will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/04/18/magazine/18FOB-onlanguage-t.html


Two of the most basic words in the English language, yes and no, are locked in a constant struggle, embodying abstract forces of agreement and opposition, positivity and negativity, acceptance and denial. Just look at the recent Congressional wrangling over health care reform, where the words have come to stand for much more than simply the up or down votes that legislators may cast. Democrats seeking a final compromise over health care legislation have talked optimistically about getting to yes. “I just wish and hope some of my colleagues will be willing to help us get to yes on this,” Senator Mark Warner of Virginia said. And a House leadership aide told The Huffington Post, “We do have an environment where people can now get to yes.”

Getting to yes has become a creaky cliché in political and business circles thanks to a best-selling negotiation manual with that title first published in 1981. The authors of “Getting to Yes,” Roger Fisher and William Ury of the Harvard Negotiation Project, outlined the best strategies for reaching a settlement by identifying “options for mutual gain.” Fisher had been experimenting with the word yes for quite a while. Back in 1969, he argued in the book “International Conflict for Beginners” that the key to getting the other side of the bargaining table to agree is to present them with a “yesable proposition.” It is no surprise that supporters of Barack (Yes We Can) Obama would draw on the conciliatory rhetoric of getting to yes.

Blocking the road to yes on health care and other initiatives, however, is the Republican Congressional minority, which has been painted by Democrats as “the party of no.” It’s been a common refrain since the early months of the Obama administration, when the Democratic National Committee introduced a “Party of No” clock on its Web site, tallying the time Republicans spent criticizing Obama’s budget plan without offering their own alternative.

Leading Republicans, for their part, have either disavowed the “party of no” label or have found canny ways to reclaim it. Mitt Romney told the Conservative Political Action Conference to embrace no proudly, explaining, “it is right and praiseworthy to say no to bad things.” Representative Pete Sessions of Texas, meanwhile, has offered a punny twist: “We’re the party of know: k-n-o-w.”

When Democrats use “the party of no” to criticize Republican obstructionism, they are unwittingly echoing Ronald Reagan. The expression has historical resonances, like Truman’s election-year complaints in 1948 about “the do-nothing Congress” or Spiro Agnew’s memorable line (penned in 1970 by his speechwriter, William Safire) excoriating “the nattering nabobs of negativism.”

But Reagan was the first to paint the opposing party so forcefully with the brush of “no.” At the welcoming rally of the 1988 Republican National Convention, Reagan blasted the opposition as “the party of no,” while appealing to “rank-and-file Democrats” who had been alienated by the “strident liberalism and negativism” of party leaders. “The party of ‘yes’ has become the party of ‘no,’ ” Reagan, himself a former Democrat, said. “The liberal leadership of your party has been saying no to you, and now it’s time for you to start saying no to them.” The boisterous crowd reacted to Reagan by chanting the antidrug slogan his wife, Nancy, did so much to popularize: “Just say no!”

James Carville, who knew a powerful political catchphrase when he heard one, turned “the party of no” back on Republicans in the spring of 1993 when President Clinton was working to pass a stimulus bill. Clinton himself went further the following year, calling the Republicans the party of “no, no, no, no, no, no, no, no, no,” in an angry speech at a Boston fund-raiser. Other leading Democrats like Al Gore and Dick Gephardt used the phrase to disparage Republicans in the Clinton era.

At the beginning of George W. Bush’s second term, “the party of no” became a Republican talking point again, spearheaded by Tom DeLay, then the House majority leader. After Bush’s 2005 State of the Union address, DeLay said of the Democrats: “They’ve become the party of ‘no.’ No ideas, no solutions, no agenda. You know, ‘Just say no’ is not an agenda.” Now, five years later, the no wheel has turned yet again, proving that Reagan’s original phrase is endlessly adaptable to changing political circumstances.

Yes and no can accrue symbolic heft through what linguists call “zero nominalization,” whereby a noun is created from some other part of speech without adding a typical suffix like –ness or –ation. Nouny versions of yes and no have enjoyed quite a ride from the political class, but they also get plenty of play in pop culture. On the positive side of the ledger, Wendy Macleod’s play and subsequent movie adaptation “The House of Yes” tells the story of an entitled rich girl who will not be denied. Maria Dahvana Headley’s 2006 memoir of a year spent accepting dates from any man who asked her out is titled, naturally enough, “The Year of Yes.”

But the power of no is even more primal, perhaps because it is so often among the first words that English speakers learn as children. The poet James Tate imagines it as a territory of sorts, writing, “I went out of myself into no, into nowhere.” In slangy vernacular, no can turn into a material substance: the teenage title character in the 2007 movie “Juno” protests, “That’s a big, fat sack of no!” Bauer-Griffin Online, a paparazzi photo blog, critiques celebrities with snarky headlines like “Kelly Preston Is a Bucket of ‘No’ ” or “Phoebe Price, Pile of ‘No.’ ” In our culture of negativity, all too often the noes have it.

Reader Question
Barbara Orris asks: “The word avuncular means ‘of, pertaining to, or characteristic of an uncle.’  Is there a word that means pertaining to or characteristic of an aunt?”

A handful of adjectives related to aunts have been recorded in English, though none are as common as the male counterpart, avuncular. The most straightforward is auntly, modeled after motherly, fatherly, sisterly and brotherly, with scattered usage back to the 1830s. (Uncly is even rarer.) Auntish and auntlike are other alternatives that append aunt with familiar suffixes.

If we take the Latinate approach à la avuncular, then the Oxford English Dictionary provides materteral, attested since 1823 in humorous use, meaning “characteristic or typical of an aunt.” Classics majors would be quick to point out that the Latin roots of these words only cover maternal siblings: avunculus means “mother’s brother” and matertera means “mother’s sister.” On the paternal side, there’s patruus for “father’s brother” and amita for “father’s sister.”

In 1982, when William Safire asked his loyal Lexicographic Irregulars for “a word to fill this black hole in our vocabulary,” creative suggestions included auntique, tantular, tantoid, and tantative. One of his correspondents, Arianna Stassinopoulos (now better known as Arianna Huffington), was drawn to the Latin root amita, noting that “so far no English word derived from it exists.” She added, “There is, however, no reason why we cannot here and now invent it and proclaim that all aunts on the father’s side must from now on end their letters  ‘Yours amitally.'” Safire declared amital the best of the lot, despite the “happily drugged” sound of it.

Amital has, in fact, cropped up in anthropological studies of kinship since the 1940s but, like materteral, it has never made any headway in general use. Avuncular, meanwhile, has proved a durable descriptor for older public figures with a kindly demeanor, like Ronald Reagan and Walter Cronkite. It would probably take a thesis in gender studies to explain this terminological imbalance. Notably, traditional stereotypes about aunts have often been less than flattering: a literature search for auntish turns up many examples of “maiden auntish,” evoking images of forlorn spinsterhood.

Our new language columnist, Ben Zimmer, will answer one reader question every other week.


Full article and photo: http://www.nytimes.com/2010/03/21/magazine/21FOB-onlanguage-t.html

Vocabulary Size

Trying to convince speakers of English that they need to expand their vocabularies is one of the oldest strategies for selling word books. The very first English dictionary, A Table Alphabeticall, published in 1604, stated on its title page that its approximately 2,500 words (most of them relatively obscure) were intended for the “benefit and help of ladies, gentlewomen or any other unskillful persons.” For the next 100 years, most published dictionaries were concerned with helping the verbally unsophisticated (or at least the insecure) feel more comfortable with their verbal repertories. These early dictionaries were filled only with “hard” words and did not bother with defining cow or apple, reasoning that everyone knew what words like that meant. Instead they sought to explain to the uninitiated the meaning of terms like desticate (to cry like a rat) and antipelargy (the reciprocal love children have for their parents).

In the years since, we English speakers have become only more concerned with the size and quality of our own personal word hoards. We seem to be under the impression that a small vocabulary is one of those things, like bad teeth or poor manners, that can hold us back in life. There are thousands of books and Web sites, many of them quite commercially successful, that promise to help redress the problem. The cover of the audio book “Wordmaster” exhorts you to “Improve your word power and improve your life!” The Web site Verbalsuccess.com avers that its method will earn you greater respect in life and allow you to make more money after using it for just a week. Others make promises of a more modest nature, like the cautious claim in the book “The Words You Should Know to Sound Smart” that it is “possible” that learning its words “may even put some money in your pocket.” Claims by some services are simply baffling, like those emblazoned on the cover of “The Yo Momma Vocabulary Builder” (sample entry: “Yo momma’s so peripatetic, she went out for a walk and never came back”), which assures readers that it will assist not only in verbal combat and achieving higher test scores — but also in losing weight.

Study after study over the past hundred years has tied vocabulary size to higher socioeconomic status, greater educational achievement and a host of other goods. Of all the benefits, real or imaginary, of a robust vocabulary, perhaps the most appealing is that vocabulary is heritable — that you can pass it along to your children like an acquired trait in Lamarckian evolution. The Educational Testing Service, which has been concerned with improving vocabularies since 1947, issued a report in 2009, “Parsing the Achievement Gap II,” which explained some of the benefits of an extensive vocabulary. Among the more notable benefits it cited was that children who are raised in higher socioeconomic brackets tend to have vocabularies that are remarkably larger than those who are raised in poorer ones. By the age of 3, children who are raised in a professional household know twice as many words as do children raised on welfare.

Yet before you set aside that copy of “Goodnight Moon” in favor of reading to your progeny from Merriam-Webster in the evening, consider that it is not simply the number of words but also how they are used that is important. Most famous quotations, for instance, are not full of polysyllabic Latinisms. Brandishing 25-cent words unnecessarily will mark you as a blowhard, not an effective communicator. Winston Churchill’s oft-repeated statement about how he had nothing to offer but “blood, toil, tears and sweat” would have elicited nothing but puzzlement had he replaced that quartet of short nouns with the synonyms vermeil, moiling, delacrimation and sudorification.

It should be noted that while people have been trying to puff up their vocabulary for hundreds of years, for just as long there has been a small number of dissenters complaining about the practice. In 1664 an anonymously written pamphlet, “Vindex Anglicus,” railed against the practice of trying to use the absurd Latinate words that were then creeping into English. The author of the screed gave a list of examples of words that offended him, including catillate (to lick dishes), brochity (having crooked teeth) and vitulate (to rejoice wantonly). These are not the words that will improve your scores on standardized tests or indeed serve any useful practical function, which would explain why they are not found in any of today’s vocabulary-improvement books. You should apparently take care to sound smart but not too smart.

Most of today’s popular methods of building your vocabulary have an explicitly instrumental mind-set, avowing that if you learn lots of new words, you will get something tangible for your trouble. Few people, it seems, are thought to be content with learning new words merely to have something pleasant to think about. Knowing that there’s a word — groak — for staring silently at someone while they eat, perhaps in the hope that they will give you some food, or that the word undisonant denotes the sound that waves make when crashing on the shore will gain you nothing except the joy of knowing it. Is this not enough? Oddly, the contemporary book that gives perhaps the most romantic explanation for why you should learn words is “The Yo Momma Vocabulary Builder,” which says that the “primary reason to seek a larger vocabulary has nothing to do with impressing people, cultivating professional gain or building scholarly achievement.” The reason the authors of this curious book give for learning new words? “It makes life more interesting.”

Ammon Shea is the author of “Reading the O.E.D.: One Man, One Year, 21,730 Pages.” He is a consulting editor of American dictionaries for Oxford University Press.


Full article and photo: http://www.nytimes.com/2010/03/14/magazine/14FOB-onlanguage-t.html


When President Obama responded to the failed Christmas airliner bombing while on vacation in his native state of Hawaii, some Republicans claimed it was “bad optics.” “Hawaii to many Americans seems like a foreign place,” the Republican strategist Kevin Madden told CNN. “And I think those images, the optics, hurt President Obama very badly.”

A month later, the shoe was on the other foot when the Republican National Committee held its winter meeting in, yes, Hawaii. Then it was the party’s chairman, Michael Steele, who had to answer questions about the “optics” of gathering the party faithful at a beachfront resort in Waikiki.

How did optics achieve buzzword status in American politics? In his final On Language column last September, William Safire noted the trend: “ ‘Optics’ is hot, rivaling content.” When politicians fret about the public perception of a decision more than the substance of the decision itself, we’re living in a world of optics. Of course, elected officials have worried about outward appearances since time immemorial, but optics puts a new spin on things, giving a scientific-sounding gloss to P.R. and image-making.

Though the metaphorical expansion of optics into the political arena feels novel, it has actually been brewing for a few decades. On May 31, 1978, The Wall Street Journal quoted Jimmy Carter’s special counselor on inflation, Robert Strauss, as saying that business leaders who went along with Carter’s anti-inflation measures might be invited to the White House as a token of appreciation. “It would be a nice optical step,” Strauss said. The Journal was not impressed by the idea: the following day, an editorial rebuffed Strauss’s overtures with the line “Optics will not cure inflation.”

Over the course of the 1980s, optics gained a foothold in political discussions — not in the United States but in Canada. An April 7, 1983, Toronto Globe and Mail article headlined “Optics Is Name of Game” explained: “They say in Larry Grossman’s health ministry, it’s all a matter of optics. This has nothing to do with the eyes, but it has everything to do with the way the public sees things.” In 1986, The Globe and Mail reported that “Industry minister Hugh O’Neil showed up to get the premier’s ‘guidance’ on how to handle the political ‘optics’ of a series of massive layoffs at Algoma Steel.” And in Greg Weston’s book “Reign of Error,” about John Turner’s brief stint as prime minister in 1984, Senator Keith Davey of Canada is quoted as declining an offer to run Turner’s campaign with the excuse, “the optics would be all wrong.”

Even now, optics in the sense of political appearances is far more prevalent in Canada than stateside. I asked Stefan Dollinger, a lexicographer at the University of British Columbia who is leading a revision of the Dictionary of Canadianisms on Historical Principles, why this might be. Dollinger pointed out that bilingual Canadians would be familiar with a similar French term, optique. In standard French, optique can refer to the science of optics or it can mean “perspective, point of view.” Beyond those core meanings, optique has been extended to visual appearances in general (much like the German equivalent Optik). Canadian-French usage adds a more politically focused angle, which seems to have been imported across the bilingual divide.

The interplay of English optics and French optique on Canada’s political scene has long fascinated Beryl Wajsman, president of the Montreal-based Institute for Public Affairs and editor of The Suburban, Quebec’s largest English-language weekly. “The ‘optique,’ as it is called in very politically savvy Quebec, is everything,” he wrote in a 2007 column for Canada Free Press. Wajsman told me that optics and optique may have first commingled in Montreal around the time of the 1980 referendum on Quebec’s sovereignty. Independence for the province was voted down, with the “No” side bolstered by a stirring speech from Pierre Trudeau at Montreal’s Paul Sauvé Arena just days before the referendum. Trudeau’s bold intervention, Wajsman recalls, created some powerful optics.

Not long after Canadian political insiders got caught up in optics, some American business leaders began finding the term useful, too. In a December 1985 Wall Street Journal profile, the Allied Signal executive Michael D. Dingman dismissed speculation over clashes with his company’s C.E.O. as “a matter of ‘optics,’ a word he uses frequently to mean ‘perception.’ ” Then in May 1987, the Ohio Edison Company president Justin T. Rogers Jr. defended the construction of more electrical power plants by saying: “Optically, we’ve got all the energy we want in this country. Energy policies are being set today more on the optics of the situation than they are on the realities.”

Were these corporate titans influenced by political goings-on north of the border? Possibly, but the fact that optics showed up in relation to Carter’s inflation czar back in 1978 shows that this wasn’t an entirely Canadian invention. Rather, as is the case with so many buzzwords, we most likely can’t trace a tidy linear path from a single original source to subsequent adopters.

Optics has many sources of appeal: for bilingual Canadians it resonates with optique; for monolingual Americans it brings to mind a panoply of other associations. As Jan Freeman wrote in The Boston Globe, the word “invokes a whole set of tech-and-science terms like ‘physics,’ ‘statistics’ and ‘tectonics,’ as well as Greek-derived high-concept nouns like ‘hermeneutics,’ ‘aesthetics’ and ‘pragmatics,’ all with an aura of brainy precision.” Fittingly enough, the beauty of optics is in the eye of the beholder.

Ben Zimmer is executive producer of visualthesaurus.com.


Full article and photo: http://www.nytimes.com/2010/03/07/magazine/07FOB-onlanguage-t.html