Five Best Groundbreaking Memoirs

The Education of Henry Adams

By Henry Adams (1918)

With its narratordolefully pointing the way toward modernism, insistently (and convincingly) writing in the third person, “The Education of Henry Adams” is a one-man kaleidoscope of American history: its politics and pretenses, its turn from a patrician, Victorian society toward the unknowable chaos of the 20th century. Adams regarded his efforts at education as a lifelong exercise in passionate failure. Though 100 copies of the book were printed privately in 1907, he withheld general publication until after his death in 1918. What he didn’t write revealed an intimate truth: Adams omitted the story of his wife’s depression and suicide in 1885. Here was a seminal memoir, required reading for every student of intellectual history, in which the Rubicon of a life had been left out! Adams lifts the veil just twice: once when describing his sister’s death, and again when he returns to America and visits the bronze statue at Rock Creek cemetery in Washington, commissioned from Augustus Saint-Gaudens in his wife’s honor.

Survival in Auschwitz

By Primo Levi (1958)

From the opening sentence—”I was captured by the Fascist Militia on 13 December 1943″—this searingly quiet account by Primo Levi, an Italian chemist, of his 10 months in Auschwitz is a monument of dignity. First published in Italy in 1947 with a title that translates as “If This Is a Man,” the book became a blueprint for every such story that followed, not only as a portrait of the camp’s atrocities but also as a testament to the moments when humanity prevailed. On a mile-long trip with a fellow prisoner to retrieve a 100-pound soup ration, Levi begins to teach his friend “The Canto of Ulysses” from Dante. Completing the lesson becomes urgent, then vital: “It is late, it is late,” Levi realizes, “we have reached the kitchen, I must finish.” No candle has ever shown more brilliantly from within the caverns of evil.

Slouching Towards Bethlehem


Joan Didion in 1981.

By Joan Didion (1968)

If Joan Didion’s first nonfiction collection now seems tethered to the 1960s, it’s partly because so many writers would try to imitate her style: The tenor and cadence were as precise as an atomic clock. She mapped a prevailing culture from the badlands of Southern California and the streets of Haight-Ashbury to the province of her own paranoia, all of it cloaked in jasmine-scented doom. As both background character and prevailing sensibility, Didion brings the reader into her lair: “You see the point. I want to tell you the truth, and already I have told you about the wide rivers.” “Slouching Towards Bethlehem” suggested that memoir was about voice as well as facts. Didion didn’t just intimate a decade of upheaval, she announced it with a starter pistol’s report.


By Michael Herr (1977)

Every war has its Stephen Crane, its Robert Graves—and Vietnam had Michael Herr. He spent a year in-country in 1967, then nearly a decade turning what he saw there into a surreal narrative of the war’s geography, from its napalmed landscape to the craters of a soldier’s mind. Soldiers talked to Herr—told him things they hadn’t said before or maybe even known. “I should have had ‘Born to Listen’ written on my helmet,” he told me in London in 1988. What Herr dared to write about was war’s primal allure: “the death space and the life you found inside it.” That he created this gunmetal narrative with a blend of fact and creative memory was acknowledged from the first; his netherland of “truth” mirrored the dream-like quality of the war and influenced its literature for a decade to come.

Darkness Visible

By William Styron (1990)

Certainly there have been other literary memoirs of personal anguish, but Styron’s brutal account of his cliffwalk with suicidal despair blew the door open on the subject. Depression and alcoholism in writers had too often been viewed through a lens of romantic ruin—the destiny- ridden price of creative genius. “Darkness Visible” put an end to all that. Literary lion, second lieutenant during World War II, Styron was brought to his knees in his own brooding woods. His story hauled plenty of ideas about clinical depression out of the 19th century and into the light of day, where they belonged.

Ms. Caldwell is the author of “Let’s Take the Long Way Home: A Memoir of Friendship.” The former chief book critic of the Boston Globe, she was in 2001 awarded the Pulitzer Prize for Distinguished Criticism.


Full article and photo:

What Ahmadinejad Knows

Iran’s president appeals to 9/11 Truthers.

Let’s put a few facts on the table.

• The recent floods in Pakistan are acts neither of God nor of nature. Rather, they are the result of a secret U.S. military project called HAARP, based out of Fairbanks, Alaska, which controls the weather by sending electromagnetic waves into the upper atmosphere. HAARP may also be responsible for the recent spate of tsunamis and earthquakes.

• Not only did the U.S. invade Iraq for its oil, but also to harvest the organs of dead Iraqis, in which it does a thriving trade.

• Faisal Shahzad was not the perpetrator of the May 1 Times Square bombing, notwithstanding his own guilty plea. Rather, the bombing was orchestrated by an American think tank, though its exact identity has yet to be established.

• Oh, and 9/11 was an inside job. Just ask Mahmoud Ahmadinejad.

The U.S. and its European allies were quick to walk out on the Iranian president after he mounted the podium at the U.N. last week to air his three “theories” on the attacks, each a conspiratorial shade of the other. But somebody should give him his due: He is a provocateur with a purpose. Like any expert manipulator, he knew exactly what he was doing when he pushed those most sensitive of buttons.

He knew, for instance, that the Obama administration and its allies are desperate to resume negotiations over Iran’s nuclear programs. What better way to set the diplomatic mood than to spit in their eye when, as he sees it, they are already coming to him on bended knee?

He also knew that the more outrageous his remarks, the more grateful the West would be for whatever crumbs of reasonableness Iran might scatter on the table. This is what foreign ministers are for.

Finally, he knew that the Muslim world would be paying attention to his speech. That’s a world in which his view of 9/11 isn’t on the fringe but in the mainstream. Crackpots the world over—some of whom are reading this column now—want a voice. Ahmadinejad’s speech was a bid to become theirs.

This is the ideological component of Ahmadinejad’s grand strategy: To overcome the limitations imposed on Iran by its culture, geography, religion and sect, he seeks to become the champion of radical anti-Americans everywhere. That’s why so much of his speech last week was devoted to denouncing capitalism, the hardy perennial of the anti-American playbook. But that playbook needs an update, which is where 9/11 “Truth” fits in.

Could it work? Like any politician, Ahmadinejad knows his demographic. The University of Maryland’s World Public Opinion surveys have found that just 2% of Pakistanis believe al Qaeda perpetrated the attacks, whereas 27% believe it was the U.S. government. (Most respondents say they don’t know.)

Among Egyptians, 43% say Israel is the culprit, while another 12% blame the U.S. Just 16% of Egyptians think al Qaeda did it. In Turkey, opinion is evenly split: 39% blame al Qaeda, another 39% blame the U.S. or Israel. Even in Europe, Ahmadinejad has his corner. Fifteen percent of Italians and 23% of Germans finger the U.S. for the attacks.

Deeper than the polling data are the circumstances from which they arise. There’s always the temptation to argue that the problem is lack of education, which on the margins might be true. But the conspiracy theories cited earlier are retailed throughout the Muslim world by its most literate classes, journalists in particular. Irrationalism is not solely, or even mainly, the province of the illiterate.

Nor is it especially persuasive to suggest that the Muslim world needs more abundant proofs of American goodwill: The HAARP fantasy, for example, is being peddled at precisely the moment when Pakistanis are being fed and airlifted to safety by U.S. Marine helicopters operating off the USS Peleliu.

What Ahmadinejad knows is that there will always be a political place for what Michel Foucault called “the sovereign enterprise of Unreason.” This is an enterprise whose domain encompasses the politics of identity, of religious zeal, of race or class or national resentment, of victimization, of cheek and self-assertion. It is the politics that uses conspiracy theory not just because it sells, which it surely does, or because it manipulates and controls, which it does also, but because it offends. It is politics as a revolt against empiricism, logic, utility, pragmatism. It is the proverbial rage against the machine.

Chances are you know people to whom this kind of politics appeals in some way, large or small. They are Ahmadinejad’s constituency. They may be irrational; he isn’t crazy.

Bret Stephens, Wall Street Journal


Full article :

So wrong it’s right

The ‘eggcorn’ has its day

Over the past 10 days, language bloggers have been exchanging virtual high-fives at the news of an honor bestowed on one of their coinages. In its most recent quarterly update, the Oxford English Dictionary Online announced that its word-hoard now includes the shiny new term eggcorn.

An eggcorn, as regular readers of this column may recall, is — well, here’s the official new definition: “an alteration of a word or phrase through the mishearing or reinterpretation of one or more of its elements as a similar-sounding word.” If you write “let’s nip it in the butt” (instead of “bud”) or “to the manor born” (instead of “manner”), you’re using an eggcorn.

The term derives from “egg corn” as a substitution for “acorn,” whose earliest appearance comes in an 1844 letter from an American frontiersman: “I hope you are as harty as you ust to be and that you have plenty of egg corn bread which I can not get her and I hop to help you eat some of it soon.”

Why would eggcorn (as we now spell it) replace acorn in the writer’s lexicon? As the OED editors comment, “acorns are, after all, seeds which are somewhat egg-shaped, and in many dialects the formations acorn and eggcorn sound very similar.” (And, like corn kernels, acorns can be ground into meal or flour.) This coinage came to the attention of the linguists blogging at Language Log in 2003, and at the suggestion of Geoffrey Pullum, one of the site’s founders, it was adopted as the term for all such expressions.

Eggcorns needed their own label, the Language Loggers decided, because they were mistakes of a distinct sort — variants on the traditional phrasing, but ones that still made at least a bit of sense. “Nip it in the bud,” for instance, is a horticultural metaphor, perhaps not so widely understood as it once was; the newer “nip it in the butt” describes a different strategy for getting rid of some unwelcome visitation, but it’s not illogical. Hamlet said he was “to the manner born,” but the modern alteration, “to the manor born,” is also a useful formula.

And because they make sense, eggcorns are interesting in a way that mere disfluencies and malapropisms are not: They show our minds at work on the language, reshaping an opaque phrase into something more plausible. They’re tiny linguistic treasures, pearls of imagination created by clothing an unfamiliar usage in a more recognizable costume.

Even before the eggcorn era, most of us had heard (or experienced) pop-song versions of the phenomenon, like “’Scuse me while I kiss this guy” (for Jimi Hendrix’s “kiss the sky” line), but these have had their own label, mondegreen, for more than half a century. The word was coined in 1954 by Sylvia Wright, in commemoration of her mishearing of a Scottish ballad: “They have slain the Earl o’ Moray/ And laid him on the green,” went the lament, but Wright thought the villains had slain the earl “and Lady Mondegreen.”

Then there are malapropisms, word substitutions that sound similar but make no sense at all. They’re named for Mrs. Malaprop, a character in the 1775 play “The Rivals,” whose childrearing philosophy illustrates her vocabulary problem: “I would by no means wish a daughter of mine to be a progeny of learning….I would have her instructed in geometry, that she might know something of the contagious countries.”

And when the misconceived word or expression has spread so widely that we all use it, it’s a folk etymology — or, to most of us, just another word. Bridegroom, hangnail, Jerusalem artichoke — all started out as mistakes.

But we no longer beat ourselves up because our forebears substituted groom for the Old English guma (“man”), or modified agnail (“painful nail”) into hangnail, or reshaped girasole (“sunflower” in Italian) into the more familiar Jerusalem.

The border between these folk-etymologized words, blessed by history and usage, and the newer eggcorns is fuzzy, and there’s been some debate already at the American Dialect Society’s listserv, ADS-L, about whether the distinction is real. Probably there is no bright line; to me, “you’ve got another thing coming” and “wile away the hours” are eggcorns — recent reshapings of expressions I learned as “another think” and “while away” — but to you they may be normal.

But we face the same problem in deciding which senses are valid for everyday, non-eggcornish words. When does nonplussed for “unfazed” or enormity for “hugeness” become the standard sense? We can only wait and see; the variants may duke it out for decades, but if a change takes hold, the battle will one day be forgotten.

The little eggcorn is in the same situation: It’s struggling to overcome its mixed-up heritage and grow into the kind of respectable adulthood enjoyed by the Jerusalem artichoke. We’re not obliged to help it along, but while it’s here, we might as well enjoy its wacky poetry.

Jan Freeman’s e-mail address is; she blogs about language at Throw Grammar from the Train (  


Full article and photo:

The Non-Economist’s Economist

John Kenneth Galbraith avoided technical jargon and wrote witty prose—too bad he got so much wrong

The Dow Jones Industrials spent 25 years in the wilderness after the 1929 Crash. Not until 1954 did the disgraced 30-stock average regain its Sept. 3, 1929, high. And then, its penance complete, it soared. In March 1955, the U.S. Senate Banking and Currency Committee, J. William Fulbright of Arkansas, presiding, opened hearings to determine what dangers lurked in this new bull market. Was it 1929 all over again?

John Kenneth Galbraith (1908-2006), photographed by Richard Avedon in Boston in 1993

One of the witnesses, John Kenneth Galbraith, a 46-year-old Harvard economics professor, seemed especially well-credentialed. His new history of the event that still transfixed America, “The Great Crash, 1929” was on its way to the bookstores and to what would prove to be a commercial triumph. An alumnus of Ontario Agricultural College and the holder of a doctorate in agricultural economics from the University of California at Berkeley, Galbraith had written articles for Fortune magazine and speeches for Adlai Stevenson, the defeated 1952 Democratic presidential candidate. He was a World War II price controller and the author of “American Capitalism: The Concept of Countervailing Power.” When he stepped into a crowded elevator, strangers tried not to stare: he stood 6 feet 8 inches tall.

On the one hand, Galbraith observed, the stock market was not so speculatively charged in 1955 as it had been in 1929 On the other, he insisted, there were worrying signs of excess. Stocks were not so cheap as they had been in the slack and demoralized market of 1953 (though, at 4%, they still outyielded corporate bonds). “The relation of share prices to book value is showing some of the same tendencies as in 1929,” Galbraith went on. “And while it would be a gross exaggeration to say that there has been the same escape from reality that there was in 1929, it does seem to me that enough has happened to indicate that we haven’t yet lost our capacity for speculative self-delusion.”


Reading List: If Not Galbraith, Who?

Maury Klein tells a great story in “Rainbow’s End: The Crash of 1929” (Oxford, 2001), but he also attempts to answer the great question: What went wrong? For the financial specialist in search of a tree-by-tree history of the forest of the Depression, look no further than Barrie A. Wigmore’s “The Crash and Its Aftermath: A History of the Securities Markets in the United States, 1929-33” (Greenwood Press, 1985).

In the quality of certitude, the libertarian Murray Rothbard yielded to no economist. His revisionist history, “America’s Great Depression” (available through the website of the Mises Institute), contends that it was the meddling Hoover administration that turned recession into calamity. Amity Shlaes draws up a persuasive indictment of the New Deal in her “The Forgotten Man” (HarperCollins, 2007).

“Economics and the Public Welfare” by Benjamin Anderson (Liberty Press, 1979) is in strong contention for the lamest title ever fastened by a publisher on a deserving book. Better, the subtitle: “A Financial and Economic History of the United States: 1914-1946.”

“Where are the Customers’ Yachts? Or A Good Hard Look at Wall Street,” by Fred Schwed Jr. (Simon & Schuster, 1940) is the perfect antidote for any who imagine that the reduced salaries and status of today’s financiers is anything new. Page for page, Schwed’s unassuming survey of the financial field might be the best investment book ever written. Hands-down, it’s the funniest.

An unfunny but essential contribution to the literature of the Federal Reserve is the long-neglected “Theory and Practice of Central Banking” (Harper, 1936) by Henry Parker Willis, the first secretary of the Federal Reserve Board. Willis wrote to protest the against the central bank’s reinvention of itself, quite against the intentions of its founders, as a kind of infernal economic planning machine. He should see it now.

Freeman Tilden’s “A World in Debt” (privately printed, 1983) is a quirky, elegant, long out-of-print treatise by a non-economist on an all-too-timely subject. “The world,” wrote Tilden in 1936, “has several times, and perhaps many times, squandered itself into a position where a total deflation of debt was imperative and unavoidable. We may be entering one more such receivership of civilization.”

If the Obama economic program leaves you cold, puzzled or hot under the collar, turn to Hunter Lewis’s “Where Keynes Went Wrong” (Axios Press, 2009) or “The Critics of Keynesian Economics,” edited by Henry Hazlitt (Arlington House, 1977).

—James Grant


Re-reading Galbraith is like watching black-and-white footage of the 1955 World Series. The Brooklyn Dodgers are gone—and so is much of the economy over which Galbraith lavished so much of his eviscerating wit. In 1955, “globalization” was a word yet uncoined. Imports and exports each represented only about 4% of GDP, compared with 16.1% and 12.5%, respectively, today. In 1955, regulation was constricting (this feature of the Eisenhower-era economy seems to be making a reappearance) and unions were powerful. There was a lingering, Depression-era suspicion of business and, especially, of Wall Street. The sleep of corporate managements was yet undisturbed by the threat of a hostile takeover financed with junk bonds.

Half a century ago, the “conventional wisdom,” in Galbraith’s familiar phrase, was statism. In “American Capitalism,” the professor heaped scorn on the CEOs and Chamber of Commerce presidents and Republican statesmen who protested against federal regimentation. “In the United States at this time,” noted the critic Lionel Trilling in 1950, “liberalism is not only the dominant but even the sole intellectual tradition.” William F. Buckley’s upstart conservative magazine, National Review, made its debut in 1955 with the now-famous opening line that it “stands athwart history, yelling Stop.” Galbraith seemed not to have noticed that history and he were arm in arm. His was the conventional wisdom.

Concerning the emphatic Milton Friedman, someone once borrowed the Victorian-era quip, “I wish I was as sure of anything as he is of everything.” Galbraith and the author of “Capitalism and Freedom” were oil and water, but they did share certitude. To Galbraith, “free-market capitalism” was an empty Rotary slogan. It didn’t exist and, in Eisenhower-era America, couldn’t. Industrial oligopolies had rendered it obsolete.

Only in the introductory economics textbooks, he believed, did the free interplay between supply and demand determine price. Fortune 500 companies set their own prices. They chaffered with their vendors and customers, who themselves were big enough to throw their weight around in the market. As a system of decentralized decision-making, there was something to be said for capitalism, Galbraith allowed. As a network of oligopolistic fiefdoms, however, it needed federal direction. The day of Adam Smith’s “invisible hand” was over or ending. “Countervailing power,” in the Galbraith formulation, was the new idea.

Corporate bureaucrats—collectively, the “technostructure”—had pushed aside the entrepreneurs, proposed Galbraith channeling Thorstein Veblen. While, under the robber baron model, the firm existed to make profits, the modern behemoth exists to perpetuate itself in power while incidentally earning a profit. Planning is what the technostructure does best—it seems to hate surprises. “This planning,” wrote Galbraith, in “The New Industrial State,” “replaces prices that are established by the market with prices that are established by the firm. The firm, in tacit collaboration with the other firms in the industry, has wholly sufficient power to set and maintain minimum prices.” What was to be done? “The market having been abandoned in favor of planning of prices and demand,” he prescribed, “there is no hope that it will supply [the] last missing element of restraint. All that remains is the state.” It was fine with the former price controller of the Office of Price Administration.

As for the stockholder, he or she was as much a cipher as the manipulated consumer. “He (or she) is a passive and functionless figure, remarkable only on his capacity to share, without effort or even without appreciable risk, in the gains from the growth by which the technostructure measures its success,” according to Galbraith. “No grant of feudal privilege has ever equaled, for effortless return, that of the grandparents who bought and endowed his descendants with a thousand shares of General Motors or General Electric or IBM.” Galbraith was writing near the top of the bull market he had failed to anticipate in 1955. Shareholders were about to re-learn (if they had forgotten) the lessons of “risk.”

In its way, “The New Industrial State” was as mistimed as “The Great Crash.” In 1968, a year after the appearance of the first edition, the planning wheels started to turn at Leasco Data Processing Corp., Great Neck, N.Y. But Leasco’s “planning” took the distinctly un- Galbraithian turn of an unsolicited bid for control of the blue-blooded Chemical Bank of New York. Here was something new under the sun. Saul Steinberg, would-be revolutionary at the head of Leasco, ultimately surrendered before the massed opposition of the New York banking community. (“I always knew there was an Establishment,” Mr. Steinberg mused—”I just used to think I was a part of it.”) But the important thing was the example Mr. Steinberg had set by trying. The barbarians were beginning to form at the corporate gates.

The cosseted, self-perpetuating corporate bureaucracy that Galbraith described in “The New Industrial State” was in for a rude awakening. Deregulation became a Washington watchword under President Carter, capitalism got back its good name under President Reagan and trade barriers fell under President Clinton. Presently came the junk-bond revolution and the growth in an American market for corporate control. Hedge funds and private equity funds prowled for under- and mismanaged public companies to take over, resuscitate and—to be sure, all too often—to overload with debt. The collapse of communism and the rise of digital technology opened up vast new fields of competitive enterprise. Hundreds of millions of eager new hands joined the world labor force, putting downward pressure on costs, prices and profit margins. Wal-Mart delivered everyday low, and lower, prices, and MCI knocked AT&T off its monopolistic pedestal. The technostructure must have been astounded.

Galbraith in his home in Cambridge, Mass., in 1981

Here are the opening lines of “American Capitalism”: “It is told that such are the aerodynamics and wing-loading of the bumblebee that, in principle, it cannot fly. It does, and the knowledge that it defied the august authority of Isaac Newton and Orville Wright must keep the bee in constant fear of a crack-up.” You keep reading because of the promise of more in the same delightful vein. And, indeed, there is much more, including a charming annotated chronology of Galbraith’s life by his son and the editor of this volume, James K. Galbraith.

John F. Kennedy’s ambassador to India, muse to the Democratic left, two-time recipient of the Presidential Medal of Freedom, celebrity author, Galbraith in life was even larger than his towering height. His “A Theory of Price Control,” which was published in 1952 to favorable reviews but infinitesimal sales, was his one and only contribution to the purely professional economics literature. Thereafter this most acerbic critic of free markets prospered by giving the market what it wanted.

Now comes the test of whether his popular writings will endure longer than the memory of his celebrity and the pleasure of his prose. “The Great Crash” has a fighting chance, because of its very lack of analytical pretense. “History that reads like a poem,” raved Mark Van Doren in his review of the 1929 book. Or, he might have judged, that eats like whipped cream.

But the other books in this volume seem destined for only that kind of immortality conferred on amusing period pieces. When, for example, Galbraith complains in “The Affluent Society” that governments can’t borrow enough, or that the Federal Reserve is powerless to resist inflation, you wonder what country he was writing about, or even what planet he was living on.

Not that the professor refused to learn. In the first edition of “The New Industrial State,” for instance, he writes confidently: “While there may be difficulties, and interim failures or retreats are possible and indeed probable, a system of wage and price restraint is inevitable in the industrial system.” A decade or so later, in the edition selected for this volume, that sentence is gone. In its place is another not quite so confident: “The history of controls, in some form or other and by some nomenclature, is still incomplete.”

At the 1955 stock-market hearings, Galbraith was followed at the witness table by the aging speculator and “adviser to presidents” Bernard M. Baruch. The committee wanted to know what the Wall Street legend thought of the learned economist. “I know nothing about him to his detriment,” Baruch replied. “I think economists as a rule—and it is not personal to him—take for granted they know a lot of things. If they really knew so much, they would have all of the money, and we would have none.”

Mr. Grant, the editor of Grant’s Interest Rate Observer, is the author, most recently, of “Mr. Market Miscalculates” (Axios, 2009)


Full article and photos:

Uncommon knowledge

A surprise benefit of minimum wage

The minimum wage has been politically controversial for most of the last century, even though it affects a marginal share of the labor force and evidence of significant job loss is inconclusive. Now one economist would like us to consider another effect of the minimum wage: finishing high school. By curtailing low-wage/low-skill jobs, the minimum wage motivates young people to stay in school and become skilled. This effect then generates what the author calls an “educational cascade” by setting an example for the upcoming class of students. He estimates that the average male born in 1951 gained 0.2 years — and the average male born in 1986 gained 0.7 years — of high school due to the cumulative effect of the minimum wage.

Sutch, R., “The Unexpected Long-Run Impact of the Minimum Wage: An Educational Cascade,” National Bureau of Economic Research (September 2010).

Bearing false witness

False confessions and false eyewitness testimony are never-ending challenges for the judicial process. Although coercive interrogation is blamed in many of these situations, new research illustrates just how little coercion is needed. In an experiment, people played a quiz game for money. Later, they were told that the person who had sat next to them during the game was suspected of cheating. They were shown a 15-second video clip of the person sitting next to them cheating, even though the video clip was doctored and no cheating actually happened. They were asked to sign a witness statement against the cheater, but they were explicitly told not to sign if they hadn’t directly witnessed the cheating, aside from seeing it in the video. Nevertheless, almost half of those who saw the video signed the statement. Some of those who signed the statement even volunteered additional incriminating information.

Wade, K. et al., “Can Fabricated Evidence Induce False Eyewitness Testimony?” Applied Cognitive Psychology (October 2010).

The cure for sadness: pain

For most people, pain is not fun. However, a recent study finds that, when you’re not having fun, pain can help. Several hundred people were tested to see how much pain — in the form of increasing pressure or heat applied to their hands — they could tolerate. Not surprisingly, people reported being less happy after the experiment. But less happy is not necessarily the same as more unhappy. Indeed, negative emotions were also attenuated after the experiment, especially for women and people with more sensitive emotions. In other words, physical pain helped dull emotional pain.

Bresin, K. et al., “No Pain, No Change: Reductions in Prior Negative Affect following Physical Pain,” Motivation and Emotion (September 2010).

That reminds me of…me!

In a series of experiments, researchers have transformed Descartes’s famous phrase (“I think, therefore I am”) into something like this: “I am reminded of myself, therefore I will think.” People presented with a resume or product paid more attention to it if it happened to have a name similar to their own. As a result of this increased attention, a high-quality resume or product got a boost, while a low-quality resume or product was further handicapped. However, in a strange twist, people who sat in front of a mirror while evaluating a product exhibited the opposite effect: Quality didn’t matter for a product with a similar name but did matter otherwise. The authors speculate that too much self-referential thinking overloads one’s ability to think objectively.

Howard, D. & Kerin, R., “The Effects of Name Similarity on Message Processing and Persuasion,” Journal of Experimental Social Psychology (forthcoming).

Defensive sleeping

The odds that you’ll need to fend off an attacker entering your bedroom at night are pretty small. Yet, according to a recent study, our evolutionary heritage — formed when we had to survive sleeping outdoors — instills a strong preference for bedrooms designed less by the principles of Architectural Digest than by those of “Home Alone” or “Panic Room.” When shown a floor plan for a simple rectangular bedroom and asked to arrange the furniture, most people positioned the bed so that it faced the door. They also positioned the bed on the side of the room behind the door as it would be opening, and as far back from the door as possible, a position that would seem to give the occupant the most time to respond. If the floor plan included a window on the opposite side of the room from the door, people were inclined to move the bed away from the window, too.

Spörrle, M. & Stich, J., “Sleeping in Safe Places: An Experimental Investigation of Human Sleeping Place Preferences from an Evolutionary Perspective,” Evolutionary Psychology (August 2010).

Kevin Lewis is an Ideas columnist.


Full article and photo:


Randy Britton e-mails: “I’ve noticed in much of the coverage of the BP oil spill that the press has taken to calling the oil well ‘busted.’ Since when is ‘busted’ the proper way to describe a broken oil well?  It seems very colloquial and not a form I would expect to see in proper journalistic forums.”

Even now that BP’s troubled oil well in the Gulf of Mexico is being permanently sealed, news reports continue to refer to the “busted well,” particularly wire services like the Associated Press and AFP. Reuters was an early adopter, reporting on efforts to contain the “busted well” on May 3. Alternatively, busted has modified oil rig, or just plain rig. A database search of coverage of the BP spill finds the first recorded use of busted came nine days into the crisis on April 29, when the MSNBC host Ed Schultz said, “The busted rig is leaking — get this — 200,000 gallons of oil a day.”

Is busted overly informal for journalists? The verb bust certainly has colloquial roots, beginning its life on the American scene as a folksy variant of burst. (The same dropping of the “r” turned curse into cuss, horse into hoss and parcel into passel.) Building on earlier use as a noun, bust busted out as a verb as early as 1806, when Meriwether Lewis, while on his famous expedition with William Clark, wrote in his journal, “Windsor busted his rifle near the muzzle.” Since then, bust has worked its way into a wide variety of American expressions.

Bust runs the gamut from slang to standard,” explain David K. Barnhart and Allan A. Metcalf in their book “America in So Many Words.” “When it is used to mean ‘to explode or fall apart or be arrested,’ bust is generally slang. In the sense of failing (especially financially) it is informal, as busting the bank in gambling lingo, while in the specialized sense of taming a horse it is standard, the only way to say busting a bronco.

Despite its potential slanginess, busted is “not actually forbidden” in the news media, as the Boston Globe language columnist Jan Freeman wrote in August. Indeed, reporters often latch onto the occasional colloquialism that seems particularly expressive, and in this case, Freeman surmises they were drawn to the term’s “criminal-cowboy-macho connotations.”

Regardless of the reasons for its current vogue, it’s notable that busted was rarely relied on by the press to describe stricken oil wells before the BP disaster — even in incidents that were highly similar, such as the 1979 blowout of the Ixtoc I well in the Gulf of Mexico. Most of the precursors I found come from more literary sources. It was appropriate, for instance, in some light verse by J.W. Foley published in The New York Times in 1904:

Dear friend, there’s a question I’d like to ask you,
(Your pardon I crave if it vexes)
Have you ever invested a hundred or two
In an oil well somewhere down in Texas?
Have you ridden in autos (I mean in your mind),

With the profits you honestly trusted
Would flow from your venture in oil stocks — to find
That the oil well was hopelessly busted?

I can’t find fault in reporters drawing on the rich history of bust and busted in American English to add a little extra oomph to their dispatches from the gulf. Calling the well busted does evoke a looser, wilder state of disrepair than broken, or the more technically accurate blown-out. But after many months of news coverage, the phrase “busted well” has now turned into little more than a cliché. That’s a far worse journalistic offense than a bit of well-placed slang.

Ben Zimmer will answer one reader question every other week.


Full article:

Unpacking Imagination

In an age of childhood obesity and children tethered to electronic consoles, playgrounds have rarely been more important. In an age of constrained government budgets, playgrounds have rarely been a harder sell. Fortunately, the cost of play doesn’t have to be prohibitive. In creating the Imagination Playground in Lower Manhattan — a playground with lots of loose parts for children to create their own play spaces — we realized that many of the elements with the greatest value to children were inexpensive and portable. Although traditional playgrounds can easily cost in the millions to build, boxed imagination playgrounds can be put together for under $10,000. (Land costs not included!) The design below is one that my architecture firm has done in collaboration with the New York City Parks Department and KaBoom, a nonprofit organization. But it needn’t be the only one out there. There are a lot of ways to build a playground — and a lot of communities in need of one. Let a thousand portable playgrounds bloom.

David Rockwell, New York Times


Full article and photo:

New England’s hidden history

More than we like to think, the North was built on slavery.

In the year 1755, a black slave named Mark Codman plotted to kill his abusive master. A God-fearing man, Codman had resolved to use poison, reasoning that if he could kill without shedding blood, it would be no sin. Arsenic in hand, he and two female slaves poisoned the tea and porridge of John Codman repeatedly. The plan worked — but like so many stories of slave rebellion, this one ended in brutal death for the slaves as well. After a trial by jury, Mark Codman was hanged, tarred, and then suspended in a metal gibbet on the main road to town, where his body remained for more than 20 years.

It sounds like a classic account of Southern slavery. But Codman’s body didn’t hang in Savannah, Ga.; it hung in present-day Somerville, Mass. And the reason we know just how long Mark the slave was left on view is that Paul Revere passed it on his midnight ride. In a fleeting mention from Revere’s account, the horseman described galloping past “Charlestown Neck, and got nearly opposite where Mark was hung in chains.”

When it comes to slavery, the story that New England has long told itself goes like this: Slavery happened in the South, and it ended thanks to the North. Maybe

we had a little slavery, early on. But it wasn’t real slavery. We never had many slaves, and the ones we did have were practically family. We let them marry, we taught them to read, and soon enough, we freed them. New England is the home of abolitionists and underground railroads. In the story of slavery — and by extension, the story of race and racism in modern-day America — we’re the heroes. Aren’t we?

As the nation prepares to mark the 150th anniversary of the American Civil War in 2011, with commemorations that reinforce the North/South divide, researchers are offering uncomfortable answers to that question, unearthing more and more of the hidden stories of New England slavery — its brutality, its staying power, and its silent presence in the very places that have become synonymous with freedom. With the markers of slavery forgotten even as they lurk beneath our feet — from graveyards to historic homes, from Lexington and Concord to the halls of Harvard University — historians say it is time to radically rewrite America’s slavery story to include its buried history in New England.

“The story of slavery in New England is like a landscape that you learn to see,” said Anne Farrow, who co-wrote “Complicity: How the North Promoted, Prolonged, and Profited From Slavery” and who is researching a new book about slavery and memory. “Once you begin to see these great seaports and these great historic houses, everywhere you look, you can follow it back to the agricultural trade of the West Indies, to the trade of bodies in Africa, to the unpaid labor of black people.”

It was the 1991 discovery of an African burial ground in New York City that first revived the study of Northern slavery. Since then, fueled by educators, preservationists, and others, momentum has been building to recognize histories hidden in plain sight. Last year, Connecticut became the first New England state to formally apologize for slavery. In classrooms across the country, popularity has soared for educational programs on New England slavery designed at Brown University. In February, Emory University will hold a major conference on the role slavery’s profits played in establishing American colleges and universities, including in New England. And in Brookline, Mass., a program called Hidden Brookline is designing a virtual walking tour to illuminate its little-known slavery history: At one time, nearly half the town’s land was held by slave owners.

“What people need to understand is that, here in the North, while there were not the large plantations of the South or the Caribbean islands, there were families who owned slaves,” said Stephen Bressler, director of Brookline’s Human Relations-Youth Resources Commission. “There were businesses actively involved in the slave trade, either directly in the importation or selling of slaves on our shores, or in the shipbuilding, insurance, manufacturing of shackles, processing of sugar into rum, and so on. Slavery was a major stimulus to the Northern economy.”

Turning over the stones to find those histories isn’t just a matter of correcting the record, he and others say. It’s crucial to our understanding of the New England we live in now.

“The absolute amnesia about slavery here on the one hand, and the gradualness of slavery ending on the other, work together to make race a very distinctive thing in New England,” said Joanne Pope Melish, who teaches history at the University of Kentucky and wrote the book “Disowning Slavery: Gradual Emancipation and ‘Race’ in New England, 1780-1860.” “If you have obliterated the historical memory of actual slavery — because we’re the free states, right? — that makes it possible to turn around and look at a population that is disproportionately poor and say, it must be their own inferiority. That is where New England’s particular brand of racism comes from.”

Dismantling the myths of slavery doesn’t mean ignoring New England’s role in ending it. In the 1830s and ’40s, an entire network of white Connecticut abolitionists emerged to house, feed, clothe, and aid in the legal defense of Africans from the slave ship Amistad, a legendary case that went all the way to the US Supreme Court and helped mobilize the fight against slavery. Perhaps nowhere were abolition leaders more diehard than in Massachusetts: Pacifist William Lloyd Garrison and writer Henry David Thoreau were engines of the antislavery movement. Thoreau famously refused to pay his taxes in protest of slavery, part of a philosophy of civil disobedience that would later influence Martin Luther King Jr. But Thoreau was tame compared to Garrison, a flame-thrower known for shocking audiences. Founder of the New England Anti-Slavery Society and the newspaper The Liberator, Garrison once burned a copy of the US Constitution at a July Fourth rally, calling it “a covenant with death.” His cry for total, immediate emancipation made him a target of death threats and kept the slavery question at a perpetual boil, fueling the moral argument that, in time, would come to frame the Civil War.

But to focus on crusaders like Garrison is to ignore ugly truths about how unwillingly New England as a whole turned the page on slavery. Across the region, scholars have found, slavery here died a painfully gradual death, with emancipation laws and judicial rulings that either were unclear, poorly enforced, or written with provisions that kept slaves and the children born to them in bondage for years.

Meanwhile, whites who had trained slaves to do skilled work refused to hire the same blacks who were now free, driving an emerging class of skilled workers back to the lowest rungs of unskilled labor. Many whites, driven by reward money and racial hatred, continued to capture and return runaway Southern slaves; some even sent free New England blacks south, knowing no questions about identity would be asked at the other end. And as surely as there was abolition, there was “bobalition” — the mocking name given to graphic, racist broadsides printed through the 1830s, ridiculing free blacks with characters like Cezar Blubberlip and Mungo Mufflechops. Plastered around Boston, the posters had a subtext that seemed to boil down to this: Who do these people think they are? Citizens?

“Is Garrison important? Yes. Is it dangerous to be an abolitionist at that time? Absolutely,” said Melish. “What is conveniently forgotten is the number of people making a living snagging free black people in a dark alley and shipping them south.”

Growing up in Lincoln, Mass., historian Elise Lemire vividly remembers learning of the horrors of a slaveocracy far, far away. “You knew, for example, that families were split up, that people were broken psychologically and kept compliant by the fear of your husband or wife being sold away, or your children being sold away,” said Lemire, author of the 2009 book “Black Walden,” who became fascinated with former slaves banished to squatter communities in Walden Woods.

As she peeled back the layers, Lemire discovered a history rarely seen by the generations of tourists and schoolchildren who have learned to see Concord as a hotbed of antislavery activism. “Slaves [here] were split up in the same way,” she said. “You didn’t have any rights over your children. Slave children were given away all the time, sometimes when they were very young.”

In Lemire’s Concord, slave owners once filled half of town government seats, and in one episode town residents rose up to chase down a runaway slave. Some women remained enslaved into the 1820s, more than 30 years after census figures recorded no existing slaves in Massachusetts. According to one account, a former slave named Brister Freeman, for whom Brister’s Hill in Walden Woods is named, was locked inside a slaughterhouse shed with an enraged bull as his white tormentors laughed outside the door. And in Concord, Lemire argues, black families were not so much liberated as they were abandoned to their freedom, released by masters increasingly fearful their slaves would side with the British enemy. With freedom, she said, came immediate poverty: Blacks were forced to squat on small plots of the town’s least arable land, and eventually pushed out of Concord altogether — a precursor to the geographic segregation that continues to divide black and white in New England.

“This may be the birthplace of a certain kind of liberty,” Lemire said, “but Concord was a slave town. That’s what it was.”

If Concord was a slave town, historians say, Connecticut was a slave state. It didn’t abolish slavery until 1848, a little more than a decade before the Civil War. (A judge’s ruling ended legal slavery in Massachusetts in 1783, though the date is still hotly debated by historians.) It’s a history Connecticut author and former Hartford Courant journalist Anne Farrow knew nothing about — until she got drawn into an assignment to find the untold story of one local slave.

Once she started pulling the thread, Farrow said, countless histories unfurled: accounts of thousand-acre slave plantations and a livestock industry that bred the horses that turned the giant turnstiles of West Indian sugar mills. Each discovery punctured another slavery myth. “A mentor of mine has said New England really democratized slavery,” said Farrow. “Where in the South a few people owned so many slaves, here in the North, many people owned a few. There was a widespread ownership of black people.”

Perhaps no New England colony or state profited more from the unpaid labor of blacks than Rhode Island: Following the Revolution, scholars estimate, slave traders in the tiny Ocean State controlled between two-thirds and 90 percent of America’s trade in enslaved Africans. On the rolling farms of Narragansett, nearly one-third of the population was black — a proportion not much different from Southern plantations. In 2003, the push to reckon with that legacy hit a turning point when Brown University, led by its first African-American president, launched a highly controversial effort to account for its ties to Rhode Island’s slave trade. Today, that ongoing effort includes the CHOICES program, an education initiative whose curriculum on New England slavery is now taught in over 2,000 classrooms.

As Brown’s decision made national headlines, Katrina Browne, a Boston filmmaker, was on a more private journey through New England slavery, tracing her bloodlines back to her Rhode Island forebears, the DeWolf family. As it turned out, the DeWolfs were the biggest slave-trading family in the nation’s biggest slave-trading state. Browne’s journey, which she chronicled in the acclaimed documentary “Traces of the Trade: A Story from the Deep North,” led her to a trove of records of the family’s business at every point in slavery’s triangle trade. Interspersed among the canceled checks and ship logs, Browne said, she caught glimpses into everyday life under slavery, like the diary entry by an overseer in Cuba that began, “I hit my first Negro today for laughing at prayers.” Today, Browne runs the Tracing Center, a nonprofit to foster education about the North’s complicity in slavery.

“I recently picked up a middle school textbook at an independent school in Philadelphia, and it had sub-chapter headings for the Colonial period that said ‘New England,’ and then ‘The South and Slavery,’ ” said Browne, who has trained park rangers to talk about Northern complicity in tours of sites like Philadelphia’s Liberty Bell. “Since learning about my family and the whole North’s role in slavery, I now consider these things to be my problem in a way that I didn’t before.”

If New England’s amnesia has been pervasive, it has also been willful, argues C.S. Manegold, author of the new book “Ten Hills Farm: The Forgotten History of Slavery in the North.” That’s because many of slavery’s markers aren’t hidden or buried. In New England, one need look no further than a symbol that graces welcome mats, door knockers, bedposts, and all manner of household decor: the pineapple. That exotic fruit, said Manegold, is as intertwined with slavery as the Confederate flag: When New England ships came to port, captains would impale pineapples on a fence post, a sign to everyone that they were home and open for business, bearing the bounty of slave labor and sometimes slaves themselves.

“It’s a symbol everyone knows the benign version of — the happy story that pineapples signify hospitality and welcome,” said Manegold, whose book centers on five generations of slaveholders tied to one Colonial era estate, the Royall House and Slave Quarters in Medford, Mass., now a museum. The house features two carved pineapples at its gateposts.

By Manegold’s account, pineapples were just the beginning at this particular Massachusetts farm: Generation after generation, history at the Royall House collides with myths of freedom in New England — starting with one of the most mythical figures of all, John Winthrop. Author of the celebrated “City Upon a Hill” sermon and first governor of the Massachusetts Bay Colony, Winthrop not only owned slaves at Ten Hills Farm, but in 1641, he helped pass one of the first laws making chattel slavery legal in North America.

When the house passed to the Royalls, Manegold said, it entered a family line whose massive fortune came from slave plantations in Antigua. Members of the Royall family would eventually give land and money that helped establish Harvard Law School. To this day, the law school bears a seal borrowed from the Royall family crest, and for years the Royall Professorship of Law remained the school’s most prestigious faculty post, almost always occupied by the law school dean. It wasn’t until 2003 that an incoming dean — now Supreme Court Justice Elena Kagan — quietly turned the title down.

Kagan didn’t publicly explain her decision. But her actions speak to something Manegold and others say could happen more broadly: not just inserting footnotes to New England heritage tours and history books, but truly recasting that heritage in all its painful complexity.

“In Concord,” Lemire said, “the Minutemen clashed with the British at the Old North Bridge within sight of a man enslaved in the local minister’s house. The fact that there was slavery in the town that helped birth American liberty doesn’t mean we shouldn’t celebrate the sacrifices made by the Minutemen. But it does mean New England has to catch up with the rest of the country, in much of which residents have already wrestled with their dual legacies of freedom and slavery.”

Francie Latour is an associate editor at Wellesley magazine and a former Globe reporter.


Full article and photo:

A short history of presidential primaries

Although a Niagara of vitriol is drenching politics, the two parties are acting sensibly and in tandem about something once considered a matter of constitutional significance — the process by which presidential nominations are won.

The 2012 process will begin 17 months from now — in February rather than January. Under rules adopted by both parties’ national committees, no delegates to the national conventions shall be selected before the first Tuesday in March — except for delegates from New Hampshire, South Carolina and Nevada. Iowa may still conduct its caucuses, which do not select delegates, in February.

It is not graven on the heart of man by the finger of God that the Entitled Four shall go first, but it might as well be. Although they have just 3.8 percent of the nation’s population, they do represent four regions. Anyway, they shall have the spotlight to themselves until the deluge of delegate selections begin — perhaps in March but preferably in April.

Any Republican delegate-selection event held before the first day of April shall be penalized: The result cannot be, as many Republicans prefer, a winner-take-all allocation of delegates. March events “shall provide for the allocation of delegates on a proportional basis.” This means only that some of the delegates must be allocated proportional to the total vote.

Because Democrats are severe democrats, they have no winner-take-all events, so they do not have this stick with which to discipline disobedient states. Instead, they brandish — they are, after all, liberals — a carrot: States will be offered bonus delegates for moving their nominating events deeper into the nominating season, and for clustering their contests with those of neighboring states.

Each party wants to maximize its chance of nominating a strong candidate and — this is sometimes an afterthought — one who would not embarrass it as president. So both parties have equal interests in lengthening the nominating process to reduce the likelihood that a cascade of early victories will settle nomination contests before they have performed their proper testing-and-winnowing function.

With states jockeying for early positions, the danger has been that the process will become compressed into something similar to an early national primary. This would heavily favor well-known and well-funded candidates and would virtually exclude everyone else.

There have been other proposals. One would divide the nation into four regions voting on monthly intervals, with the order of voting rotating every four years. Another would spread voting over 10 two-week intervals, with the largest states voting last, thereby giving lesser-known candidates a chance to build strength.

Such plans, however, require cooperation approaching altruism among the states, which should not be counted on. Instead, the two parties are in a Madisonian mood, understanding that incentives are more reliable than moral exhortations in changing behavior.

Speaking of the sainted Madison, the parties’ reforms are a small step back toward what the Constitution envisioned: settled rules for something important. The nation’s Founders considered the selection of presidential candidates so crucial that they wanted the process to be controlled by the Constitution. So they devised a system under which the nomination of presidential candidates and the election of a president occurred simultaneously:

Electors meeting in their respective states, in numbers equal to their states’ senators and representatives, would vote for two candidates for president. When Congress counted the votes, the one with the most would become president, the runner-up vice president.

This did not survive the quick emergence of parties. After the presidential election of 1800, which was settled in the House after 36 votes, the 12th Amendment was adopted, and suddenly the nation had what it has had ever since — a process of paramount importance but without settled rules. The process has been a political version of the “tragedy of the commons” — by everyone acting self-interestedly, everyone’s interests are injured.

In 1952, Sen. Estes Kefauver of Tennessee won every Democratic primary he entered except Florida’s, which was won by Sen. Richard Russell of Georgia. So the nominee was . . . Illinois Gov. Adlai Stevenson. Party bosses, a species as dead as the dinosaurs, disliked Kefauver.

Today, the parties’ modest reforms — the best kind — have somewhat reduced the risks inherent in thorough democratization of the nomination process. Certainly the democratization has not correlated with dramatic improvements in the caliber of nominees. And the current president, whose campaign was his qualification for the office, is proof that even a protracted and shrewd campaign is not an infallible predictor of skillful governance.

George F. Will, Washington Post


Full article and photo:

How to Raise Boys Who Read

Hint: Not with gross-out books and video-game bribes.

When I was a young boy, America’s elite schools and universities were almost entirely reserved for males. That seems incredible now, in an era when headlines suggest that boys are largely unfit for the classroom. In particular, they can’t read.

According to a recent report from the Center on Education Policy, for example, substantially more boys than girls score below the proficiency level on the annual National Assessment of Educational Progress reading test. This disparity goes back to 1992, and in some states the percentage of boys proficient in reading is now more than ten points below that of girls. The male-female reading gap is found in every socio-economic and ethnic category, including the children of white, college-educated parents.

The good news is that influential people have noticed this problem. The bad news is that many of them have perfectly awful ideas for solving it.

Everyone agrees that if boys don’t read well, it’s because they don’t read enough. But why don’t they read? A considerable number of teachers and librarians believe that boys are simply bored by the “stuffy” literature they encounter in school. According to a revealing Associated Press story in July these experts insist that we must “meet them where they are”—that is, pander to boys’ untutored tastes.

For elementary- and middle-school boys, that means “books that exploit [their] love of bodily functions and gross-out humor.” AP reported that one school librarian treats her pupils to “grossology” parties. “Just get ’em reading,” she counsels cheerily. “Worry about what they’re reading later.”

Not with ‘gross-out’ books and video-game bribes.

There certainly is no shortage of publishers ready to meet boys where they are. Scholastic has profitably catered to the gross-out market for years with its “Goosebumps” and “Captain Underpants” series. Its latest bestsellers are the “Butt Books,” a series that began with “The Day My Butt Went Psycho.”

The more venerable houses are just as willing to aim low. Penguin, which once used the slogan, “the library of every educated person,” has its own “Gross Out” line for boys, including such new classics as “Sir Fartsalot Hunts the Booger.”

Workman Publishing made its name telling women “What to Expect When You’re Expecting.” How many of them expected they’d be buying “Oh, Yuck! The Encyclopedia of Everything Nasty” a few years later from the same publisher? Even a self-published author like Raymond Bean—nom de plume of the fourth-grade teacher who wrote “SweetFarts”—can make it big in this genre. His flatulence-themed opus hit no. 3 in children’s humor on Amazon. The sequel debuts this fall.

Education was once understood as training for freedom. Not merely the transmission of information, education entailed the formation of manners and taste. Aristotle thought we should be raised “so as both to delight in and to be pained by the things that we ought; this is the right education.”

“Plato before him,” writes C. S. Lewis, “had said the same. The little human animal will not at first have the right responses. It must be trained to feel pleasure, liking, disgust, and hatred at those things which really are pleasant, likeable, disgusting, and hateful.”

This kind of training goes against the grain, and who has time for that? How much easier to meet children where they are.

One obvious problem with the SweetFarts philosophy of education is that it is more suited to producing a generation of barbarians and morons than to raising the sort of men who make good husbands, fathers and professionals. If you keep meeting a boy where he is, he doesn’t go very far.

The other problem is that pandering doesn’t address the real reason boys won’t read. My own experience with six sons is that even the squirmiest boy does not require lurid or vulgar material to sustain his interest in a book.

So why won’t boys read? The AP story drops a clue when it describes the efforts of one frustrated couple with their 13-year-old unlettered son: “They’ve tried bribing him with new video games.” Good grief.

The appearance of the boy-girl literacy gap happens to coincide with the proliferation of video games and other electronic forms of entertainment over the last decade or two. Boys spend far more time “plugged in” than girls do. Could the reading gap have more to do with competition for boys’ attention than with their supposed inability to focus on anything other than outhouse humor?

Dr. Robert Weis, a psychology professor at Denison University, confirmed this suspicion in a randomized controlled trial of the effect of video games on academic ability. Boys with video games at home, he found, spend more time playing them than reading, and their academic performance suffers substantially. Hard to believe, isn’t it, but Science has spoken.

The secret to raising boys who read, I submit, is pretty simple—keep electronic media, especially video games and recreational Internet, under control (that is to say, almost completely absent). Then fill your shelves with good books.

People who think that a book—even R.L. Stine’s grossest masterpiece—can compete with the powerful stimulation of an electronic screen are kidding themselves. But on the level playing field of a quiet den or bedroom, a good book like “Treasure Island” will hold a boy’s attention quite as well as “Zombie Butts from Uranus.” Who knows—a boy deprived of electronic stimulation might even become desperate enough to read Jane Austen.

Most importantly, a boy raised on great literature is more likely to grow up to think, to speak, and to write like a civilized man. Whom would you prefer to have shaped the boyhood imagination of your daughter’s husband—Raymond Bean or Robert Louis Stevenson?

I offer a final piece of evidence that is perhaps unanswerable: There is no literacy gap between home-schooled boys and girls. How many of these families, do you suppose, have thrown grossology parties?

Mr. Spence is president of Spence Publishing Company in Dallas.


Full article and photo:

Visigoths at the gate?

When facing a tsunami, what do you do? Pray, and tell yourself stories. I am not privy to the Democrats’ private prayers, but I do hear the stories they’re telling themselves. The new meme is that there’s a civil war raging in the Republican Party. The Tea Party will wreck it from within and prove to be the Democrats’ salvation.

I don’t blame anyone for seeking a deus ex machina when about to be swept out to sea. But this salvation du jour is flimsier than most.

In fact, the big political story of the year is the contrary: that a spontaneous and quite anarchic movement with no recognized leadership or discernible organization has been merged with such relative ease into the Republican Party.

The Tea Party could have become Perot ’92, an anti-government movement that spurned the Republicans, went third-party and cost George H.W. Bush reelection, ending 12 years of Republican rule. Had the Tea Party gone that route, it would have drained the Republican Party of its most mobilized supporters and deprived Republicans of the sweeping victory that awaits them on Nov. 2.

Instead, it planted its flag within the party and, with its remarkable energy, created the enthusiasm gap. Such gaps are measurable. This one is a chasm. This year’s turnout for the Democratic primaries (as a percentage of eligible voters) was the lowest ever recorded. Republican turnout was the highest since 1970.

True, Christine O’Donnell’s nomination in Delaware may cost the Republicans an otherwise safe seat (and possibly control of the Senate), and Sharron Angle in Nevada is running only neck-and-neck with an unpopular Harry Reid. On balance, however, the Tea Party contribution is a large net plus, with its support for such strong candidates as Marco Rubio of Florida, Pat Toomey of Pennsylvania, Joe Miller of Alaska, Mike Lee of Utah. Even Rand Paul, he of the shaky start in Kentucky, sports an eight-point lead. All this in addition to the significant Tea Party contribution to the tide that will carry dozens of Republicans into the House.

Nonetheless, some Democrats have convinced themselves that they have found the issue with which to salvage 2010. “President Obama’s political advisers,” reports the New York Times, “are considering a range of ideas, including national advertisements, to cast the Republican Party as all but taken over by Tea Party extremists.”

Sweet irony. Fear-over-hope rides again, this time with Democrats in the saddle warning darkly about “the Republican Tea Party” (Joe Biden). Message: Vote Democratic and save the nation from a Visigoth mob with a barely concealed tinge of racism.

First, this is so at variance with reality that it’s hard to believe even liberals believe it. The largest Tea Party event yet was the recent Glenn Beck rally on the Mall. The hordes descending turned out to be several hundred thousand cheerful folks in what, by all accounts, had the feel of a church picnic. And they left the place nearly spotless — the first revolution in recorded history that collected its own trash.

Second, the general public is fairly evenly split in its views of the Tea Party. It experiences none of the horror that liberals do — and think others should. Moreover, the electorate supports by 2-to-1 the Tea Party signature issues of smaller government and lower taxes.

Third, you would hardly vote against the Republican in your state just because there might be a (perceived) too-conservative Republican running somewhere else. How would, say, Paul running in Kentucky deter someone from voting for Mark Kirk in Illinois? Or, to flip the parties, will anyone in Nevada refuse to vote for Harry Reid because Chris Coons, a once self-described “bearded Marxist,” is running as a Democrat in Delaware?

Fourth, what sane Democrat wants to nationalize an election at a time of 9.6 percent unemployment and such disappointment with Obama that just this week several of his own dreamy 2008 supporters turned on him at a cozy town hall? The Democrats’ only hope is to run local campaigns on local issues. That’s how John Murtha’s former district director hung on to his boss’s seat in a special election in Pennsylvania.

Newt Gingrich had to work hard — getting Republican candidates to sign the Contract with America — to nationalize the election that swept Republicans to victory in 1994. A Democratic anti-Tea Party campaign would do that for the Republicans — nationalize the election, gratis — in 2010. As a very recent former president — now preferred (Public Policy Polling, Sept. 1) in bellwether Ohio over the current one by 50 percent to 42 percent — once said: Bring ’em on.

Charles Krauthammer, Washington Post


Full article:

Can a president lead with Woodward watching?

Question of the day: Why do presidents give the White House keys to Bob Woodward?

I ask this with all due deference, respect, hat in hand, cape over puddle and other sundry gestures owed by ink-stained wretches like me to the Most Famous Journalist on the Planet.

Through several administrations, Woodward has become president ex officio — or at least reporter in chief, a human tape recorder who issues history’s first draft even as history is still tying its shoes.

For years he’s been the best-selling first read on a president’s inner struggles. His latest, “Obama’s Wars,” exposes infighting in the West Wing over how to handle Afghanistan.

The suggestion that there was discord in the Oval Office over whether to increase troop numbers in a brutal war theater is, frankly, of great consolation. If we don’t worry ourselves sick about putting lives on the line, what exactly would we concern ourselves with? Who’s dancing next with the stars?

What is of some concern — at least based on those excerpts that have leaked thus far — is that the president gets pushed around by the generals. And that impression feeds into the larger one that Barack Obama is not quite commander in chief. He seems far more concerned with being politically savvy than with winning what he has called the good war.

Cognitive dissonance sets in when Obama declares that “it’s time to turn the page” in the war that he didn’t like — Iraq — and that is not in fact over. Fifty thousand troops remain in Iraq, while the surge in Afghanistan seems to be not enough — or too much for too long, already.

Whatever one’s view of circumstances on the ground, whether in the wars abroad or in domestic skirmishes on Wall Street, Obama seems not to be the man in charge. Nor does it seem that he is even sure of his own intentions. One telling exchange reported by Woodward took place with Sen. Lindsey Graham (R-S.C.). In explaining his July 2011 deadline to begin withdrawing troops from Afghanistan, Obama told Graham:

“I have to say that. I can’t let this be a war without end, and I can’t lose the whole Democratic Party.”

How’s that? We tell the enemy when we’re leaving so the party base doesn’t get upset? Well, of course, public opinion matters in war, as in all things. As we’ve seen before, wars can’t be won without the will of the people at home. But a commander in chief at least ought to know what he’s fighting for and why he’s asking Americans to risk their lives. If it’s not a good enough reason to warrant victory, then maybe it isn’t any longer a good war.

In another telling anecdote, the president asked his aides for a plan “about how we’re going to hand it off and get out of Afghanistan.” Apparently, he didn’t get such a plan. Whose presidency is this anyway?

The White House reportedly isn’t upset with the way the president comes across. His portrayal is consistent with what they consider a positive profile: Obama as thoughtful and reflective. To the list might we add ponderous?

We all want a thoughtful president. As few Democrats tire of reminding us, America and the world have had quite enough of cowboys. But surely we can discard the caricatures and settle on a thoughtful commander who is neither a gunslinger nor a chalk-dusted harrumpher. Surely the twain can meet.

The Woodward Syndrome, meanwhile, presents a dilemma for all presidents. By his presence, events are affected. By our knowledge of what he witnesses, even as history is being created in real time, we can also affect these same events. Is it fair to Obama to critique him as he navigates his own thoughts? Or are we interfering with outcomes by inserting ourselves into conversations to which we were never supposed to be privy?

It’s a conundrum unlikely to be resolved. If anything, in our tell-all, see-all political culture, no struggle will go unrecorded or un-critiqued. The need for strong leadership is, therefore, all the more necessary.

There’s a saying that seems applicable here: Work like you don’t need money, love like you’re never been hurt, dance like no one’s watching.

Note to President Obama: Lead like there’s no tomorrow. No midterm election, no presidential reelection, no party base. Liberate yourself from the Woodward Syndrome, figure out what you think, and lead.

You are commander in chief, after all. Half the country may disagree with you, but they’ll respect you in the morning.

Kathleen Parker, Washington Post


Full article:

Homo administrans

The biology of business

Biologists have brought rigour to psychology, sociology and even economics. Now they are turning their attention to the softest science of all: management

SCURRYING around the corridors of the business school at the National University of Singapore (NUS) in his white lab coat last year, Michael Zyphur must have made an incongruous sight. Visitors to management schools usually expect the staff to sport suits and ties. Dr Zyphur’s garb was, however, no provocative fashion statement. It is de rigueur for anyone dealing with biological samples, and he routinely collects such samples as part of his research on, of all things, organisational hierarchies. He uses them to look for biological markers, in the form of hormones, that might either cause or reflect patterns of behaviour that are relevant to business.

Since its inception in the early 20th century, management science has been dominated by what Leda Cosmides and John Tooby, two evolutionary psychologists, refer to disparagingly as the standard social science model (SSSM). This assumes that most behavioural differences between individuals are explicable by culture and socialisation, with biology playing at best the softest of second fiddles. Dr Zyphur is part of an insurgency against this idea. What Dr Cosmides and Dr Tooby have done to psychology and sociology, and others have done to economics, he wants to do to management. Consultants often talk of the idea of “scientific” management. He, and others like him, want to make that term meaningful, by applying the rigour of biology.

To do so, they will need to weave together several disparate strands of the subject—genetics, endocrinology, molecular biology and even psychology. If that works, the resulting mixture may provide a new set of tools for the hard-pressed business manager.

To the management born

Say “biology” and “behaviour” in the same sentence, and most minds think of genetics and the vexed question of nature and nurture. In a business context such questions of heredity and environment are the realm of Scott Shane, a professor of management at Case Western Reserve University in Ohio. In a recent book*, Dr Shane proffers a review of the field. Many of his data come from studies of twins—a traditional tool of human geneticists, who are denied the possibility of experimental breeding enjoyed by their confrères who study other species, such as flies and mice.

Identical twins share all of their DNA. Non-identical twins share only half (like all other siblings). Despite a murky past involving the probable fabrication of data by one of the field’s pioneers, Sir Cyril Burt, the science of comparing identical with non-identical twins is still seen as a good way of distinguishing the effects of genes from those of upbringing.

The consensus from twin studies is that genes really do account for a substantial proportion of the differences between individuals—and that applies to business as much as it does to the rest of life. Dr Shane observes genetic influence over which jobs people choose (see chart), how satisfied they are with those jobs, how frequently they change jobs, how important work is to them and how well they perform (or strictly speaking, how poorly: genes account for over a third of variation between individuals in “censured job performance”, a measure that incorporates reprimands, probation and performance-related firings). Salary also depends on DNA. Around 40% of the variation between people’s incomes is attributable to genetics. Genes do not, however, operate in isolation. Environment is important, too. Part of the mistake made by supporters of the SSSM was to treat the two as independent variables when, in reality, they interact in subtle ways.

Richard Arvey, the head of the NUS business school’s department of management and organisation, has been looking into precisely how genes interact with different types of environment to create such things as entrepreneurial zeal and the ability to lead others. Previous research had shown that people exhibiting personality traits like sensation-seeking are more likely to become entrepreneurs than their less outgoing and more level-headed peers. Dr Arvey and his colleagues found the same effect for extroversion (of which sensation-seeking is but one facet). There was, however, an interesting twist. Their study—of 1,285 pairs of identical twins and 849 pairs of same-sex fraternal ones—suggests that genes help explain extroversion only in women. In men, this trait is instilled environmentally. Businesswomen, it seems, are born. But businessmen are made.

In a second twin study, this time just on men, Dr Arvey asked to what extent leaders are born, and to what extent they are made. Inborn leadership traits certainly do exist, but upbringing, he found, matters too. The influence of genes on leadership potential is weakest in boys brought up in rich, supportive families and strongest in those raised in harsher circumstances. The quip that the battle of Waterloo was won on the playing fields of Eton thus seems to have some truth.

Pathways to success

Twin studies such as these point the way, but they provide only superficial explanations of what is going on. To get at the nitty gritty it is necessary to dive into molecular biology. And that is the province of people like Song Zhaoli, who is also at the NUS.

One way genes affect behaviour is through the agency of neurotransmitters, the chemicals that carry messages between nerve cells. Among these chemicals, two of the most important are dopamine and serotonin. Dopamine controls feelings of pleasure and reward. Serotonin regulates mood. Some personality traits have been shown to depend on the amounts of these neurotransmitters that slosh around the junctions between nerve cells. Novelty-seeking, for example, is associated with lots of dopamine. A tendency to depression may mean too little serotonin. And the levels of both are regulated by genes, with different variants of the same underlying gene having different effects.

Recent years have seen a surge of research into the links between particular versions of neurotransmitter-related genes and behavioural outcomes, such as voter turnout, risk-aversion, personal popularity and sexual promiscuity. However, studies of work-related traits have hitherto been conspicuous by their absence.

Dr Song has tried to fill this gap. His team have gathered and analysed DNA from 123 Singaporean couples to see if it can be matched with a host of work-related variables, starting with job satisfaction.

In this case Dr Song first checked how prone each participant in the study was to the doldrums, in order to establish a baseline. He also asked whether they had experienced any particularly stressful events, like sustaining serious injury, getting the sack or losing a lot of money, within the previous year. Then he told participants to report moments of negative mood (anger, guilt, sadness or worry) and job satisfaction (measured on a seven-point scale) four times a day for a week, using a survey app installed on their mobile phones.

He knew from previous research that some forms of melancholia, such as seasonal affective disorder (or winter blues), have been linked to particular versions of a serotonin-receptor gene called HTR2A. When he collated the DNA and survey data from his volunteers, he found those with a particular variant of HTR2A were less likely than those carrying one of its two other possible variants to experience momentary negative mood, even if they had had a more stress-ridden year. Dr Song also found that when carriers of that same variant reported lower negative mood, they also tended to report higher job satisfaction—an effect which was absent among people who had inherited the remaining two versions of the gene.

This suggests that for people fortunate enough to come equipped with the pertinent version of HTR2A, stressful events are less likely to have a negative effect on transient mood. What is more, for these optimists, better mood turns out to be directly related to contentment with their job. In other words, it may be a particular genetic mutation of a serotonin-receptor gene, and not the employer’s incentives, say, that is making people happier with their work.

The hormonal balance-sheet

Neurotransmitters are not the only way an individual’s genetic make-up is translated into action. Hormones also play a part. For example, oxytocin, which is secreted by part of the brain called the hypothalamus, has been shown to promote trust—a crucial factor in all manner of business dealings. The stress hormone cortisol, meanwhile, affects the assessment of the time value of money.

That, at least, was the conclusion of a study by Taiki Takahashi of Hokkaido University in Japan. After taking saliva samples from 18 volunteers, Dr Takahashi asked them what minimum amount of money they would accept in a year’s time in order to forgo an immediate payout of ¥10,000 (around $90 at the time). He found those with a lower base level of the hormone tended to prefer immediate payment, even when the sum in question was piffling compared with the promised future compensation.

Then there is testosterone, the principal male sex hormone (though women make it too). The literature on this hormone’s behavioural effects is vast. High levels of the stuff have been correlated with risk tolerance, creativity and the creation of new ventures. But testosterone is principally about dominance and hierarchy. This is where Dr Zyphur’s mouth swabs come in.

When Dr Zyphur (who is now at the University of Melbourne) was at the NUS, he led a study of how testosterone is related to status and collective effectiveness in groups. He and his colleagues examined levels of the hormone in 92 mixed-sex groups of about half a dozen individuals. Surprisingly, a group member’s testosterone level did not predict his or her status within the group. What the researchers did discover, though, is that the greater the mismatch between testosterone and status, the less effectively a group’s members co-operate. In a corporate setting that lower productivity translates into lower income.

Testosterone crops up in another part of the business equation, too: sales. It appears, for instance, to be a by-product of conspicuous consumption. In an oft-cited study Gad Saad and John Vongas of Concordia University in Montreal found that men’s testosterone levels responded precisely to changes in how they perceived their status. Testosterone shot up, for example, when they got behind the wheel of a sexy sports car and fell when they were made to drive a clunky family saloon car. The researchers also reported that when a man’s status was threatened in the presence of a female by a display of wealth by a male acquaintance, his testosterone levels surged.

As Dr Saad and Dr Vongas point out, a better understanding of this mechanism could help explain many aspects both of marketing and of who makes a successful salesman. Car salesmen, for example, are stereotypically male and aggressive, which tends to indicate high levels of testosterone. Whether that is really the right approach with male customers is, in light of this research, a moot point.

Natural selection

Results such as these are preliminary. But they do offer the possibility of turning aspects of management science into a real science—and an applied science, to boot. Decisions based on an accurate picture of human nature have a better chance of succeeding than those that are not. For instance, if job satisfaction and leadership turn out to have large genetic components, greater emphasis might be placed on selection than on training.

Not everyone is convinced. One quibble is that many investigations of genetics and behaviour have relied on participants’ retrospective reports of their earlier psychological states, which are often inaccurate. This concern, however, is being allayed with the advent of techniques such as Dr Song’s mobile-sampling method.

Another worry is that, despite the fact that most twin studies have been extensively replicated, they may be subject to systematic flaws. If parents exhibit a tendency to treat identical twins more similarly than fraternal ones, for instance, then what researchers see as genetic factors could turn out to be environmental ones.

That particular problem can be examined by looking at twins who have been fostered or adopted apart, and thus raised in separate households. A more serious one, though, has emerged recently. This is that identical twins may not be as identical as appears at first sight. A process called epigenesis, which shuts down genes in response to environmental prompts, may make their effective genomes different from their actual ones.

Statistically, that would not matter too much if the amount of epigenesis were the same in identical and fraternal twins, but research published last year by Art Petronis of the Centre for Addiction and Mental Health in Toronto and his colleagues, suggests it is not. Instead, identical twins are epigenetically closer to each other than the fraternal sort. That means environmentally induced effects that are translated into action by this sort of epigenesis might be being confused by researchers with inherited ones.

Still, this and other concerns about the effectiveness of the new science should pass as more data are gathered. But a separate set of concerns may be increased by better data. These are those of an ethical nature, which pop up whenever scientists broach the nature-nurture nexus. Broadly, such concerns divide into three sorts.

The first involves the fear that genetic determinism cheapens human volition. But as Dr Shane is at pains to stress, researchers like him are by no means genetic fatalists. He draws an analogy with sports wagers. Knowing that you have the favourable version of a gene may shift the odds somewhat, but it no more guarantees that you will be satisfied with your job than knowing of a player’s injury ensures that you will cash in on his team’s loss. Indeed, it might be argued that a better understanding of humanity can help direct efforts to counteract those propensities viewed as detrimental or undesirable, thus ensuring people are less, rather than more, in thrall to their biology.

The second set of ethical worriers are those who fret that biological knowledge may be used to serve nefarious ends. Whenever biology meets behaviour the spectre of social Darwinism and eugenics looms menacingly in the background. Yet, just because genetic information can serve evil ends need not mean that it has to. Dr Shane observes that pretending DNA has no bearing on working life does not make those influences go away, it just makes everyone ignorant of what they are, “Everyone, that is, except those who want to misuse the information.”

The third ethical qualm involves the thorny issue of fairness. Ought employers to use genetic testing to select their workers? Will this not lead down a slippery slope to genetic segregation of the sort depicted in the genetic dystopias beloved of science-fiction?

This pass, however, has already been sold. Workers are already sometimes hired on the basis of personality tests that try to tease out the very genetic predispositions that biologists are looking for. The difference is that the hiring methods do this indirectly, and probably clumsily. Moreover, in a rare example of legislative foresight, politicians in many countries have anticipated the problem. In 2008, for example, America’s Congress passed the Genetic Information Nondiscrimination Act, banning the use of genetic information in job recruitment. Similar measures had previously been adopted in several European countries, including Denmark, Finland, France and Sweden.


There is one other group of critics. These are those who worry that applying biology to business is dangerous not because it is powerful, but because it isn’t. To the extent they are genetic at all, behavioural outcomes are probably the result of the interaction of myriad genes in ways that are decades from being fully understood. That applies as much to business-related behaviour as to behaviour in any other facet of life.

Still, as Dr Zyphur is keen to note, not all academic work has to be about hard-nosed application in the here and now. Often, the practical applications of science are serendipitous—and may take a long time to arrive. And even if they never arrive, understanding human behaviour is just plain interesting for its own sake. “We in business schools often act like technicians in the way we conceptualise and teach our topics of study,” he laments. “This owes much to the fact that a business school is more like a trade school than it is a part of classic academia.” Now, largely as a result of efforts by Dr Zyphur and others like him, management science looks set for a thorough, biology-inspired overhaul. Expect plenty more lab coats in business-school corridors.

*“Born Entrepreneurs, Born Leaders. How Your Genes Affect Your Work Life”. Oxford University Press. $29.95


Full article and photos:

Ten of the best disguises in literature

The Odyssey, by Homer

Odysseus arrives back at his island of Ithaca disguised as a beggar. He is recognised only by his old dog Argus (animals always see through disguises), which dies of joy on the spot. In his disguise, our hero is able to see who has been loyal to him and who has not.

Measure for Measure, by William Shakespeare

The Duke who governs Vienna wants to see what his underlings will get up to in his absence. So he asks his friend Friar Thomas for some monkish garb: “Supply me with the habit and instruct me / How I may formally in person bear me / Like a true friar”. It works, and not even his most devoted courtiers recognise him until he finally unveils himself.

The Monk, by Matthew Lewis

Another monkish disguise. Sexy young Matilda lusts after Father Ambrosio, the most pious monk in Madrid. So she dresses up as a young novice monk and finds her way into the monastery. In her cell she reveals herself to Ambrosio, who cannot resist her charms. It turns out that she is in fact a demon.

Jane Eyre, by Charlotte Brontë

One of the great episodes of transvestism in literature comes when Rochester togs himself up as a Gypsy woman to read the palms of the guests he has invited to Thornfield. Blanche Ingram, Jane’s rival for his affections, gets uncomforting news, but Jane is told “the cup of bliss” is going to be offered to her.

East Lynne, by Mrs Henry Wood

Lady Isabel Vane loses her happy home and family when she conducts an adulterous affair with the utterly caddish Francis Levinson. Having learned the error of her ways, she returns to be governess to her own children, disguised by blue-lensed glasses, hair turned white from shock after a train crash and a scarred mouth.

The Mystery of Edwin Drood, by Charles Dickens

Dick Datchery arrives in the town of Cloisterham, apparently a detective in disguise (he wears a wig). He (or she?) keeps watch over John Jasper, choirmaster and secret drug addict. Drood has disappeared: is he disguised as Datchery? Or is it another character, investigating Drood’s murder? Dickens did not finish the book, so we will never know.

“The Man with the Twisted Lip”, by Arthur Conan Doyle

Watson visits a squalid London opium den in search of genteel addict Isa Whitney. Watson finds his man and notices an old man, “absorbed” in his drug-taking: “very thin, very wrinkled, bent with age, an opium pipe dangling down from between his knees”. Of course, it is Sherlock Holmes, conducting field research!

Charley’s Aunt, by Brandon Thomas

In the Victorians’ favourite farce, two Oxford students, Charley and Jack, persuade their friend Lord Fancourt Babberly to impersonate Charley’s aunt from Brazil. With “her” as chaperone, they can entertain Amy and Kitty, the two girls they fancy. Jack’s father and Amy’s father both fall for the fake aunt, before the real one turns up.

Third Girl, by Agatha Christie

False identities proliferate in Christie’s novels, but Third Girl satisfies by being peculiarly dependant on wigs. The plot turns on the capacity of Norma Restarick’s stepmother Mary to assume different identities by changing wigs. Her disguise is so successful she manages to pose as Norma’s flatmate without her stepdaughter noticing who she really is.

Madame Doubtfire, by Anne Fine

Unemployed actor Daniel Hilliard dresses up as a woman and applies for the job as nanny to his children, who live with his estranged wife Miranda. The elder two of his three children recognise him immediately, though his wife is completely fooled. When she discovers his ruse, she agrees to give him more access.


Full article:

The Enraged vs. the Exhausted

If you thought the 1994 election was historic, just wait till this year.

All anyone in America who cares about politics was talking about this week was the searing encounter that captured, in a way that hasn’t been done before, the essence of the political moment we’re in. When 2010 is reviewed, it will be the clip producers pick to illustrate the president’s disastrous fall.

It is Monday, Sept. 20, the middle of the day, in Washington. CNBC is holding a town hall for the president. A woman stands—handsome, dignified, black, a person with presence. She looks as if she may be what she turns out to be, an Obama supporter who in 2008 put up street signs, passed out literature and tried to win over co-workers. As she later told the Washington Post, “I was thinking that the people who were against him and didn’t believe in his agenda were completely insane.”

The president looked relieved when she stood. Perhaps he thought she might lob a sympathetic question that would allow him to hit a reply out of the park. Instead, and in the nicest possible way, Velma Hart lobbed a hand grenade.

“I’m a mother. I’m a wife. I’m an American veteran, and I’m one of your middle-class Americans. And quite frankly I’m exhausted. I’m exhausted of defending you, defending your administration, defending the mantle of change that I voted for, and deeply disappointed with where we are.” She said, “The financial recession has taken an enormous toll on my family.” She said, “My husband and I have joked for years that we thought we were well beyond the hot-dogs-and-beans era of our lives. But, quite frankly, it is starting to knock on our door and ring true that that might be where we are headed.”

What a testimony. And this is the president’s base. He got that look public figures adopt when they know they just took one right in the chops on national TV and cannot show their dismay. He could have responded with an engagement and conviction equal to the moment. But this was our president—calm, detached, even-keeled to the point of insensate. He offered a recital of his administration’s achievements: tuition assistance, health care. It seemed so off point. Like his first two years.

But it was the word Mrs. Hart used that captured everything: “exhausted.” From what I see, that’s how a lot of Democrats feel. They’ve turned silent, too, like people who witnessed a car crash and can’t talk anymore about the reasons for the accident or how many were injured.

This election is more and more shaping up into a contest between the Exhausted and the Enraged.

In a contest like that, who wins? That’s like asking, “Who would win a sporting event between the depressed and the anxious?” The anxious are wide awake. The wide awake win.

But Rep. Marsha Blackburn of Tennessee suggests I have the wrong word for the Republican base. The word, she says, is not enraged but “livid.”

The three-term Republican deputy whip has been campaigning in Alabama, Colorado, Georgia, Mississippi, Missouri, New York, Ohio, Pennsylvania and South Carolina. We spoke by phone about what she is seeing, and she sounded like the exact opposite of exhausted.

There are two major developments, she says, that are new this year and insufficiently noted, but they’re going to shape election outcomes in 2010 and beyond.

First, Washington is being revealed in a new way.

The American people now know, “with real sophistication,” everything that happens in the capital. “I find a much more knowledgeable electorate, and it is a real-time response,” Ms. Blackburn says. “We hear about it even as the vote is taking place.”

Voters come to rallies carrying research—”things they pulled off the Internet, forwarded emails,” copies of bills, roll-call votes. The Internet isn’t just a tool for organization and fund-raising. It has given citizens access to information they never had before. “The more they know,” Ms. Blackburn observes, “the less they like Washington.”

Second is the rise of women as a force. They “are the drivers in this election cycle,” Ms. Blackburn says. “Something is going on.” At tea party events the past 18 months, she started to notice “60% of the crowd is women.”

She tells of a political rally that drew thousands in Nashville, at the State Capitol plaza. She had brought her year-old grandson. When the mic was handed to her, she was holding him. “I said, ‘How many of you are grandmothers?’ The hands! That was the moment I realized that the majority of the people at the political events now are women. I saw this in town halls in ’09—it was women showing up at my listening events, it was women talking about health care.”

Why would more women be focusing more intently on politics this year than before?

Ms. Blackburn hypothesizes: “Women are always focusing on a generation or two down the road. Women make the education and health-care decisions for their families, for their kids, their spouse, their parents. And so they have become more politically involved. They are worried about will people have enough money, how are they going to pay the bills, the tuition, get the kids through school and college.”

Ms. Blackburn suggested, further in the conversation, that government’s reach into the personal lives of families, including new health-care rules and the prospect of higher taxes, plus the rise in public information on how Washington works and what it does, had prompted mothers to rebel.

The media called 1994 “the year of the angry white male.” That was the year of the Republican wave that yielded a GOP House for the first time in 40 years. “I look at this year as the Rage of the Bill-Paying Moms,” Ms. Blackburn says. “They are saying ‘How dare you, in your arrogance, cap the opportunities my child will have? You’ll burden them with so much debt they won’t be able to buy a house—all because you can’t balance the budget.'”

How does 2010 compare with 1994 in terms of historical significance? Ms. Blackburn says there’s an unnoted story there, too. Whereas 1994 was historic as a party victory, a shift in political power, this year feels more organic, more from-the-ground, and potentially deeper. She believes 2010 will mark “a philosophical shift,” the beginning of a change in national thinking regarding the role of the individual and the government.

This “will be remembered as the year the American people said no” to the status quo. The people “do not trust” those who make the decisions far away. They want to restore balance.

What is the mainstream media getting wrong about this election, and what is it getting right? The media, Ms. Blackburn says, do not fully appreciate “how livid people are with Washington.” They see the anger but don’t understand its implications. “They’re getting right that people want change, but they’re wrong about what that change is going to be.” The media, she said, “are going to be amazed when Carly Fiorina and Sharron Angle win.”

The mainstream media famously like the horse race—red is up, blue is down; Smith is in, Jones is out. But if Ms. Blackburn is right, the election, and its meaning, will be more interesting than the old, classic jockeying. And the outcomes won’t be controlled by the good ol’ boys but by those she calls “the great new gals.”

Peggy Noonan, Wall Street Journal


Full article and photo:

The Carter-Obama Comparisons Grow

Walter Mondale himself sees a parallel.

Comparisons between the Obama White House and the failed presidency of Jimmy Carter are increasingly being made—and by Democrats.

Walter Mondale, Mr. Carter’s vice president, told The New Yorker this week that anxious and angry voters in the late 1970s “just turned against us—same as with Obama.” As the polls turned against his administration, Mr. Mondale recalled that Mr. Carter “began to lose confidence in his ability to move the public.” Democrats on Capitol Hill are now saying this is happening to Mr. Obama.

Mr. Mondale says it’s time for the president “to get rid of those teleprompters and connect” with voters. Another of Mr. Obama’s clear errors has been to turn over the drafting of key legislation to the Democratic Congress: “That doesn’t work even when you own Congress,” he said. “You have to ride ’em.”

Mr. Carter himself is heightening comparisons with his own presidency by publishing his White House diaries this week. “I overburdened Congress with an array of controversial and politically costly requests,” he said on Monday. The parallels to Mr. Obama’s experience are clear.

Comparisons between the two men were made frequently during the 2008 campaign, but in a favorable way. Princeton University historian Sean Wilentz, for instance, told Fox News in August 2008 that Mr. Obama’s “rhetoric is more like Jimmy Carter’s than any other Democratic president in recent memory.” Syndicated columnist Jonah Goldberg noted more recently that Mr. Obama, like Mr. Carter in his 1976 campaign, “promised a transformational presidency, a new accommodation with religion, a new centrism, a changed tone.”

But within a few months, liberals were already finding fault with his rhetoric. “He’s the great earnest bore at the dinner party,” wrote Michael Wolff, a contributor to Vanity Fair. “He’s cold; he’s prickly; he’s uncomfortable; he’s not funny; and he’s getting awfully tedious. He thinks it’s all about him.” That sounds like a critique of Mr. Carter.

Foreign policy experts are also picking up on similarities. Walter Russell Mead, then a fellow at the Council on Foreign Relations, told the Economist magazine earlier this year that Mr. Obama is “avoiding the worst mistakes that plagued Carter.” But he warns that presidents like Mr. Obama who emphasize “human rights” can fall prey to the temptation of picking on weak countries while ignoring more dire human rights issues in powerful countries (Russia, China, Iran). Over time that can “hollow out an administration’s credibility and make a president look weak.” Mr. Mead warned that Mr. Obama’s foreign policy “to some degree makes him dependent on people who wish neither him nor America well. This doesn’t have to end badly and I hope that it doesn’t—but it’s not an ideal position after one’s first year in power.”

Liberals increasingly can’t avoid making connections between Mr. Carter’s political troubles and those of Mr. Obama. In July, MSNBC’s Chris Matthews asked his guests if Democrats up for re-election will “run away from President O’Carter.” After much laughter, John Heileman of New York Magazine quipped “Calling Dr. Freud.” To which Mr. Matthews, a former Carter speechwriter, sighed “I know.”

Pat Caddell, who was Mr. Carter’s pollster while he was in the White House, thinks some comparisons between the two men are overblown. But he notes that any White House that is sinking in the polls takes on a “bunker mentality” that leads the president to become isolated and consult with fewer and fewer people from the outside. Mr. Caddell told me that his Democratic friends think that’s happening to Mr. Obama—and that the president’s ability to pull himself out of a political tailspin is hampered by his resistance to seek out fresh thinking.

The Obama White House is clearly cognizant of the comparisons being made between the two presidents. This month, environmental activist Bill McKibben met with White House aides to convince them to reinstall a set of solar panels that Mr. Carter had placed on the White House roof. They were taken down in 1986 following roof repairs. Mr. McKibben said it was time to bring them back to demonstrate Mr. Obama’s support for alternative energy.

But Mr. McKibben told reporters that the White House “refused to take the Carter-era panel that we brought with us” and only said that they would continue to ponder “what is appropriate” for the White House’s energy needs. Britain’s Guardian newspaper reported that the Obama aides were “twitchy perhaps about inviting any comparison (to Mr. Carter) in the run-up to the very difficult mid-term elections.” Democrats need no reminding that Mr. Carter wound up costing them dearly in 1978 and 1980 as Republicans made major gains in Congress.

Mr. Fund is a columnist for


Full article:

Val McDermid’s top 10 Oxford novels

What to read? … Students walk past the Radcliffe Camera building in Oxford city centre.

Val McDermid is the award-winning author of numerous crime novels, including a series of books starring her most famous creation, clinical psychologist Dr Tony Hill. She read English at St Hilda’s College, Oxford – at 17, one of the youngest undergraduates the college had ever taken, and the first from a Scottish state school. Her latest novel, Trick of the Dark, is set in Oxford, and is published by Little, Brown.

“I spent three years at St Hilda’s College, Oxford. I took a degree in English, but more valuable was what I learned outside tutorials. And finally, with Trick of the Dark, I’ve managed to write about it. Oxford exerts a strong influence on those it touches, whether they love it or hate it, whether they embrace it or resist it, whether they admit it or deny it. I didn’t know much about it when I arrived, but thanks in large part to the dozens of books written about it, I know a lot more now.”

1. Brideshead Revisited by Evelyn Waugh

I was instantly seduced by Waugh’s portrait of the collision between a decent middle-class chap and a dysfunctional bunch of Catholic toffs. Although superficially I had nothing in common with his characters apart from studying at Oxford, I couldn’t avoid all sorts of emotional identification with them. This is the quintessential novel of Oxford gilded youth flying too close to the sun.

 2. The Way Through the Woods by Colin Dexter

Impossible to avoid Inspector Morse, whose TV adventures have amplified the city’s tourist magnetism. I’ve chosen this one because it features crucially one of my favourite Oxford streets, Park Town. I remember particularly the day Richard Nixon resigned. I had spent the afternoon reading in a hammock in a garden in Park Town, eating figs and drinking Italian wine, then went indoors as the sun went down to turn on the TV and watch history being made.

 3. The Moving Toyshop by Edmund Crispin

A classic crime novel that brings a streak of surrealism to the genre. Featuring the anarchic English literature don Gervase Fen, the mystery gets under way when a visiting poet finds a dead body in a toyshop in the middle of the night. By morning, it’s been transformed into a grocery store. Written with wit and brio, this is a clever, energetic romp that still entertains.

4. An Instance of the Fingerpost by Ian Pears

Set just after the Restoration, when conspiracies were rife, this epistolary novel features a quartet of unreliable narrators giving their versions of the same series of events. Cleverly constructed and completely fascinating, it’s loosely based on historical happenings and is crammed with fascinating period detail. It’s as much a novel of ideas as it is of character, but none the less compelling for that.

5. Zuleika Dobson by Max Beerbohm

A mad fantasy, subtitled “An Oxford love story”, this is a satire on the sheltered world of Oxford colleges a century ago. Zuleika, granddaughter of the warden of Judas College, is a conjuror whose charms bewitch all the men who come into contact with her. Rejection drives them to mass suicide and Zuleika sets her sights on Cambridge. Beerbohm’s a class act whose wit makes this still worth a read.

 6. To Say Nothing of the Dog by Connie Willis

A science-fiction fantasy dressed in the vestments of a Victorian novel, complete with epigraphs, chapter outlines and sidelong nods to Dorothy L Sayers, Conan Doyle, Jerome K Jerome and Wilkie Collins. There’s time travel; a McGuffin (the bishop’s bird stump); a Gothic villainess (Lady Schrapnell); and enough fun and games to fill a rainy weekend.

 7. Lyra’s Oxford by Philip Pullman

Strictly speaking, a short story, but an irresistible add-on to the His Dark Materials trilogy. It takes place two years after the trilogy, in the alternate Oxford introduced in Northern Lights. The story itself is intriguing but slight; its main interest comes from the extras that accompany it – a map of Lyra’s Oxford, adverts and tourist information from her universe. An amusing divertissement, but still, you should read the trilogy …

 8. Dirty Tricks by Michael Dibdin

No one has ever cast a colder eye on respectablility than Michael Dibdin. Here, a north Oxford couple’s perfect life is shattered when a dinner guest seduces the wife in her own kitchen. This triggers a series of escalating events that strip bare the superficiality of their lives and end in ruthless murder. Weaving a terrifying thread of sex and violence, this is a brilliant and satisfying thriller.

9. The Lessons by Naomi Alderman

A recent addition to the canon of Oxford fiction, Alderman’s second novel gives a tip of the hat to Brideshead, featuring its own version of a more contemporary gilded youth and an updated take on the grip of the church and its consequences. Alderman is a gifted, witty writer and The Lessons is a sharp, insightful overview of a journey that starts out hopeful and ends horrible.

 10. Gaudy Night by Dorothy L Sayers

I am no lover of Sayers – I find her style overblown, her snobbishness irritating and Lord Peter Wimsey infuriating – but no list of Oxford fiction would be complete without Gaudy Night. So I will cheat and quote my fellow crime writer Andrew Taylor: “She tried to use a detective story both as a vehicle for serious themes — the value of scholarship, and the price it exacts — and as a novel of character and manners with an attendant love story. It is a book that has given some of its readers their first glimpse of the intellectual excitement a university can offer.”


Full article and photo:

The Best of Enemies

Leaked documents, dirty tricks, nasty rumors: Richard Nixon and Jack Anderson deserved each other.

The Cliffs Notes version of The Fall of Richard Nixon is straightforward enough: The most corrupt president in American history was brought down by courageous young newspaper muckrakers who rescued the republic. But there is a supple revisionist narrative that adds more than a few layers of complexity to the established account and makes the tale much more interesting.

In “Poisoning the Press,” Mark Feldstein tells the story of the long, feral struggle between two poor, driven boys born 30 miles apart in the West who grew up to be Richard Nixon and Jack Anderson. A protégé of the columnist Drew Pearson and a devout Mormon, Anderson tormented Nixon, a fighting (non-pacifist) Quaker, throughout his 30-year political career and, Mr. Feldstein says, taught Nixon some of the dirty tricks that would later destroy his presidency.

In Mr. Feldstein’s telling, it’s hard to decide whether Nixon or Anderson was the greater rogue. For every one of Nixon’s well-known crimes against the republic, there turns out to be an equal and opposite crime against journalism by Anderson.

Dirty reporting tricks were Anderson’s M.O. He bugged the hotel room of the notorious Bernard Goldfine, a Bostontextile manufacturer who had bestowed an Oriental rug and a vicuna overcoat on President Eisenhower’s starchy chief of staff, Sherman Adams. Anderson even rooted through J. Edgar Hoover’s garbage searching for evidence that the G-man and his handsome deputy, Clyde Tolson, were lovers but found nothing more provocative than empty bottles of the antacid Gelusil. Anderson splashed stolen documents in his syndicated column, “Washington Merry-Go-Round,” routinely made false accusations of homosexuality and drunkenness—and took payoffs from a mob-connected fixer.

Starting as a legman for the patrician, ruthless Pearson and then on his own, Anderson drew first blood on most of the scandals that tainted Nixon almost from the start. There was the infamous $205,000 “loan”—$1.6 million today—that was passed to Nixon though his squirrelly brother Donald by Howard Hughes right after Nixon was re-elected as Eisenhower’s vice president in 1956. In just the first three months of 1972, Anderson broke the stories of the Nixon administration’s secret support for Pakistan on the eve of the India-Pakistan war; an additional $100,000 payoff from Hughes; the fixing of an antitrust case against the conglomerate ITT in return for a $400,000 pledge to underwrite the 1972 Republican National Convention; and the CIA plot against Salvador Allende, the Marxist president of Chile. Stolen or leaked secret documents fueled each Anderson scoop.

Nixon absorbed the lessons and fought back. He tried to insinuate a spy into Anderson’s staff, arranged for counterfeit secret documents to be slipped to the columnist, even had the CIA dog him in an episode straight from the Keystone Kops. Nothing worked, and the president became so enraged by Anderson’s relentless snooping that he uttered his own version of Henry II’s famous death sentence for Thomas à Becket: “Will nobody rid me of this turbulent priest?”

After a hideaway chat with Nixon, his consiglieri, Chuck Colson, concluded that it was imperative “to stop Anderson at all costs.” Soon plumbers E. Howard Hunt and G. Gordon Liddy were meeting with a CIA poison expert to explore slipping LSD to Anderson so that he would trip out while driving and die in a car crash. According to Mr. Feldstein, Liddy even volunteered to stab Anderson to death or break his neck in what would look like a street mugging before the hit was finally shelved as impractical.

Cataloging Nixon’s villainies, Mr. Feldstein, a TV newsman turned academic, mines fresh treasures from the president’s trove of secret Oval Office tapes. Nixon is even more foul-mouthed than we remember—and weirder.

At one point, the president orders his men to find out whether Anderson is the gay lover of a Navy yeoman who leaked to Anderson the secret documents proving the U.S. tilt to Pakistan. After all, Nixon says, Whittaker Chambers and Alger Hiss were romantically entangled: “They were both—that way.” Earlier he lectures his aides on how “homosexuality destroyed” Greece and Rome. “Aristotle was a homo. We all know that,” he explains. “So was Socrates. You know what happened to the Romans? The last six emperors were fags.”

Nixon’s path to Watergate was predictable, Mr. Feldstein suggests, given his character and his conviction that he had to defeat his implacable political and press enemies by any means. During the 1968 campaign, Pearson and Anderson prophesied that a President Nixon would “revert to type,” create “dossiers on all potential rivals” and direct “personal goons” to do his dirty work.

They were right, but Jack Anderson, the crack sleuth, blew the biggest Nixon scandal of all. He had a tip about a Republican espionage operation against the Democratic National Committee offices in the Watergate hotel. The columnist even ran into one of the burglars—a Cuban he knew—at the Washington airport a few hours before the break-in. But Anderson didn’t work the tip hard and didn’t pursue the Cuban, even though the man, before dashing off, blurted that he was on “top secret” business.

So a couple of young reporters named Woodward and Bernstein cultivated “Deep Throat” and carried Anderson’s crusade to Nixon’s doom. Seconding W. Joseph Campbell’s recent book, “Getting It Wrong,” Mr. Feldstein astutely notes: “All mythmaking to the contrary, Watergate journalism was largely derivative, reporting on investigations that were already under way before news outlets began covering them.”

He nails the baleful Nixon-Anderson legacy, too. “The ghosts of Richard Nixon and Jack Anderson continue to haunt Washington long after their departure,” he concludes. “The poisoning of politics and the press that marked their careers has tainted governance and public discourse ever since.”

Mr. Kosner is the author of “It’s News to Me,” a memoir of his career as the editor of Newsweek, New York magazine, Esquire and the New York Daily News.


Full article and photo:

Democrats Run From Pelosi

And the GOP prepares its ‘Pledge to America.’

Sometimes the impending loss of power can cause people to say strange things. Consider House Speaker Nancy Pelosi, who told reporters last week, “I don’t really even have the time to pay attention” to the attacks on her. “This is what campaigns are about. I sort of, like, thrive on them.”

Really? It’s hard to imagine Mrs. Pelosi likes the ads run by at least seven Democratic House incumbents distancing themselves from her agenda, such as the stimulus, cap and trade, and ObamaCare. Or the comments in recent weeks by Reps. Chet Edwards (a trusted Texas lieutenant), Heath Shuler (North Carolina) and Zack Space (Ohio), all of whom declined to support her re-election, saying they don’t even know who will run for speaker. Does she appreciate Alabama Rep. Bobby Bright, who said late last month, “Heck, she might even get sick and die”?

Mrs. Pelosi also faces an uprising by 37 House Democrats who back extending all the Bush tax cuts. Most of them signed a letter on Sept. 15 saying “given the continued fragility of our economy and slow pace of recovery . . . raising any taxes right now could negatively impact economic growth.” With 179 Republicans in the House, just one more Democratic defection and there could be a majority for continuing the Bush tax cuts right now.

There is similar discontent among Senate Democrats. It appears impossible that Majority Leader Harry Reid can pass any tax bill. Senate Finance Chairman Max Baucus is rumored to be unveiling his proposal within days, but no one seems to know what will be in it. There have been no substantive discussions among the finance committee’s members, a precondition for any sincere attempt to legislate.

Meanwhile, the president refuses to provide his own proposal. This is especially disappointing given that Mr. Obama’s budget requires that the $3 trillion of Bush tax cuts he favors be offset by tax increases. So whose trillions of oxen does Mr. Obama want to gore with higher taxes just 40 days before the election? He won’t say, proving he’s not really serious about resolving his tax mess now.

Instead, he’s content to ensnare Democrats in a losing game by asking them to extend the Bush tax cuts before they adjourn—only for those making less than $250,000. But with less than two weeks before Congress adjourns, Democrats can’t pass a tax cut through either chamber.

So why are they even trying to take it up now? It will leave the president and Democratic lawmakers looking disorganized, incompetent and impotent. No wonder Sen. Dianne Feinstein (D., Calif.) questioned the sanity of Democratic leaders. “I don’t know who takes a tax vote, in their right mind, just before an election,” she told the Daily Caller on Tuesday.

Mrs. Feinstein knows of what she speaks. Depending on how the question is asked, polls show as many as two out of every three Americans want to continue the Bush tax cuts and oppose raising taxes on anyone right now because of the feeble economy.

Still, Democrats have achieved something significant. Just before a crucial election, they have cemented their party’s reputation as tax-happy.

Given this ineptness, there will be a temptation for Republicans to ease up, say little of substance, and play out the clock. But in politics, it is never wise to count on the opposition to keep making mistakes. Democrats will get their act together sometime.

Republicans must reinvigorate the national conversation about jobs and economic growth, the stimulus, spending, deficits and ObamaCare, and then present constructive proposals of their own to meet the nation’s challenges.

That’s why today’s release of the House GOP’s “Pledge to America” is so important. It presents practical steps to create jobs, control spending, repeal ObamaCare, reform Washington and keep America secure. Much of it is embodied in legislation that can be voted on right now.

The only thing Congress must do before it leaves town is fund the government. The “Pledge” would freeze the tax code for two years and fund non-defense spending at 2008 levels—before the bailouts and stimulus. Mrs. Pelosi would lose if this were voted upon, even with her current huge majority. So it’s unlikely she’ll allow the GOP proposal to be considered. But she can’t stop Republicans from making their point on spending and taxes.

What’s brought Republicans so close to victory are their deep differences with Democrats. Now’s the time to emphasize those policy disagreements in every way possible. Keeping the fight on the big issues will strengthen the powerful current that’s set to sweep Democrats from office.

Mr. Rove, the former senior adviser and deputy chief of staff to President George W. Bush, is the author of “Courage and Consequence” (Threshold Editions, 2010).


Full article :

Uncommon knowledge

Simple steps to happier politics

It’s easy to be discouraged by our polarized political environment. A new study suggests there may be an easy way out. Right before the 2008 presidential election, prospective voters were asked to complete an online survey. Some of the participants were assigned a brief self-affirmation exercise, where they had to choose the personal trait (from a list of 10) that was most important to them and write a sentence or two explaining that choice. Other participants encountered the same list but had to choose the trait that was least important and explain why someone else might find it important. All participants then viewed video clips from the last presidential debate. Those who were “affirmed,” who wrote about what was most important to them, moderated their partisan views of Obama, with Republicans becoming less harshly critical and Democrats less gushing in their enthusiasm. Even more surprising, though, was that this pattern held up after the election: When the researchers e-mailed Republicans 10 days after the election, the affirmed Republicans had a significantly more favorable outlook on Obama’s presidency. So maybe Senator Al Franken had the right idea with his famous Saturday Night Live skit “Daily Affirmation with Stuart Smalley.”

Binning, K. et al., “Seeing the Other Side: Reducing Political Partisanship via Self-Affirmation in the 2008 Presidential Election,” Analyses of Social Issues and Public Policy (forthcoming).

Green = weak?

Marketers may assume that “green” products are more appealing to consumers, especially to environmentally conscious consumers. But according to a recent study, green branding sends a signal that can undermine other essential features of a product. Specifically, green products tend to be associated with gentleness, not strength. For example, people were more interested in eco-friendly baby shampoo than eco-friendly car shampoo, tires, or laundry detergent. The researchers also found a similar effect in an experiment with hand sanitizer during flu season. They put two bottles of sanitizer — one was green-colored and labeled “eco-friendly,” while the other was just clear and labeled “regular” — on a table. If they knew they were being watched, most people used the green version, but if no one seemed to be watching, most people used the regular version.

Luchs, M. et al., “The Sustainability Liability: Potential Negative Effects of Ethicality on Product Preference,” Journal of Marketing (September 2010).

The problem with talking about it

A common refrain in conflicts is to “talk it out.” While this may be effective in certain situations, some forms of talk may make the problem worse. A psychologist at Princeton University conducted an experiment in war torn eastern Congo with a radio-broadcast soap opera designed to reduce ethnic hostility. In some broadcast areas, the soap opera was followed by a 15-minute talk show. After a year of the broadcast, researchers interviewed a large sample of Congolese in the listening area. Although the talk show had the intended effect of increasing discussion among listeners, it also had the unintended effect of increasing intolerance. Apparently, the talk show provoked more contentious discussion and made people even more aware of ethnic grievances.

Paluck, E., “Is It Better Not to Talk? Group Polarization, Extended Contact, and Perspective Taking in Eastern Democratic Republic of Congo,” Personality and Social Psychology Bulletin (September 2010).

Who prays, stays

In a previous column, I wrote about a study showing that prayer can reduce alcohol consumption. The researchers behind that study have now come out with a study showing that prayer can curtail another vice: infidelity. Among a sample of several hundred college students, those who reported praying more for their partner were less likely to report cheating six weeks later. Of course, this pattern could simply mean that the kind of people who pray don’t tend to be the kind of people who cheat. So the researchers randomly assigned students to pray (in this case, for their partner) for four weeks. Compared to those who were assigned to undirected prayer or to think positive thoughts about their partner, praying for one’s partner reduced reported cheating behavior. The researchers videotaped a bunch of couples actually discussing the future of their relationship. Independent assessments of these discussions found greater commitment by those who had prayed for their partner.

Fincham, F. et al., “Faith and Unfaithfulness: Can Praying for Your Partner Reduce Infidelity?” Journal of Personality and Social Psychology (forthcoming).

Write unclearly

It almost goes without saying that one should write clearly. But that depends. According to a new study, if your goal is education, you may not want to write too clearly. In one experiment, people read a short story by Mark Twain that was printed in a font that was either easy or difficult to read; the story was also presented as either a “Historical Analysis Study” or a “Short Story Study.” When read as a short story for enjoyment, the story was rated better in the easy-to-read font, but, as a historical analysis, the story was rated better in the hard-to-read font. In another experiment, while reading the same Twain story, some people were asked to furrow their brow, an action that has been shown to induce the perception of complexity. Among those who furrowed their brow, the story was rated better when read as a historical analysis, but worse when read for enjoyment.

Galak, J. & Nelson, L., “The Virtues of Opaque Prose: How Lay Beliefs about Fluency Influence Perceptions of Quality,” Journal of Experimental Social Psychology (forthcoming).

Kevin Lewis is an Ideas columnist.


Full article:

Lost libraries

The strange afterlife of authors’ book collections

A few weeks ago, Annecy Liddell was flipping through a used copy of Don DeLillo’s ”White Noise” when she saw that the previous owner had written his name inside the cover: David Markson. Liddell bought the novel anyway and, when she got home, looked the name up on Wikipedia.

Markson, she discovered, was an important novelist himself–an experimental writer with a cult following in the literary world. David Foster Wallace considered Markson’s ”Wittgenstein’s Mistress”–a novel that had been rejected by 54 publishers–”pretty much the high point of experimental fiction in this country.” When it turned out that Markson had written notes throughout Liddell’s copy of ”White Noise,” she posted a Facebook update about her find. ”i wanted to call him up and tell him his notes are funny, but then i realized he DIED A MONTH AGO. bummer.”

The news of Liddell’s discovery quickly spread through Facebook and Twitter’s literary districts, and Markson’s fans realized that his personal library, about 2,500 books in all, had been sold off and was now anonymously scattered throughout The Strand, the vast Manhattan bookstore where Liddell had bought her book. And that’s when something remarkable happened: Markson’s fans began trying to reassemble his books. They used the Internet to coordinate trips to The Strand, to compile a list of their purchases, to swap scanned images of his notes, and to share tips. (The easiest way to spot a Markson book, they found, was to look for the high-quality hardcovers.) Markson’s fans told stories about watching strangers buy his books without understanding their origin, even after Strand clerks pointed out Markson’s signature. They also started asking questions, each one a variation on this: How could the books of one of this generation’s most interesting novelists end up on a bookstore’s dollar clearance carts?

What Markson’s fans had stumbled on was the strange and disorienting world of authors’ personal libraries. Most people might imagine that authors’ libraries matter–that scholars and readers should care what books authors read, what they thought about them, what they scribbled in the margins. But far more libraries get dispersed than saved. In fact, David Markson can now take his place in a long and distinguished line of writers whose personal libraries were quickly, casually broken down. Herman Melville’s books? One bookstore bought an assortment for $120, then scrapped the theological titles for paper. Stephen Crane’s? His widow died a brothel madam, and her estate (and his books) were auctioned off on the steps of a Florida courthouse. Ernest Hemingway’s? To this day, all 9,000 titles remain trapped in his Cuban villa.

The issues at stake when libraries vanish are bigger than any one author and his books. An author’s library offers unique access to a mind at work, and their treatment provides a look at what exactly the literary world decides to value in an author’s life. John Wronoski, a longtime book dealer in Cambridge, has seen the libraries of many prestigious authors pass through his store without securing a permanent home. ”Most readers would see these names and think, ’My god, shouldn’t they be in a library?’” Wronoski says. ”But most readers have no idea how this system works.”

The literary world is full of treasures and talismans, not all of them especially literary–a lock of Byron’s hair has been sold at auction; Harvard has archived John Updike’s golf score cards.

For private collectors and university libraries, though, the most important targets are manuscripts and letters and research materials–what’s collectively known as an author’s papers–and rare, individually valuable books. In the first category, especially, things can get expensive. The University of Texas’s Harry Ransom Center recently bought Bob Woodward and Carl Bernstein’s papers for $5 million and Norman Mailer’s for $2.5 million. Compared to the papers, the author’s own library takes a back seat. ”An author’s books are important,” says Tom Staley, the Ransom Center’s director, ”but they’re no substitute for the manuscripts and the correspondence. The books are gravy.”

Updike would seem to have agreed. After his death in 2009, Harvard’s Houghton Library bought Updike’s archive, more than 125 shelves of material that he assembled himself. Updike chose to include 1,500 books, but that number is inflated by his own work–at least one copy of every edition of every book in every language it was issued. ”He was not so comprehensive in the books that he read,” says Leslie Morris, Harvard’s curator for the Updike archive. In fact, Updike was known to donate old books to church book sales and to hand them out to friends’ wives. Late in life, he made a deal with Mark Stolle, who owns a bookstore in Manchester-by-the-Sea. ”He would call me once his garage was filled,” Stolle remembers, ”and I would go over and buy them.”

While he didn’t seem to value them, Updike’s books begin to show how and why an author’s library does matter. In his copy of Tom Wolfe’s ”A Man in Full,” which was one of Stolle’s garage finds, Updike wrote comments like ”adjectival monotony” and ”semi cliché in every sentence.” A comparison with Updike’s eventual New Yorker review suggests that authors will write things in their books that they won’t say in public.

An author’s library, like anyone else’s, reveals something about its owner. Mark Twain loved to present himself as self-taught and under-read, but his carefully annotated books tell a different story. Books can offer hints about an author’s social and personal life. After David Foster Wallace’s death in 2008, the Ransom Center bought his papers and 200 of his books, including two David Markson novels that Wallace not only annotated, but also had Markson sign when they met in New York in 1990. Most of all, though, authors’ libraries serve as a kind of intellectual biography. Melville’s most heavily annotated book was an edition of John Milton’s poems, and it proves he reread ”Paradise Lost” while struggling with ”Moby-Dick.”

And yet these libraries rarely survive intact. The reasons for this can range from money problems to squabbling heirs to poorly executed auctions. Twain’s library makes for an especially cringe-worthy case study because, unlike a lot of now-classic authors, he saw no ebb in his reputation–and, thus, no excuse in the handling of his books. In 1908, Twain donated 500 books to the library he helped establish in Redding, Conn. After Twain’s death in 1910, his daughter, Clara, gave the library another 1,700 books. The Redding library began circulating Twain’s books, many of which contained his notes, and souvenir hunters began cutting out every page that had Twain’s handwriting. This was bad enough, but in the 1950s the library decided to thin its inventory, unloading the unwanted books on a book dealer who soon realized he now possessed more than 60 titles annotated by Mark Twain. Today, academic libraries across the country own Twain books in which ”REDDING LIBRARY” has been stamped in purple ink.

But the 1950s also marked the start of a shift in the way many scholars and librarians appraised an author’s books. They began trying to reassemble the most famous authors’ libraries–or, in worst-case scenarios like Twain’s, to compile detailed lists of every book a writer had owned. The effort and ingenuity behind these lists can be astounding, as scholars will sift through diaries, receipts, even old library call slips. A good example is Alan Gribben’s ”Mark Twain’s Library: A Reconstruction,” which runs to two volumes and took nine years to complete.

This raises an obvious question: Why not make the list of an author’s books before dispersing them? The answer, usually, is time. Book dealers, Wronoski says, can’t assemble scholarly lists while also moving enough inventory to stay in business. When Wallace’s widow and his literary agent, Bonnie Nadell, sorted through his library, they sent only the books he had annotated to the Ransom Center. The others, more than 30 boxes’ worth, they donated to charity. There was no chance to make a list, Nadell says, because another professor needed to move into Wallace’s office. ”We were just speed skimming for markings of any kind.”

Still, the gap between the labor required on the front end and the back end can make such choices seem baffling and even–a curious charge to make when discussing archives–short-sighted. Libraries, for their part, must also allocate limited resources, and they do so based on a calculus of demand, precedent, and prestige. This means the big winners are historical authors (in the 1980s, Melville’s copy of Milton sold at an auction for $100,000) and those who fit into a library’s targeted specialties. ”We tend to focus on Harvard-educated authors,” Morris says. ”The Houghton Library is pretty much full and has been for the last 10 years.”

In David Markson’s case, the easiest explanation for why his books ended up at The Strand is that he wanted them to. Markson, who lived near the bookstore, would stop by three or four times a week. The Strand, in turn, hosted his book signings and maintained a table of his books, and Markson’s daughter, Johanna, says he frequently told her in his final years to take his books to The Strand. ”He said they’d take good care of us,” she says.

And so, after Johanna and her brother saved some books that were important to them–”I want my children to see what kind of reader their grandfather was,” Johanna says–a truck from The Strand picked up the rest, 63 boxes in all. Fred Bass, The Strand’s owner, says he had to break Markson’s library apart because of the size of his operation. ”We do it with most personal libraries,” Bass says. ”We don’t have room to set up special collections.”

Markson had sold books to The Strand before. In fact, over the years, he sold off his most valuable books and even small batches of his literary correspondence simply to make ends meet. Markson recalled in one interview that, when he asked Jack Kerouac to sign a book for him, Kerouac was so drunk he stabbed the pen through the front page. Bass said he personally looked through Markson’s books hoping to find items like this. ”But David had picked it pretty clean.”

Selling his literary past became a way for Markson to sustain his literary future. In ”Wittgenstein’s Mistress” and the four novels that followed, Markson abandoned characters and plots in favor of meticulously ordered allusions and historical anecdotes–a style he called ”seminonfictional semifiction.” That style, along with the skill with which he prosecuted it, explains both the size and the passion of Markson’s audience.

Markson’s late style also explains the special relevance of his library, and it’s a wonderful twist that these elements all came together in the campaign to crowdsource it. Through a Facebook group and an informal collection of blog posts, Markson’s fans have put together a representative sample of his books. The results won’t satisfy the scholarly completist, but they reveal the range of Markson’s reading–not just fiction and poetry, but classical literature, philosophy, literary criticism, and art history. They also illuminate aspects of Markson’s life (one fan got the textbooks Markson used while a graduate student) and his art (another got his copy of ”Foxe’s Book of Martyrs,” where Markson had underlined passages that resurface in his later novels). Most of all, they capture Markson’s mind as it plays across the page. In his copy of ”Agape Agape,” the final novel from postmodern wizard William Gaddis, Markson wrote: ”Monotonous. Tedious. Repetitious. One note, all the way through. Theme inordinately stale + old hat. Alas, Willie.”

Markson’s letters to and from Gaddis were one of the things he sold off–they’re now in the Gaddis collection at Washington University–but Johanna Markson says he left some papers behind. ”He always told us, ’When I die, that’s when I’ll be famous,’” she says, and she’s saving eight large bins full of Markson’s edited manuscripts, the note cards he used to write his late novels, and his remaining correspondence. A library like Ohio State’s, which specializes in contemporary fiction, seems like a good match. In fact, Geoffrey Smith, head of Ohio State’s Rare Books and Manuscripts Library, says he would have liked to look at Markson’s library, in addition to his papers. ”We would have been interested, to say the least,” Smith says.

But if Markson’s library–and a potential scholarly foothold–has been lost, other things have been gained. A dead man’s wishes have been honored. A few fans have been blessed. And an author has found a new reader. ”I’m glad I got that book,” Annecy Liddell says. ”I really wouldn’t know who Markson is if I hadn’t found that. I haven’t finished ‘White Noise’ yet but I’m almost done with ‘Wittgenstein’s Mistress’–it’s weird and great and way more fun to read.”

By Craig Fehrman is working on a book about presidents and their books.


Full article and photo:

The me-sized universe

Some parts of the cosmos are right within our grasp

If you happen to think about the universe during the course of your day, you will likely be overwhelmed.

The universe seems vast, distant, and unknowable. It is, for example, unimaginably large and old: The number of stars in our galaxy alone exceeds 100 billion, and the Earth is 4.5 billion years old. In the eyes of the universe, we’re nothing. We humans are tiny and brief. And much of the physics that drives the universe occurs on the other end of the scale, almost inconceivably small and fast. Chemical changes can occur faster than the blink of an eye, and atoms make the head of a pin seem like a mountain (really more like three Mount Everests).

Clearly, our brains are not built to handle numbers on this astronomical scale. While we are certainly a part of the cosmos, we are unable to grasp its physical truths. To call a number astronomical is to say that it is extreme, but also, in some sense, unknowable. We may recognize our relative insignificance, but leave dwelling on it to those equipped with scientific notation.

However, there actually are properties of the cosmos that can be expressed at the scale of the everyday. We can hold the tail of this beast of a universe–even if only for a moment. Shall we try?

Let’s begin at the human scale of time: It turns out that there is one supernova, a cataclysmic explosion of a star that marks the end of its life, about every 50 years in the Milky Way. The frequency of these stellar explosions fully fits within the life span of a single person, and not even a particularly long-lived one. So throughout human history, each person has likely been around for one or two of these bursts that can briefly burn brighter than an entire galaxy.

On the other hand, while new stars are formed in our galaxy at a faster rate, it is still nice and manageable, with about seven new stars in the Milky Way each year. So, over the course of an average American lifetime, each of us will have gone about our business while nearly 550 new stars were born.

But stars are always incomprehensibly large, right? Well, not always. Sometimes, near the end of a star’s life, it doesn’t explode. Instead, it collapses in on itself. Some of these are massive enough to become black holes, where space and time become all loopy. But just short of that, some stars collapse and become massive objects known as neutron stars. While these stars have incredible gravitational fields and can be detected from distances very far away, they are actually not very large. They are often only about 12 miles in diameter, which is about the distance from MIT to Wellesley College. While its mass is 500,000 times the mass of the Earth, a neutron star is actually very easy to picture, at least in terms of size.

Moving to the other end of the size spectrum, hydrogen atoms are unbelievably small: You would need to line up over 10 billion of them in a row to reach the average adult arm span. However, the wavelength of the energy a neutral hydrogen atom releases is right in our comfort zone: about 21 centimeters (or 8 inches). This is only about one-eighth the average height of a human being. This fact was even encoded pictorially on the plaques on the Pioneer probes, in order to show human height to any extraterrestrials that might eventually find these probes now hurtling out of the solar system, and who might be interested in how big we are.

And let’s not forget energy, though it might seem hard to find energetic examples on the human scale. For example, the sun, a fairly unimpressive star, releases over 300 yottajoules of energy each second, where yotta- is the highest prefix created in the metric system and is a one followed by 24 zeroes. Nonetheless, there are energy quantities we can handle. The most energetic cosmic rays–highly energetic particles of mysterious origin that come from somewhere deep in space–have about the same amount of energy as a pitcher throwing a baseball at 60 miles per hour. This is the low end of the speed of a knuckleball, which is one of the slowest pitches in baseball. While the fact that a tiny subatomic particle has that much energy is truly astounding, it’s no Josh Beckett fastball.

While these examples might seem few and far between, there is good news: The universe is actually becoming less impersonal. Through science and technology, we are getting better at bringing cosmic quantities to the human scale. For example, the number of stars in our Milky Way galaxy is less than half the total number of bits that can be stored on a Blu-ray disc. The everyday is slowly but surely inching towards the cosmic.

Yes, the universe is big and we are small. But we must treasure the exceptions, and see a little bit of the human in the cosmic, even if only for a moment.

Samuel Arbesman is a postdoctoral fellow in the Department of Health Care Policy at Harvard Medical School and is affiliated with the Institute for Quantitative Social Science at Harvard University. He is a regular contributor to Ideas.


Full article:

Limning a controversy

Hate that headline? You have company

It is probably a bit too harsh to call those upset by The Baltimore Sun’s recent use of the word limn in a headline word-haters, but I assume they’d be even more offended by the fancy word misologists.

If you didn’t catch the (admittedly brief) controversy, it went a bit like this. On Sept. 7, The Baltimore Sun used the word limn in a front-page headline (“Opposing votes limn difference in race”). That same day, Carol N. Shaw sent a letter to the editor complaining about the paper’s use of the word, calling it “unbelievably arrogant and patronizing” to use a word that she, having graduated magna cum laude and Phi Beta Kappa from the University of Maryland, didn’t immediately understand.

Although the Sun has used the word limn twice before in headlines (and 47 times, total, in the paper’s history), those previous uses didn’t occasion much, if any, comment. The Sun’s level-headed and pragmatic grammar and usage blogger, John McIntyre, supported the use of limn in the headline, especially as it’s one of the limited stock of short verbs in English that are (as he put it) “neither scatological nor obscene.”

At first glance, it’s hard to see why limn should be considered verba non grata: It’s related, etymologically, to illuminate, and has been in use in English since the 1400s, at first to mean “to paint with gold or bright color” (as in illuminated manuscripts) and then (metaphorically) to mean painting a picture in words. That metaphorical use has proven to be irresistible to book reviewers, especially: Michiko Kakutani, the book reviewer for The New York Times, has been criticized for overuse of limn.

It’s not very frequent, but limn isn’t any more specialized or opaque than the words burgeon or kiosk, all of which were estimated by the Living Word Vocabulary (a 1981 vocabulary study) to be understood by less than a third of college graduates.

Ben Zimmer, writing on the Visual Thesaurus website, pointed out that limn, in particular, has come in for more than its fair share of abuse over the years: Michael Dirda, The Washington Post book critic, has called it an example of an “ugly, pushy” word; writer Ben Yagoda called it a word that has “never been said aloud in the history of English”; and David Foster Wallace admitted that limn could seem “just off-the-charts pretentious.” William Safire, back in 2002, called limn a “vogue word” and gave it a life span of “six more months.” (Here at the Globe, Page One editor Charles Mansbach says he’d avoid limning anything in a headline: “It probably would baffle too many readers.”)

Leaving aside for the moment the question of whether limn is a reasonable word to expect readers to understand, the interesting part of this controversy is how clearly it divides people into two groups: those who feel intrigued and excited when they encounter a new word, and those who feel irritated or defensive.

Some of the latter would explain their irritation in terms of efficiency: why add another hurdle to comprehension by throwing in a word you’re not confident your audience will understand? And it’s true — a word that sticks out can distract a reader to the point of ignoring everything else that’s been written. But the dogged pursuit of the most-widely-understood word can leave both precision and elegance behind, resulting in colorless and boring writing.

Those who feel defensive are almost certainly reacting to years of assertions by word-lovers that a large vocabulary is a sign of an educated and cultured person. Hundreds of books and thousands of websites imply that a large vocabulary is the ticket to success in business and life. (One list of increase-your-vocabulary books states baldly: “It’s useless to be intelligent if you cannot express those ideas”; another suggests that a large vocabulary is necessary to be accepted as a “mature person.”) After all that hype, why are we surprised when the use of an unusual word is felt to be an implicit criticism of those who don’t immediately recognize it, a slur on their education, intellect, maturity, and literacy?

And there’s no denying an element of showoffishness is present in many uses of rare words. It would be peculiar if the all-too-human desire for status — the motivation behind name-dropping, wearing luxury brands, listening to obscure bands, or checking in to velvet-rope places on Foursquare — didn’t manifest itself in word choice, as well.

It shouldn’t be too hard to broker a truce, here, though. If the word-lovers can agree to throw in an acknowledgment whenever we use a geason word — one that’s rare or extraordinary — and the word-avoiders can agree to be a little less impatient with us when we do (and not take it personally), then problem solved. And we can all just paint — or limn — a happier world.

Erin McKean is a lexicographer and founder of


Full article:

Aren’t We Clever?

What a contrast. In a year that’s on track to be our planet’s hottest on record, America turned “climate change” into a four-letter word that many U.S. politicians won’t even dare utter in public. If this were just some parlor game, it wouldn’t matter. But the totally bogus “discrediting” of climate science has had serious implications. For starters, it helped scuttle Senate passage of the energy-climate bill needed to scale U.S.-made clean technologies, leaving America at a distinct disadvantage in the next great global industry. And that brings me to the contrast: While American Republicans were turning climate change into a wedge issue, the Chinese Communists were turning it into a work issue.

“There is really no debate about climate change in China,” said Peggy Liu, chairwoman of the Joint U.S.-China Collaboration on Clean Energy, a nonprofit group working to accelerate the greening of China. “China’s leaders are mostly engineers and scientists, so they don’t waste time questioning scientific data.” The push for green in China, she added, “is a practical discussion on health and wealth. There is no need to emphasize future consequences when people already see, eat and breathe pollution every day.”

And because runaway pollution in China means wasted lives, air, water, ecosystems and money — and wasted money means fewer jobs and more political instability — China’s leaders would never go a year (like we will) without energy legislation mandating new ways to do more with less. It’s a three-for-one shot for them. By becoming more energy efficient per unit of G.D.P., China saves money, takes the lead in the next great global industry and earns credit with the world for mitigating climate change.

So while America’s Republicans turned “climate change” into a four-letter word — J-O-K-E — China’s Communists also turned it into a four-letter word — J-O-B-S.

“China is changing from the factory of the world to the clean-tech laboratory of the world,” said Liu. “It has the unique ability to pit low-cost capital with large-scale experiments to find models that work.” China has designated and invested in pilot cities for electric vehicles, smart grids, LED lighting, rural biomass and low-carbon communities. “They’re able to quickly throw spaghetti on the wall to see what clean-tech models stick, and then have the political will to scale them quickly across the country,” Liu added. “This allows China to create jobs and learn quickly.”

But China’s capability limitations require that it reach out for partners. This is a great opportunity for U.S. clean-tech firms — if we nurture them. “While the U.S. is known for radical innovation, China is better at tweak-ovation.” said Liu. Chinese companies are good at making a billion widgets at a penny each but not good at complex system integration or customer service.

We (sort of) have those capabilities. At the World Economic Forum meeting here, I met Mike Biddle, founder of MBA Polymers, which has invented processes for separating plastic from piles of junked computers, appliances and cars and then recycling it into pellets to make new plastic using less than 10 percent of the energy required to make virgin plastic from crude oil. Biddle calls it “above-ground mining.” In the last three years, his company has mined 100 million pounds of new plastic from old plastic.

Biddle’s seed money was provided mostly by U.S. taxpayers through federal research grants, yet today only his tiny headquarters are in the U.S. His factories are in Austria, China and Britain. “I employ 25 people in California and 250 overseas,” he says. His dream is to have a factory in America that would repay all those research grants, but that would require a smart U.S. energy bill. Why?

Americans recycle about 25 percent of their plastic bottles. Most of the rest ends up in landfills or gets shipped to China to be recycled here. Getting people to recycle regularly is a hassle. To overcome that, the European Union, Japan, Taiwan and South Korea — and next year, China — have enacted producer-responsibility laws requiring that anything with a cord or battery — from an electric toothbrush to a laptop to a washing machine — has to be collected and recycled at the manufacturers’ cost. That gives Biddle the assured source of raw material he needs at a reasonable price. (Because recyclers now compete in these countries for junk, the cost to the manufacturers for collecting it is steadily falling.)

“I am in the E.U. and China because the above-ground plastic mines are there or are being created there,” said Biddle, who just won The Economist magazine’s 2010 Innovation Award for energy/environment. “I am not in the U.S. because there aren’t sufficient mines.”

Biddle had enough money to hire one lobbyist to try to persuade the U.S. Congress to copy the recycling regulations of Europe, Japan and China in our energy bill, but, in the end, there was no bill. So we educated him, we paid for his tech breakthroughs — and now Chinese and European workers will harvest his fruit. Aren’t we clever?

Thomas L. Friedman, New York Times


Full article:


My ebullient 4-year-old son, Blake, is a big fan of the CDs and DVDs that the band They Might Be Giants recently produced for the kiddie market. He’ll gleefully sing along to “Seven,” a catchy tune from their 2008 album “Here Come the 123s” that tells of a house overrun by anthropomorphic number sevens. The first one is greeted at the door: “Oh, there’s the doorbell. Let’s see who’s out there. Oh, it’s a seven. Hello, Seven. Won’t you come in, Seven? Make yourself at home.”

Despite the song’s playful surrealism (more and more sevens arrive, filling up the living room), the opening lines are routine and formulaic. The polite ritual of answering the door and inviting a guest into your house relies on certain fixed phrases in English: “Won’t you come in?” “Make yourself at home.”

As Blake learned these pleasantries through the song and its video, I wondered how much — or how little — his grasp of basic linguistic etiquette is grounded in the syntactical rules that structure how words are combined in English. An idiom like “Make yourself at home” is rather tricky if you stop to think about it: the imperative verb “make” is followed by a second-person reflexive pronoun (“yourself”) and an adverbial phrase (“at home”), but it’s difficult to break the phrase into its components. Instead, we grasp the whole thing at once.

Ritualized moments of everyday communication — greeting someone, answering a telephone call, wishing someone a happy birthday — are full of these canned phrases that we learn to perform with rote precision at an early age. Words work as social lubricants in such situations, and a language learner like Blake is primarily getting a handle on the pragmatics of set phrases in English, or how they create concrete effects in real-life interactions. The abstract rules of sentence structure are secondary.

In recent decades, the study of language acquisition and instruction has increasingly focused on “chunking”: how children learn language not so much on a word-by-word basis but in larger “lexical chunks” or meaningful strings of words that are committed to memory. Chunks may consist of fixed idioms or conventional speech routines, but they can also simply be combinations of words that appear together frequently, in patterns that are known as “collocations.” In the 1960s, the linguist Michael Halliday pointed out that we tend to talk of “strong tea” instead of “powerful tea,” even though the phrases make equal sense. Rain, on the other hand, is much more likely to be described as “heavy” than “strong.”

A native speaker picks up thousands of chunks like “heavy rain” or “make yourself at home” in childhood, and psycholinguistic research suggests that these phrases are stored and processed in the brain as individual units. As the University of Nottingham linguist Norbert Schmitt has explained, it is much less taxing cognitively to have a set of ready-made lexical chunks at our disposal than to have to work through all the possibilities of word selection and sequencing every time we open our mouths.

Cognitive studies of chunking have been bolstered by computer-driven analysis of usage patterns in large databases of texts called “corpora.” As linguists and lexicographers build bigger and bigger corpora (a major-league corpus now contains billions of words, thanks to readily available online texts), it becomes clearer just how “chunky” the language is, with certain words showing undeniable attractions to certain others.

Many English-language teachers have been eager to apply corpus findings in the classroom to zero in on salient chunks rather than individual vocabulary words. This is especially so among teachers of English as a second language, since it’s mainly the knowledge of chunks that allows non-native speakers to advance toward nativelike fluency. In his 1993 book, “The Lexical Approach,” Michael Lewis to Classroom: Language Use and Language Teaching” and “Teaching Chunks of Language: From Noticing to Remembering.”

Not everyone is on board, however. Michael Swan, a British writer on language pedagogy, has emerged as a prominent critic of the lexical-chunk approach. Though he acknowledges, as he told me in an e-mail, that “high-priority chunks need to be taught,” he worries that “the ‘new toy’ effect can mean that formulaic expressions get more attention than they deserve, and other aspects of language — ordinary vocabulary, grammar, pronunciation and skills — get sidelined.”

Swan also finds it unrealistic to expect that teaching chunks will produce nativelike proficiency in language learners. “Native English speakers have tens or hundreds of thousands — estimates vary — of these formulae at their command,” he says. “A student could learn 10 a day for years and still not approach native-speaker competence.”

Besides, Swan warns, “overemphasizing ‘scripts’ in our teaching can lead to a phrase-book approach, where formulaic learning is privileged and the more generative parts of language — in particular the grammatical system — are backgrounded.” Formulaic language is all well and good when talking about the familiar and the recurrent, he argues, but it is inadequate for dealing with novel ideas and situations, where the more open-ended aspects of language are paramount.

The methodology of the chunking approach is still open to this type of criticism, but data-driven reliance on corpus research will most likely dominate English instruction in coming years. Lexical chunks have entered the house of language teaching, and they’re making themselves at home.

Ben Zimmer will answer one reader question every other week. Send your queries to


Full article and photo:

The Pen That Never Forgets

In the spring, Cincia Dervishaj was struggling with a take-home math quiz. It was testing her knowledge of exponential notation — translating numbers like “3.87 x 102” into a regular form. Dervishaj is a 13-year-old student at St. John’s Lutheran School in Staten Island, and like many students grappling with exponents, she got confused about where to place the decimal point. “I didn’t get them at all,” Dervishaj told me in June when I visited her math class, which was crowded with four-year-old Dell computers, plastic posters of geometry formulas and a big bowl of Lego bricks.

To refresh her memory, Dervishaj pulled out her math notebook. But her class notes were not great: she had copied several sample problems but hadn’t written a clear explanation of how exponents work.

She didn’t need to. Dervishaj’s entire grade 7 math class has been outfitted with “smart pens” made by Livescribe, a start-up based in Oakland, Calif. The pens perform an interesting trick: when Dervishaj and her classmates write in their notebooks, the pen records audio of whatever is going on around it and links the audio to the handwritten words. If her written notes are inadequate, she can tap the pen on a sentence or word, and the pen plays what the teacher was saying at that precise point.

Dervishaj showed me how it works, flipping to her page of notes on exponents and tapping a set of numbers in the middle of the page. Out of a tiny speaker in the thick, cigar-shaped pen, I could hear her teacher, Brian Licata, explaining that precise problem. “It’s like having your own little personal teacher there, with you at all times,” Dervishaj said.

Having a pen that listens, the students told me, has changed the class in curious ways. Some found the pens make class less stressful; because they don’t need to worry about missing something, they feel freer to listen to what Licata says. When they do take notes, the pen alters their writing style: instead of verbatim snippets of Licata’s instructions, they can write “key words” — essentially little handwritten tags that let them quickly locate a crucial moment in the audio stream. Licata himself uses a Livescribe pen to provide the students with extra lessons. Sitting at home, he’ll draw out a complicated math problem while describing out loud how to solve it. Then he’ll upload the result to a class Web site. There his students will see Licata’s handwriting slowly fill the page while hearing his voice explaining what’s going on. If students have trouble remembering how to tackle that type of problem, these little videos — “pencasts” — are online 24 hours a day. All the students I spoke to said they watch them.

LIKE MOST PIECES of classroom technology, the pens cause plenty of digital-age hassles. They can crash. The software for loading students’ notes onto their computers or from there onto the Web can be finicky. And the pens work only with special notepaper that enables the pen to track where it’s writing; regular paper doesn’t work. (Most students buy notepads from Livescribe, though it’s possible to print the paper on a color printer.) There are also some unusual social side-effects. The presence of so many recording devices in the classroom creates a sort of panopticon — or panaudiocon, as it were. Dervishaj has found herself whispering to her seatmate, only to realize the pen was on, “so we’re like, whoa!” — their gossip has been recorded alongside her notes. Although you can pause a recording, there’s currently no way to selectively delete a few seconds of audio from the pen, so she’s forced to make a decision: Delete all the audio for that lesson, or keep it in and hope nobody else ever hears her private chatter. She usually deletes.

Nonetheless, Licata is a convert. As the students started working quietly on review problems, their pens making tiny “boop” noises as the students began or paused their recording, Licata pulled me aside to say the pens had “transformed” his class. Compact and bristling with energy, Licata is a self-professed geek; in his 10 years of teaching, he has seen plenty of classroom gadgets come and go, from Web-based collaboration software to pricey whiteboards that let children play with geometric figures the way they’d manipulate an iPhone screen. Most of these gewgaws don’t impress him. “Two or three times a year teachers whip out some new technology and use it, but it doesn’t do anything better and it’s never seen again,” he said.

But this time, he said, was different. This is because the pen is based on an age-old classroom technique that requires no learning curve: pen-and-paper writing. Livescribe first released the pen in 2008; Licata encountered it when a colleague brought his own to work. Intrigued, he persuaded Livescribe to donate 20 pens to the school to outfit his entire class. (The pens sell for around $129.) “I’ve made more gains with this class this year than I’ve made with any class,” he told me. In his evenings, Licata is pursuing a master’s degree in education; separately, he intends to study how the smart pens might affect the way students learn, write and think. “Two years ago I would have told you that note-taking is a lost art, that handwriting was a lost art,” he said. “But now I think handwriting is crucial.”

TAKING NOTES HAS long posed a challenge in education. Decades of research has found a strong correlation between good notes and good grades: the more detailed and accurate your notes, the better you do in school. That’s partly because the act of taking notes forces you to pay closer attention. But what’s more important, according to some researchers, is that good notes provide a record: most of the benefits from notes come not from taking them but from reviewing them, because no matter how closely we pay attention, we forget things soon after we leave class. “We have feeble memories,” says Ken Kiewra, a professor of educational psychology at the University of Nebraska and one of the world’s leading researchers into note-taking.

Yet most students are very bad at taking notes. Kiewra’s research has found that students record about a third of the critical information they hear in class. Why? Because note-taking is a surprisingly complex mental activity. It heavily taxes our “working memory” — the volume of information we can consciously hold in our heads and manipulate. Note-taking requires a student to listen to a teacher, pick out the most important points and summarize and record them, while trying not to lose the overall drift of the lecture. (The very best students do even more mental work: they blend what they’re hearing with material they already know and reframe the concepts in their own words.) Given how jampacked this task is, “transcription fluency” matters: the less you have to think about the way you’re recording notes, the better. When you’re taking notes, you want to be as fast and as automatic as possible.

All note-taking methods have downsides. Handwriting is the most common and easiest, but a lecturer speaks at 150 to 200 words per minute, while even the speediest high-school students write no more than 40 words per minute. The more you struggle to keep up, the more you’re focusing on the act of writing, not the act of paying attention.

Typing can be much faster. A skilled typist can manage 60 words a minute or more. And notes typed into a computer have other advantages: they can be quickly searched (unlike regular handwritten notes) and backed up or shared online with other students. They’re also neater and thus easier to review. But they come with other problems, not least of which is that typing can’t capture the diagrammatic notes that classes in math, engineering or biology often require. What’s more, while personal computers and laptops may be common in college, that isn’t the case in cash-strapped high schools. Laptops in class also bring a host of distractions — from Facebook to Twitter — that teachers loathe. And students today are rarely taught touch typing; some note-taking studies have found that students can be even slower at typing than at handwriting.

One of the most complete ways to document what is said in class is to make an audio record: all 150-plus words a minute can be captured with no mental effort on the part of the student. Kiewra’s research has found that audio can have a powerful effect on learning. In a 1991 experiment, he had four groups of students listen to a lecture. One group was allowed to listen once, another twice, the third three times and the fourth was free to scroll back and forth through the recording at will, listening to whatever snippets the students wanted to review. Those who relistened were increasingly likely to write down crucial “secondary” ideas — concepts in a lecture that add nuance to the main points but that we tend to miss when we’re focused on writing down the core ideas. And the students who were able to move in and out of the audio stream performed as well as those who listened to the lecture three times in a row. (Students who recorded more secondary ideas also scored higher in a later quiz.) But as anyone who has tried to scroll back and forth through an audio file has discovered, reviewing audio is frustrating and clumsy. Audio may be richer in detail, but it is not, like writing and typescript, skimmable.

JIM MARGGRAFF, the 52-year-old inventor of the Livescribe pen, has a particular knack for blending audio and text. In the ’90s, appalled by Americans’ poor grasp of geography, he invented a globe that would speak the name of any city or country when you touched the location with a pen. In 1998, his firm was absorbed by Leapfrog, the educational-toy maker, where Marggraff invented toys that linked audio to paper. His first device, the LeapPad, was a book that would speak words and play other sounds whenever a child pointed a stylus at it. It quickly became Leapfrog’s biggest hit.

In 2001, Marggraff was browsing a copy of Wired magazine when he read an article about Anoto, a Swedish firm that patented a clever pen technology: it imprinted sheets of paper with tiny dots that a camera-equipped pen could use to track precisely where it was on any page. Several firms were licensing the technology to create pens that would record pen strokes, allowing users to keep digital copies of whatever they wrote on the patterned paper. But Marggraff had a different idea. If the pen recorded audio while it wrote, he figured, it would borrow the best parts from almost every style of note-taking. The audio record would help note-takers find details missing from their written notes, and the handwritten notes would serve as a guide to the audio record, letting users quickly dart to the words they wanted to rehear. Marggraff quit Leapfrog in 2005 to work on his new idea, and three years later he released the first Livescribe pen. He has sold close to 500,000 pens in the last two years, mostly to teachers, students and businesspeople.

I met Marggraff in his San Francisco office this summer. He and Andrew Van Schaack, a professor in the Peabody College of Education at Vanderbilt University and Livescribe’s science adviser, explained that the pen operated, in their view, as a supplement to your working memory. If you’re not worried about catching every last word, you can allocate more of your attention to processing what you’re hearing.

“I think people can be more confident in taking fewer notes, recognizing that they can go back if there’s something important that they need,” Van Schaack said. “As a teacher, I want to free up some cognitive ability. You know that little dial on there, your little brain tachometer? I want to drop off this one so I can use it on my thinking.” Marggraff told me Livescribe has surveyed its customers on how they use the pen. “A lot of adults say that it helps them with A.D.H.D.,” he said. “Students say: ‘It helps me improve my grades in specific classes. I can think and listen, rather than writing.’ They get more confident.”

Livescribe pens often inspire proselytizing among users. I spoke to students at several colleges and schools who insisted that the pen had improved their performance significantly; one swore it helped boost his G.P.A. to 3.9 from 3.5. Others said they had evolved highly personalized short notations — even pictograms — to make it easier to relocate important bits of audio. (Whenever his professor reeled off a long list of facts, one student would simply write “LIST” if he couldn’t keep up, then go back later to fill in the details after class.) A few students pointed to the handwriting recognition in Livescribe’s desktop software: once an individual user has transferred the contents of a pen to his or her computer, the software makes it possible to search that handwriting — so long as it’s reasonably legible — by keyword. That, students said, markedly sped up studying for tests, because they could rapidly find notes on specific topics. The pen can also load “apps”: for example, a user can draw an octave of a piano keyboard and play it (with the notes coming out of the pen’s speaker), or write a word in English and have the pen translate it into Spanish on the pen’s tiny L.E.D. display.

Still, it’s hard to know whether Marggraff’s rosiest ambitions are realistic. No one has yet published independent studies testing whether the Livescribe style of enhanced note-taking seriously improves educational performance. One of the only studies thus far is by Van Schaack himself. In the spring, he conducted an unpublished experiment in which he had 40 students watch a video of a 30-minute lecture on primatology. The students took notes with a Livescribe pen, and were also given an iPod with a recording of the lecture. Afterward, when asked to locate specific facts on both devices, the students were 2.5 times faster at retrieving the facts on the pen than on the iPod. It was, Van Schaack argues, evidence that the pen can make an audio stream genuinely accessible, potentially helping students tap into those important secondary ideas that we miss when we’re scrambling to write solely by hand.

Marggraff suspects the deeper impact of the pen may not be in taking notes when you’re listening to someone else, but when you’re alone — and thinking through a problem by yourself. For example, he said, a book can overwhelm a reader with thoughts. “You’re going to get ideas like crazy when you’re reading,” Marggraff says. “The issue is that it’s too slow to sit down and write them” — but if you don’t record them, you’ll usually forget them. So when Marggraff is reading a book at home or even on a plane, he’ll pull out his pen, hit record and start talking about what he’s thinking, while jotting down some keywords. Later on, when he listens to the notes, “it’s just astounding how relevant it is, and how much value it brings.” No matter how good his written notes are, audio includes many more flashes of insight — the difference between the 30 words per minute of his writing and the 150 minutes per word of his speech, as it were.

Marggraff pulls out his laptop to show me notes he took while reading Malcolm Gladwell’s book “Outliers.” The notes are neat and legible, but the audio is even richer; when he taps on the middle of the note, I can hear his voice chattering away at high speed. When he listens to the notes, he’ll often get new ideas, so he’ll add notes, layering analysis on top of analysis.

“This is game-changing,” he says. “This is a dialogue with yourself.” He has used the technique to brainstorm patent ideas for hours at a time.

Similarly, in his class at St. John’s, Licata has found the pen is useful in capturing the students’ dialogues with themselves. For instance, he asks his students to talk to their pens while they do their take-home quizzes, recording their logic in audio. That way, if they go off the rails, Licata can click through the page to hear what, precisely, went wrong and why. “I’m actually able to follow their train of thought,” he says.

Some experts have doubts about Livescribe as a silver bullet. As Kiewra points out, plenty of technologies in the past have been hailed as salvations of education. “There’s been the radio, there’s been the phonograph, moving pictures, the VCR” — and, of course, the computer. But the average student’s note-taking ability remains as dismal as ever. Kiewra says he now believes the only way to seriously improve it is by painstakingly teaching students the core skills: how to listen for key concepts, how to review your notes and how to organize them to make meaning, teasing out interesting associations between bits of information. (As an example, he points out that students taking notes on the planets will learn lots of individual facts. But if they organize them into a chart, they’ll make discoveries on their own: sort the planets by distance from the sun and speed of rotation, and you’ll discover that the farther you go out, the more slowly they spin.) Kiewra also says that an effective way to get around the problem of incomplete and disorganized note-taking is for teachers to give out “partial” notes — handouts that summarize key concepts in the lecture but leave blanks that the students must fill in, forcing them to pay attention. Some studies have found that students using partial notes capture a majority of the main concepts in a lecture, more than doubling their usual performance.

Indeed, many modern educators say that students shouldn’t be taking notes in class at all. If it’s true that note-taking taxes their working memory, they argue, then teachers should simply hand out complete sets of notes that reflect everything in the lecture — leaving students free to listen and reflect. After all, if the Internet has done anything, it has made it trivially easy for instructors to distribute materials.

“I don’t think anyone should be writing down what the teacher’s saying in class,” is the blunt assessment of Lisa Nielsen, author of a blog, “The Innovative Educator,” who also heads up a division of the New York City Department of Education devoted to finding uses for new digital tools in classrooms. “Teachers should be pulling in YouTube videos or lectures from experts around the world, piping in great people into their classrooms, and all those things can be captured online — on Facebook, on a blog, on a wiki or Web site — for students to be looking at later,” she says. “Now, should students be making meaning of what they’re hearing or coming up with questions? Yes. But they don’t need to write down everything the teacher’s said.” There is some social-science support for the no-note-taking view. In one experiment, Kiewra took several groups of students and subjected them to different note-taking situations: some attended a lecture and reviewed their own notes; others didn’t attend but were given a set of notes from the instructor. Those who heard the lecture and took notes scored 51 percent on a subsequent test, while those who only read the instructor’s notes scored 69 percent.

Of course, if Marggraff has his way, smart pens could become so common — and so much cheaper — that bad notes, or at least incomplete ones, will become a thing of the past. Indeed, if most pen-and-paper writing could be easily copied and swapped online, the impacts on education could be intriguing and widespread. Marggraff intends to release software that lets teachers print their students’ work on dot-patterned paper; students could do their assignment, e-mail it in, then receive a graded paper e-mailed back with handwritten and spoken feedback from the teacher. Students would most likely swap notes more often; perhaps an entire class could designate one really good note-taker and let him write while everyone else listens, sharing the notes online later. Marggraff even foresees textbooks in which students could make notes in the margins and have a permanent digital record of their written and spoken thoughts beside the text. “Now we really have bridged the paper and the digital worlds,” he adds. Perhaps the future of the pen is on the screen.

Clive Thompson, a contributing writer for the magazine, writes frequently about technology and science.


Full article and photo:

Harvest Moons and the Seeds of Our Faith

How the fall equinox, and the science of ancient astronomy, helped shape religions

Next Wednesday heralds the official end of summer—the autumnal equinox —when the length of day and night are equal (circa 11:09 p.m. ET). In the 21st century, this astronomical event is little more than a passing curiosity. But rewind by about three millennia to the time of the ancient Babylonians, and the autumnal equinox marked the start of the “minor new year.” Not only did celestial events define sacred festivals. Conversely, religion powered the development of astronomy, the first science.

Today, science and religion are often thought to be very different, unconnected disciplines. But looking back at our ancient past, we see that the development of religion and early science have really gone hand-in-hand, shaping some of the characteristics of mainstream religion in ways we may not realize.

For instance, while the Babylonians celebrated their “main new year” in the spring, their tradition of having a minor autumnal new year has carried over into both mainstream religion and secular practice. Nick Campion, a historian of cultural astronomy at the University of Wales, notes two echoes of ancient autumn observances today. “It’s a custom inherited by Jews—hence Rosh Hashanah,” he told me, “while the beginning of the academic year in autumn is a secular legacy.”

The Babylonians made meticulous records of celestial events. To them, as to many ancient civilizations, the sky was thought to be the writing pad of the gods, while the stars and planets were the ink used to communicate divine messages.

Through today’s lens, the practices of star-gazing Babylonian priests may appear to be based mostly in superstition. Each night they searched the sky for omens sent by the great god Marduk or one of his entourage of lesser deities. Unexpected wanderings of the planets might foreshadow a poor harvest in the village, while the early risings of the moon could portend malformed births. By far the worst harbinger was a lunar eclipse, which signaled that the gods were angry with the king and called for his death.

The fall equinox. Astronomy helped shape religions.

Much early astronomy dealt with developing techniques to predict these omens, allowing crucial time for pre-emptive prayers and rituals to ward off misfortune.

Despite being tied to religious ritual (and often to gruesome sacrifice), the work of these priests marks the beginnings of science, says John Steele, a historian of ancient astronomy at Brown University. “They were making mathematical predictions based on empirical observations, which is astronomy by definition,” he says.

An even more detailed understanding of celestial phenomena influenced the decline of polytheism. As more sophisticated science showed that the astronomical events were routine and could be predicted, they lost their ability to inspire fear. By the 5th century B.C., Greek philosophers were developing a view that the universe originated from one divine source.

Nick Campion adds that with the rise of the monotheistic Abrahamic religions, the need to “secularize the planets”—stripping them of divine agency—became even more pressing. Astronomy could not be written out of religion completely, in part because in people’s minds the celestial patterns were so clearly tied to the changing of the seasons. So monotheistic religious leaders emphasized the importance of sacred calendars governed by predictable celestial motions.

They argued also that understanding the behavior of planets and stars was the route to revealing what Mr. Campion calls the “unfolding of God’s plan.” For a time, he notes, astronomy actually became a tool of power for the religious elite to wield, The better religious scholars were at predicting astronomical events, the more society was seen to be successfully harmonizing with God.

In the early Islamic empire, astronomical patterns dictated not only the calendar, but also the architecture of cities. Mr. Campion has studied the original plans for building Baghdad, which was designed to be laid out in seven concentric circles––to mimic the geocentric view of the cosmos held at the time, with earth at its center, and the sun, the moon and the five then-known planets in orbit around it.

Although those plans were partially abandoned, the ultimate framework of the city was indeed circular and its foundations were laid on a day calculated to coincide with the time that Jupiter, then thought to be the supreme power-giving planet, rose above the Eastern horizon. “Baghdad was literally a cosmopolis,” says Mr. Campion.

Baghdad is one example of how an ancient society was built to celestial blueprints. To fully appreciate some of our religious practices today––and sometimes even the layout of the ground under our feet––we must look back to the earliest science and the influence of the night skies.

Ms. Merali is a science writer and documentary producer based in London.


Full article and photo:

The Foreign Devil’s Dictionary

The Oxford Chinese Dictionary is a fresh, modern bridge between two languages that can still seem a world apart.

Chinese were hardly enthusiastic about learning the language of the English barbarians when East India Company ships first turned up on the shores of the Celestial Kingdom. Only in the 18th century did traders begin to pick up a few words, using pamphlets like the one entitled “Those Words of the Devilish Language of Red-Bristled People Commonly Used in Buying and Selling.”

Today, practicing English is practically China’s national pastime, with the number of English students and speakers reckoned at between 200 million and 350 million. And with the release of the Oxford Chinese Dictionary last week, they have a better guide to the devilishly difficult language. Oxford University Press describes it as “the largest, the most up-to-date, the most accurate, and the most authoritative English-Chinese/Chinese-English dictionary ever published.”

But a bilingual dictionary in one volume can hardly be an exhaustive catalogue of every word in two languages. Instead it aims to be accessible to learners and users of both languages, presenting a broad sweep of modern English and Chinese. It’s especially useful for the student of Chinese struggling to keep up with the breakneck development of slang and allusion.

As such, it’s more a compendium of the cultural climate than an official standard-bearer for the language. There are entries for renzishi tuoxie (“flip-flops”), shua zuipizi (to “be all talk and no action”) and zhaguo (to “get excited and angry”). In the entry for san (“three”), one can peruse threes of all kinds: a three-pointer in basketball (sanfenqiu), Sun Yat-sen’s Three Principles of the People (sanminzhuyi) and an escort who provides three kinds of services (sanpei)—although exactly which three is left to the reader’s imagination.

There is breadth enough for the novice and depth enough for the specialist. Those who have breakfasted in China may already know that youtiao are deep-fried dough strips but may learn that the word is also used to refer to an untrustworthy person. Likewise with ku; those who know that it means “bitter” may not know the full span of hardship that it can describe: a thankless job (kuchai), mental vexation (kunao), lost appetite during summer (kuxia) and the feigning of injury to win others’ confidence (kurouji).

The new dictionary includes many words that are new not only to the world of Chinese-English dictionaries, but also to the language itself—the lexical footprints of a culture on the move. Fans—and in China there are many, of all sorts—will find the most current ways to call themselves: There is a straight transliteration from English (fensi, the same word for thin rice noodles) but also a more evocative rendering (fashaoyou, which literally means “fever friend”). Hangers-on will learn to keep an ear out for the word zhuixingzu to know when they have crossed the line and are officially groupies.

The Internet, predictably, contributes a jumbled heap of fresh idiom, but some of the new vocabulary represents social change along more established dimensions. China’s nouveaux riches may now be known among their jealous neighbors as “moneybags” (dakuan), “big shots” (dawan) or “bigwigs” (daheng). Graduates who have taken a little too long to leave the nest are said to be “gnawing” on their parents (kenlaozhu). Adulterers who might once have called their paramours concubines (qie) for lack of a lowlier term can now aspire to precision; an ernai is a kept woman of less-official standing.

The Oxford dictionary reaches back in the language’s history, too. China’s many chengyu, or idiomatic phrases derived from traditional fables and classical texts, present a high hurdle for learners. Often arcane and literary, they still pervade everyday conversation.

To convey in Chinese that a situation is paradoxical (zixiang maodun), for instance, is to invoke a story about an arms dealer who oversells his wares. Expressing the need for perseverance (tiechu mocheng zhen) means referring to the story of a man who made a needle by rolling a steel pole in his hands for weeks and weeks. The new dictionary does crackerjack work with these and others. Some experts put the number of chengyu expressions at 5,000, though others settle for no fewer than 20,000. Statistics on the Chinese language, like folk tales, change depending on who’s telling them.

Browsing the new dictionary reinforces a sense of the deep pools from which the Chinese language springs, even at its colloquial cutting-edge. Today’s Chinese, profoundly rooted yet full of novelty, reflects a people who revere tradition but also seek constantly to reinterpret that tradition. The Oxford dictionary is an apt monument to this ambivalence.

Nevertheless, monuments always decay. If it follows the example of the flagship Oxford English Dictionary, the Oxford Chinese Dictionary may in its next edition retreat from the printed page, living exclusively online. It would be a fitting development. The words and phrases of modern Chinese may have at last been captured and recorded. But in Shenzhen’s Internet cafés and Beijing’s rock clubs, the language is still being taken apart and rebuilt.

Mr. Zhong is a Princeton-in-Asia fellow at the Wall Street Journal Asia’s editorial page.


Full article and photo :

Boxing Lessons

I offer training in both philosophy and boxing. Over the years, some of my colleagues have groused that my work is a contradiction, building minds and cultivating rational discourse while teaching violence and helping to remove brain cells. Truth be told, I think philosophers with this gripe should give some thought to what really counts as violence.  I would rather take a punch in the nose any day than be subjected to some of the attacks that I have witnessed in philosophy colloquia.  However, I have a more positive case for including boxing in my curriculum for sentimental education. 

Western philosophy, even before Descartes’ influential case for a mind-body dualism, has been dismissive of the body. Plato — even though he competed as a wrestler — and most of the sages who followed him, taught us to think of our arms and legs as nothing but a poor carriage for the mind.  In “Phaedo,” Plato presents his teacher Socrates on his deathbed as a sort of Mr. Spock yearning to be free from the shackles of the flesh so he can really begin thinking seriously. In this account, the body gives rise to desires that will not listen to reason and that becloud our ability to think clearly.
In much of Eastern philosophy, in contrast, the search for wisdom is more holistic. The body is considered inseparable from the mind, and is regarded as a vehicle, rather than an impediment, to enlightenment. The unmindful attitude towards the body so prevalent in the West blinkers us to profound truths that the skin, muscles and breath can deliver like a punch.

While different physical practices may open us to different truths, there is a lot of wisdom to be gained in the ring. Socrates, of course, maintained that the unexamined life was not worth living, that self-knowledge is of supreme importance. One thing is certain: boxing can compel a person to take a quick self-inventory and gut check about what he or she is willing to endure and risk. As Joyce Carol Oates observes in her minor classic, “On Boxing”:

Boxers are there to establish an absolute experience, a public accounting of the outermost limits of their beings; they will know, as few of us can know of ourselves, what physical and psychic power they possess — of how much, or how little, they are capable.

Though the German idealist philosopher G.W.F. Hegel (1770-1831) never slipped on the gloves, I think he would have at least supported the study of the sweet science. In his famous Lord and Bondsman allegory,[1] Hegel suggests that it is in mortal combat with the other, and ultimately in our willingness to give up our lives, that we rise to a higher level of freedom and consciousness. If Hegel is correct, the lofty image that the warrior holds in our society has something to do with the fact that in her willingness to sacrifice her own life, she has escaped the otherwise universal choke hold of death anxiety. Boxing can be seen as a stylized version of Hegel’s proverbial trial by battle and as such affords new possibilities of freedom and selfhood.

Viewed purely psychologically, practice in what used to be termed the “manly art” makes people feel more at home in themselves, and so less defensive and perhaps less aggressive. The way we cope with the elemental feelings of anger and fear determines to no small extent what kind of person we will become. Enlisting Aristotle, I shall have more to say about fear in a moment, but I don’t think it takes a Freud to recognize that many people are mired in their own bottled up anger. In our society, expressions of anger are more taboo than libidinal impulses. Yet, as our entertainment industry so powerfully bears out, there is plenty of fury to go around. I have trained boxers, often women, who find it extremely liberating to learn that they can strike out, throw a punch, express some rage, and that no one is going to die as a result.

And let’s be clear, life is filled with blows. It requires toughness and resiliency. There are few better places than the squared circle to receive concentrated lessons in the dire need to be able to absorb punishment and carry on, “to get off the canvas” and “roll with the punches.” It is little wonder that boxing, more than any other sport, has functioned as a metaphor for life. Aside from the possibilities for self-fulfillment, boxing can also contribute to our moral lives.

In his “Nicomachean Ethics,” Aristotle argues that the final end for human beings is eudaimonia ─ the good life, or as it is most often translated, happiness. In an immortal sentence Aristotle announces, “The Good of man (eudaimonia) is the active exercise of his soul’s faculties in conformity with excellence or virtue, or if there be several human excellences or virtues, in conformity with the best and most perfect among them.”[2]

A few pages later, Aristotle acknowledges that there are in fact two kinds of virtue or excellence, namely, intellectual and moral.[3] Intellectual excellence is simple book learning, or theoretical smarts. Unlike his teacher Plato and his teacher’s teacher, Socrates, Aristotle recognized that a person could know a great deal about the Good and not lead a good life. “With regard to excellence,” says Aristotle, “it is not enough to know, but we must try to have and use it.” [4]

Aristotle offers a table of the moral virtues that includes, among other qualities, temperance, justice, pride, friendliness and truthfulness. Each semester when I teach ethics, I press my students to generate their own list of the moral virtues. “What,” I ask, “are the traits that you connect with having character?”  Tolerance, kindness, self-respect, creativity, always make it on to the board, but it is usually only with prodding that courage gets a nod. And yet, courage seems absolutely essential to leading a moral life. After all, if you do not have mettle, you will not be able to abide by your moral judgments.  Doing the right thing often demands going down the wrong side of the road of our immediate and long-range self-interests. It frequently involves sacrifices that we do not much care for, sometimes of friendships, or jobs; sometimes, as in the case with Socrates, even of our lives. Making these sacrifices is impossible without courage.

According to Aristotle, courage is a mean between rashness and cowardliness;[5] that is, between having too little trepidation and too much. Aristotle reckoned that in order to be able to hit the mean, we need practice in dealing with the emotions and choices corresponding to that virtue.  So far as developing grit is concerned, it helps to get some swings at dealing with manageable doses of fear. And yet, even in our approach to education, many of us tend to think of anything that causes a shiver as traumatic.  Consider, for example, the demise of dodge ball in public schools. It was banned because of the terror that the flying red balls caused in some children and of the damage to self-esteem that might come with always being the first one knocked out of the game. But how are we supposed to learn to stand up to our fears if we never have any supervised practice in dealing with the jitters? Of course, our young people are very familiar with aggressive and often gruesome video games that simulate physical harm and self-defense, but without, of course, any of the consequences and risks that might come with putting on the gloves.

Boxing provides practice with fear and with the right, attentive supervision, in quite manageable increments. In their first sparring session, boxers usually erupt in “fight or flight” mode. When the bell rings, novices forget everything they have learned and simply flail away.  If they stick with it for a few months, their fears diminish; they can begin to see things in the ring that their emotions blinded them to before. More importantly, they become more at home with feeling afraid. Fear is painful, but it can be faced, and in time a boxer learns not to panic about the blows that will be coming his way.

While Aristotle is able to define courage, the study and practice of boxing can enable us to not only comprehend courage, but “to have and use” it. By getting into the ring with our fears, we will be less likely to succumb to trepidation when doing the right thing demands taking a hit. To be sure, there is an important difference between physical and moral courage. After all, the world has seen many a brave monster. The willingness to endure physical risks is not enough to guarantee uprightness; nevertheless, it can, I think contribute in powerful ways to the development of moral virtue.


[1] G.W.F. Hegel, “Phenomenology of Spirit,” Chapter 4.
[2] Aristotle, “Nicomachean Ethics,” Book I, Chapter 7.
[3] ibid., Book I, Chapter 13.
[4] ibid, Book X, Chapter 9.
[5] ibid, Book III, Chapter 7.

Gordon Marino is an active boxing trainer and professor of philosophy at St. Olaf College. He covers boxing for the Wall Street Journal, is the editor of “Ethics:The Essential Writings” (Modern Library Classics, 2010) and is at work on a book about boxing and philosophy.


Full article and photo :