Obama vs. Honduran Democracy

The Obama administration is using its brass knuckles to support Latin American thugs.

If the Obama administration were a flotilla of ships, it might be sending out an SOS right about now. ObamaCare has hit the political equivalent of an iceberg. And last week the president’s international prestige was broadsided by the Scots, who set free the Lockerbie bomber without the least consideration of American concerns. Mr. Obama’s campaign promise of restoring common sense to budget management is sleeping with the fishes.

This administration needs a win. Or more accurately, it can’t bear another loss right now. Most especially it can’t afford to be defeated by the government of a puny Central American country that doesn’t seem to know its place in the world and dares to defy the imperial orders of Uncle Sam.

I’m referring, of course, to Honduras, which despite two months of intense pressure from Washington is still refusing to reinstate Manuel Zelaya, its deposed president. Last week the administration took off the gloves and sent a message that it would use everything it has to break the neck of the Honduran democracy. Its bullying might work. But it will never be able to brag about what it has done.

The most recent example of the Obama-style Good Neighbor Policy was the announcement last week that visa services for Hondurans are suspended indefinitely, and that some $135 million in bilateral aid might be cut. But these are only the public examples of its hardball tactics. Much nastier stuff is going on behind the scenes, practiced by a presidency that once promised the American people greater transparency and a less interventionist foreign policy.

honduras ssss

Supporters of Honduran President Roberto Micheletti (August 24.). The U.S. continues to implement punitive measures against the country.

To recap, the Honduran military in June executed a Supreme Court arrest warrant against Mr. Zelaya for trying to hold a referendum on whether he should be able to run for a second term. Article 239 of the Honduran constitution states that any president who tries for a second term automatically loses the privilege of his office. By insisting that Mr. Zelaya be returned to power, the U.S. is trying to force Honduras to violate its own constitution.

It is also asking Hondurans to risk the fate of Venezuela. They know how Venezuela’s Hugo Chávez went from being democratically elected the first time, in 1998, to making himself dictator for life. He did it by destroying his country’s institutional checks and balances. When Mr. Zelaya moved to do the same in Honduras, the nation cut him off at the pass.

For Mr. Chávez, Mr. Zelaya’s return to power is crucial. The Venezuelan is actively spreading his Marxist gospel around the region and Mr. Zelaya was his man in Tegucigalpa.

The Honduran push-back is a major setback for Caracas. That’s why Mr. Chávez has mobilized the Latin left to demand Mr. Zelaya’s return. Last week, Dominican Republic President Leonel Fernández joined the fray, calling for Honduras to be kicked out of the Central American Free Trade Agreement (Cafta). Mr. Fernandez is a close friend of Mr. Chávez and a beneficiary of Venezuela’s oil-for-obedience program in the Caribbean.

Mr. Obama apparently wants in on this leftie-fest. He ran for president, in essence, against George W. Bush. Mr. Bush was unpopular in socialist circles. This administration wants to show that it can be cool with Mr. Chávez and friends.

Mr. Obama’s methods are decidedly uncool. Prominent Hondurans, including leading members of the business community, complain that a State Department official has been pressuring them to push the interim government to accept the return of Mr. Zelaya to power.

When I asked the State Department whether it was employing such dirty tricks a spokeswoman would only say the U.S. has been “encouraging all members of civil society to support the San Jose ‘accord'”—which calls for Mr. Zelaya to be restored to power. Perhaps something was lost in the translation but threats to use U.S. power against a small, poor nation hardly qualify as encouragement.

Elsewhere in the region there are reports that U.S. officials have been calling Latin governments to demand that they support the U.S. position. When I asked State whether that was true, a spokeswoman would not answer the question. She would only say that the U.S. is “cooperating with the [Organization of American States] and [Costa Rican President] Oscar Arias to support the San José accord.”

In other words, though it won’t admit to coercion, it is fully engaged in arm-twisting at the OAS in order to advance its agenda.

This not only seems unfair to the Honduran democracy but it also seems to contradict an earlier U.S. position. In a letter to Sen. Richard Lugar on Aug. 4, the State Department claimed that its “strategy for engagement is not based on any particular politician or individual” but rather finding “a “resolution that best serves the Honduran people and their democratic aspirations.”

A lot of Hondurans believe that the U.S. isn’t using its brass knuckles to serve their “democratic aspirations” at all, but the quite-opposite aspirations of a neighborhood thug.

Full article and photo: http://online.wsj.com/article/SB10001424052970204731804574382872711784150.html

Sorting Fact From Fiction on Health Care

Current congressional proposals would significantly change your relationship with your doctor.

In recent town-hall meetings, President Barack Obama has called for a national debate on health-care reform based on facts. It is fact that more than 40 million Americans lack coverage and spiraling costs are a burden on individuals, families and our economy. There is broad consensus that these problems must be addressed. But the public is skeptical that their current clinical care is substandard and that no government bureaucrat will come between them and their doctor. Americans have good reason for their doubts—key assertions about gaps in care are flawed and reform proposals to oversee care could sharply shift decisions away from patients and their physicians.

Consider these myths and mantras of the current debate:

Americans only receive 55% of recommended care. This would be a frightening statistic, if it were true. It is not. Yet it was presented as fact to the Senate Health and Finance Committees, which are writing reform bills, in March 2009 by the Agency for Healthcare Research and Quality (the federal body that sets priorities to improve the nation’s health care).

The statistic comes from a flawed study published in 2003 by the Rand Corporation. That study was suppose to be based on telephone interviews with 13,000 Americans in 12 metropolitan areas followed up by a review of each person’s medical records and then matched against 439 indicators of quality health practices. But two-thirds of the people contacted declined to participate, making the study biased, by Rand’s own admission. To make matters worse, Rand had incomplete medical records on many of those who participated and could not accurately document the care that these patients received.

For example, Rand found that only 15% of the patients had received a flu vaccine based on available medical records. But when asked directly, 85% of the patients said that they had been vaccinated. Most importantly, there were no data that indicated whether following the best practices defined by Rand’s experts made any difference in the health of the patients.

health ssss

In March 2007, a team of Harvard researchers published a study in the New England Journal of Medicine that looked at nearly 10,000 patients at community health centers and assessed whether implementing similar quality measures would improve the health of patients with three costly disorders: diabetes, asthma and hypertension. It found that there was no improvement in any of these three maladies.

Dr. Rodney Hayward, a respected health-services professor at the University of Michigan, wrote about this negative result, “It sounds terrible when we hear that 50 percent of recommended care is not received, but much of the care recommended by subspecialty groups is of a modest or unproven value, and mandating adherence to these recommendations is not necessarily in the best interest of patients or society.”

The World Health Organization ranks the U.S. 37th In the world in quality. This is another frightening statistic. It is also not accurate. Yet the head of the National Committee for Quality Assurance, a powerful organization influencing both the government and private insurers in defining quality of care, has stated this as fact.

The World Health Organization ranks the U.S. No. 1 among all countries in “responsiveness.” Responsiveness has two components: respect for persons (including dignity, confidentiality and autonomy of individuals and families to make decisions about their own care), and client orientation (including prompt attention, access to social support networks during care, quality of basic amenities and choice of provider). This is what Americans rightly understand as quality care and worry will be lost in the upheaval of reform. Our country’s composite score fell to 37 primarily because we lack universal coverage and care is a financial burden for many citizens.

We need to implement “best practices.” Mr. Obama and his advisers believe in implementing “best practices” that physicians and hospitals should follow. A federal commission would identify these practices.

On June 24, 2009, the president appeared on “Good Morning America” with Diane Sawyer. When Ms. Sawyer asked whether “best practices” would be implemented by “encouragement” or “by law,” the president did not answer directly. He said that he was confident doctors “want to engage in best practices” and “patients are going to insist on it.” The president also said there should be financial incentives to “allow doctors to do the right thing.”

There are domains of medicine where a patient has no control and depends on the physician and the hospital to provide best practices. Strict protocols have been developed to prevent infections during procedures and to reduce the risk of surgical mishaps. There are also emergency situations like a patient arriving in the midst of a heart attack where standardized advanced treatments save many lives.

But once we leave safety measures and emergency therapies where patients have scant say, what is “the right thing”? Data from clinical studies provide averages from populations and may not apply to individual patients. Clinical studies routinely exclude patients with more than one medical condition and often the elderly or people on multiple medications. Conclusions about what works and what doesn’t work change much too quickly for policy makers to dictate clinical practice.

An analysis from the Ottawa Health Research Institute published in the Annals of Internal Medicine in 2007 reveals how long it takes for conclusions derived from clinical studies about drugs, devices and procedures to become outdated. Within one year, 15 of 100 recommendations based on the “best evidence” had to be significantly reversed; within two years, 23 were reversed, and at 5 1/2 years, half were contradicted. Americans have witnessed these reversals firsthand as firm “expert” recommendations about the benefits of estrogen replacement therapy for postmenopausal women, low fat diets for obesity, and tight control of blood sugar were overturned.

Even when experts examine the same data, they can come to different conclusions. For example, millions of Americans have elevated cholesterol levels and no heart disease. Guidelines developed in the U.S. about whom to treat with cholesterol-lowering drugs are much more aggressive than guidelines in the European Union or the United Kingdom, even though experts here and abroad are extrapolating from the same scientific studies. An illuminating publication from researchers in Munich, Germany, published in March 2003 in the Journal of General Internal Medicine showed that of 100 consecutive patients seen in their clinic with high cholesterol, 52% would be treated with a statin drug in the U.S. based on our guidelines while only 26% would be prescribed statins in Germany and 35% in the U.K. So, different experts define “best practice” differently. Many prominent American cardiologists and specialists in preventive medicine believe the U.S. guidelines lead to overtreatment and the Europeans are more sensible. After hearing of this controversy, some patients will still want to take the drug and some will not.

This is how doctors and patients make shared decisions—by considering expert guidelines, weighing why other experts may disagree with the guidelines, and then customizing the therapy to the individual. With respect to “best practices,” prudent doctors think, not just follow, and informed patients consider and then choose, not just comply.

No government bureaucrat will come between you and your doctor. The president has repeatedly stated this in town-hall meetings. But his proposal to provide financial incentives to “allow doctors to do the right thing” could undermine this promise. If doctors and hospitals are rewarded for complying with government mandated treatment measures or penalized if they do not comply, clearly federal bureaucrats are directing health decisions.

Further, at the AMA convention in June 2009, the president proposed linking protection for physicians from malpractice lawsuits if they strictly adhered to government-sponsored treatment guidelines. We need tort reform, but this is misconceived and again clearly inserts the bureaucrat directly into clinical decision making. If doctors are legally protected when they follow government mandates, the converse is that doctors risk lawsuits if they deviate from federal guidelines—even if they believe the government mandate is not in the patient’s best interest. With this kind of legislation, physicians might well pressure the patient to comply with treatments even if the therapy clashes with the individual’s values and preferences.

The devil is in the regulations. Federal legislation is written with general principles and imperatives. The current House bill H.R. 3200 in title IV, part D has very broad language about identifying and implementing best practices in the delivery of health care. It rightly sets initial priorities around measures to protect patient safety. But the bill does not set limits on what “best practices” federal officials can implement. If it becomes law, bureaucrats could well write regulations mandating treatment measures that violate patient autonomy.

Private insurers are already doing this, and both physicians and patients are chafing at their arbitrary intervention. As Congress works to extend coverage and contain costs, any legislation must clearly codify the promise to preserve for Americans the principle of control over their health-care decisions.

Dr. Groopman, a staff writer for the New Yorker, and Dr. Hartzband are on the staff of Beth Israel Deaconess Medical Center in Boston and on the faculty of Harvard Medical School.


Full article and photo: http://online.wsj.com/article/SB10001424052970203706604574378542143891778.html

Israel, Iran and Obama

Conflict is inevitable unless the West moves quickly to stop a nuclear Tehran.

The International Atomic Energy Agency has produced another alarming report on Iran’s nuclear programs, though it hasn’t released it publicly, only to governments that would also rather not disclose more details of Iran’s progress toward becoming a nuclear theocracy. Meanwhile, Iran intends to introduce a resolution, backed by more than 100 members of the so-called Non-Aligned Movement, that would ban military attacks on nuclear facilities. No actual mention of Israel, of course.

The mullahs understand that the only real challenge to their nuclear ambitions is likely to come from Israel. They’ve long concluded that the U.N. is no threat, as IAEA chief Mohamed ElBaradei has in practice become an apologist for Iran’s program. They can also see that the West lacks the will to do anything, as the Obama Administration continues to plead for Tehran to negotiate even as Iran holds show trials of opposition leaders and journalists for saying the recent re-election of Mahmoud Ahmadinejad was fraudulent. The irony is that the weaker the West and U.N. appear, the more probable an Israeli attack becomes.


The reality that Western leaders don’t want to admit is that preventing Iran from getting the bomb is an Israeli national imperative, not a mere policy choice. That’s a view shared across Israel’s political spectrum, from traditional hawks like Prime Minister Benjamin Netanyahu to current Defense Minister and former Labor Prime Minister Ehud Barak. Israelis can see the relentless progress Iran is making toward enriching uranium, building a plutonium-breeding facility and improving on its ballistic missiles—all the while violating U.N. sanctions without consequence. Iran’s march to the bomb also alarms its Arab neighbors, but it represents an existential threat to an Israeli nation that Iran has promised to destroy and has waged decades of proxy war against.

This threat has only increased in the wake of Iran’s stolen election and crackdown. The nature of the regime seems to be changing from a revolutionary theocracy to a military-theocratic state that is becoming fascist in operation. The Revolutionary Guard Corps is gaining power at the expense of the traditional military and a divided clerical establishment.

On the weekend, Ahmadinejad called for the arrest and punishment of opposition leaders, and last week he nominated Ahmad Vahidi, a commander in Iran’s Revolutionary Guards Corps, to become defense minister. Vahidi is wanted on an Interpol arrest warrant for his role in masterminding the 1994 attack on a Jewish cultural center in Buenos Aires. That attack killed 85 people and wounded 200 others. Vahidi’s nomination shows that when Ahmadinejad talks of wiping Israel off the map, no Israel leader can afford to dismiss it as a religious allegory.

Israel also looks warily on the Obama Administration’s policy of diplomatic pleading with Iran, which comes after six years of failed diplomatic overtures by the European Union and Bush Administration. Secretary of State Hillary Clinton’s suggestion in July that the U.S. would extend a “defense umbrella” over its allies in the Middle East “once [Iranians] have a nuclear weapon” may have been a slip of the lip. But Israelis can be forgiven for wondering if the U.S. would sooner accept a nuclear Iran as a fait accompli than do whatever is necessary to stop it.

It’s no wonder, then, that the Israeli military has been intensively—and very publicly—war-gaming attack scenarios on Iran’s nuclear installations. This has included sending warships through the Suez Canal (with Egypt’s blessing), testing its Arrow antiballistic missile systems and conducting nation-wide emergency drills. U.S. and Israeli military officials we’ve spoken to are confident an Israeli strike could deal a significant blow to Iran’s programs, even if some elements would survive. The longer Israel waits, however, the more steps Iran can take to protect its installations.

The consequences of an Israeli attack are impossible to predict, but there is no doubt they would implicate U.S. interests throughout the Middle East. Iran would accuse the U.S. of complicity, whether or not the U.S. gave its assent to an attack. Iran could also attack U.S. targets, drawing America into a larger Mideast war.

Short of an Islamist revolution in Pakistan, an Israeli strike on Iran would be the most dangerous foreign policy issue President Obama could face, throwing all his diplomatic ambitions into a cocked hat. Yet in its first seven months, the Administration has spent more diplomatic effort warning Israel not to strike than it has rallying the world to stop Iran.


In recent days, the Administration has begun taking a harder line against Tehran, with talk of “crippling” sanctions on Iran’s imports of gasoline if the mullahs don’t negotiate by the end of September. Rhetorically, that’s a step in the right direction. But unless Mr. Obama gets serious, and soon, about stopping Iran from getting a bomb, he’ll be forced to deal with the consequences of Israel acting in its own defense. 


Full article: http://online.wsj.com/article/SB10001424052970203863204574348533106427974.html

Diplomacy in the Age of No Secrets

Today’s quiet deal could be tomorrow’s headline.

To the list of industries undermined by the Internet, from music to the Yellow Pages, we can add another: diplomacy. By all appearances, the early release of the Libyan convicted of blowing up Pan Am Flight 103 over Lockerbie, Scotland was part of a program of quiet diplomacy by the British government to appeal to Moammar Gadhafi. This favor turned out to be anything but quiet.

The freeing of the Libyan intelligence officer convicted of the 1988 bombing, Abdel Basset al-Megrahi, is a case study in how people now expect a free and instant flow of information about what their politicians have done and how hard it has become to keep secret deals secret.

The release of the bomber was announced by the Scottish minister of justice as an act of compassion, citing Megrahi’s prostate cancer. But other murderers have been ill and died in Scottish prisons. Suspicions grew with the leak of a letter from the Foreign Office in London that had assured the justice minister there was no legal barrier to Megrahi’s early release. The letter expressed the “hope on this basis you will now feel able to consider the Libyan application.” The “judicial” decision was exposed as political.


Lockerbie bomber Abdel Basset al-Megrahi and Seif al-Islam Gadhafi at an airport in Tripoli, Libya.

It didn’t help the British government that the Libyans didn’t play along, ignoring the ground rules of quiet diplomacy. Megrahi, who was released after serving only eight years of a 27-year sentence, got a hero’s welcome in Tripoli that included the flying of the Scottish and Libyan flags. Gadhafi thanked British Prime Minister Gordon Brown (“my friend Brown”) and others, including the queen, for “encouraging the Scottish government to make this historical and courageous decision.” Gadhafi’s son bragged that the release of the Libyan bomber was “always on the negotiating table” during discussions of “commercial contracts for oil and gas with Britain.”

Under further pressure, Downing Street released a letter Prime Minister Brown had sent Gadhafi urging a low-key welcome for the state intelligence officer who killed 270 people, mostly Americans. A “high-profile return would cause further unnecessary pain for the families of the Lockerbie victims,” the letter said. It also said, “You will be aware that the Scottish executive’s public announcement on Megrahi’s future is expected very shortly. I understand that their decision is to transfer Megrahi back to Libya on compassionate grounds,” contradicting earlier claims that the decision was known only when the Scottish minister announced it.

Reports then emerged that a procession of cabinet ministers had gone cap in hand to Libya in recent months and that Prince Andrew had even been scheduled to attend tomorrow’s celebration of the 40th anniversary of Gadhafi’s one-man rule. As these facts emerged, Chris Patten, a former chairman of the Conservative Party, pointed out that people would assume the next British company to win a contract in Libya was “all part of the payoff for complicity in an ill-judged decision.”

The Web made political sentiment easy to track. Comments on the BBC site last week included an American who wrote that he’ll boycott Scottish goods, including Scotch whisky. “As an American of Scottish descent, this is particularly painful, though not as painful as watching a mass murderer set free,” John from Washington wrote. “On the bright side, I will instead enjoy my Jack Daniel’s Tennessee whisky and Kentucky Bourbon, which will actually save me money.” Frank in Edinburgh posted, “I think the boycott’s a good idea. Alas, I shall not participate. As an Edinburgh resident, I would have to drive to Berwick [England] to buy my groceries and that’s not feasible.”

Polls found that twice as many Britons think the release had more to do with oil than with Megrahi’s health and that people in Scotland opposed the release by a margin of nearly 2 to 1. The Scottish Parliament begins hearings today, so expect further details of how this release happened.

We’ve come a long way from the days when a diplomatic wink and nod were the end of the discussion. It’s real progress that Libya, which out of caution stepped back from some of its activities following the overthrow of Saddam Hussein in Iraq, is now largely the focus of trade deals, but this doesn’t mean that people will ever forgive terrorism.

Indeed, one lesson for the U.S. is that politicians can’t avoid responsibility for anything to do with terrorism. The British government couldn’t blame a Scottish justice minister for releasing a terrorist. Likewise, a White House wouldn’t be able to escape political repercussions if a terrorist is freed because of the difficulties in trying these cases in criminal courts instead of as acts of war.

Diplomacy was once satirically defined as the patriotic art of lying for one’s country. This approach is hard to sustain in a world that demands transparency. For diplomats, there’s no negotiating around the fact that confidential deals today could be headlines tomorrow.


Full article and photo: http://online.wsj.com/article/SB10001424052970204731804574382571184933610.html

Society Meets The Sixties

The aristos flock to a party. The brownies are spiked with hashish.

Anyone who has seen “Gosford Park” (2001) knows that Julian Fellowes, the movie’s screenwriter, has a knack for mocking the foibles of the British ­upper crust. In his novel “Snobs” (2005) he skewered the inhabitants of the same milieu even more savagely. “Past Imperfect” shows Mr. Fellowes’s satirical talents to be undiminished. Here, though, he offers a rounded portrait of an aristocratic gratin fighting to preserve its customs and defend its turf.

Mr. Fellowes chooses his moment carefully—­precisely a decade after Queen Elizabeth II had ­summarily ended the ceremonies at which young ladies were presented at court. Until 1958, this rite of passage was the sine qua non for debutantes. How, in the new dispensation, were aristocratic parents (and parvenus) going to marry off their progeny? “Past Imperfect” ­offers a portrait of Society in 1968, unwilling to yield to democratic norms and fighting to ­retain its habits and mores.

julian sss

The effort now centers on a charity ball held at—how the mighty have fallen!—a hotel, albeit a grand one on London’s fashionable Park Lane. “There was hardly a parent there,” Mr. ­Fellowes writes of the event, “who thought their daughters’ future would be anything more than an extended repeat of their own present. How can they have been so secure in their expectations? Didn’t it occur to them that more change might be on its way? After all, their generation had lived through enough of it to push the world off its axis.”

The novel’s whirl of parties and dances takes place in London’s fabled Swinging Sixties, and some of the era’s telltale iconography makes an appearance. The ­first-person narrator, a student at Cambridge at the time and now recalling his youth, notes that part of British culture in the 1960s was “about pop and drugs and happenings, and Marianne Faithfull and Mars Bars and free love.” But another part looked back to a ­traditional England, “where behaviour was laid down according to the practice of, if not many centuries, at least the century immediately before, where everything from clothes to sexual morality was rigidly determined and, if we did not always obey the rules, we knew what they were.”

The view of this social class in “Past Imperfect” is often less than flattering. By the late 1960s, its ­members are nervous about their status, toying with “the new” and, as ever, eager for the money to keep up appearances. The men still dress in white tie, and even know when it is right to wear it, though ever fewer ­people care. Other customs threaten to fall away. A ­debutante attempts to attend the races at Ascot but is forbidden admission because she is—shockingly—­wearing trousers. She decides to remove them on the spot, sending nearby photographers into a frenzy. “I suppose I can come in now,” she says calmly to the bow-tied gateman. “I suppose you can,” he ­answers. Later in the story, an American heiress—the family’s name is Vitkov—arrives in London and tries to crash London society by renting Madame Tussauds, of all places. The aristos flock to her party even so, ­consuming the brownies that are handed around ­without quite grasping, until it is too late, that they have been spiked with hashish by a spiteful guest.

The novel’s narrator, a member of the aristocratic class himself, is aware of its foolish side but cannot feel happy about what eventually comes to replace the old order. He observes that a new and affluent group is now living the high life, but its members “do not, unlike their predecessors a century ago, take much ­responsibility for those less blessed. This new breed feel no heed to lead the public in public.”

Mr. Fellowes, it should be said, is not merely ­committing sociology in “Past Imperfect.” He offers a narrative crowded with incident and memorable ­characters. The device by which he builds his story—a dying billionaire entrusts the narrator with a quest to discover which one of a half-dozen ladies gave birth to his only heir—is a trifle contrived. But it does give Mr. Fellowes the chance to romp through bedroom and ­ballroom, not to mention down memory lane. A ­disastrous dinner party referred to throughout the novel—think shattered crockery and illusions—is not described in full until near the end. Even after all that build-up, it proves worth the wait.

Mr. Rubin is a writer in Pasadena, Calif.


Full article and photo: http://online.wsj.com/article/SB10001424052970203706604574371522977817250.html

Japan Throws the Bums Out

But does the new crowd have better ideas?

It was inevitable that even the Japanese would eventually get fed up with patronage politics, governance gaffes and decades of economic drift.

Yesterday’s election victory of the Democratic Party of Japan and party leader Yukio Hatoyama is no small thing. It undermines nearly 54 years of Liberal Democratic Party dominance in Tokyo. The last time this happened—in 1993—a motley coalition of eight parties held power for merely 11 months. The DPJ, by contrast, has been a party for more than a decade and wants to stay for the Lower House’s full four-year term.

Mr. Hatoyama also wields the biggest popular mandate in more than a decade after winning a resounding majority in the Lower House yesterday. The DPJ and its allies now control both legislative houses. Such a political earthquake was last witnessed in 2005, when former Prime Minister Junichiro Koizumi called a snap election to get a popular mandate to reform Japan’s economy and oust antireform MPs.

 japan iii

Democratic Party of Japan new leader Yukio Hatoyama

Mr. Hatoyama follows three lackluster LDP governments. He is—as Barack Obama was to George W. Bush—the anti-Koizumi. His political mantra is yuai, or friendship and love. Mr. Koizumi touted reform. Mr. Hatoyama is an anticapitalist. Mr. Koizumi embraced competition. Mr. Hatoyama wants to embrace Asia and the United Nations. Mr. Koizumi drew closer to the U.S. Mr. Hatoyama thinks China’s rise is inevitable and that Japan should resign itself to making do with the prosperity it accumulated in the past. Mr. Koizumi took a firm line toward Beijing and wanted Japan to start growing again to support a strong defense.

These are not small differences, nor are they marginal to American interests in Asia. At home, Mr. Hatoyama’s Keynesian worship may spell another lost decade of growth for the world’s second-largest economy. He stands for agricultural protectionism, higher minimum wages, higher taxes in the name of environmental responsibility and more handouts to the elderly, parents and unemployed. He wants to protect small- and medium-sized businesses from competition. His pledges to cut taxes are minimal; his goal to cut fat from the budget, vague; and his commitment to free trade, marginal. The phrase “economic growth” scored nary a mention in his campaign pledges.

Mr. Hatoyama’s big reform idea is to attack the bureaucracy, which is a worthy goal and scores big points with voters. But he won’t touch the shibboleth of Japan’s political establishment—the postal service. He wants politicians to make policy, which in any other, normal democracy would seem banal. But if the policies themselves aren’t better, will that really matter?

On foreign policy, too, the DPJ marks a change from the Koizumi era. Mr. Hatoyama, like other U.S. allies in Asia-Pacific, maintains that the relationship with Washington will continue to be the cornerstone of Japan’s security. He doesn’t have much choice in the matter; Japan has yet to fully normalize its military, and even if it did, it’s unreasonable to think Tokyo could match China’s single-minded military buildup and raw numbers. The DPJ, like the LDP before it, needs the U.S.

But Mr. Hatoyama is intent on scoring populist points at home by talking about distancing Japan from that very alliance. The first thing he’s likely to do is stop Japanese self-defense forces in the Indian Ocean from refueling the U.S.-led coalition in Afghanistan. That won’t have much practical effect, but the symbolism matters. The DPJ also wants to renegotiate U.S. basing agreements and do more with the United Nations, that most effective of fighting forces. Like Mr. Obama, the Japanese leader also likes the utopian idea of a nuclear-free world. North Korea’s recent tests and missile launches make that kind of thinking seem naive.

The remarkable thing is that the Obama administration seems almost wholly unaware of this anticapitalist, anti-U.S. turn of events in its cornerstone ally in North Asia. This is a mistake. Mr. Hatoyama has little experience governing and could use some guidance from Japan’s best and closest ally.

Ms. Kissel is editor of The Wall Street Journal Asia’s editorial page.


Full article and photo: http://online.wsj.com/article/SB10001424052970203706604574381700306393382.html

Why Oil Still Has a Future

 Demand in the developing world trumps new technology.

On Aug. 28, 1859, in the backwoods of northwest Pennsylvania, the first successful oil well went into production in the United States, ushering in an energy revolution that would make whale oil obsolete and eventually transform the industrial world. Yet 150 years later, even as demand increases in developing countries, oil’s position in the global economy is being questioned and challenged as never before.

Why this debate about the single most important source of energy—and a very convenient one—that provides 40% of the world’s total energy? There are the traditional concerns—energy security, diversification, political risk, and the potential for conflict among nations over resources. The huge shifts in global income flows raise anxieties about the possible impact on the global balance of power. Some worry that physical supply will run out, although examination of the world’s resource base—including a new analysis of over 800 oil fields—shows ample physical resources below ground. The politics above ground is a separate question.

But two new factors are now fueling the debate. One is the way in which oil has taken on a second identity. It is no longer only a physical commodity. It has also become a financial asset, along with stocks, bonds, currencies and the rest of the world’s financial portfolio. The resulting price volatility—from less than $40 in 2004, to as high as $147.27 in July 2008, back down to $32.40 in December 2008, and now back over $70—has enormous consequences, and not only at the gas station and in terms of public anger. It makes it much more difficult to plan future energy investments, whether in oil and gas or in renewable and alternative fuels. And it can have enormous economic impact; Detroit was sent reeling by what happened at the gas pump in 2007 and 2008 even before the credit crisis. Such volatility can fuel future recessions and inflation.

That volatility has become an explosive political issue. British Prime Minister Gordon Brown and French President Nicolas Sarkozy recently called in these pages for a global solution to “destructive volatility,” although they added that there are “no easy solutions.”

The other new factor is climate change. Whatever the outcome of the upcoming mammoth United Nations climate-change conference in Copenhagen this December, carbon regulation is now part of the future of oil.

But are big cuts in world oil usage possible? Both the U.S. Department of Energy and the International Energy Agency project that global energy use will increase almost 50% between 2006 and 2030—with oil still providing 30% or more of the world’s energy.

The reason is something else that is new—the globalization of demand. No longer are the growth markets for petroleum to be found in North America, Western Europe and Japan. The United States has already hit “peak gasoline demand.”

The demand growth has now shifted, massively, to the fast-growing emerging markets—China, India and the Middle East. Between 2000 and 2007, 85% of the growth in world oil demand was in the developing world. This shift continues: This year, more new cars have been sold in China than in the United States. When economic recovery takes hold, what happens in emerging countries will be the defining factor in the path for overall consumption.

There are two obvious ways to temper demand growth—either roll back economic growth, or find new technologies. The former is not acceptable. Thus, the answer has to lie in technology. The challenge is to find alternatives to oil that can be economically competitive—and convenient and reliable—at the massive scale required.

What will those alternatives be? Batteries and plug-ins and other electric cars—today’s favorite? Advanced biofuels? Natural-gas vehicles? The evolving smart grid, which can integrate plug-ins with greener electric generation? Or advances in the internal combustion engine, increasing fuel efficiency two or three times over?

In truth, we don’t know, and we won’t know for some time. For now, however, it is clear that the much higher levels of support for innovation—and large government incentives and subsidies—will inevitably drive technological change.

For oil, the focus is on transportation. After all, only 2% of America’s electricity is generated by oil. Until recently, it appeared that the race between the electric car and the gasoline-powered car had been decided a century ago, with a decisive win by the gasoline-powered car on the basis of cost and performance. But the race is clearly on again.

Yet, whatever the breakthroughs, the actual impact on fuel use for the next 20 years will be incremental due to the time it takes to get large-scale mass production up and running and the massive scale of the global auto industry. My firm, IHS CERA, projects that with aggressive sales volumes and no major bumps in the road (unusual for new technologies), plug-in hybrids and pure electric vehicles could constitute 25% of new car sales by 2030. But because of the slow turn-over of the overall fleet, gasoline consumption would be reduced only modestly below what it would otherwise be. Thereafter, of course, the impact could grow, perhaps very substantially.

But, in the U.S., at least for the next two decades, greater efficiency in the internal combustion engine, advanced diesels, and regular hybrids, combined with second-generation biofuels and new lighter materials, would have a bigger impact sooner. There is, however, a global twist. If small, low-cost electric vehicles really catch on in the auto growth markets in Asia, that would certainly lower the global growth curve for future oil demand.

As to the next 150 years of petroleum, we can hardly even begin to guess. For the next 20 years at least, the unfolding economic saga in emerging markets will continue to make oil a global growth business.

Mr. Yergin, chairman of IHS CERA, is author of “The Prize: the Epic Quest for Oil, Money, and Power” (Free Press), out in a revised edition this year. His article on the future of oil appears in the most recent issue of Foreign Policy.

Full article: http://online.wsj.com/article/SB10001424052970203706604574370511700484236.html

Health Care and the Democratic Soul

It’s time for Obama to channel Harry Truman.

What is at stake in the debate over health care is more than the mere crafting of policy. The issue is now the identity of the Democratic Party.

By now we know that Democrats can bail out traditional Republican constituencies like Wall Street, but it remains to be seen whether they can enact a convincing version of their own signature issue, health-care reform.

At this point, it’s fair to ask whether Democrats remember why health care is their issue in the first place. As health-care debates always have done, this one has pushed to the fore all the big questions about the rightful role of government, and too many Democrats have sought to avoid them with mushy appeals to consensus and bipartisanship. The war is on and if Democrats want to win they need to start fighting.

In the early years of the campaign for national health insurance, the battle lines were more clearly drawn. Back in the ’40s, the issue was part of an “economic bill of rights,” a grand Rooseveltian idea pushed by President Harry S. Truman.

Truman had a knack for populist phrasing. “In 1932 we were attacking the citadel of special privilege and greed,” he declared in accepting the Democratic presidential nomination in 1948. “We were fighting to drive the money changers from the temple. Today, in 1948, we are now the defenders of the stronghold of democracy and of equal opportunity, the haven of the ordinary people of this land and not of the favored classes or the powerful few.”

The Democrats won that particular battle with “the powerful few” but, fighting among themselves as usual, failed to enact national health insurance. Health-care reform nonetheless remained their great cause, their high-voltage appeal to average voters, even those who otherwise saw them as a Harvard-and-Hollywood elite. And even during feeble reform campaigns like President Bill Clinton’s 1993 attempt, the opposite half of the populist melodrama—in that case, the insurance industry—duly acted out its corporate bad-guy role.

This year things were supposed to be different. Democrats hold good-sized majorities in both houses of Congress and are led by an eloquent president who won an undeniable mandate last November. This time, the Democrats got the traditional opponents of health-care reform on board: “Ex-Foes of Health-Care Reform Emerge as Supporters” declared a headline in the Washington Post in March, over a story describing a friendly summit meeting between Mr. Obama and various health-care industry representatives.

This time the health-care fight was to be what official Washington loves: An act of cold consensus, not of hot idealism or Trumanesque populism. All the “stakeholders” would be taken care of. No one would need to get his suit ruffled.

And all it took to send the whole thing crashing to the ground, it now appears, were a few groundless rumors and a handful of angry right wingers who figured out how to game town-hall meetings and get themselves on TV. “Today there is another populist revolt afoot,” wrote Gary Bauer in Human Events last week, hailing the righteous grassroots outrage he sees in the town-hall protests.

So we have come full circle: The reformers shake hands with the special interests, while conservatives denounce the whole thing in the name of the common man and the Founding Fathers.

After I listened to a few angry town-hall meetings on the radio, the situation was clear to me. Democrats had to meet this pseudo-populist challenge by rolling out the real thing, the New Deal vision that is their party’s raison d’être.

So far, however, many in the party’s leadership haven’t been able to awaken from their bipartisan reverie. When Mr. Obama found his plans under attack, for example, he promptly began to downplay the “public option,” an obvious predicate to cutting a deal and placating the insurance industry. In other words, the prospect of a populist outburst from the right apparently moved him toward abandoning the most populist element of his party’s plans and toward an even more Beltwayist position—to move that much closer to the caricature of Democrats traditionally drawn by the right.

Mr. Obama still has time to reverse course. A great deal depends on it. To fail on health care yet again might well be the “Waterloo” Republicans dream of. And yet, as the party’s leaders click through their PowerPoint presentations and review the complicated details, they seem unable to confront the biggest questions that the right is asking, the ones about the eternal perfidy of government.

Maybe Democrats are afraid it will hurt their standing with those generous fellows on K Street if they channel Harry Truman and say what needs to be said: That government can be made to work for average people. But it will hurt even worse if they refuse to say it.

Thomas Frank, Wall Street Journal


Full article: http://online.wsj.com/article/SB10001424052970203706604574373000964995482.html

Early dinner

Fried egg

For a lower-middle class boy from Liverpool, a plate of egg and chips at five o’clock was not the done thing, recalls Laurie Taylor in his weekly column.

It was the egg and chips which first made me realise that Jim lived in a different world.

We’d gone back to his terrace house in Bootle one day after school and were sitting at the table in the back room when his dad came home from a long shift on the railways. I remember him saying “Hello” as he saw his son and me at the table but he then vanished into the tiny kitchen.

Jim and I went on chatting for a few minutes about school and Liverpool’s chances in the coming Saturday game until suddenly his dad re-appeared and without a single word placed a big plate of egg and chips and a steaming sugary mug of tea in front of each of us.

The egg and chips were delicious. No doubt about it. And the tea was just great. But even as I followed Jim’s example and finger dipped my chips in the runny yolk I felt confused by their sudden appearance.

Had Jim exchanged some hidden sign with his dad that said he was ready for egg and chips and tea? And why had I been automatically included? And why had it been so readily assumed that I wanted or even liked egg and chips? And why had no one even asked how much sugar I wanted in my mug of tea? And why, come to think of it, were we so happily wading into such a substantial meal at just after five in the afternoon?

Of course, the answer to all these questions was quite straightforward. Jim and his dad were working class. And members of the working class at that time thought it completely natural to eat five o’clock in the afternoon. But even more, as members of the working class they took it for granted that everyone else ate at that time and would happily regard egg and chips and tea as the perfect meal for the occasion.

Terraced life

How different from my own dear lower middle-class home where eating a heavy meal in the late afternoon would have been regarded as dangerously close to a satanic rite. Neither would my mother have ever tolerated egg and chips on her dinner table, or, even, in her wildest dreams, have allowed any member of the family to accompany any meal at all with a mug of steaming tea.

The more time I spent with Jim the more I came to realise the taken-for-granted aspect of so much of the terraced life around him. In Jim’s road everyone seemed to smoke Woodbines, read the Daily Mirror, take coach trips to Blackpool to see the lights, have a regular flutter on the horses, eat tins of assorted biscuits, drink mild and bitter (“mixed”), and finish off any evening out with a bag of fish and chips. No-one I knew in my road in Crosby did any of these things.

I realised, of course, that it was hard cash which determined some of these choices, but I also sensed that everybody did much the same as everybody else because that was a way of saying that you weren’t too posh or stuck up or different.

When I went on from school to college in Kent and began to talk in this way about working class life in Liverpool I was accused of being sentimental and romantic. My new friends pointed out the sins of the working class – their drunkenness and violence and sexism.

At the time I was snobbish enough to accept much of this argument. I began to wonder how I could ever have seen life in Bootle as somehow worth celebrating.

But at the end of my first year I came across a copy of Richard Hoggart’s Uses of Literacy. I read every word, placing ticks in the margin to record the similarities between life in my Bootle and his Hunslet. And I insisted on reading out chunks to my snooty new friends. Compare this, I said imperiously, with your own isolated, miserable, bourgeois lives.

But reading Hoggart did make me wonder why not one of my Bootle friends had ever expressed any personal pleasure at the way they lived their lives. Had they been no more able than I was to see its great strength and vitality? And then one day in the early 70s I heard the perfect answer.

The Liverpool sculptor, Arthur Dooley, was talking on the radio about the destruction of even more Liverpool terraces. The architect who was responsible for this latest bout of demolition sought to justify his action by telling Dooley that not one of the residents had complained about being moved out to the new tower block estates on the edges of the city.

Dooley was not convinced. “Let me tell you this,” he said in his strong Liverpool accent, “there’s no-one as easy to rob of their culture as those folks that don’t know they’ve got one.”


Full article and photo: http://news.bbc.co.uk/2/hi/uk_news/magazine/8223453.stm

Interrogating the CIA


A clever, streetwise classmate of mine at the Central Intelligence Agency’s junior officer training program—a former Delta Force officer—quickly and rudely discovered that counterterrorism in the much-vaunted Reagan years wasn’t a serious endeavor at Langley. He had original and provocative ideas on using physical force to scare the bejesus out of terrorist suspects who had American blood on their hands. Although the CIA was then filling up with operatives pretending to be engaged against a growing terrorist menace, Langley’s counterterrorist data bank and real operational planning were near zero. My friend’s ideas were too unsettling. He resigned. By the time I resigned in 1994, CIA counterterrorism had become an inflexible, lumbering creature, incapable of countering the wicked anti-American forces gaining strength in the Middle East.

Fast forward to eight years after 9/11: Has Attorney General Eric Holder damaged the CIA’s improved counterterrorist capacity by his decision to employ a special prosecutor to investigate whether crimes were committed by the agency’s interrogators? From the moment Barack Obama won the presidency, Langley’s use of “enhanced interrogation” was obviously over. The appointment of a prosecutor guarantees that unless the United States is again devastated by a terrorist attack—on a scale greater than 9/11—CIA operatives will certainly decline any future order by a Republican president to interrogate roughly a jihadist. Langley’s junior officers may still receive survival and escape training, which is the baptismal font for the agency’s enhanced interrogation techniques. But members of al Qaeda will not similarly get to enjoy the experience. 

Constrained by new rules and hostile lawyers, can the CIA in the future successfully interrogate uncooperative jihadists, like self-described 9/11 mastermind Khalid Sheikh Mohammed, who remained as close-mouthed as a clam when questioned without physical coercion? The Obama White House has been enamored of the possibilities of soft power; jihadists, too, are now supposed to yield to the psychological prowess of interrogators who play by the rules of the Federal Bureau of Investigation. Will Langley be able to develop and retain interrogators culturally and linguistically qualified under the administration’s new plans, which will have the White House and the FBI overseeing all counterterrorist interrogations? Such outside control is, among other things, meant to ensure that the CIA, which originally generated the idea of enhanced interrogation, will never again be a font of such unpleasant creativity.

Regardless of whether one believes CIA-inflicted waterboarding, sleep deprivation or severe psychological coercion (suggesting that harm could come to a family member of a taciturn al Qaeda detainee) constitute torture, such actions may have produced an intelligence bonanza and saved thousands of lives. The released and heavily redacted 2004 CIA Inspector General’s report on interrogations doesn’t make a crystal clear case in favor of enhanced interrogation, but it certainly does suggest—and one has the distinct impression that the Inspector General was personally inclined against the rough treatment—that senior officers in the Directorate of Operations consistently found the interrogations to be valuable in collecting critical information against some members of al Qaeda, especially Mr. Mohammed.

We will never know whether being nicer—building rapport—with Mr. Mohammed would have eventually worked and produced the same or better results than CIA methods did. It’s possible. But what those who argue this position are really saying is that the variables of human nature—that even the hardest holy warriors, men who live to die and slaughter infidels as an expression of divine love and vengeance—will always yield to physically noncoercive methods that don’t have that much psychological punch either. This is a very Christian way of looking at interrogation: FBI agents are supposed to reach into the souls of jihadists and as father-confessors get them to voluntarily cooperate. It’s morally redemptive for all concerned. It’s neat and clean. No deadly plots go unbroken.

Even if the 3,000 intelligence reports produced between Sept. 11, 2001 and April 2003 from the CIA’s “high value detainees,” that is, the folks who likely received rough treatment, were released, we might not resolve the debate between those who believe exclusively in the utility of rapport-building interrogations and those more skeptical about FBI methods applied to those who fly airplanes into skyscrapers. But the publication of these documents would probably help. Former Vice President Dick Cheney, a busy man, undoubtedly just read the summaries given him by the CIA and believed them; Mr. Obama, a busier man, has certainly by now perused the same operational assessments and dismissed them. (Mr. Obama could hardly do otherwise since he’d so emphatically declared during the campaign, before having classified access to Langley’s work, that enhanced interrogation had both disgraced us and made us less safe.)

Until these reports are made public, or at a minimum the detailed and regular agency assessments of the reports’ value are released, we on the outside cannot better assess whether enhanced interrogation techniques worked. Mr. Obama has certainly set the stage for an enormous row that could well consume much of the energy of his administration if the special prosecutor brings charges against any CIA official for the way he interrogated an al Qaeda terrorist with 9/11 blood on his hands. And it’s an excellent bet that if the Justice Department starts prosecuting CIA officers, some hard-left European magistrates, who are still furious that their governments abetted the Bush administration’s counterterrorism in clandestine ways, won’t be far behind in bringing lawsuits against U.S. officials. Euro-American security ties are strong (self-interest is a strong glue) and have been mostly immune to the storms that regularly strike the trans-Atlantic community (the invasion of Iraq actually deepened the intelligence exchanges between us and the antiwar French and Germans).

But the prosecution of high-profile CIA and Bush administration officials for “torture” could well spotlight U.S.-European clandestine dealings sufficiently to make them subject to political litmus tests—something that has rarely happened even with ardently leftist European governments.

As difficult as these problems could prove for the Obama administration, the CIA and the Justice Department, there’s a more immediate operational issue for the clandestine service: Langley, once again, probably cannot field a competent group of counterterrorist interrogators.

It’s a very good guess that the organization right now has no volunteers coming forward for this work, and those who are currently indentured will free themselves from this profession as soon as possible. This may not be a pressing problem if the CIA doesn’t have anyone to interrogate, which was the case throughout most of the 1990s. That changed after 9/11, but even then it’s very unlikely that the best and the brightest at the agency involved themselves with the nuts and bolts and unpleasantness of interrogating “high-value” al Qaeda detainees. Complex debriefings—let alone more aggressive interrogations—in foreign languages have rarely been an agency forte. Such things are very hard work and don’t guarantee promotions.

Inexperienced officers have usually been on the agency’s frontline. And as the 2004 Inspector General report makes clear, CIA officials were early on nervous about the interrogations. Rest assured that this meant that most case officers—especially those with field experience, Middle Eastern language skills and good opportunities for traditional, perk-filled assignments abroad—kept far away from anything touching upon the interrogation of al Qaeda terrorists.

Standard job rotations in the CIA have always been enough to debilitate professionalism developing inside the clandestine service against most targets—operatives work two or three years on a subject or country and then move on. From discussions with active-duty CIA officers since 9/11, I have the strong impression that counterterrorism hasn’t been exempted from Langley’s constantly revolving doors. When real competence develops among an operational cadre it is inevitably because individual officers have a special drive to do so, usually because of an insufficiently requited love of a subject.

A good case officer with Middle Eastern languages and a penchant for understanding Islamic radicalism would now have to be insane to accept an assignment that detailed him to interrogate Islamic terrorist suspects. No self-respecting case officer wants to be constantly surveilled by his boss. That’s not the way the intelligence business works, which is, when it works, an idiosyncratic, intimate affair. We should be horrified by the idea that holy warriors will now be questioned by operatives who tolerate all the cover-your-tush paperwork, who don’t mind being videoed when they go to work, who want to be second-guessed by their CIA bosses, let alone by FBI agents, and intelligence-committee Congressional staffers, and now White House officials.

The war on terrorism obviously isn’t what it used to be (invading countries initially produces a lot of intelligence work). If the White House is unwilling to detain terrorist suspects in facilities like Guantanamo, it is doubtful that it will want to capture many individuals for interrogations. Since the Obama administration has retained rendition, it has an escape valve that it can use to discard suspected or confirmed terrorists whom the administration wouldn’t want to prosecute in the U.S. criminal justice system (a position not at all unlikely given the difficulties of using intelligence information in U.S. courts).

Rendition isn’t risk-free, as President George W. Bush learned all too well. The Obama administration will surely use rendition when it must (quietly transporting suspects out of the country just to empty the jails of Guantanamo would be tricky and politically precarious). But it will likely not use rendition often enough to allow case officers in the field a means to examine would-be terrorists without stultifying concerns about what to do with them after the heart-to-heart chats.

Case officers only get good at hunting their prey—at prying into the minds of their targets—by constant work and by pushing the envelope. With enhanced interrogation off-limits, CIA operatives could easily find themselves face-to-face with a jihadist who tells them to bugger off. What are they then to do? Will their superiors be professionally sensitive to their inability to make further progress? Could they get promoted after they pass suspected jihadists to the FBI? Would the FBI even take them, knowing that they might have to be rendered to an unsavory foreign power and thereby quite possibly compromise the bureau’s more pristine image? (It will be a near-miracle if the Obama administration can long hide its renditions from the press given the number of Democrats within the administration in sensitive positions who may strongly oppose rendition to any country willing to take in suspected jihadists.)

American counterterrorism has now enthusiastically shifted from the “gloves coming off” to a post-post-9/11 determination to return American virtue to what it supposedly once was. Unless Langley now piles on cash bonuses—and CIA bonuses usually aren’t compelling—the incentives for agency officers to join the White House’s new plans for a multiagency “professional” cadre of interrogators will go nowhere. Langley will be lucky if it can get the third-rate among its own to sign on. And one has to wonder about the better agents at the FBI, which still hasn’t happily made the transition into a counterterrorist organization. Who would want to join an interrogation outfit that sounds so politically correct and sensitive?

Throughout the 1990s, FBI offices grew rapidly overseas. In some places, the bureau’s men actually took over the offices of CIA station chiefs, pushing the bureaucratic equivalent of four-star generals into much smaller digs. Returning rapidly to a pre-9/11 world, the Obama administration seems poised to give the FBI overwhelming responsibility for counterterrorism at home and abroad. The CIA is no longer the pre-eminent agency in the fight against Islamic militancy. It hardly did a superlative job. But many will not be rejoicing at the rise once again of the FBI in counterterrorism. Being “virtuous” may not look so good looking back.

Mr. Gerecht, a senior fellow at the Foundation for Defense of Democracies, is a former operative in the CIA’s clandestine service.


Full article and photo : http://online.wsj.com/article/SB10001424052970203706604574377130844113174.html#mod=article-outset-box