Читать книгу Recalculating: Steve Chapman on a New Century - Steve Chapman - Страница 8

Оглавление

Lunacy on the left — and some sanity

Sunday, October 21, 2001

On any important issue, you can expect to hear a vigorous argument between liberals and conservatives. But when it comes to dealing with the Sept. 11 attacks, the most noteworthy debate is the one going on among people on the left and other people on the left.

In that discussion, some commentators have shown a firm grasp of reality and unapologetic patriotism. Others, however, have been striving mightily to make themselves into national laughingstocks.

There are plenty of conservatives, of course, who have used the opportunity to permanently disqualify themselves from ever being taken seriously again — including Ann Coulter, Jerry Falwell and Pat Robertson. But for sheer volume of imbecility, it’s hard to outdo the blame-America-first crowd of mindless peaceniks and twitchy conspiracy theorists. The New Republic, a moderately liberal political weekly, has been running a regular feature called “Idiocy Watch,” highlighting particularly lamebrained comments, and the forum has so far been dominated by lefties.

In the immediate aftermath of the Sept. 11 terrorism, writer Susan Sontag found herself barraged with criticism for decrying the “self-righteous drivel and outright deception being peddled by public figures and TV commentators,” while assailing our “cowardly” approach to war.

Weeks later, her bout of temporary insanity is beginning to look permanent. “I don’t like throwing biscuits and peanut butter and jam and napkins, little snack packages produced in a small city in Texas, so we can say, ‘Look, we’re doing something humanitarian,’ ” sneers Sontag in an interview with the online magazine Salon. She fears that the government is about to deport all Muslims and declare martial law, which is about as likely as Susan Sontag being appointed secretary of defense.

Humorist Michael Moore, who made the anti-capitalist movie “Roger & Me,” finds reasons to yuk it up. “Finally, the bombs are raining down on Afghanistan, and as Martha Stewart says, that’s a good thing,” he brays on his Web site. “Yesiree, I say, BOMBS AWAY! Rockets red glare. We are all WHITE WITH FOAM!” I don’t know about you, but I laughed till my ribs hurt.

Professional moviemaker and amateur paranoid Oliver Stone saw the terrorist attacks as the fault of the Republican Party. “Does anybody make a connection between the 2000 election and the events of Sept. 11?” he asked at a recent panel discussion in New York, which I take to mean that the terrorists were bitter that Al Gore didn’t win.

Stone searched hard and found something good to say about Osama bin Laden and Co.: “The new world order is about order and control. This attack was pure chaos, and chaos is energy. All great changes have come from people or events that were initially misunderstood, and seemed frightening, like madmen.”

Does it seem odd to you that Oliver Stone would feel affinity with madmen? Me neither.

Fortunately, you can find plenty of left-of-center commentators who prefer their country and its ideals over those of the enemy. At the New York forum, writer Christopher Hitchens lambasted Stone as a “moral idiot, as well as an intellectual idiot.” The attack, he said, was “state-supported mass murder, using civilians as missiles.”

The Nation magazine, perhaps the best-known organ of leftist thought, published a column by Katha Pollitt on why she refused to let her daughter decorate their living-room window with an American flag — which Mom regards as a symbol of “jingoism and vengeance and war.” (The daughter retorts that the flag “means standing together and honoring the dead and saying no to terrorism.” Where does she get such ideas?)

But many other Nation contributors have taken a sharply different view from Pollitt’s. Writing in the latest issue, Princeton professor Richard Falk agrees with those who fault the U.S. as an imperialist power, but says that critique is “dangerously inappropriate in addressing the challenge posed by the massive crime against humanity committed on Sept. 11.” The American role in world affairs, he argues, “cannot be addressed so long as this movement of global terrorism is at large and prepared to carry on with its demonic work.”

Nation columnist Eric Alterman has no charity toward those who feel no patriotism at a time like this. Some of them, he says, “really do hate their country. These leftists find nothing to admire in its magnificent Constitution; its fitful history of struggle toward greater freedom for women, minorities and other historically oppressed groups; and its values, however imperfectly or hypocritically manifested in everyday life.” For Alterman, “patriotism requires no apologies.”

Those who feel differently — who can’t take their own country’s side when it is under attack by murderous foreign theocrats — should find themselves disgraced and ignored long after this struggle is over. In wartime, as leftists like Alterman understand, stupidity is not forgivable.

Should we use torture to stop terrorism?

Thursday, November 1, 2001

It’s the sort of question that, way back in spring semester, would have made for a good late-night bull session in a college dorm room: If an atomic bomb were about to be detonated in Manhattan, would police be justified in torturing the terrorist who planted it to learn its location and save the city? But today, the debates are starting up in the higher reaches of the federal government. And this time, the answers really matter.

Last week, The Washington Post reported great frustration in the FBI and Justice Department over the stubborn silence of four suspected terrorists arrested after Sept. 11, including one who wanted lessons in steering a commercial aircraft but had no interest in taking off or landing. Unless they can administer truth serum or torture, law enforcement officials fear, they may never get information about planned attacks that still are in the works. American lives could therefore be lost.

The question posed above is easy to answer. No one could possibly justify sacrificing millions of lives to spare a murderous psychopath a brief spell of intense pain, which he can end by his own choice. When the threat is so gigantic and the solution so simple, we are all in the camp of the Shakespeare character who said, “There is no virtue like necessity.”

This indulgence of reality requires no great rethinking of fundamental principles. Rules that suffice for normal circumstances often have to be suspended for emergencies. We have laws against burglary and theft, and for good reason: Society couldn’t function if homes and property had no protection. But if a starving plane-crash victim stranded in the wild broke into a locked cabin to get food, he wouldn’t be sent to prison.

The complications of the torture issue arise once you move from the extreme hypothetical case to the messiness and uncertainty of the real world. Almost everyone would agree it’s permissible to use forcible interrogation methods to prevent nuclear holocaust. But it’s impossible to write a law that restricts the use of torture to cases where 1) a considerable number of lives are in peril, and 2) police are sure they have a guilty party who can provide the information needed to avert the catastrophe. The brutal techniques are therefore likely to spread.

We know that from experience. Most states that employ torture do it pretty much anytime it suits their law enforcement purposes. And Israel, the rare government to attempt to impose clear standards and limits on the use of coercion, found that the exception threatened to swallow the rule.

With an eye to the “ticking bomb” scenario, Israel authorized the use of “moderate physical pressure” to persuade suspected terrorists to talk — including shaking them, covering their heads with foul-smelling hoods, putting them in cold showers, depriving them of sleep for days on end, forcing them to crouch in awkward positions, and the like. These were needed, the government said, because of the chronic threat of Palestinian attacks on civilian and military targets. And, besides, they weren’t really torture.

But this option quickly expanded beyond the cases where it might be excused. An Israeli human-rights group that successfully challenged these methods in court said that 85 percent of Arabs arrested each year by the General Security Service — including many never charged with a crime — were subjected to such abuse. That works out to thousands of victims over the years.

Israel found its carefully controlled approach escaping control in two ways. First, the brutal techniques were soon used in routine cases, not just extreme ones. Second, “moderate” pressure sometimes became immoderate: An estimated 10 detainees died from their mistreatment.

The problem is not with Israel but with human nature. To a man with a hammer, said Mark Twain, everything looks like a nail. Give police and security agents in any country a tool and they’ll want to use it, and even overuse it. If the government were to torture the suspects arrested after Sept. 11, it might find they don’t know anything important.

There are, of course, other options for inducing cooperation from suspected lawbreakers, including carrots (light sentences, money, relocation with a new identity) and sticks (long sentences, extradition to countries known for harsh punishments). That strategy has worked on other terrorists, like the one caught trying to sneak explosives into the U.S. for a millennium attack.

So it would not be wise to formally authorize the use of torture to combat terrorism. And what if the cops someday have to try it to save New York City from a nuclear blast? I trust they’ll do what they have to do, and forgiveness will follow.

Is John Walker a failure of liberalism?

Sunday, December 16, 2001

He’s an ordinary suburban kid who was dissatisfied with the anything-goes culture of modern America. So he set off on his own to serve God and follow a strict code of morality, though the trendy people he grew up with might scoff. He even became a soldier.

In another context — say, if he had become a born-again Christian and joined up with rebels in Iraq — John Walker might be a conservative hero. Instead, since he went off to Afghanistan to fight with the Taliban, he and his parents are being used as a prime example of what’s wrong with liberals.

Walker, you see, grew up in affluent and left-leaning Marin County, Calif., which tells some conservatives everything they need to know. “He was prepared for this seduction not just by the wispy relativism of Marin County, but also by a much broader post-60s cultural liberalism that gave his every step toward treason a feel of authenticity and authority,” pronounced Hoover Institution scholar Shelby Steele.

The Wall Street Journal said Walker and CIA officer Johnny Michael Spann, who was killed in a riot at the prison where Walker was being held, came from “two Americas that don’t even speak the same language.” The Journal editors said they found Spann’s world “refreshingly unenlightened” compared to the squishy permissiveness that infects Walker’s hometown.

Critics on the right had a field day deriding Walker’s parents, who were found guilty of a variety of sins. The kid was named for John Lennon! He went to an alternative high school! His mother had an interest in Buddhism! They sent him money even after he fell in with Islamic zealots abroad!

And what do you think Dad had to say after his son was found carrying an AK-47 against his own country? “I don’t think John was doing anything wrong,” Frank Lindh offered, in words that seemed designed to evoke winces. “We want to give him a big hug and then a little kick in the butt for not telling us what he was up to.”

By my lights, that’s taking parental understanding a bit too far. But what is a father supposed to do when a child he has loved and cherished from birth goes astray, placing himself in mortal danger? Maybe there are some parents out there who would say, “Kill the traitor,” but not many.

Most parents, if one of their children faced possible execution for his crimes, would choose to support him rather than abandon him. That is not the same thing as excusing his conduct. If Lindh plans to castigate his son for his grossly repellent choices — and for all we know, he does — he can hardly be blamed for preferring to do it in a private family conversation rather than on “Good Morning America.”

The rush to blame Walker’s crimes on his free-thinking parents and his wealthy, liberal hometown is way too facile. Timothy McVeigh came from what conservatives might call a “refreshingly unenlightened” place — Pendleton, N.Y., a blue-collar town of 5,000 people near Buffalo. He was also an Army veteran who saw combat during the Gulf War. But I don’t recall any conservatives saying that something rotten in the culture of Pendleton or Ft. Riley, Kan., brought on the Oklahoma City bombing.

Likewise, Theodore Kaczynski grew up in the heavily Catholic, salt-of-the-earth Chicago suburb of Evergreen Park, which is known as “The Village of Churches.” But when the Unabomber was finally caught, no one blamed his murderous attacks on the pervasiveness of Christianity and patriotism in his youthful surroundings.

Plenty of bad people have grown up in wealthy, permissive, liberal towns — and plenty have grown up in middle-class, authoritarian, conservative ones. Human nature is the same in both places, and neither environment guarantees good citizenship. If old-fashioned moral attitudes are more likely to provide a reliable check on our baser impulses, why is it that murder rates are higher in Bible-Belt states like Mississippi and Alabama than in more liberal locales? Why do mass school shootings typically take place in Norman Rockwell country instead of Cambridge or Berkeley?

As for judging the influence of Walker’s family, long-distance psychiatry is not terribly reliable. Maybe his parents did a poor job raising him, or maybe he was headed for trouble no matter what they did. Evil and stupidity are often hard to comprehend. Good parents can produce bad kids, just as bad parents can yield good kids. We shouldn’t assume that someone else, with a stronger backbone and clear rules, would have had any more success with Walker than his parents had.

Conservatives insist the Walker case proves that if you don’t raise children with traditional moral values, some of them will veer wildly out of control. They’re right, of course. But they neglect to mention that if you do raise children with traditional moral values, some of them will do exactly the same thing.

Some lessons we have learned

Monday, December 31, 2001

A year ago, Americans were still recovering from a profound trauma: an excruciatingly close and bitterly disputed presidential election. It had to be resolved by the courts after a five-week legal battle that left George W. Bush, who came in second in the popular vote, with the most tainted victory in our history.

Well, we thought that was a profound trauma. Viewed through the haze cast by certain events that took place one morning in September, the case of Bush vs. Gore now looks like a petty squabble of unprincipled partisans, not the momentous struggle between good and evil that advocates on either side claimed at the time. Today, if you asked people for a personal embodiment of evil, not many Americans would name George W. Bush or Al Gore. And I doubt one in 50 could identify David Boies, who last December was famous — really — as Gore’s attorney.

Living through a horror is not really something to be recommended. Legend has it that after former Treasury Secretary John Connally and his wife had to declare bankruptcy back in 1987, a friend assured Nellie that the experience would make her a better, stronger person. “I didn’t want to be better or stronger,” she replied. Champions of collective discipline think the war on terrorism will firm up our national character. But there was really nothing wrong with our national character, as our determined response to Sept. 11 makes clear.

Still, finding ourselves suddenly at war had the same useful effects that a brush with death can have: illuminating truths that were not apparent before, and sharpening our sense of what is important. The government has been forcefully reminded that of all the countless responsibilities it has assumed in recent decades, none can match the gravity of its first duty: protecting its citizens from foreign enemies. Next to that, providing a Medicare prescription drug benefit — one of the main issues of that long-ago 2000 presidential campaign — seems to fall short of being absolutely essential.

Our leaders in Washington have also learned that when war is truly necessary, Americans will support it with almost universal fervor and resolve. A handful of left-wing critics, it’s true, did us the favor of proving that they are incapable of speaking up for their country even when it is under attack. Many commentators expected that ordinary people would likewise turn against our action in Afghanistan if things didn’t go well, something that had happened in previous conflicts.

But this war was fundamentally different from every war Americans have been asked to support over the last half-century. We didn’t embark on it because someone said it was needed to deter communist aggression, or preserve our credibility abroad, or prevent dominoes from falling, or shore up NATO, or avert a humanitarian crisis in one place, or enhance stability in another. We embarked on it because someone killed thousands of our fellow citizens and had every intention of killing more.

Osama bin Laden no doubt learned a lot as well from the aftermath of the attacks. He saw us leave Lebanon and Somalia when we suffered some casualties. He watched us respond ineffectually when terrorists attacked American embassies in Kenya and Tanzania, an American military installation in Saudi Arabia, and an American destroyer in Yemen. So he thought he could carry out horrendous massacres on U.S. soil and pay no price.

That turned out to be the biggest miscalculation since Sen. James Chesnut of South Carolina, ridiculing the notion that the North would fight to preserve the Union, offered to drink all the blood that would be shed over secession. Like Chesnut, bin Laden has discovered that though Americans are a peaceable folk, their forbearance has limits.

At the same time, Americans have refused to let their fear override their principles. Free speech is alive and well. Though dissenters may be disregarded, they haven’t been persecuted. Civil liberties advocates are justified in asking why the government has detained hundreds of non-citizens without showing they pose a danger. But compared to what happened in past wars, that’s a small matter. The gross overreactions that some people feared haven’t emerged.

The most striking fact about our response to this crisis is not how badly suspect groups (such as Muslims, Arabs and other dusky-complexioned individuals) have fared but how well. One poll found that American attitudes about Muslims improved after the attacks. In March, 45

percent of those surveyed expressed a positive view of Muslims. By November, 59 percent had a favorable opinion of them.

Not long ago, Muslims and other Americans wondered if normality could survive the Sept. 11 attacks. Today, it certainly looks that way.

Gay adoption: In a child’s best interest

A law allowing gay parents to adopt children is not “depriving” children of the benefits of a mom and dad if they don’t have them to begin with

Sunday, February 10, 2002

When the American Academy of Pediatrics published a new policy statement in favor of letting gays and lesbians adopt children, it drew the expected response from hard-line conservatives who, upon hearing the word “homosexual,” have the urge to run screaming from the room.

“There is an abundance of research that children do best when raised by a mother and a father who are committed to one another in marriage,” asserted Ken Connor, president of the Family Research Council. “To support a policy that would intentionally deprive a child of such benefits is unconscionable.” Sandy Rios, head of Concerned Women for America, said, “As the single mother of a son, I can see quite clearly that having a mother and a father together would be far better for my son.”

This reaction was an exercise in missing the point. Maybe a child living with his biological mother and her lesbian partner would be better off living with a heterosexual married couple. Maybe he would be better off living with the queen of England.

But the fact is, the child is not living with a heterosexual married couple. A law allowing custodial gay parents to adopt their children is not “depriving” a child of the benefits of a mother and father if the child doesn’t have them to begin with. A woman or man may be widowed after having a child and later move in with someone of the same sex. Or a woman may choose to bear a child on her own and then enter a long-term lesbian relationship. Or two women may decide together that one should be artificially inseminated to conceive a baby that they will raise together. None of these arrangements is illegal or uncommon. In any of them, the result is a child being brought up by a same-sex couple.

Though it may distress some conservatives to hear it, there is nothing to prevent homosexuals from falling in love and setting up housekeeping. We are pretty much past the stage of persecuting people for their sexual orientation. And if gays and lesbians are allowed to have stable homosexual relationships, some of those are bound to involve children.

Estimates are that anywhere from 1 million to 9 million children have at least one homosexual parent, and many of these live with same-sex couples. The pediatric group’s statement acknowledges that the evidence on how these children fare is based on “small and nonrepresentative samples,” but says the available information suggests “there is no systematic difference between gay and nongay parents” and no “risk to children as a result of growing up in a family with one or more gay parents.”

Experts may argue about the data, but it’s irrelevant to the issue addressed by the academy: whether the law should allow “second-parent” adoption by one member of a gay couple. Some states ban such adoptions, and many others leave the matter to the whim of judges. Such adoptions are clearly sanctioned in only seven states (including Illinois).

But many children are already living with same-sex couples — most often, a biological parent and his or her partner. No one, even at the FRC and CWA, is proposing that these kids be removed from these homes and placed with the Osmond family. Like it or not, they will be raised by gays.

The question is how to protect the kids in these homes. What the critics insist on ignoring is that the debate is not about the rights of homosexuals. It’s about the welfare of children.

Children gain nothing from laws that prevent adoption by second parents. Just the opposite. If Heather has two mommies but Mommy No. 2 can’t adopt her, she’s worse off than a child with two legally recognized parents.

As the Human Rights Campaign Foundation notes, if she gets sick, Mommy No. 2’s health insurance policy may not cover her. If her second mother dies, she won’t be eligible for Social Security survivor benefits, as other children are, and she may inherit nothing. If her biological mother dies, she may be removed from the only home she’s known and placed somewhere else.

Gay couples, like other couples, sometimes split up, and in that case, Heather may not be able to get child support. Or her second mom may not get visitation rights, depriving the child of contact with a loved parent. A lot of things can happen if second-parent adoptions are not allowed, and all of them are bad.

The effect of such laws is to say that homosexual parents may not adopt the children they raise even if adoption would be the best thing for the children. Anti-gay conservatives say they want to protect kids. Their policies say something else.

Silencing dissent near ground zero

Sunday, February 24, 2002

William Harvey has a publicist’s uncanny knack for knowing how and where to place a message to make sure it gets the maximum response from an interested audience. But Harvey is not a publicist. He’s an opinionated New Yorker whose talent for communication has earned him a criminal indictment.

On Oct. 4, just a few weeks after the terrorist attacks on New York City and Washington, D.C., he showed up in military fatigues on a corner not far from ground zero. He was carrying a sign with Osama bin Laden superimposed over the World Trade Center buildings and handing out leaflets setting forth his belief that “America is getting paid back for what it’s doing to Islamic countries.”

A crowd quickly gathered on the sidewalk around him, and it didn’t consist of well-wishers. With memories of the collapsing towers still painfully vivid, passersby screamed obscenities, demanded that he be locked up, and even threatened to kill him. A police officer surveyed the scene and placed him under arrest.

Harvey was charged with disorderly conduct. Why? Because in the view of the officer, he deliberately “obstructed vehicular and pedestrian traffic.” The Manhattan district attorney’s office decided to pursue the case, and earlier this month, a county judge rejected Harvey’s claim that he was fully within his constitutional rights. He’s scheduled to go on trial in April.

The defendant says he is being punished merely for expressing unpopular views in a public place. The judge, however, insists that his views are not at issue. “It is the reaction which speech engenders, not the content of the speech, that is the heart of disorderly conduct,” he declared. It’s reasonable to assume, said the judge, that he knew he was going to create “public inconvenience, annoyance or alarm.” By his thinking, if someone becomes disorderly because he’s angry over what Harvey said, then Harvey rather than his listener is in violation of the law.

But the 1st Amendment does not protect Harvey’s right to say only things that won’t upset anyone, or to say them only in places where no one will care enough to stop to listen. And it’s not needed to assure the freedom of Americans to call Osama bin Laden an evil terrorist whose actions cannot possibly be justified. People with that view (which includes me) don’t have to worry about police and prosecutors coming after them.

No, the constitutional mandate was created specifically to safeguard opinions that most of us despise and many of us would like to silence. It was meant to uphold the minority’s right to speak, especially in the face of majority opposition — no matter how stupid the minority or how vehement the majority. Harvey’s indictment, however, is based on the assumption that listeners have a right not to hear anything that may throw them into a fury.

If getting in the way of pedestrian traffic is a crime, of course, it’s not just Harvey but his disgruntled listeners who are guilty. But the police apparently didn’t arrest any of the others. And it’s impossible to believe that the cop would have arrested Harvey if he had drawn a crowd by denouncing bin Laden. “It’s a heckler’s veto,” says UCLA law professor Eugene Volokh. “Anytime I threaten a guy, he gets arrested and I don’t.”

But the heckler’s veto has been rejected by the Supreme Court over and over, in cases where the threat to public order was far greater than it was this time. In a 1949 case, for example, a man was arrested for disorderly conduct after delivering a speech so inflammatory it produced disturbances in a crowd of some 1,000 people outside the Chicago auditorium where he was speaking. But the court threw out his conviction.

“A function of free speech is to invite dispute,” wrote Justice William O. Douglas then. “That is why freedom of speech, though not absolute, is nevertheless protected against censorship or punishment, unless shown likely to produce a clear and present danger of a serious substantive evil that rises far above public inconvenience, annoyance or unrest.” That opinion could have been written as a direct rebuke of Harvey’s prosecution.

The fact that this incident took place in wartime doesn’t give the authorities any more power to silence dissent. Nothing Harvey did created the suggestion that he was bent on terrorism. All he was doing was challenging the wisdom of American policies. That’s the sort of message that is especially important to hear at a time when the public is so united in believing we’re in the right.

If we really are in the right, we can certainly survive the criticisms of people like William Harvey. And someday, when we’re in the wrong, we may need someone like him to let us know.

Writing for fame, profit— the easy way

Sunday, March 3, 2002

Conscientious journalists inevitably say they are shocked and saddened when they discover that a fellow practitioner of the writing trade has defrauded readers. Lately, there has been a veritable Tournament of Roses parade of writers who have confessed to making up stories or stealing words.

Stephen Ambrose, who became prosperous chronicling the heroism of ordinary men in World War II, committed the unheroic act of plagiarizing from other accounts in several of his books. Doris Kearns Goodwin, a former Harvard professor, had to acknowledge — 15 years after the matter was brought to her attention — that she appropriated passages from multiple sources in her book “The Fitzgeralds and the Kennedys.”

But while we workaday scribblers regret to see such lapses mar our noble profession, a part of us is also delighted and grateful to these worthies for unintentionally raising the meager value of our work. Those of us who have never written critically acclaimed best-sellers can at least boast that the non-fiction material we have written, we actually wrote, and it’s really not fictional.

Once upon a time, that seemed only obvious. But it’s a claim that a dwindling number of writers can make. We few, we happy few. . . (Whoops! It was some guy named Shakespeare who wrote that, not me.)

Every time you think the cascade of bamboozlement is over, it starts up again. Last week, The New York Times ran one of the most entertaining editors’ notes ever written, regretfully informing readers that an article published in its Sunday magazine was a king-sized whopper.

In the piece, about the indentured servitude of an African boy in Ivory Coast, veteran freelance writer Michael Finkel employed a variety of what you might call unapproved techniques. He created a composite main character, attributed experiences to the character that “did not apply specifically to any single individual,” invented scenes, and composed the entire 6,000-word article “without consulting his notes” — notes that, upon inspection, turned out to contradict much of what he reported. The editors’ note could have just said, “Remember that article? Forget everything you read.”

I would like to say I was shocked and saddened by Finkel’s misconduct. But I was too busy rolling on the floor laughing at his explanations to feel the faintest pang of woe. “I slipped,” he confessed manfully. “It deserved a correction. But there is a great deal of accuracy. Not once has the prose been called into question.”

Well, he’s right. The prose was just fine. The only problem with the article was that it made “Lord of the Rings” look like a PBS documentary.

But you will be relieved to know that Finkel’s bald-faced lies had a noble motive. Said he: “I hope readers know that this was an attempt to reach higher — to make something beautiful, frankly.” What’s the matter, reader — you against beauty?

Ambrose and Goodwin were only slightly less brazen. Ambrose acted as though only schoolmarms and pedants would care if the stories that have made him so rich sprang from his pen or somebody else’s. It was just “six or seven sentences in three or four of my books,” he said dismissively — and besides, his only mistake was that he “failed to put some words and sentences into quotation marks.”

This is a bit like a shoplifter saying that his only mistake was failing to pay. Does Ambrose think there is some other and more terrible form of plagiarism than using someone else’s sentences and pretending you wrote them?

Goodwin, whose publisher settled with one of her victims in 1987 — without bothering to inform readers — insisted that her plagiarism was the product of simple absent-mindedness on her part. She had taken notes from other books in longhand, she said, and then, when it came time to write, mistook the notes for her own original compositions.

How did it happen that vast tracts of her book were lifted from other sources? “The mechanical process of checking things was not as sophisticated as it should have been,” she allowed. Well, surely you wouldn’t expect a simple barefoot Harvard Ph.D. to employ a “sophisticated” approach for documenting her sources.

All these revelations will serve as an everlasting inspiration to ordinary scribes whose work is destined to line birdcages or swell remainder bins rather than make us rich and famous. While churning out unmemorable prose by the sweat of our brow may not seem like a great achievement, it’s more than some celebrated writers can manage.

And the next time my editor thinks my column isn’t up to snuff, I have the perfect comeback: “You think Stephen Ambrose could do better?”

Does free trade breed poverty?

Sunday, March 24, 2002

In the minds of some environmentalists, free trade ranks right up there with the Exxon Valdez as a despoiler of nature. A group called Global Exchange claims that free trade has caused poor countries to “cut down their forests, overfish their waters and exploit other natural resources.” Says the Sierra Club, “Current trade rules are too often used to undermine environmental protections . . . in the name of free trade.”

That’s one common indictment of globalization. Another popular one is the charge that expanded commerce among nations has put downward pressure on wages in Third World countries, aggravating income inequality and worsening the lives of the poor.

In the mythology of the left, this all fits together. Free trade, it’s said, unleashes a ruthless cycle of competition that punishes both poor nations and rich ones, by forcing all into a brutal and debilitating competition. Poor nations that allow rampant pollution and ecological destruction can take industry away from countries which try to protect the environment. Places where workers toil for pennies can attract multinational corporations, which see big profits from paying starvation wages.

Socially responsible countries are forced to lower their standards to compete. The more trade there is across borders, the more intense the pressure. It’s an endless race to the bottom that everyone loses.

But mythology, while it can produce entertaining tales, doesn’t have to be rooted in reality, and most of the left’s case against free trade is not. In fact, a growing body of evidence indicates that globalization advances the very goals it is accused of sabotaging.

The latest proof comes from a study published last year in the American Economic Review by economists Werner Antweiler, Brian Copeland and M. Scott Taylor. They looked at data on a major pollutant, sulfur dioxide, over the period from 1971 to 1996, when trade barriers were coming down and international commerce was expanding.

Environmentalists will be reassured to learn that countries which opened up to trade generated faster economic growth — which, these scholars say, produced more pollution. But that was not the only effect. It turns out that as nations grow richer, they and their people demand a cleaner environment. The first result comes sooner, but for the vast majority of countries, the second one has been much more important.

“That effect becomes pretty dominant,” says Antweiler, an economist at the University of British Columbia. As a general rule, “if trade liberalization raises gross domestic product per person by 1 percent, then pollution concentrations fall by about 1 percent.” In other words, Antweiler and his colleagues conclude, “free trade is good for the environment.”

Poor people may prefer pollution to starvation, but as their income rises, they no longer have to make that cruel tradeoff. The AER study, it should be noted, did find an exception to the rule — communist countries, whose governments were not overly influenced by the desires of their citizens.

The claim that free trade will increase poverty and inequity also appears to be built on sand. It discounts the most important economic experiment of our time — China’s decision in 1978 to open up an economy that was among the most tightly closed on earth. Since then, World Bank economists David Dollar and Aart Kraay point out in a recent issue of Foreign Affairs magazine, “China has seen the most spectacular reduction of poverty in world history.”

Over the last two decades, as trade has proliferated, the number of poor people in the world has dropped by 200 million, even as the Earth’s population rose. And countries that have liberalized trade have seen their average growth rates rise, while closed economies lagged.

Dollar and Kraay accept the prevailing view that in China, at least, progress against poverty has coincided with rising inequality — that though the poor have gotten richer, the rich have gotten richer faster. If that were the price of progress against poverty, it would certainly be worth paying. But another study suggests that far from increasing income gaps, free trade shrinks them.

In a working paper for the respected National Bureau of Economic Research, economist Shang-Jin Wei of the International Monetary Fund and Yi Wu of Georgetown University looked at urban-rural earnings disparities in some 100 different cities in China and nearby rural areas. What they found was the opposite of what is claimed by critics of globalization: The more open the city is to trade, the smaller the urban-rural gap. “Globalization has helped to reduce, rather than increase, urban-rural income inequality,” they conclude.

For a long time, many people on the left have said they’re 1) against poverty and pollution and 2) against free trade. One of these days, they’re going to have to make up their minds.

‘Delinquent’ no longer has to follow ‘juvenile’

Thursday, April 11, 2002

We adults often complain and despair over the ways of young people. Their headache-inducing music. Their drug and alcohol use. Their casual approach to sex. Their tattoos and body piercings. Their underdeveloped work ethic. Their sense of entitlement. Their attraction to gangs and guns.

So if you’re over the age of 30, you would not find it surprising to learn that the crime rate among American youngsters has risen sharply in recent years. That would merely confirm what so many people suspect: Our permissive culture has failed to instill self-respect and self-discipline in our children.

But that assumption is confounded by a fact that really is surprising: Between 1994 and 2000, the total number of juvenile arrests fell by 13 percent.

Actually, I’m grossly understating the good news. As the Urban Institute notes in a recent study by Jeffrey Butts and Jeremy Travis, the drop in violent crimes among children under 18 was even bigger.

Arrests for aggravated assault declined by 22 percent. They were down 25 percent for rape and 51 percent for robbery. Best of all, murder arrests plunged by 68 percent.

All this happened at the same time that the number of youngsters was growing. “The rate of juvenile crime in 2000 was lower than at any time in the previous decade,” report Butts and Travis.

This trend came as a surprise, to say the least. In the late 1980s and early ‘90s, kids went on a violent binge, boosting their murder rates by half. Conservatives blamed the trend on our lenient treatment of teenage hoodlums. Many experts looked toward the future and assured us that things were about to get even worse.

In 1995, for example, criminologist James Q. Wilson of UCLA issued a dire warning that the number of teenagers would grow and that many of the new kids would be “high-rate, repeat offenders — 30,000 more young muggers, killers and thieves than we have now.” The Council on Crime in America predicted “a coming storm of juvenile violence.”

If you turned those predictions upside down, you’d get a pretty good picture of what has happened in reality. Since 1994, the juvenile arrest rate for violent crimes has dropped by more than a third. In 1994, an estimated 3,700 kids were arrested for murder. In 2000, the number was down to 1,200.

Instead of adding 30,000 new violent criminals, we have subtracted more than 50,000, including some 2,500 killers. Large numbers of serious criminals have disappeared and haven’t been replaced.

Not only did the storm never materialize, but the weather, as far as juvenile crime is concerned, has been warm and sunny.

What happened? One event is that some bad things stopped happening. The crack epidemic, which was deadly not because of the effects of the drug but because of its disruptive effect on illegal drug markets, unleashed a surge of violence as well-armed dealers and gangs fought over turf.

The economy, which had turned down in the early 1990s, experienced a record boom. Welfare reform induced many recipients to take jobs, giving their kids the message that honest work is a normal way of life.

These factors help to explain why adult as well as juvenile crime has ebbed. But juvenile crime had risen so fast in the preceding years that you could almost say it had nowhere to go but down.

It’s tempting to think that tougher treatment of young delinquents finally made them change their ways. Many states have begun prosecuting more and more violent kids as adults and packing them off to prison for long periods of time. But that trend didn’t really get rolling until after youth crime started to fall. And it doesn’t necessarily work: Florida’s ostentatiously fierce approach has left it with a juvenile crime rate 80 percent above the national average.

California, by contrast, spent the 1990s obstinately refusing to adopt tougher policies against juvenile offenders — leading voters to approve a 2000 ballot initiative mandating a crackdown. What was the price for its habit of shamelessly coddling young criminals over the previous decade? California’s youth crime rate “dropped like a stone,” says Franklin Zimring, a criminologist at the University of California at Berkeley and author of the 1998 book “American Youth Violence.”

Obviously something else was going on to steer kids away from violence, and it appears to be part of a bigger trend toward more sensible behavior.

During the 1990s, teen pregnancy, drinking, smoking and drug use all became less common.

For all the freedom they have, today’s teenagers show an inclination toward healthy behavior and self-preservation that past generations — their parents, for example — didn’t acquire until they were older.

We may never know exactly what precipitated this welcome development. But it’s nice to think that maybe we’re doing something right.

Back to the future on gasoline prices

Sunday, May 5, 2002

Lots of us have watched “That ’70s Show” — so many that some politicians think we’re eager to live it as well. Americans of a certain age can remember, without great fondness, the energy crisis of that decade. It featured government price controls, claims of obscene profits, and congressional investigations of alleged oil company conspiracies.

Most people, as they waited in long lines for the chance to buy a few gallons of fuel, learned a lesson about the value of trusting market forces over government regulation. But some forgetful types apparently need a refresher.

Gasoline prices are on the rise lately. Now, prices of innumerable commodities rise and fall all the time, as you may notice when you visit the produce section of your supermarket. But most prices are not posted in giant numerals at every busy intersection. So many people overreact to any jump in pump prices.

Since mid-March, the average fee nationally for unleaded regular has risen to $1.39 a gallon — up from $1.11 in February. Painful? Actually, you might consider the current price grounds for cheering. It’s down 30 cents from a year ago.

There is no cheering, though, on Capitol Hill, where memories are short and demagogues are plentiful. Last week, a Senate investigative subcommittee released a report lamenting the instability of petroleum prices and accusing oil companies of rigging markets to gouge consumers.

Meanwhile, on the other side of the country, the Hawaii legislature voted to put controls on pump prices, starting in 2004. It didn’t occur to the lawmakers that one reason the state has the highest gasoline prices in the country is that it has the highest gasoline taxes. Corporate profiteering, bad; government profiteering, good.

The Senate subcommittee report was proof that, in the words of energy economist Michael Lynch of DRI-WEFA, an economic consulting firm, “if you go on a witch hunt far enough, you’ll find a witch.” It claimed Marathon Ashland Petroleum withheld supplies of reformulated gasoline in 2000 “so as not to depress prices.” The company replied that it produced 33 percent more of the stuff that year than the year before and “sold every drop we made.” The Washington Post noted, “The report found no evidence of antitrust violations or collusion.”

Never mind that. Among the outrages unearthed by our courageous investigators was an internal Marathon memo from 1998 that said, “Nature stepped in to lend the oil producers a helping hand in the form of Hurricane Georges,” referring to a devastating tropical storm.

Sen. Carl Levin (D-Mich.), who apparently had lived 67 years without ever encountering black humor, pronounced this “an amazing document” and angrily demanded an explanation. A Marathon executive dutifully apologized “that my company would take pleasure in a hurricane.”

The report, like the Hawaii price caps, is based on the assumption that oil companies have the power to enrich themselves at will by shutting off supplies, driving up prices, and turning motorists upside down to shake all the change out of their pockets. It claims, at the same time, that “the price of gasoline has also become more volatile than ever.”

But if oil companies are so good at colluding to fleece the consumer, why the volatility? Why can’t they push prices up to the sky and keep them there? Last year, when the price was plunging, the oil barons looked like they couldn’t execute a plot to rob a lemonade stand.

An industry that enjoys vast market power ought to be a lucrative one. But as the federal Energy Information Administration reports, the return on investment in refining and marketing petroleum averaged a paltry 4 percent in the 1990s.

How could that be? It might have something to do with the fact that consumers have enjoyed a long stretch of luxuriously low gasoline prices, owing to an abundance of the stuff. Gasoline sells today for about what it did in 1981. If pump prices had just kept pace with inflation over the last two decades, we’d be paying about $2.75 a gallon, not $1.39.

Our good fortune isn’t an accident. It’s the result of getting the government to butt out. Price controls discouraged production, with the perverse effect that prices rose. When President Reagan got rid of the caps, prices soon began to drop. And though they have jumped up and down on occasion, they have generally stayed low.

But if we’re tired of having it easy and want to bring back high prices, long lines and energy chaos, we know how to do it. The Hawaii legislature and the Senate subcommittee are off to a good start.

On Internet speech, librarians to the rescue

Thursday, June 6, 2002

The people who have advanced the cause of free speech have often been wild, radical or dangerous types — communists, anti-Semites, pornographers, war resisters, flag-burners, and the like. Today, storming the barricades of censorship and rejecting the demands of conformity, we have a different group of firebrands: America’s librarians.

Your image of a librarian may be a prim spinster whose idea of proper communication is to put a finger to her lips and say, “Shhhh!” This time, though, the librarians’ message to the federal government is: “Don’t you dare shush my patrons!”

The battle is over government regulation of access to cyberspace. The Children’s Internet Protection Act of 2001 requires all federally funded libraries and schools to install computer filters to block sites offering child pornography, obscenity or anything “harmful to minors.” Noting that the Internet offers a lot of images and text that would make Hugh Hefner blush, our elected representatives decreed that libraries should prevent patrons from seeing such material, inadvertently or by choice.

This is a worthwhile goal, but in practical terms, the only way to seal off the stuff that falls outside the bounds of free speech is to seal off a lot of stuff Americans have a right to see and produce. That’s why the American Library Association went to court to challenge the law — arguing that the job of librarians is to help children and adults make use of their 1st Amendment rights, not to violate those rights. It’s also why a special federal court panel last week overturned the CIPA, finding that it was burning a lot of wheat along with the chaff.

The Internet boasts some 2 billion Web pages and is growing like kudzu in a greenhouse, adding 1.5 million pages every day, or more than 1,000 a minute. Companies that sell filters can’t possibly put human eyeballs on more than a microscopic fraction of those sites, so they have to rely on key words and other identifiers to figure out which ones to block.

But this is not a very accurate method. Since key word searches can’t evaluate photos, dirty pictures can get through. Meanwhile, a lot of things that should get through somehow don’t. One expert called by the government in this case admitted that between 6 percent and 15 percent of the sites blocked by filters didn’t meet the filter companies’ definition of sexually explicit material, never mind the law’s.

This approach is worse than official censorship. It’s officially sponsored censorship that delegates to private vendors the task of deciding what is fit to see and what is not. And the people in Washington don’t even know what’s being censored — because filter companies treat that information as a proprietary secret. Congress told these suppliers, “We’ll let you decide what to suppress, even though we don’t know what you’re suppressing.”

The judicial panel noted that among the sites that were put off limits were those set up by a Knights of Columbus group, a Christian orphanage in Honduras, a Libertarian candidate for the California legislature, a Louisiana cancer treatment facility, a bed and breakfast in North Carolina and Southern Alberta Fly Fishing Outfitters — which may have gotten in trouble for glistening shots of naked trout. And, wouldn’t you know it, one of the library filters blocked a satirical Web site called “Dumb Laws.” Like, maybe, the Children’s Internet Protection Act?

The unreliability of filters, unfortunately, is in the nature of the beast. As the judges explained, the evidence showed “not only that filtering programs bar access to a substantial amount of speech on the Internet that is clearly constitutionally protected for adults and minors, but also that these programs are intrinsically unable to block only illegal Internet content while simultaneously allowing access to all protected speech.” CIPA is the moral equivalent of trying to eliminate pornographic magazines by burning down every other newsstand.

So does the ruling mean libraries can do nothing to keep smut away from our children? Of course not. Even before the law was passed, libraries had created policies designed to minimize the dangers posed by the Internet without sacrificing its immense value. Some allow youngsters to use only filtered computers, while providing unfiltered access to adults. Some have policies that bar patrons from looking at illegal sites, with violators losing their library privileges. Others put computers in highly visible public areas to discourage children from going to pornographic sites.

None of these alternatives is as satisfying as a foolproof technological fix, but that perfect option turns out to be a fantasy. So maybe we should learn to trust our librarians.

Even in scariest of times, Americans still prize fun

Thursday, July 4, 2002

On the 226th anniversary of our independence from Great Britain, Americans will remember the achievement of the founders who championed the ideals of liberty and democracy. They might also reflect on the contributions of Arthur “Spud” Melin, who died at 77 last week after a lifetime dedicated to that other revolutionary concept: the pursuit of happiness.

Among the more inspiring contributions of Washington, Adams and Jefferson was establishing a nation where a corporation could adopt a name like Wham-O. That firm, co-founded by Melin in 1948, embodied the assumption that, even during times of peril, Americans insist on their right to have fun. That impulse will no doubt be on display today, despite everything that has happened since the last Independence Day celebration.

We might wonder why the 4th of July is more about idle pleasure than sober remembrance. All those fireworks and barbecues can distract from the historic importance of the day, and from the many tasks left to us, the heirs of the revolution.

But the signing of the Declaration of Independence, amid all the uncertainties of the war, was a festive occasion, with bands playing and bells ringing. Wrote John Adams, “It ought to be solemnized with pomp and parade, with shows, games, sports, guns, bells, bonfires and illuminations from one end of this continent to the other, from this time forward for evermore.” America was not founded on a zeal to wring the frivolity out of life.

Good thing for Melin. His company brought out its most successful products in the 1950s, which are usually recalled as a time of dull contentment. In fact, that decade was frequently turbulent and terrifying — thanks to a Cold War that fostered anti-communist panic and raised the specter of all-out nuclear war. One of the company’s forgotten flops offered protection from this latter threat: the Do-It-Yourself Fallout Shelter.

Americans may have doubted the effectiveness of that product, but never mind: They refused to let worries about radioactive incineration dampen their spirits. Wham-O had better luck with more carefree items, like the Hula Hoop, the Frisbee, the Slip’N Slide, and the Chubby Checker Limbo Set. These toys were aimed at youngsters whose parents, after enduring the hardships of the Great Depression and the horrors of World War II, were determined to give their offspring peace and abundance.

The much-derided conformity of the period was really an effort to restore calm to a world that had gone mad. More than any previous generation of American parents, the grownups who generated the Baby Boom expected that their kids would enjoy life.

It’s no coincidence that one of the greatest monuments to the indulgence of children, Disneyland, opened in 1955. Barbie made her first appearance four years later. Wham-O introduced the Hula Hoop in 1958 and set off perhaps the biggest toy craze in history. Eventually, Melin’s company sold 25 million of them.

Wham-O got used to dealing with large numbers. The ultrabouncy Superball racked up 7 million sales in its first six months after debuting in 1965. Over the years, the firm sold some 100 million Frisbees — plastic discs it originally called Pluto Platters in an effort to capitalize on American interest in UFOs from outer space.

Occasionally, the big numbers were of the negative type, like when Wham-O brought out a product with one of the more memorable names in the annals of retailing. Instant Fish were not, as you might suspect, the early equivalent of pet rocks. They were lumps of African mud containing real fish eggs, which were supposed to hatch after being soaked in water. But the theory didn’t pan out, and more than $1 million in orders had to be refunded. That didn’t stop people from flocking to stores to buy any number of other goofy Wham-O creations, from Silly String to Hacky Sacks.

The Soviet government denounced the Hula Hoop as proof of the “emptiness of American culture,” and the Islamic fanatics who have declared war on the United States would no doubt agree. American culture is an affront to those who think everything ought to have a serious purpose. It has far too much room for lighthearted activities, like tossing a Frisbee or hurtling down a Slip’N Slide, that are outwardly useless.

Americans will not forget Sept. 11 today, and they will not forget that the war on terrorism is still to be won, but they are not about to let such matters spoil a good birthday party. We all know what Spud Melin could have told us: Having fun is the best revenge.

Securing a bigger government

With Bush’s approach, Washington is likely to get downright obese

Sunday, September 15, 2002

Like many a president before him, George W. Bush sees an urgent national problem and offers an interior-decorating solution: adding another chair to the Cabinet table.

He proposes to create a federal department to protect the homeland from attack — which apparently is too much to expect from a Defense Department that is supposed to get $369 billion next year. With 170,000 employees, the new Department of Homeland Security would be large enough to have its own National Basketball Association franchise. Bush says this change will yield better coordination, more accountability, greater effectiveness and no increase in expenditures.

Somehow I don’t think Osama bin Laden is sitting in a cave somewhere fretting that if we create a Department of Homeland Security, he may be in a real pickle. And anytime someone from the government tells you this won’t cost anything, you’re advised to hold onto your wallet with a pair of Vise-Grips.

But let’s assume that a new department is justified. Let’s assume it will greatly enhance our safety. And let’s assume it won’t further balloon the budget deficit. Does that mean we need yet another department in the federal government?

Of course not. If your doctor said you need more fruits and vegetables in your diet, you wouldn’t start eating four meals a day to manage the necessary increase in consumption. You’d stick with three meals and eat less of other foods — meat, pasta, french fries, Krispy Kremes, or something. Otherwise, you could expect to add a lot of pounds eventually.

With Bush’s approach, Washington is also likely to get fatter and fatter over the long run. That’s what comes of failing to take measures to restrict growth. So I propose anew rule: Upon creating a new bureaucracy, our leaders have to get rid of at least one existing agency. You want a department focused on homeland security? Fine, but first, decide which of the ones we’ve already got are no longer needed.

No one doubts that terrorism is an urgent problem demanding government action. But a national emergency, more than any other time, requires setting priorities. If saving Americans from attack is a higher priority than before, then something else must be a lower priority than it used to be.

So we ought to admit as much and reconfigure the federal government to reflect the changes in our world. We’ve got 14 Cabinet departments, after all. Surely we could dispense with at least one of them.

Policymakers, however, assume that once an issue has been elevated to the top of the national agenda, it must stay there forever, no matter how crowded the higher rungs of the ladder may get. The Education Department was created two decades ago to show our deep commitment to improving schools. The Energy Department sprang up in response to the energy crisis of the 1970s.

In 2002, though, it’s not so easy to justify these bureaucracies. The Education Department today accounts for only 7 percent of all the money the nation spends on schools. The energy crisis, meanwhile, was consigned to the history books as soon as President Ronald Reagan junked price controls on oil and gasoline, ushering in a long and continuing era of energy non-crisis.

Either or both of these agencies could be closed down, with their non-essential programs scrapped and their essential ones — let’s be generous and assume there are some — transferred to other departments.

Or we might get rid of the Commerce Department, which rests on the mystifying notion that American capitalism can’t flourish without active government support and assistance. Someone could also take a good hard look at the Department of Agriculture, which was created at a time when 25 percent of Americans lived on farms but endures today, when less than 2 percent do.

The USDA’s persistence brings to mind the joke about the civil servant found weeping at his desk in the Bureau of Indian Affairs. “What’s wrong?” someone asked. “My Indian died,” he answered. At worst, Agriculture and Commerce could be merged into a single agency dedicated to wasting money on people who are perfectly capable of supporting themselves.

There’s a precedent for making reorganization something other than a euphemism for expanding government. Bush says his plan is modeled on President Harry Truman’s 1947 National Security Act, which united all the military services in a new Department of Defense. But when Truman created a new Cabinet department, he got rid of two old ones — the Navy Department and the War Department, both of which had been around for more than 150 years.

Bush says we need to bring the DHS into being to “secure our homeland” from ruthless enemies. But it wouldn’t hurt to also protect us all from too much government.

Appeasement myths, the realities of Iraq

Sunday, October 6, 2002

Should we go to war to stop Hitler? That question may surprise you — at least if you operate on the assumption that Hitler is dead and not about to go anywhere.

But conservatives insist that Hitler has been reincarnated in the form of Saddam Hussein. They say that like the British of the 1930s, who had to choose between the concessions offered by Prime Minister Neville Chamberlain and the military action urged by Winston Churchill, we have to decide between cowardice and courage.

The Weekly Standard magazine labels all the opponents of this pre-emptive war “the axis of appeasement.” The Daily Telegraph of London sneers, “Just as the prospect of invading Iraq provokes clerical and secular hand-wringing now, so did the prospect of taking up arms against Nazism then.” When Illinois Sen. Dick Durbin announced he would vote against a resolution authorizing the president to invade Iraq, his Republican opponent Jim Durkin immediately detected the stench of “appeasement.”

Exhuming the Nazis to justify war is not a tactic unique to conservatives. Liberals accused the United States of shameless appeasement in refusing to send troops to stop the war in Bosnia. Both sides claim to have learned the lessons of history, but the only episode they can ever seem to remember is the rise of the Third Reich.

But they don’t even know much of that history. Anyone trying to apply the experience of Nazi Germany to the case of Iraq can see two obvious things: Saddam Hussein is no Hitler, and our policy over the last 11 years looks nothing like appeasement.

Hitler had been in power just five years when he annexed Austria in 1938. Before that year was over, he had coerced Britain and France to surrender part of Czechoslovakia. In 1939, he invaded Poland. Denmark, Norway, Belgium and France soon followed. In 1941, he marched on Moscow.

It was a plan of conquest breathtaking in its speed and scope. Just eight years after gaining power, Hitler was on the verge of controlling an empire stretching from the Atlantic to the Pacific.

And where is Saddam’s imperial plan? He has been in charge of Iraq for some 30 years, and so far he’s initiated hostilities with only two countries, Iran and Kuwait. Hitler dreamed of ruling the world. Hussein’s grand vision was to control the whole of the Shatt al Arab waterway and some oil fields to his south.

For all his vicious nature, he has shown no interest in building an empire. In any case, that would be an impossibility for Iraq, which has just 23 million people and is surrounded by bigger nations.

As for his domestic realm, Hussein is unquestionably a ruthless despot willing to kill anyone who stands in his way. But that description would not begin to capture Hitler, who slaughtered innocents across the continent on a gargantuan scale. To equate Hussein with Hitler is like equating a snow flurry with an ice age.

If finding someone to impersonate the Fuhrer is tough, finding a modern-day Neville Chamberlain is even harder. When Hitler demanded the Sudetenland from Czechoslovakia, Britain and France meekly gave it to him. When he proceeded to swallow up the rest of the country, nobody tried to stop him. When Hussein invaded Kuwait, by contrast, he unleashed Operation Desert Storm on himself.

No one has been appeasing him since then, either. On the contrary, we’ve kept the Iraqi regime confined to a tight little cage.

The two no-fly zones enforced by British and American fighters cover most of Iraq. Meanwhile, economic sanctions have kept him from buying weapons and spare parts, or doing much of anything to rebuild his army. “Hitler got more powerful with time, while Saddam has gotten weaker,” notes John Mearsheimer, a defense scholar at the University of Chicago.

We’ve stationed thousands of troops in Kuwait, we have air bases in Saudi Arabia, and we generally keep an aircraft carrier within striking distance of Iraq at all times. In short, we’ve let Hussein know that if he ever sets one toe across any of his borders, we’ll stomp him flatter than a straw hat on the interstate.

“Everyone agrees we have to take action against him,” says Mearsheimer, who says the choice is not between war and appeasement, but “containment versus rollback.” The policy of containment, backed by our nuclear deterrent, is the same policy the United States employed against the Soviet Union for 40 years, with successful results.

Hawks claim to be rejecting the policies of Neville Chamberlain that brought on World War II. What they’re really rejecting is the policy of Harry Truman and Ronald Reagan — which won the Cold War and can win this one.

Time to evict police from our bedrooms

Laws against sodomy have become outmoded

Thursday, December 5, 2002

One night four years ago, sheriff’s officers acting on a complaint of an armed man creating a disturbance in a Pasadena, Texas, apartment, entered the dwelling and barged into a bedroom. But all they found was a couple enjoying a pastime commonly enjoyed by couples in their bedrooms, and I don’t mean organizing the closet.

You might guess that at this point, the cops would have blushed, apologized and left as fast as their feet would carry them. Wrong. They arrested the couple under the Texas anti-sodomy statute.

You see, John Lawrence and Tyron Garner are both males, and the state prohibits acts of sodomy between people of the same sex. Lawrence and Garner were arrested, convicted and fined $200 apiece.

There are some laws that exist only because no one would ever dream of enforcing them. Anti-sodomy statutes, which forbid carnal deeds that have been committed by the overwhelming majority of American adults, are a prime example. Until 1961, every state prohibited sodomy. But most people have lost interest in regulating what others do between the sheets. Today, sodomy laws exist in only 13 states.

Most of those put these forms of gratification off-limits to all their citizens. Texas has a different approach. Its criminal code, while permitting “deviate sexual intercourse” by heterosexual partners, outlaws it for homosexuals. If John and Tyra had been caught doing what John and Tyron were caught doing, the police would have been powerless to stop them.

Even in Texas, though, no one enforces the law, no one expects it to be enforced, and hardly anyone wants it to be enforced. There are some 43,000 gay and lesbian couples in the Lone Star State, according to the last census, as well as thousands of other homosexual individuals who may pair off on any given night. A gay population with even a minimally active libido would present a law enforcement challenge beyond measure.

But occasionally police stumble onto illegal conduct, and then the law turns out to be more than a dead letter. For Garner and Lawrence, there was the indignity of being jailed, hauled into court and fined for consensual acts carried out in private. On top of that, their lawyers note, they are now disqualified or restricted “from practicing dozens of professions in Texas, from physician to athletic trainer to bus driver.” If they move to some states, they’ll have to register as sex offenders.

The two men have appealed to the U.S. Supreme Court, insisting that the law violates their right to privacy and discriminates against them on unconscionable grounds, and the court has agreed to hear the case. They say states have no business telling adults what intimacies they may choose to enjoy, and certainly can’t deny specific pleasures to gays while allowing them to straights.

Either theory would require the court to stake out new constitutional ground. Just 16 years ago, it upheld a Georgia man’s sodomy conviction, ridiculing the idea that he had “a fundamental right to engage in homosexual sodomy.” It has yet to treat discrimination against gays as the equivalent of discrimination against blacks or women.

But there are good grounds for putting this type of regulation off-limits. The court has long accepted that the Constitution establishes an unassailable zone of individual privacy. In 1965, it said the government can’t forbid contraceptives to married people, and later it said the same for people who haven’t tied the knot.

Supporters of the law object that the Constitution says nothing about sexual privacy. But the 9th Amendment says the reference to specific liberties in the Bill of Rights “shall not be construed to deny or disparage others retained by the people.” Recognizing those other rights is the task of the Supreme Court. Given modern notions of sexual autonomy, it makes perfect sense for the court to find that the government has no business telling consenting adults what they can put where.

The Texas prosecutors argue that we can outlaw gay sodomy because we’ve always outlawed gay sodomy. “Fundamental rights must be grounded in the nation’s history and legal traditions,” they say, and in this case, history and tradition support the ban. But discrimination against women has its own basis in history and tradition, and even conservatives today show no interest in making the case that the government can treat half the human race as an inferior species.

Laws against sodomy are just as outmoded. No state actually enforces these laws the way other laws are enforced. Why? Because very few Americans see policing the bedroom as a legitimate function of government.

We as a people already accept that sexual freedom and privacy are fundamental rights of every rational adult. Maybe the Supreme Court is ready to do likewise.

TV, Cokie Roberts and the real Sunday morning news

Sunday, January 5, 2003

Cokie Roberts recently left her post as co-anchor of ABC’s Sunday morning news show, “This Week,” and I’m happy to say I won’t miss her. That’s not because I don’t like her — in fact, I like her a lot — but because I never watch the show, or its competitors. On this, she and I are in perfect agreement: Both of us think our lives are better without “This Week.”

Why did she leave a job most journalists would run over their grandmothers to get? In an interview on CNN’s “Reliable Sources,” Roberts said it was simple: “Because I had reached a point in my life when it was time for me not to be getting up at 5:00 every Sunday and going to work, and then coming back and going to church, and then taking a nap, and not really having the opportunity to spend time with people I wanted to spend time with, mainly my family.”

As it happens, “This Week” conflicts with my paramount Sunday morning obligation, which is taking my 11-year-old daughter to the local ice arena so she can practice her figure skating. But that’s not really my reason for not watching. Truth is, I’d rather go to an empty rink and stare at the ice than subject myself to those programs.

In many journalistic circles, it is considered a solemn obligation, and even a pleasure, to pass Sunday morning in front of the tube. The only acceptable excuse for not watching the talk shows is if you’re actually participating in one of them — something no one has ever been addled enough to invite me to do. In my younger days as a journalist, I used to make a habit of tuning in, feeling it was essential to keeping abreast of national events and writing well-informed commentary.

But eventually it became apparent to me that what I was gaining in information, I was losing in perspective. I was mistaking the current for the important.

I often got so caught up in what the chairman of the House Ways and Means Committee had to say, or how the secretary of energy saw the world, that I lost sight of other matters — like whether they were discussing anything that deserved two seconds of anyone’s time. I found the voice of Sam Donaldson or Tim Russert or George Will barging in to my cranial cavity just when I was trying to figure out what I thought about something.

So I gave up the routine for good. In the last decade and a half, the only time I relapsed was when the Monica Lewinsky scandal broke and there was talk that President Clinton might have to resign. But that prediction turned out to be a specimen of inside-the-beltway wisdom from journalists talking too much to one another. It reconfirmed that I was better off doing something else — even if it was only raking leaves.

Another reason for my inattention is that anything important said on these shows will be in the newspapers on Monday, since news is usually scarce on Sunday. And if you want the whole context of the distinguished senator’s thoughts about this nomination or that bill, you can always find a transcript online — and read it in a tiny fraction of the time it takes to watch the show. All without having to wonder what kind of mousse George Stephanopoulos puts on his hair.

In fact, most of the “news” on these programs is as predictable as Wonder Bread. Roberts admitted frustration with the format: “Over the years politicians have gotten more scripted. They have media consultants who tell them how to do it. They’ve learned how to speak in 11-second sound bites. . . So it is irritating, and you do try to think of something that maybe they haven’t thought of so that they are thrown off the script a bit, but it’s hard to do.”

But maybe the best reason for going AWON — absent without news — is the one Roberts mentioned: It frees up time for the ordinary stuff of life, whether it’s walking the dog, reading a book, going to church, meeting a friend, talking to your spouse, or supporting your local ice arena.

Those things are important, even to journalists. Like everyone else, we need regular reminders that there is a lot more to life than government and politics. Sunday-morning news shows focus a journalist’s mind on work when it ought to be focused on anything else.

I’m glad I’m not the only one who feels that way. Nobody on her deathbed ever wished she had spent more time with “This Week.” I don’t think Cokie Roberts will be the first.

Racial preferences: Unpleasant facts

Bias in universities is a policy of pretense

Sunday, January 19, 2003

The University of Michigan, in its effort to achieve racial diversity in its student body, gives preference to black, Hispanic and American Indian applicants. Without such consideration, it says, these minority groups would practically vanish from campus.

So here’s a quiz: If you’re applying for undergraduate admission to the school, is it better to have a) a perfect score on the SAT or b) a dark complexion? The answer, of course, is b). On Michigan’s 150-point scale, a perfect board score gets you 12 points. Being black or Hispanic gets you 20.

That helps explain why President Bush decided last week to support a constitutional challenge to the program, calling it “a quota system that unfairly rewards or penalizes prospective students based solely on their race.” The president obviously doesn’t like to take actions that can be construed as racially insensitive, especially right after Trent Lott got caught waxing nostalgic for segregation. But the Michigan program is so blatant in its reverse discrimination that Bush didn’t really have much choice.

Advocates of preferences in higher education give the impression that they’re a matter of just a slight thumb on the scale. In fact, these policies essentially require two entirely different scales. They mean appreciably lowering admissions standards for minority applicants. Among students with board scores and grades in the middle range of Michigan Law School applicants, one appeals court judge noted, nearly four out of five whites and Asian-Americans were rejected. But 100 percent of blacks and Hispanics in that range were accepted.

The university used to judge different races by explicitly different standards, with the law school operating a “special admissions program” to ensure that at least 10 percent of each class would be black, Hispanic or Native American. It abandoned that system because it was vulnerable to legal challenge, replacing it with a stress on “diversity” that is designed to achieve a “critical mass” of minority students.

In practice, though, the new approach bears a striking resemblance to the old one. It virtually guarantees admission to minority students with academic credentials that would usually disqualify a white candidate. And it has an impressive habit of keeping minority representation high. “Critical mass” seems to mean a quota with a little fuzz around the edges.

There is no dispute that for blacks and Hispanics, Michigan greatly de-emphasizes grades and board scores. It’s not hard to understand why. Administrators want a substantial number of minority students, but there aren’t a lot of minority students with the stellar academic credentials that the university normally demands. The law school says that each year, it gets about 900 applications from white students in the top range of grades and test scores — but only about 35 from minority candidates.

What the whole affirmative action debate passes over is the unpleasant fact behind it: the racial gulf in academic achievement. Asian-Americans don’t need special help in university admissions because they have no trouble competing with whites in the classroom. But on average, blacks and Hispanics lag behind.

You might assume that’s the lingering consequence of racism. But Abigail Thernstrom, a member of the U.S. Commission on Civil Rights and co-author of the book “America in Black and White,” points out that from the mid-1970s to the late 1980s, the gap shrank between blacks and whites, as measured by the National Assessment of Educational Progress test — and since then, it’s widened. Does anyone think racism has more impact today than it did 15 years ago?

Nor is poverty a convincing excuse. “Black students from families with incomes above $70,000 a year score lower on the SAT than white students from families with incomes of less than $10,000 a year,” notes Shelby Steele, an African-American scholar at the Hoover Institution.

So how can the gap be explained? Elementary and secondary schools are obviously not adequately preparing many students for higher education. The breakdown of the black family — two out of three black children are born out of wedlock, compared to 27 percent of whites — puts African-American youngsters at a great disadvantage. And African-Americans in general place less importance on education than more successful ethnic groups.

The widespread use of preferences is a policy of pretense. It pretends, against all evidence, that the racial gap in academic performance doesn’t really matter. It tells blacks and Hispanics they don’t need to meet the same standards as everyone else.

The policy is supposed to be a boon to minorities that would otherwise be “under-represented” on university campuses. But if racial preferences have failed to close the academic gap, maybe it’s because they’re part of the problem.

Thoroughly bogus case for war

Bush’s plans for Iraq turns out to be full of holes

Sunday, February 2, 2003

Conservatives fancy themselves to be hardheaded realists, immune to cheap emotional appeals. But last week, you could barely recognize them. Hearing George W. Bush rail theatrically against the savagery of Saddam Hussein in his State of the Union address, members of the war party practically quivered in ecstasy.

“The president was able to show his resolve, his sober determination, his moral vision,” exulted David Brooks in The Weekly Standard. The Wall Street Journal’s editorial writers got a thrill from “the look in his eyes” as he “seethed with determination.” Peggy Noonan, a speechwriter for President Reagan and the first President Bush, wrote in perfect seriousness, “For a moment I thought of earnest Clark Kent moving, at the moment of maximum danger, to shed his suit, tear open his shirt and reveal the big ‘S’ on his chest.”

Well, there is no accounting for what goes through Peggy Noonan’s mind in the presence of a Republican politician. But it’s understandable that conservatives responded to the speech with their hearts, because it didn’t have much to appeal to the brain. All the inflammatory denunciations and ostentatious muscle-flexing couldn’t disguise the flimsiness of Bush’s case.

Consider the reasons he cited:

- Iraq has weapons of mass destruction and hopes to get more. The president unrolled a list of nasty weapons that Iraq has long possessed — anthrax, mustard gas, sarin, and VX nerve agents, which could be used to kill millions of people. But that raised an inconvenient question: Why hasn’t he used them against us? Answer: He knows he would be destroyed. That hasn’t changed.

- The only reason Hussein wants such weapons is for aggression. Bush says that’s “the only possible use” they could have. Nonsense. Half a century of experience with the Bomb makes it clear that weapons of mass destruction are valuable only for deterring attack, not facilitating it.

That’s why we spend billions on nuclear missiles we never use. Given our desire for “regime change” in Iraq, Hussein has understandable motives for wanting such protection. It’s worked for North Korea, hasn’t it?

- Hussein is too crazy to control. Bush got a rousing ovation when he declared, “Trusting in the sanity and restraint of Saddam Hussein is not a strategy, and it is not an option.” In fact, Bush himself has relied on it for more than two years. If it’s not an option, why didn’t Bush set out to attack him immediately after taking office?

The truth is, Hussein has sometimes been aggressive but never suicidal. We don’t have to wonder if he can be deterred. He already has been, over and over. He could have used his chemical and biological armaments during the Gulf War or anytime in the last 12 years. But he didn’t. Trusting Saddam Hussein to place his personal and political survival first has not only been a strategy, it’s been a successful one.

- He might give unconventional weapons to Al Qaeda. “Imagine those 19 hijackers with other weapons and other plans — this time armed by Saddam Hussein,” said the president. This is a fantasy. The administration has tried in vain to prove that Iraq had a hand in the Sept. 11 attacks. And the Central Intelligence Agency, in a classified assessment last fall, dismissed the possibility that Hussein would give his most lethal weapons to an uncontrollable terrorist organization that might turn against him.

The only instance in which he might do that, said the CIA, was if the U.S. were to launch an all-out war — because he would no longer have anything to lose. Bush’s solution is the surest way to precipitate the very nightmare it’s supposed to prevent.

- He’s a sadistic dictator who tortures his people in horrible ways. A recent report from Amnesty International found, “Detainees in their custody are tortured with electro-shocks, suffocated with plastic bags over their heads, burned by cigarettes, beaten with metal pipes and gun barrels, and have chili peppers put in their eyes or on their genitals.” But that wasn’t Iraq — it was the Philippines, where U.S. troops were sent last year to train government soldiers in fighting Islamic extremists.

We work with a lot of countries where torture is reportedly common — including Turkey, Pakistan, Russia and Egypt. Amnesty International says there are about 70 around the world. There is only one, though, that bothers Bush enough to invade. Reciting gruesome tales from Iraq is good for stirring an audience, but as grounds for war, it’s completely bogus.

This State of the Union address resembled one of those fast-paced thrillers that manages to keep you on the edge of your seat even though the plot is full of holes. It was easy to get swept away by it, but only if you didn’t think too much.

Listen up: Short memories fuel the U.S. drive for war

To understand that America can fail badly in a war, you have to be old enough to recall the carnage of Vietnam

Thursday, March 6, 2003

The rest of the world may be opposed to a U.S. attack on Iraq, but here in America, there is general agreement that we are right and everybody else on Earth is wrong. American public opinion was in favor of taking out Saddam Hussein after the Sept. 11 attacks, and it still is. Who cares if this war, which we intend to fight for the good of humanity, doesn’t appeal to most of humanity?

Polls suggest that a large majority of Americans endorses the Bush administration’s drive toward war, though support has eroded in recent months. In January 2002, an ABC News/Washington Post poll found that 71 percent of the citizenry favored military action to get rid of Hussein. Today, support is down to 63 percent. Opposition, meanwhile, has risen from 24 to 31 percent. But that’s still a 32-point gap, the polling equivalent of a landslide.

Why do Americans take such a different view from Europeans and other pesky foreigners? One reason is that we have great faith in our own good intentions, and no amount of contact with reality can shake that confidence. Right now, we’re prepared to do whatever is necessary to rebuild Iraq as a peaceful and prosperous country. Never mind that we had the same mission in mind for Afghanistan, but lost interest about 10 minutes after the Taliban fell. A short memory is a great boon to self-esteem.

George W. Bush himself, who once scorned nation-building, now offers himself as the nation-builder extraordinaire. His mission is not just to remove a menace, but to plant a flowering democracy that will blow the seeds of liberation across the Middle East.

Skeptics abroad have their doubts, both about our motives and our staying power. That has something to do with all the Middle Eastern dictators we’ve been happy to snuggle with over the years out of lust for their oil. If we wanted to promote human rights in the Arab world, we could have started with Saudi Arabia, whose ruling family is repressive enough to make Gen. Franco look like Captain Kangaroo.

Let’s face it: Human rights and democracy have never been a big factor in our foreign policy. We have no trouble working with the military dictator who rules Pakistan. We don’t mind seeking help on North Korea from China, which is about as democratic as an Alabama jail. Bush has forged a partnership with Russian President Vladimir Putin even though his tactics against rebels in Chechnya would make a buzzard retch.

Americans undoubtedly approve of invading Iraq because, as American Enterprise Institute polling expert Karlyn Bowman puts it, “people have known Saddam Hussein for a decade and think he’s a thug.” But they also start from the assumption that taking care of him will be easy.

They have that belief for a simple reason: In recent decades, almost all our military ventures have been successful and virtually painless. The first Gulf War spilled amazingly little American blood, given the scale of the undertaking. U.S. soldiers in Bosnia were less likely to die than U.S. soldiers not in Bosnia. Not a single American boy or girl died in the war in Kosovo. Those interventions that have gone badly (Somalia, Lebanon) were so brief and small-scale that they can be forgotten.

To understand that America can fail badly in a war, you have to be old enough to remember the endless, pointless carnage of Vietnam. That’s why, in the ABC News/Washington Post poll, the highest support for the Iraq invasion comes from 18-to-34-year-olds, while the lowest comes from those 65 or older.

It’s hard for a lot of Americans, particularly young ones, to imagine things going very wrong — either during the war itself or in the occupation that follows. So war has regained its allure of romance and glory.

If you wonder why people support the war, you might consider why people buy sport-utility vehicles. It’s not because SUVs fill an urgent practical need, but because they carry an aura that a lot of Americans like to project: brawny, rugged, fearless. Enthusiasm for this war serves likewise to convey toughness and bravery in a manner requiring no effort.

It’s no surprise that two-thirds of men favor military action, compared to only half of women. Tough guys aren’t afraid of a little bloodshed, at least if it’s on the other side of the planet. Only women and wimps — like those effeminate Europeans — bother looking for ways to avoid a fight.

But pride goeth before a fall, and if Americans persist in launching military crusades around the globe, we’ll eventually rediscover that they can end tragically.

That’s a lesson only experience can teach.

Does dissent have a place in wartime?

Playing patriotism card is shameless

Thursday, March 27, 2003

Are you patriotic or anti-war? If you think that’s a false choice, you probably weren’t in attendance at one of the “Pro-America/Support Our Troops” rallies held in cities across the country last weekend.

In the view of many citizens who favor the invasion of Iraq, opposition is symptomatic of anti-Americanism, and open dissent during a time of war comes close to treason. At some rallies, marchers carried signs saying, “America — Love It or Leave It.”

It’s hard to see why people should be expected to leave a free country because they have the gall to exercise their freedom. Maybe the ones who should leave are their critics, who would be more comfortable in a country whose government tolerates no criticism — say, Iraq. Or maybe they think we can’t deliver liberty to the Iraqi people unless we first confiscate it from the American people.

There is no contradiction between loving your country and wanting it to stay out of unwise wars that expose American soldiers and civilians to needless dangers. Nor does demonstrating against the war imply a desire to see the United States lose. I can’t speak for all critics of the war, but once the bombs started falling, I wanted exactly what the supporters want: a swift victory and the safe return of all our soldiers, marines, sailors and aviators.

As is often the case when the nation is embroiled in military conflict, however, those who favor war make every effort to appropriate the flag as their own political symbol. They insist that public opposition to the war provides comfort to Saddam Hussein and betrays those risking their lives in Iraq. Fox News host Bill O’Reilly said a few weeks ago, “It is our duty as loyal Americans to shut up once the fighting begins, unless facts prove the operation wrong, as was the case in Vietnam.”

Shut up once the fighting begins? You first, Bill. People who opposed the war have no duty to gag themselves once the war is underway — any more than Bill Clinton’s enemies had an obligation to cease their criticism once he won his impeachment trial.

Nor is blind support of government policy any favor to those in uniform. Supporters of the war often suggest that the debate is between those with military experience and those without. Not so. Many of its advocates in the administration haven’t served — including Vice President Dick Cheney and Deputy Secretary of Defense Paul Wolfowitz. The main skeptic has been Colin Powell, former chairman of the Joint Chiefs of Staff.

Some prominent veterans have criticized the administration for invading Iraq rather than simply keeping Hussein in the cage to which he has been confined for 12 years. Retired Marine Gen. Anthony Zinni, former commander of U.S. forces in the Middle East, came out against the invasion last year. So did Brent Scowcroft, a retired Air Force general who served as national security adviser to the first President Bush.

They have plenty of company. Shortly before the war began, an organization called Veterans for Common Sense sent a public letter to the White House signed by 986 veterans of every rank and service, saying that they “strongly question the need for war at this time.”

One of them was Charles Sheehan-Miles, a decorated Army combat veteran of the Persian Gulf war and a co-founder of the group. What did he think about protesters back then? “It made me happy that there were people who cared enough to take a stand on the issue,” he says.

As for the reaction of his fellow soldiers, Sheehan-Miles recalls, “It was mixed. Some thought nobody should protest, and some thought it was OK, and a lot didn’t care one way or the other.” It doesn’t show much regard for our military people to think they would fall to pieces upon hearing that some people question the president’s mission.

Supporters of the war don’t really believe that dissent is intolerable in wartime. Even O’Reilly said it would be defensible if “facts prove the operation wrong.” You can be sure conservatives will object loudly if they think the administration is waging the war with insufficient force or resolve. But if that sort of criticism isn’t dangerous to the war effort, why is criticism from the other side?

Playing the patriotism card or the veterans card is a shameless attempt to discredit and intimidate dissenters, which is easier than proving them wrong. The real divide is between those who see open debate in a democracy as a weakness and those who see it as a strength. The anti-war demonstrators may be wrong about some things, but they’re right about that.

Our endless quest for invulnerability

Sunday, March 30, 2003

America is the most secure nation on Earth — and the most insecure. The war in Iraq baffles the rest of the world because it reflects our tendency to see urgent perils that others don’t. We spend as much on defense as the rest of the world combined. But we regard Saddam Hussein, the beleaguered dictator of a small, poor, faraway nation, as a threat too great to tolerate.

This is a different kind of war from what the world is used to. In Afghanistan, we were pursuing an enemy that had killed thousands of Americans. But Iraq hasn’t attacked the United States, hasn’t threatened to attack the United States, has nothing to gain by attacking the United States, and hasn’t acquired the capacity to do us any serious harm. The Bush administration has gone to war solely because Iraq might, someday, put us at risk.

One reason Americans support this war, whether it proceeds quickly or slowly, is that they look forward to being rid of this chronic nuisance so we can enjoy a more peaceful world. But the march to Baghdad looks to be just the opening battle in a broader and more dangerous war — against any potential adversary, anyplace in the world.

That’s the message of the new national security strategy unveiled by the administration last year. It asserts the right of the U.S. to launch preventive wars, if necessary — and not just to eliminate immediate threats, but to head off “emerging threats before they are fully formed.”

Calvin Coolidge once said, “If you see 10 troubles coming down the road, you can be sure that nine of them will run into the ditch before they reach you.” George W. Bush, by contrast, worries that the troubles will not only stay out of the ditch but will bear offspring on the way.

That’s the whole basis of the war against Iraq. For 12 years, the U.S. and its allies have proven that Saddam Hussein can be contained by a significant military presence and the occasional use of force. During the first gulf war, we learned he can be deterred from using weapons of mass destruction. But suddenly, those methods are deemed inadequate.

The old maxim is, “If you want peace, prepare for war.” The idea is that enemies wouldn’t challenge a nation that is well-armed and ready to respond to aggression. Hitler, for example, was encouraged in his predations by the reluctance of Britain and France to fight.

Hussein has not been so lucky. Since he invaded Kuwait — which he did largely because the U.S. government gave him the idea we’d let him — he has been acutely aware that he will never be allowed to aggress again. The first President Bush also left him no doubt that if he used his worst weapons, he would be obliterated.

Recalculating: Steve Chapman on a New Century

Подняться наверх