Tuesday, August 26, 2008


(This article originally appeared in the weekly Easy Reader, Hermosa Beach, Calif.)

November 15, 2007

Here, take this little True/False quiz on honeybees. See if you know more about them than Jerry Seinfeld does.

1. Honeybees have yellow bodies with black stripes.
2. Male bees have stingers.
3. Male bees go out to gather nectar from flowers and are the principal workforce inside the hive.
4. Worker bees select one job in the hive when they are young and do it for the rest of their lives.
5. All the bees in a colony are cousins.
6. Bees have no use for pollen themselves but suck it up and spray it over flowers because they somehow know pollination is important for the ecology.
7. If a colony of bees has enough honey to meet their needs, they will stop working.
8. Beekeepers enslave the bees for their own profit. Their slogan is, "They make the honey, and we make the money."
9. Beekeepers use smoke to suffocate the bees.
10. Many people are petrified of bees.

Here are the answers:

1. False. Honeybees have brown bodies with black stripes. The yellow-and-black insects are yellowjackets, the wasps that go after your picnic and give honeybees a bad name.

2. False. Only female bees have stingers. The male bee’s similar organ is for sex.

3. False. Male bees, appropriately named drones, do nothing at all except to fly out to look for and mate with a virgin queen (and to die in the process). The rest of the time, they lounge around inside the hive, being fed and cared for by the females, who outnumber them about 200 to 1. In the fall, the females push them all outside, where they starve to death.

4. False. Worker bees, all sterile females, perform many different tasks in the hive, depending on their age. They spend the last half of their six-week lives as foragers, gathering nectar and pollen from flowering plants.

5. False. All the bees in a colony are sisters and brothers, the offspring of the queen bee.

6. False. Bees bring back pollen to the hive and convert it into "bee bread," their source of protein. Honey is their carbohydrate. They eat nothing else besides these two foods.

7. False. As long as there are enough flowers, enough workers, and enough room in the hive, bees will continue to make honey, even though it’s too much for them to use. This is why beekeepers can take the surplus honey without depriving the bees.

8. False. Unlike cows, bees cannot be domesticated or trained; they will do whatever they want. The best that beekeepers can do is give them a decent home and fields of flowers and hope they’ll stick around.

9. False. Smoke calms the bees and when used in moderation will not harm them.

10. True. One tiny insect, especially in a car, will turn many people frenetic.
How did you do? Better than Seinfeld, I’m sure. Each of the above questions is based on scenes from his DreamWorks animated feature, Bee Movie. Only the last one is true, and his depictions of bee paranoia are uproariously accurate.

As a beekeeper who often gives talks to both adults and children, I wonder if there’s something bad about dishing out all this misinformation. I’m mostly glad this movie’s out there, since nothing makes a person realize the truth better than unmasking the lies. And after all, it’s just a cartoon. If you can make bees speak English, why can’t you make bee colonies look like the male-dominated American society of, say, 1967?

That’s what this movie does. As you probably already know, having seen it yourself, heard the reviews, or read the McDonald’s promotional packaging, it’s a fly-weight Bildungsroman starring Seinfeld as The Graduate. Returning from Bee College on the other side of the hive and smartly dressed in black and yellow ("My sweater is Ralph Lauren, and I wear no pants") (see fallacy #1 above), Barry B. (for Benjamin?) Benson is pressured by his "parents" (fallacy #5) to get a job (fallacy #3), not in plastics but in honey, the only industry in this company town. Dreading the thought of spending the rest of his life doing a single task (fallacy #4), he sneaks out to accompany the macho Pollen Jocks air squadron (#3 again) on their flight to vacuum up nectar and spew around pollen from the flowers in Central Park (fallacy #6, except that there actually are flowers in Central Park). Separated from his unit and after brushes with death by tennis ball and windshield wiper, he finds himself in a flower shop and is saved from the swatter by the human owner, cartoon-comely Vanessa Bloom (no relation to Molly), voiced by Renee Zellweger. He does exhibit the drone’s drive to mate, but since the PG rating would be jeopardized and he doesn’t have the right fixtures anyway (fallacy #2), they settle for a platonically passionate relationship, giving new meaning to a woman’s cry, "You insect, you!" It’s a pity there’s no Mrs. Robinson, but there is a funny remake of the swimming pool scene.

The last half of the movie turns Marxist. Beekeepers are portrayed as capitalist exploiters of the apian working class (fallacies #8 and #9). Barry courageously takes the human race to human court and wins. All commercial honey is returned to the bees, who then grow so lazy by the surfeit that they quit working (fallacy #7), creating a pollination crisis that is solved by . . . well, you gotta see the rest for yourself.

Or else just forget about it. Like Seinfeld used to say about his TV series, Bee Movie is a show about nothing. Despite all the save-the-pollinators advertising (including a pre-movie plug by a chief exploiter, bushy-bearded Burt of Burt’s Bees), it has little to do with either nature or human nature. It’s clever and often funny, though you may find yourself wishing Jerry would ditch the bee costume — his face has always been at least as entertaining as his lines.

For millennia, at least as far back as the Roman poet Virgil, humans have looked to honeybee society as a model, utopian or dystopian, for their own. More interesting than Seinfeld’s drone-world would be a feminist treatment reflecting the actual world of the hive.

My mind is reeling. Imagine the queen bee in an asbestos pants-suit.
(This article originally appeared in the weekly Easy Reader, Hermosa Beach, Calif.)

November 8, 2007

It’s not that anybody was expecting surprises when President Bush mounted the podium at the State Department on October 24 to talk about Cuba।

"In this building," he began, "President John F। Kennedy spoke about the U.S. economic embargo against Cuba’s dictatorship. . . . Today another president comes with hope to discuss a new era for the United States and Cuba."

Actually, after 45 years, it was just more of the same.

Among the wonderful things he named to usher in the new era are a "Freedom Fund" for economic development, admission to the Partnership for Latin American Youth scholarship program, and — "Here’s an interesting idea to help the Cuban people," he said, as if he’d just thought it up — blanketing the country with internet-ready computers to be supplied by non-governmental organizations and faith-based groups. Such benefits will become available, however, "only if the Cuban regime, the ruling class, gets out of the way." Until then, the embargo and travel restrictions will continue, diplomatic relations will remain severed, and no negotiation will take place.

This is a new era?

Perhaps the most convincing proof of the futility of America’s intransigent policy toward Cuba is that Fidel Castro has outlasted nine American presidents and may very well outlast the tenth। Bush calls Castro’s "a failed regime।" If that’s failure, it’s certainly a durable one।

If ever there was a time for a major shift in Cuba policy, it is now। The once-blustering Fidel is sick and has turned the reins over to his younger, more circumspect brother Raul. A practical man, Raul masterminded a now-thriving tourist industry to bring in hard currency after Soviet aid dried up in the 1990’s. He also developed a creative urban-farming plan with a free-market component to lessen dependence on foreign food imports. There are signs that he is favorable toward a Chinese-style economy. Nothing could create the atmosphere for a "velvet revolution" more than normal relations with the U.S.

But Bush will not deal with dictators. Period.

Sound familiar?

There’s no doubt that this administration’s policy, like that of its predecessors, has been shaped by the politically powerful Cuban exile community in Miami. But there is something more radical at work here. The rhetoric of this speech was pure Bush, the Bush who brought you Iraq and may soon bring you Iran. It was Bush the idealist, Bush the ideologue, who was on display.
With his now familiar but no less frightening axis-of-evil hyperbole, he identified Cuba as "a tropical gulag," a place of "terror and trauma," with "horrors still unknown to the rest of the world."

Consistent with his approach toward repressive governments in Iraq, Iran, and Syria among others, and toward the Hamas in Palestine and the Hezbollah in Lebanon, Bush sees no room for compromise or tolerance toward the Castros or their likely Communist successors. "Life will not improve for Cubans under their current system of government," he declared. "It will not improve if we seek accommodation with a new tyranny in the interests of ‘stability.’ . . . The operative word in our future dealings with Cuba is not ‘stability.’ The operative word is ‘freedom.’"

What his own failed international policies have not taught him is that freedom is better nurtured by stability — by negotiation rather than confrontation, by inducement rather than sanction.
In an interview with Charlie Rose on PBS last week, the head of the U.N.’s International Atomic Energy Agency, Mahomed ElBaradei, gave a refreshingly frank and lucid analysis of the present dynamic with Iran: The more the U.S. threatens, he said, the more the Iranian people will hunker down behind their government, no matter how oppressive it is and no matter how attracted so many of them are to Western ideas and lifestyles.
It is the same with Cuba, and then some.

Despite economic hardship, collectivization, and restrictions on free speech, reports suggest that most Cubans hardly consider their island "a tropical gulag." They are grateful for their educational and health-care systems and fiercely proud of their culture. Though they are obviously dissatisfied with the Communist system and eager for change, there is little apparent desire for overthrow. Many, in fact, retain a genuine affection for Fidel and for the ideals, if not the implementation, of his revolution.

What Cubans on the island fear more is attempts by the United States to engineer a "transition" — or even an Iraq-like invasion — when the Castro era ends. They want nothing more than free and open relations with the U.S. — but on their terms, not ours.

As one friend of mine, an organizer in the urban agriculture movement who has often visited Cuba to research its sustainable food production program, told me: "What I hear people saying is, ‘We want change, but we want to do it our way, not America’s way. Bush wants to liberate us, but we want to liberate ourselves.’"

You can’t help hoping that Bush might still surprise the world by emulating his predecessors. Richard Nixon’s initiative toward China — a total turnaround by the consummate Cold Warrior — has become a political paradigm of the triumph of common sense over ideology. While continuing to castigate Communist repression, he unexpectedly offered more amicable relations and freer trade, a move that sparked the systemic changes that have propelled China’s astounding economic growth. Much the same can be said about Ronald Reagan’s overtures to Soviet Premier Gorbachev after years of Evil Empire rhetoric. Both Nixon and Reagan proved to be practical politicians who saw the opportunity to induce change by acting synergistically with systems poised for change.

Bush’s speech indicates that he’s no Nixon nor Reagan. But he’s still got over a year to surprise us.

Sunday, August 24, 2008


November 1, 2007

When I arrived in the South Bronx in the fall of 1990, the reality matched the legend. Half the buildings in the neighborhood were burnt-out shells. The sidewalks were covered with broken glass and empty crack vials. The nights roared with raucous music, argument, and gunshots. Drug dealers brazenly hawked their products right from their own front steps. Friends advised me to avoid subway violence by riding in the front car, where the engineer might protect you. In chic midtown Manhattan, you had to step around or over the many homeless stretched out on the subway grates for warmth. On summer days, rats undulated in packs through piles of putrid garbage bags waiting on the streets for sporadic pickup.

People were afraid the whole place would either implode or explode. In 1993, polls showed that almost half of the residents of New York would move away if they could.

David Dinkins was the mayor. He had defeated two-term incumbent Ed Koch in the 1989 Democratic primary and had gone on to beat Republican Rudolph Giuliani, then the U.S. Attorney for New York City, in the general election. There was no trick to that: Dinkins, the former borough president of Manhattan, was a respected public figure; an African-American, he was thought the best to ease racial tensions; and of course, he was a Democrat in a straight-ticket Democratic town.

Four years later, he was blamed by many for bringing the city to its knees.
"My verdict on the David Dinkins years is simple," Ed Koch later wrote. "Very nice man, very poor mayor."

Dinkins was unable to come to grips with the circumstances and events that were afflicting most major cities at the time: crime, homelessness, racial division, loss of the tax base by suburban flight, compromise of the infrastructure. And he dithered in crisis, most notoriously during the rioting between Blacks and Jews in Crown Heights, Brooklyn, in 1991, when he waited three days before sending the police in force to put it down.

Giuliani saw his chance. In 1993, running on a law-and-order platform, he faced Dinkins a second time. Voters were not happy with either choice. Koch wrote that people would come up to him on the street and implore, "Mayor, you must run again, you must run again!" "No," he would respond, "the people threw me out and now the people must be punished!" — at which they would cry, "Oh, Mayor, we have been punished enough!"

Democrat Koch eventually endorsed Giuliani, fearing that New York was "on the brink of disintegration." Giuliani won by two percent.

The tough guy, the prosecutor, the commander: Many who voted against him, myself included, were secretly relieved to see him in.

Things turned around. Under Giuliani’s first police commissioner, William Bratton of Boston, the crime rate fell sharply. In my neighborhood, undercover cops began using the bell tower of St. Augustine Catholic Church to monitor the corner "drug stores"; soon the dealers went undercover themselves and the open hawking stopped. Violence in the public housing projects and subways was tamped down by folding the ill-trained housing and transit police forces into the NYPD, and by enacting stringent gun-control laws. People in poor communities like ours were fearful of the police, but grudgingly glad they were there.

Then there was the "quality of life." Despite protests, often justified, homeless persons were swept off the grates and into shelters. Subway stations were refurbished, and the ubiquitous graffiti on the cars were promptly removed. The "squeegee men," the bane of every urban driver, were driven away by the police. Big-bomb firecrackers, available year-round in Chinatown, were banned and confiscated, and noise-abatement laws curtailed the late-night throb of ultra-bass speakers in apartments and roving cars.

Giuliani was the right man at the right time. The economic achievements of his administration that he touts in his present wrestles with Mitt Romney — balancing budgets, cutting taxes — were mostly the result of the general surge in the economy during the Clinton years. His fundamental accomplishment was to restore order to a city verging on chaos. With order, confidence returned, and New Yorkers once more began to feel that their destiny was in their own hands.

Their hands, not his.

When Giuliani ran for a second term in 1997, people were happier with themselves than they were with their mayor. Over four years, he had shown himself to be, as Ed Koch wrote in his 1999 book with the telling title, Giuliani: Nasty Man, "a ruthless control freak who governs by imposing a state of terror on members of his administration and claiming credit for accomplishments he had nothing to do with."

He had forced Bratton out of his job, piqued that the police commissioner had made the cover of Time magazine. He had hectored Schools Chancellor Ramon Cortinez as "precious" and "the little victim." Without investigation, he had sided with the police in a number of incidents of alleged brutality, including the sodomizing of Abner Louima, who may or may not have heard the officer with the broomstick cry, "It’s Giuliani time!"

But it was order the voters were grateful for, and they re-elected him overwhelmingly.
Giuliani’s second term was simply bizarre. Like all dictators, the personality that had facilitated his achievements turned into hybris. He openly battled with his second wife, Donna Hanover, with their two young children in between. He alienated the artistic community by trying to close down an exotic exhibit at the Brooklyn museum. He virtually severed relations with the Black community by refusing to speak for two years to Virginia Fields, the Manhattan borough president, and by his unrepentant attitude after the 43-shot slaying of unarmed African immigrant Amidou Diallo by undercover cops.

When he left office in January, 2002, despite his unquestioned leadership during the World Trade Center trauma, people were as glad to see him go as they had been to see him come. He’d done his job, but now his time had passed. His suave and savvy successor, Michael Bloomberg, would become the city’s Everyman that Giuliani had never been.

The question before today’s voters should be: Is Giuliani the right man for this time? After eight years of pig-headed presidency and the polarization of national and international politics, is it really "Giuliani time" anymore?


October 25, 2007

There is much about being a good mayor of New York that commends itself to being a good president of the United States. With a population of over eight million (more than Switzerland’s), a Gross Domestic Product of around $420 billion (more than Saudi Arabia’s), and an annual budget of $52 billion, New York could be a city-state, not just a city. The New York Police Department is more like a standing army, larger and better funded than many countries’ entire military and with bureaus in capitals throughout the globe. The mayor is an international figure, frequently abroad and often treated like a head of state.

But in addition to being chief executive, the mayor is expected by New Yorkers to be the city’s Everyman — not regally aloof but grittily involved. They want him on the spot at every emergency small and large, at every gas-leak or noxious smell, at the bedside — or the funeral — of every fallen cop and firefighter; they want him out there taking batting practice with the sand-lot hopefuls in Central Park, sampling an icee at a South Bronx push-cart or a garlic-dill at the Lower East Side’s Pickle Day; they want to see him scrunched into a child’s desk on the first day of school.

The mayor of New York is an icon of the city, the emblem of its identity and its self-esteem, and several mayors have become national icons, too: Fiorello La Guardia, guiding the city through the Depression and the War while reading the Sunday comic strips to the kids over the radio; and more recently, Ed Koch, leading thousands of stranded citizens over the Brooklyn Bridge to their jobs during a transit strike, and almost every day for eight years popping up all over the five boroughs to shake hands and ask, "How’m I doin’?"

So too, the president of the United States is the icon of the country. Beside and beyond all the policies and programs, it is the personal image the president projects that most affects our sense of identity and self-esteem, and the identity and esteem accorded it abroad.
Most of us would like to think we’re beyond that — that we weigh our mayors, our presidents, and our candidates rationally, on the basis of issues, not image — but in fact the mental process of evaluating our leaders and prospective leaders is to a great extent non-rational, intuitive, right-brained. We may try to puzzle out the newspapers’ comparison charts of candidates’ health-care proposals, but in the back our heads we treat these people, whom we’ve never seen in the flesh — much less had a chance to chat with — just as we do our co-workers, friends, and lovers.

Thus President Bush, who first won election as a compassionate conservative, the man you’d rather have a beer with, gradually alienated much of the country and most of the rest of the world, more by personality than by policy. By kicking back at the ranch after Hurricane Katrina struck, by ignoring the funerals of soldiers killed in Iraq, by an on-the-air attitude of arrogance, flippancy, and often befuddlement, he has at the end become an embarrassment, an anti-icon, an image of America as the home of the shallow, the stupid, the self-obsessed.

It is this iconic, non-rational element in our appraisal of leaders that has given Rudolph Giuliani his paradoxical traction in the race for president. By any objective standard, he should simply be unelectable. His historical stances on gun control, gay rights, abortion, and immigration (on all of which he now waffles but does not wholly retract), not even to mention his tawdry personal life, should make him anathema to the Republican base, but they have not brushed him off. His more-of-the-same positions on everything from Iraq to environment to health care should make every Democrat and independent cringe, yet they are looking at him. He remains at the top of the Republican polls and the strongest contender in election-today hypotheticals against every Democratic opponent.

It’s the power of myth. When people need an idol, they will not check for clay feet. Even though on the policy page Giuliani is basically Bush, in the misty realm of icons he is the anti-Bush, the man who was on the spot at the World Trade Center, who didn’t get doe-eyed at the news, who brought the city and the country through the gravest assault on American soil since Pearl Harbor. He’s the man, you’d love to think, who’d have planted himself squarely on the field of the Superdome after Katrina and personally consoled every single military widow and widower: the Mayor of the United States.

Early in his campaign, most of the pundits predicted that Giuliani could not last, that he could not run on 9/11 alone. But he has played the terror card well, tapping into Americans’ lingering fears, and of late has broadened his mythic scope, portraying himself as the Man Who Cleaned Up New York and can clean up Washington too.

How sturdy is the Giuliani myth to New Yorkers?

There is no doubt that most, myself included, applauded his public persona on 9/11 and the days that followed. Many in fact, myself included, initially favored his own proposal to the state legislature that the newly-enacted term-limit law be waived and his name be put on the ballot in the November 2001 elections. Needing an anchor in the storm and fixed on the moment of terror, we experienced a temporary amnesia, forgetting the Giuliani we had come to know over the previous eight years and up to that point were glad to be rid of.

Response to overwhelming crisis is one thing, but most of life is day-to-day. How did he do with the day-to-day as mayor?

We’ll test the pre-9/11 myth in the next column.


October 18, 2007

"America must speak candidly about the past not only to help heal the wounds of the survivors and the families of the victims, but to give the United States the moral authority it needs to take action against other genocides like that taking place today in Darfur."

So said Adam Schiff, the Democratic Congressman from Pasadena, following the passage of H.Res. 106 by the House Foreign Affairs Committee last week. This non-binding resolution, which he authored, calls upon the president "to accurately characterize the systematic and deliberate annihilation of 1,500,000 Armenians as genocide."

Candor, healing, moral authority: all noble ideals, all in critically short supply today. Would the presidential utterance of that one word, genocide, help bring these about?

It doesn’t look like it. Immediately after the vote, the Republic of Turkey, at which the bill’s sponsors claim it is not directed, recalled its ambassador and issued a statement that passage of the measure by the Congress might force the country to "cut logistical support to the U.S." Jittery, the administration presented Defense Secretary Gates to remind legislators that 70 percent of the military hardware going to Iraq and 30 percent of its fuel comes through U.S. bases in Turkey. Besides, said Presidential Press Secretary Dana Petrino, President Bush has often "expressed on behalf of the American people our horror at the tragedy of 1915" — he just won’t use the G-word to describe it.

Not much healing going on as yet; what about candor?

According to its sponsors, the bill has nothing to do with modern Turkey; it merely want to set the historical record straight. Beginning in 1915, the desperate and disintegrating Ottoman Empire attempted to exile its Armenian population, which it considered an internal threat; at the outbreak of World War I, the Empire had aligned itself with the Central Powers, and the Orthodox Christian Armenians were accused of aiding the enemy, Russia. Reporters and diplomats from the U.S., the neutral countries of Europe, and even allied Germany extensively documented the deportations and the deaths, and at the end of the war an international tribunal condemned them as "offenses against the laws and customs of war and the principle of humanity." Subsequent research by scholars worldwide overwhelmingly concluded that the leaders of the Empire systematically planned and actively attempted the mass extermination of the Armenians. At present, 22 nations have identified those acts as "genocide."

In 1923, the Republic of Turkey, a secular state modeled on Western governments and embracing Western culture, was shaped from the ruins of the Ottoman sultanate. Its radical disengagement from the old ways in some sense legitimizes the contention of the genocide bill’s supporters that it levels no accusations against the people of Turkey or their government but is directed, in the words of House Majority Leader Steny Hoyer, at "another government, at another time." It wishes only to place before the world the atrocities against the Armenians alongside other deplorable efforts at extermination, past and present.

But cultures are always slow to admit their own shame and will manipulate their histories intensively to avoid it. The official Turkish position, as I understand it, is that most of the Armenian deaths were caused not by execution but by starvation, disease, and their own hands — horrible enough, but circumstantial, not intentional, and thus not technically "genocide."
Wriggling on the issue of atrocities, while inexcusable, is understandable and by no means unusual. Germany kept the Holocaust out of its textbooks for years. The Soviet Union denied its horrors against Jews, Ukrainians, and even its own people during the Stalinist era. And what about our own government’s treatment of the American Indian? It took almost two centuries before our children began to be taught that this was not the heroic civilizing of a savage people but something close to . . . genocide. I doubt that that word appears much in textbooks even now, nor that the Congress in its quest for candor has ever passed a resolution acknowledging it. And indeed, no matter how repentant we Americans may become of our history of racial maltreatment, be it against Indian tribes, the Chinese in the nineteenth century, the Japanese internees of the twentieth, and African-Americans to this day, would we not still be insulted if, say, the Turkish parliament chose to call it to our attention?

Which brings up the question of restoring our moral authority. If there’s one positive cultural result of Abu Ghirab, Guantanamo, and the secret detention-and-torture centers outside our borders, it may be that we Americans are becoming more open-eyed about the actual weight of our moral authority. Humiliated and chastened by our government’s actions and attitudes, we may be becoming less likely to cast the first stone. Violations of human rights are every country’s problem, including our own. Our moral authority will not be restored by self-righteous declarations but by positive initiatives for peace, international justice, and economic development.

Judging from the experiences of other countries including our own, it may take a long while yet for Turkey to acknowledge the historical fact of the Armenian genocide. It may come sooner, rather than later, if the U.S. continues to nurture its relationship to Turkey as an open, forward-looking secular state.

Representative Mike Pence of Indiana, a Foreign Relations Committee member who voted against the genocide resolution after supporting identical ones in 2000 and 2005, summed things up exactly: "While this is still the right position, it is not the right time."


October 11, 2007

"All right, you can print this, but don’t use my name. I don’t want to get painted as an un-American guy. I’m proud of my war record, but I just want to be left alone."

My friend, a retired small-business owner in Westchester County, just north of New York City, served in the South Pacific during World War II as a tugboat captain with the Merchant Marine.

"I enlisted in 1942," he told me. "I was 21 years old. I wanted to sail, not to shoot big guns, so I picked the Merchant Marine. I fibbed a bit to the recruiter about my vast sailing experience; all I’d ever skippered was a little catboat my brother and I owned. They took me right away. They didn’t check much in those days."

He got his training on the job, and spent the war pulling barges and supply ships into and out of port, first in New Guinea and then in the Philippines. "I believe in Divine Providence," he said. "I got through three major invasions without a scratch. Sometimes I even had fun at it."

When I asked him if he had watched Ken Burns’s PBS documentary, The War, he told me, "No. I had enough of the real thing. I don’t want to be reminded of all that stuff again. Besides, most of what you see on TV and in the movies is one-sided: The Japs are all heartless killers who will fight to the death, and the Americans are all good guys just doing a job. It wasn’t that clean-cut.
"I was at Wewak and Hollandia in ’43. Of course, you never know what is really going on in a battle. You don’t see anything beyond your immediate surroundings, and you don’t have time to think about grand strategy anyway because you’re dodging bullets and bombs and trying to stay alive. Much later, I learned about what happened there from a Japanese friend of mine whose father was trapped at Wewak. MacArthur cut them off at Wewak and took Hollandia. The Japs were isolated in the jungle, thousands of them, dying of hunger. They had their backs against the mountains and couldn’t do a thing. We treated them like a prison colony; I think we even dropped supplies on them to feed them.

"My friend’s father was a radio operator, and he said he sent radio messages to the Americans: ‘We surrender.’ They weren’t all suicidal, like we’re used to hearing - they valued their lives too. It was us who didn’t want to take prisoners because we didn’t know what to do with them. So we made them kill themselves.

"Did you ever see those movies about the Battle of Iwo Jima? They made everybody think this was an essential objective, but that’s bullshit. This was a god-forsaken island in the middle of nowhere. They all claim they needed that airstrip for bombing raids on Japan, but it had no strategic value at all, and after we got it we hardly used it.

"It was terrible. There were thousands of Japs there, dug into the caves. By that time, in ’45, there was practically nothing left of the Japanese navy and air force. They were finished. We had the island surrounded by ships, we bombarded it for weeks, and the Japs were completely isolated. They could not leave. All we had to do was let them alone.

"But the Marines were crazy. They just wanted to fight. The war was almost over for God’s sake, but they had all this ammunition to use up.

"We lost 3,000 Marines just trying to get those Japs out of the caves. If they wanted to stay there, we should have left them. They were helpless. Instead, the Marines decided to clear them out with flame-throwers. What a way to die. Do you know how flame-throwers kill? They draw all the oxygen out of the cave, so the poor guys suffocate, then they incinerate them. That must be the worst kind of death.

"Was it suicide? It was more like desperation. By the time we did all that bombardment, they were out of their minds. My God, it was like shooting fish in a barrel. It was inhumane. It was the most disgusting goddamn thing we ever did.

"And that flag-raising picture? That was just a publicity stunt. You know they did it once for real, and then a second time for the photographer. And then half of those guys were killed later on trying to get the Japs out of the caves. The Marines had all those toys they knew they wouldn’t be able to play with anymore, and they used them up on those poor bastards. They confused war with a goddamn football game. It was just another sporting event to them, a grudge match.

"It was goddamn vindictive, that’s what it was.

"Well, at least that’s what I think, but I’m only one person, and I’m second-guessing admirals. But I think the idea should be to lessen violence wherever possible, not increase it.

"I tried to do some good for humanity whenever I could in that hell. I’ll tell you two stories.
"I was in the battle of Leyte Gulf in the Philippines in ’44, the biggest naval battle of the war. One time my tug was out at sea, far away from other ships with an air battle going on overhead. Our boys shot down a Japanese marine pilot and he jumped out but his parachute wouldn’t open all the way. We were the only vessel for miles around. When he hit the water, he hit it hard and landed maybe within 50 yards of us. He was unconscious, bleeding from his nose. We fished him out and got him on deck, but then we didn’t know what to do with him. Eventually he came to, and we put him on a hospital ship headed for Leyte Gulf. They put a ladder down the side, and that guy was able to climb up that ladder into the ship. I never found out what became of him, but I’ve often asked myself: Did he die? Did he go home and have a family? Did he end up in jail? I wish I knew.

"I’ll tell you another one. In the last six months of the war, I got dengue fever. They took me off the tug and put me on a small hospital ship of maybe 30 beds. I was in charge of the medicine supply. At that time there was a big black market in medicines.

"Late one night, a couple Filipinos came out to us in a canoe. You weren’t supposed to let anyone get close to your vessel because you never know. But this guy was begging and pleading that his son was sick with pneumonia and he would buy penicillin at any price. I listened to his story but wouldn’t sell him anything, it’s against all the rules. But he was so sincere that I got some penicillin, which I shouldn’t have done, and just gave it to him.

"Two weeks later we were tied up unloading some potatoes and a Filipino man came up and invited me to dinner at his boss’s house. It turned out his boss was the guy who had begged penicillin for his son. He had a beautiful home with gorgeous mahogany floors, and two lovely daughters, and he gave us a wonderful dinner. I’d made a nice friend, and in fact we invited his two daughters to dinner on the hospital boat. They came with their brother, whose life had been saved by the penicillin I gave his dad.

"I’m glad I did it."

Not exactly un-American.


October 4, 2007

"My God did not make this world to be like this! He made it for peace and love, but man just wants to fight."

Leonard Fraser leaned back in his chair in the room he shares with his wife Dorothy at the Fulton Special Care nursing home here in the South Bronx. I’d come over to talk with him about World War II.

He and Dorothy have been watching Ken Burns’s documentary, The War. "They got it basically right," he told me. "What you saw on TV is pretty much what it was like."
Leonard grew up in Harlem and was working in a midtown warehouse when he was drafted into the Army in 1942. After basic training at Fort Huachuca in Arizona, the camp reserved for African-Americans, he was sent to New Guinea. "First Battalion, Negro Division, 93rd Cavalry Reconnaissance: That was us. I was trained as an armorer. I could fix any gun you can name, big or small."

He served in the South Pacific for the duration of the war. The battles are mostly a blur to him now, or so he says. "Finschhaven . . . Molotai . . . all the big ones. The enemy was tough. We had to get them out of caves where they dug themselves in. They would die rather than surrender. They would throw themselves off of cliffs to commit suicide. I took a beautiful silver sword off the body of one of the officers. I have it in storage."

He paused, and his face grew blank. "I don’t remember a lot of what happened, but I remember how I felt. I was scared all the time — we all were — but there was nothing we could do about that. We just kept on."

"Malaria," his wife prompted. "You got malaria."

"Yes, I got malaria, we all did. It was jungle warfare. We just kept on."

"He doesn’t remember details anymore," Dorothy said. "He told me so many things, but now I can’t remember them either."

"It’s not a good thing to think about," Leonard said. "You have to forget."
Leonard Fraser got through over two years of combat uninjured. When he heard the news of the Japanese surrender, he sums up his feelings in one word: "Overjoyed."

They sent the troops back to the States packed in the ship like cattle. "He had to sleep on the floor all the way back," Dorothy said. "There were no cots for them."

Did he experience racial prejudice in the service?

"No, except that we were segregated. We fought hard and well, and the white soldiers came to respect us. The only prejudice I remember was stepping off the train in New Orleans for a break on the way out to Arizona. We had to use ‘colored’ bathrooms and eat in the ‘colored’ section of the restaurant. I didn’t like that. But that was just the South, not the Army."
When he returned home, his old job at the warehouse was waiting for him. Soon he met Dorothy, a native of Jamaica who had just come to New York to study fashion design and dress-making. It was mutual love at first sight.

"He was such a handsome guy," she recalled, "and a great dancer. But I wanted to make sure about him. The first thing I asked him was, ‘Do you have a job?’ The second thing was, ‘Do you have a bank account?’ When he took me to meet his mother, I knew we would all get along. I even looked a little like her. We all fit in together. We got married in 1947."
"That’s a war in itself," Leonard laughed.

Their life together was not easy, Dorothy told me. When the warehouse closed down in the mid-1950’s, Leonard did handyman work while Dorothy supplemented the family income designing and making dresses. They had two children, Lenny and June; Lenny died suddenly in 1993. They have three grandchildren.

Tormented by memories of battle, he was in and out of V.A. hospitals all his life. "He has ‘war syndrome,’" Dorothy said. "It’s like he’s there sometimes."

In the mid-1960’s, the family moved into a spacious apartment in the new Webster Houses, a city project in the South Bronx, where they remained for 40 years while the neighborhood disintegrated around them. Leonard became what’s known today as an "outsider artist." He would bring home items he found in the street and make colorful arrays of them in little shrines around the apartment. In the foyer, he had a collection of unmatched baby shoes, meticulously arranged by size and surrounded by American flags. On my visits to their home over the years, Dorothy would complain about the clutter but was resigned to it: "It’s Lenny’s way."

Last year, when Dorothy’s diabetes made her unable to walk, their daughter found a place for both of them in Fulton Special Care, a spotless facility with a pleasant and helpful staff, and good food — too good, according to Dorothy.

"Look at him!" she complained. "He used to be so skinny, and here he’s put on 20 pounds!"
That doesn’t bother Leonard. Untroubled by physical ailments, still youthful in appearance, and preserving the impish smile of a fresh recruit, he celebrated his 86th birthday last week.

But he keeps pondering the war he was in, and all the wars that have happened since.

"When you’re out there under fire, you ask yourself many times: What are you here for? What are you doing? You’ve got to fight for your life."

He turned silent a moment, then said, "If I had to do it again, it would be questionable. Was it worth it? Sixty years later, you turn around and people are still fighting."

"You can write this down: Don’t ever call up Sergeant Fraser again."


September 27, 2007

When I was growing up in Norwalk, Calif., in the 1950’s, World War II was just a game. At about the age of ten I had a shoe-box full of small olive-drab plastic soldiers and an impressive array of complementing military hardware — mortars, howitzers, jeeps, tanks, amphibious vehicles. When my neighbor Sammy came over with his own battalions to play war, we had a fine fighting force.

The dusty ground beneath the nectarine tree in my family’s back yard became some Pacific island, and we would dig it up with miniature foxholes and riddle it with mud-scattering explosions scooped out by hand. Our tongues and throats perfected convincing sound-effects for screaming bombs, machine-gun fire, flame-throwers, and what he called "hang grenades." When our dads came home from work and our moms would summon us to our respective dinners, we’d call a cease-fire and troop home caked with dirt and pumping with adrenalin from our after-school battle for America.

In the evening, washed and fed and homework done, I’d convince my parents to tune the TV in to "Victory at Sea," "Navy Log," "The Big Picture," and other wartime documentaries and dramas. That way, before hitting the subject in school, I learned about Pearl Harbor, Corregidor, Guadalcanal. I heard Hitler, Churchill, and FDR and tried to imitate their oratorical styles. I saw the night bombings and the dogfights and the kamikaze attacks, and the bodies of dead soldiers bobbing in the surf.

War was a terrible beauty to me then, cinematic, shot by the camera behind my eyes.
Sammy’s dad had served in the Pacific, which is why in our backyards we always fought "the Japs," whoever they were; through random conversation he had taught his curious son about military equipment and gear, and supplied him with cuss-words and randy ditties that appalled my parents. My dad had spent the war running turret lathes in a machine shop in South Gate; he was rejected for service because of anemia (at his induction physical, he often recalled, the medical officer looked at his chart and said, "Your blood count is so low you should be dead"), but he felt perfectly healthy and always considered it something of a lucky mistake.

There were many veterans among my parents’ friends, and the stories they told after a couple highballs only reinforced my romantic image of war: the camaraderie, the pranks, the exotic places they went to on leave. If they had been wounded in battle, they neither showed it nor discussed it, and they never said a word about the killing they may have seen and done. How could they? Highballs and horror don’t mix.

Years later, seeking perspective, I would try to get veterans to talk about their battlefield experiences. Most were evasive. About 20 years ago, a man who was in the D-Day invasion told me bluntly: "It was beyond words, and I try not to think about it. What I can’t get rid of is the stench of dead bodies. To this day it’s still in my nostrils." At around the same time, a neighbor in Van Nuys, whose loopy, happy-go-lucky husband came home from Germany with a metal plate under his skull, often confided, "I didn’t sleep last night. Lou was screaming in his dreams. He’s still fighting the war there."

These recollections from years past are being unlocked by Ken Burns’s documentary The War, airing on PBS this week. The film, hyped for months and supplemented by local productions such as California at War and Latino Stories of World War II on KCET and New York War Stories on WNET here, tries to get at The Big Picture by looking at the small, from the point of view of ordinary Americans who were caught up in it.

Burns has said that he could not have done this type of study even ten years ago because so many veterans were still reluctant to talk. Suffering from what we now call Post-Traumatic Stress Disorder, the only way they could cope with what they had seen and done and live an apparently normal life was to banish it from waking consciousness and release it only in their nightmares. Burns’s own father opened up to his history-obsessed son just once, close to his death in 2001, as the project was beginning to take shape. Only now, Burns claims, with veterans dying at the rate of a thousand per day, have more been able and willing to loose their tongues and share their memory with the world before it is lost forever.

This is not entirely true, of course; over the last decade there have been many first-person accounts of specific battles and domestic wartime life presented on PBS and the History Channel, but it is Burns’s scope that is significant — the tracking of individuals from the war’s beginning to its end, and the changes it effected in their lives and in the life of the four small towns they came from.

Burns determinedly avoids the usual documentary commentary "from above" — the talking heads in this film are not historians and military experts but the octogenarians who had known the war personally, in the battlefield, in the factory, and in the case of Japanese-Americans, in the internment camps. The historians stay behind the scenes, however, in the well-researched narration and vintage film footage that put the experiences of these people in context.
Viewed for itself, the film is often unsatisfying, at least in the two episodes I have so far seen before sending this column off. Burns too frequently returns to quotidian small-town life, leaving the film lacking the sustained tension that should drive the heroic story forward. Wynton Marsalis’s sound-track is decidedly dull, even though the music of the Big Bands was the cement that kept home and abroad together, and there is no haunting and memorable theme-song unifying the film like the "Ashokan Farewell" did in his 1990 documentary The Civil War. And the consistently monotone narration by Keith David is positively soporific.

The War is worth watching, less for what it is as a cinematic piece and more for what it opens in our collective consciousness — for those who lived through the period, for those like myself who knew it in shadow form from their parents, and for the generations for whom it is as distant as the Revolution. It taps the well of memory.

I lost track of Sammy a long time ago. I wonder if he’s been watching.


September 20, 2007

Iraq Week at the capital has come and gone, and what a week it was. General Petraeus and Ambassador Crocker flew in to give their mandated progress reports to Congress. President Bush followed up with his own address to the nation. The candidates had their opportunity to grandstand. And MoveOn.org revealed its hubris with that vile "General Betray Us" ad. (Punning on a person’s name is the lowest form of the lowest form of humor, and it may cause a self-immolation comparable only to Howard Dean’s "I Have a Scream" speech that shot down his high-flying presidential campaign in 2004.)

What can we make of the week that was?

While there is no doubt that General Petraeus’s report and recommendations were his own, as he made a point of emphasizing at the hearings, there is also no doubt that he and Ambassador Crocker were part of a masterful administration plan to shift the political ground. Nowhere was it clearer than in the president’s address last Thursday.

The speech was laced with the usual self-vindicating hyperbole about the terrorist threat (though the general, when asked by Senator Warner of Virginia whether operations in Iraq are making America safer, said, "I don’t know"). But the word "victory," so long a staple of Bush rhetoric, was conspicuously absent. It was replaced by the equivocal "success," "succeed," and "successful," which he used eleven times. And success, even to the gung-ho president, was now only a grasp at straws. In Anbar province, he proudly noted, some Sunni sheiks, formerly enemy insurgents, have formed alliances with American forces to flush out (also Sunni) Al Qaeda cells (of course, one of those sheiks was murdered the very day of the speech, and the alliances themselves may be of only temporary convenience, typical of the fluidity of the sects and sub-sects all over the country). And yes, he said, "the Iraqi army is becoming more capable, although there is a great deal of work to be done to improve the national police" (actually, the report recently issued by the James Jones commission called the police force "operationally ineffective" and recommended it be disbanded).

These "successes" — neither Petraeus nor Crocker were quite able to call them that — were enough to project sending 5,700 troops home by Christmas and the rest of the surge-force home by July.

"The principle guiding my decisions on troop levels in Iraq," Bush declared, "is ‘return on success.’ The more successful we are, the more American troops can return home." But, he cautioned, "success will require U.S. political, economic, and security engagement that extends beyond my presidency."

So there you have it: A few bars of "I’ll Be Home for Christmas" and many more of "The Long Run."

Despite the paucity of good news, the political ground shifted. Even though the promised troop reductions are nothing more than a validation of the word "surge," they sparked a faint flame of hope for an end to the occupation. And conversely, even though the predictions of keeping troops in Iraq for years to come were something no one wanted to hear, they gave the sense that at last our leaders were speaking realistically about the situation, with neither the blather of "victory" from the president nor the pseudo-bliss of immediate pullout from dovish Democrats. The president will get his funding. His successor will get it too.

Still, there is no comprehensive plan. The general was there, the ambassador was there, the secretary of defense made the talk-show rounds at the end of the week — but where was the secretary of state? If there ever were a time to reveal a true change of course, a vision for the future calling into play all the nations with a stake in the stabilization and viability of Iraq, this was it. But there is no such vision, from anywhere. This isn’t about Anbar or the body-count in Baghdad; it’s about Iran and Turkey and Syria and Jordan and Israel and Lebanon and Saudi Arabia and Dubai and the European Union too. It’s about that despised and sidelined but potentially effective body, the United Nations. And it’s about those two million Iraqi refugees whose flight has sucked the talent and creativity from the country.

And lastly but not leastly, it’s about the Democrats. It is a sad thing to see that sorry bunch of presidential aspirants sniping at each other when they should be working as a team to do what the current administration has failed to do: present to the public a unified eight-year plan for peace across the Middle East. They have no lack of resources for this task - the brightest military and diplomatic minds have been churning out books, articles, and proposals with astounding speed and with no lack of insight. Why couldn’t the candidates and the Democratic leadership subvert the dysfunctional primary process, gather in a smokeless room, and come up with a candidate, a cabinet, and a plan?

In a phrase that tarred the first President Bush, it’s "the vision thing." For four years now, neither the administration nor its opponents have been able to see past the next hurdle. Beyond the fuzzy utopia of "a free Iraq," the "way forward" needs a broad and practical program.


September 13, 2007

With just 15 months left in office — a twinkling of an eye to history, an eternity to most of the world today - George W. Bush is looking toward his legacy. Lyndon B. Johnson looms in his side-view mirror, closer than he may appear.

Nearly 40 years ago, Johnson, battered by Vietnam, virtually abdicated his presidency and left office in humiliation. Bush is determined not to let Iraq do that to him.
You can see that attitude in his August 22 speech to the Veterans of Foreign Wars. He refuses to follow the path of withdrawal in Iraq, as Johnson and his successors Nixon and Ford did in Vietnam. "Then as now," he declared, "people argued the real problem was America’s presence and that if we would just withdraw, the killing would end."

After quoting several anti-war naysayers in Congress and the press, he resumed: "Three decades later, there is a legitimate debate about how we got into the Vietnam War and how we left. . . . Whatever your position is on that debate, one unmistakable legacy of Vietnam is that the price of American’s withdrawal was paid by millions of innocent citizens whose agonies would add to our vocabulary new terms like ‘boat people,’ ‘re-education camps,’ and ‘killing fields.’"
Another price of withdrawal, he went on, was the loss of "American credibility" that haunts us to this day; he quotes the taunt of Al Qaeda leader Zawahiri that "there is no hope in victory. The Vietnam specter is closing every outlet."

His explicit linkage of Iraq to Vietnam was characteristically weak: Not a single person today argues that withdrawing from Iraq would end the killing; and as for the Iraqi version of "boat people," refugees are not waiting for an American exit — already two million have fled the country, with another million displaced internally.

However, it was a surprise to many that he compared Vietnam to Iraq at all, after repeatedly denying any such thing, particularly the word "quagmire." The way he finally compared them was even more of a surprise: Was he actually implying that the U.S. should have stayed in Vietnam?

Robert Dallek, author of biographies of both Johnson and Nixon, complained to the press with incredulity, "What is Bush suggesting? That we didn’t fight hard enough, stay long enough? That’s nonsense. We were in Vietnam for ten years. We dropped more bombs on Vietnam than we did in all of World War II in every theater. We lost 58,700 American lives. And we couldn’t work our will."

Of course, the Bush speech was no historical analysis, only dangling assertions begging to be filled in by inference. But those bald statements excite the mind to "alternate history," that fascinating game of "what-ifs": What would have happened if the U.S. had remained in Vietnam?
Johnson’s consistent policy for Vietnam was "containment," not conquest. As Truman had done in Korea, the goal was to keep Communist regimes from toppling their pro-Western neighbors: the "domino effect." The incursions and bombings in North Vietnam were acts of deterrence, not aggression. But the North Vietnamese could not be contained; they were focused on unification with the South, on their terms alone.

In late March of 1968, just days after announcing he would not seek a second term, Johnson called a halt to the bombing and invited peace talks with the North. What if he hadn’t? Would "staying the course" have eventually so worn down Northern resources and morale that Ho Chi Minh would have initiated peace talks instead? Possibly - but at what further cost? By that time, conditions at home were verging on anarchy, and within that same year Martin Luther King and Robert Kennedy would be dead; could Johnson have continued the war of attrition without destroying his own country in the process?

There is a second what-if: What if Johnson had changed his war policy from containment to conquest? What if he had gone for the throat of North Vietnam as some "hawks" had suggested, obliterating Hanoi, eliminating the Communist leadership, and forcing unification on his own terms? Would that "victory" have saved his presidency, reversed the nation’s slumping morale, and restored the image of America the Invincible? On the other hand, would victory in Vietnam have precipitated a disastrous conflict with the Soviet Union and/or created an American colony, angry and restive and ready to explode anew?

Because of its complexity, the possible alternative histories of Vietnam are endless. What Bush’s own may be, we’ll probably never know, just as we do not know his scenario for "victory" in Iraq, over which he has some actual control. But it is clear that like Johnson in Vietnam, Bush has no real vision of victory, despite his rhetoric. He and his agents General Petraeus and Ambassador Crocker, who are testifying to Congress this week, cling to the hope that given more time, U.S. military presence will prevent total chaos and allow for a kind of natural shakedown among the warring factions that will eventually result in social reconfiguration and political stabilization. That’s hardly victory, but then again, this is hardly a war anymore; it’s a waiting game that may take decades to resolve.

An air of resignation seems to have settled on the country. Calls for immediate pullout grow fewer and fainter as the reality of the situation sets in. With Petraeus and Crocker paving the way, President Bush is slowly positioning himself to hand the tar-baby of Iraq to his successor without acknowledging defeat.

He has Lyndon Johnson in the mirror.


September 6, 2007

In his speech to the Veterans of Foreign Wars convention on August 22, President Bush made an unprecedented appeal to history to support his policies on Iraq and Afghanistan. While it was atypical for him to make any reference to history at all, it was typical for him not to reason from premise to conclusion: Thanks to American occupation after World War II, he summarily stated, Japan was transformed from a totalitarian state into a thriving democracy; thanks to American intervention, South Korea was not overrun by the North and became an economic powerhouse; and conversely, thanks to the anti-war movement, South Vietnam was overrun by the North, causing "agonies" to "millions of innocent citizens." History in the Bush mind is a simple "was/is," with no steps in between — very much like his "was/will be" vision of a democratic Iraq.

His use of history in this speech was, of course, not analytic but polemic: to denounce naysayers in Congress and the press, past and present. "Will today’s generation of Americans," he asked, "Resist the lure of retreat, and will we do in the Middle East what the veterans in this room did in Asia?"

Despite his omissions, or even because of them, the speech is valuable as a challenge to refresh our own historical knowledge and memory and do the reasoning that he refuses to do. It’s a ready-made assignment for a history class.

In last week’s column I tried to fill in Bush’s blanks on post-war Japan, concluding that its dramatic transformation into a Western-style democracy was the result of a comprehensive American reorganization and re-education plan for Japanese society, something entirely lacking in the administration’s policy on Iraq.

What about his history lesson on Korea? "After the North Koreans crossed the 38th Parallel in 1950," Bush begins, "President Harry Truman came to the defense of the South." Not true; the United Nations did it. Rather than bypassing the U.N. as Bush did in invading Iraq, Truman immediately went to the Security Council for resolutions condemning the Communist aggression and requesting member nations to assist the Republic of South Korea in its self-defense. Though the U.S. was given overall command of the operations and supplied much of the manpower and equipment, 15 other U.N. members sent troops and over 50 contributed medical or economic aid.

On July 19, 1950, less than a month after the Communist invasion began, Truman addressed the nation on TV. As David McCulloch writes in his biography, Truman, the president stressed that American forces "were fighting under a U.N. command and a U.N. flag, and this was ‘a landmark in mankind’s long search for a rule of law among nations.’"

"Nor was he the least evasive about what would be asked of the country," McCulloch continues. "The ‘job’ was long and difficult. It meant increased taxes, rationing if necessary, ‘stern days ahead.’" Contrast the Bush approach to national commitment to "getting the job done" in Iraq.
But what was the job in Korea? After pushing the invading forces back across the 38th parallel in October of 1950, U.N. commander Douglas MacArthur proposed continuing the drive northward to unite the two Koreas. Since unification was an expressed goal of the United Nations that had been blocked by the Soviet Union, Truman agreed. Then Communist China entered the conflict, surprising the advancing U.N. forces, driving them down, and eventually retaking much of the South. MacArthur pleaded to employ Nationalist Chinese troops from Taiwan and even to use up to 50 atomic bombs to take out major cities on the Chinese mainland. As summarized by McCulloch, General George Marshall warned Truman that "the United States must not get ‘sewed up’ in Korea, but find a way to ‘get out with honor.’" "There was no doubt in my mind," Truman later wrote, "that we should not allow the action in Korea to extend to a general war. All-out military action against China had to be avoided, if for no other reason than because it was a gigantic booby trap." He reconsidered the unification goal, and when MacArthur questioned his judgment, Truman fired him.

MacArthur was unrepentant. In his "old soldiers never die" farewell speech to Congress in April of 1951, he defiantly declared: "In war, indeed, there can be no substitute for victory. There were some who, for varying reasons, would appease Red China. They were blind to history’s clear lesson, for history teaches, with unmistakable emphasis, that appeasement begets new and bloodier war."

Under the new U.N. commander, General Matthew Ridgeway, Communist forces were turned back and the situation eventually stabilized into skirmishes and mutual bombardments across the 38th Parallel while the U.N. and the Communists squabbled over the terms of a truce. An armistice establishing the present borders of North and South Korea was at last signed in July of 1953.

In his speech, Bush quotes (but, strangely, does not name) many politicians and pundits critical of U.S. involvement in the Korean conflict, an implicit chastisement of his own critics on Iraq. He never utters his favorite word, "victory"; how could he? He would sound more like the fanatical MacArthur than Truman, who thought it wiser to settle for a stalemate that contains violence rather than a rash move in the name of honor that spreads it.

What Bush also misses is what may be the real comparison of Korea to Iraq: that it has taken 50 years and countless coups and juntas for South Korea to settle down, and that there are still 30,000 U.S. troops stationed there.

"I recognize," Bush concludes, "that history cannot predict the future with absolute certainty." In this case, it might.


August 30, 2007

The speech George Bush delivered to the Veterans of Foreign Wars convention in Kansas City, Mo., last week was surely the most intriguing one of his presidency. It is actually thought-provoking - something extremely rare from any contemporary politician - though I’m not sure he intended it to be. In it, he turns to history to buttress his position on terrorism in general and Iraq and Afghanistan in particular — this from a man who, aside from the tired Munich/appeasement cliché applied by everyone to any confrontation deemed non-negotiable, has demonstrated no prior regard for history other than the history he is making for himself.
The speech is so uncharacteristic that you have to wonder who got his ear, who wrote it for him. Whoever it was, we should be grateful. It is still pure, unnuanced Bush, but with consciousness expanded. His understanding of the American dynamic in post-war Japan, in Korea, and even in Vietnam — a subject he’s heretofore denied as an analogy to his own adventures in Iraq — now provide us with a broader perspective on his worldview, self-image, and sense of destiny.

After his introductory remarks, in which he proudly declares that "I stand before you as a wartime President" engaged in "a struggle for civilization," he initiates his thesis with "a story that begins on a sunny morning, when thousands of Americans were murdered in a surprise attack" by an enemy that "despises freedom" and "turns to a strategy of surprise attacks destined to create so much carnage that the American people will tire of the violence and give up the fight.

"If this story sounds familiar," he continues, "it is — except for one thing": The enemy he means is not Al Qaeda (or Saddam Hussein, whom he neglects to mention) but Imperial Japan. The attack is not 9/11 but Pearl Harbor, and the ultimate result of the American response to it was a Japan transformed from a totalitarian state into a Western-style democracy that "has brought peace and prosperity to its people" and "helped jump-start the economies of others in the region."

How did this transformation occur? Bush does not say. What he does say, with copious citations, is that critics both inside and outside the Truman administration believed "Japanese culture was inherently incompatible with democracy," and that "Americans were imposing their ideals on the Japanese," especially in regard to women’s suffrage and freedom of religion.
"You know," he wryly concludes with one eye to the Congress and the other to the academy, "the experts sometimes get it wrong."

Typically, he lets his implicit comparison between the American occupation of Japan and the American occupation of Iraq dangle in the air, presuming them self-evident, without need for further thought. But such allusions should send every citizen to the history books or their on-line equivalents, seeking either to confirm his assertions or call his bluff.

My own brief research into post-war Japan brought up the following. Check it against U.S. policy in post-war (i.e., after the toppling of the Hussein regime in May, 2003) Iraq.
When the United States occupied Japan in the fall of 1945, complete control over the society was placed under General of the Army Douglas MacArthur as Supreme Commander for the Allied Powers. Immediately, what was left of the Japanese military was demobilized and all war equipment was seized and destroyed. Conspicuous war criminals were tried by military tribunals, but all governmental structures and most of their personnel were left in place. Then the reconstruction began.

Bush is right that the prevailing strategy of the Truman administration for the rebuilding of defeated Japan was to actively shape it into a Western-style democracy. The plan was headily ambitious: The entire culture was targeted for transformation, from top to bottom.

From the start, political parties cooperative with the occupation’s goals were allowed to emerge, and the first free elections were held in April of 1946, eight months after the surrender. Later that year, an American-drafted constitution was adopted by the interim Japanese government. It relegated the Emperor’s once all-powerful role to a "symbol of the unity of the state," disestablished Shinto as the state religion and guaranteed the other freedoms and legal safeguards of the U.S. Bill of Rights, decentralized the government, and officially abolished the military, making Japan a pacifist state.

In addition, social programs modeled on Franklin Roosevelt’s New Deal were set in place: The huge landed estates were redistributed among the tenant farmers, the largest corporations and financial institutions were broken up, laws regulating labor practices were enacted, and a social security system was established. Further, rigid traditional family structures were tempered by laws guaranteeing the rights of wives and grown children. Most especially, the educational system was completely "de-ideologized" to encourage historical objectivity and a respect for human rights and democratic government.

The American command implemented these goals indirectly. On every level of government, American advisors, both military and civilian, communicated the command’s directives as "suggestions" to the Japanese officials with whom they closely worked, and who through fear or deference eagerly carried them out.

Despite the reach of this enterprise, there was little opposition. Perhaps the Japanese people, after centuries of imperial rule, were culturally conditioned to respect and accept authority. Perhaps, as Bush believes, they responded to "the universal appeal of liberty." In any case, they seemed to see something in the Americans and their policies that they liked.

In September of 1951, six years after the surrender, the United States relinquished control and returned full sovereignty to a Japan that was adjusting to its new identity and poised for prosperity.

Bush, we see from his speech, takes Japan’s revitalization as one of his models for the future Iraq and Afghanistan, if only he were allowed to "finish the job."

But what job is that? There may have been parallels to Japan when the totalitarian regimes of the Taliban and Saddam were overthrown, but Truman and MacArthur did not allow Japan to dissolve into chaos in the name of self-determination. Now there may be no parallels at all.


August 23, 2007

It’s the summer of the honeybee.

In my eight years of keeping bees, I’ve not seen such intense interest in these insects as I have this season. Enticed by best-selling books like The Secret Life of Bees, people have become enamored; alerted by apocalyptic media coverage of the disappearing bees, people have become concerned. Local folks see me at work with my three hives at Genesis Park Community Garden and shout over the fence, "Hey, Mistah, what’s happening to the bees?" Reporters Google "Beekeeping New York City," find something written by me or about me, and track me down for interviews. Photographers and film-makers documenting unusual aspects of urban life lug over their cumbersome equipment and shoot closeups of the un-self-conscious creatures going about their work and long-shots of the self-conscious beekeeper in coveralls and veil, pacifying the hives with smoke. The talks and gloves-on workshops that I give to community gardeners and to visitors at Wave Hill, the forested mansion overlooking the Hudson River in the Bronx, always fill up quickly.

Children are particularly attracted. Last Thursday the Bronx Helpers, a neighborhood youth organization, brought a dozen rambunctious pre-teens to the garden for a beekeeping demonstration. When they arrived, they were twitching in panic, batting away flies and yellow-jackets and screeching in fear of stings. Two hours later, after I’d opened a hive and brought combs thick with docile and disinterested bees right in front of their faces, they left with a new, positive attitude toward the honeybee.

Last Friday I brought some equipment and an observation hive - a slim wooden box with plexiglass sides for viewing the bees at work on their combs - to Camp Kiwi, a suburban summer refuge 50 miles north of here. Kids and counselors alike were fascinated, seeing for themselves what they’d read about in school and watched on the Discovery Channel. I felt a bit ashamed that some of the children knew more about bees than I do.

The growing interest in the local-foods movement brings people to the garden to taste and buy honey. Last week, Santa Cruz surfer-turned-restaurateur Jim Denevan featured it in complement with cheeses and wines from Long Island at his annual gourmet dinner to benefit New York urban farmers. Allergy sufferers claim near-miraculous cures from daily doses of honey made in their vicinity - theory has it that honey made from the same flowers that provoke their allergic reactions acts as a natural antidote.

Others come to learn about beekeeping first-hand, hoping to set up a hive or two in their back yard or community garden, or even to try a career at it. On Saturday a Haitian man and his eleven-year-old son helped me harvest honey and extract it from the comb. The son was serious and attentive, wide-eyed with wonder; the father was practical and inquisitive about equipment and hive-management; he and a partner envision a large beekeeping operation in Haiti.
I spent the first week of August at the annual convention of the Eastern Apicultural Society, held this year at the University of Delaware at Newark. A couple hundred beekeepers from Maine to Florida, from Long Island to Iowa, gathered to hear scientists discuss the latest diseases, mead-makers and chefs share their honey-based recipes and give generous tastings, and craftspeople demonstrate making candles, soaps, and tinctures of propolis.

Usually I can’t stand conventions, but this one was different. There is something about honeybees that shapes the people who work with them and brings them together in a kind of shared priesthood, mediating the natural and the human. Beyond all the science - the experts reported they still aren’t sure why the bees are disappearing - and the practicalities - how best to purify beeswax for making candles - were the lunchtime conversations that sometimes bordered on the mystical. A wizened career beekeeper from upstate New York, after recounting his woeful experience with bears ("They outsmart even electric fences to get at that honey. I tell you, Winnie the Pooh they ain’t"), concluded, "Still, I wouldn’t be doing anything else. I love working with the bees in the apple orchards in spring - open sky, fresh air, smell of blossoms. Nothing better." A young marine archaeologist from Delaware told me that when she decided to start beekeeping last spring, she set up an empty hive-box in her back yard and was about to buy a colony from a local bee farm when a swarm of wild bees came out of nowhere, found the box, and settled in. Next month she’s having a priest bless them.

At the whimsical social event called the Bee Bawl, where prizes are awarded for the best bee costume (imagine that happening at a wasp convention), the group celebrated almost a dozen wedding anniversaries of couples who had met at the convention in years past. Two engagements were also announced.

It’s the summer of the honeybee.

Sunday, August 17, 2008


August 9, 2007

I drink tap water. In the fridge there’s always a full pitcher, plus several refilled plastic bottles ready for the car and the backpack. Until recently, there’s never been any particular principle for my doing this, except the pleasure principle.

Improbable as it seems, New York City water is delicious, if such a word can be applied to water at all. Crisp and neutral-tasting, it comes to us from snow-pack and rainfall in the Catskill Mountains a hundred miles to the northwest. It is held in a network of reservoirs buffered by city-owned forest land to protect them from encroaching development and its pollutants. It travels to the city through one huge tunnel, propelled almost exclusively by the force of gravity. This water is so pure that last week the U.S. Environmental Protection Agency granted a ten-year waiver of the federal requirement that all drinking water be filtered. Only four other large cities also qualified for the exemption: Boston, Seattle, Portland, Ore., and San Francisco.

The present system began in 1842 with the Croton Reservoir in nearby Westchester County, and was gradually expanded to include 19 reservoirs and three lakes. It now provides a billion gallons of water a day to New York City and other communities in its area.

On hot summer days, some New Yorkers still find ways to open what are supposed to be tamper-proof fire hydrants, sending tons of pristine mountain water to the sewers. When I first came to New York from drought-ridden Southern California in the 1990’s, I was appalled by this practice. I soon discovered that opening fire hydrants was a summer tradition, like stickball.
Even with a municipal swimming pool nearby, nothing was surer to start a spontaneous block party than a gushing hydrant. Little public thought was apparently given to the dangers involved: Open hydrants lower water pressure, and too many open at once keep both firefighters and people in upper-floor apartments from getting the water they need. In addition, every summer a number of children were injured or even killed by the powerful, natural force of water seeking its level.

In my first sweltering summer here, I was on Lafayette Street in Greenwich Village, where a short, bare-chested man with a can of Bud and a cigarette in one hand and a pipe-wrench in the other stood by a gushing hydrant. "Why are you wasting all this water?" I asked indignantly. "Because I want to," he snarled. Welcome to New York.

In the late 1990’s, in the face of a severe drought, this revered custom was challenged by Mayor Giuliani. Some hydrants were fitted with caps that only a special key could open, and others with sprinklers that provided a gentle spray for kids to play in. People found ways around both. Here in the Bronx, it is still common to find open hydrants manned by dads and grandpas, knocking kids down and giving spontaneous car-washes to passing traffic. I’ve gotten many a wash myself this way, before calling the Fire Department to report a violation.

While driving past open hydrants for my car-wash, I often notice people cooling their feet in the ready-made river and drinking, of all things, bottled water.

Native New Yorkers of a certain age are still very proud of the quality of their water. Even when the city was crumbling a decade or two ago, they’d say, "Well, at least the water is still great." Consumer Reports magazine, based in nearby Yonkers, runs blind tastings of pricy bottled waters along with a selection of municipal sources; almost invariably, NYC tap comes out on top.
But people here, as everywhere, have developed a fixation on the water bottle. It is their constant companion, filling the psychological place that smoking had in more innocent times: a momentary distraction, a break in consciousness from whatever thought or action happens to be happening. And it seems just as addictive, maybe even moreso because of its acceptability. In my church, no less, it is now common to see people, including choir members, readers, ushers, and even the priest himself nipping at their bottles all through the service.
I suppose, as the bottled-water industry marketers say, it’s better to be addicted to water than to all those other substances we compulsively put in our mouths. But why not just refill the bottles you already have with tap water?

It would save money, for one thing. The mayor’s office, which has recently started a campaign to promote New York’s fine liquid product, points out that drinking your recommended eight twelve-ounce glasses of water a day will cost $1,400 a year if you go bottled, and 49 cents if you go faucet. And considering that many of the top-selling waters like Aquafina are, as Aquafina’s label abbreviates, "Bottled at the Source P.W.S." - Public Water Supply - it’s hard to imagine why otherwise savvy shoppers would choose the stuff from the store.

There are other costs involved too, including the impact of plastic bottles on the environment, from manufacture to shipping to disposal, and the grim fact that 60 percent of the world’s population has no reliable access to safe drinking water.

Bottled water is one of those issues that people never thought was an issue. Slowly we are putting the pieces together and beginning to discover that it is.

I’ve been drinking New York City tap water on the pleasure principle. Now other principles are entering in.


August 2, 2007

MILWAUKEE - Wisconsin Avenue, Milwaukee’s Main Street, is anchored by two radically different structures, the Pabst Mansion on the west and the Milwaukee Art Museum complex on the east. They’re tales of two cities.

A family wedding brought me back here after years away. The Milwaukee I remember from long-ago visits probably remains the rest of the country’s stereotypical image, the town that beer made famous. Miller, Blatz, Schlitz, Pabst all had their major operations right downtown, and the first thing you did as a tourist was take a brewery tour or two, or three. You’d creep along the catwalk, peering at the enormous steel holding tanks, taking in the sweet smell of fermenting grains and watching the clattering march of bottles, cans, and kegs being filled and capped. Then you’d hit the tasting room and leave with a souvenir mug or coaster set.

You can still take a tour at Miller’s, as well as at a couple upstart micro-breweries, but the rest of the Big Guys are all gone. The last vestige of the old Pabst Brewery, its skyscraping brick chimney, is now half-dismantled; the huge vertical lettering on its side has been shaved down to "BST."

To fully appreciate the Era of Beer, visitors should make a point of touring the residence of Frederick Pabst, the mastermind of macro-brewing and macro-marketing in the nineteenth century.

What Rockefeller was to oil, Pabst was to beer. A German immigrant, he worked steamers on Lake Michigan until he married into the family of Phillip Best, who owned a small brewery in Milwaukee. In 1864, at age 28, Pabst bought a half interest in the Best Brewing Company and embarked on an ingenious expansion plan that included not only beer but all the German Gemütlichkeit that surrounds beer. By 1889, when he bought the rest of the company and changed its name to his own, Pabst had built a nationwide distribution system, a vast chain of taverns, several restaurants and resorts, and prestigious hotels from New York to San Francisco.

In 1892 he moved his family from their modest dwelling on the brewery grounds into an opulent new home on what was then aptly called Grand Avenue, Milwaukee’s Mansion Row. He also moved the Pabst Pavilion - an ornate, glass-domed basilica to beer that had showcased Pabst products at the 1893 Columbian Exposition in Chicago - next to his house, to be used as the family sun room.

Frederick Pabst died in 1904, his wife two years later. In 1908, the family sold the property to the Roman Catholic Archdiocese of Milwaukee for the residence of its own potentate, the archbishop. Ironically, the pavilion, lavishly adorned with terra-cotta hop vines, beer steins, and statues of the mythical gods of brew-making, was converted into a chapel.

Over the years, the neighborhood disintegrated. Most of the elegant mansions were razed and replaced with offices, stores and low-income housing. Fortunately, by the time the archdiocese decided to vacate, the preservationist movement had taken hold, and in 1975 the property was transferred to a non-profit organization which has been restoring the mansion to its original condition and furnishings.

That’s one Milwaukee, and other than the Mansion there isn’t much left of it, except for what is called Old World Third Street toward the city center. There you still find Usinger’s sausage store and Mader’s German restaurant with its oaken interior, its waiters in liederhosen, and its dense menu of pork, dumplings, and red cabbage.

At the opposite end of Wisconsin Avenue, on the shore of Lake Michigan, is the symbol of the new Milwaukee, the Museum complex. Pabst built more than his share of civic structures in his time - the Pabst Building and the Pabst Theater are still downtown landmarks - but this cluster of buildings is something he could scarcely have imagined.

The edifices he commissioned were of his day - weighty, dark, ornate, symbols of power and amassed wealth. The museum buildings defy gravity and capture light and air. The War Memorial Center, designed by the Finnish architect Eero Saarinen and opened in 1957, is a stark rectangle of concrete, floating on cantilevered stilts. Adjoining it is the building that put Milwaukee on the international architectural map, the Quadracci Pavilion. This masterpiece by Santiago Calatrava of Spain is a cavern of concrete and glass 90 feet high, shaded by its signature "wings" that open and close like an umbrella. Photos of the place cannot do it justice; it is a dynamic of light and movement that must be experienced, and it’s reason enough to visit Milwaukee.

Between the mansion and the museum is a revitalizing center city, with apartments, shops, and restaurants replacing the breweries and grimy factories that for so long lined the Milwaukee River that the town was built around. Except for Frederick Pabst and his crowd, the Good Old Days weren’t as good as this.

To me, one sign of a healthy community is bookstores, and I was pleasantly surprised to find that despite the encroachment of the national chains, the locals are holding their own. Besides the five-store chain of Harry W. Schwartz Bookshops, there are several used-book specialists downtown, including the cluttered Renaissance on Plankington Avenue and the scrupulously ordered Downtown Books on Wisconsin, to which I immediately became addicted.
Summer outdoor entertainment is extraordinary. The weekends are filled with festivals reflecting the city’s ethnic diversity: German, of course, but also French, Italian, Mexican, American Indian, African, Polish, Arab. Something for everybody.

Two Milwaukees - one to imagine nostalgically, and one to get to know.


July 18, 2007

Well, what would you do about Iraq? Go ahead: fantasize, free-associate, be imaginative, develop your own scenario. Your ideas are just about as good as those of the President, the candidates, the legislators, the generals, the so-called intelligence community, the media commentators, and the professors and ex-operatives who’ve written all those books that everybody talks about but nobody reads.

Remember "Stay the course"? It used to be Bush’s mantra of gritty resolve, contrasted with the wimpish "Cut and run." Now it’s an apt description of national immobility - foreign policy on autopilot, mere maintenance, acknowledgment of helpless stuckness. You know where Condi Rice has been lately? Staying on the golf course.

Though the President discarded "Stay the course" some months ago in favor of "creative engagement" or whatever he called it, he continues to repeat the word "victory." He once had a personal definition of victory, remember? - "An Iraq that can govern itself, sustain itself, and defend itself." But he’s gone minimalist lately; the word "victory" just dangles there out of his mouth, alone, a beautiful Cartesian self-evident truth. Victory is victory.

The problem with the word "victory" is that it is a war-word. And the problem with the country as a whole, from the politicians to the generals to the media to the man in the street, is that it thinks it is at war when it is not.

Yes, there once was a war in Iraq. It took place in 2003, beginning with the invasion and ending with the capture of Saddam Hussein. "Mission accomplished," the banner on that aircraft carrier proclaimed. We won the war four years ago.

Then came the period of occupation, the deployment of forces to maintain order while elections were held, a new government was created and recognized, a constitution was devised, and an army and police force were re-formed. Had those steps resulted in an Iraq that could govern itself, sustain itself, and defend itself, the occupation would have ended right there. The troops would have gone home, proud that another mission for good had been accomplished.
But that has not yet happened, nor is there much hope that under present circumstances it ever will. The work of the armed forces in the reconstituted Iraq is mostly policing, holding together what we keep telling ourselves is an autonomous government and an independent nation, while caught in the middle of multiplying factions waging their own mini-wars against each other, and us if we’re in the way.

With policing, as every cop from New York to L.A. knows, there is no such thing as victory. Fighting crime is forever. The only question for us in Iraq now seems to be: How long will we let forever go on? And on that question, everyone is paralyzed.

Just look at the Congress, with its dozens of bills floated but unpassed: immediate withdrawal, drawdown in 120 days, wait till October and see, pull into enclaves to protect the borders, draw up some more benchmarks and see, etc., etc. The only bills that pass are the full-funding ones, continuing the vicious circle.

To break the impasse, one must look again at why the Bush administration invaded Iraq in the first place. As we all know now, the weapons of mass destruction were just a pretext. The invasion was primarily a social experiment, an attempt by the neoconservative cabal to implement their utopian conviction that given the opportunity, all human beings will naturally choose democracy. Like all utopias, the experiment died a-borning. Democracies are not natural; they must be formed in a society’s collective mind over decades. Still clinging to the naïve hope that given enough time, Iraq can form itself into a model democracy for the Middle East, the great superpower has tied its own hands.

Given the possible alternatives of withdrawal - social chaos or radical Islamic takeover - why not consider the opposite approach: good old-fashioned American imperialism?

Bush says he admires Theodore Roosevelt. What would T.R. do in Iraq? Typical of his era, he was unafraid to take over countries and shape them to his purposes. This is a venerable American tradition, from the Monroe Doctrine forward. As he did with Panama and across Latin America, he’d depose the Iraqi government, set up a puppet regime to serve American interests, have the opposition disposed of, appropriate the oil fields, put everything under his thumb - in short, colonize Iraq.

Maybe it’s time to quit accommodating and just take over. To hell with those benchmarks; install a government that will do what we want it to do, now. To hell with the "surge," too; it’s emblematic of the chronic short-term thinking that has plagued this project from the start.
If the President had all the guts and resolution he always claims to have, he’d set before the people a 20-year plan for the pacification and reconstruction of Iraq.

Forget "embedding" a few U.S. troops in that clownish Iraqi army; make it a subset of our own. Commit all the forces possible, and immediately reinstate the draft to guarantee a long-term supply.

At the same time, give Iraq a New Deal, a massive Marshall Plan. Put everybody to work, WPA-style; fill the universities with American visiting professors of economics, political philosophy, and law; replace Halliburton with real businesses, foreign and domestic, large and small. And as partial payment, make every drop of oil the property of the U.S. government, with the oil companies as contractors.

The neocons thought that an Iraq free of Saddam would Americanize itself. They were wrong. Only America can Americanize Iraq. Make it a possession like Hawaii was - or California, for that matter. Teach them to think like Americans, show them how to play baseball, and give them their freedom when we’re satisfied they can handle it.

American imperialism just ain’t what it used to be. If you’re going to invade a country, do it right and do it all.

Now there’s a fantasy for you. Got any better ideas?


July 5, 2007

Sicko. When I first saw the title of Michael Moore’s new movie on the state of health care at home and abroad, I thought: Uh-oh, he’s doing it again, squandering an opportunity to be a real player in a critical national issue by veering to his base of leftist absurdists and letting the right write him off as the Holy Fool. Why not call it something less flippant, more engaging, more Al Gore-like? Well, maybe not.

Moore likes allusions to the classics. His last work, Fahrenheit 9/11, was a faint thematic echo of Ray Bradbury’s dystopic book and film. Sicko may well be a distant accolade to Alfred Hitchcock. Perhaps he wants our subconscious to imagine the American health care "system" as a gargantuan Bates Motel, with the insurance companies as the bloodthirsty proprietors and the federal government as their embalmed mother.

However off-putting the title may be to some, the topic itself is so timely that the movie may draw the curious of all persuasions into the theaters just to see what Moore does with it - or to it.
They won’t be disappointed. Aesthetically, the film is great satire, pitting irony against irony at every turn. Politically, it is provocative in the best sense, challenging the country and its leaders to quit dancing around the question and face it head-on: Why not universal health care?

Moore began his project with a simple little inquiry on his website, michaelmoore.com: If you’ve had any problems with your health insurance, contact me. Within weeks, e-mail responses ran to the tens of thousands. With his characteristic genius for culling through mountains of data to find the perfect illustrations, he selected witnesses with both telling cases and good camera presence, people like you and me with stories like yours and mine. Among them are a middle-aged couple forced into bankruptcy by life-threatening diseases, moving in with one of their children; a wiry, wry retiree working as a supermarket janitor to meet his drug expenses; a bearded, good-natured craftsman who cut off two of his finger-tips at his table saw and left them that way rather than pay the $75,000 his insurance company wouldn’t; a sampling of ailing 9/11 rescue volunteers excluded from the government’s paltry compensation plan; and the wide-spread news story of the deranged indigent woman dropped off at the door of the Union Rescue Mission in L.A. by Kaiser Permanente Hospital.

Unlike his previous films, Moore made no attempt to barge into the offices of the corporate big shots to gather embarrassing off-the-cuff comments, mostly because he didn’t have to: Insurance company employees were coming out of the woodwork by the hundreds with plenty of heart-rending tales of their own: There is, for example, the young customer service rep, tearfully describing her daily agony of knowing that her desperate applicants would be denied coverage because of preexisting conditions; and then the former adjustor burdened with guilt over the thousands of people whose claims he rejected - and received bonuses for.

Also unlike Moore’s other movies, this one actually shows a very bright side to the issue - unfortunately, he has to go across the border to find it, always with a clever segue. First he drops up to Canada, where a single mother from Detroit has established a common-law marriage of convenience with an acquaintance and drives up as needed for checkups and treatment for herself and her daughter. Then he flies off to England, where an American Beatles fan who broke his arm while attempting a somersault across Abbey Road (all documented on a friend’s home video) found to his amazement that the British National Health Service fixed him up immediately, for free. Then down to France, where young American expatriates relax with Moore in a lovely restaurant, telling stories of French health care complete with doctors who make house calls and nannies supplied by the government not only to do child-care but the laundry too - all without a bill.

And finally, across to Cuba, where Moore loads his dispirited 9/11 volunteers onto a motorboat, sails within bullhorn distance of the Guantanamo prison, and pleads for admittance, since Pentagon officials had proudly testified that the enemy combatants there receive top-notch medical care, including dentistry, eyeglasses, and routine annual colonoscopies, at full government expense. Failing that, he takes them to a Havana hospital, where they are welcomed effusively and are given all the MRI’s and dental work and medications denied them in the U.S.A.

The Cuban episode is a reversion to those stretch-the-truth shenanigans of his earlier works, quite obviously contrived and thus the weakest part of the film. It is also the cleverest.

After all this, Moore pops his simple question. He reminds us of all the public things that benefit us throughout our lives and over which there is no controversy: free public education, free public libraries, free public museums, free public fire and police protection. Why not, he asks, free public health care?

Of course, for all of the above, "free" is never really free, and Moore, the polemicist, sidesteps the problems of financing and cost-containment both abroad and at home. What he does do is to brilliantly illustrate the real meaning of "free" in these cases: The complete leveling of society, where rich and poor have equal access to those things essential to human well-being. Yes, taxes are higher in countries with universal health care, but most of their citizens agree that it’s a small price to pay for peace of mind.

As the population ages, there is more and more general sentiment in favor of universal health care. Nevertheless, it will take a popular revolt, led by visionary public officials, to break the back of big insurance and big pharmaceutical. This is not forthcoming - just review the half-hearted positions of the major precedential candidates on the issue, all of whom are beholden to the drug and insurance lobbies.

In the final scene, Michael Moore, that corpulent gadfly with the basset-hound eyes, lumbers up the steps of the U.S. Capitol with his bag of dirty laundry. If the French can do it, why can’t we?