Drone Casualties: The New Body Count (Updated)

predator
A method of war, not a strategy

W.J. Astore

In President Obama’s drone wars, how many innocent civilians have been killed?  An official U.S. government report will suggest that roughly 100 civilians have been killed since 2009 in drone strikes, a surprisingly small number.  According to NBC News:

The Long War Journal, a project of the right-leaning Foundation for Defense of Democracies think tank whose numbers tend to be the most favorable for U.S. policy-makers, tallied 207 civilian casualties since 2009 in 492 strikes in Pakistan and Yemen. That does not include strikes in Somalia and Libya, which the Obama administration includes in its count of around 100 [civilians killed].

New America, a left-leaning Washington think tank, counted between 244 and 294 civilians killed in 547 attacks in Pakistan, Yemen and Somalia.

The Bureau of Investigative Journalism estimates that as many as 1068 civilians were killed in Pakistan, Yemen and Somalia, the vast majority since 2009.

So it’s unclear whether the Obama administration’s drone strikes have killed 100 innocents, 300 innocents, or over 1000 innocents.  Part of the discrepancy involves who is a “militant” and who is an innocent civilian.  The U.S. government tends to count all military-age males killed in drone strikes as “militants,” effectively changing the meaning of civilian to “women and children.”

In one respect, this body count doesn’t matter.  Dead is dead, whether you’re talking about 100 people or 1000.  And isn’t the death of 100 innocents enough to provoke protest if not outrage?  Think of the reaction in the U.S. to the killing of 49 innocent civilians in Orlando.  Better yet, think if a foreign government was flying drones over our skies, taking out American “terrorists” while killing a few innocent civilians now and again.  Would we dismiss 100 dead American civilians as “collateral damage,” regrettable but necessary in this foreign power’s war on terror?

Of course not.  Americans would memorialize the dead, honor them, and make them a cause for vengeance.

For all the people the U.S. government is killing overseas in hundreds of deadly drone strikes, it’s not obvious that any progress is being made in the war on terror. The wars continue, with the Taliban gaining strength in Afghanistan.  ISIS is on the wane, until it rebounds or morphs into another form.  What is essentially terror bombing as a weapon against terror has little chance of ending a war on terror.  Meanwhile, hammer blows from the sky against fractured societies only serve to propagate the fractures, creating new fault lines and divisions that are exploitable by the determined and the fanatical.

Indeed, we really have no clear idea whether these multi-billion dollar air campaigns are making any progress in war. Much of the data and results of these campaigns are both classified and open to bias, with reports of casualties being manipulated or “spun” by all sides.  All we really know is that innocents are killed (whether 100 or 1000) as the wars persist with no end in sight.

Meanwhile, American exceptionalism rules.  As Tom Engelhardt noted back in May of 2015:

In his public apology for deaths [of innocents by drones] that were clearly embarrassing to him, President Obama managed to fall back on a trope that has become ever more politically commonplace in these years.  Even in the context of a situation in which two innocent hostages had been killed, he congratulated himself and all Americans for the exceptional nature of this country. “It is a cruel and bitter truth,” he said, “that in the fog of war generally and our fight against terrorists specifically, mistakes — sometimes deadly mistakes — can occur.  But one of the things that sets America apart from many other nations, one of the things that makes us exceptional is our willingness to confront squarely our imperfections and to learn from our mistakes.”

Whatever our missteps, in other words, we Americans are exceptional killers in a world of ordinary ones.  This attitude has infused Obama’s global assassination program and the White House “kill list” that goes with it and that the president has personally overseen.

Drone strikes are a method of war, but they’ve become the American strategy.  The strategy, so it seems, is to keep killing bad guys until the rest give up and go home.  But the deaths of innocents, whether 100 or 1000, serve to perpetuate cycles of violence and revenge.

We have, in essence, created a perpetual killing machine.

Update (7/2/2016): Well, the Obama administration has done it again, releasing its report on drone casualties on the afternoon of Friday, July 1st, just before the long Independence Day weekend, ensuring minimal media coverage.  The report excludes “active” war zones such as Iraq, Afghanistan, and Syria, a convenient definition that serves to lower the death toll.

According to the report, U.S. drone strikes in places like Yemen, Libya, tribal Pakistan, and Somalia have accounted for about 2500 “terrorists” while killing 64 to 116 civilian bystanders.  The tacit message: We’re killing 25 times (or perhaps 40 times) as many “terrorists” as we are innocent civilians, a very effective (even humane?) kill ratio.

Talk about an exercise in cynical bookkeeping!  One can guess what happened here. Someone high up in the government began with the civilian body count judged acceptable: I’m guessing that figure was roughly 100.  Then, they worked backwards from that.  How do we get 100?  Well, if we exclude “active” war zones such as Iraq, Afghanistan, and Syria, and if we squint sideways …

Well, you probably know the saying: the very first casualty in war is truth.  Followed by an honest accounting of civilian casualties, as this latest report from the Obama administration shows.

Two Big Reasons Not to Vote for Trump

May 29, 2016
Fear his ignorance

W.J. Astore

Nuclear proliferation and global warming are two big issues that Donald Trump is wrong about.  They’re also the two biggest threats to our planet.  Nuclear war followed by nuclear winter could end most life on earth within a matter of weeks or months.  Global warming/climate change, though not as immediate a threat as nuclear war and its fallout, is inexorably leading to a more dangerous and less hospitable planet for our children and their children.

What does “The Donald” believe?  On nuclear proliferation, which only makes nuclear war more likely, Trump is essentially agnostic and even in favor of other nations joining the nuclear club, nations like Japan, South Korea, even Saudi Arabia.  When all countries should be earnestly working to reduce and then eliminate nuclear stockpiles, Trump is advocating their expansion.  (An aside: recall in a previous debate that Trump had no idea what America’s nuclear triad is; add intellectual sloth to his many sins.)

On global warming, Trump is essentially a skeptic on whether it exists (“hoax” and “con job” are expressions of choice), even as he seeks to protect his resorts from its effects. Along with this rank hypocrisy, Trump is advocating an energy plan that is vintage 1980, calling for more burning of fossil fuels, more drilling and digging, more pipelines, as if fossil fuel consumption was totally benign to the environment and to human health.

Along with his tyrannical and fascist tendencies, Trump is wrong on two of the biggest issues facing our planet today.  His ignorance and recklessness render him totally unfit to be president.

This Modern and Dystopic World

orwell 006
My copy of Orwell’s 1984

W.J. Astore

The modern world is a kluge of Ray Bradbury’s Fahrenheit 451 with screens everywhere in which people submerge themselves, Aldous Huxley’s Brave New World with “soma” of all sorts to keep us drugged and happy, and of course George Orwell’s 1984 with constant surveillance and the “two minutes of hate,” directed mainly at “the enemy,” especially the enemy within, known in 1984 as Goldstein (for some Americans today, “Goldstein” is Donald Trump; for others, it’s Hillary Clinton; for a few, it’s Ted Cruz or perhaps all of the above).

Dystopic elements characterize our American moment, hence the appropriateness of dystopic science fiction novels.  Bradbury was especially good at poking holes in the idea technology was in essence a liberating force.  He captured the way people might submerge their identities within screens, neglecting the real people around them, even those closest to them, for the “virtual reality” of infotainment.  Huxley was keen to debunk mass production as a liberating force, but his invention of “soma,” a mood-enhancing drug that leads to detachment and inaction, captured our overly medicated ways.  (I can’t watch network news without being bombarded by drug ads that promise me release from pain or acne or other nuisances and hence a better life, as long as I take this pill or use this inhaler.)  Finally, Orwell captured the total surveillance state, one driven by fear, obsessed by enemies created by the state to cow the masses.  Perhaps the darkest of the three, Orwell left little hope for the “little man” oppressed under the jackboot of a militaristic and totalitarian state.

The times are not quite that dark in America today, but these three classic novels offer warnings we’d do well to heed.  An aspect of these dystopias we most definitely see in America today is the degeneration of news, of information, of knowledge.  As a society, America is arguably less fact-based today than at any point in its history.  Even as we’re immersed in information via the Internet, the news itself has become shallower, or trivial, or frivolous, when it’s not out-and-out propaganda.

I grew up watching the news.  Before going to school, I used to watch the “Today” show in the morning in the 1970s.  It was a decent show.  Some real and serious news made the cut.  Now it’s largely a laugh-fest featuring celebrities making sales-pitches.  The news as soap opera; the news as vanity.

To state the obvious: The network “news” has been dumbed down.  Image is nearly everything.  Stories are far shorter and without context.  Designed for people with limited attention spans, they’re also designed to keep people watching, so they feature sensationalism and “quick hits” — nothing too taxing or disturbing.

Of course, the real news is still out there, as Tom Engelhardt notes in his latest probing article at TomDispatch.com.  It’s just much harder to find on the network “news”:

What’s left out?  Well, more or less everything that truly matters much of the time: any large, generally unphotogenic process, for instance, like the crumbling of America’s infrastructure (unless cameras can fortuitously zoom in on a bridge collapsing or a natural gas pipeline in the process of blowing up in a neighborhood — all so much more likely in an age in which no imaginable situation lacks its amateur video); poverty (who the hell cares?); the growing inequality gap locally or globally (a no-interest barrier the WikiLeaks-style Panama Papers recently managed to break through); almost anything that happens in the places where most of the people on this planet actually live (Asia and Africa); the rise of the national security state and of militarism in an era of permanent war and permanent (in)security in the “homeland”; and don’t even get me started on climate change…

Coming to grips with the real news would require thought and necessitate action – changes, radical ones, to the status quo.  And what powerbroker wants that?

Focus instead, America, on your screens.  Take your soma.  Hate your Goldstein.  That’s the method driving our madness.  Dystopia, anyone?

 

The Challenger Shuttle Disaster, Thirty Years Later

Challenger_flight_51-l_crew
The Crew of the Challenger

W.J. Astore

When the Challenger blew up thirty years ago this January, I was a young Air Force lieutenant working an exercise in Cheyenne Mountain Command Center near Colorado Springs, Colorado.  I remember the call coming in to the colonel behind me.  I heard him say something like, “Is this real world?”  In other words, is this really happening, or is it part of the exercise?  The answer at the other end was grim, our exercise was promptly cancelled, and we turned on the TV and watched the explosion.

Our initial speculation that day was that an engine had malfunctioned (the explosion appeared to have occurred when the shuttle’s engines were reaching maximum thrust).  But it turned out the shuttle had a known technical flaw that had not been adequately addressed.  Something similar would happen to the Columbia in 2003: a known technical flaw, inadequately addressed, ended up crippling the shuttle.

When I taught a course on “technology and society” at the collegiate level, I had my students address the non-technical causes of the Challenger and Columbia disasters.  Here is the question I put to them in the course syllabus:

NASA lost two space shuttles: the Challenger in 1986 and the Columbia in 2003.  Tragically, both these accidents were preventable.  Both had clear technical causes.  In 1986, faulty O-rings on the solid rocket boosters allowed gas to escape, leading to an explosion of the center fuel tank.  In 2003, insulation foam that detached from the shuttle upon liftoff damaged the heat insulation tiles that protect the shuttle from the intense heat of reentry, leading to internal explosions as the Columbia reentered the atmosphere.

Both accidents also highlighted wider issues involving risk management, institutional culture, and control of highly complex machinery.  Before each accident, NASA engineers had warned managers of preexisting dangers.  In the case of the Challenger, it was the risk of launching in low temperatures, as shown by previous data of gas leakage at O-ring seals when the air temperature was below sixty degrees Fahrenheit.  In the case of the Columbia, visual data suggested the shuttle had sustained damage soon after liftoff, a fact that could have been confirmed by cameras and/or a space walk.  In both cases, managers overruled or disregarded the engineers’ concerns, leading to catastrophe.

Question: What do you think were the key non-technical factors that interacted with the technical flaws?  What lessons can we learn from these accidents about controlling complex technical systems?

I wanted my students to focus on issues such as group think, on management concerns about cost and schedule and how those might cloud judgment, on the difficulty of managing risk, on the possibilities of miscommunication among well-intentioned people operating under stress.

I ended the lesson with a quote from Richard Feynman, the Nobel Prize winning scientist who had served on the Challenger board of inquiry after the accident.  Feynman’s honest assessment of the critical flaws in NASA’s scheme of management was shunted to an appendix of the official report.  It’s available in his book, “What Do You Care What Other People Think?”

This is what Feynman had to say:

“For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled.”

It was a devastating conclusion – a much needed one then, and arguably even more needed today.

It Should Never Be Done Again: Hiroshima, 70 Years Later

Hiroshima after the bomb
Hiroshima after the bomb

W.J. Astore

August 6, 1945.  Hiroshima.  A Japanese city roughly the size of Houston.  Incinerated by the first atomic bomb.  Three days later, Nagasaki.  Japanese surrender followed.  It seemed the bombs had been worth it, saving countless American (and Japanese) lives, seeing that a major invasion of the Japanese home islands was no longer needed.  But was the A-bomb truly decisive in convincing the Japanese to surrender?

President Truman’s decision to use atomic bombs against Japan is perhaps the most analyzed, and, in the United States, most controversial decision made during World War II.  The controversy usually creates more heat than light, with hardliners posed on mutually opposed sides.  The traditional interpretation is that Truman used the A-bombs to convince a recalcitrant Japanese Emperor that the war was truly lost.  A quick Japanese surrender appeared to justify Truman’s choice.  It also saved tens of thousands of Allied lives in the Pacific (while killing approximately 250K Japanese).  This thesis is best summed up in Paul Fussell’s famous essay, “Thank God for the Atomic Bomb.”

Even before Hiroshima, however, a small number of scientists argued that the A-bomb should not be used against Japan without a prior demonstration in a remote and uninhabited location.  Later, as the horrible nature of radiation casualties became clearer to the American people, and as the Soviet Union developed its own arsenal of atomic weapons, threatening the United States with nuclear Armageddon, Americans began to reexamine Truman’s decision in the context of the Cold War and the nuclear arms race.  Gar Alperovitz’s revisionist view that Truman was practicing “atomic diplomacy” won its share of advocates in the 1960s. (Alperovitz expanded upon this thesis in the 1990s.)  Other historians suggested that racism and motives of revenge played a significant role in shaping the U.S. decision.  This debate reached its boiling point in the early 1990s, as the Smithsonian’s attempt to create a “revisionist” display to mark the bomb’s 50th anniversary became a lightning rod in the “culture wars” between a Democratic administration and a resurgent Republican Congress.

Were the atomic bombs necessary to get the Japanese to surrender?  Would other, more humane, options have worked, such as a demonstration to the Japanese of the bomb’s power?  We’ll never know with certainty the answer to such questions.  Perhaps if the U.S. had been more explicit in their negotiations with Japan that “unconditional surrender” did not mean the end of Japan’s Emperor, the Japanese may have surrendered earlier, before the A-bomb was fully ready.  Then again, U.S. flexibility could have been interpreted by Japanese hardliners as a sign of American weakness or war fatigue.

Unwilling to risk appearing weak or weary, U.S. leaders dropped the A-bomb to shock the Japanese into surrendering. Together with Stalin’s entry into the war against Japan, these shocks were sufficient to convince the Japanese emperor “to bear the unbearable,” in this case total capitulation, a national disgrace.

A longer war in the Pacific — if only a matter of weeks — would indeed have meant higher casualties among the Allies, since the Japanese were prepared to mount large-scale Kamikaze attacks.  Certainly, the Allies were unwilling to risk losing men when they had a bomb available that promised results.  The mentality seems to have been: We developed it.  We have it.  Let’s use it.  Anything to get this war over with as quickly as possible.

That mentality was not humane, but it was human.  Truman had a weapon that promised decisiveness, so he used it.  The attack on Hiroshima  was basically business as usual, especially when you consider the earlier firebombing raids led by General Curtis LeMay.  Indeed, such “conventional” firebombing raids continued after Hiroshima and Nagasaki until the Japanese finally sent a clear signal of surrender.

Of course, an event as momentous, as horrific, as Hiroshima took on extra meaning after the war, given the nuclear arms race, the Cold War and a climate represented by the telling acronym of MAD (mutually assured destruction). U.S. decisionmakers like Truman were portrayed as callous, as racist, as war criminals.  Yet in the context of 1945, it’s difficult to see any other U.S. president making a different decision, especially given Japan’s apparent reluctance to surrender and their proven fanaticism at Iwo Jima, Okinawa and elsewhere.

As Andrew Rotter notes in Hiroshima: The World’s Bomb (2008), World War II witnessed the weakening, if not erasure, of distinctions between combatants and non-combatants, notably during LeMay’s firebombing of Tokyo in March 1945 but in many other raids as well (Rotterdam and Coventry and Hamburg and Dresden, among so many others). In his book, Rotter supports the American belief that Japan would fight even more fanatically for their home islands than they did at Iwo Jima and Okinawa, two horrendous battles in 1945 that preceded the bomb. But he argues that Truman and Secretary of War Henry Stimson engaged in “self-deception” when they envisioned that the effects of the atomic bomb could be limited to “a purely military” target.

A quarter of a million Japanese died at Hiroshima and Nagasaki and in the years and decades following.  They died horrible deaths.  And their deaths serve as a warning to us all of the awful nature of war and the terrible destructiveness of nuclear weapons.

Hans Bethe worked on the bomb during the Manhattan Project.  A decent, humane, and thoughtful man, he nevertheless worked hard to create a weapon of mass destruction. His words of reflection have always stayed with me.  They come in Jon Else’s powerful documentary, “The Day After Trinity: J. Robert Oppenheimer and the Atomic Bomb.”

Here is what Bethe said (edited slightly):

The first reaction we [scientists] had [after Hiroshima] was one of fulfillment.  Now it has been done.  The second reaction was one of shock and awe: What have we done?  What have we done.  The third reaction was it should never be done again.

It should never be done again: Just typing those words here from memory sends chills up my spine.

Let us hope it is never done again.  Let us hope a nuclear weapon is never used again.  For that way madness lies.

Major Sporting Events and Air Shows: Too Corporatized, Too Controlling, Too Much

The New Yankee Stadium: The House that Corporations Built
The New Yankee Stadium: The House that Corporations Built

W.J. Astore

Back in 2010, I wrote the following article on American sports for Huffington Post.  With the end of “March Madness” and the beginning of baseball season, the time seems right to revive it.  I love watching my hometown teams and experiencing the vicarious thrill of victory (as well as the agony of defeat), and I’ll never give up sports and the fun of being a fan.  But professional sports in America sure make me want to stop watching at times, as you’ll read below:

Been to a major American sporting event lately? If not, consider yourself fortunate. The NFL and NASCAR are already over-the-top when it comes to manufactured noise, exaggerated pyrotechnics, and wall-to-wall corporate advertisements. Even my beloved sport of baseball has fallen victim to sensory saturation and techniques of crowd control that would make a dictator proud. The grace and spontaneity of America’s pastime is increasingly lost in Jumbotrons, overly loud and canned music, and choreographed cheering.

With all the Jumbotrons and other video screens everywhere, people are no longer focused on the game as it takes place on the field, and perhaps turning to their neighbor for an explanation if they miss a play or nuance. Instead, people look to the screens to follow the game. Indeed, sight lines at some seats at Yankee Stadium are so poor that the only way you can watch the action on the field is on video screens posted at strategic locations.

Speaking of Yankee Stadium, last month a friend of mine went to a game there and found the experience “shocking.” In his words:

“The new stadium is flooded with noise from constant speakers as well as screens everywhere. It was so loud that there was really not much independent reaction from the crowd. I got a feeling like I was in a scene from Triumph of the Will. The noise would come out of the speakers and people would chant. When it stopped so did the people. The entire experience left me dying to get out of there!”

Mediocre seats are $110 each, and an $11 beer only compounds the pain. Attending a Yankees game “used to be something of a social leveler, where people of all classes would come and meet to support the team… Although the place was packed for a Red Sox game, it was a largely white crowd, looking nothing like the mix of people who actually inhabit New York,” my friend concluded.

I share my friend’s concerns. I hate being coerced by screens and speakers telling me when to cheer and what to say. Even at my local Single-A baseball games, the post-game fireworks are set to music, usually of a patriotic tenor. I’ve got nothing against music, but why can’t I just enjoy the fireworks? I don’t need “Proud to be an American” blaring to make me proud to be an American.

But it seems like many fans are happy being told when to cheer, what to say, even what to feel. Or they’ve simply become accustomed to being controlled, which has the added benefit to owners of suppressing any inconvenient spontaneity.

More and more, our senses are saturated so we cannot pause to converse or even to think. If the game grows tiresome, people turn to cell phones, palm pilots, and other personal technologies for stimulation. And the phenomenon is hardly limited to sporting events. Today’s version of “Sesame Street” is an exercise in frenetic action and hyperkinetic stimulation; one wonders whether it’s designed for ADHD kids, or to create ADHD kids.

More and more, we’re surrounded by and immersed in near-total sensory saturation; the stifling effect such an environment has on individual spontaneity and thought can’t be disregarded (nor can it be accidental).

And it appears in the most unlikely of places. I used to watch air shows at the U.S. Air Force Academy. Few things are more viscerally thrilling (or chilling) than a formation of F-16s screaming overhead. But that effect apparently wasn’t enough. The powers-that-be “augmented” the air show with loud rock music (call it the “Top Gun” effect) along with an especially annoying (and superfluous) narrator. There was even a proposal to add huge video screens and even bigger speakers to the performance until it got shot down due to charges of contractual cronyism.

In a way, it’s sad to compare today’s thunderingly loud yet sterile air shows to their Depression-era counterparts. The latter, as another friend reminded me, had in his words:

“No concrete runways, no visitor stands, just grass in a field on the edge of town. I loved planes so much that as an eleven year old I would take the two streetcars … then walk a mile to the airport. There was always one or two old biplanes and the small crowd would wait expectantly for the pilots and the daredevils to appear. What excitement just to see those little planes taxing across the grass and getting into position to take off. Gunning their little engines and racing along into the wind. Loops, upside down and then the big thrill, the ‘wing walkers.’ Try that on a jet.”

Bigger, faster, louder doesn’t always mean “better.” Whether it’s an air show or ball game today, we seem saturated by noise, video images, and other sensory distractions, often advertised as “necessary” to broaden the appeal to non-fans or casual spectators who simply want to feel that they’ve witnessed a spectacle, whatever its meaning.

It’s hard to develop an inner life when you’re constantly plugged-in and distracted. It’s also hard to take independent political stances when you’re constantly bombarded by infotainment, not just in the mainstream media but in the sports world as well. I don’t care about off-field shenanigans or contract disputes or manufactured grudges between teams, nor do I want to watch pre-game and post-game shows that last longer than the games: I just want to watch the game and marvel at the accomplishments of world-class athletes while cheering for my home team.

Sports have always been a form of entertainment, of course, but today’s events are being packaged as life-consuming pursuits, e.g. fantasy football leagues. And if we’re spending most of our free time picking and tracking “our” players and teams, it leaves us a lot less time to criticize our leaders and political elites for their exploitation of the public treasury – and betrayal of the public trust. I wonder, at times, if we’re heading in the direction of “Rollerball” (the original movie version with James Caan), in which a few corporations dominate the world and keep the little people (you and me) distracted with ultra-violent sports and hedonistic consumption, so much so that people can’t recognize their own powerlessness and the empty misery of their lives.

Until our sporting events and air shows return to a time when players and fans and enthusiasts collectively showed up simply for the love of the game and the purity of it all (and I can hear my brother mischievously singing, “Until the twelfth of never”), count me out. I can be more spontaneous in my living room with friends — and the beer sure is cheaper.

Update (4/1/2015):  Chicago’s Wrigley Field is a vintage ballpark with a lot of character.  So how do you ruin some of that?  By installing a massive Jumbotron.

Now this makes me proud to be an American
Now this makes me proud to be an American

They still haven’t finished the bleachers, but they have the humongous TV glaring and dominating the skyline in left field.  Error, Cubs.

The Nuclear Triad Is Not the Holy Trinity

An Ohio-Class Submarine, armed with Trident nuclear missiles
An Ohio-Class Submarine, armed with Trident nuclear missiles

W.J. Astore

America’s nuclear triad of land-based intercontinental ballistic missiles (ICBMs), sub-launched ballistic missiles (Ohio-class nuclear submarines), and nuclear-capable bombers is a relic of the Cold War.  The triad may have made some sense in a MAD (as in mutually assured destruction) way in the 1960s and 1970s, at the height of the Cold War with the USSR.  But it makes no strategic or financial (or moral) sense today.  Nevertheless, the U.S. is investing $10 billion over the next six years to update land-based ICBMs, missiles that should be decommissioned rather than updated precisely because they are both outdated and redundant.

The most survivable leg of the nuclear triad remains the U.S. Navy’s nuclear submarines, which carry Trident II missiles with multiple warheads.  These submarines are virtually impossible for any potential American foe to locate and sink in any timely fashion, therefore ensuring a survivable nuclear deterrent that is more than sufficient in any conceivable crisis.

Indeed, it’s arguable whether the U.S. needs any nuclear deterrent, given the size of the U.S. military and the power of its conventional military forces.  Even old Cold War warriors like Henry Kissinger have come out in favor of eliminating nuclear weapons from the earth, as did Barack Obama when he first ran for president in 2008.

But morality and common sense quickly disappear when politics and fear-mongering intervene.  States where nuclear missiles are currently based, such as North Dakota and Wyoming, want to keep them in their silos so that federal dollars continue to flow into local and state economies.  Fearful “hawks” point to the existence of nuclear missiles in China or Russia (or even Pakistan!) as the reason why the U.S. needs to maintain nuclear superiority, even though no country comes close to the power and survivability of the U.S. Navy’s Trident submarines.

And let’s not, of course, forget morality.  With Christmas coming, I recall something about “Thou Shall Not Kill” and loving thy neighbor.  Spending scores of billions (maybe even a trillion dollars!) to update America’s nuclear arsenal, an arsenal that has the capacity to unleash genocide against multiple enemies while plunging the planet into nuclear winter, seems more than a little contrary to the Christian spirit, whether at Christmas or indeed any time of the year.

The decision to “invest” in outdated and redundant land-based ICBMs says much about the American moment.  It’s almost as if our government believes the nuclear triad really is the Holy Trinity.  Heck — why else did our country choose to anoint genocidal nuclear missiles as “Peacekeepers“?

It should sadden us all that some American leader of the future may yet utter the line, “We had to destroy the planet to save it.”  Such is the horrifying potential and maddening logic of our nuclear forces.

 

World War I: The Paradox of Semi-Modern War

British Machine Gun Team in World War I
British Machine Gun Team, 1916

Dennis Showalter.  Introduction by William Astore.

Over the next four years, historians around the world will grapple with the meaning and legacies of the “Great War” fought one hundred years ago (1914-1918).  An epochal event in world history, World War I has as many meanings as it has had historians.  Among those historians, Dennis Showalter is one of the very best.  In this article, Showalter argues that the war was, in many ways, not “modern” at all.  The enormity of the war, to include its enormous wastage, generated primitivism as much as it stimulated innovation.  On the Western Front, site of industrialized mass destruction, troops fought with modern machine guns and chemical weapons even as they revived maces and mail armor of medieval vintage.

Most remarkable, as Showalter notes, was the resilience of home front support.  As dreams of quick, decisive battles turned into long, murderous slogs of nightmarish proportions, control of events was ceded to military men who saw only one way to victory — exhaustion through attrition and economic warfare.  When Germany finally collapsed near the end of 1918, few people were as surprised as the victors or as shocked as the losers.  As the victors exulted, the losers licked wounds — and vowed vengeance.

So it was that the “war to end all wars” became just one major act in a never-ending tragedy in a century dominated by war.  Even today, warfare in places like the Middle East reflects the poor choices and conflicting promises made during the Great War by the major powers.  In fact, what was perhaps most “modern” about World War I was the blowback that plagued its putative victors.  Consider, for example, France’s decision to ignore requests in 1919 by a young Ho Chi Minh for greater autonomy to be granted to Vietnamese in French Indochina.  France had leaned on Vietnamese labor during the Great War (with as many as 140,000 Vietnamese doing grunt work such as digging trenches), and the Vietnamese expected something in return.  They got nothing, a decision that set the stage for Vietnam’s revolt and France’s eventual defeat at Dien Bien Phu in 1954.  W.J. Astore

Dennis Showalter on the Paradox of World War I: A Semi-Modern War

The looming centennial of the Great War has inspired a predicable abundance of conferences, books, articles, and blog posts. Most are built on a familiar meme: the war as a symbol of futility. Soldiers and societies alike are presented as victims of flawed intentions and defective methods, which in turn reflected inability or unwillingness to adapt to the spectrum of innovations (material, intellectual, and emotional) that made the Great War the first modern conflict. That perspective is reinforced by the war’s rechristening, backlit by a later and greater struggle, as World War I—which confers a preliminary, test-bed status.

Homeward bound troops pose on the ship's deck and in a lifeboat, 1919. The original image was printed on postal card ("AZO") stock. Public Domain
Homeward bound troops pose on the ship’s deck and in a lifeboat, 1919. The original image was printed on postal card (“AZO”) stock. Public Domain via Wikimedia Commons.

In point of fact, the defining aspect of World War I is its semi-modern character. The “classic” Great War, the war of myth, memory, and image, could be waged only in a limited area: a narrow belt in Western Europe, extending vertically five hundred miles from the North Sea to Switzerland, and horizontally about a hundred miles in either direction. War waged outside of the northwest European quadrilateral tended quite rapidly to follow a pattern of de-modernization. Peacetime armies and their cadres melted away in combat, were submerged by repeated infusions of unprepared conscripts, and saw their support systems, equine and material, melt irretrievably away.

Russia and the Balkans, the Middle East, and East Africa offer a plethora of case studies, ranging from combatants left without rifles in Russia, to the breakdown of British medical services in Mesopotamia, to the dismounting of entire regiments in East Africa by the tsetse fly. Nor was de-modernization confined to combat zones. Russia, Austria-Hungary, the Ottoman Empire, and arguably Italy, strained themselves to the breaking point and beyond in coping with the demands of an enduring total war. Infrastructures from railways to hospitals to bureaucracies that had functioned reasonably, if not optimally, saw their levels of performance and their levels of competence tested to destruction. Stress combined with famine and plague to nurture catastrophic levels of disorder, from the Armenian genocide to the Bolshevik Revolution.

Semi-modernity posed a corresponding and fundamental challenge to the wartime relationship of armed forces to governments. In 1914, for practical purposes, the warring states turned over control to the generals and admirals. This in part reflected the general belief in a short, decisive war—one that would end before the combatants’ social and political matrices had been permanently reconfigured. It also reflected civil authorities’ lack of faith in their ability to manage war-making’s arcana—and a corresponding willingness to accept the military as “competent by definition.”

Western Battle Front 1916. From J. Reynolds, Allen L. Churchill, Francis Trevelyan Miller (eds.): The Story of the Great War, Volume V. New York. Specified year 1916, actual year more likely 1917 or 1918. Public Domain via Wikimedia Commons.

The extended stalemate that actually developed had two consequences. A major, unacknowledged subtext of thinking about and planning for war prior to 1914 was that future conflict would be so horrible that the home fronts would collapse under the stress. Instead, by 1915 the generals and the politicians were able to count on unprecedented –and unexpected–commitment from their populations. The precise mix of patriotism, conformity, and passivity underpinning that phenomenon remains debatable. But it provided a massive hammer. The second question was how that hammer could best be wielded. In Russia, Austria-Hungary, the Ottoman Empire, neither soldiers nor politicians were up to the task. In Germany the military’s control metastasized after 1916 into a de facto dictatorship. But that dictatorship was contingent on a victory the armed forces could not deliver. In France and Britain, civil and military authorities beginning in 1915 came to more or less sustainable modi vivendi that endured to the armistice. Their durability over a longer run was considered best untested.

Even in the war’s final stages, on the Western Front that was its defining theater, innovations in methods and technology could not significantly reduce casualties. They could only improve the ratio of gains. The Germans and the Allies both suffered over three-quarters of a million men during the war’s final months. French general Charles Mangin put it bluntly and accurately: “whatever you do, you lose a lot of men.” In contemplating future wars—a process well antedating 11 November 1918—soldiers and politicians faced a disconcerting fact. The war’s true turning point for any state came when its people hated their government more than they feared their enemies. From there it was a matter of time: whose clock would run out first. Changing that paradigm became—and arguably remains—a fundamental challenge confronting a state contemplating war.

Dennis Showalter is professor of history at Colorado College, where he has been on the faculty since 1969. He is Editor in Chief of Oxford Bibliographies in Military History, wrote “World War I Origins,” and blogged about “The Wehrmacht Invades Norway.” He is Past President of the Society for Military History, joint editor of War in History, and a widely-published scholar of military affairs. His recent books include Armor and Blood: The Battle of Kursk (2013), Frederick the Great: A Military History (2012), Hitler’s Panzers (2009), and Patton and Rommel: Men of War in the Twentieth Century (2005).

Article used by permission of the author. See more at http://blog.oup.com/2014/06/first-world-war-paradox-of-semi-modern-war/#sthash.opNivppW.dpuf

A Century of Mass Slaughter

Big Bertha (wiki)
Big Bertha (wiki)

W.J. Astore.  Also featured at Huffington Post.

This August marks the 100th anniversary of the start of World War I. That “Great War” was many things, but it was most certainly a war of machines, of dreadnought battleships and “Big Bertha” artillery, of newfangled airplanes and tortoise-like tanks. Industrial juggernauts like Great Britain, France, and Germany succeeded more or less in mobilizing their economies fully for war; their reward was reaping the horrors of death-dealing machinery on a scale theretofore thought impossible.

In that summer of 1914, most experts expected a short war, so plans for sustaining machine-age warfare through economic mobilization were lacking. Confronted by trench warfare and stalemate on the Western Front which owed everything to modern industrialism and machinery, the “big three” antagonists strove to break that stalemate using the means that had produced it: weapons and munitions. Those empires caught up in the war that were still industrializing, e.g. Russia, Austria-Hungary, the Ottoman Empire, found themselves at a serious disadvantage.

Together, Britain and France forged an industrial alliance that proved (with help from the U.S.) to be a war-winning “arsenal of democracy.” Yet this alliance contributed to an overvaluing of machines and munitions at the soldiers’ expense. For Entente leaders — even for old-school cavalry officers like Britain’s Field Marshal Sir Douglas Haig — new artillery with massive stockpiles of shells promised to produce the elusive breakthrough and a return to mobile warfare and glorious victory.

Thus it was that at the Battle of the Somme that began on July 1, 1916, British soldiers were reduced to trained occupiers. Lengthy pre-battle artillery barrages, it was believed, would annihilate German defenders, leaving British troops to slog uncontested across no-man’s land to occupy the enemy’s shattered and empty trenches.

But those trenches were not empty. Germany’s defenses survived Britain’s storm of steel largely intact. And Britain’s soldiers paid the price of misplaced faith in machine warfare: nearly 20,000 dead on that first day, with a further 40,000 wounded.

The Somme is but one example of British and French commanders being overwhelmed by the conditions of machine warfare, so much so that they placed their faith in more machines and more munitions as the means to victory. After underestimating the impact of technology on the battlefield up to 1914, commanders quickly came to overestimate it. As a result, troops were inadequately trained and tactics inadequately developed.

As commanders consumed vast quantities of machinery and munitions, they became accustomed to expending lives on a similarly profligate scale. Bodies piled up even as more economic means were tapped. Meanwhile, the staggering sacrifices required by destructive industrialism drove nations to inflate strategic ends. Industrialized warfare that spat out lead and steel while consuming flesh and bone served only to inflame political demands, negating opportunities for compromise. Total victory became the only acceptable result for both sides.

In retrospect it’s remarkable how quickly leaders placed their faith in the machinery of war, so much so that military power revved uncontrollably, red-lined, then exploded in the faces of its creators. Industrialized destruction and mass slaughter were the predictable outcomes of a crisis whose resolution was driven by hardware — more weaponry, more machinery, more bodies. The minds of the men who drove events in that war could not sanction negotiation or compromise; those were forms of “weakness” that neither side could accept. Such murderous inflexibility was captured in the postwar observation of novelist Virginia Woolf that “It was a shock to see the faces of our rulers in the light of the shell fire. So ugly they looked — German, English, French — so stupid.” Note how she includes her own countrymen, the English, in the mix of the ugly and the stupid.

In World War I, Carl von Clausewitz’s dictum of war as an extreme form of politics became tragically twisted to war as the only means of politics, with industrialized mass destruction as the only means of war. The resulting failure to negotiate a lasting peace came as no surprise since the war had raced not only beyond politics, but beyond the minds of its military and political leaders.

The Great War had unleashed a virus, a dynamic of destruction, that would only be suppressed, and even then only imperfectly, by the wanton destruction of World War II. For what was Auschwitz but a factory of death, a center for mass destruction, a mechanized and murderous machine for efficient and impersonal slaughter, a culmination of the industrialized slaughter (to include mass gassing) of World War I?

The age of mass warfare and mass destruction was both catalyst for and byproduct of the age of machinery and mass production. Today’s age is less industrial but no less driven by machinery and mass consumption (which requires a form of mass destruction inflicted largely on the environment).

Aerial drones and cyber warfare are already providing disturbing evidence that the early 21st century may yet echo its predecessor in introducing yet another age of misplaced faith in the machinery of warfare. The commonality remains the vulnerability of human flesh to steel, as well as human minds to manipulation.

A century has passed, yet we’re still placing far too much faith in the machinery of war.

Technology and the Role of Scientists and Engineers in Society

Earth as seen from orbit by Apollo 11 in 1969
Earth as seen from orbit by Apollo 11 in 1969

W.J. Astore

Twenty-five years ago, I wrote the following paper for a class in the history of technology.  Back then, chlorofluorocarbons (CFCs) and acid rain as well as global warming were issues highlighting the drawbacks of technology.  CFCs were damaging the ozone layer, acid rain was poisoning our lakes and streams and damaging trees, with the buildup of greenhouse gases looming as a future threat.  The future is now, of course, since we’ve done virtually nothing to address global warming.  If anything, the debate in 1989 was far more sober, since back then there were no “climate change deniers.”

Written at the tail end of the Cold War, my paper from 1989 is colored by the threat of nuclear annihilation, another threat (like acid rain and CFCs) that has abated in the last two decades.  Reason for hope, perhaps?

Yet in those 25 years, technology has only proliferated even as compassion for those less fortunate has declined.  I wrote this paper before there was an Internet and World Wide Web, before cell phones and smart phones became ubiquitous, before we had so much conclusive evidence of the dangers of man-accelerated global warming.  I was attempting to argue that scientists and engineers had an obligation to consider the larger impact of their work, to include the moral implications of their research.

I’ve made one major change to this paper as written 25 years ago.  Back then, I concluded with the idea that an ethics based on Christianity needed to inform the work of scientists and engineers.  Today, this argument seems far too parochial and limiting, so I have removed it.

Technology and the Role of Scientists and Engineers in Modern Society (1989)

What is the proper role of scientists and engineers in modern society?  This question is especially relevant today, as can readily be confirmed by opening the September 1989 special issue of Scientific American entitled “Managing Planet Earth.”  Technology, it seems, has spawned many monsters: chlorofluorocarbons that tear holes in our protective ozone shield, factory smoke that turns our rain acidic, carbon dioxide that threatens to convert our planet into one big greenhouse.  The contributors to Scientific American assert that humanity must regain control over technology before its monsters inflict irreparable damage to the earth.

Defenders of technology, not surprisingly, advance the opposite thesis.  Samuel Florman, an engineer and the author of Blaming Technology, counters that “technology is still very much under society’s control, that it is in fact an expression of our very human desires, fancies, and fears.”  In Florman’s opinion, engineers should dedicate themselves to doing works for the good of society, but they should not try to define what is good for society.  Their mission, Florman holds, is to achieve rather than to set society’s goals.

Florman does not exonerate engineers from all responsibility, however.  He asserts that engineers must be guided by their individual consciences, but he also suggests that society should not expect any “special compassion” from its engineers.  In fact he implies that society must resign itself to emotionally-detached engineers: “If we accept the single-minded dedication of ballet dancers and other artists,” Florman analogizes, “we should be able to accept, however regretfully, the same characteristic in a number of scientists and engineers.”

But a serious flaw lies at the heart of Florman’s plea for the sanctity of the engineering profession.   He disregards the vastly different societal roles of artists versus scientists and engineers, as well as the serious dangers of a powerful technical elite.  The philosopher Hannah Arendt noted these dangers in the context of atomic experimentation:

     The simple fact that physicists split the atom without any hesitations … although they realized full well the enormous destructive potentialities … demonstrates that the scientist qua scientist does not even care about the survival of the human race on earth or, for that matter, about the survival of the planet itself.

Arendt makes an important point here.  Scientists and engineers sometimes pursue their interests even when they threaten the survival of humanity (or themselves for that matter).  Evidence from the Manhattan Project lends credibility to this argument.  Most scientists who worked on the project were too caught up in the technical challenges of building the atomic bomb to entertain moral qualms about the bomb’s purpose.  Robert R.  Wilson, the leader of the cyclotron group during the Project, observed that he never considered quitting:

We were the heroes of our epic, and there was no turning back.  We were working on a problem to which we were completely committed; there was little time to re-examine our moral position from day to day.

The atomic bomb was the grail for these knights of science; they focused on their pursuit and little else.  Perhaps they believed they could wash their hands clean of the stains of Hiroshima and Nagasaki, for they neither made the decision to drop the bombs nor did they pilot the planes.  Yet they could not deny that it was their expertise that brought humanity to the brink of its own destruction during the Cold War.

So what does our nuclear heritage teach us?  It teaches us that humanity needs a more humane technology and more humane engineers.  In sum, we need a new purpose for technology, one that is inspired by social and humanitarian concerns.

Jules Verne captured the risk of failing to do so.  “If men go on inventing machinery, they’ll end by being swallowed up by their own inventions,” Verne prophesized.  There are still some people, however, who continue to believe that technological advances themselves will eliminate technology’s harms.   Charles F. Kettering, a remarkably inventive General Motor’s executive and a quintessential company man, captured this idea.  In Paul de Kruif’s words, Kettering felt that

You cannot put the brakes on any discovery … you’ve got to go on with it even if we’re all blown to hell with it.  What you should do is step up the study of human nature, you may even find a chemical, a vitamin, a hormone, a simple pill to take the devil out of human nature….

Here one cannot help but be reminded of Aldous Huxley’s Brave New World, where another automotive engineer, Henry Ford, was god, morality was but a faint memory, and drugs were the panacea for human ills.

Elting Morison, in Men, Machines, and Modern Times (1984), suggests that since technology forces humanity into its categories, humanity has no choice but to create a new culture to accommodate it.  He proposes that a series of small experiments be performed world-wide, with “man as the great criterion” (or, perhaps more accurately, the great guinea pig).  Apparently, a successful experiment will be one in which humans thrive, while an unsuccessful one will be one where humans “break down.”  Rather oddly, Morison believes the military provides us with the paradigm of how to proceed.  In his words:

They [the military] have the nuclear weapon that has fulfilled the exaggerated extreme toward which the system always tends … But for practical purposes they have created around this extreme a whole arsenal of carefully graded instruments of limited destruction – old-fashioned armaments of lesser power and new weapons of modulated nuclear energy.

It’s shocking how Morison waxes nostalgic over those “old-fashioned” weapons, and his addition of “modulation” to atomic bombs makes them seem downright cozy.  As George Orwell observed in his famous 1946 essay entitled Politics and the English Language, “such phraseology is needed if one wants to name things without calling up mental pictures of them.”   Thus cluster bombs that send shrieking hunks of shrapnel through the air, napalm that sears lungs and burns human skin, and atomic artillery shells that annihilate armies (but not cities, we hope) become, for Morison, “modest examples of how to begin to proceed.”

A more pessimistic prospectus for the future of technology is held by Arnold Pacey in The Maze of Ingenuity (1980).  For Pacey, history reveals that technology cannot “easily accommodate the broad aims and the mixture of human and technical factors which a socially-orientated direction of progress in technology … require[s].  Thus the efforts made to encourage a more directly social form of technical progress … have been relatively ineffective.”

Pacey attributes this failure to the dominance of the mechanical world view.  Beginning with Galileo, Pacey maintains, scientists and engineers restricted their own view of the world, blinding themselves to the larger purposes of technology.

Pacey does more than lament, though.  He offers several potential solutions, all of which seem flawed.  He assumes that new, less destructive, technologies are needed to meet human needs, or to ease poverty, yet the world currently has enough resources to end poverty, and present technology could doubtless be used more constructively.  Pacey also unconsciously undermines his argument by citing education and medical care as “examples of how continuous improvement is possible without any large accompanying drain on material resources.”  Unfortunately for Pacey, both education and medical care are currently (and rightly) under siege in this country.  Despite large sums of money spent and countless reform proposals, education remains mediocre, while medical care remains compassionless and costly.

No wonder Pacey despairs.  He half-heartedly mentions other potential balms, e.g. critical science, which pursues “careful, rigorous researches into the relationship between technical innovation, nature and society,” and general systems theory, yet it is unclear from reading Pacey how critical science differs from general systems theory.   In the end, Pacey supplies the reader with little in the way of hope, for he despondently observes that systems theory is corruptible.

In the end, we’re left with today’s dehumanizing technological imperative, as noted by Carlo Cipolla, a noted historian of technology, in this passage:

Each new machine … creates new needs, besides satisfying existing ones, and breeds newer machines.  The new contrivances modify and shape our lives and our thoughts; they affect the arts and philosophy, and they intrude even into our spare time.

To prevent this dominance of the machine, science and technology need to serve social and humanitarian needs more directly.  In “Thinking about Human Extinction,” George Kateb holds that individuals must attach themselves first and foremost to existence.  This attachment “cannot be cultivated by way of a theology that bestows [from the outside] meaning or worth on existence,” and it must be able to withstand “all temptations to go along with policies that may lead to human and natural extinction.”

Existence is justified by a sense of beauty; specifically, Martin Heidegger’s wonderment at the very indefiniteness of existence.  For Kateb, “because there could have been earthly nothingness … one must finally attach oneself to earthly existence, whatever it is, and act to preserve it … [To this end] persons must be schooled in beauty to acquire the disposition to sustain wonder that there is earthly existence rather than none.”  In sum, we must learn to revel in the very fact of humanity’s existence against the longest of cosmic odds.

In a world that grows ever more fragile with each passing day, an appreciation for the fragility of our existence, as well as an abiding compassion for humanity, is exactly what we need from our scientists and engineers.

________________________

Sources in order of citation

Samuel C. Florman, Blaming Technology: The Irrational Search for Scapegoats (New York: St. Martin’s Press, 1981).

Hannah Arendt, “A Symposium on Space: Has Man’s Conquest of Space Increased or Diminished his Stature?”, The Great Ideas Today 1963 (Chicago: Encyclopedia Britannica, Inc., 1963).

Robert R. Wilson, “The Scientists who Made the Atom Bomb,” Science, Conflict and Society (San Francisco: W.H. Freeman, 1969).

Jules Verne, Five Weeks in a Balloon (1862), quoted in James R. Newman, “The History and Present State of Science Fiction,” Science, Conflict and Society (San Francisco: W.H.  Freeman, 1969).

Paul de Kruif, Life Among the Doctors (New York: Harcourt, Brace, 1949), p. 445, quoted in William Leslie, Boss Kettering (New York: Columbia University Press, 1983).

Elting E. Morison, Men, Machines, and Modern Times (Cambridge, Mass: MIT Press, 1966, 1984).

Arnold Pacey, The Maze of Ingenuity: Ideas and Idealism in the Development of Technology (New York: Holmes/Meier, 1974, 1980).

Carlo M. Cipolla, Clocks and Culture 1300-1700 (New York: W.W. Norton & Co., 1978).

George Kateb, “Thinking about Human Extinction: (I) Nietzsche and Heidegger,” Raritan (Fall 1986), pp. 1-28.