It seems Americans can’t rally support for something without declaring a “war” on it. The war on poverty. On drugs. On gangs and crime. On terror. And these wars have become open-ended, or “generational” in Pentagon-speak, with a dynamic of crisis-surge-“progress”-new crisis-new surge-repeat that sustains large bureaucracies and huge government spending.
To these “wars” we must add a new one, notes Michael Klare at TomDispatch.com: the climate change war. As Texas and Florida were being clobbered by powerful hurricanes, the U.S. military and Homeland Security took the lead role in responding to these disasters, notes Klare. Yet, even as the U.S. National Security State was mobilized to respond, identifying and seeking to mitigate a root cause of this “war” — the role global warming plays in exacerbating these storms — was and is very much forbidden by the Trump administration.
This is nothing new. As with so many other wars, the U.S. military is deployed to address symptoms rather than root causes. Worse than that, we often deny our own role in creating or worsening those root causes.
With respect to climate change, we Americans have made our choice. We’ve come to believe the advertising slogans that “we can have it all.” We’ve dismissed the dangers of wanton fossil fuel consumption, and indeed wanton materialism in general. Corporations have worked hard to persuade us that global warming might just be a hoax, or at the very least dodgy science. Many of us have willingly bought the message that coal is “clean,” that fracking along with new pipelines are safe and create jobs, even though it’s clean(er) energy like wind and solar that is the better job-creator.
Those are facts that lead me to a different “war” in America, the one being waged against truth. Basic truths are denied (e.g. that human activity contributes to global warming) in the interests of profits enjoyed by powerful industries. But denial in “war” is not a path to victory (except for the profiteers). Denial is a path only to generational conflict, one that is sure to lead to more disasters and end only in defeat.
So, two things are most definitely certain: the climate change war will be generational. And, much like that other generational war — the war on terror — our military won’t win it. For no one wins a war against Mother Nature — not when we’re going out of our way to piss her off.
Did you know the U.S. has built nearly 70,000 nuclear weapons since 1945? Did you know the U.S. Air Force lost a B-52 and two hydrogen bombs in an accident over North Carolina in 1961, and that one of those H-bombs was a single safety-switch away from exploding with a blast equivalent to three or four million tons of TNT (roughly 200 Hiroshima-type bombs)? Did you know a U.S. nuclear missile exploded in its silo in Arkansas in 1980, throwing its thermonuclear warhead into the countryside?
That last accident is the subject of a PBS American Experiencedocumentary that I watched last night, “Command and Control.” I highly recommend it to all Americans, not just for what it reveals about nuclear accidents and the lack of safety, but for what it reveals about the U.S. military.
Here are a few things I learned about U.S. nuclear weapons and the military from the documentary:
During the silo accident, the Strategic Air Command (SAC) general in charge of nuclear missiles was a pilot with no experience in missiles. His order to activate a venting fan during a fuel leak led to the explosion that destroyed the missile and killed an airman. (Experts from Martin Marietta, the military contractor that built the Titan II missile, advised against such action.)
Airmen who courageously tried against long odds to mitigate the accident, and who were wounded in the explosion, were subsequently punished by the Air Force.
The Air Force refused to provide timely and reliable knowledge to local law enforcement as well as to the Arkansas governor (then Bill Clinton) and senators. Even Vice President Walter Mondale was denied a full and honest accounting of the accident.
Nuclear safety experts concluded that “luck” played a role in the fact that the Titan’s warhead didn’t explode. It was ejected from the silo without its power source, but if that power source had accompanied the warhead as it flew out of the silo, an explosion equivalent to two or three megatons could conceivably have happened.
Finally, the number of accidents involving U.S. nuclear weapons is far greater than the military has previously reported. Indeed, even the nation’s foremost expert in nuclear weapons development was not privy to all the data from these accidents.
In short, the U.S. has been very fortunate not to have nuked itself with multiple hydrogen bombs over the last 70 years. Talk today of a threat from North Korea pales in comparison to the threat posed to the U.S. by its own nuclear weapons programs and their hair-raising record of serious accidents and safety violations.
Despite this record, President Obama and now President Trump have asked for nearly a trillion dollars over the next generation to modernize and improve U.S. nuclear forces. Talk about rewarding failure!
Threatening genocidal murder is what passes for “deterrence,” then and now. This madness will continue as long as people acquiesce to the idea the government knows best and can be trusted with nuclear weapons that can destroy vast areas of our own country, along with most of the world.
To end the insanity, we must commit to eliminating nuclear weapons. Ronald Reagan saw the wisdom of total nuclear disarmament. So should we all.
An Addendum: In my Air Force career, I knew many missileers who worked in silos. They were dedicated professionals. But accidents happen, and complex weapons systems fail often in complex and unpredictable ways. Again, it’s nuclear experts themselves who say that luck has played a significant role in the fact that America hasn’t yet nuked itself. (Of course, we performed a lot of above-ground nuclear testing in places like Nevada, making them “no-go” places to this day due to radiation.)
Update (4/27/17): I’d heard of Air Force plans to base nuclear weapons on the moon, but today I learned that a nuclear test was contemplated on or near the moon as a way of showcasing American might during the Cold War. As the New York Timesreported, “Dr. [Leonard] Reiffel revealed that the Air Force had been interested in staging a surprise lunar explosion, and that its goal was propaganda. ‘The foremost intent was to impress the world with the prowess of the United States.’ It was a P.R. device, without question, in the minds of the people from the Air Force.” Dr. Reiffel further noted that, “The cost to science of destroying the pristine lunar environment did not seem of concern to our sponsors [the U.S. military] — but it certainly was to us, as I made clear at the time.”
The U.S. military wasn’t just content to pollute the earth with nuclear radiation: they wanted to pollute space and the moon as well. All in the name of “deterrence.”
Two pictures of above-ground nuclear testing in Nevada in 1955
In the crusade against Communism, otherwise known as the Cold War, the U.S. saw “freedom” as its core strength. Our liberties were contrasted with the repression of our chief rival, the USSR. We drew strength from the idea that our system of government, which empowered people whose individualism was guided by ethics based on shared values, would ultimately prevail over godless centralism and state-enforced conformity. An important sign of this was our belief in citizen-soldiers rather than warriors, and a military controlled by democratically-elected civilians rather than by dictators and strong men.
Of course, U.S. foreign policy during the Cold War could be amoral or immoral, and ethics were often shunted aside in the name of Realpolitik. Even so, morality was nevertheless treated as important, and so too were ethics. They weren’t dismissed out of hand.
Fast forward to today. We no longer see “freedom” as a core U.S. strength. Instead, too many of us see freedom as a weakness. In the name of defeating radical Islamic terrorism, we’ve become more repressive, even within the USA itself. Obedience and conformity are embraced instead of individualism and liberty. In place of citizen-soldiers, professional warriors are now celebrated and the military is given the lion’s share of federal resources without debate. Trump, a CEO rather than a statesman, exacerbates this trend as he surrounds himself with generals while promising to obliterate enemies and to revive torture.
In short, we’ve increasingly come to see a core national strength (liberty, individualism, openness to others) as a weakness. Thus, America’s new crusades no longer have the ethical underpinnings (however fragile they often proved) of the Cold War. Yes, the Cold War was often unethical, but as Tom Engelhardt notes at TomDispatch.com today, the dirty work was largely covert, i.e. we were in some sense embarrassed by it. Contrast this to today, where the new ethos is that America needs to go hard, to embrace the dark side, to torture and kill, all done more or less openly and proudly.
Along with this open and proud embrace of the dark side, America has come increasingly to reject science. During the Cold War, science and democracy advanced together. Indeed, the superior record of American science vis-à-vis that of the Soviet Union was considered proof of the strength and value of democracy. Today, that is no longer the case in America. Science is increasingly questioned; evidence is dismissed as if it’s irrelevant. “Inconvenient truths” are no longer recognized as inconvenient — they’re simply rejected as untrue. Consider the astonishing fact that we have a president-elect who’s suggested climate change is a hoax perpetrated by China.
Yesterday, I saw the following comment online, a comment that summed up the new American ethos: “Evidence and facts are for losers.” After all, President-elect Trump promised America we’d win again. Let’s not let facts get in the way of “victory.”
That’s what a close-minded crusader says. That the truth doesn’t matter. All that matters is belief and faith. Obey or suffer the consequences.
Where liberty is eroded and scientific evidence is denied, you don’t have democracy. You have something meaner. And dumber. Something like autocracy, kleptocracy, idiocracy. And tyranny.
Two weeks ago, I did an interview with TheoFantastique on the military in science fiction. I’d like to thank John Morehead, the site’s creator, for inviting me to answer a few questions on a subject near and dear to my heart.
TheoFantastique: Bill, thanks for making a little time to respond to a few questions related to the subject matter of your article. What are some general observations you have made about the shift in science fiction film depictions of the American military from the post-World War II period to the present?
Bill Astore: Thanks for inviting me, John. I grew up in the late 1960s and 1970s, in the immediate aftermath of the Vietnam War and Watergate. Films of that era were generally critical of the establishment, including sci-fi films. I fondly recall Planet of the Apes with its anti-nuclear message. Also Soylent Green with its warning about over-population, but even more dire was the way in which the authorities hid from the people the true nature of their new food source. Think also of Capricorn One, hardly a great film, but one which exposed a government conspiracy at the heart of the first manned mission to Mars. And Silent Running with Bruce Dern. The basic message was how humans were destroying planet earth, often due to nuclear war or environmental destruction, or both. Finally, Logan’s Run was a favorite of mine, but again the message was how the government of that world hid from the people the true nature of life outside of the bubble.
I remember seeing Alien in the theater and being blown away by the alien “birth” scene. But again the theme of that film was you can’t trust the authorities, who wanted the “alien” at any cost, i.e. the crew was expendable. Think of Outland as well with Sean Connery: yet more corruption among the establishment, this time involving drugs and production quotas in space mining. Here the workers were expendable.
I know I’m digressing from your question, but my general point is this: Sci-Fi films (and stories) are generally questioning (or questing, perhaps). They are usually not pro-military or pro-authority. Put differently, for every Starship Troopers there’s a Bill the Galactic Hero as a counterweight.
Think of one of my all-time favorite films, The Day the Earth Stood Still. The military is completely ineffectual in that film. Worse: the military contributes to the problem. Similarly, in the 1950s lots of films were made about the dangers of nuclear war and radiation. The military usually didn’t emerge in a favorable light in those films, if I recall correctly.
I think this began to change with films like Star Wars and Close Encounters of the Third Kind. Star Wars could be read as apolitical (“a long time ago, in a galaxy far, far away”), even if that wasn’t George Lucas’s intent. In Close Encounters, a terrific film that I saw in the theater, the authorities actually know what they’re doing. They greet the alien mothership peacefully, and communicate with music and light instead of guns and nukes. Again, I don’t think Spielberg was making a pro-authority or pro-military film, but I believe he didn’t want to make a political film, a film like The Day the Earth Stood Still.
After these two films, Hollywood embraced space operas and feel-good movies. There were exceptions, of course. One of my favorite movies is Starman with Jeff Bridges. Again, the authorities only want the alien for the powers he brings with him. Think too of The Man Who Fell to Earth and the way in which his life is corrupted by human excess. Doesn’t he get addicted to television?
The movie that really changed it all was Independence Day, a perfect film in the aftermath of Desert Storm (the expulsion of Iraq from Kuwait). Here, of course, the militaries of various countries come together to defeat the aliens, led by an American president who climbs into the cockpit to lead the charge himself. This proved so popular that it’s no surprise George W. Bush tried to replicate the scene in the aftermath of the U.S. invasion of Iraq in 2003 (his infamous landing on an aircraft carrier, followed by his “Mission Accomplished” victory speech).
TheoFantastique: What represents much of the portrayal of the U.S. and its military, and what does this say back to us by way of reflection on American militarism around the world?
Bill Astore: I think many, if not most, Americans now want to see the U.S. military portrayed in a positive light in films. Since the 1980s, and especially since the 1990s, Americans have been told to “support our troops.” After 9/11, ordinary Americans were taught and told we live in a dangerous world filled with “alien” terrorists, and that we had to submit to authority to combat and defeat those “aliens.”
Some recent sci-fi films, I believe, have come to celebrate the military, its weaponry, and its can-do spirit of “warriors.” They’ve played it safe, in other words. In some cases, film makers may have curried favor with the Pentagon as a way of securing military cooperation in filming. For example, to secure access to bases, to advanced technologies such as the F-22 and F-35 jet fighters, and so on. It makes their films “sexier” to have such access.
I’m sure some would say, So what? What’s wrong with a summer blockbuster that portrays military action in a favorable light? To that I’d say: reel war is nothing like real war. The best science fiction films — or the memorable ones — inspire us to dream of bettering ourselves as individuals and as a species. And I think the best films still seek to challenge us to be more noble, more benevolent, more compassionate.
TheoFantastique: How do you feel as a retired Air Force officer about current science fiction’s perspective on the U.S. military?
Bill Astore: I have mixed feelings. On the one hand, I’m glad that films are not universally anti-military. On the other hand, I’m upset that many films tend to glorify battle and war. War often looks very sexy and exciting in today’s crop of sci-fi action flicks. We need to remember that war is bloody awful, and that lasers and light sabers would not make it any less awful.
Check out TheoFantastisque, a meeting place for myth, imagination, and mystery in pop culture.
When the Challenger blew up thirty years ago this January, I was a young Air Force lieutenant working an exercise in Cheyenne Mountain Command Center near Colorado Springs, Colorado. I remember the call coming in to the colonel behind me. I heard him say something like, “Is this real world?” In other words, is this really happening, or is it part of the exercise? The answer at the other end was grim, our exercise was promptly cancelled, and we turned on the TV and watched the explosion.
Our initial speculation that day was that an engine had malfunctioned (the explosion appeared to have occurred when the shuttle’s engines were reaching maximum thrust). But it turned out the shuttle had a known technical flaw that had not been adequately addressed. Something similar would happen to the Columbia in 2003: a known technical flaw, inadequately addressed, ended up crippling the shuttle.
When I taught a course on “technology and society” at the collegiate level, I had my students address the non-technical causes of the Challenger and Columbia disasters. Here is the question I put to them in the course syllabus:
NASA lost two space shuttles: the Challenger in 1986 and the Columbia in 2003. Tragically, both these accidents were preventable. Both had clear technical causes. In 1986, faulty O-rings on the solid rocket boosters allowed gas to escape, leading to an explosion of the center fuel tank. In 2003, insulation foam that detached from the shuttle upon liftoff damaged the heat insulation tiles that protect the shuttle from the intense heat of reentry, leading to internal explosions as the Columbia reentered the atmosphere.
Both accidents also highlighted wider issues involving risk management, institutional culture, and control of highly complex machinery. Before each accident, NASA engineers had warned managers of preexisting dangers. In the case of the Challenger, it was the risk of launching in low temperatures, as shown by previous data of gas leakage at O-ring seals when the air temperature was below sixty degrees Fahrenheit. In the case of the Columbia, visual data suggested the shuttle had sustained damage soon after liftoff, a fact that could have been confirmed by cameras and/or a space walk. In both cases, managers overruled or disregarded the engineers’ concerns, leading to catastrophe.
Question: What do you think were the key non-technical factors that interacted with the technical flaws? What lessons can we learn from these accidents about controlling complex technical systems?
I wanted my students to focus on issues such as group think, on management concerns about cost and schedule and how those might cloud judgment, on the difficulty of managing risk, on the possibilities of miscommunication among well-intentioned people operating under stress.
I ended the lesson with a quote from Richard Feynman, the Nobel Prize winning scientist who had served on the Challenger board of inquiry after the accident. Feynman’s honest assessment of the critical flaws in NASA’s scheme of management was shunted to an appendix of the official report. It’s available in his book, “What Do You Care What Other People Think?”
This is what Feynman had to say:
“For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled.”
It was a devastating conclusion – a much needed one then, and arguably even more needed today.
America’s thinking about military action is impoverished. The U.S. military speaks of precision munitions and surgical strikes, suggesting a process that is controllable and predictable. Experts cite Prussian military theorist Carl von Clausewitz for his axiom that war is a continuation of political discourse with the admixture of violent means. Here, military action is normalized as an extreme form of politics, suggesting again a measure of controllability and predictability.
But what if war is almost entirely imprecise and unpredictable? What if military action and its impacts are often wildly out of line with what the “experts” anticipate? In fact, this is precisely what military history shows, time and time again, to include recent U.S. military actions in Iraq and Afghanistan.
U.S. military action essentially acts like hammer blows that upset the state of nature within the complex ecologies of societies like Iraq and Afghanistan. These blows ripple in unpredictable directions, creating new states of nature that change the ecologies of these societies in fundamental ways. They further generate fault lines that are often contrary to U.S. goals and interests.
Charles Darwin can lend a hand in explaining why this is so. Darwin is best known for his theory of evolution with its idea of “the survival of the fittest,” although Darwin did not use that term when he originally published The Origin of Species in 1859. Indeed, Darwin’s view of evolution was highly complex and multifaceted, as befits a man who studied the natural world in great detail for his entire adult life.
In an earlier, unpublished version of his masterwork, Darwin employed a complex image, known as the “wedge” metaphor, to explain interactions within the natural world that led to species extinction. Here is the way Darwin described “The Struggle for Existence” in his Notebook prior to The Origin of Species:
Nature may be compared to a surface covered with ten‐thousand sharp wedges, many of the same shape & many of different shapes representing different species, all packed closely together & all driven in by incessant blows: the blows being far severer at one time than at another; sometimes a wedge of one form & sometimes another being struck; the one driven deeply in forcing out others; with the jar & shock often transmitted very far to other wedges in many lines of direction: beneath the surface we may suppose that there lies a hard layer, fluctuating in its level, & which may represent the minimum amount of food required by each living being, & which layer will be impenetrable by the sharpest wedge.
In his model of the face of nature, Darwin showcases the interconnectedness of all species, together with the way in which changes to that face (the hammer blows) favor some species (wedges) while forcing out others. The hard layer, which represents the minimum amount of food for all, and which Darwin says cannot be penetrated, suggests an ecology that will continue to sustain life even as some species (wedges) are forced out and die off. The face of nature constantly changes, some species perish, but life itself endures.
How does Darwin’s wedge metaphor apply to military action? Consider, for example, U.S. airstrikes in the Middle East. They are the hammer blows, if you will, to the face of nature in the region. The wedges are various groups/sects/factions/tribes in the region. The U.S. believes its hammer blows will force out “bad” wedges, driving them toward extinction, which will ultimately improve the prospects of “good” wedges, such as so-called moderates in Syria. But what if U.S. blows (airstrikes and other violent military action) are driving radical sects (wedges) more deeply into the face of nature (in this case, the face of politics and society in the Middle East)? What if these radical sects, like Darwin’s driven wedges, are forcing out rival sects that are more moderate? What if the “jar & shock” of these U.S. military hammer blows is being propagated throughout Middle Eastern societies and Islam in ways that are as unpredictable as they are long-lasting?
Darwin’s complex wedge metaphor should make us think more deeply about the results of blows to complex, interconnected, and interdependent systems. Using military strikes in an attempt to destroy “bad” wedges may have the very opposite effect than the one intended. Instead of being destroyed, such wedges (such as the Islamic State) are driven deeper into the ecology of their communities, helping them to thrive, even as they send out vibrations “in many lines of direction” that harden the new ecology of the region against U.S. interests.
What, then, to make of Darwin’s “hard layer” in his wedge metaphor, which varies in its level but which persists in that no wedge may penetrate it? The “hard layer” represents that which all wedges can’t do without. All species are dependent on a source of food and energy, a source of sustenance to sustain reproduction. Darwin notes that the hard layer fluctuates, and though he doesn’t explicitly state it, those fluctuations must also act much like blows, displacing some wedges while favoring others with effects that ripple across the face of nature.
Rise or fall, the “hard layer” persists, meaning life on earth persists, even as individual species perish. Darwin explicitly states that no wedge can penetrate the hard layer, but here his metaphor breaks down when we consider humans as a wedge. Because humans can and do penetrate that layer. As a species, we do have the capacity to damage, even to destroy, the hard layer of nature upon which all species are dependent. We’re the killer wedge in the wedge metaphor.
Politically speaking, piercing that hard layer in the Middle East would be equivalent to igniting a new Crusade that leads to world war, one involving nuclear weapons or other forms of WMD. Devolution in place of evolution.
Of course, one shouldn’t push any metaphor too far. That said, Darwin’s “wedge” metaphor, in its imagery and subtlety, is more useful in understanding the complexity and unpredictability of military action than analogies that reduce war to exercises in precision surgery or power politics.
William Astore is a retired lieutenant colonel (USAF) and former professor of history who edits the blog The Contrary Perspective.
Twenty-five years ago, I wrote the following paper for a class in the history of technology. Back then, chlorofluorocarbons (CFCs) and acid rain as well as global warming were issues highlighting the drawbacks of technology. CFCs were damaging the ozone layer, acid rain was poisoning our lakes and streams and damaging trees, with the buildup of greenhouse gases looming as a future threat. The future is now, of course, since we’ve done virtually nothing to address global warming. If anything, the debate in 1989 was far more sober, since back then there were no “climate change deniers.”
Written at the tail end of the Cold War, my paper from 1989 is colored by the threat of nuclear annihilation, another threat (like acid rain and CFCs) that has abated in the last two decades. Reason for hope, perhaps?
Yet in those 25 years, technology has only proliferated even as compassion for those less fortunate has declined. I wrote this paper before there was an Internet and World Wide Web, before cell phones and smart phones became ubiquitous, before we had so much conclusive evidence of the dangers of man-accelerated global warming. I was attempting to argue that scientists and engineers had an obligation to consider the larger impact of their work, to include the moral implications of their research.
I’ve made one major change to this paper as written 25 years ago. Back then, I concluded with the idea that an ethics based on Christianity needed to inform the work of scientists and engineers. Today, this argument seems far too parochial and limiting, so I have removed it.
Technology and the Role of Scientists and Engineers in Modern Society (1989)
What is the proper role of scientists and engineers in modern society? This question is especially relevant today, as can readily be confirmed by opening the September 1989 special issue of Scientific American entitled “Managing Planet Earth.” Technology, it seems, has spawned many monsters: chlorofluorocarbons that tear holes in our protective ozone shield, factory smoke that turns our rain acidic, carbon dioxide that threatens to convert our planet into one big greenhouse. The contributors to Scientific American assert that humanity must regain control over technology before its monsters inflict irreparable damage to the earth.
Defenders of technology, not surprisingly, advance the opposite thesis. Samuel Florman, an engineer and the author of Blaming Technology, counters that “technology is still very much under society’s control, that it is in fact an expression of our very human desires, fancies, and fears.” In Florman’s opinion, engineers should dedicate themselves to doing works for the good of society, but they should not try to define what is good for society. Their mission, Florman holds, is to achieve rather than to set society’s goals.
Florman does not exonerate engineers from all responsibility, however. He asserts that engineers must be guided by their individual consciences, but he also suggests that society should not expect any “special compassion” from its engineers. In fact he implies that society must resign itself to emotionally-detached engineers: “If we accept the single-minded dedication of ballet dancers and other artists,” Florman analogizes, “we should be able to accept, however regretfully, the same characteristic in a number of scientists and engineers.”
But a serious flaw lies at the heart of Florman’s plea for the sanctity of the engineering profession. He disregards the vastly different societal roles of artists versus scientists and engineers, as well as the serious dangers of a powerful technical elite. The philosopher Hannah Arendt noted these dangers in the context of atomic experimentation:
The simple fact that physicists split the atom without any hesitations … although they realized full well the enormous destructive potentialities … demonstrates that the scientist qua scientist does not even care about the survival of the human race on earth or, for that matter, about the survival of the planet itself.
Arendt makes an important point here. Scientists and engineers sometimes pursue their interests even when they threaten the survival of humanity (or themselves for that matter). Evidence from the Manhattan Project lends credibility to this argument. Most scientists who worked on the project were too caught up in the technical challenges of building the atomic bomb to entertain moral qualms about the bomb’s purpose. Robert R. Wilson, the leader of the cyclotron group during the Project, observed that he never considered quitting:
We were the heroes of our epic, and there was no turning back. We were working on a problem to which we were completely committed; there was little time to re-examine our moral position from day to day.
The atomic bomb was the grail for these knights of science; they focused on their pursuit and little else. Perhaps they believed they could wash their hands clean of the stains of Hiroshima and Nagasaki, for they neither made the decision to drop the bombs nor did they pilot the planes. Yet they could not deny that it was their expertise that brought humanity to the brink of its own destruction during the Cold War.
So what does our nuclear heritage teach us? It teaches us that humanity needs a more humane technology and more humane engineers. In sum, we need a new purpose for technology, one that is inspired by social and humanitarian concerns.
Jules Verne captured the risk of failing to do so. “If men go on inventing machinery, they’ll end by being swallowed up by their own inventions,” Verne prophesized. There are still some people, however, who continue to believe that technological advances themselves will eliminate technology’s harms. Charles F. Kettering, a remarkably inventive General Motor’s executive and a quintessential company man, captured this idea. In Paul de Kruif’s words, Kettering felt that
You cannot put the brakes on any discovery … you’ve got to go on with it even if we’re all blown to hell with it. What you should do is step up the study of human nature, you may even find a chemical, a vitamin, a hormone, a simple pill to take the devil out of human nature….
Here one cannot help but be reminded of Aldous Huxley’s Brave New World, where another automotive engineer, Henry Ford, was god, morality was but a faint memory, and drugs were the panacea for human ills.
Elting Morison, in Men, Machines, and Modern Times (1984), suggests that since technology forces humanity into its categories, humanity has no choice but to create a new culture to accommodate it. He proposes that a series of small experiments be performed world-wide, with “man as the great criterion” (or, perhaps more accurately, the great guinea pig). Apparently, a successful experiment will be one in which humans thrive, while an unsuccessful one will be one where humans “break down.” Rather oddly, Morison believes the military provides us with the paradigm of how to proceed. In his words:
They [the military] have the nuclear weapon that has fulfilled the exaggerated extreme toward which the system always tends … But for practical purposes they have created around this extreme a whole arsenal of carefully graded instruments of limited destruction – old-fashioned armaments of lesser power and new weapons of modulated nuclear energy.
It’s shocking how Morison waxes nostalgic over those “old-fashioned” weapons, and his addition of “modulation” to atomic bombs makes them seem downright cozy. As George Orwell observed in his famous 1946 essay entitled Politics and the English Language, “such phraseology is needed if one wants to name things without calling up mental pictures of them.” Thus cluster bombs that send shrieking hunks of shrapnel through the air, napalm that sears lungs and burns human skin, and atomic artillery shells that annihilate armies (but not cities, we hope) become, for Morison, “modest examples of how to begin to proceed.”
A more pessimistic prospectus for the future of technology is held by Arnold Pacey in The Maze of Ingenuity (1980). For Pacey, history reveals that technology cannot “easily accommodate the broad aims and the mixture of human and technical factors which a socially-orientated direction of progress in technology … require[s]. Thus the efforts made to encourage a more directly social form of technical progress … have been relatively ineffective.”
Pacey attributes this failure to the dominance of the mechanical world view. Beginning with Galileo, Pacey maintains, scientists and engineers restricted their own view of the world, blinding themselves to the larger purposes of technology.
Pacey does more than lament, though. He offers several potential solutions, all of which seem flawed. He assumes that new, less destructive, technologies are needed to meet human needs, or to ease poverty, yet the world currently has enough resources to end poverty, and present technology could doubtless be used more constructively. Pacey also unconsciously undermines his argument by citing education and medical care as “examples of how continuous improvement is possible without any large accompanying drain on material resources.” Unfortunately for Pacey, both education and medical care are currently (and rightly) under siege in this country. Despite large sums of money spent and countless reform proposals, education remains mediocre, while medical care remains compassionless and costly.
No wonder Pacey despairs. He half-heartedly mentions other potential balms, e.g. critical science, which pursues “careful, rigorous researches into the relationship between technical innovation, nature and society,” and general systems theory, yet it is unclear from reading Pacey how critical science differs from general systems theory. In the end, Pacey supplies the reader with little in the way of hope, for he despondently observes that systems theory is corruptible.
In the end, we’re left with today’s dehumanizing technological imperative, as noted by Carlo Cipolla, a noted historian of technology, in this passage:
Each new machine … creates new needs, besides satisfying existing ones, and breeds newer machines. The new contrivances modify and shape our lives and our thoughts; they affect the arts and philosophy, and they intrude even into our spare time.
To prevent this dominance of the machine, science and technology need to serve social and humanitarian needs more directly. In “Thinking about Human Extinction,” George Kateb holds that individuals must attach themselves first and foremost to existence. This attachment “cannot be cultivated by way of a theology that bestows [from the outside] meaning or worth on existence,” and it must be able to withstand “all temptations to go along with policies that may lead to human and natural extinction.”
Existence is justified by a sense of beauty; specifically, Martin Heidegger’s wonderment at the very indefiniteness of existence. For Kateb, “because there could have been earthly nothingness … one must finally attach oneself to earthly existence, whatever it is, and act to preserve it … [To this end] persons must be schooled in beauty to acquire the disposition to sustain wonder that there is earthly existence rather than none.” In sum, we must learn to revel in the very fact of humanity’s existence against the longest of cosmic odds.
In a world that grows ever more fragile with each passing day, an appreciation for the fragility of our existence, as well as an abiding compassion for humanity, is exactly what we need from our scientists and engineers.
Sources in order of citation
Samuel C. Florman, Blaming Technology: The Irrational Search for Scapegoats (New York: St. Martin’s Press, 1981).
Hannah Arendt, “A Symposium on Space: Has Man’s Conquest of Space Increased or Diminished his Stature?”, The Great Ideas Today 1963 (Chicago: Encyclopedia Britannica, Inc., 1963).
Robert R. Wilson, “The Scientists who Made the Atom Bomb,” Science, Conflict and Society (San Francisco: W.H. Freeman, 1969).
Jules Verne, Five Weeks in a Balloon (1862), quoted in James R. Newman, “The History and Present State of Science Fiction,” Science, Conflict and Society (San Francisco: W.H. Freeman, 1969).
Paul de Kruif, Life Among the Doctors (New York: Harcourt, Brace, 1949), p. 445, quoted in William Leslie, Boss Kettering (New York: Columbia University Press, 1983).
Elting E. Morison, Men, Machines, and Modern Times (Cambridge, Mass: MIT Press, 1966, 1984).
Arnold Pacey, The Maze of Ingenuity: Ideas and Idealism in the Development of Technology (New York: Holmes/Meier, 1974, 1980).
Carlo M. Cipolla, Clocks and Culture 1300-1700 (New York: W.W. Norton & Co., 1978).
George Kateb, “Thinking about Human Extinction: (I) Nietzsche and Heidegger,” Raritan (Fall 1986), pp. 1-28.