Technology as Diversion from Social Inequality

tech fears

W.J. Astore

Today, access to technology and its services is often associated with equality of opportunity in society.  In education, for example, getting computers and Internet service to low-income students is considered a vitally important step to students’ maturation and their skill sets in a competitive global marketplace.  The “digital divide” must be bridged, else disadvantaged students will be stuck in the dark ages and left behind.  Focusing on technology as both “bridging” mechanism and source of enlightenment has the added benefit of being easily measurable and “correctable,” e.g. by increasing the number of computers per class, the number of connected classrooms, and so on.

Spending (or, as they say, “investing”) money on classroom technology, moreover, is obviously favored by tech companies both for present and future profits (raise a child on Apple devices and perhaps as adults they’ll always favor Apple).  Parents like it too: perhaps Johnny and Susie mainly play games on their school-provided iPads, but at least they’re occupied while “learning” computer skills.

Of course, the digital divide does exist, and computer skills are valuable.  But hyping access to technology is often a distraction from much bigger issues of inequality, as George Orwell noted back in the early 1930s in “The Road to Wigan Pier.”

Back then, Orwell was concerned with electricity rather than computers and connectivity.  But what he says about electrification could be said about any technology presented as a panacea for social ills.

Here’s what Orwell wrote at the end of chapter 5 of his book:

And then there is the queer spectacle of modern electrical science showering miracles upon people with empty bellies. You may shiver all night for lack of bedclothes, but in the morning you can go to the public library and read the news that has been telegraphed for your benefit from San Francisco and Singapore. Twenty million people are underfed but literally everyone in England has access to a radio. What we have lost in food we have gained in electricity. Whole sections of the working class who have been plundered of all they really need are being compensated, in part, by cheap luxuries which mitigate the surface of life.

Orwell was rightly skeptical of technological “miracles” like electricity that were sold as mitigating fundamental inequalities such as access to healthy food and warm and adequate housing.  Empty bellies and empty prospects are not filled by instant news, whether via the telegraph and wireless radio or via the Smart phone and wireless LANs.

The point is not to blame technology.  The point is to highlight technology as a choice, one that often doesn’t address fundamental inequities in society.

Trump: A Worrisome Commander-in-Chief

Trump holds a rally with supporters at the Suburban Collection Showplace in Novi, Michigan, U.S.
He doesn’t speak softly, even as he now inherits a very big U.S. military stick. REUTERS/Jonathan Ernst

W.J. Astore

I’d never watched a U.S. presidential candidate who scared me – truly scared me – until the Republican debate on March 3, 2016.  This candidate literally gave me the creeps.  As a historian and as a retired U.S. military officer, his answer to a question on torture and the potential illegality of his orders if he became the military’s civilian commander-in-chief horrified me.  The next day, I wrote a short blog post in which I argued that this candidate had disqualified himself as a candidate for the presidency.  That candidate’s name was Donald Trump.

What did candidate Trump say that so horrified me?  He said this: They [U.S. military leaders] won’t refuse [my illegal orders]. They’re not going to refuse me. Believe me.  After again calling for waterboarding and more extreme forms of (illegal) torture, as well as not denying he’d target terrorists’ families in murderous reprisal raids, candidate Trump then said this: I’m a leader. I’m a leader. I’ve always been a leader. I’ve never had any problem leading people. If I say do it, they’re going to do it. That’s what leadership is all about.

As I wrote at the time, “Our military does not follow blindly orders issued by ‘The Leader.’ Our military swears an oath to the Constitution.  We swear to uphold the law of the land. We don’t swear allegiance to a single man (or woman) as president.”

“Trump’s performance … reminded me of Richard Nixon’s infamous answer to David Frost about Watergate: ‘When the president does it, that means it’s not illegal.’ No, no, a thousand times no.  The president has to obey the law of the land, just as everyone else has to.  No person is above the law, an American ideal that Trump seems neither to understand nor to embrace.”

“And that disqualifies him to be president and commander-in-chief.”

Yes, I wrote those words just before the Ides of March.  And yet here we are, with Trump as our president-elect and, come January 2017 the U.S. military’s next commander-in-chief.  What the hell?

Confronted with criticism of his remarks that the U.S. military would follow his orders irrespective of their legality, Donald Trump soon walked them back.  But for me his dictatorial instincts, his imperiousness, and, worst of all, his ignorance of or indifference to the U.S. Constitution, stood revealed in horrifyingly stark relief.  Little that Trump said or did after this major, to my mind disqualifying, gaffe convinced me that he was fit to serve as commander-in-chief.

Here’s what I wrote back in March about the prospect of Trump serving as commander-in-chief:

Donald Trump: Lacks an understanding of the U.S. Constitution and his role and responsibilities as commander-in-chief.  Though he has shown a willingness to depart from orthodoxies, e.g. by criticizing the Iraq War and the idea of nation-building, Trump’s temperament is highly suspect.  His bombast amplified by his ignorance could make for a deadly combination.  Hysterical calls for medieval-like torture practices are especially disturbing.

Another disturbing tack he took was to suggest that he’d clean house among the military’s senior ranks — apparently, America today doesn’t have enough men like George Patton and Douglas MacArthur, Trump’s all-time favorite generals.  Patton was a notorious hothead, and MacArthur was vainglorious, egotistical, and insubordinate.  Leaving that aside, Trump doesn’t seem to understand that the president is not a dictator who can purge the military officer corps. Officers are appointed by Congress, not by the president, and they serve at the will of the American people, not at the whim of the president.

Combine Trump’s ignorance of the U.S. Constitution with his cavalier attitude toward nuclear weapons and you truly have a combustible formula.  Clearly, Trump had no idea what America’s nuclear triad was during the Republican primary debates, but few people in the media seemed to care.  (Gary Johnson, meanwhile, was pilloried by the press for not knowing about Aleppo.)  Trump gave statements that seemed to favor nuclear proliferation, and seemed to suggest he saw nuclear weapons as little different from conventional ones.  He also repeated that hoary chestnut, vintage 1960, that some sort of “missile gap” existed between the U.S. and Russia: the lie that Russia was modernizing its nuclear forces and the USA was falling hopelessly behind.  Again, there was little push back from the press on Trump’s ignorance and lies: they were enjoying the spectacle and profits too much.

When it comes to nuclear war, ignorance and lies are not bliss.  Can Trump grow up?  Can he become an adequate commander-in-chief? America’s future, indeed the world’s, may hinge on this question.

A Century of Mass Slaughter

Big Bertha (wiki)
Big Bertha (wiki)

W.J. Astore.  Also featured at Huffington Post.

This August marks the 100th anniversary of the start of World War I. That “Great War” was many things, but it was most certainly a war of machines, of dreadnought battleships and “Big Bertha” artillery, of newfangled airplanes and tortoise-like tanks. Industrial juggernauts like Great Britain, France, and Germany succeeded more or less in mobilizing their economies fully for war; their reward was reaping the horrors of death-dealing machinery on a scale theretofore thought impossible.

In that summer of 1914, most experts expected a short war, so plans for sustaining machine-age warfare through economic mobilization were lacking. Confronted by trench warfare and stalemate on the Western Front which owed everything to modern industrialism and machinery, the “big three” antagonists strove to break that stalemate using the means that had produced it: weapons and munitions. Those empires caught up in the war that were still industrializing, e.g. Russia, Austria-Hungary, the Ottoman Empire, found themselves at a serious disadvantage.

Together, Britain and France forged an industrial alliance that proved (with help from the U.S.) to be a war-winning “arsenal of democracy.” Yet this alliance contributed to an overvaluing of machines and munitions at the soldiers’ expense. For Entente leaders — even for old-school cavalry officers like Britain’s Field Marshal Sir Douglas Haig — new artillery with massive stockpiles of shells promised to produce the elusive breakthrough and a return to mobile warfare and glorious victory.

Thus it was that at the Battle of the Somme that began on July 1, 1916, British soldiers were reduced to trained occupiers. Lengthy pre-battle artillery barrages, it was believed, would annihilate German defenders, leaving British troops to slog uncontested across no-man’s land to occupy the enemy’s shattered and empty trenches.

But those trenches were not empty. Germany’s defenses survived Britain’s storm of steel largely intact. And Britain’s soldiers paid the price of misplaced faith in machine warfare: nearly 20,000 dead on that first day, with a further 40,000 wounded.

The Somme is but one example of British and French commanders being overwhelmed by the conditions of machine warfare, so much so that they placed their faith in more machines and more munitions as the means to victory. After underestimating the impact of technology on the battlefield up to 1914, commanders quickly came to overestimate it. As a result, troops were inadequately trained and tactics inadequately developed.

As commanders consumed vast quantities of machinery and munitions, they became accustomed to expending lives on a similarly profligate scale. Bodies piled up even as more economic means were tapped. Meanwhile, the staggering sacrifices required by destructive industrialism drove nations to inflate strategic ends. Industrialized warfare that spat out lead and steel while consuming flesh and bone served only to inflame political demands, negating opportunities for compromise. Total victory became the only acceptable result for both sides.

In retrospect it’s remarkable how quickly leaders placed their faith in the machinery of war, so much so that military power revved uncontrollably, red-lined, then exploded in the faces of its creators. Industrialized destruction and mass slaughter were the predictable outcomes of a crisis whose resolution was driven by hardware — more weaponry, more machinery, more bodies. The minds of the men who drove events in that war could not sanction negotiation or compromise; those were forms of “weakness” that neither side could accept. Such murderous inflexibility was captured in the postwar observation of novelist Virginia Woolf that “It was a shock to see the faces of our rulers in the light of the shell fire. So ugly they looked — German, English, French — so stupid.” Note how she includes her own countrymen, the English, in the mix of the ugly and the stupid.

In World War I, Carl von Clausewitz’s dictum of war as an extreme form of politics became tragically twisted to war as the only means of politics, with industrialized mass destruction as the only means of war. The resulting failure to negotiate a lasting peace came as no surprise since the war had raced not only beyond politics, but beyond the minds of its military and political leaders.

The Great War had unleashed a virus, a dynamic of destruction, that would only be suppressed, and even then only imperfectly, by the wanton destruction of World War II. For what was Auschwitz but a factory of death, a center for mass destruction, a mechanized and murderous machine for efficient and impersonal slaughter, a culmination of the industrialized slaughter (to include mass gassing) of World War I?

The age of mass warfare and mass destruction was both catalyst for and byproduct of the age of machinery and mass production. Today’s age is less industrial but no less driven by machinery and mass consumption (which requires a form of mass destruction inflicted largely on the environment).

Aerial drones and cyber warfare are already providing disturbing evidence that the early 21st century may yet echo its predecessor in introducing yet another age of misplaced faith in the machinery of warfare. The commonality remains the vulnerability of human flesh to steel, as well as human minds to manipulation.

A century has passed, yet we’re still placing far too much faith in the machinery of war.

The Pentagon’s Sky is Falling!

Secretary of Defense Chuck Hagel
Secretary of Defense Chuck Hagel

W.J. Astore

You’ve probably seen the headline: Secretary of Defense Chuck Hagel is gutting the Army to numbers not seen since the sleepy days before Pearl Harbor!   Senior Republicans like Lindsey Graham and John McCain have already declared that these cuts are DOA (dead on arrival) in the Senate.  Why? Allegedly because they endanger our national defense.  Naturally, such claims are often politically-motivated.  Former Vice President Dick Cheney has already gone on record as claiming that President Obama prefers to fund food stamps and other social entitlement programs to funding the military at adequate levels.

Should we be worried?  Conor Friedersdorf has an excellent article at The Atlantic to explain why Hagel’s proposed cuts to today’s Army should not be compared to the Army’s end strength in 1940.  The U.S. military has obviously changed greatly since then.  Today, the military relies much more on technology as “force multipliers.”   There is simply no military on the planet as high-tech and capable of projecting power as the U.S. military.  Moreover, because we’re not fighting simultaneous wars in Iraq and Afghanistan, we simply no longer need as many soldiers in the Army as we did during the Surge years.

Today’s military is far less concerned with end strength than it is with capabilities.  I recall talking to an Army lieutenant colonel and experienced battalion commander in Iraq.  He explained that one of his infantry companies (of approximately 100 men) could easily defeat an enemy battalion (of approximately 500 men). He wasn’t boasting; just stating facts.  An American company, assuming it could tap its technology as well as all of its fire support units (artillery, helicopters, and close air support from the U.S. Air Force), would simply move faster and hit harder and more accurately than its enemy.  Again, it’s not about numbers; it’s about capabilities.

The US military is enormously powerful.  Its naval and air assets are second to none.  So is its ability to hit hard at a distance.  So is its equipment — its force multipliers — from divisional/brigade levels down to the platoon/squad level.   Reversing the old Soviet dictum, in this case quality has a quantity all its own.

To suggest that Hagel’s proposed retrenchment in Army end strength would return us to 1940 is the ultimate in ignorance — or the ultimate in deliberate disinformation for political gain.

America’s weakness has nothing to do with its military.  America’s weakness is the rampant dishonesty of its political discourse.  Even adding a million soldiers to our Army’s rolls won’t fix that.

Is the Digital World Too Ephemeral?

Give me hardcopy!
Give me hardcopy!

W.J. Astore

A concern I have about the new borderless digital world is its ephemeral nature.  Even though I keep a blog and write a lot online, I still prefer books and hardcopy.  I clip newspaper articles.  I file them away and then occasionally resuscitate them and use them in class when I teach.

Hardcopy has a sense of permanence to it.  A certain heft.  Whereas our new digital world, as powerful as it is for instant access and personal customization, seems much more ephemeral to me.

I know similar complaints have been made throughout history.  The proliferation of books was deplored as leading to the decline of visual memory skills.  Television was equated with the end of civilization, with the medium becoming the message.

Perhaps what I’m truly lamenting is the slow decline of context, together with the erosion of deep memory.  The digital world we increasingly inhabit seems to encourage an ephemeral outlook in which history just becomes one damn thing after another.

To switch metaphorical images, the dynamism and flash of the digital world is much like a landscape with lots of beautiful shiny leaves and glistening flowers to attract our attention.

Yet, at least in our minds, the landscape is rootless.  Our gaze is enraptured, our minds are intrigued, but the moment is fleeting, and we fail to act.  We fail to act because we are entertained without being nurtured.

Let’s take smartphones, for example.  With their instant access to data, they seem to make us very smart indeed.  But access to knowledge (data recall) isn’t intelligence.  There’s simply no substitute for deep-seated intellectual curiosity and the desire to learn.

Smart phones are useful tools — a gateway to a dynamic digital world. But they’re not making us any smarter.  Perhaps they’re helping us to connect certain dots a little faster.  But are we connecting them in the right way?  And are they the right dots to connect?

Those are questions that smartphones can’t answer.  Those are questions that require deep, contextual, thinking.  And group discussion. Think Socrates and his followers, debating and discoursing. And acting.

Sometimes it’s best to disconnect from the matrix, find a quiet place for reflection, sink down some roots, and hit the books.  Then find other informed people and bounce your ideas off them.  Collisions of minds in informed discourse. Competing ideas feed the completing of actions for the common good.

As the Moody Blues might say, it’s a question of balance. The astral planes of the digital world can open new vistas, but let’s not forget the need to return to earth and get things done.

STEM Education Is Not Enough

Sir Peter Medawar
Sir Peter Medawar

W.J. Astore

If you’re in education, you’ve heard the acronym STEM. It stands for science, technology, engineering, and mathematics.  As a country, the USA is behind in STEM, so there are lots of calls (and lots of federal money available) for improvements in STEM.  Usually the stated agenda is competitiveness.  If the US wants to compete with China, Japan, Europe, India, and other economies, our students must do better in science and math, else our economy will atrophy.

Here’s a sample rationale that can stand in for hundreds of others: “International comparisons place the U.S. in the middle of the [STEM] pack globally,” said Debbie Myers, general manager of Discovery Communications.  And for corporate managers like Myers, that’s not good enough when competition in the global market is both endless and the means to the end, the end being profit.

I’m all for STEM.  I got my BS in mechanical engineering and worked as an engineer in the Air Force.  I love science and got my master’s and Ph.D. in the history of science and technology.  I love science fiction and movies/documentaries that explore the natural world around us.

And that’s one thing that bugs me about all this emphasis on STEM.  It’s not about curiosity and fun; it’s not even about creativity.  STEM is almost always pushed in the US in terms of market competitiveness.  STEM, in other words, is just another commodity tied to profit in the marketplace.

My other bugaboo is our educational establishment’s focus on STEM to the exclusion of the humanities.  At the same time as the humanities are undervalued, STEM is reduced to a set of skills as mediated and measured by standardized tests.  Can you solve that equation?  Can you calculate that coefficient of friction? Can you troubleshoot that server?  Results, man.  Give me results.

Sir Peter Medawar, a great medical researcher and a fine writer on science, spoke of scientific discovery as an act of creation akin to poetry and other so-called liberal arts.  Nowadays, we simply don’t hear such views being aired in US discourse.  STEM as an act of creation?  As a joyful pursuit? Bah, humbug.  Give me results.  Give me market share.  Make me Number One.

If we as a nation want to encourage STEM, we should be focusing not on rubrics and metrics and scores.  We should instead be focusing on the joy of learning about nature and the natural world. How we model it, manipulate it, understand it, and honor it by preserving it.  STEM, in other words, must be infused with, not divorced from, the humanities.  Why?  Because STEM is a human pursuit.

As we pursue STEM, we should also honor our human past, a past in which we’ve learned a lot about ethics, morality, and humane values.  The problem is that STEM education in the US is often present- and future-focused, with little time for the past.

In American society, those with respect for old ways and traditional values are often dismissed as Luddites or tolerated as quaint misfits (like the Amish).  After all, Luddites aren’t competitive. And Amish quilts and buggies won’t return America to preeminence in science and technology.  The US as a nation has nothing to gain from them.  Right?

Here’s the problem.  We connect STEM to material prosperity.  We dismiss those who question all this feverish attention to STEM as anti-science or hopelessly old-fashioned.  But there’s a lot we can from the humanities about ourselves and our world.

To cite just one example: Consider this passage from Jacob Burckhardt, a great historian writing during the industrial revolution of the late 19th-century:

material wealth and refinement of living conditions are no guarantee against barbarism. The social classes that have benefited from this kind of progress are often, under a veneer of luxury, crude and vulgar in the extreme, and those whom it has left untouched even more so. Besides, progress brings with it the exploitation and exhaustion of the earth’s surface, as well as the increase and consequent proletarianization of the urban population, in short, everything that leads inevitably to decline, to the condition in which the world casts about for ‘refreshment’ from the yet untapped powers of Nature, that is, for a new ‘primitiveness’ – or barbarism.”

What a party-pooper he was, right? Most of what the US defines as STEM is about “material wealth” and “refinement of living conditions,” the very definition of “progress,” at least for those out to make a buck off of it.

Burckhardt was warning us that “progress” tied to STEM had its drawbacks, to include the exhaustion of the earth’s resources as well as the exploitation of human labor. Divorced from ethics and morality, STEM was likely to lead to “primitiveness,” a new barbarism.

Tragically, Burckhardt was right. Consider the industrialized mass murder of two world wars. Consider the “scientific” mass murder committed by the Nazis. (By the way, the Nazis were great at STEM, valuing it highly.)

In a democracy, STEM divorced from the humanities is not “competitive,” unless your idea of competition is barbaric. Disconnected from humane values, a narrow education in STEM will serve mainly to widen the gap between the 1% and the rest of us while continuing to stretch the earth’s resources to the breaking point.

Education in STEM, in short, is not enough. But you won’t learn that by listening to corporate CEOs or presidents prattle on about competitiveness.

For that wisdom, you need to study the humanities.

Of MOOCs and Technology: Why True Education Is Not Content Delivery

Robin Williams in "Dead Poets Society"
Robin Williams in “Dead Poets Society”

W.J. Astore

Massive open online courses (MOOCs) are one of those “pedagogical practices that are current and relevant to the new generation of learners,” to use a description featured prominently in promotional literature. Sure sounds trendy, doesn’t it? But education is not simply about content delivery. Education is about inspiration. It’s about lighting a fire in the mind (and maybe the belly too). Call me skeptical, but I don’t think a MOOC can do that.

OK, I haven’t tried a MOOC, but I have experienced distance learning. As a military officer, I took ACSC (Air Command and Staff College) by “correspondence.” The Air Force sent me the books and study materials, I did the reading and studying — and learned absolutely nothing. Why? First you memorized content, then you took multiple-choice tests to measure your “mastery” of that content. I passed with flying colors — and retained nothing.

As a professor I’ve also advised a graduate student via distance learning. It was an adequate experience for the both of us, but we never met. The mentoring experience was impoverished. I felt little connection to the student, and I’d wager he felt little connection to me.

Distance learning and MOOCs reduce education to content delivery. And it requires an exceptional student to get the most out of them. When I query my students in class about on-line courses, most of them are ambivalent or opposed to them. When they favor them, they say things like: “It was easy to skate by” or “I took it only because it fit my work schedule.”

To be blunt, administrators are looking for ways to reduce costs, and on-line learning is being pushed for that very reason. No classrooms needed. Little or no cost for electricity, facilities, classroom materials and the like. Combine cost-cutting imperatives with growing privatization of education and you have a recipe for education delivered as a commodity driven by the profit motive.

What’s wrong with that, you say? Nothing. Just say “goodbye” to any radical or even fresh ideas being pushed by profit-driven vendors.

Even as we’re overvaluing MOOCs and distance learning, we’re overhyping glitzy technology in the classroom. When it’s appropriate, I use technology in the classroom, but not because I’m trying to be trendy, i.e. not because I think Twitter or Tablets or other gimmicks and gizmos are how you “connect” with today’s students.

Indeed, exactly because my students are perpetually staring at screens, I often use an old-school approach of engaging them in class with vivid stories and amusing anecdotes and open-ended discussion.

Today’s students don’t need more technology; they don’t need more PowerPoint and computer-based learning platforms. What they need are enthusiastic and talented and creative teachers and professors who see education not as a job but as a calling.

I bet every person reading this remembers a teacher or professor who truly inspired you. And I bet he or she did so without glitzy technology and without genuflecting before “current pedagogical practices.”

My father was fond of saying, “The more things change, the more they remain the same.” Give me passion in the classroom. Give me a teacher who throws off sparks, and students with combustible minds. Give me that, and I’ll show you true education.

An Addendum: After writing this, I came across a Northeastern University survey featured at the Chronicle for Higher Education that addressed MOOCs, among other issues.  This is what the survey found:

“Slightly more than half of the respondents believe that MOOCs will fundamentally transform how students are taught, but just 27 percent think the online classes are of the same quality as traditional, in-person education. And yet more than half of the respondents predicted that in five to seven years an online education would be seen as of equal quality to a traditional one.”

So whatever I think about MOOCs, I think it’s fair to say that they are here to stay, and that their influence and reach will continue to grow.

Astore writes regularly for TomDispatch.com and The Contrary Perspective and can be reached at wjastore@gmail.com.