I was educated in public schools by dedicated teachers in the pre-digital age. My teachers read books to me and had me read books. I learned math, partly by rote, but also through friendly student competitions. Science I learned by doing, like chemistry with Bunsen burners and test tubes. I had classes in art and music, and even though I had little talent in drawing or playing an instrument, I still learned to appreciate both subjects. My high school was big and diverse, so I took electives in courses I really enjoyed, like science fiction, photography, even a course in aquariology, in which I built my own aquarium. And I must say I’m glad there wasn’t the distraction of Facebook, Instagram, Snapchat, and similar social media sites to torment me; video games, meanwhile, were in my day still crude, so I spent more time outside, playing tennis, riding my bike, hanging with friends, being in the world and nature (fishing was a favorite pursuit).
When I was a teen, we learned a lot about history and civics and the humanities. We spent time in the library, researching and writing. I took a debate course and learned how to construct an argument and speak before an audience. When I graduated from high school, I felt like I had a solid grounding: that I knew enough to make educated choices; that I could participate as a citizen by voting intelligently when I was eighteen.
Something has happened to education in America. You can see it in the big trends that are being hyped, including STEM, vocational training, computers and online courses, and privatization (charter schools). What suffers from these trends is the humanities, the arts, unionized teachers, critical and creative thinking skills, and, most especially, civics and ethics.
STEM is all about science, technology, engineering, and mathematics. My BS is in mechanical engineering and I love science and math, so I’m sympathetic to STEM classes. The problem is how STEM is justified – it’s usually couched in terms of keeping America competitive vis-à-vis other nations. STEM is seen as a driver for economic success and growth, a servant of industry, innovation, and profit. It’s not usually sold as developing critical thinking skills, even though STEM classes do help to develop such skills.
From STEM we turn to vocational training. Many students seek a career, of course, and not all students wish to go to a four-year college, or to college period. But once again vocational training is mainly justified as a feeder to business and industry. It’s often reduced to education as training for labor, where the primary goal is to learn to earn. It may produce decent plumbers and welders and electricians and the like, but also ones who are indoctrinated to accept the system as it is.
In The Baffler, Tarence Ray has an article, “Hollowed Out: Against the sham revitalization of Appalachia.” Ray critiques ARC (the Appalachian Regional Commission) in the following passage that resonated with my own experiences teaching at a vocational college:
“The ARC [in the late 1960s and early ‘70s] also placed a lot of emphasis on career and vocational education. This appealed to President Nixon, who was desperate to counteract the student activism of antiwar and environmental groups. ‘Vocational education is more politically neutral,’ one White House aide put it. But it was also advantageous for the multinational corporations who controlled Appalachia’s coal resources and most of its institutions of power–the goal was to create a workforce that was skilled but also obedient. An education in the humanities emphasizes critical thinking, which might lead to political consciousness, a risk that the ARC could not afford to take.” [emphasis added]
My dad liked the historical saying, the more things change, the more they stay the same. A vocational education sounds good, especially to those in power. Doubtless young people need marketable skills. The shame of it all is that the final “product” of vocational colleges–skilled graduates who are “workforce-ready”–is by design a limited one—an obedient one. America needs active and informed citizens as well, and they need to have the skills and mindset to question their bosses, their so-called betters, because if they lack such a mindset, nothing will change for the better in our society.
Along with STEM and vocational training is an emphasis on computers and online courses. Nowadays most school administrators would rather fund computers and networked classrooms than raise pay for teachers. In fact, online courses are advertised as a way to replace teachers, or at least to reduce the number of full-time teachers needed on staff. But I question whether one can learn sociology or art or philosophy or ethics by taking an online course. And I remain skeptical of big “investments” in computers, SMART boards, and the like. They may have their place, but they’re no substitute for education that’s truly student-centered, and one that’s focused on civics and ethics, right and wrong.
The final trend we’re seeing is privatization, as with charter schools. The (false) narrative here is that teachers in unions are overpaid, unaccountable, and otherwise inflexible or incompetent. Somehow the magical free market will solve this. If only one could get rid of unions while privatizing everything, all will be well in America’s schools. Private corporations, driven by profit and “efficiency,” will somehow produce a better product, a word I choose deliberately, for they see education as a product. And while some charter schools have been innovative and effective, many others have failed, mainly because education isn’t education when it’s reduced to a “deliverable” – a commodity driven by and reduced to money.
At a time when the United States desperately needs critical and creative thinkers educated in the arts and humanities as well as STEM and vocational subjects, our schools and especially our legislators are rejecting their duty to serve democratic ideals, choosing instead to embrace business, industry, economic competitiveness, and obedience, all in service of the bottom line measured in dollars and cents. Now more than ever, America needs young people who are engaged civically and ethically, who value more than money and materialism. Yet many of our schools are pursuing a much different agenda.
For most Americans, patriotism means love of country. But I’d like to suggest this “love” is misplaced for three reasons. First, I’d like to suggest that “country” is an imaginary construct. Two, I’d like to show how patriotism is misused and abused by the powerful, most infamously by President Donald Trump. And three, I’d like to suggest a new form of patriotism, the love of the tangible, and by this I mean our fellow human beings.
“Country” as an imaginary construct
“Imagine there’s no countries,” John Lennon wrote nearly fifty years ago. Generally, citizens of a given country insist they love their nation. But can one truly “love America,” or any other country or nation? For that matter can you love any state, city, town, or sports team?
In general semantics, a branch of linguistics which is itself a branch of philosophy, the word is not the thing, the map is not the territory. Canada, France, the Red Sox are only names, concepts, phenomena of consciousness. Or a neurological system in the brain if you adhere to the Western materialist worldview.
Think about it: You can’t see, touch, feel, hear, or taste “France.” But you can taste a French pastry made in “France” and see and touch the Eiffel Tower. ”Vive La France” does not mean that French people collectively are going to live a long life. In fact, the concept of France vanishes if there are no longer any human beings left after, say, France is devastated by a massive nuclear attack.
Now, one can literally love the beauty of the land that comprises the legal territory of a given country. I love the mountains and the deserts of the Western U.S., the woods of northern Maine, the seacoasts of California. I love Fourth of July celebrations, the fireworks and cookouts. I even love the old Frank Sinatra song, “The House I Live In” because it names things in America that you can put your hands on, such as the line “the ‘howdy’ and the handshake.” And then the concluding lyric, “that’s America to me.” (Notice there is no insinuation there is an America out there, only the symbolic meaning of the phrase.)
Love of country, in short, is nonsensical because a country, a nation, is an abstraction, a conceptual phenomenon, a byproduct of mental processes, that has no existence in the material universe. Perhaps Lennon’s dream of “imagine there’s no countries” will only become reality when we no longer perceive people as enemies or opponents merely because they live elsewhere or look different.
The misuse and abuse of patriotism
Politicians and journalists tend to affirm, for obvious reasons, that it’s important to state how much you love America. Not to do so could easily result in your career or ambitions heading south. Still, proclaiming your love of country, whatever country that is, all too often has undesirable and destructive consequences. For instance, it becomes easier to support a government taking the country to war. Or colossal military budgets in the name of “defending” the “country.”
To an unreflective patriot the country is not seen as the sum of its parts but as a reality sui generis, perhaps symbolized by a father figure like Uncle Sam.
If I can make a sweeping generalization, among rural chauvinists “country” is part of the “God, Country, and Guns” trinity. This idea is well captured by the Merle Haggard song from 1970 that “When they’re runnin’ down our country, man/They’re walkin’ on the fightin’ side of me.”
President Trump’s recent call for members of the so-called squad, the four progressive Congresswomen of color, to “go back” to where they came from (a takeoff of “love it or leave it”) is one step away from “I will hurt you if I see you again.” Obviously, there is no place natural-born U.S. citizens can go back to. And even if they were not citizens by birth, why should they have to leave after having become U.S. citizens? Trump’s “patriotism” is racist nationalism – and shamelessly so.
Patriotism, in the narrow Trumpian usage of that word, demands opponents, sides, an “us versus them” mentality. And that’s a mentality calculated for division, distraction, and destruction.
We humans can’t see national borders from space, but we do see our planet. Our real “homeland.” Nevertheless, the false choice of “America: love it or leave it” has recently been revived from the days when protesters against the Vietnam War were denounced as unpatriotic. In truth, they were performing the most patriotic act imaginable, if patriotism is properly defined as love of one’s fellow human beings. In that sense, real patriotism is humanitarianism. It’s focused on humans and the home where we live, not on constructs that are insensible.
False patriotism may remain “the last refuge of a scoundrel,” as Samuel Johnson, the 18th century British social philosopher, observed. Even so, a literal belief in “my country, right or wrong” could still do us all in some sunny day. A dangerous myth, indeed.
Is the American male dead? I’ve seen enough articles and books espousing a “war” on men and boys, amounting to a concerted attack on masculinity, to suggest that males are, if not dead, very much in decline in America, threatened by a “feminized” society that devalues manly virtues.
An article at the National Review, “Understanding the Inescapable Reality of Masculinity,” suggests that men as men have an “essential nature,” one that is “physical, aggressive, violent,” but that these traits are under attack as wider American society works to deny men their “inherent masculinity.” The article further argues there aren’t enough male role models in the lives of young boys – especially fathers and father-figures. This is a well-worn argument on the vital importance of the nuclear family with a man like Ward Cleaver in charge of it. There’s nothing wrong with that, except not all fathers are patient, kind, and intelligent mentors like Ward on “Leave it to Beaver.” Sadly, more than a few drive young boys to be aggressive and violent in selfish and dangerous ways.
Leaving that aside, it seems odd that this narrative of the decline of masculinity persists so strongly in Trump’s America. Now there’s a man! He’s physical, aggressive, unafraid to boast of pussy-grabbing or the size of his penis. He’s urged his followers at rallies to get physical with protesters. He supports torture and even hints at shooting immigrants as a rational “get tough” policy. Posing like Winston Churchill, he scowls and frowns in a simulacrum of manly determination. If the president is America’s chief role model, Trump’s doing his best to project masculinity as he understands it.
Indeed, you might argue Trump won the presidency in part because of his unapologetic “masculine” posing. Contrast this to Hillary Clinton, often portrayed as a “ball-buster,” an emasculating female. (Indeed, I had a Hillary nutcracker, a novelty gift from a friend.) Male voters (joined by a majority of White women) in 2016, perhaps looking for a “real” man to vote for and turned off by an alleged nut-cracking harridan, broke for Trump.
Trump’s win—and continued tolerance of his bullying, boastful, and bellicose manner—give the lie to the decline of masculinity narrative in America. Why does it persist, then? Because it’s yet another way to divide us. Consider similar narratives of an alleged war on Christianity, or that higher education is driven by hegemonic liberal/leftist agendas. In fact, Christianity is more powerful than ever in America—just look at Mike Pence and the influence of evangelicals in the U.S. government—and higher education is increasingly about serving the needs of business, industry, and the military-industrial complex.
But truth is unimportant when the object is stirring up divisiveness. Tell American men they’re threatened: that radical feminists, effete city dwellers, Ivy League elites, and other disreputable elements are out to get them. Then urge “threatened” males to vote for retrograde (fake) tough guys like Trump. It may not be the most subtle tactic, but it works.
In this narrative, masculinity is defined in “can-do,” action-oriented ways. Man as Alpha male, as doer, as fighter, whether in a bad way (as a killer) or in a good way (as a protector). It’s warrior-and empire-friendly. And indeed U.S. foreign policy today is distinctly masculine, with loads of emphasis on domination, on bossing other peoples around, simply because we’re bigger and badder than them.
What’s truly worrisome is not false narratives about masculinity’s decline but how it’s narrowly defined in violent and aggressive ways. We forget that macho posturing by America’s “leaders” has created enormous problems. Just think of George W. Bush and all his macho strutting before and during the Iraq war.
America needs fewer calls about putting on “big boy” pants and more emphasis on engaging in negotiation and diplomacy, along with action to end America’s chaotic and unwinnable wars. America is already carrying a big stick. It can afford to speak softly instead of shouting.
Perhaps the profession that requires job security more than any other is teaching, especially college teaching. Tenure traditionally meant that a teacher/professor could be terminated only for moral turpitude (e.g. sexual abuse of students), blatant racism, unfair or unjust grading, gross incompetence, failure to obey basic institutional rules such as not showing up to class on time, or not teaching the subject matter he or she was hired to teach. Nowadays, however, “Just Cause” is often grounds for termination of a tenured faculty member. But “Just Cause” in any work contract is far too flexible an instrument for employers and far too vague for employees who rightly worry about job security.
Job insecurity prior to acquiring tenure and tenure granted with a “Just Cause” basis for termination of employment work to stifle academicians’ free expression of creative ideas, theories, and perspectives in and outside of the classroom. Any psychiatric or psychological clinician knows, or should know, that the threat of losing one’s livelihood produces stress and anxiety. Going to work each day, knowing your job is “contingent,” can become a dreaded and stressful experience.
Not only does academic tenure reduce or eliminate anxiety and stress: It ensures the free expression in the classroom of controversial and unorthodox ideas and pedagogical methods. Colleges, all schools for that matter, should remain faithful to the ultimate purpose of education, to bring students out of darkness—e-ducare in Latin. It therefore should be difficult to dismiss a teacher/professor once that person has acquired tenure.
Alas, much has changed in the groves of academe. “Make America Great Again” has come to mean—long before Trump—make life easier for administrators of educational institutions, especially those who primarily view education as preparation for the world of work. Colleges and universities are top-heavy with administrators. In fact, it’s easier to find employment as an administrator than it is as a full-time faculty member.
Colleges are also becoming increasingly technocratic in their organizational structure. Form is becoming more important than content. The typical teacher/professor is expected to be virtually robotic in his/her performance. (God help a member of a college faculty nowadays who does not know the finer points of PowerPoint or refuses to use technology at all in the classroom.) Scores on multiple-choice faculty evaluations are more valued than what students are learning. The goal (often unstated) of pedagogy is to prepare students for becoming employees who will fit neatly and quietly into niches in the business and corporate world. Professors are subtly urged, sometimes threatened, to become unindicted co-conspirators in what appears to be the ultimate purpose of education in contemporary American society: to produce graduates who will unreflectively accept the status quo.
Today’s system of compromised tenure limits the ability of teachers/professors to encourage students to question and challenge the status quo. At its best, traditional tenure promoted an atmosphere in the classroom where teachers felt free to discuss contemporary political, social, and science/technology issues. Job security encouraged teachers to provide the cognitive tools for what Neil Postman called “crap detecting” (critical thinking) in his book, “Teaching as a Subversive Activity.” Education for Postman included the ability to distinguish reality from propaganda—and it often worked. For example, college-educated students were more likely to resist the draft, protest the Vietnam War, and oppose Richard Nixon’s invasion of Cambodia. In short, they questioned authority because they had the tools, mindset, and commitment to do so.
In his 1923 book, “The Goose Step: A Study of American Education,” Upton Sinclair had this to say regarding colleges and universities: “Suppose I was to tell to tell you that this education machine has been stolen? That a bandit crew have got hold of it and have set it to work, not for your benefit, nor for the benefit of your sons and daughters, but for an end very far from these? That our six hundred thousand young people (supposedly in higher education) are being taught, deliberately and of set purpose, not wisdom but folly, not justice but greed, not freedom but slavery, not love but hate.” Worshiping or conforming to a socio-economic system based on the values and goals of capitalism is the leading obstacle to an education that promotes democratic and humanitarian values, according to Sinclair.
Sinclair further argued that college professors should not “merely have job security” but also should have “collective control of that job.” He insisted that the faculty “must take from the trustees, and from the man they hired, the president, the greater part of their present functions.” Sinclair’s message is telling: It’s undesirable for democracy for administrators to treat professors as employees who are readily dismissible.
“Readily dismissible” is an apt description of adjunct/contingent faculty today. The number of adjuncts teaching college courses now outnumbers full-time tenured faculty. On the adjunct level there is no job security from semester to semester. The academic goosestep is always outside the door.
Teachers on all levels of formal education have vital roles to play in getting all of us to question authority. How can they do that, however, when their jobs can be eliminated by administrators whose first loyalty is often to an establishment that sustains that authority? To challenge hegemonic social systems and structures, teachers and professors need job security. They need tenure. Is that why they’re not getting it?
Richard Sahn, a retired professor of sociology, taught at the collegiate level for four decades.
I know: who cares about the education of our kids as the redacted Mueller Report dominates the airwaves on CNN, MSNBC, and similar cable “news” networks?
I care. I spent fifteen years as a history professor, teaching mostly undergraduates at technically-oriented colleges (the Air Force Academy; the Pennsylvania College of Technology). What I experienced was the slow death of education in America. The decline of the ideal of fostering creative and critical thinking; the abandonment of the notion of developing and challenging young people to participate intelligently and passionately in the American democratic experiment. Instead, education is often a form of social control, or merely a means to an end, purely instrumental rather than inspirational. Zombie education.
Nowadays, education in America is about training for a vocation, at least for some. It’s about learning for the sake of earning, i.e. developing so-called marketable skills that end (one hopes) in a respectable paycheck. At Penn College, I was encouraged to meet my students “at their point of need.” I was told they were my “customers” and I was their “provider.” Education, in sum, was transactional rather than transformational. Keep students in class (and paying tuition) and pray you can inspire them to see that the humanities are something more than “filler” to their schedules — and their lives.
As a college professor, I was lucky. I taught five classes a semester (a typical teaching load at community colleges), often in two or three subjects. Class sizes averaged 25-30 students, so I got to know some of my students; I had the equivalent of tenure, with good pay and decent benefits, unlike the adjunct professors of today who suffer from low pay and few if any benefits. I liked my students and tried to challenge and inspire them to the best of my ability.
All this is a preface to Belle Chesler’s stunning article at TomDispatch.com, “Making American Schools Less Great Again: A Lesson in Educational Nihilism on a Grand Scale.” A high school visual arts teacher, Chesler writes from the heart about the chronic underfunding of education and how it is constricting democracy in America. Here she talks about the frustrations of classes that are simply too big to teach:
[Class sizes grew so large] I couldn’t remember my students’ names, was unable to keep up with the usual grading and assessments we’re supposed to do, and was overwhelmed by stress and anxiety. Worst of all, I was unable to provide the emotional support I normally try to give my students. I couldn’t listen because there wasn’t time.
On the drive to work, I was paralyzed by dread; on the drive home, cowed by feelings of failure. The experience of that year was demoralizing and humiliating. My love for my students, my passion for the subjects I teach, and ultimately my professional identity were all stripped from me. And what was lost for the students? Quality instruction and adult mentorship, as well as access to vital resources — not to mention a loss of faith in one of America’s supposedly bedrock institutions, the public school…
The truth of the matter is that a society that refuses to adequately invest in the education of its children is refusing to invest in the future. Think of it as nihilism on a grand scale.
Nihilism, indeed. Why believe in anything? Talk about zombie education!
What America is witnessing, she writes, is nothing short of a national tragedy:
Public schools represent one of the bedrock institutions of American democracy. Yet as a society we’ve stood aside as the very institutions that actually made America great were gutted and undermined by short-term thinking, corporate greed, and unconscionable disrespect for our collective future.
The truth is that there is money for education, for schools, for teachers, and for students. We just don’t choose to prioritize education spending and so send a loud-and-clear message to students that education doesn’t truly matter. And when you essentially defund education for more than 40 years, you leave kids with ever less faith in American institutions, which is a genuine tragedy.
Please read all of her article here at TomDispatch.com. And ask yourself, Why are we shortchanging our children’s future? Why are we graduating gormless zombies rather than mindful citizens?
Perhaps Trump does have some relevance to this article after all: “I love the poorly educated,” sayeth Trump. Who says Trump always lies?
About fifteen years ago, I wrote a short history of World War II for an encyclopedia on military history. I was supposed to be paid for it, but apparently the money ran out, though my article and the encyclopedia did appear in 2006. Having not been paid, I still own the rights to my article, so I’m posting it today, hoping it may serve as a brief introduction for a wider audience to a very complex subject. A short bibliography is included at the end.
Dr. William J. Astore
World War II (1939-1945): Calamitous global war that resulted in the death of sixty million people. The war’s onset and course cannot be understood without reference to World War I. While combat in the European theater of operations (ETO) lasted six years, in Asia and the Pacific combat lasted fourteen years, starting with the Japanese invasion of Manchuria in 1931. Unprecedented in scale, World War II witnessed deliberate and systematic killing of innocents. Especially horrific was Germany’s genocidal Endlösung (Final Solution), during which the Nazis attempted to murder all Jewish, Sinti, and Roma peoples, in what later became known as the Holocaust.
Rapid campaigns, such as Germany’s stunning seven-week Blitzkrieg (lightning war) against France, characterized the war’s early years. Ultimately, quick victories gave way to lengthy and punishing campaigns from mid-1942 to 1945. Early and rapid German and Japanese advances proved reversible, although at tremendous cost, as the Soviet Union and the United States geared their economies fully for war. The chief Axis powers (Germany, Japan, and Italy) were ultimately defeated as much by their own strategic blunders and poorly coordinated efforts as by the weight of men and matériel fielded by the “Big Three” Allies (Soviet Union, United States, and Great Britain).
Militant fascist regimes in Italy and Germany and an expansionist military regime in Japan exploited inherent flaws in the Versailles settlement, together with economic and social turmoil made worse by the Great Depression. In Germany, Adolf Hitler dedicated himself to reversing what he termed the Diktat of Versailles through rearmament, remilitarization of the Rhineland, and territorial expansion ostensibly justified by national representation.
Concealing his megalomaniac intent within a cloak of reasoned rhetoric, Hitler persuaded Britain’s Neville Chamberlain and France’s Édouard Daladier that his territorial demands could be appeased. But there was no appeasing Hitler, who sought to subjugate Europe from the Atlantic to the Urals, re-establish an African empire, and ultimately settle accounts with the United States. For Hitler, only a ruthless rooting out of a worldwide “Jewish-Bolshevik conspiracy” would gain the Lebensraum (living space) a supposedly superior Aryan race needed to survive and thrive.
Less ambitious, if equally vainglorious, was Italy’s Benito Mussolini. Italian limitations forced Il Duce to follow Germany. Disparities in timing made the “Pact of Steel,” forged by these countries in 1936, fundamentally flawed. The Wehrmacht marched to war in 1939, four years before the Italian military was ready (it was still recovering from fighting in Ethiopia and Spain). Yet Mussolini persevered with schemes to dominate the Mediterranean.
Japan considered its war plans to be defensive and preemptive, although in their scope they nearly equaled Hitler’s expansionist ambitions. The Japanese perceived the alignment of the ABCD powers (America, Britain, China, and Dutch East Indies) as targeted directly against them. The ABCD powers, in contrast, saw themselves as deterring an increasingly bellicose and aggressive Japan. As the ABCD powers tightened the economic noose to compel Japan to withdraw from China, Japan concluded it had one of two alternatives: humiliating capitulation or honorable war. Each side saw itself as resisting the unreasonable demands of the other; neither side proved willing to compromise.
Nevertheless, Japan looked for more than a restoration of the status quo. Cloaked in the rhetoric of liberating Asia from Western imperialism, Japanese plans envisioned a “Greater East Asian Co-Prosperity Sphere,” in which Japan would obtain autarky and Chinese, Koreans, and Filipinos would be colonial subjects of the Japanese master race. In their racial component and genocidal logic, made manifest in the Rape of Nanking (1937), Japanese war plans resembled their Nazi equivalents.
European Theater of Operations (ETO), 1939-1941
1939-1941 witnessed astonishing successes by the Wehrmacht. With its eastern border secured by the Molotov-Ribbentrop Non-Aggression Pact, Germany invaded Poland on 1 September 1939. Two days later, Britain and France declared war on Germany. As French forces demonstrated feebly along Germany’s western border, Panzer spearheads supported by Luftwaffe dive-bombers sliced through Poland. Attacked from the east by Soviet forces on 17 September, the Poles had no choice but to surrender.
Turning west, Hitler then attacked and subdued Denmark and Norway in April 1940. By gaining Norway, Germany safeguarded its supply of iron ore from neutral Sweden and acquired ports for the Kriegsmarine and bases for the Luftwaffe to interdict shipping in the North Sea, Arctic, and North Atlantic. Throughout this period, Germany and France engaged in Sitzkrieg or Phony War.
Phony War gave way on 10 May 1940 to a massive German invasion of the Low Countries and France. A feint on the extreme right by Germany’s Army Group B in Belgium drew French and British forces forward, while the main German thrust cut through the hilly and forested Ardennes region between Dinant and Sedan. The German plan worked to perfection since the French strategy was to engage German forces as far as possible from France’s border. The Wehrmacht’s crossing of the Meuse River outpaced France’s ability to react. Their best divisions outflanked, the Franco-British army retreated to Dunkirk, where the Allies evacuated 335,000 men in Operation Dynamo. The fall of Paris fatally sapped France’s will to resist. The eighty-four-year-old Marshal Philippe Pétain oversaw France’s ignominious surrender, although the French preserved nominal control over their colonies and the rump state of Vichy.
Surprise, a flexible command structure that encouraged boldness and initiative, high morale and strong ideological commitment based on a shared racial and national identity (Volksgemeinschaft), and speed were key ingredients to the Wehrmacht’s success. Intoxicated by victory, the Wehrmacht’s rank-and-file looked on the Führer as the reincarnation of Friedrich Barbarossa. Higher-ranking officers who disagreed were bribed or otherwise silenced.
Hitler next turned to Britain, which under Winston Churchill refused to surrender. During the Battle of Britain the Luftwaffe sought air superiority to facilitate a cross-channel invasion (Operation Sea Lion). This goal was beyond the Luftwaffe’s means, however, especially after Hitler redirected the bombing from airfields to London. By October the Luftwaffe had lost 1887 aircraft and 2662 pilots as opposed to RAF losses of 1023 aircraft and 537 pilots. Temporarily stymied, Hitler ordered plans drawn up for the invasion of the Soviet Union. Stalin’s defeat, Hitler hoped, would compel Churchill to sue for peace.
Hitler’s victories stimulated Japan to conclude, on 27 September 1940, the Tripartite Pact with Germany and Italy. Japan also expanded its war against China while looking avariciously towards U.S., British, Dutch, and French possessions in Southeast Asia and the Pacific. Meanwhile, Mussolini, envious of Hitler’s run of victories, invaded Greece in October. The resulting Italo-Greek conflict ran until April 1941 and exposed the Italian military’s lack of preparedness, unreliable equipment, and incompetent leadership. Italian blunders in North Africa also led in Libya to Britain’s first victory on land. The arrival of German reinforcements under General Erwin Rommel reversed the tide, however. Rommel’s Afrika Korps drove British and Dominion forces eastwards to Egypt even faster than the latter had driven Italian forces westwards. Yet Rommel lacked sufficient forces to press his advantage. Meanwhile, German paratroopers assaulted Crete in May 1941, incurring heavy losses before taking the island. Events in the Mediterranean and North Africa soon took a backseat to the titanic struggle brewing between Hitler and Stalin.
The Eastern Front, 1941
After rescuing the Italians in Greece and seizing the Balkans to secure his southern flank, Hitler turned to Operation Barbarossa and the invasion of the Soviet Union. Deluded by his previous victories and a racial ideology that viewed Slavs as Untermenschen (sub-humans), Hitler predicted a Soviet collapse within three months. Previous Soviet incompetence in the Russo-Finnish War (1939-40) seemed to support this prediction. The monumental struggle began when Germany and its allies, including Hungary, Slovakia, Rumania, Bulgaria, Italy, and Finland, together with volunteer units from all over Europe, invaded the USSR along a 1300-mile front on 22 June 1941. The resulting death struggle pit fascist and anti-Bolshevik Europe against Stalin’s Red Army. For Hitler the crusade against Bolshevism was a Vernichtungskrieg (war of annihilation). Under the notorious Commissar Order, the Wehrmacht shot Red Army commissars (political officers) outright. Mobile killing units (Einsatzgruppen) rampaged behind the lines, murdering Jews and other racial and ethnic undesirables.
The first weeks of combat brought elation for the Germans. Nearly 170 Soviet divisions ceased to exist as the Germans encircled vast Soviet armies. Leningrad was surrounded and endured a 900-day siege. But by diverting forces towards the vast breadbasket of the Ukraine and the heavy manufacturing and coal of the Donets Basin, Hitler delayed the march on Moscow for 78 days. By December, sub-zero temperatures, snow, and fresh Soviet divisions halted exhausted German soldiers on the outskirts of Moscow. A Soviet counteroffensive (Operation Typhoon) threw Hitler’s legions back 200 miles, leading him to relieve two field marshals and 35 corps and division commanders. Hitler also dismissed the commander-in-chief of the army, Walter von Brauchitsch, and assumed command himself. His subsequent “stand fast” order saved the Wehrmacht the fate of Napoleon’s army of 1812, but this temporary respite came at the price of half a million casualties from sickness and frostbite.
A crucial Soviet accomplishment was the wholesale evacuation of its military-industrial complex. By November the Soviets disassembled 1500 industrial plants and 1300 military enterprises and shipped them east, along with ten million workers, to prepared sites along the Volga, in the Urals, and in western Siberia. Out of the range of the Luftwaffe, Soviet factories churned out an arsenal of increasingly effective weapons, including 50,000 T-34s, arguably the best tank of the war. Hitler now faced a two-front war of exhaustion, the same strategic dilemma that in World War I had led to the Second Reich’s demise.
Hitler arguably lost the war in December 1941, especially after declaring war on the United States on 11 December, which soon became the “arsenal of democracy” whose Lend-Lease policy shored up a reeling Red Army. Operation Barbarossa, moreover, highlighted a failure of intelligence of colossal proportions as the Wehrmacht fatally underestimated the reserves Stalin could call on. As Franz Halder, chief of the army general staff noted in his diary, “We reckoned with 200 [Soviet] divisions, but now [in August 1941] we have already identified 360.” As German forces plunged deeper into Soviet territory, they had to defend a wider frontage. A front of 1300 miles nearly doubled to 2500 miles. The vastness, harshness, and primitiveness of Mother Russia attenuated the force of the Panzer spearheads, giving Soviet forces space and time to recover from the initial blows of the German juggernaut. When the Red Army refused to die, Germany was at a loss at what to do next. Well might the Wehrmacht have heeded the words of the famed military strategist, Antoine Jomini: “Russia is a country which it is easy to get into, but very difficult to get out of.”
The Eastern Front, 1942-1945
Soviet strategy was to draw Germany into vast, equipment-draining confrontations. Germany, meanwhile, launched another Blitzkrieg, hoping to precipitate a Soviet collapse. Due to the previous year’s losses, the Wehrmacht in 1942 could attack along only a portion of the front. Hitler chose the southern half, seeking to secure the Volga River and oil fields in the Caucasus. Initial success soon became calamity when Hitler diverted forces to take Stalingrad.
The battle of Stalingrad lasted from August 1942 to February 1943 as the city’s blasted terrain negated German advantages in speed and operational art. As more German units were fed into the grinding street fighting, the Soviets prepared a counteroffensive (Operation Uranus) that targeted the weaker Hungarian, Italian, and Rumanian armies guarding the German flanks. Launched on 19 November, Uranus took the Germans completely by surprise. Encircled by 60 Red Army divisions, the 20 divisions of Germany’s Sixth Army lacked adequate strength to break out. The failure of Erich von Manstein’s relief force to reach Sixth Army condemned it to death. Although Hitler forbade it, the remnants of Sixth Army capitulated on 2 February 1943.
Stalingrad was a monumental moral victory for the Soviets and the first major land defeat for the Wehrmacht. After losing the equivalent of six months’ production at Stalingrad, Hitler belatedly placed the German economy on a wartime footing, but by then it was too late to close an ever-widening production gap. The Wehrmacht bounced back at Kharkov in March 1943, but it was to be their last significant victory. In July Hitler launched Operation Citadel at Kursk, which resulted in a colossal battle involving 1.5 million soldiers and thousands of tanks. Remaining on the defensive, the Red Army allowed the Wehrmacht to expend its offensive power in costly attacks. After fighting the Wehrmacht to a standstill, the Red Army drove it back to the Dnieper.
The dénouement was devastating for Germany. Preceded by a skilful deception campaign, Operation Bagration in Byelorussia in June 1944 led to the collapse of Germany’s Army Group Center. When Hitler ordered German forces to stand fast, 28 German divisions ceased to exist. By 1945, the Wehrmacht could only sacrifice itself in futile attempts to slow the Soviet steamroller. Soviet second-line forces used terror, rape, and wanton pillaging and destruction to avenge Nazi atrocities. Soviet forces had prevailed in the “Great Patriotic War” but at the staggering price of ten million soldiers killed, another 18 million wounded. Soviet civilian deaths exceeded 17 million. The Germans and their allies lost six million killed and another six million wounded. Hitler’s overweening ambition and fatal underestimation of Soviet resources and will led directly to Germany’s destruction.
The Anglo-American Alliance and the ETO, 1942-1945
In 1942 two-thirds of Americans wanted to defeat Japan first, but Franklin Delano Roosevelt and Churchill agreed instead on a “Germany first” policy. Their decision reflected concerns that Germany might defeat the Soviet Union in 1942. That year U.S. Army Chief of Staff George C. Marshall argued for a cross-channel assault, but the British preferred to bomb Germany, invade North Africa, and advance through Italy and the Balkans. This indirect approach reflected British memories of the Western Front in World War I and a desire to secure lines of communication in the Mediterranean to the Suez Canal and ultimately to India. British ideas prevailed because of superior staff preparation and the reality that the Allies had to win the Battle of the Atlantic before assaulting Germany’s Atlantic Wall in France.
Operation Torch in November 1942 saw Anglo-American landings in North Africa, in part to assure Stalin that the United States and Britain remained committed to a second front. Superior numbers were telling as Allied forces drove their Axis counterparts towards Tunisia, although the U.S. setback at Kasserine Pass in February 1943 reflected the learning curve for mass citizen armies. Fortunately for the Allies, Hitler sent additional German units in a foolhardy attempt to hold the remaining territory. With the fall of Tunisia in May 1943 the Axis lost 250,000 troops.
The Allies next invaded Sicily in July but failed to prevent the Wehrmacht’s withdrawal across the Straits of Messina. Nevertheless, the Sicilian Campaign precipitated Mussolini’s fall from power and Italy’s unconditional surrender on 8 September. Forced to occupy Italy, Hitler also rushed 17 divisions to the Balkans and Greece to replace Italian occupation forces. Churchillian rhetoric of a “soft underbelly” in the Italian peninsula soon proved misleading. The Allied advance became a slogging match in terrain that favored German defenders. At Salerno in September, Allied amphibious landings were nearly thrown back into the sea. At Anzio in January 1944, an overly cautious advance forfeited surprise and allowed German forces time to recover. Allied forces finally entered Rome on 4 June 1944 but failed to reach the Po River valley in northern Italy until April 1945.
The Italian campaign became a sideshow as the Allies gathered forces for a concerted cross-channel thrust (Operation Overlord) in 1944. It came in a five-division assault on 6 June at Normandy. Despite heavy casualties at Omaha Beach, the Allies gained a strong foothold in France. Success was due to brilliant Allied deception (Operation Fortitude) in which the Allies convinced Hitler that the main attack was still to come at Pas de Calais and that they had 79 divisions in Britain (they had 52). Germany’s best chance was to drive the Allies into the sea on the first day, but Hitler refused to release reserves. Once ashore in force, and with artificial harbors (Mulberries), Allied numbers and air supremacy took hold. In 80 days the Allies moved two million men, half a million vehicles, and three million tons of equipment and supplies to France. Once the Allies broke out into open country, there was little to slow them except their own shortages of fuel and supplies. After destroying Germany’s Army Group B at Falaise, the Allies liberated Paris on 25 August. Field-Marshal Bernard Montgomery’s attempt in September at vertical envelopment (Operation Market Garden) failed miserably, however, as paratroopers dropped into the midst of Panzer divisions. High hopes that the war might be over in 1944 faded as German resistance stiffened and Allied momentum weakened.
Hitler chose December 1944 to commit his strategic reserve in a high-stakes offensive near the Ardennes. Known as the Battle of the Bulge, initial Allied disorder and panic gave way to determined defense at St. Vith and Bastogne. Once the weather cleared, Allied airpower and armor administered the coup de grâce. The following year the Allies pursued a broad front offensive against Germany proper, with George S. Patton’s Third Army crossing the Rhine River at Remagen in March. Anglo-American forces met the Red Army on the Elbe River in April, with Soviet forces being awarded the honor of taking Berlin.
The second front in France was vital to Germany’s defeat. Yet even after D-Day German forces fighting the Red Army exceeded those in France by 210 percent. Indeed, 88 percent of the Wehrmacht’s casualties in World War II came on the Eastern Front. That the U.S. Army got by with just 90 combat divisions was testimony to the fact that the bulk of German and Japanese land forces were tied down fighting Soviet and Chinese armies, respectively. Helping the Allies to husband resources in the ETO was a synergistic Anglo-American alliance, manifested by joint staffs, sharing of intelligence, and (mostly) common goals.
The Air War in Europe
The air forces of all the major combatants, the USAAF and RAF excepted, primarily supported ground operations. U.S. and British air power theory, however, called for concerted strategic bombing campaigns against enemy industry and will. Thus these countries orchestrated a combined bomber offensive (CBO) as a surrogate second front in the air. While the USAAF attempted precision bombing in daylight, RAF Bomber Command employed area bombing by night. The CBO devastated Cologne (1942), Hamburg (1943), and Dresden (1945), but Germany’s will remained unbroken. The CBO succeeded, however, in breaking the back of the Luftwaffe during “Big Week” (February 1944) in a deadly battle of attrition. Eighty-one thousand Allied airmen died in the ETO, with the death rate in RAF Bomber Command alone reaching a mind-numbing 47.5 percent. Hard fought and hard-won, air supremacy proved vital to the success of Allied armies on D-Day and after.
Battle of the Atlantic
Nothing worried Churchill more than the Kriegsmarine’s U-boats (submarines). Surface raiders like the Bismarck or Graf Spree posed a challenge the Royal Navy both understood and embraced with relish. Combating U-boats, however, presented severe difficulties, including weeks of tedious escort duty in horrendous weather. Despite Allied convoys and fast merchantmen, U-boats sank an average of 450,000 tons of shipping each month from March 1941 to December 1942. In March 1943 the Allies lost 627,000 tons, which exceeded the rate of replacement.
Yet only two months later, the tide turned against Germany. Allied successes in reading the Kriegsmarine’s Enigma codes proved vital both in steering convoys away from U-boat “wolf packs” and in directing naval and air units to attack them. Decimetric radar and high-frequency directional finding helped the Allies detect U-boats; B-24 Liberators armed with depth charges closed a dangerous gap in air coverage; and escort groups (including carriers) made it perilous for U-boats to attack, especially in daylight. These elements combined in May 1943 to account for the loss of 41 U-boats, 23 of which were destroyed by air action. Faced with devastating losses of experienced crews, Grand-Admiral Karl Dönitz withdrew his U-boats from the North Atlantic. They never regained the initiative. Germany ultimately lost 510 U-boats while sinking 94 Allied warships and 1900 merchant ships. Because the Kriegsmarine pursued lofty ambitions of building a blue-water navy, however, Germany never could produce enough U-boats to cut Britain’s economic lifeline. Poor resource allocation and strategic mirror imaging ultimately doomed the Kriegsmarine to defeat.
The Rising Sun Ascendant, 1937-1942
By 1938 the Imperial Japanese Army (IJA) had 700,000 soldiers in China. In 1939 the IJA attempted to punish the Soviets for supplying China only to be defeated at the battle of Khalkin Gol. After this defeat, and spurred on by the Imperial Japanese Navy (IJN), Japanese leaders increasingly looked southward, especially as British, Dutch, and French possessions became vulnerable when Germany ran rampant in the ETO. Bogged down in an expensive war with China, and facing economic blockade, Japan decided to seize outright the oil, rubber, tin, bauxite and extensive food resources of the Malay Peninsula, the Dutch East Indies, and Southeast Asia.
After concluding a non-aggression pact with Stalin in April 1941, Japan viewed Britain’s Royal Navy and the U.S. Pacific Fleet as its chief obstacles. To destroy the latter, Japan launched a surprise attack on Pearl Harbor on 7 December 1941, followed by attacks on British and Dutch naval units and invasions of Burma, Malaya, the Philippines, and other island groups using quick-moving, light infantry. Employing islands as unsinkable aircraft carriers, the Japanese hoped to establish a strong defensive perimeter as a shield, with the IJN acting as a mobile strike force or javelin. When the Allies confronted this “shield and javelin” strategy, Japan hoped their losses would prove prohibitive, thereby encouraging them to seek an accommodation that would preserve Japan’s acquisitions.
Japan’s key strategic blunder was that of underestimating the will of the United States, partly due to faulty intelligence that mistakenly stressed American isolationism. Pearl Harbor became for Americans a “date that shall live in infamy,” which permitted neither negotiation nor compromise. Japanese leaders knew they could not compete with U.S. industry (U.S. industrial capacity was nine times that of Japan’s), but they failed to develop feasible plans for ending the war quickly.
Nevertheless, until April 1942 the Japanese enjoyed a string of successes. Pearl Harbor was followed by attacks against the Philippines, where the United States lost half its aircraft on the ground. British attempts to reinforce Singapore led to the sinking of the battlecruiser Repulse and battleship Prince of Wales. At the Battle of Java Sea in February 1942 the IJN destroyed the Dutch navy. For the Allies, disaster followed disaster. At minimal cost, Japan seized Hong Kong, Malaya, most of Burma, and Singapore. Singapore’s surrender on 15 February was psychologically catastrophic to the British since they had failed at what they believed they did best: mounting a staunch defense. From this shattering blow the British Empire never fully recovered. By May 1942 remnants of the U.S. Army at Bataan and Corregidor surrendered, and the Japanese were in New Guinea. To this point not one of the IJN’s eleven battleships, ten carriers, or cruisers had been sunk or even badly damaged.
Eclipse of the Rising Sun, 1942-1945
The IJN suffered its first setback in May 1942 at the Battle of Coral Sea, where the USN stopped Japanese preparatory moves to invade Australia. The IJN next moved against Midway Island, hoping to draw out the U.S. Pacific Fleet and destroy it. The Japanese plan, however, was overcomplicated. It included coordination of eight separate forces and a diversionary assault on the Aleutians. Planned as a battleship fight by Admiral Isoroku Yamamato, the USN was forced instead to rely on carrier strike forces. Japanese indecision and American boldness, enhanced by effective code-breaking (known as MAGIC in the Pacific), led to the loss of four Japanese carriers. Midway was the major turning point in the Pacific theater. After this battle, the USN and IJN were equal in carrier strength, but the United States could build at a much faster rate. From 1942 to 1945 the USN launched 17 fleet carriers and 42 escort carriers, whereas the Japanese launched only four, two of which were sunk on their maiden voyage. Japan also lost its best admiral when U.S. code-breaking led, in April 1943, to the shooting down of Yamamoto’s plane.
The Japanese compounded defeat at Midway by failing to build an adequate merchant marine or to pursue anti-submarine warfare to defend what they had. Constituting less than two percent of USN manpower, American submariners accounted for 55 percent of Japanese losses at sea, virtually cutting off Japan’s supply of oil and reducing imports by 40 percent. By the end of 1944 U.S. submarines had sunk half of Japan’s merchant fleet and two-thirds of its tankers.
Much difficult fighting on land and sea remained. The United States adopted a Twin-Axis strategy designed to give the army and navy equal roles. While General Douglas MacArthur advanced through New Guinea in the southwest Pacific, neutralizing the major Japanese base at Rabaul to prepare for the invasion of the Philippines, Admiral Chester Nimitz island-hopped through the central Pacific. Guadalcanal (Operation Watchtower) in the Solomons turned into a bloody battle of attrition from August 1942 to February 1943 that ultimately favored U.S. forces. Tarawa in the Gilberts (Operation Galvanic) was the first test of the Fleet Marine Concept (FMC) that shortened the logistical tail of the U.S. Pacific Fleet. U.S. landings nearly proved disastrous, however, when Japanese defenders inflicted 42 percent casualties on the invading force. But the USN and Marines learned from their mistakes, and subsequent island operations had high yet sustainable casualty rates.
Battles such as Tarawa highlighted the astonishing viciousness and racism of both sides in the Pacific, with Americans depicting Japanese as “monkeys” or “rats” to be exterminated. Reinforcing the fight-to-the-death nature of warfare was the Japanese warrior code of Bushido that considered surrender as dishonorable. Jungle warfare on isolated islands left little room for maneuver or retreat and bred claustrophobia and desperate last stands. Ruthlessness extended to the U.S. air campaign against Japan that included the firebombing of major cities such as Tokyo, where firestorms killed at least 83,000 Japanese and consumed 270,000 dwellings.
The U.S. invasion of Saipan in June 1944 led to the “Great Marianas Turkey Shoot” in which U.S. pilots shot down 243 of 373 attacking Japanese aircraft while losing only 29 aircraft. Most devastating to Japan was the irreplaceable loss of experienced pilots. To pierce American defenses, Japan employed suicide pilots or Kamikazes at the Battle of Leyte Gulf in October and in subsequent battles. Leyte Gulf in the Philippines was a decisive if close-run victory for the USN, since the IJN missed a golden opportunity to crush Allied landing forces. Costly U.S. campaigns in 1945 led to the capture of Iwo Jima in March and Okinawa in June before the United States dropped atomic bombs on Hiroshima and Nagasaki in August. Together with Soviet entry into the war against Japan, these atomic attacks convinced the Japanese Emperor to surrender, with formal ceremonies being held on the USS Missouri on 2 September 1945.
Japan’s unconditional surrender highlighted what had been a fundamental, and ultimately fatal, schism between the IJA and IJN. Whereas the IJA had focused on the Asian continent to neutralize China and the Soviet Union, the IJN had identified the United States and Britain as its principal enemies. The IJA had been more influential in Japanese politics and dominated Imperial general headquarters. Interservice rivalry led to haphazard coordination and bureaucratic infighting that degraded the Japanese war effort. Like their nominal allies the Germans, Japan had essentially engaged in a two-front war of exhaustion against foes possessing superior resources. IJA gains in the China-Burma-India theater had not been sustainable, especially as British, Chinese, and Indian forces learned to counter Japanese infantry tactics under the determined tutelage of William Slim, Orde Wingate, and “Vinegar Joe” Stilwell.
Technology and Medicine
World War II is known as the “physicist’s war” due to the success of the U.S./British/Canadian Manhattan Project that developed atomic bombs, as well as the invention and use of radar. Germany was especially innovative, developing the V-1 cruise missile and V-2 ballistic missile as “vengeance” weapons. While a remarkable technical achievement, the V-2 was ultimately a waste of precious resources. Its circular error probable (CEP) of 20 kilometers and small one-ton warhead made it little more than a deadly nuisance. Germany also developed the Me-262, the world’s first operational jet fighter, but its late deployment in small numbers had little impact on the air war. Less spectacular, but more telling, was the Allied emphasis on fielding large numbers of proven weapons, such as Soviet T-34 and U.S. M-4 Sherman tanks; aircraft such as P-51 long-range escort fighters and Lancaster four-engine bombers; and Higgins boats for amphibious operations.
Penicillin and DDT, both developed by the Allies, were the leading medical developments. Penicillin saved the lives of untold tens of thousands of wounded Allied troops, and DDT vastly reduced casualties due to mosquito-borne diseases in the Pacific. The Germans developed nerve gas but decided against employing it, apparently because they (wrongly) believed the Allies also had it. Unlike the previous world war, chemical weapons were rarely used. Finally, Allied code-breaking efforts such as ULTRA saw the development of primitive computers.
Legacies of the War
World War II saw the emergence of the United States and Soviet Union as superpowers. The resulting Cold War between them created a bi-polar world until the Soviet Union’s collapse in the early 1990s. With the end of the myth of Western superiority came the decline of colonial empires and the independence of countries such as India (1947). The war also resulted in the division of Germany (reunited in 1989) and the occupation and democratization of Japan; the creation of the United Nations and the state of Israel; and the rise of leaders formed in the crucible of war, such as Dwight D. Eisenhower and Charles de Gaulle. A vastly destructive war with tragic consequences, World War II nevertheless saw the demise of Hitler’s Third Reich, a regime based on mass slavery of “inferiors” and the categorical extermination of “undesirables” (Jews, Gypsies, the handicapped and mentally ill, etc.), as well as the overthrow of a Japanese regime that glorified militarism and justified slavery and racial discrimination on a massive scale.
Bartov, Omer. The Eastern Front, 1941-45: German Troops and the Barbarisation of Warfare, New York, 1986.
Churchill, Winston S. The Second World War, 6 vols, New York, 1948-53.
Costello, John. The Pacific War 1941-1945, New York, 1982.
Dear, I.C.B., ed. The Oxford Companion to World War II, New York, 1995.
Dower, John W. War Without Mercy: Race and Power in the Pacific War, New York, 1986.
Gilbert, Martin. The Second World War: A Complete History, New York, 1989.
Glantz, David M. and Jonathan M. House. When Titans Clashed: How the Red Army Stopped Hitler, Lawrence, KS, 1995.
Iriye, Akira. Power and Culture: The Japanese-American War, 1941-1945, Cambridge, MA, 1981.
Jones, R.V. The Wizard War: British Scientific Intelligence 1939-1945, New York, 1978.
Keegan, John. The Second World War, New York, 1990.
Lukacs, John. The Last European War, September 1939/December 1941, New Haven, CT, 1976, 2000.
Miller, Edward. War Plan Orange: The U.S. Strategy to Defeat Japan, 1897-1945, Annapolis, MD, 1991.
Murray, Williamson and Allan R. Millett. A War to be Won: Fighting the Second World War, Cambridge, MA, 2000.
Overy, Richard. Russia’s War: A History of the Soviet War Effort: 1941-1945, New York, 1997.
Overy, Richard. Why the Allies Won, New York, 1995.
Parker, R.A.C. Struggle for Survival: The History of the Second World War, New York, 1990.
Spector, Ronald H. Eagle Against the Sun: The American War with Japan, New York, 1985.
Stoler, Mark A. Allies and Adversaries: The Joint Chiefs of Staff, The Grand Alliance, and U.S. Strategy in World War II, Chapel Hill, NC, 2000.
Taylor, A.J.P. The Origins of the Second World War, New York, 1961.
Thorne, Christopher G. Allies of a Kind: The United States, Britain, and the War Against Japan, 1941-1945, New York, 1978.
Van der Vat, Dan. The Atlantic Campaign: World War II’s Great Struggle at Sea, New York, 1988.
Weinberg, Gerhard L. A World At Arms: A Global History of World War II, Cambridge, 1994.
Willmott, H.P. The Great Crusade: A New Complete History of the Second World War, New York, 1991.
In the 19th century, many people believed in polygenism, and others used the concept of “the races of man,” where by “race” they often meant species. At home, I have a framed copy of the races of man taken from an encyclopedia published in the 1890s. Here’s a photo of it:
Of course, there’s always an assumed hierarchy to the races of man concept. White Europeans are at the top, since it’s they who defined and ordered the hierarchy. Surprise!
In my photo, White Europeans take pride of place in the center, with some swarthy Italians at the top right (I’m half-Italian). Meanwhile, Polynesian (pink flowers in hair) and Indian (from South America) women are shown with bare breasts. “Primitives” are primitive precisely because they’re “immodest” in dress, a convention that allowed publishers to show nudity in illustrations and photos without being accused of pornography. You might call this the “National Geographic” dispensation for nudity.
My college students were often amazed when I told them that science shows that all of us — all humans — came out of Africa. Far too many people today still think of race as both definitive and as a rung on a ladder, and naturally they always place their own “race” on the top rung.
Even more disturbing is the resurgence of racialized (and often racist) thinking in the United States. The idea of the races of man and the “scientific” ordering of the same was debunked a century ago, yet it’s back with a vengeance in much of the U.S.
Naturally, those who promote racialized thinking always put their own perceived race at the top. In that sense, nothing whatsoever has changed since the 19th century and the “races of man” concept.