New Orleans and Its Disappearing Confederate Statues

One Thing after Another has noticed over the last several months that national politics has crowded just about everything else out of the news. Stories about history’s contemporary relevance or impact are sometimes difficult to find these days. So if you weren’t paying attention, you might have missed the saga now taking place in New Orleans.

In July 2015, in the wake of the mass shooting at the Emanuel African Methodist Episcopal Church in Charleston, South Carolina, Mitch Landrieu, the mayor of New Orleans, asked his city council to remove four monuments from the city. Five months later, after much public debate, the city council voted 6-1 to do so. Three of the monuments celebrated Confederate heroes: Jefferson Davis (president of the Confederacy), Robert E. Lee (commander of the Confederate Army of Northern Virginia), and P.G.T. Beauregard (a prominent Confederate general born outside of New Orleans). The fourth, the Liberty Monument (erected in 1891), memorialized the so-called Battle of Liberty Place (1874). This armed struggle pitted the Crescent City White League, which sought to settle a disputed election by seating a Democratic governor by force, against the metropolitan police (along with elements of the state militia) which fought to defend a Republican regime associated with racial equality. An inscription added in 1932 explicitly celebrated the battle as a step in the direction of white supremacy.

On Monday, April 24, the Liberty Monument was disassembled. Over two weeks later, on Thursday, May 11, the statue of Davis was removed. The workers who took away the Davis statue wore flak jackets for protection and masks to conceal their identity. Such precautions should come as no surprise; the whole exercise has been incredibly controversial, and the statues have been the scenes of protests as well as counter-protests.

What position should one take on the removal of these statues? One Thing after Another believes that the following interview of Professor David Blight (an expert on the history of slavery and the American Civil War who teaches at Yale while directing the Gilder Lehrman Center for the Study of Slavery) in Slate contains a great deal of good sense:

http://www.slate.com/articles/news_and_politics/interrogation/2017/05/should_new_orleans_remove_its_civil_war_monuments_historian_david_blight.html

Yes, One Thing after Another understands what its readers have come to expect—that this blog usually refers to articles only to criticize them. This case, however, is different. Blight makes a number of thoughtful points throughout his interview. Anybody who has read this blog’s discussion of Tony Horowitz’s Confederates in the Attic will be familiar with several of the ideas that emerge from this exchange. The three most important and relevant ones are as follows.

First, the Confederates fought valorously (much—if not all—of the time) but for a bad cause that was inextricably tied to slavery. One Thing after Another ought to remind readers that such is not merely the verdict of contemporary historians. This blog recalls Ulysses S. Grant’s verdict in his Memoirs (1885), which describes the preliminaries preceding Lee’s surrender at Appomattox:

What General Lee’s feelings were I do not know. As he was a man of much dignity, with an impassible face, it was impossible to say whether he felt inwardly glad that the end had finally come, or felt sad over the result, and was too manly to show it. Whatever his feelings, they were entirely concealed from my observation; but my own feelings, which had been quite jubilant on the receipt of his letter, were sad and depressed. I felt like anything rather than rejoicing at the downfall of a foe who had fought so long and valiantly, and had suffered so much for a cause, though that cause was, I believe, one of the worst for which a people ever fought, and one for which there was the least excuse. I do not question, however, the sincerity of the great mass of those who were opposed to us.

Blight argues, then, that those inclined to defend the memorials ought to admit that the Confederate cause was “deeply flawed or terrible.” However, they ought to also realize that the contemporary South should feel neither shame nor pride for what Southerners did over 150 years ago. As Professor Randy Sparks (a scholar at Tulane University whom the interviewer refers to and with whom Blight agrees) asserts, Confederates were “men of their time and place.”

Second, people need to see, as Sparks argues, that now “is our time, and our place.” We cannot change what our ancestors did, but we can influence the world that our descendants inherit. Much of the controversy surrounding the removal of Confederate statues really has to do with contemporary issues (an argument that Horowitz also makes). For sure, a number of these issues are rooted in the legacies of slavery and the war (e.g the underprivileged position of African Americans today). Still, when people argue about, say, the Confederate battle flag, more often than not, they are projecting today’s concerns on the past. Such debates are often truly about present-day disputes concerning inequality, race, economic opportunity, identity, the basis of community, the limits of government authority, and so on. We ought to have conversations about these issues without making inapt, ahistorical, or anachronistic references to the Civil War.

Third, having recognized these points, we can’t and shouldn’t destroy every Confederate memorial. Attempting to stamp out such memorials would pose to communities questions that admit no easy solution (e.g. Is this or that a memorial? What does it commemorate?). Such a policy would also come to feel oppressive as localities fell under the shadow of a memorial police. As Blight points out, iconoclasm is dangerous because no one quite knows where it will lead. American history without Davis, Lee, and Beauregard would be incomplete, so we cannot erase them from the past. But we can, as Blight suggests, erect “tasteful, important, meaningful new memorials” that show how history has moved on from the Lost Cause fable. In this fashion, we can bring memory and history closer together, an achievement that would prove a public service. Blight refers to the Robert Gould Shaw and 54th Massachusetts Memorial on the edge of Boston Common (a patinated plaster cast of which is pictured above) as a possible model for future monuments, and rightly so. If we are compelled to remember Confederate leaders like Davis, Lee and Beauregard, justice demands that we do a better job of representing the complexity of the American Iliad. That task involves publicizing the stories of those who have been pushed to the margins by traditional memorialization of the war (e.g. African Americans, poor Southern whites, and women) but who played such an important role in the conflict.

Furthermore, I consider that the myth of the unemployable History major must be destroyed.

Wars are Not Always Won by Military Genius or Decisive Battle, But Attrition is not the Answer

Cathal Nolan, who teaches military history (among other things) at Boston University, recently wrote an essay entitled “Wars are not Won by Military Genius or Decisive Battles” in the online journal Aeon.

https://aeon.co/ideas/wars-are-not-won-by-military-genius-or-decisive-battles

In this piece, Nolan criticizes traditional military history for focusing on battles—something that misleads the public into thinking that wars are won “in an hour or an afternoon of blood and bone.” Such a view of war also entices “generals and statesmen with the idea that a hard red day can be decisive, and allow us to avoid attrition” which many see as “morally vulgar and without redemptive heroism.” If we begin to understand that wars are a matter of “joining weight of material to strength of will,” we come to comprehend that victory is attained less by military genius than by “grinding,” “resolve,” and “strategic depth.” Having recognized that war is about attrition, we must embrace that fact. As Nolan puts it:

With humility and full moral awareness of its terrible costs, if we decide that a war is worth fighting, we should praise attrition more and battle less. There is as much room for courage and character in a war of attrition as in a battle.

Before writing anything else, One Thing after Another must concede that Nolan is correct about a number of things. Clearly, as he argues, there is much more to war than battle. There are the operational, strategic, and political dimensions of war, and these involve areas as diverse as culture and economics. He is also on the mark in arguing that, quite frequently, wars are drawn-out affairs in which the defeated party is vanquished as much by material exhaustion as by anything else. The spirit behind this essay, which requires us to accept that there is no short-cut to military victory, is commendable. In the same way that one cannot make an omelet without breaking eggs, one cannot win a war without a commitment that involves many soldiers getting killed. Finally, every military historian must, as Nolan does, give the Tolstoyan view of warfare its due; fighting is a chaotic enterprise over which generals find it difficult to assert control.

One Thing after Another also understands that Nolan probably seeks to offer a kind of intellectual provocation. Even so, in responding, the blogosphere must do its best to keep him honest. And honesty compels this blog to disagree with Nolan’s argument on a number of grounds. To start with, Nolan’s terms are often ill-defined and his argument overdrawn when he discusses the current state of military history as well as the public’s understanding of war. What literature is he referring to when he mentions “traditional military history” which presents battles “as fulcrum moments where empires rose or fell in a day”? What exactly is the “drums-and-trumpets style,” and with what frequency are “popular histories” written that way? Which historians celebrate “even failed campaigns as glorious”? One Thing after Another does not recognize the current state of military history in these statements. Are academic and professional military historians implicated in Nolan’s charges? If not, he should make that point clear. If so, he is wrong. Nolan’s charges concerning war movies also seem problematic. Are they universally about “raw courage and red days, the thrill of vicarious violence and spectacle”? This blog can think of numerous and substantial exceptions to this claim. And on what basis does Nolan assert that “most people” still think wars are won “in an afternoon”? In light of current events, such a claim appears questionable.

The argument that all wars are more or less won by attrition also seems like something of an overstatement. Every conflict witnesses a degree of attrition, but if one claims that they are all won through this process, the category of attrition ceases to be a particularly useful category of analysis. Moreover, insisting that attrition is central to all wars would iron out the uniqueness of each conflict, and as historians we are bound to recognize this uniqueness. Most important, though, is the fact that many wars clearly are not won by attrition. Off the top of its head, One Thing after Another can think of several conflicts that more or less consisted of a single major battle (e.g. Hastings, Jena-Auerstedt, and Königgrätz). In many more cases, there are wars that were decided by a great battle (e.g. Gaugamela) or wars that were in no way won by attrition (e.g. the Falklands War).

Even if the notion that wars were won by attrition was entirely correct, we would still be justified in studying battles (although not to the exclusion of all else). It is, after all, through battle that attrition often takes place. In this context, one recalls Friedrich Engels’ paraphrasing of Carl von Clausewitz (which appeared in John Keegan’s The Face of Battle—a book, by the way, that completely reconfigured the approach to battle history for the better over forty years ago): “Fighting is to war what cash payment is to trade, for however rarely it may be necessary for it actually to occur, everything is directed towards it, and eventually it must take place all the same and must be decisive.” Even if it is not decisive in an afternoon, battle is decisive nonetheless. One thinks in this context of William Philpott’s Three Armies on the Somme (2010). This battle history argues that the Somme was an attritional fight that played a major role in hollowing out the Germany army and paving the way for Allied victory during World War I. In other words, by attriting the German army, the Somme contributed to decision and is worthy of study.

Of course, if battle is significant, so is generalship. After all, one of the reasons our armed forced study military history—and particularly battle history—is to cultivate leadership to fight future wars as well as we can. Nolan counsels, however, that we should not worship “military genius”; instead, we must value “sound generalship.” The distinction is not entirely clear. One Thing after Another is put in mind of Clausewitz’s famous statement about friction that appears in On War: “Everything in war is simple, but the simplest thing is difficult.” Is Nolan advising, then, that we should give up on brilliance and hope for nothing more than military leaders who can execute simple operations in the name of attrition?

The problem is that deliberately embracing a classic strategy of attrition (that is, where attrition is preeminent—for attrition is always present) leads to significant ethical problems. For one thing, it places us on the path of reducing humanity to an instrument or an object rather than treating human life as an end in itself (think of Immanuel Kant here). For another, as scholars operating in the Just War tradition have pointed out, attrition often leads to violations of the criterion of proportionality in jus in bello. By its very nature, generals employing attrition as a strategy are inclined to unleash violence of great intensity on an enormous scale that can be inordinate when compared to the aims sought. Such an approach to war is wasteful of human life and is therefore condemnable, especially when other strategies are available. Of course, Nolan’s point seems to be that, generally speaking, no other strategies are truly available; despite our best efforts, wars are de facto about attrition, so we may as well call a spade a spade and get on with it. There is perhaps some merit in such honesty, but this kind of truthfulness places us on a terrible and slippery slope.

After Waterloo, which capped almost a quarter century of continuous fighting in Europe, military men became enamored of Napoleon. They studied Napoleon through his leading interpreter, Antoine-Henri Jomini, in an attempt to understand the secret of attaining decision on the battlefield, and they largely reconceived military history as the story of decisive battles. Since 1945, more often than not, the United States has found itself involved in frustrating “protracted” wars (to use Mao Zedong’s phrase) in which the enemy has often targeted this country’s will to sustain the struggle. Indeed, at this moment, America still finds itself mired in wars of long duration in Central Asia and the Middle East. Considering these circumstances, is it any surprise that a contemporary scholar is willing to throw up his hands, claim that the age of decisive battle never was, and tell us to embrace attrition? In his prescriptions, Nolan is very much unlike Napoleon’s successors; the former counsels attrition, the latter sought decision on the battlefield. Where they are similar, though, is in their tendency to recast the past in the image of their own time. Admittedly, to quote Benedetto Croce, “All history is contemporary history.”  Yet if we allow our current preoccupations to color our view of the past too much, we run the risk of producing ahistorical interpretations.

Furthermore, I consider that the myth of the unemployable History major must be destroyed.

David Brooks is Wrong about the “Crisis of Western Civ”

David Brooks, one of the regular op-ed columnists at The New York Times, is very upset with university professors, especially those who teach history. According to Brooks, they are responsible for the “crisis of Western Civ.”

https://www.nytimes.com/2017/04/21/opinion/the-crisis-of-western-civ.html?_r=0

According to Brooks, there once was a time when people in Europe and North America believed in a “Western civilization narrative” that was “confidently progressive” and helped “explain their place in the world and in time.” This narrative promoted certain values, including the “importance of reasoned discourse, the importance of property rights, and the need for a public square that was religiously informed but not theocratically dominated.” According to Brooks, this view of history provided “diverse people” with a “sense of shared mission and a common vocabulary” which in turn promoted “a framework within which political argument could happen” and “common goals” could be attained. This narrative was best articulated by Will and Ariel Durant’s eleven-volume series, The Story of Civilization (1935-1975) which focused on a number of key figures and described Western history as an “an accumulation of great ideas and innovations.”

At some point, for reasons that Brooks never really explains, “many people,” but especially those teaching in universities, “lost faith in the Western civilization narrative.” It stopped being taught. If it was mentioned at all, it was described as a “history of oppression.” Brooks claims that terrible consequences have flowed from this change in the intellectual wind: the rise of illiberal and authoritarian figures “who don’t even pretend to believe” in the narrative; the collapse of the political center that once had faith in the democratic capitalism that was upheld by the narrative; and the undermining of liberal values in America. Brooks closes by arguing that:

These days, the whole idea of Western civ is assumed to be reactionary and oppressive. All I can say is, if you think that was reactionary and oppressive, wait until you get a load of the world that comes after it.

One Thing after Another has enjoyed much time to reflect on the utility of Western Civ; while in graduate school, this blog served as a teaching assistant in Western Civ courses for three years (nine quarters in a row!) before spending another three years teaching Western Civ as a visiting assistant professor at two different institutions. These experiences lead One Thing after Another to think (although it pains this blog to be so blunt) that Brooks has ventured into territory he does not understand.

For one thing, who believed in the kind of Western Civ narrative that Brooks summarizes, and when did they believe it? Brooks’ assertions are rather vague. At one point “people” believed this narrative. Then “many people . . . lost faith” in it. These claims resemble those C essays One Thing after Another used to read in Western Civ classes where that indistinct and monolithic entity, “the people,” did this and that for no discernible reason (e.g. “the French Revolution began because the people rose up to fight for their rights”). In an attempt to prove the power of the Durants’ narrative, Brooks does mention that The Story of Civilization sold two million copies (many through the Book of the Month Club), but One Thing after Another has seen enough mint copies of this eleven-volume work in used bookstores to wonder how many readers actually stumbled through its 10,000 pages. What this blog does know is that historians at the time did not think much of the Durants’ efforts. Will Durant was not a historian (he had earned a Ph.D. in Philosophy which is not exactly the same thing as History), and he did not always engage with the complexity of the past. As Crane Brinton pointed out in his review of The Age of Voltaire (Volume 9 of The Story of Civilization):

It is difficult for a professor of history to say good things about their work without seeming to unbend, if not to patronize. Clearly they are readable. They can produce the telling anecdote, the picturesque detail, [and] the sense of movement in events and ideas. . . . Above all, though, they are often mildly epigramatic. Though they can be comfortably realistic about human nature, the Durants are never uncomfortably realistic, never daring, never surprising. Theirs is the enlightenment that still enlightens, basically kindly, hopeful, progressive, reasonable, democratic.

In other words, it was history that was neither taxing nor challenging to mainstream liberal opinion in mid-1960s America. This verdict is especially telling, appearing as it does in the obituary of Will Durant produced by Brooks’ own newspaper, The New York Times.

One Thing after Another will try to leave to one side the conceptual problems associated with the whole Western Civ project (e.g. where and when was the “West,” and on what basis are certain people and places included in this “West”?). Instead, this blog is interested in Brooks’ description of the Western Civ narrative as a collection of great ideas, people, and values whose sole purpose seems to consist of upholding a liberal consensus that seeks to bind our fragile body politic.

It is not clear if Brooks believes that this narrative is an accurate representation of the past of if it is a convenient and useful myth. If the former, he is wrong; if the latter, he must realize that, like most myths, it is bound to be exposed. Whatever the case, Brooks’ essay does not seem to recognize that “history” as a discipline does not “tell” us this thing or that about the past (in much the same way that “science” does not “say” this thing or that about the natural world). Rather, historians marshal documentary evidence on behalf of arguments that seek to represent the past. Some of these arguments are more persuasive than others, and they may become dominant in their subfield for some time. But in their bridging of the gap between the present and the past, none could be said to be “the truth.” At best, they are credible inasmuch as they seem to jibe with extant documents of the past.

The point to remember is that history is constantly contested. The discipline does not set forth a series of immutable truths about Western Civ or anything else. Instead, historians present rival interpretations of past events. These rival interpretations stem, in part, from the fact that documentary evidence is often unclear and contradictory. But these conflicting readings of the past are also a product of historians’ own concerns and world views. As Benedetto Croce argued, “All history is contemporary history.” These are the reasons why, for instance, various scholars argue over whether class, culture, or politics was the main driving force behind the French Revolution.

History, then, is often messy and paradoxical. Brooks’ vision of Western Civilization (and the Durants’, from which he takes inspiration) does not seem to recognize this messiness and paradox, and that goes a long way toward explaining why historians no longer find that vision compelling. Western Civilization is no greater and no worse than the common run of humanity. It has done great good, great evil, and very much in between. Its unfolding has been unpredictable and full of surprises. It does not point in any particular direction. Take Rousseau (to name one of the “great figures” of Western Civilization to whom Brooks refers). His legacy is conflicted. This ambivalence is reflected by the fact that the two greatest near-contemporaries who felt Rousseau’s intellectual influence most forcefully were Kant and Robespierre. Not surprisingly, then, there are those who see Rousseau as absolutely indispensable to the development of modern liberalism and democracy, while others consider him the intellectual forebear of modern authoritarianism. Freedom and tyranny—these are the twin faces of the Western tradition, and any narrative that purports to describe this tradition must come to grips with both.

The main problem with Brooks’ argument is that it identifies or conflates a particular narrative of Western Civilization with liberal democratic ideals. It is his anguish about the decline of the latter that provides the driving force for his essay. But there is no need to make historians the focus of his ire. One can love liberal democracy without clinging to a fairy-tale version of Western history. The much-perceived decline of liberal democracy in the West probably has many origins; it seems disproportionate to point to so inconsequential a force as history professors as the main culprits. Defenders of liberal democracy should fight for what they think is right, but they should not criticize historians for refusing to embrace a narrative that does not do justice to the complexity of the Western tradition.

Brooks’ conclusion puts One Thing after Another in mind of a line from George Orwell’s classic, semi-autobiographical short story, “Shooting an Elephant” (1936). In the introduction, the narrator describes himself in terms that would have fit Orwell himself:

I did not even know that the British Empire is dying, still less did I know that it is a great deal better than the younger empires that are going to supplant it.

The problem was, of course, that even if one believed the British Empire was a “good deal better” than its successors, there was no point in wishing for its survival; its position was untenable. The same goes for the Durants’ narrative of Western Civ. Even if one believes it was a good deal better, its position, too, is untenable.

Furthermore, I consider that the myth of the unemployable History major must be destroyed.

Review: Richard Overy’s The Bombing War: Europe 1939-1945

Richard Overy, The Bombing War: Europe 1939-1945 (London: Penguin, 2013).  

Richard Overy is one of the leading historians of World War II alive today, and while he has written on a number of topics associated with that conflict, the fighting in the air is his area of special expertise. While The Bombing War is not as comprehensive as some of his other works, such as the The Air War, 1939-1945 (1980), it is one of his most powerful books. For those interested in the topic of strategic bombing during World War II, The Bombing War is indispensable. It balances the meticulous research and broad vision that only an expert of Overy’s caliber can produce.

One of Overy’s purposes in writing The Bombing War is to provide “the first full narrative history of the bombing war in Europe” (xxiv). This narrative, he argues, is more complete than previous efforts because a) it covers all of Europe, b) it integrates bombing into the “broad strategic picture” (xxiv), and c) it links the narratives of those who did the bombing with those who were bombed. Overy’s other main objective consists of “re-examining the established narratives on the bombing war” which have been shaped, especially in the British and American cases, by official histories (xxv-xxvi). (The United States The Army Air Forces in World War II, which consisted of seven volumes, was published between 1948 and 1958, while Britain’s four-volume equivalent, The Strategic Air Offensive against Germany, appeared in 1961). Overy has conducted this re-examination by studying the “private papers of individuals and institutions” as well as parts of the official record that “were originally closed to public scrutiny because they raised awkward questions” (xxvi). At 642 pages of small, densely printed text, The Bombing War is long (maybe overlong), but it never loses sight of two related theses. First, strategic bombing during the war never lived up to the hype of its proponents; there was a big discrepancy between promise and achievement. Second, strategic bombing, as practiced during the conflict, was a bludgeon that did not achieve enough to justify the enormous collateral damage that it inflicted on both lives and property.

Overy’s story begins with a discussion of World War I and the interwar period. Here, he focuses on two major developments that helped make strategic bombing possible during World War II. The massive mobilization of World War I as well as the rhetoric that followed afterwards led everyone to assume that the next war would be “total” and that civilians would naturally be targets in this conflict. This discourse meshed well with assumptions among airmen and statesmen that urban conurbations of the modern era were particularly susceptible to dislocation from aerial bombing. Based on little evidence, those who contemplated the course of air war in the future believed that industry was vulnerable to destruction and that civilians living in big cities would panic easily. These attitudes, however, did not make strategic bombing during World War II inevitable; Overy argues that it was only events during the war that made such a thing possible.

Among the many limits that prevented airmen from immediately and deliberately dropping bombs indiscriminately on civilians in 1939 was the fact that many air forces believed that their primary mission consisted of supporting the army in a ground-attack role. And indeed, Overy argues that two incidents widely seen as initiating “terror” bombing during the war—the Luftwaffe’s bombardments of Warsaw and Rotterdam—were not that at all. In both cases, he claims that German aircraft sought out enemy ground forces that happened to be ensconced in or near urban areas. These two attacks resulted in large numbers of civilians being killed. The air assault against Rotterdam proved especially tragic since German and Dutch forces were then negotiating the surrender of the city but could not get word to the Luftwaffe fast enough to halt the air attack.

The first real strategic bombing campaign took place over the skies of Britain between 1940 and 1941. Overall German strategy was muddled from the start, constantly shifting from one objective to the next. On the eve of the Battle of Britain, Hitler could not decide whether to encourage the British to enter negotiations, invade southern England and dictate a settlement, or use ships, submarines, and aircraft to impose a blockade on British ports. As Overy puts it, “Hitler opted for all three possibilities, and achieved none of them” (68). Whatever the case, all three required the Luftwaffe to play an important role and demanded a heavy commitment from Hitler’s airmen. Forces, however, were frittered away as “the German offensive hovered between trying to gain air superiority against the RAF, preparation for invasion, contributing to the blockade by sea of British trade, degrading Britain’s industrial war potential and vague expectations of a crisis afflicting the enemy’s morale” (611). The failure to fix on an appropriate target and destroy it (along with the inability to match ends with means) accounted in large part for the frustration of German aims. This frustration occurred in spite of Britain’s weaknesses in civil defense (which were not made good until the latter part of 1941) and huge deficiencies in the RAF’s night-fighting capacity.

Although, as Overy points out, each strategic bombing campaign of the war differed in a number of ways, the German attack on Britain was emblematic in that it was planned and launched on the fly; almost no research or preparation for such an effort had been performed during the pre-war period (which accounts for the strategic confusion). This problem would also plague Allied campaigns throughout the conflict. The German campaign was also important in that it stretched notions of what was considered permissible during the war. The British in particular subjected the German campaign to very close scrutiny. In some cases, RAF’s Bomber Command learned important lessons (e.g. dense concentrations of incendiaries mixed with high explosive bombs were particularly useful in destroying large parts of towns). In others, the British misconstrued with the Luftwaffe had been up to (e.g. they assumed Germans were engaged in mere terror bombing). In still others, the RAF totally missed the boat (e.g. the British ramped up their bombing of German cities in the hope of demoralizing civilians and dislocating the economy without pausing to think that the Germans had failed to do the very same thing in the very same way).

With these observations in mind, it should come as no surprise that Overy is extremely critical of Bomber Command’s own effort against Germany and occupied Europe. Initially, the RAF’s campaign was too piecemeal, light, inaccurate, and scattered to have much effect. Starting in late 1941, however, the British more or less decided on the area bombing of German cities in an attempt to demoralize, dehouse, and decimate German civilians (which is what they thought the Germans had attempted to do to them). Although Britain’s political and military leadership always felt ambivalent about this decision, the appointment of Sir Arthur Harris as the head of Bomber Command in February 1942 gave the force an aggressive and intractable advocate who was fully committed to the air war against German civilians to the exclusion of all else. Nonetheless, progress was stymied by a number of shortcomings. There was a lack of appropriate, heavy four-engined bombers (as late as 1942, the number of Avro Lancasters was limited). The British were also plagued by “the slow development of target-finding and marking, [and] the dilatory development of effective electronic aids, marker bombs and bombsights.”  And then there was “the inability to relate means and ends more rationally to maximize effectiveness and cope with enemy defenses”—a problem that had also hampered the Germans (300). Despite its ineffectiveness, Bomber Command was allowed to persist in its campaign which swallowed a very large proportion of available British resources (about 7% of total British man-hours during the conflict)—no small victory for Harris and his subordinates who sought to safeguard their bailiwick.

The entry of the United States into the war did not change the British situation a great deal. The Americans made clear that they would not divert bombers from their factories to supply the British. Not surprisingly, considering the many demands placed on the United States, it took the Americans some time to organize, equip, and train a large bomber force that could exercise any influence in the European theater. The Allies made much fuss about a “combined offensive” and “round-the-clock” bombing (Americans during the day, British at night), which seemed to suggest that their bombers acted in concert. The truth of the matter was that their campaigns operated merely in parallel and did not reinforce each other at all. The Americans did not think much of bombing cities for the sake of depressing German morale. They were more interested in employing daytime precision attacks and destroying specific targets that would slow down German production (although Overy admits that when visibility was limited, American blind bombing was just as indiscriminate as anything Bomber Command did). Overy intimates that although American forces experienced difficulty in finding the bottlenecks that could bring the German economy to a halt, they expressed a much more thoughtful and sophisticated approach to bombing than Harris ever did. Bomber Command continued its nocturnal attempt to destroy city after city in the hope that the cumulative destruction would eventually end the war somehow.

In the end, Overy argues, Allied strategic bombing did not end the war, but it did influence the manner in which Germany was defeated. In early 1944, American forces finally made a commitment to using the bombing campaign as a means of destroying the Luftwaffe in the skies over Germany. The delay in reaching this decision was not determined by technology; it was also a matter of placing commanders in the European theater who shared that vision. By that date, Carl Spaatz (commander of US strategic air forces), Jimmy Doolittle (Eighth Air Force), and William Kepner (VIII Fighter Command) occupied the key American positions in Europe and agreed that it was necessary to combine “the indirect assault on air force production and supplies through bombing with the calculated attrition of the German fighter force through air-to-air combat and fighter sweeps over German soil” (361). Initially spearheaded by P-47s with drop tanks (the P-51s came later), fighter loosely accompanying American bombers sought out German aircraft, leading to huge air battles with massive casualties on both sides. It was a campaign of attrition for which the Germans were ill-suited. Two major developments occurred as a result. First, the Germans redistributed resources—personnel, fighter aircraft, and anti-aircraft guns—to the homeland on a large scale to counter this threat. These were resources that could not be deployed on other fronts to support German ground forces (including anti-aircraft weapons which could double as anti-tank guns). Second, having forced the Germans to concentrate their aircraft in Germany, the Americans proceeded to destroy the Luftwaffe, shooting down enormous numbers of planes and killing their pilots. By mid-year, the Americans had achieved air supremacy over France and Germany. And then strategic bombing lurched forward on a much larger scale than ever before; three-quarters of the total tonnage of bombs dropped on Germany fell between September 1944 and May 1945. The Allies persisted in heavy bombing largely because they were worried that the Germans might suddenly produce new weapons that could turn the tide (the V-weapons as well as the Messerschmidt Me 262 jet fighter certainly gave them reason to think this way). They also hoped that more bombing could bring the war to a swifter end—the British thinking that obliterating more cities would tip Germany over the edge while the Americans believed that the destruction of oil and transportation targets would undermine the German war effort. Still, German productivity reached its height in the last three months of 1944, when bombing was extraordinarily heavy. Allied victory eventually came at an extremely high cost to victor and vanquished, but the impact of bombing was only one of several factors that defeated the Axis powers.

Many readers familiar with the topic will have seen parts of this narrative before, but Overy presents a version of the story that is very much his own in which a number of key arguments, great and small, are modified. Overy’s book is particularly interesting when it comes to discussing civil defense and the impact of the war on civilians, something that most histories of strategic bombing do not study in a systematic way. The Bombing War stresses the degree to which different circumstances obtained in different countries. For instance, civil defense in Britain was characterized by friction between the voluntarist tradition of a free society and the centralizing tendency of the state. In Germany and the Soviet Union, however, the party saw civil defense mainly as a means of political and social mobilization. Whatever the case, the experience of civil defense was similar to that of the bomber forces in that its preparations were incomplete upon the war’s outbreak; capacity and sophistication generally grew as the war continued. It is hard to make generalizations about bombing’s impact on the various peoples of Europe, though, as every country was different. Overy points out that a good case could be made that bombing helped topple Mussolini in 1943, but he proceeds to argue that the collapse of the Fascist regime had more to do with its overall inability to cope with the various stresses of modern war. In cases where the state or party was more or less equal to the challenges of fulfilling civilians’ needs (e.g. Britain and Germany), heavy bombing generally did not enhance or undermine the population’s will to resist. If anything, it made civilians more reliant on the authorities which reduced the potential for dissent. The picture Overy paints of civilian populations under sustained air attacks is one of anxiety, exhaustion, and deprivation. Moreover, these populations were highly mobile as they left destroyed urban areas in search of shelter, food, and working utilities. It is not surprising that people in such a position would turn to the state for succor.

Conquered territories, particularly in western Europe, found themselves in a unique position. Generally hostile to the German occupation, they initially supported the Allied bombing of military targets. The RAF hoped that a campaign in these regions would damage German military installations (e.g. submarine pens) and slow down production in factories that had worked on German contracts. Later, in preparation for the cross-Channel invasion, the Allies sought to destroy most of northern France’s transportation infrastructure (and once troops had landed in Normandy, heavy bombers were used for ground support). In these regions, the British always saw bombing as a propaganda act that could demoralize collaborators and give resistance a boost. Unfortunately, once the RAF began bombing France and the Low Countries without restriction in February 1942, opinion in these countries turned against the British initiative. Just as they were in Germany, Allied bombings tended to be inaccurate and destructive, resulting in many civilian casualties (almost 60,000 French civilians were killed by Allied bombs). In the conclusion of his chapter on the bombing of occupied Europe, Overy notes, “Bombing was a blunt instrument as the Allies knew full well, but is bluntness was more evident and more awkward when the bombs fell outside Germany” (606).

Not surprisingly, Overy concludes that strategic bombing as practiced during World War II was a crude, wasteful, and illegal strategy. Moreover, it was a failure on its own terms. It sought to win the war singlehandedly by destroying the enemy economy, demoralizing the enemy population, and deracinating the enemy’s political system. In all of these areas, the impact of bombing was limited. Strategic bombing’s main contribution to Allied victory—the destruction of the Luftwaffe—was almost incidental. The obsession with the “weight and scale” of attacks, rather than accuracy, paved the way for post-war nuclear arsenals that sought to do the same thing but on a much larger scale. This approach to strategic bombing would prove a dead-end; precision-guided munitions, Overy argues, were the “way forward” (613). We can be thankful, then, that “profound changes in available weapons, the transformation of geopolitical reality and post-war ethical sensibilities have all combined to make the bombing war between 1939 and 1945 a unique phenomenon in modern European history, not possible earlier and not reproducible since” (633).

Furthermore, I consider that the myth of the unemployable History major must be destroyed.

Hugh Dubrulle

NOTE: This essay reviews the Penguin UK version of Overy’s book, not the Penguin USA edition (entitled The Bombers and the Bombed: Allied Air War over Europe 1940-1945). The latter was heavily edited and is much shorter than the former. The reviewer recommends that you purchase the British version.

The Myth of the Unemployable History Major Must Be Destroyed

One Thing after Another has a son in high school, so this blog knows a number of parents who have completed the college application grind. Among these are “K” (we feel obliged to protect her anonymity), whose son was considering Saint Anselm College. At one point, she told One Thing after Another that her son liked history, but since “he wanted to make sure he had a job after he graduated,” he was going to major in politics. In the end, K’s son went to another school so, in a sense, his choice of major did not matter.

K’s reasoning, however, does matter to this blog. For years, One Thing after Another has heard this line of argument over and over again. A history major is an unaffordable luxury, so the argument goes, because one cannot merely go to college to study one’s interests. The cost is so great that students must major in something that will guarantee them a job. Since the only kinds of jobs supposedly open to history majors are teaching and positions related to history (e.g. museum staff), students often look to other majors that give them better opportunities.

This blog understands why parents feel this way. One Thing after Another remembers the anxious expression on K’s careworn face as she explained the decisions that she and her son had to make. The stakes are high. College is so expensive that parents cannot avoid thinking in terms of return on investment. At Saint Anselm College, tuition for 2017-2018 will be $38,960, room and board will reach $14,146, and mandatory fees will come in at $1,030. Obviously, not everyone will pay this kind of money. The discount rate at our school is around 49% (much to the dismay of our CFO), which means that the average student will pay just over half of the $38,960 in tuition (somewhere around $19,960) for a total bill of about $35,136. Spending that kind of money over four years, one could buy about six 2017 Honda CRVs or pay for almost 60% of the median home price in Goffstown ($247,000 for the period between January and April). Finding this kind of cash is an enormous burden for middle-class families—let alone poor ones. It’s no wonder that students rush to major in disciplines where the connection between the field of study and a remunerative job seems obvious. It seems fairly easy to understand, then, why students are somewhat more hesitant to take the plunge in a major where connecting the dots between academic work and employment appears somewhat more difficult.

But the dots are there, and they can be connected if only people show a little patience.

History classes stress the analysis of various media—usually texts but also sources like film, music, painting, and so on. History majors ask and answer questions such as, “Who produced this source?” “Why did she produce it?” and “Under what circumstances was this source produced?” Ours is a reading-intensive discipline because reading is the only way to become practiced at this sort of thing. Doing this kind of work requires the development of analytical skills that lead students to sharpen their judgment. They come to understand what is likely or what is true. At the same time, they are required to synthesize a great deal of material to form a comprehensive picture of how people, places, and things have worked in the past—and how they may work in the future. They are then prepared to answer questions such as, “Why did this happen?” and “How did it occur?” What’s more, students in History are compelled by the nature of the discipline to articulate their thoughts in a systematic and compelling manner, both through discussion and on paper. In addition to being a reading-intensive discipline, we are also a writing-intensive one. Finally, the study of history leaves students with an enormous amount of cultural capital. Among other things, they encounter great literature, music, painting, movies, and rhetoric.  At the same time, they also learn about important events and noteworthy civilizations that we should all know something about—such as Han China, the French Revolution, the Zulu Kingdom, the Progressive Era in America, and World War II. Students educated in this fashion thus add to their stock of experience which helps them confront the challenges of the present.

To summarize, the course of study that History majors undergo provides them with high-level analytical skills, a capacity to synthesize large chunks of information, and an ability to present logical arguments in a persuasive fashion. Not only that, but their training offers them knowledge that helps them navigate and understand the world. These are the kind of attributes employers are looking for even in an age where STEM seems to be king (see here, here, here, here, here, and here—you get the idea).

We know these things to be true because we see what happens to our own majors after they graduate from Saint Anselm College. Our department recently surveyed alums who graduated between 2012 and 2015 with a degree in History. We determined that out of the three-quarters who responded to the survey, 100% were employed or attending graduate school. We also found they attained success in a wide variety of fields, most of which have nothing to do with history. For sure, we always have a number of students who double-major in history and secondary education. We are proud of these students, many of whom are high achievers; in 2014 and 2015, the winner of the Chancellor’s Award for the highest GPA in the graduating class was a history major who went on to teach. And yes, we also have a small number of graduates who go on to work in history-related fields (see here and here). But around 75% of our graduates are scattered among a wide range of other jobs.

Recently, One Thing after Another engaged in the exercise of naming all the positions held by History alumni whom the blog personally knows. This list is obviously not scientific; other members of the History Department know different alums who hold even more positions. Yet what follows ought to give the reader a sense of the wild diversity of jobs open to those who major in History. One Thing after Another knows many history majors who have gone on to law school and have since hung out their shingle as attorneys. Many of our alumni also work for the FBI, the CIA, and the DHS. Others have found employment as police officers and state troopers. We have a number of alumni who currently serve as commissioned officers in the armed forces. Many have gone into politics, serving as lobbyists, political consultants, legislative aids, and town administrators. Others have been on the staffs of governors and mayors. Large numbers work in sales for a variety of industries. We have managers at investment firms and folks who work on Wall Street. Other history majors this blog knows are in the health insurance business, serve as economic consultants, hold positions in import-export businesses, have become construction executives, and work in public relations. They have also become dentists, software engineers, filmmakers, nurses, social workers, journalists, translators, college coaches, and executive recruiters. Some work in the hospitality industry as the managers of resorts, hotels, and convention centers. Others are to be found on college campuses as administrators, financial aid officers, reference librarians, and so on. And then there are the archivists, curators, and museum staffers. Remember, this list (which was compiled in a somewhat off-hand manner) is not exhaustive. It only consists of alumni whom One Thing after Another knows personally. There are many other history alums out there doing even more things.

This blog must close with a reference to Cato the elder (portrayed above). In the years before the Third Punic War (149 BC-146 BC), this prominent soldier, politician, and historian, was convinced that Carthage still presented the greatest threat to Roman power in the Mediterranean. His obsession with Carthage is captured in the story that he concluded every speech in the Senate, no matter what the topic, with “Ceterum autem censeo Carthaginem esse delendam”—which means in English, “Furthermore, I consider that Carthage must be destroyed.” This phrase has often been shortened to “Carthago delenda est” or “Carthage must be destroyed.” From this point forward, in defense of history, One Thing after Another must be as implacable as Cato the Elder, and thus, this blog will conclude every post with, “The myth of the unemployable History major must be destroyed.”

Gaughan and Jack Experience Woodside Priory School

During spring break, Education Professor Terri Greene Henning accompanied five Saint Anselm College Secondary Education students as they visited Woodside Priory School, a Catholic Benedictine middle and high school in Portola Valley, California, connected to Saint Anselm Abbey. Among those students were Colleen Gaughan ’18 and Randy Jack ’18, both history and secondary education double majors. One Thing After Another asked them to share some thoughts about their experience.

The trip began on Saturday March 4, with a flight to San Francisco and two days of sightseeing. Students visited the Golden Gate Bridge, the University of San Francisco, the Ferry Building, the Pacific Ocean, and Alcatraz. Randy Jack said of the sightseeing, “Being a history major provided a unique perspective, because it gave Colleen and me an opportunity to appreciate the rich history of the city. Colleen and I freaked out when we saw Alcatraz for the first time! Seeing the Golden Gate Bridge was an absolute bucket list item for me so it was an incredible moment when I first laid my eyes on it. ”

On Sunday, the group made their way to Portola Valley and the Woodside Priory School. The group was housed on campus for the week, encouraging an inclusive and immersive environment. Each day, the group attended mass in the morning, observed classes with students, attended sporting events, and explored the campus. Both Gaughan and Jack were placed in classrooms to observe and teach lessons.

Colleen Gaughan, who is passionate about both history and English, was placed in an English classroom for the first few days and attended a middle school US History class later in the week. Colleen said of her experience: “I think being a history major really made a difference specifically in the history classrooms. It was great to see how they were teaching history to students, especially middle school children. In the middle school US History class, they were listening to the Broadway musical Hamilton and using that to keep students engaged in the material. I think sometimes it’s difficult to get students interested when they think history is just lectures about dead people. So making history fun and come alive was helpful for the students. I think that being a history major, I was able to recall what made me love history; I saw that same passion in the students and how they were being taught.”

Randy Jack was placed almost exclusively in a Social Studies classroom and was able to teach lessons in a US History class. Jack called the experience “fantastic” and discussed how “being a history major absolutely plays a big part in how I would like to teach. While we learn a great deal about the particular strategies in our education classes, taking history classes at Saint A’s has been important in informing how I want to utilize the strategies I’ve learned in a historical contexts.” Gaughan shares these sentiments: “In my future history classes . . . I want students to understand concepts and how events relate to each other, rather than being nitpicky about memorization of dates. The focus of my classes will be making sure students can apply what they are learning in their history class to what they are seeing in the world. Understanding where we’ve been can help inform us on where we will go.”

When asked what moments of the trip stood out to them, Gaughan and Jack both referred to their time teaching in classrooms. For Jack, “seeing the students be so engaged and laughing and having fun while learning was a good reminder of exactly why I want to be a teacher.” For Gaughan, the spiritual value of the trip was as important as the educational value. There are three Benedictine monks from Saint Anselm Abbey at the school, and the Saint Anselm students had dinner with them one night. Gaughan said, “Since there are only three monks, there was room for real discussion. Father Martin is an alum of Saint Anselm, and he used to live here before he was asked to move out to California, so it was interesting hearing his stories about how the school has changed over the years. . . . It was also very cool from a historical perspective to hear the stories of Father Pius and Father Maurus, who were two of the Hungarian monks who escaped communism by coming to the United States and eventually set up the monastic community at Priory. That dinner was one of the most memorable events of my trip.”

Their experiences at the Woodside Priory School confirmed both Gaughan and Jack’s decision to teach history. Jack admits, “My decision to become a history teacher was one that I pondered for a long time. It started out with my love for history; growing up I always loved talking about history. Eventually I decided I would love to be able to get a job using my love for history, and I figured education would be a good fit. However, when I finally entered the classroom as an educator my sophomore year, I realized it was so much more than that. I realized that being able to work with students and help them develop their own appreciation for history was equally important to me.”

For Gaughan, teaching history is a way to initiate a new generation of informed students. As she put it, “I love how history informs us of the past and helps us to understand the present. I think that by studying the people of the past we can understand what worked, what hasn’t, and what we might want to try. Understanding cause and effect is a pivotal part of understanding the past and the present, and I think that it’s a skill that is really important to develop and one I want to foster in my students.”

NOTE: In the photo above, Gaughan is third from left while Jack is fourth from left; both are holding the banner. Professor Terri Greene Henning is far left. 

Hitchen Saves the World at the NH Department of Environmental Services

This semester, Lily-Gre Hitchen ’18, a History major from Auburn, New Hampshire, is interning with the New Hampshire Department of Environmental Services. One Thing after Another caught up with Hitchen and asked her about her experiences

Q: What made you decide to do an internship?

A: Ever since I was in grade school I’ve been interested in the courts, lawyers, and the law in general. When I first entered Saint Anselm College, I seriously considered the possibility of going to law school afterwards. Now being a junior, I decided to do an internship that would answer questions that have been brewing since I was a freshman. What do lawyers do? What type of work can a person do in the legal field? What is it like to work with the law? Would I like that type of work? This internship for me was all about discovery; I wanted my questions answered with experience.

Based on my time spent at the New Hampshire Department of Environmental Services (NH DES), I would recommend internships to my fellow history majors. I think they are very important in seeing firsthand how skills learned in history classes can be applied to the “real world.” If there is the time and opportunity, I highly recommend completing an internship that is of interest.

Q: What intrigued you about NH DES in particular?

A: The initial thought of working at the NH DES excited me. The main reason that I was intrigued is that I am a huge nature lover, and I care for New Hampshire’s environment in particular. I am from a small town, and I have always enjoyed the outdoors. My home state is very special to me, and I wanted to be able to protect its environment. DES was a great place to pursue my passions.

Q: Can you describe a typical day at the office?

A: I am responsible for a wide variety of tasks at the DES. During a typical day, most of my time is spent working independently on an assigned project. My assignments can range from proofreading legal documents for cross-referencing errors and creating tables expressing the changes in a set of laws to creating draft decisions on environmental fine cases. Drafting fine case decisions are my favorite projects to work on because there are so many facets in making a decision. I read through the file while analyzing the case’s chronology of events, I listen to the fine hearing, and then I draft a document explaining if I think the respondents committed a violation. After I complete the draft, I pass it on for review. When I am not working independently, I am sitting in on meetings or fine hearings.

Q: Are you finding your history skills useful in your legal work?

A: My history skills have helped me in ways that I had never expected. I think the most important history skills that I have used are reading critically, paying attention to detail, not making assumptions, and being skeptical. Using these skills I have spotted mistakes in numerous documents, whether it is in their structure or chronology. Also, being able to formulate a chronology and possessing the ability to point out errors in an already provided chronology is an expertise that history majors are taught and expected to master. However, I never knew that this particular skill would be useful in the working world.

Q: What has been the hardest part of translating your classroom skills into the workplace?

A: The most difficult part of translating my classroom abilities to the workplace was asking questions. In a classroom, a professor is either always open to questions, or specifically asks, “Are there any questions?” However, in a new workplace it is sometimes a balancing act trying to find the appropriate time to ask a question. I did not want to be an annoyance, so at first I was reluctant to speak up. Over the first week, I realized that I did not need to be reluctant when asking questions; I just needed to be respectful. Everyone is busy in the office, so I only ask questions when I cannot continue my work without it being answered. Being concise when asking questions is also a part of respecting their time.

Q: So what do you do to after a busy week of classes and internship?

A: Most of my time outside my classes and internship goes towards the family business. My mother owns a hair salon, Salon OPA, so I have many responsibilities there that I am proud of. I manage the inventory, cash customers out, answer phone calls, and make appointments. I also have my apprentice license in cosmetology and makeup certification, so I can perform some services. One of the most rewarding parts of working at the salon is selling wigs to women who are going through cancer treatment or have alopecia. Working at the salon has given me a joy for business, and the appreciation of entrepreneurs of all kinds.