Month: October 2014

Meet Sarah Hardin

Sarah Hardin

This fall, the History Department acquired a new faculty member, Sarah Hardin. Although World History and Conversatio will constitute part of her class rotation, her primary teaching area will be African history. A number of our students have already met her in class, but One Thing after Another hasn’t properly introduced her yet. Seeing as Sarah has had the benefit of a couple of months to settle in, One Thing after Another asked her a few questions so everybody could get to know her.

Q: Where are you from originally? What was it like growing up there?

A: My earliest years were spent in a lake house in the Piney Woods of East Texas and on Galveston Island where I played a lot outdoors and developed my love for nature.  For a short while, my dad drove a book mobile which I liked as a van I could play in, but at the age of 5 I hated reading.  We moved to Austin, and I took school field trips to the Greenbelt, Big Bend National Park, and Mexico (to go caving), something I really miss.

Q: How did you get interested in African history?

A: Lots of little things over a long time.

In middle and high school, I was fortunate to take world history, world literature, and anthropology. I was also fortunate to have teachers who assigned Colin Turnbull’s The Forest People (a highly romanticized and problematic ethnology of the Batwa in the Belgian Congo, but amazing to my young mind) and several great novels written by West, East, and South Africans.  These classes and books made me curious about how people around the world had experienced Western imperialism and what their own histories were.

In college, after finishing my Spanish minor, I started taking French since I was frustrated by not knowing how to pronounce the French words I was reading in the books in my anthropology and history classes. In French class, I learned of the Francophone world.  I also took Korean and wrote a research paper on Korean resistance to Japanese occupation.  My senior thesis was on the ways Freya Stark, a British travel writer, envisioned the end of the British empire and the future of the Middle East, Egypt, and Yemen.

After I graduated, I used my French, Spanish, and Korean language skills a little as an international student advisor at the Texas Intensive English Program in Austin for four years. I worked with students from every continent on the planet. It was great!

As I considered going to graduate school, I was interested in Latin American and Asian history, but I found that African history would enable me to combine all my interests and get a job, so here I am!

Q: How did you end up researching the relationships between cotton, pesticides, and the Fulbe people in Senegal?

A: Again, I pursued lots of little things.  I originally proposed doing an environmental history of western Côte d’Ivoire since there is little written about that area in English.  At the University of Wisconsin-Madison, I got a Foreign Language Area Studies fellowship to study Hausa, one of the languages spoken the most by West Africans.  Hausaland is in northern Nigeria and southern Niger and is known for cotton cultivation, among other things.  I thought researching cotton would be a good way to do a comparative environmental history, plus I recalled how my grandmother spoke about picking cotton as a child in Texas.  I then got a Jan Vansina travel grant from the UW History department to do research for my master’s thesis at the archives for French West Africa in Dakar, Senegal.  I looked for records on northern Côte d’Ivoire and southern Niger, but I found more on southeastern Senegal.  French colonizers reported, as they tended to stereotype ethnic groups, that the main people who grew cotton were Pulaar-speaking Fulbe.  Indeed, when I left the archives and visited that part of Senegal, people there told me the same thing. I then studied Pulaar and when I was doing my interviews I asked folks about how cotton cultivation had changed since their grandmothers’ era, and the biggest change they remembered was the introduction of pesticides.  I’m now researching why certain chemicals were introduced and how they affected people’s lives.  Even though a lot of pesticides were used, not much is widely known about what exactly their effects were in those places.  That information could help influence public policy.

Q: Of all the books you use in teaching various courses in African history, which one is your favorite and why?

A: One of my favorites is Jan Bender Shetler’s Imagining Serengeti: A History of Landscape Memory in Tanzania from Earliest Times to the Present because even though it is technical and a little difficult for beginners to read in places, undergraduate students tell me they learn a lot from it.  The book does almost everything a college-level book on African history should do. It challenges common views of the Serengeti as a natural wilderness. It undermines common views of “tribes” and shows how social affiliations are formed.  It explains the use of alternative sources and methods in history (archaeological, linguistic, and oral).  And it chronicles how the Serengeti National Park we see today was created.  Shetler presents the perspectives of people who have been called “poachers” and argues that it is crucial that their histories be taken seriously when considering public policies.  (Given that the topic is about Serengeti where there are no large, densely populated cities, it is not the best book for urban history, but then there are other books for that.)

Q: Since the great majority of the department is not from New England, we have noticed all sorts of peculiarities about this region. Have you noticed anything usual about life here?

A: So far I’ve found that people drive really friendly in Manchester, NH. That’s a good thing since I’ve seen a lot of motorcyclists exercising their freedom to drive without helmets!

Q: What activities do you do in your free time?

A: Exploring the beautiful hiking trails and the great bookstores in New England!

Descent into Hell: Japanese Civilians and “The Bomb”

Hiroshima Bombing Victim

“Would the Japanese have surrendered without Hiroshima?” is the question that opens Jonathan Mirsky’s review of Descent into Hell: Civilian Memories of the Battle of Okinawa.

http://www.nybooks.com/blogs/nyrblog/2014/oct/23/descent-hell/

In some ways, such an opening is unfair to the book. First-hand accounts of Japanese civilian life during World War II have not appeared in English with much frequency, let alone narratives about Japanese civilians literally caught in the crossfire of combat on land. For this reason alone, Descent into Hell should attract the attention of anybody  interested in the fighting that took place in the Pacific theater during this conflict. It does not, however, necessarily answer the question of whether or not the Japanese would have surrendered without the atomic bomb.

The connection between Descent into Hell and Hiroshima amounts to this: Mirsky claims that the book suggests Japanese civilians were devoted to the emperor and unwilling to surrender, so only the atomic bomb could have shaken their will to continue the fight. This argument does not seem to recognize the forces that truly led the Japanese government toward surrender in 1945.

In the last year of the war, the Japanese leadership clearly understood that the United States had obtained the upper hand in the Pacific. Japanese objectives had shrunk since the heady days of late 1941. As American forces began to close in on the home islands, the Japanese hoped to preserve their independence and avoid unconditional surrender by bringing the Americans to the negotiating table. The only way to to that was by inflicting heavy losses on the United States and making the war as terrible as possible. The Japanese no longer had a navy to speak of, and they had few trained pilots at their disposal, but they believed their willingness to take and inflict casualties would allow them to eventually demoralize the Americans. In other words, the Japanese leadership did not seem overly concerned about their own losses. And certainly, one part of this calculation held true: in the Pacific, American losses in 1945 rose dramatically. Once the United States became involved in large-scale ground combat in Normandy (June 1944), its casualties began averaging about 16,000 to 19,000 men per month. Large numbers of these casualties were suffered in the Pacific: Leyte (17,000), Luzon (31,000), Iwo Jima (20,000), and Okinawa (46,000). To put these losses in perspective, the first thirty days of the Normandy campaign, which was extremely hard fought by European standards, led to 42,000 casualties.  (By way of comparison, it is interesting to note that American casualties in Iraq between 2003 and 2012 amounted to about 36,000.)

If the Americans were appalled by the obstinacy of Japanese resistance, they were still capable of applying enormous military pressure on the home islands. Their submarines continued to wipe out the Japanese merchant fleet, their planes had mined Japanese waters and brought the coastal trade to halt, and their bombers had started to lay waste to Japanese cities. Among other things, Japan was running out of food, and in October 1945, after the war was over, famine was only averted by massive American aid.

Yet the Japanese believed they had additional cards to play. The Soviet Union had remained neutral in the Pacific war, and the Japanese set great store on being able to use the Soviets as an intermediary in talks with the United States. There was even some hope that Japan could foment discord between the United States and the Soviet Union whose alliance had always been somewhat awkward. Such thinking was delusional, but it kept the Japanese leadership hanging tough.

At Yalta (February 1945), the Soviets agreed  to join the Pacific war within three months of the conflict ending in Europe. As German resistance collapsed, the Soviets feverishly prepared to launch an invasion of Japanese-occupied Manchuria. They kept these plans secret, of course, from the Japanese. The Americans, for their part, had tested their first atomic bomb by mid-July and were now keen to end the war before the Soviets became heavily involved in Asia. The problem was that the Japanese showed little sign of wishing to surrender on terms that the Americans were willing to accept. The Americans wrestled with what kind of terms they might be willing to concede to the Japanese to bring them to capitulate, but the lure of a quick, unconditional surrender that would not cost many American lives, proved impossibly attractive. The Americans prepared for an invasion of the home islands, but they also deployed their atomic bombs.

The first atomic bomb hit Hiroshima on August 6, and a second bomb fell on Nagasaki on August 9. On the latter date, the Soviets invaded Manchuria. Which of these events convinced the Japanese government to surrender has been at the center of a lively debate. On one side, Tsuyoshi Hasegawa, author of Racing the Enemy (2005), has argued that the Soviet declaration of war played the preponderant part in the Japanese decision to capitulate. On the other, scholars like Richard Frank, who is perhaps best known for Downfall (1999), maintain the more traditional view that the atomic bombs brought the Pacific war to an end.

Of course, if one tends toward Hasegawa’s view, then Mirsky’s opening question is irrelevant because the atomic bombs did not really end the war. But even if one does not see eye-to-eye with Hasegawa, Mirsky’s question is still irrelevant. While they might disagree on what prompted Japan to throw in the towel, there is one thing on which all of these historians agree: the feelings of Japanese civilians did not enter into the matter. At the end of the day, it was the Japanese cabinet and emperor that made the decision. As Max Hastings has argued in several of his works about World War II, when it comes to waging armed conflict, authoritarian regimes have this great advantage over democracies: they can exert more coercion against their own people and need not engage in consultation. For that reason, they are capable of extracting more from their populations and suffering losses that more representative forms of government would never countenance. And so it was with the Japanese in August 1945. Read Descent into Hell, and read it for any number of reasons. But it won’t tell you if the atomic bombs were necessary.

How Many Native Americans Were There before Columbus, and Why Should We Care?

The manner of their attire.

The online edition of The Atlantic recently republished the following article that first appeared in 2002:

http://www.theatlantic.com/magazine/archive/2002/03/1491/302445/

The Atlantic wants us to take advantage of Columbus Day to reflect on the historiographical debate concerning the pre-Columbian population of the Americas. We should take the monthly up on its offer because this debate is, in many ways, exemplary in that it expresses what is so interesting and significant about these kinds of controversies.

First, it indicates very clearly what historians do. Historians, of course, dispute what happened and how it happened. They almost always have to do so with limited evidence. Perhaps even more important, they also argue about the significance of what occurred. The debate over America’s pre-Columbian population revolves around a number of very big and difficult questions. Before Europeans exerted any influence on the Western hemisphere, how many people lived in the Americas? How did they live, and what was the nature of their influence on the land? What was their quality of life when compared with the Europeans who arrived on their shores? How did they die, and how did this death lead to important changes?

Second, this debate involves a variety of fields. Any historical debate of significance is, to some extent, interdisciplinary. That’s because big questions bleed into a variety of fields. This particular argument involves not only historians, but also archeologists, anthropologists, geographers, epidemiologists, ethnographers, demographers, botanists, and ecologists. In some ways, the interdisciplinary nature of this discussion is a virtue in that a variety of fields can see a question in the round. At the same time, however, conversations between disciplines can become chaotic because they employ varying approaches and see the world from different perspectives. That kind of situation can lead to specialists talking past one another, making it difficult to reach agreement.

Third, as all the participants seem to recognize, this discussion informs a series of important contemporary arguments. The most significant ones have to do with, first, our relationships to each other and, second, our relationship to the land. To start with the first one, as Mann points out, “given the charged relations between white societies and native peoples, inquiry into Indian culture and history is inevitably contentious.” Controversies revolving around such issues, of course, touch upon the responsibilities of white societies to native peoples. While the contemporary discussion influences how each group sees the other, it also shapes the way each groups understands itself. In other words, this debate has much to do with identity. This feature of the argument partially explains its ferocity. All historiographical debates of any worth involve a collision of world views, but this collision is fraught with emotion. As for the second contemporary argument, the one revolving around our relationship to the land, this historiographical debate has great significance. For centuries, the myth of the noble savage led many to believe that native Americans were a part of nature rather than actors who molded that nature. The modern environmental movement took inspiration from that vision. If you don’t believe One Thing after Another, take a look at this famous public service announcement produced by Keep America Beautiful in 1971 (otherwise known as the “Crying Indian ad”). Here, the iconic native American represents nature and serves as the standard by which to criticize a dysfunctional and polluted modern world. But as experts in a variety of fields increasingly appear to believe, native Americans were a sort of “keystone species” who influenced the land to suit their own needs. In other words, as long as there have been humans in the Americas, there has been no such thing as “pristine” nature.

Because of their political consequences, these kinds of debates tend to smolder for decades, occasionally breaking out into open flame. For the same reason, the findings that issue from these arguments often find their way to the public in distorted form via the news media. It is for these reasons that it makes sense to read up on these controversies from the beginning. We should all take a look at Dobyns, Crosby, and Cronon’s works.  Of course, who has time for that except historians?

The Beliefs We Profess

medieval professor

Our students, colleagues, and the general public almost always see faculty members in their “professor” roles.  The word originally meant one who declared (professed) their faith or belief, which shows the origins of the word in the monastic realm (monks still profess their vows).  It also shows how thoroughly academics were expected to know their material; they were literally professing that which they understood to be true and real, not guesses or speculation.

Regular readers of One Thing After Another will know that historians rarely profess solely facts, but rather the intellectual beliefs that make facts relevant and bind them together into evidence-supported explanations for past events.  The true faith of the historian is understood to be not in the facts themselves, but in our training to see which facts are more likely to be true, which are important to weave together, which questions to ask in order to arrive at answers that help us make sense of the past.

Yet particularly when we spend the majority of our time teaching, even we historians fall into the trap of seeing ourselves as others see us – as knowing a huge number of things and sharing that knowledge lecture by lecture and reading by reading with those who know less.  This is particularly true when a student asks a question and we can draw on all we know to construct a rich, complex, and, on our very best days, fascinating answer – it feels like we are on top of our field, a true professor.

Even when we teach students how to do what we do in the senior research class, it can feel as if we are working with apprentices to whom we dole out the techniques and tricks that have made us the master.  Even when their topic is not our own, we know how to find the central works in the field, lay out the historiography, find the central research questions, place our arguments in the discussion, locate the primary sources, read past and around the inherent biases.  This is what we do, and we teach others to do it as well as we can.  At our best, we become colleagues, not masters, encouraging students to be historians, to take the lead.  We ask them, at their best, to show us not only that they know the techniques, but that they can use them creatively to form their own interpretations.

So it can come as a rather great surprise when we take on a new research project of our own to realize that we feel remarkably like apprentices ourselves.  It is the little things, the lack of basic answers to basic questions.  For example:  In which of the four dozen books published on nineteenth century American religion in the past 10 years can I find an answer to the seemingly simple question, how many people were members of churches in 1820?  Why does this author keep using the terms “orthodox” and “evangelical” without explaining them, and where can I look up what they mean in this context?  Since most letters are saved by the person to whom they were sent, not the person who sent them, how do I figure out with whom my subject corresponded, without having a list of names?!  The problems differ by field and by question.  But each of us, when we do serious research, is brought up short by what we do not know, what we can not answer, what we do not understand– yet.

And in that yet is the historian’s real faith, what we profess.  For it is the historian’s central belief that if we ask enough carefully constructed questions, search widely enough for the facts, infer deeply enough from what is present and what is missing, we will find some way to tell the story.  What we produce may not be the definitive answer for all time; more likely it will be complete and compelling for a time, but subject to later additions or revisions as new questions get asked and sources found.  But we stake out our ground and do our best.  We trust that no matter how much we are the apprentice before a new question or a new set of sources, dogged application of the historian’s craft will – sometimes ever so agonizingly slowly – point us toward answers, sources, and connections.

As medieval craft masters knew, there is great economic value in protecting your trade secrets, sharing them rarely, and always appearing all-knowledgeable.   Students and parents might well prefer to pay for dispensers of wisdom than professors of faith.  But just between us, it is the historian’s faith that may be our most valuable property, that which produces the ultimate goal – new historical understanding and new historical thinkers.  In our research, we remind ourselves of that, and it perhaps makes us better teachers as well.

Do You Favor Independence for Scotland?

Devolution

Over the last month, a number of people (mostly students) have asked One Thing after Another, “What do you think about the referendum on independence for Scotland?” One Thing after Another has always hesitated to respond because such a question involves predicting the future (i.e. determining whether or not Scotland will be better off alone). The study of history sharpens our judgment and allows us to meet the challenges of today’s world in an informed manner. It does not, however, allow anybody to make prognostications with any kind of accuracy.

While ruminating upon this question, One Thing after Another noticed the following article in The Atlantic which uses the referendum in Scotland as a launch pad to discuss the future course of world politics:

http://www.theatlantic.com/international/archive/2014/09/stronger-than-democracy/380774/?single_page=true

One Thing after Another had something of an “a-ha” moment (an epiphany, not a flashback to the band), and thought this article called for a historically informed response that addressed some major issues associated with Scottish independence.

In tackling the particular case of Scotland, Parag Khanna, the author, makes much sense. He is correct that from a political perspective, those who favored greater autonomy for Scotland would win, no matter what the outcome of the referendum. If the measure passed, Scotland obtained independence. If the poll failed, the Scots would nevertheless obtain many devolved powers.  It is when he wanders from the example of Scotland that Khanna encounters some semantic difficulties and makes a number of questionable assumptions.

There are two semantic problems with Khanna’s argument about devolution. First, Khanna leads his essay by claiming, “The 21st century’s strongest political force is not democracy but devolution.” However, demands for devolution of the sort that Khanna refers to are expressions of nationalism. And nationalism, especially in its voluntarist version, has long been tied to democracy. As John Stuart Mill (1806-1873) wrote in Chapter XVI (entitled “Of Nationality, as Connected with Representative Government)” of Considerations on Representative Government (1861), “Where the sentiment of nationality exists in any force, there is a prima facie case for uniting all the members of the nationality under the same government, and a government to themselves apart. This is merely saying that the question of government ought to be decided by the governed. One hardly knows what any division of the human race should be free to do if not to determine which of the various collective bodies of human beings they choose to associate themselves.” If, as Mill argues, the “question of government ought to be decided on by the governed,” the case is a democratic one. In other words, pace Khanna, then, devolution (and the desire for self-determination that lies behind it) is not distinct from democracy—rather, it is an expression of democracy. In this context, One Thing after Another can think of no better authority than Ernest Renan (1823-1892), the French historian who wrote, among other things, the famous essay, “What is a Nation?” (1882). Renan argued that the most important element of nationality was a willingness on the part of the nation’s members to live together as part of a national community. As he put it famously: “The existence of a nation . . . is a daily referendum, just as the continuing existence of an individual is a perpetual affirmation of life.” What an appropriate analogy in the case of Scotland—the Scots voted in a referendum and decided, just barely, to continue living as part of a national community of Britons.

Second, Khanna uses the word “devolution” to describe both independence and increased autonomy. Independence for Scotland is placed in this category as are the 75 new states that resulted from decolonization in the post-World War II era and the 15 states that emerged from the disintegration of the Soviet Union. But in the course of his argument, so are Texas, California, Western Australia, British Columbia, Quebec, Catalonia, and Basque country. Unfortunately for this type of taxonomy, there is a world of difference between an independent state and one that constitutes part of a federal union (or possesses the status of a semi-autonomous region). Independent states are sovereign, while members of a federal union are not.

Two of Khanna’s assumptions are also questionable, and both of them seem to stem from the kind of breathless, contemporary forecasting that assumes globalization, connectivity, and information technology have changed everything beyond recognition.  First, even though we have only reached 2014, he declares that devolution is the “political force of the 21st century.” As any historian can tell you, Fortune’s wheel can turn rather suddenly. Think about what the world looked like 86 years ago (1928). The world was dominated by European empires, and although an intelligent observer might have spotted the approach of world-wide devolution (George Orwell was one such figure), such a vision would have required much prescience. Who is to say that the globe of 2100 will resemble our devolving world? Making statements about the 21st century eight-and-a-half decades in anticipation of its completion leaves many hostages to fortune.

This point is related to a second one.  Khanna does not provide us with a good idea of what the future holds because he completely disregards the fact that small states live in a world dominated by big ones. The United States, China, and Russia all have important spheres of influence that they seek to shape through military and economic means. While they have not necessarily always been successful, they are more movers than moved. Certainly, they have intervened decisively with smaller states—ask the Iraqis, Tibetans, and Georgians, to name just three examples. At the same time, the massive economic influence of large powers has yielded important political results. To take the most crude example, the Russians have used their control of energy to manipulate Europe. Large states and empires may be unwieldy, as Khanna claims, but collections of small states acting together in pursuit of common interests are just as unwieldy. The European Union is not exactly a nimble beast. Moreover, at the end of the day, such supranational organizations rely on large or medium-sized powers as backstops. Germany has come to dominate the European Union, and a Scotland escaping English control might, like the Greeks, eventually find itself at the mercy of Teutonic bankers. Likewise, Eastern European states with large Russian minorities (e.g. Latvia) are turning to NATO for support because they see what has happened to the Ukraine—but what credible deterrent does NATO present if not the fact of American power? It is for this very reason that František Palacký (1798-1876), the historian and leader of the Czech national revival, famously claimed that if the Austrian Empire did not exist, it would have to be invented. In other words, the Czech people needed room to grow, but they could only do so within a larger multi-ethnic Austrian Empire, otherwise, they would fall prey to either Germans or Russians. In a devolved world, where can small states find security?

Finally, devolution is not a universal solution to political problems. In many cases, ethnicities and nationalities are so thoroughly intermingled that it is impossible to redraw territorial boundaries in a way that suits everybody (a point that even John Stuart Mill understood in 1861). The Balkans, of course, is the obvious example, but the problem exists in many other parts of the world. This difficulty of intermingling raises another question: to what level should devolution descend? How little can a devolved state be? Historians of nationalism have long noted a “threshold principle”—that is, a nation has to be large enough to be viable. Nationalists used to speak of economic or cultural viability, but what about military viability? Where is that threshold, especially in a world where bigger states can prey upon smaller ones? To use one example, the collapse of the Austro-Hungarian Empire and the forging of the Paris settlement of 1919 led to the creation of many small Eastern European states—almost of all of which fell first  into the German orbit before succumbing to rule from Moscow.

Should one be for or against independence for Scotland? For sure, there are many local or parochial issues to consider with regard to the relationship between the Scots and the English (e.g. changes in taxation). But, as one contemplates these questions, one should also explore the larger global context that will influence the resolution of these issues.