So let’s play a history game.
If you were going to teach a course on the American Revolution, what would be your starting point?
(a) 1776 (b) 1775 (c) 1763 (d) 1754
There are arguments for each of the above. You might argue for (a) on the grounds that colonial resistance to British authority did not become an official struggle for independence until the Declaration of Independence of that year. You might opt for (b) by reasoning that the real line of no return leading to revolution was crossed in 1775 when “the shot heard round the world” rang out on Lexington Green. You might choose (c), figuring that the chain of events that ultimately culminated in revolution was triggered by a change in England’s policy toward her American colonies in the aftermath of the Seven Years’ War. Or you might advocate for (d) on the basis that the Albany Plan of Union of that year marked the first halting expression of intercolonial cooperation and was thus a foreshadowing of the revolution to come. Which would you choose?
My answer would be (e) “none of the above.” This is not because I would choose a different year as my starting point. I wouldn’t choose any year. I don’t like the idea of diving into the topic of the course until we’ve spent some time thinking about what we’re going to be doing and why it might matter to us. Remember the concluding point of my last post? Historical knowledge is all well and good, but I want my students and I to be after what Christian historian Herbert Butterfield called “the deeper wisdom.”
Historical knowledge is most valuable, Butterfield maintained, when it is “transmuted into a deeper wisdom that melts into the rest of experience and is incorporated in the fabric of the mind itself.” The goal is not to master a bunch of historical facts–although they do come in handy for Jeopardy or Trivial Pursuit. The goal is wisdom: transforming insight that changes how we see the world. We’ll never achieve that with a Sergeant Joe Friday “just the facts, ma’am” approach to the past. I want my students to approach the American Revolution (1) thinking about their thinking, (2) scrutinizing their hearts, and (3) and expecting transformation. Each is essential. I’ll share some thoughts about the first of these goals in this post, and take up the second and third goals next time we talk.
With all three practices in mind, I began my current course on the American Revolution with a two-week unit titled “What We’re Doing and Why.” We started with a pair of basic (but hardly simple) questions aimed at helping them to practice metacognition, i.,e., to think about their thinking. “What is history?” we asked, and “What is involved in thinking historically?” We approached these questions from a number of angles, but I think my favorite point of entry involves a comparison to what I call the You Are There approach to the past.
To convince my students that I am just a little bit younger than dirt, I shared with them about a Saturday morning TV program that I got hooked on during junior high, back in the dark ages before cable TV. The show was titled You Are There, and for several years CBS aired it on weekends with the goal of luring kids away from Saturday-morning cartoons by making history “come alive” on the television screen. It was hosted by news anchorman Walter Cronkite, a highly respected TV journalist affectionately known as “Uncle Walter” and frequently touted as “the most trusted man in America.”
The program would begin with Cronkite introducing a crucial episode in history from his news desk, then shift to “live coverage” of the moment as real network correspondents “interviewed” key figures (like Julius Caesar, Abraham Lincoln, or Elizabeth Cady Stanton) and narrated events “as they unfolded.” Before “going live on location,” Cronkite would assure viewers, “Everything you see here was as it happened that day, except . . . [pause for dramatic effect] You Are There.” It was a clever premise, and, nerd that I was, I watched a bunch of episodes. Indeed, it’s but a slight exaggeration to trace my lifelong passion for history to those Saturday mornings with Uncle Walter.
Mock me if you will, but the kind of history that most of us are drawn to has a lot in common with that TV show. They may be less hokey, but underneath the surface the history books that make the past “come alive” for us are the ones that we are drawn to, and as a general rule they follow the same basic strategy. Like You Are There, they seemingly transport us to another time, enabling us to observe the past directly and listen in as figures from the past speak for themselves. The good news is that we can learn a lot of history from such an approach, as I believe I did on those Saturday mornings long ago. The bad news is that we don’t learn a single iota about thinking historically. For all its attractions, history of the You Are There variety discourages us from distinguishing between (1) what actually happened in the past, (2) our understanding of what actually happened, and (3) the art of reconstructing what actually happened. At bottom, it misleads us as to what history is and what historians do, and it’s the features that we like most about it that turn out to be the most pernicious.
In the process of making the past “come alive,” the You Are There approach obscures the absolutely fundamental distinction between “the past,” on the one hand, and “history” on the other. The past is everything that has happened before us, what C. S. Lewis memorably likened to a “roaring cataract of billions upon billions” of individual moments. (Click here for my thoughts on Lewis’s marvelous metaphor.) History, in contrast, concerns subsequent human understanding of that awesome totality. The difference is immense. It brings to mind Walt Whitman’s famous dictum about the American Civil War. Having witnessed its carnage firsthand, the poet was certain that mere writers with pen and ink could never capture the conflict’s horrific human cost. Try though they might, he concluded, “The real war will never get in the books.” Whitman was right, but his insight applies more broadly than he realized. The real past never “gets in the books,” not completely and objectively, for the simple reason that the past itself is gone forever. Coming to grips with this truth is the first step to thinking historically.
In like manner, history of the You Are There variety obscures the absolutely indispensable role of the historian, who becomes little more than a reporter “on location” telling us just what she sees. To say that the past is gone forever is not to say that it is wholly unknowable, but rather to underscore that the process of gaining historical knowledge is much more complicated than is commonly understood. Because we cannot observe the past directly, we must puzzle instead over vestiges of that vanished reality, traces that endure in what historians call “primary sources”: artifacts such as diaries and memoirs, newspapers and correspondence, legal records and census data, architecture and archaeological remains.
Complicating our task is the reality that these echoes are always woefully incomplete. Whatever the topic that interests us, we never have all the relevant facts at our disposal; we work instead with a subset, often a miniscule proportion. What is more—tired clichés notwithstanding—those facts that remain never “speak for themselves.” They lie silent and inert until the historian breathes life into them, literally resurrects them, by fashioning them into a persuasive interpretation. Interpretation is at the very core of the historian’s task.
This is why history, as a serious field of study, involves so much more than the mastery of discrete facts about the past. At its richest, history is both a branch of knowledge and an intellectual discipline that trains the mind in ways of thinking that enrich historical understanding. As we learn to think historically, we develop habits of mind that stick with us long after we’ve forgotten the forgettable facts that cram the pages of history texts. We search reflexively for patterns of change over time. We think critically about cause and effect. We’re aware of historical contingency and sensitive to context and complexity. We know that sound historical interpretations require a foundation of reliable evidence, and we practice the critical thinking skills that allow us to analyze historical sources effectively and evaluate their testimony wisely.
So much for the discipline of historical thinking. But why should we study history at all? Why pay attention to the past in the ever-changing 21st century? Good question. We’ll talk about that next time.