A MOVIE FOR LINCOLN’S BIRTHDAY

Today is Abraham Lincoln’s birthday (he was born 207 years ago, if you’re wondering), and a great way to commemorate the occasion would be to watch one of the best movies about American history ever made, the 2012 Stephen Spielberg film Lincoln, starring Daniel Day Lewis, Tommy Lee Jones, and Sallie Field. On the whole, academic historians praised the movie when it came out, and I generally concur. Lincoln can be criticized for numerous factual inaccuracies (most of them minor), but by Hollywood standards, the film makes room for an unusual degree of historical complexity. I recommend it highly.

Lincoln movie I

To begin with, the entire structure of the film drives home the complicated interrelationship between the issues of slavery and race in mid-nineteenth century America. One of the most important things to understand about the coming of the Civil War is that southern whites tended to believe that the defense of slavery and white supremacy were inseparable, while northern whites thought otherwise. As the sectional crisis of the 1850s intensified, southern whites tended to see any criticism of slavery as an assault on racial hierarchy. Northern whites, in contrast, were divided on the matter. While northern Democrats regularly condemned abolitionism as part of a fanatical crusade for racial equality, northern Republicans went out of their way to separate the issues of slavery and race. Indeed, they had no choice if they wanted any kind of political future. Northern voters were not ready to embrace racial equality, even as a hypothetical goal, but the majority, at least, might be convinced to support the end of slavery if emancipation did not seem to threaten the privileged position of whites in American society.

Lincoln Movie IILincoln makes this point wonderfully in the scene in which Pennsylvania Republican congressmen Thaddeus Stevens (played by Jones) disavows support of political or social equality for former slaves, even though he had long been a supporter of both. The clear message of the scene—a historically accurate one—is that passage of the Thirteenth Amendment required that the party of Lincoln frame the racial implications of emancipation as conservatively as possible.

The movie also illustrates nicely the considerable diversity within the Republican Party itself with regard to emancipation and racial equality. Whereas scenes situated in the House of Representatives commonly pit Republicans against Democrats, many of the movie’s more intimate conversations—in the president’s cabinet room, the executive office, even the White House kitchen—were designed to highlight differences of opinion among Republicans themselves. So, for example, we see Jones’ Thaddeus Stevens chiding Lincoln for his timidity and telling the president that the only acceptable course is to free the slaves, expropriate the land of their masters, and totally remake the southern social and racial structure. But we also listen in as Maryland Republican Francis P. Blair (played by Hal Holbrooke) lectures Lincoln that conservative Republicans will never support emancipation at all unless they can convince their constituents that the measure is absolutely necessary to win the war. The movie does an outstanding job in helping us to imagine just how difficult a task it was for Lincoln to satisfy the disparate factions of his own party and still fashion a reasonably coherent public policy.

Yes, Lincoln gets a lot of its history right, and in a medium in which that rarely occurs. And yet the message of the movie is historically inaccurate and anachronistic. What is Lincoln trying to say to us? I suspect that historian Louis Masur is correct (writing in The Chronicle Review), when he observes that the film aims “to restore our faith in what political leaders, under the most trying of circumstances, can sometimes accomplish.” I’m no movie critic, and I don’t know for sure what producer Stephen Spielberg or playwright Tony Kushner intended, but this certainly seems to be the message that emerges. Not coincidentally, it is a message that many Hollywood liberals would find comforting: a determined leader uses the power of government to push a reluctant nation toward a self-evidently righteous end.

With this central point in mind, I thought one of the most dramatically critical moments of the movie was when Lincoln grows angry at naysayers in his cabinet. As they insist that the votes necessary to pass the Thirteenth Amendment in the House simply aren’t there, Daniel Day-Lewis’s Lincoln rises to his feet and thunders, “I am the President of the United States of America, clothed in immense power! You will procure me these votes.”

Lincoln Movie III

In fairness, I don’t think that such a reading of Lincoln’s leadership is entirely off base. Lincoln was an adept politician who successfully held together a diverse coalition during the greatest trial our nation has endured. More specifically, the movie’s portrayal of Lincoln’s sense of urgency in pressing for a vote on an emancipation amendment before the war’s conclusion is well grounded in historical evidence. And in the end, it is undeniable that our sixteenth president forcefully promoted a measure—the abolition of slavery—that a substantial majority of the nation’s free population opposed. At the same time, however, the movie’s simplistic message requires a selective reading of Lincoln’s private papers and public pronouncements. Such a selective reading is facilitated by the chronological focus of the movie, which centers almost entirely on the first few weeks of 1865. A broader focus might have complicated the film’s central message enormously.

Ever since Lincoln’s assassination, well meaning Christians have insisted that “the Great Emancipator” was a sincere follower of Jesus. I would never say dogmatically that he was not (who can know the human heart save God alone?), but I will say that almost none of Lincoln’s closest contemporaries viewed him as a man of orthodox faith. The best modern scholarly study of Lincoln’s religious beliefs—by a nationally respected Christian historian, Allen Guelzo—argues persuasively that Lincoln never fully accepted the Christian concept of a God who intervenes in the world to effect the salvation of individual sinners who trust in Him. (I highly recommend his biography Abraham Lincoln: Redeemer President.) And yet Lincoln did believe in Providence. In his early adult years such faith amounted to little more than a belief in a “First Cause” or “Prime Mover,” but by the beginning of the war Lincoln had come to believe in a God who actively superintended human affairs. As the war grew long and its human cost soared, furthermore, it is clear that the president ached to find some larger meaning or divine purpose in the conflict.

Long before the events dramatized by Stephen Spielberg, Lincoln had begun to ask profoundly religious questions about the war. Possessing a logical bent of mind (the movie rightly hints at his appreciation for Euclid’s theorems), the lawyer Lincoln wrestled with the possible implications of the war’s unexpected length and butcher’s bill. Sometime in 1862 he jotted down his inchoate thoughts on the matter, and the undated memorandum was preserved later by his personal secretaries and given the title “Memorandum on the Divine Will.” Lincoln’s memo to himself begins with this bedrock assumption: “The will of God prevails.” In the brief paragraph that follows, Lincoln noted that God could bring victory to either side instantly, and “yet the contest proceeds.” This suggested a conclusion to Lincoln that he was “almost ready” to accept as true, namely, that “God’s purpose is something different from the purpose of either party.”

Lincoln’s suspicion that God was at work for some larger purpose continued to grow as the war dragged on, and increasingly he suspected that the divine design was to bring an end to slavery. Lincoln understood full well that the North had not gone to war in 1861 with that objective in mind, and over time he came to believe that God was prolonging the war until the North embraced and accomplished that goal. If Salmon Chase and Gideon Welles can be trusted (two of Lincoln’s cabinet members who kept careful diaries during the war), Lincoln privately explained his decision to declare the preliminary emancipation proclamation as the result of a vow to “his maker.” If God allowed the Union army to repulse Robert E. Lee’s invasion of Maryland, Lincoln told his assembled cabinet, he had resolved to “consider it an indication of the divine will and that it [would be] his duty to move forward in the cause of emancipation.”

Lincoln gradually developed this theme more publicly as the war continued. In the spring of 1864, for example, in a speech in Baltimore he observed that neither side had anticipated “that domestic slavery would be much affected by the war. “So true it is,” Lincoln noted, “that man proposes, and God disposes.” That same month Lincoln wrote similarly to a Kentucky newspaper editor. “I claim not to have controlled events,” he related, “but confess plainly that events have controlled me. Now, at the end of three years struggle the nation’s condition is not what either party, or any man devised, or expected. God alone can claim it.” A few months later Lincoln wrote to a political supporter that “the purposes of the Almighty are perfect, and must prevail, though we erring mortals may fail to accurately perceive them in advance. . . . Surely,” Lincoln concluded, the Lord “intends some great good to follow this mighty convulsion, which no mortal could make, and no mortal could stay.”

The culmination of such reasoning came in Lincoln’s rightly admired second inaugural address, a speech that also serves as the culmination of Lincoln the movie. Yet playwright Tony Kushner has chosen to include only the final fourth of that very short speech (the original was only 703 words long), and he leaves out the most religiously significant passages of an address that is arguably the most profoundly religious public reflection ever uttered by an American president. The movie ends with Lincoln’s famous call for “malice toward none” and “charity for all,” but that plea can only be understood in the context of what had preceded it. Echoing the insight that had come to define Lincoln’s personal understanding of the war, the president had told the assembled throng that neither side had anticipated the end of slavery and both had hoped for an outcome “less fundamental and astounding.” Although both sides “pray[ed] to the same God,” the prayers of neither side had been fully answered. “The Almighty has His own purposes.” Since neither side had been fully in step with God’s will, it made no sense for the victorious side to impose a self-righteous and vengeful peace.

I have observed in this blog that history can function in a number of valuable ways as we go to the past for enlightenment. As a form of memory it aids our understanding. As a kind of mirror it sharpens our self-perception. History is also a kind of conversation across the ages. In the midst of our nation’s greatest trial, Abraham Lincoln wrestled with questions of profound importance. We would benefit from hearing him and from wrestling ourselves with his conclusions. For all its virtues, Lincoln won’t help us with that.

HAPPY LINCOLN’S BIRTHDAY

 

Lincoln

When I was in grade school (not quite half a century ago) I used to love the month of February.  Not only did we typically have a class Valentine’s Day party, but there was not one but two school holidays in rapid succession: Abraham Lincoln’s birthday on February 12, and George Washington’s birthday ten days later.

Although I didn’t know it at the time, we owed the second holiday to the U. S. Congress, but the first holiday came courtesy of the great state of Tennessee.  That is, Washington’s Birthday was a federal holiday–and had been since the 1880s–but Lincoln’s Birthday was a state holiday, observed in parts of the country and not in others.  On the eve of WWII, it was an official holiday in twenty-six states, the District of Columbia, and the territory of Alaska.

The number of states with a Lincoln holiday fell sharply after 1971.  The culprit was the passage that year of the Uniform Monday Holiday Act, a pragmatic measure that moved a number of federal holidays to the nearest Monday, primarily to generate more three-day weekends for federal employees.  The act shifted observance of Washington’s Birthday from February 22nd to the third Monday of the month, and although merchants and advertisers soon began referring to the holiday as “Presidents’ Day” (or “President’s Day,” or “Presidents Day”), there is technically no federal Presidents’ Day holiday, and the day off many of us will get next Monday (February 15th) is technically thanks to the federal observance of George Washington’s Birthday, although many of us assume that the Monday holiday was intended to honor both presidents.

At any rate, by 1990, the number of states with a Lincoln’s Birthday holiday had fallen from twenty-six to ten, and now it has dwindled to four: New York, Connecticut, Missouri, and (of course), the state with “The Land of Lincoln” on its license plates where I now reside, Illinois.

Even though most of us don’t remember Lincoln’s birthday any more, my new friend Wayne Shepherd does.  After more than three decades with Moody Broadcasting in Chicago, Wayne is now the host of several nationally syndicated radio programs and hosts a weekly thirty-minute podcast called First Person with Wayne Shepherd.  With Lincoln’s birthday in mind, Wayne invited me to his studio not long ago to discuss our nation’s sixteenth president.

If you’re interested, you can listen to our conversation here.

Happy Lincoln’s Birthday, and do remember this wise word of warning from the man himself:

Lincoln Internet

WHY I LOVE WHEATON COLLEGE—PART ONE

Wheaton College's Blanchard Hall

Wheaton College’s Blanchard Hall

These are difficult days at Wheaton College, dark, discouraging days. A storm broke over our heads last December. It erupted when our colleague, Dr. Larycia Hawkins, posted comments online that some readers interpreted as equating Islam and Christianity. It intensified when the college’s administration first suspended Dr. Hawkins, then announced that it would seek to dismiss her from the faculty. Perhaps an end is now in sight. Over the weekend the administration announced that it was withdrawing its request to terminate Dr. Hawkins and then disclosed that Hawkins and the administration had mutually agreed to “part ways.” How these steps will be received—what they will mean to faculty, staff, students, alumni, and the larger world—is an open question.

What is certain is that the controversy has exacted a heavy toll. For the past two months we’ve been besieged left and right. Liberal detractors have denounced Wheaton’s fundamentalism and Islamophobia, even as conservative critics lamented the school’s surrender to theological liberalism and political correctness. “Woe to you when all men think well of you,” Jesus said. At least we don’t have to worry about that.

Just as sloshing a coffee cup reveals what’s inside it, the stress and strain of the controversy has shown the world the truth behind our admissions brochures. We’re a fallen institution staffed by fallen men and women. More precisely, we’re sinners—to use an unpopular term—and I’m the chief of them. As a recent speaker on our campus put it, it’s wholly fallacious to think that we’re in the business of receiving innocent Christian teenagers (they’re not) with the goal of preserving their innocence (we can’t). Instead, we’re a community committed to joining a two-thousand-year-old conversation about the meaning of the claim that “Jesus Christ is Lord.” Together, we explore the implications of that declaration, both for our innermost selves and for the way that we engage the world. Yes, we are fallen, but our calling is high and wonderful, and the opportunity to pursue it is unspeakably precious.

That is why I love Wheaton College.

I don’t love it because it’s perfect. (See above.) And I’m not saying that I love it at this moment in order to make a point about who’s been right in the current controversy. I’m making this declaration—I feel compelled to make it—because I’m sick at heart and I’ll burst if I stay silent. Too much recent criticism of the college goes beyond the matter at hand to call into question Christian education more generally. In reply, I want to follow the example of generations of evangelicals before me and share my testimony. I use the term advisedly. What follows isn’t a systematic argument about the pros and cons of Christian education. I’m just going to testify to my experience. You can make of it what you will.

You should know that my perceptions of Wheaton College are inseparable from the twenty-two years that I spent at the University of Washington before coming here. William Faulkner is famous for observing that “the past is never dead. It’s not even past.” More poetically, in Intruder in the Dust, one of Faulkner’s characters explains, “It’s all now you see. Yesterday won’t be over until tomorrow and tomorrow began ten thousand years ago.” Faulkner meant that we never meet the present in pristine purity. The past is ever with us, shaping who we are, what we notice, how we see. Surely my story bears this out. Every day that I come to work, I see and feel and experience Wheaton in the light of my time in the secular Academy. How could it be otherwise? It was in the secular Academy that I first learned to think, to research, to teach and to write. It was there that my sense of vocation was originally conceived and nurtured. And it was there, above all, that I developed a longing for a kind of education that the secular Academy could never deliver.

The University of Washington's "Cathedral of Learning," Suzallo Library. I had a private study on the library's fifth floor.

The University of Washington’s “Cathedral of Learning,” Suzallo Library. I had a private study on the library’s fifth floor.

As I reflect on it, my time at the University of Washington divides neatly into two periods. The first was the tenure-track years, when my highest priority was not to think about my job but to keep my job. If you’re not familiar with the process, most colleges and universities give their new full-time faculty six years or so to earn tenure, and if they fall short of the institution’s standards, they’re sent packing. You’ll probably think that’s more than generous if you earn your living in the business world, where employees are regularly fired or laid off with short notice. The difference is that in the academic world—in large part because of the tenure system—job turnover and new job creation is minimal. Professors who are denied tenure rarely find other academic positions. You don’t start over at another school. You start over in another line of work. And if you’ve already spent six to eight years (or more) toiling on a Ph.D. and another six years of 60-70-hour work weeks as an assistant professor, you can understandably conclude that you’ve just wasted a good part of your life. The stakes are enormous, and that has a way of keeping you focused.

At the time, I would have described these years primarily in terms of their intensity. Now, I remember them more as a period of sleepwalking and inertia. With little self-awareness, I jumped onto the academic treadmill and did what the Academy asked of me. It wasn’t unpleasant. I benefited from UW’s exceptional resources, worked with bright students, and learned from supportive colleagues. And if you had asked me during those years, I would have said that I was being faithful to my calling as a Christian university professor. I was teaching a college Sunday School class, occasionally witnessing to unbelieving students, and (as a good Southern Baptist) saying “no” to wine at faculty parties. Above all, I was pursuing excellence in my field, loving God with my mind by pressing toward the prize of tenure, promotion, and professional recognition.

Or so I thought. And then I got tenure.

Isn’t it funny how God can expose the emptiness of our ambitions by fulfilling them? In the spring of 1994 I received two momentous pieces of mail almost simultaneously, and in tandem they changed the direction of my life. First, I received an advance copy of my first book, soon to be published by Cambridge University Press. It was pretty typical of first books that begin life as doctoral dissertations. It was deeply researched but narrowly focused. Specialists praised it—it won two professional book prizes—but almost no one else could understand it or desired to. Worse, there were no eternal issues in its pages, no engagement with Permanent Things, no grappling with questions of importance to my local church or to the broader community of faith. It was of the Academy, to the Academy, and for the Academy.

And the Academy, for its part, said “Well done, good and faithful servant! Enter into the joy of your lord.” That same week I received formal notification from the UW trustees that I had been promoted and granted tenure. The real decision on my tenure application had been made much earlier—once Cambridge had offered me a book contract the outcome was certain—but there was still something symbolically jarring about receiving the book and the promotion letter in the same week. I weighed these two “successes,” figuratively holding one in each hand and reflecting on what my university chose to value and reward. What I felt wasn’t elation, or affirmation or gratification, but a profound sense of emptiness. I was thirty-three years old, at the salary I was earning I knew I would have to work until I died, and I couldn’t imagine being able to continue for much longer.

Humanly speaking, I was experiencing what academics know as the post-tenure letdown. It’s so common that it’s become a cliché, so I don’t pretend for a moment that my experience was unique. But I believe that God used this time of discouragement and searching to help me think critically and deeply—really for the first time—about the pluralistic multiversity of which I was a part. I began to read—more enthusiastically than systematically—about the relationship between the love of God, the life of the mind, and the nature of true education. And as I did so, I began to see the university with new eyes. Then I began to see myself with new eyes, as I realized how effectively the Academy had shaped me into its mold.

Peter Kreeft writes that our culture wants us to be “well-adjusted citizens of the Kingdom of This World.” Through years of osmosis, I had come to be a well-adjusted citizen of the Academy. It didn’t strike me as odd that the university had no cohering vision, that it denied the unity of truth, that it sought to expand knowledge while ignoring wisdom. I swallowed the Academy’s claim that it was ideologically neutral. Most troubling, I accepted as natural its compartmentalization of religious belief, with the attendant assumption that we can understand vast domains of human experience without reference to God.

I began to see these things, little by little, in the years following my promotion and tenure. This wasn’t a Damascus Road experience—no scales suddenly fell from eyes. It was more like coming out of anesthesia, a gradual awakening to reality. And like a patient just out of surgery, my discomfort increased as the anesthesia wore off.  As I began to see my surroundings differently, I also began to experience what Harry Blamires called “the loneliness of the thinking Christian.”

My Christian friends in Seattle regularly assumed that life was hard for a Christian professor in a place like the University of Washington, and they were right, but not for the reasons they supposed. They imagined that the environment was openly hostile to believers and figured that I must be the target of ostracism or even persecution. That was never my experience. Oh, there were continual reminders that I wasn’t in church: the student government association distributing “condom grams” in honor of Valentine’s Day, drag queens performing in the library courtyard (for course credit, no less), the school newspaper proclaiming “Jesus Should Have Been Aborted,” the department colleague who was a transvestite, to mention a few.

Such things were disturbing, but it’s not like I’d been unaware of them earlier. What distressed me far more were the limitations that I faced in the classroom. I hadn’t felt them when I first arrived at UW fresh from grad school. My primary goal was to help students understand the past on its own terms and largely for its own sake. And because they typically came to the university with pretty simplistic historical views, I would inevitably explode many myths that they harbored and complicate their understanding both of the past itself and of the craft of the historian. In the process, I was quick to assure them, I would also teach them critical thinking skills that would help them land good-paying jobs at Boeing or Microsoft or Amazon.

And then my sense of vocation began to change, in large part because of the reading I was doing about the nature of true education. I came to believe that my highest goal was not to help my students make a better living, but to help them wrestle with what it means to live well. I came to believe that authentic education is not the same thing as vocational training (important though that is), that it is a transformative experience that changes who we are. And as I began to take that goal seriously, I began to struggle with an ever increasing sense of futility.

In his 1947 meditation The Abolition of Man, C. S. Lewis wrote that “the pressing educational need of the moment” was not primarily to debunk our students’ unsubstantiated convictions. “The task of the modern educator,” Lewis maintained, “is not to cut down jungles but to irrigate deserts.” Lewis’s challenge both inspired and depressed me. Every day I taught students who had learned at the university that it was not necessary to have a consistent philosophy of life, that rationality was a “western construction,” that ideas were merely “convenient perceptions” and moral claims only rationalizations for self-interest. And because of the authoritative rules of the secular Academy, when those students came into my classes, I was free to pose religious questions to them but never answer them authoritatively. I was allowed to introduce religious perspectives to them but never endorse one above the rest. I could demonstrate the contradictions of particular belief systems but never proclaim the good news of a consistent alternative. In sum, if I was going to irrigate deserts at UW, I would have to do so without ever testifying to the “the fountain of living waters” (Jeremiah 2:13).

This was frustrating, as well as profoundly alienating. I never really felt alone as a Christian in the secular university until I began to try to think like one. As I did, I came to see myself, as Blamires put it, as “caught up, entangled, in the lumbering day-to-day operations of a machinery working in many respects in the service of ends that I rejected.” And so, by the year 2000, I had begun to pray for an opportunity to teach in a different setting built on a firmer foundation. A decade later, God answered that prayer.

I’ll be back with Part Two in a few days.

Wheaton I

WHY THOMAS JEFFERSON WOULD HAVE CONDEMNED THE SUPER BOWL . . . AND SUPPORTED THE NRA

(I don’t much care about the Super Bowl and rarely watch unless Peyton Manning–an old Tennessee Volunteer, like me–happens to be playing.  I originally wrote the short post below two years ago, when Manning and the Broncos also were in the championship game, and got clobbered.  I’m pulling for a different outcome this year, but not too optimistic about it.  At any rate, I thought you be interested in Jefferson’s thoughts on the kind of exercise best conducive to health.  Given all that we are learning about the connection between football and the danger of repetitive brain injury, there is something prescient about his warning against violent sports.  I don’t expect that his recommended substitute will get much traction, however.)

**********

These last couple of days I have been reading a fair amount in the correspondence of Thomas Jefferson, and just yesterday I came across some of Jefferson’s ruminations on the importance of exercise that might interest you, especially given the likelihood that almost all of us will soon be watching a certain athletic contest.

The comments I have in mind come from a letter that Jefferson wrote to his nephew, Peter Carr, in the summer of 1785. Carr was his wife’s brother’s son, and Jefferson seems to have taken great interest in his upbringing after Carr’s father died when young Peter was only three. By 1785 Jefferson’s wife had also passed away, and it may be that Jefferson saw in the teenaged Peter a tangible link to his beloved Martha. Perhaps he even saw in Peter the son he would never have. Whatever his motive, Jefferson devoted considerable attention to Carr’s education, and he counseled him frequently on the path his nephew must follow if he aspired to a career of accomplishment and service worthy of a Virginia gentleman.

Thomas Jefferson sat for this portrait by Mather Brown in 1786, the year after he wrote to his nephew Peter Carr.

Thomas Jefferson sat for this portrait by Mather Brown in 1786, the year after he wrote to his nephew Peter Carr.

Although only forty-two years old, Jefferson in 1785 had already compiled an amazing record of publish service. In the past decade alone, he had served as a delegate to the Second Continental Congress, been the principle author of the Declaration of Independence, held the post of governor of Virginia, and was now representing the United States as ambassador to France. “Mortified” by reports of his nephew’s slow academic progress, Jefferson wrote from Paris on the 19th of August to exhort his fifteen-year-old nephew to greater effort. Did Peter not realize that “every day you lose, will retard a day your entrance on that public stage whereon you may begin to be useful to yourself?” (It’s just a suspicion, but Jefferson may not have been cut out to work with teenagers.)

Jefferson began by focusing on his young charge’s character. “The purest integrity” and “the most chaste honour” must be his nephew’s “first object.” “The defect of these virtues can never be made up” by other accomplishments. Then after integrity comes intellect. “An honest heart being the first blessing,” Jefferson explained, “a knowing head is the second.” And so the future president lay out a course of reading for the teenager. He must begin with “antient” history–“reading everything in the original” language, of course–and proceed from there to Greek and Roman poetry, followed by a systematic study of philosophy and ethics beginning with Plato and Cicero.

But the body was important as well as the heart and head. To maximize his academic progress, Peter should set aside at least two hours every day for exercise, “for health must not be sacrificed to learning.” “A strong body,” Jefferson lectured, “makes the mind strong.” (Michelle Obama could not have said it better.) But what kind of exercise should Peter pursue? Jefferson left nothing to doubt. “Walking is the best possible exercise,” he instructed, but not just any kind of walking. “Never think of taking a book with you” while you walk, Jefferson stressed. Instead, “let your gun . . . be the constant companion of your walks.” While “this gives a moderate exercise to the body, it gives boldness, enterprize, and independance [sic] to the mind.”

And what about more modern kinds of sports, you ask? Peter should avoid “games played with the ball and others of that nature,” Jefferson cautioned his nephew. They “are too violent for the body and stamp no character on the mind.”

 Thomas Jefferson would advise him not to play tomorrow.

Thomas Jefferson would advise him not to play tomorrow.

ACADEMIC FREEDOM IN A CHRISTIAN CONTEXT–MORE THOUGHTS

It’s been too long since I last posted. I marvel at bloggers who are constantly connected and constantly conversing with the rest of us. All I can figure is that they have more hours in their days than the paltry twenty-four I get.

It’s been nearly two weeks—the internet equivalent of an “eon”—since I wrote about the academic freedom I’ve known at Wheaton College these past six years. I wasn’t trying to make a systematic argument comparing secular and Christian contexts. I just wanted to testify to my experience. Ever since my colleague Dr. Larycia Hawkins posted comments comparing Christianity and Islam last month, Wheaton has been the focal point of a social media frenzy. Champions and critics have rushed to do battle, one side denouncing Hawkins for her fanaticism, the other condemning the college for its bigotry. Charity has been scarce, but there’s been more than enough dogmatism to go around. As Alan Jacobs has observed, both sides seem able “to read the minds and hearts of people they don’t know.”

The goal of my previous post was not to take sides in the dispute, but to take issue with critics who insist that academic freedom can’t exist in a confessional community. For me, joining the Wheaton faculty in 2010 after twenty-two years at the University of Washington was a profoundly liberating experience. Day after day, I enjoy a degree of freedom in the classroom here that far exceeds what I knew at UW. And day after day, the freedom that I feel here thrills my heart and nourishes my soul.

This post elicited some thoughtful responses, and I’d like to reply to one of them briefly. One commenter says that his experience teaching at two Christian colleges was less positive than mine has been, and I take his testimony seriously. I also respect his conclusion that he “functions much more effectively . . . at a secular institution.” If that is true, then a secular institution is where he should be. And let me add here that I have no doubt that God often calls believing scholars to secular schools and empowers them to labor faithfully. But Steve’s point is not simply that his story is different from mine. While he respects the “many excellent scholars at Christian” institutions and the “amazing work” that they do, he knows that no school that requires its faculty to affirm a statement of faith can pretend that it also honors academic freedom. The two are simply “incompatible.”

So where does this leave us? I observed that I feel greater academic freedom at Wheaton than I experienced at my previous secular institution. Steve replied in so many words, “No, you don’t.” Wheaton may be a “good fit” for me, but what I’m experiencing here can’t be true academic freedom because, as he understands it, academic freedom can’t exist here.

We’re at an impasse. But before we throw up our hands and drop the matter, it might be worthwhile to go back and define our terms. To paraphrase the inimitable Inigo Montoya, let’s make sure that “academic freedom” means what we think it means. The early-twentieth-century historian Carl Becker once wrote, “When I meet a word with which I am entirely unfamiliar, I find it a good plan to look it up in the dictionary and find out what someone thinks it means. But when I have frequently to use words with which everyone is perfectly familiar . . . the wise thing to do is take a week off and think about them. The result is often astonishing; for as often as not I find that I have been talking about words instead of real things.”

Maybe we need to take Becker’s advice and revisit what we mean by “academic freedom.” What we cannot mean by the phrase is the liberty publicly to explore, espouse, and promote any conceivable value or set of values as an employee of an academic institution. Such a definition would be utterly useless, for I know of no place where it exists.

The unsubstantiated, near universal assumption suffusing the present controversy is the fiction that secular schools erect no boundaries to academic expression. When Steve says that secular universities “do not require people to hold a certain perspective,” I don’t begin to know how to respond. I could quickly tick off a long list of conservative political or moral positions that are unacceptable across a broad swath of today’s secular Academy.  There are countless positions which, if not kept private, would effectively preclude those who hold them from promotion and tenure, or even the possibility of employment to begin with. There’s no need to make such a list, however, because one simple example will suffice.

Today’s secular Academy insists that faculty adhere, at least publicly, to a materialist, rationalist world view. Its credo, to quote atheist Matthew Stewart, is that “there is nothing outside the world that may explain anything within it.” In theory, at least, faculty are utterly “free” to pursue truth wherever it leads, as long as they do nothing to challenge this a priori answer to the most fundamental of all human questions.

Again, Alan Jacobs puts it well:

Imagine a tenured professor of history at a public university who announces, “After much study and reflection I have come to believe that the Incarnation of Jesus Christ holds the full meaning of historical experience, and henceforth I will teach all my classes from that point of view.” Would the university’s declared commitment to academic freedom allow him to keep his job? No, because he will be said to have violated one of the core principles of that particular academic community, which is to bracket questions of religious belief rather than advocate for a particular religious view.

I would add to Jacobs’ example that if the hypothetical public university in question ousted this trouble-maker, it would deny that it had infringed on his academic freedom. If this looks like hypocrisy to an objective bystander, technically it’s not. This is because when the twenty-first century university speaks of freedom, it really has in mind a concept closer to the seventeenth and eighteenth-century meaning of liberty. Three centuries ago, liberty meant the freedom to behave uprightly. It was commonly contrasted with license, the practice of abusing freedom by behaving immorally. From the dominant viewpoint of the secular Academy, appeals to religious truths are intrinsically illegitimate, which means that no educator has a moral right to make them in the classroom, and an institution committed to academic freedom has every moral right to prohibit them. It’s a comforting rationale.

Let’s be clear: neither Christian nor secular institutions exalt unfettered academic freedom as their highest good or as an end in itself. This is because both claim to serve something larger, whether they speak in terms of “the public good” or of “Christ and His Kingdom” (goals that are hardly mutually exclusive, by the way). In pursuit of these greater goods, both Christian and secular schools establish boundaries within which they expect their faculty to operate.

The main difference I see is that the secular Academy denies that it does so.

FINDING GOD’S GRACE AT THE AHA

After a lazy Sunday afternoon watching Peyton Manning strike a blow for old geezers, I’m feeling much too “do-less” (my grandmother’s word) to grade papers, so I thought I’d share some scattered thoughts about the recent American Historical Association annual meeting that I attended two weeks ago in Atlanta. The AHA is the premier professional organization for academic historians, and typically four to five thousand of us show up for each year’s conference. We’re a raucous bunch—not. Actually, think of every stereotype of historians that comes to mind and then double them, and you’ll be on the right track.

I’ve never much enjoyed the AHA, to be honest. The sessions can be interesting, but as a concept they’ve never made sense to me. Believe it or not, historians read their professional conference papers word for word, which means that a typical AHA session involves a room of extraordinarily educated individuals (most of whom have PhDs) sitting around while someone reads to them. Given that we are literate, a cheaper and more efficient approach would be for all of the presenters to post their papers online. We could then read them at leisure from our laptops in coffee shops or while watching football on our couches, instead of having to travel across the country to have them read to us in a hotel conference room.

But that would defeat the real purpose of these gatherings, which is all about connecting with people: reuniting with old friends, making new acquaintances, giving “elevator pitches,” talking to publishers, impressing potential employers, interviewing and being interviewed, seeing and being seen. What happens in the formal academic sessions—in conference rooms with names like “Salon West” or “Grand Ballroom D”—is not quite a sideshow, but it’s close. The real work is done in the numerous receptions and banquets, the book exhibit and the hotel bar.

Which is another reason I’ve never much enjoyed the AHA. I hate to schmooze. I also hate the self-conscious isolation that comes with not schmoozing. Standing by myself in an academic reception reminds me too much of junior high (though without the fear of bodily harm). When our firstborn was fifteen months old, my wife and I traveled together to a historians’ convention in New Orleans and brought our daughter along. On the second night of the meeting, we stopped by a reception sponsored by my alma mater. The room was stuffy, loud, and crowded, with folks standing shoulder to shoulder, drinks and hors d’oeuvres in hand, while they shouted in each other’s ears about the historiographical contributions of their doctoral dissertations. In the hubbub our toddler managed to slip away from us, and I’ll never forget where we found her. She had somehow made her way through the forest of grown-up legs to the far side of the room. There she stood, pressed into the corner with her back to the crowd and her forehead against the wall. Many a time I’ve wanted to do the same thing.

My discomfort at these meetings is more than just a matter of temperament, however—the lonely-in-a-crowd feeling that an introvert in such a setting should expect. It comes also from a sense of not wholly belonging, from the palpable tension that washes over me between the values of my profession and the demands of my vocation. Professionally, I’m a member of the guild of PhD’ed historians at work in the Academy. Vocationally, I’m called generally to be a Christ follower, and more particularly (I believe) to serve the Church by helping her learn from history and remember the past faithfully. My profession and my vocation aren’t blatantly at odds—I’d have to abandon my profession if they were—but neither are they wholly complementary.

In her marvelous essay “Seeing Things: Knowledge and Love in History,” Christian historian Beth Barton Schweiger observes that “professionalization” is a process of “narrowing allegiances and priorities in order to conform to the rigid standards of the guild.” Professionalization is particularly a problem for the Christian historian, she goes on to explain, because our profession practices “knowledge as power,” eschewing the “deeper purpose of historical knowledge . . . which is to serve the ends of love.” She drives home her point with a series of rhetorical questions:

Where is mercy at the American Historical Association? What form does justice take in the job register? Who considers love in the array of bloodless panels at professional meetings?

I take her basic point. Considered as a whole, Schweiger’s surely right that “the world of professional history does not reward charity or wisdom.” But that doesn’t prevent countless individuals from being agents of God’s grace amidst the striving for professional place and power. I know this for a fact, for I was the beneficiary at least a half dozen times in three days.

The last three years have been a time of prolonged trial in the McKenzie family, and my wife and I are chronically weary and often discouraged as a result. The last thing that I expected was that the AHA would be a respite, a time of encouragement and refreshment, but that’s exactly what happened. It began on the second night of the conference with the opportunity to find a quiet corner in the Hilton and talk for an hour with a former student of mine. This young man is the complete package—great mind, exemplary character, extraordinary determination—and yet he’s encountered a series of roadblocks that have left him discouraged. We talked freely, and I had the privilege of reminding him of God’s faithfulness and love, and he responded with genuine concern for my family and for me. I left encouraged by his caring, and grateful for the opportunity to teach in a setting where connections of such depth develop frequently.

The next morning was the breakfast reception of the Conference on Faith and History. Among several conversations that were uplifting, two stand out. First was the opportunity to talk with an older friend, a scholar of national reputation who, for reasons that I have never comprehended, has always jumped at the chance to help me whenever he can. He helped get me on my first professional panel twenty-six years ago, he’s given me feedback on manuscripts, and when I needed a professional reference when I was being considered for an opening at Wheaton, he wrote the longest letter of recommendation that I have ever laid eyes on, so laudatory that I could scarcely recognize the person he was writing about. Before we parted he reminded me that he prayed for my family regularly, and he asked for prayer for one of his grandchildren. I felt loved.

Before leaving I sat down beside another friend of long-standing, mainly to pat his arm and say “hi” before he had to leave. At this point the breakfast was ending and most of the CFH members were scattering for various academic sessions, but this loving man asked me how I was and, what is more, he really wanted to know. We talked for an hour and a half in the empty dining room while the hotel staff set up for the next event. He encouraged me professionally, getting excited about my academic projects as I described them, reminding me that my labor was not in vain. He encouraged me personally, as we shared about our families and our hopes and concerns for our adult children. I left that conversation feeling affirmed, and encouraged, and loved.

This litany will soon grow tedious unless I summarize. That afternoon I ran into a historian who has been praying for my family for the past couple of years, and I took the opportunity to share with him some of the ways that I see his prayers being answered. He wanted to hear more, and we talked and walked and shared for three quarters of an hour.

By prior arrangement, I then met with the man who taught the very first college history class I ever sat in, when I was a seventeen-year-old freshman at the University of Tennessee. Thirty-eight years later, this man who inspired me to pursue an academic career wanted to connect with me. As we sat in the Marriott lobby he told me what he had seen in me nearly four decades ago, and then he trusted me enough to talk about the son who had returned from Afghanistan with PTSD, and the sense of helplessness and sorrow that had turned his own heart toward God.

Finally, before catching the airport shuttle the next morning, I was able to grab breakfast with a wonderful historian who I’ve known since the 1990s. We hadn’t connected in several years, and after finding out how I was doing, he was transparent enough to share about a personal heartache as well as God’s subsequent kindness, and again I walked away encouraged.

Does the historical profession, as a profession, reward wisdom and charity? No, it doesn’t, but I found instances of both this year at the AHA, along with encouragement and kindness and grace. May God alone be praised.

MY REVIEW OF “DEFENDERS OF THE UNBORN”

 

Some time ago I mentioned that I had written a review for Christianity Today of a new work of history that I can enthusiastically recommend, Daniel William’s Defenders of the Unborn: The Pro-Life Movement Before Roe v. Wade.  My review, at last, is available online to all readers (no subscription to CT necessary).  If you are interested, you can read the review here.

Have a great weekend.

Defenders of the Unborn