In my last post, which seems like a year ago, I wrote about the populist strands I see pervading the current Illinois state gubernatorial race and how they reminded me of a famous political contest from nearly two centuries ago, the presidential election of 1828. In that election, the supporters of General Andrew Jackson championed their candidate as a man of the people, in contrast to his supposedly elitist rival, incumbent John Quincy Adams.

By 1828 Jackson was one of the wealthiest men in America (the fruit of savvy land speculation and effective political networking), but you would never know that by listening to the claims of his supporters. To “Jacksonians” (who would soon begin to use the label Democrat), Jackson was a man of the people whose humble roots, limited education, and minimal political experience all were strong arguments in his favor. Adams, in contrast, suffered from the liabilities of a prominent family name, a Harvard degree, and a lifetime of political service. As Jackson’s campaign managers put it, the contest came down to a choice between “Adams, who can write,” and “Jackson, who can fight.” The people chose the fighter in a landslide.

I concluded the post by promising to discuss the implications of that choice, and now I am back to make good on that pledge. To be honest, I am still working through my own thinking on the question, and I don’t pretend to have any profound pronouncements to make. Instead, I would rather direct you to an assessment of American politics in the age of Andrew Jackson that is still one of the most insightful commentaries on the relationship of liberty, equality, and religion that I know of. The work I have in mind is Alexis de Tocqueville’s classic Democracy in America.

Title Page of the first American edition of Tocqueville's classic, published in 1838.

Title Page of the first American edition of Tocqueville’s classic, published in 1838.

Democracy in America is one of those books that is cited much more than read. We’ve almost all heard of it. It rates a paragraph or two in most U. S. history textbooks, and A.P. instructors may occasionally assign a tiny portion of it. But apart from political science majors and American government teachers, it is a rare American who knows much of anything about this important work. We’re the poorer because of it.

I like to frame the importance of history in terms of metaphors. History is a form of memory and is critical to our sense of who we are. It also functions as a mirror, enabling us to see ourselves with new eyes and greater clarity. It is  certainly a story as well, a grand narrative in which we situate our lives, defining where we have come from, what we believe in, and where (we hope) we are headed. But history also is potentially a rich, even life-changing conversation about perennial human questions—“a conversation with the dead about what we should value and how we should live.”

So as a historian, one of the things I want to be doing regularly is entering into conversation with the best that has been thought and said in the past. And as a history teacher, one of the things I need to be doing regularly is introducing my students to the conversation partners that they most need to meet. Alexis de Tocqueville is one of those conversation partners. He has much to say to us, if we are willing to listen.

“Time converts more people than reason,” Thomas Paine observed at the beginning of Common Sense. What he meant was that most of us tend to accept as “natural” the way that the world is when we come into it. When Paine sounded the call for independence in 1776, he was writing to Americans who had lived their entire lives under monarchy. They accepted that form of government, Paine was convinced, less because of its merits than because it was all that they had ever known.

When we see any human institution or custom as natural or inevitable, it’s really hard for us to think about it deeply. Why agonize over something that can’t be any other way than it is? Part of what Paine did in Common Sense was to take his readers back to the origins of the English monarchy in an effort to help them see it—really see it—so they might think critically about it.

This is potentially one of history’s greatest benefits. It allows us to go back in time to a moment when institutions and customs that we now take for granted were new and strange and even controversial, when the matter wasn’t settled, when the outcome wasn’t inevitable. And by listening in on the conversation from that time, what we once saw as utterly familiar can begin to seem strange to us, and this, in turn, can both inspire and enable us to think about it. This is true, as Paine recognized, because we give what is familiar the benefit of the doubt; the strange we feel forced to explain.

This is part of why C. S. Lewis so strongly advocated the reading of old books. In an assertion that first seems counter-intuitive, Lewis argued that old books have the potential to help us understand the present better than works from our own day. On the whole, contemporary works reinforce our blind spots rather than exposing them, and the truths that they teach us are often “truths which we half knew already.” By comparison, old books have a greater capacity to challenge and change us. Where they are wrong, they are unlikely to harm us. Where we are blind, however, they may open our eyes.

Revisiting a classic work like Democracy in America is one way “to keep the clean sea breeze of the centuries blowing through our minds,” in Lewis’s memorable phrase. Tocqueville was writing at a time when democracy was still a novel experiment in the world. Its future was uncertain. Its impact was unclear. And although he was writing about democracy in a very specific historical context (he arrived in the United States at the midpoint of Andrew Jackson’s first term as president), his investigation was driven by questions as relevant today as they were in the 1830s: How do you reconcile the rights of individuals with the needs of society? How do you maximize individual freedom while promoting stability and order? How do you simultaneously advance liberty and ensure justice, and what role does religious belief play in that delicate balancing act?

I can’t recommend Democracy in America highly enough.   In my next two posts I plan to share some of my favorite passages from the book, focusing on what Tocqueville had to say about (1) the effects of democratic culture on society and politics and (2) the role of religion as a bulwark against tyranny.  In the meantime, you might consider ordering a copy, if you don’t have one lying around, and spending some time with it.  The full text can be rather daunting–most editions come in at somewhere around 800 pages–but there are several good abridged editions also available.  (The edition featured in the Bedford Books series of St. Martin’s Press is my favorite of this type.  It features an excellent brief introduction by distinguished historian Michael Kammen, followed by a lean sampling of the meatiest chapters adding up to about a fifth of the original.  You can read about the edition here.)

Democracy in America

If you do decide to read the book for yourself, may I share a word of exhortation and a little bit of context before you get started?  First the exhortation: as much as possible, take seriously the idea of entering into a conversation with the author.  Hospitality is a historical as well as a Christian virtue.  As you would with any other guest that you invite into your home, purpose to treat Tocqueville considerately.   Invite him to speak.  Listen to him respectfully.  Don’t respond defensively, indeed, don’t respond at all until you have thought carefully about what he has to say.

Now for the context.  Anytime you pick up a historical document, it is a good idea to find out all that you can about the author, the author’s audience, and the author’s purpose for writing.  Here are a few details that are helpful to know:

Alexis de Tocqueville was the third son of an aristocratic French family that traced its noble lineage at least as far  back as the Norman conquest of England in 1066.  In 1831, at the age of twenty-six, Tocqueville was commissioned by the French government, in tandem with another young aristocratic Frenchman, Gustave de Beaumont, to travel to the United States to investigate and report on the American penitentiary system.  Tocqueville and Beaumont arrived in the U. S. in May of 1831, and for the next nine months they explored the country, traveling by stagecoach, steamboat, and on horseback from the urban northeast to the edge of the western frontier and back again.  Upon returning to France, they filed their report on penitentiaries and then Tocqueville began to pen a much broader set of reflections on American politics, American institutions, American culture, and the American people.  The first volume of Democracy in America was published in 1835, and volume II followed five years later.  The first American translation of volume I appeared in 1838.

Tocqueville posed for this portrait around 1850, nearly two decades after his American odyssey.

Tocqueville posed for this portrait around 1850, nearly two decades after his American odyssey.

Tocqueville’s audience was always first and foremost his fellow countrymen.  He wrote about America but not for America, at least not primarily.  Indeed, understanding the French context is crucial to understanding the book.  Writing to the English translator of his work,  Tocqueville explained,

I came into the world at the end of a long revolution, which after having destroyed the former state of things had created nothing lasting in his place.  Aristocracy was already dead when I began to live, and democracy was not yet in existence.

As the French Revolution of 1789 gave way to the Great Terror of 1793, Tocqueville’s grandfather went to the guillotine and his parents, then young adults, went to the dungeon and barely escaped with their lives.  By the time that Tocqueville was born a dozen years later, Napoleon Bonaparte was emperor of France.  The implications of these events were clear: proclaiming  liberty was not the same thing as preserving it, and the establishment of political equality guaranteed neither liberty nor justice.  These lessons haunted Tocqueville his whole life long, and Democracy in America cannot be understood apart from them.

You should know that though Tocqueville was an aristocrat in temperament and lineage, he both foresaw and accepted that democracy represented the wave of the future.  He hoped to refine the trend, not resist it.  If he was critical of what he saw in America–and he often was–he was on the whole a sympathetic critic.  He was fascinated with the United States because he believed it to be the freest nation in the world, and he always hoped that his native France could learn from the American example.

Finally, we should respect just how seriously Tocqueville approached his subject.  The stakes were almost incalculably high, he believed.  “The nations of our day cannot prevent conditions of equality from spreading in their midst,” Tocqueville wrote in the very last paragraph of volume II.  “But it depends upon themselves whether equality is to lead to servitude or freedom, knowledge or barbarism, prosperity or wretchedness.”


Last week was political primary week in Illinois, and that means that the stretch run to the general election this November is now officially underway. I’ve been at Wheaton College for nearly four years now, long enough to conclude that Illinois voters are more cynical than most I’ve encountered. I suppose you get that way when state governors regularly end up in jail and convicted felons are serious contenders for the Chicago board of aldermen. (Riddle: You’re sitting in a room with a former Illinois governor to your left and another former Illinois governor to your right. Where are you? Answer: Prison.)

Simultaneously ignoring and feeding such widespread cynicism, the day after the primary the Chicago Tribune repeatedly warned readers that the campaign for governor will be brutal. “The Brawl Is On” proclaimed the page-one headline (Chicago Tribune, March 19, 2014). “It’s going to be ugly,” an editorial agreed, quoting an unnamed senior Illinois politician. Not one, but two front-page “news” stories told voters what to expect. It will be a “particularly contentious,” “bruising fight,” short on serious reflection, long on “raw emotion,” and punctuated by a slew of “scorched-earth attacks.”

In a word, the style of the campaign promises to be vicious. The campaign’s substance—if it can be said to have any—will be populist. The word populist comes from a Latin root meaning “people.” When applied to politics, the word connotes a relentless emphasis on the people (always vaguely defined) and threats to their well being (whether real or invented). Populist politicians present themselves as one of “the people,” portray their opponents as out of touch with “the people,” and define political questions as a struggle between “the people” and the elites and “special interests” who would exploit them. In the months to come there will be countless charges and countercharges about concrete political issues, e.g., the state’s debt crisis, rising tax rates, the death penalty, and gay marriage, to name only a few. But one issue will both permeate and transcend all others: which candidate will be more responsive to “the people”? Or more simply, which candidate is more truly one of “us”?

Both gubernatorial candidates will lay claim to the title of the people’s champion, although not necessarily in the same way. As the Tribune observes, the campaign “will feature dueling brands of populism.” The Republican challenger’s “style of populism is the classic throw-the-bums-out.” The GOP nominee, a wealthy businessman named Bruce Rauner, is already denouncing the Democratic incumbent as a “career politician” held captive by special interests.  Chief among these are the powerful state employee unions, supposedly gorging on padded salaries and bloated pensions funded at taxpayer expense. The people of Illinois, so the Republican message goes, are the victims of an unholy alliance of “union leaders and establishment politicians.”

The Democratic populist response, according to the Tribune, will be to declare “class war.” The sitting governor, Democrat Pat Quinn, is already denouncing his wealthy challenger as “out of touch” with the working class, “the real everyday heroes of our state.” “I believe in everyday people,” Quinn noted in his victory speech after the polls closed. “I’m not a billionaire.” (The Rauner camp denies that their man is that wealthy, but the alliteration of the epithet “Billionaire Bruce” is too much for the Quinn campaign to pass up.)

The Tribune is almost certainly correct in its predictions. The campaign will be a street fight. And its primary message—its all pervasive message—will be populist. Because I am a historian, however, I think that historical perspective can help us in thinking about this present moment. There is nothing new about vicious, populist campaigns in American politics. Indeed, they appeared on the scene pretty much simultaneously with the rise of American democracy.  If anything, twenty-first-century elections are dignified and well-mannered in comparison with those of two centuries ago.

I am mindful of this because my class on U. S. History here at Wheaton has just finished an in-depth review of the presidential election of 1828. The 1828 election was an important transitional milestone in American political and cultural history. It is easy to overstate the case, but it is not too much of an exaggeration to describe politics prior to the 1820s as a gentlemen’s affair. By the culmination of that decade, however, the political world as we know it was coming into focus.

In colonial America, political campaigns—at least as we would define them today—did not exist, nor did formal political parties. On election day, eligible voters (i.e., white male landowners) would congregate at the county seat and learn which of the local gentry had agreed to “stand” for office. According to custom, the candidates would rarely speak on their own behalf. An individual who desired office was presumed to be power-hungry, and thus disqualified from the public trust. The absence of speeches was made up by a great deal of drinking, however, since custom dictated that the wealthy nominees “treat” the voters to large quantities of free alcohol.

When George Washington was a candidate for the Virginia House of Burgesses in 1758, for example, his personal papers reveal that he supplied voters with 28 gallons of rum, 50 gallons of rum punch, 34 gallons of wine, 46 gallons of beer, and two gallons of “cider royal.” (This amounted to a total of 160 gallons for 391 registered voters, or about 1 1/2 quarts per voter.) A few years later, James Madison, the future father of the U. S. Constitution, also ran for the House of Burgesses but followed a different strategy. Madison was disturbed by “the corrupting influence of spirituous liquors.” He viewed the tradition of “treating” voters as “inconsistent with the purity of moral and republican principles.” The future U. S. president was committed to “a more chaste mode of conducting elections” and declined to treat voters. He was defeated.

Beyond the flowing alcohol, the most prominent feature of colonial elections was how deferential and personal they were. Voters took for granted that candidates would come from the social elite—the oldest and wealthiest families. And because there were no established political parties in colonial America—and no party platforms—almost the only “issues” in an election involved the character of the candidates involved.

To put it differently, colonial politics was largely a politics of reputation. According to the dominant political values of the day, the only non-negotiable prerequisite for public office was virtue—the willingness to sacrifice self-interest for the common good. And because it was assumed that the local elites who stood for office would frequently understand political issues more thoroughly than their neighbors (thanks to superior education and the leisure time necessary to stay well-informed), it was assumed that virtuous officeholders would sometimes have to contradict the wishes of their constituents.

This view of politics informed the earliest presidential elections after the ratification of the Constitution. If anything, they were more elitist than the colonial pattern described above. The delegates to the 1787 Constitutional Convention made no explicit allowance for popular involvement in the election of the nation’s executive. The president was to be chosen by the vote of the Electoral College, and the implicit expectation was that the electors who composed this bizarre institution would be prominent statesmen appointed by the various state legislatures. The executive, in other words, would be identified by the vote of a comparative handful of prominent men. (Only sixty-nine electors cast ballots when George Washington became the first U. S. president in 1789.)

And so in 1796—after George Washington announced only two months before the election that he would not stand for a third term as president—the presidential “campaign” that ensued primarily involved prominent men writing private letters to other prominent men about the qualifications of the leading contenders, John Adams and Thomas Jefferson. Four years later—when the same two statesmen again squared off—the same elitist air survived but had weakened. In addition to writing letters, interested statesmen were now more willing to write public pamphlets, and the country’s small but growing number of newspaper editors was beginning to weigh in as well. The times were changing.

Yet as late as 1824 the aristocratic tone of presidential elections largely survived. State laws had changed in the intervening quarter century, so that now most presidential electors were to be popularly elected rather than appointed by the state legislatures. Even so, scarcely a fourth of eligible voters bothered to cast ballots in 1824, and campaign managers for the various candidates still assumed that the “public opinion” that needed to be courted was the opinion of the wealthy and powerful. For their part, the rest of the electorate seemed not to care.

This changed in 1828. Describing the change is easy. The number of votes cast more than tripled, and all across the United States a much broader swath of adult white males paid attention to national politics than ever before. Why this occurred is a complicated question. There were several factors at play, but for our purposes, one factor is paramount: the outcome of the 1824 election and the way that one candidate and his supporters responded to it.

1824 Election MapThe 1824 election had actually played out pretty much the way that the framers of the Constitution had expected most elections to unfold. First of all, there had been a large number of serious candidates: Secretary of the Treasury William Crawford, of Georgia; Kentuckian Henry Clay, Speaker of the House of Representatives; Secretary of State John Quincy Adams, of Massachuetts; and Major General Andrew Jackson, of Tennessee. Second, as might be expected in the absence of well-defined political parties, all of the candidates had attracted more of a regional than a national following. Third, and predictably given such a large field of candidates, no individual had received a majority in the Electoral College, which meant that the outcome had to be determined by a run-off election among the top three finishers in the House of Representatives. (Clay, who finished fourth, was the odd man out.) Finally, in the run-off in the House the congressmen had cast their ballots without necessarily feeling constrained by the popular vote in their home states. Although many did so, overall they favored the second-place finisher, Adams, over the first-place finisher, Jackson. There was nothing unconstitutional about their doing so, and nothing necessarily insidious in their decision that Adams was the more qualified. (In terms of political experience, he unquestionably was.)

Andrew Jackson, in an 1824 portrait by artist Thomas Scully

Andrew Jackson, in an 1824 portrait by artist Thomas Scully

But neither Jackson nor his supporters ever accepted the validity of the outcome. Jackson had finished first in the popular vote (with about 43% of the total) and nothing else mattered. Days later their anger turned to outrage, when president-elect Adams named Henry Clay as his future secretary of state. Because Clay had cast his support to Adams on the eve of the run-off, Jackson and his supporters concluded that there had been a backroom deal, that Adams had bought off the Kentuckian with the promise of a plum post in his administration. Although no “smoking gun” ever proved the (probably false) allegation, the Jackson camp screamed “Corrupt Bargain!” and the charge stuck.

Although Adams and Clay were both supposedly parties to the dastardly deed, the Jacksonians reserved their greatest scorn for Clay, whom Jackson privately labeled “the Judas of the West.” Clay had justified his support of Adams by questioning Jackson’s fitness for the presidency. The Tennessean was a “military chieftain,” Clay had declared in a public letter, and history was full of military leaders who had begun as heroes and ended as tyrants. The House Speaker strongly implied that a Jackson presidency would end in the downfall of the republic, and his conscience would not allow him to stand idly by if it was within his power to prevent such a tragedy.

In this 1825 letter to a political ally, Jackson wrote of Clay: "The Judas of the West has closed the contract and will receive the thirty pieces of silver."

In this 1825 letter to a political ally, Jackson wrote of Clay: “The Judas of the West has closed the contract and will receive the thirty pieces of silver.”

“Hypocrite!” cried Jackson supporters. “The selfish ambition of Henry Clay is visible in every line of his letter,” cried the pro-Jackson Washington Gazette. “It is but a thin disguise to a foul purpose.” Back in Nashville, a livid Jackson agreed, writing to a political ally that “demagogues” were bartering the interests of the people “for their own views, and personal aggrandizement.”

Henry Clay sat for this portrait shortly before Jackson denounced him as "The Judas of the West"

Henry Clay sat for this portrait shortly before Jackson denounced him as “The Judas of the West”

Clay’s greatest alleged crime was neither hypocrisy nor political ambition, however. His chief offense, thundered the editor of the Gazette, was that he had “insulted and struck down the majesty of the People”; he had “impugned their sovereignty”; he had “gambled away the[ir] rights.” Jackson concurred. “The will of the people has been thwarted,” he wrote to an ally in 1825. “The voice of the people has been disregarded.”

Four years later, Jackson would have the chance to vindicate both himself and the “majesty of the people.” He was again a candidate for president, this time in a head-to-head match-up against the incumbent Adams. The 1828 presidential contest would be one of the dirtiest campaigns in history. One historian of the election has written that it may have “splattered more filth in more different directions and upon more innocent people than any other in American history.” As with earlier presidential elections, it was a campaign of personalities. What was new in 1828 was how public the charges and countercharges would be.

Four years earlier, Henry Clay had hinted that Andrew Jackson could not be trusted. In 1828, his political rivals went much further. John Quincy Adams’ supporters condemned the general in no uncertain terms: Jackson was the son of a prostitute and a slave, they announced; he was an adulterer, and he was a murderer. The adultery charge was dredged up from more than three decades earlier, when Jackson had unwittingly married supposed divorcee Rachel Donelson on the Tennessee frontier before her divorce had been officially approved by the Kentucky state legislature. The murder charge referred disingenuously to executions that Jackson had ordered during his military career. The Adams camp highlighted the latter with an infamous broadside now remembered as the “Coffin Handbill,” a poster featuring some seventeen coffins in silhouette, one for each man the blackguard Jackson had supposedly cut down in cold blood over his lifetime.

Anti-Jackson "Coffin Handbill" from the Presidential Campaign of 1828

Anti-Jackson “Coffin Handbill” from the Presidential Campaign of 1828

The Jackson campaign counterattacked with admirable creativity. They led with the accusation that Adams had stolen the presidency four years earlier, a claim valued less for its truthfulness than for its effectiveness. Beyond that, their strategy was clearly to show that the incumbent president was an effete intellectual out of touch with the common man. Adams was a Harvard graduate who spoke multiple languages and boasted an extensive record of public service.  He had served as both congressman and senator from Massachusetts; as ambassador to the Netherlands, Prussia, and Russia; as Secretary of State; and now, of course, he occupied the White House.

John Quincy Adams' silk underwear disqualified him for the presidency, in the view of Jackson supporters.

John Quincy Adams’ silk underwear disqualified him for the presidency, in the view of Jackson supporters.

Jackson’s supporters turned these assets into aliabilities by denouncing Adams as a career politician, a child of privilege (son of President John Adams) who had never held a job that wasn’t handed to him. What was worse, his prolonged residence in European courts had corrupted his character and addicted him to debilitating luxury. As evidence of the latter, the Jacksonians cited Adams’ use of taxpayer money to buy a pool table and chess set for the White House, as well as his purported fondness for wearing silk “inexpressibles.” (How they knew what kind of underwear the president wore they never made clear.)

Jackson, in contrast, had been born into poverty in the southern Backcountry.  His father had died before he was born, and the subsequent death of his mother and brothers from smallpox left him an orphan as a young teenager.  He had almost no formal education and not too much regard for those who did.  (A notoriously abysmal speller, he is supposed to have said that he couldn’t trust a man who could only spell a word one way.)  What is more, Jackson had precious little political experience.  He had twice been elected to Congress, and both times he had left the capital in disgust in a matter of months.  When the Adams campaign ridiculed his lack of qualifications, however, the Jacksonians had the perfect, quintessentially populist retort: “Who would you rather have as president?”  they asked.  “Adams who can write . . . or Jackson who can fight?”

Fifty-six percent of voters chose the fighter.

We’ll discuss the implications of that decision next time.

The Church and the Christian Scholar: A Tribute to a Friend

A good friend of mine had a heart attack two days ago.  For twenty-one years James Felak and I were colleagues in the History Department at the University of Washington.  For most of that time, James was my only close Christian friend in a research institution that boasted some three thousand full-time faculty members.  I haven’t been able to talk to James yet, but my understanding is that his prognosis is encouraging.  Yesterday he asked for coffee and a laptop, and I count that a good sign, if more than a little premature.

Two posts ago I began a series of reflections on “The Church and the Christian Scholar.”  In that context, I want to pay tribute to James publicly, for he has both encouraged and challenged me greatly over the years as I have tried to figure out what it means to be a Christian scholar.  The Scripture calls believers to “walk worthy of the calling” with which we have been called, but it does not call us to walk alone.  In James, I encountered another Christian scholar willing to walk alongside me, and I will be forever grateful.

James joined the UW faculty the year after I did, and we eventually became fast friends.  I remember distinctly the first time we really connected.  A few months after James’s arrival, Seattle was hit by a freak snow storm.  (It rarely snows there, and large accumulations are almost unheard of.)  I had walked to the graduate library after lunch on a cool, damp, overcast day, which is another way of saying that it was a typical winter afternoon in the Pacific Northwest.  After six hours of reading microfilm, I came out to find that there was already 8-9 inches of snow on the ground, the public bus system on which I depended was  effectively shut down, and I had no way of getting to my home some thirteen miles from campus.  Expecting to spend the night in my office, I went to the student center to grab dinner before the grill closed, and there I happened upon James, who was doing course prep at one of the tables there.  I knew that James lived a couple of miles away and regularly walked to work, and so I asked if I could sleep on his couch for the night.  He readily agreed.

I’ll never forget the trek to his house that followed.  Neither of us was dressed for snow, which was by now up to our shins.  We were walking in regular street shoes and thin jackets, the wind was howling, the snow seemed to be coming almost horizontally–so thick that we could hardly see–and what was most bizarre of all, the entire sky was repeatedly illuminated with truly awe-inspiring flashes of lightning.  I was miserable, weary, and more than a little distracted by the prospect of being electrocuted in a blizzard.

But not James.  James was energized by the opportunity to talk about ideas–his lifelong passion–and talk he did.  Although the wind and thunder were often so loud that he had to shout into my ear, James excitedly shared his views on Communism, socialism, Christianity, the Cold War, East European history, and the conjugation of Hungarian verbs.  I was simultaneously flabbergasted and enthralled.  We have been friends ever since.

The conversations that followed over the years were less memorable but more meaningful.  As we discovered our common faith, the focus of our discussions centered more and more on the question of calling, and in particular what it meant for us to labor faithfully in the academic contexts in which God had placed us.  These conversations were inspiring and revitalizing, and I could have them with no one else.

Most of the Christian scholars I know laboring at secular colleges and universities feel profoundly alone.  At work, they are surrounded by co-workers who cannot relate to their faith, who may even equate Christianity with superstition and ignorance.  In their churches, they are often surrounded by fellow believers who cannot relate to their vocation, who may even doubt whether  genuine Christians exist within the Academy.   As a result, they are soon worn down by what the late Harry Blamires called “the loneliness of the thinking Christian.”

God used James to spare me from such loneliness.  Over scores of brown-bag lunches, James loved me by listening.  G. K. Chesterton once warned that “thinking in isolation and with pride ends in being an idiot.”   James’ friendship kept me from thinking in isolation, and it probably also pulled me back from any number of idiotic conclusions.   (I know that he thinks so, at any rate.)

Along the way, James challenged me in a number of specific ways.  First, he called me to take seriously the wisdom of Christian writers over the centuries.  Ironic for a historian, there was an element of “historylessness” (to quote sociologist Sydney Mead) in my approach to the faith.  Like most American evangelicals, I paid attention to the history of the early church as revealed in the New Testament, but once I finished the book of Revelation I jumped to C. S. Lewis and Billy Graham.  James invited me to meet with him regularly to discuss various Christian works, and the first suggestion on his list was the Confessions of St. Augustine, a sixteen-hundred-year-old work of startling contemporary relevance.

Second, James pushed me to broaden my scholarship with an eye to finding points of intersection with the interests of Christians outside the Academy.  The son of a mechanic and the product of a western Pennsylvania steel town, James has unbounded appreciation for the life of the mind.  In a way that I found bracing, however, he also scorned intellectual pretension and rejected the common academic view of scholarship as a closed conversation for privileged professors to have among themselves.  The latter was a view I had unconsciously embraced myself, and James tried to show me this by teasing me, which is his default way of relating to almost everyone.  (I lost track of the number of times he shared the eulogy he planned to deliver at my funeral.  It involved the audience wailing in grief as he read from my curriculum vitae.)

Finally, and most importantly, James encouraged me to believe that I had something to say to the church that was worth saying, that God could use me, as a scholar, to bless other believers.  The encouragement was priceless.  I wrote James just as soon as I heard a rumor that he was in the hospital, and I heard back from him literally in the midst of writing this post.  Not surprisingly, the e-mail was short.  “I’m alive–getting discharged today,” he began.  James went on to relate how the main artery to his heart had been 95% blocked, and that the attack that he suffered is the kind cardiologists refer to as “the widowmaker.”  “For years I’ve been afraid of living too long,” he confessed, “now I have the opposite concern.”  And then in the very next sentence, so characteristically encouraging and selfless, “I finished your book–fantastic job.”

Thank you, James.  Thank you, Lord.


I’ve been thinking a lot recently about Thomas Jefferson.  He has figured prominently the last couple of weeks in both of the courses that I am currently teaching, an upper-division class on U. S. history to 1865, and a general-education course on race and ethnicity as themes in the American past.  I get excited when I teach about Jefferson, not only because he played such a crucial role in our national history, but also because he has loomed so large in American memory.  My goal in all of my classes is to encourage life-long learning.  I’m not concerned that my students memorize a bunch of discrete historical facts; I want them to get a glimpse of how engagement with the past can enrich their lives for all of their lives.  This means, in part, helping them to see history as a living conversation, an ongoing dialogue with the past that occurs in the present with an eye to the future.

Historian David Harlan has written that, “at its best, the study of American history can be a conversation with the past about what we should value and how we should live.”  When Americans approach the past in this vein–when we study history to understand who we have been and to contemplate who we want to be–our nation’s third president inevitably becomes central to the conversation.  As Jefferson biographer Joseph Ellis has observed, Thomas Jefferson is “the dead white male who matters most” to us.

Why this is true is an open question, but I suspect that the paramount reason has to do with Jefferson’s principal role in crafting the Declaration of Independence.  The Second Continental Congress edited considerably the draft that Jefferson constructed in 1776 (in consultation with delegates John Adams, Benjamin Franklin, Robert Livingston, and Roger Sherman).  The wording was still predominantly his, however, and by the end of his life the wording had become exclusively his, at least in memory of the American people.

If Americans remembered George Washington as the sword of the Revolution, in other words, they venerated Jefferson as the pen.  The general may have secured independence on the battlefield, but it was the sage of Monticello who (along with Thomas Paine) had justified the Revolution and explained its meaning to posterity.  Ever since, Americans across the political spectrum–liberals and conservatives, Christians and secularists, patriots and cynics–have looked to Jefferson to define what the United States stood for at its birth.

Thomas Jefferson sat for this portrait by Charles Willson Peale in 1791.

Thomas Jefferson sat for this portrait by Charles Willson Peale in 1791.

Two examples come quickly to mind.  The first involves questions of social justice and equality.  From the argument over slavery before the Civil War to the struggle for civil rights a century later, Americans have debated what Jefferson meant–and what contemporaries thought that he meant–in asserting in 1776 that “all men are created equal.”  A second example, particularly important to evangelical Christians in recent decades, concerns the proper place of religious belief in the public square.  At least since Jefferson was cited authoritatively by the Supreme Court in 1947, Americans have contested the meaning and validity of his oft-quoted 1802 assertion that the First Amendment erected “a wall of separation between Church & State.”

Both issues are unquestionably important, and there is nothing intrinsically wrong with seeking to understand Jefferson’s position on either one.  And yet, because both subjects are so controversial, because they are fraught with policy implications and partisan consequences, the temptation to label Jefferson rather than learn from him has been immense.  Caught up in contemporary debates, our goal becomes primarily to prove that Jefferson is on “our side.”

This is especially true of the ongoing contest to define the extent of America’s Christian heritage, a struggle nicely encapsulated in the title of John Fea’s book, Was America Founded as a Christian Nation?  Given Jefferson’s stature as the author of the nation’s founding charter, combined with his seminal early role in debates over the public place of religion in American life, it is understandable that Jefferson’s religious beliefs have become a battleground in the contest over this larger question.

Understandable, but also unfortunate.

There is a cost to using history primarily as a weapon.  Rather than facilitating our understanding, it actually gets in our way, making it harder–not easier–to see the past rightly.  Complex answers don’t fare well in public debates, even when they’re true.  One of my favorite observations on this point comes from the pen of Alexis de Tocqueville, the French visitor to the United States who related his observations in the classic Democracy in America.  Tocqueville concluded, “A false but clear and precise idea always has more power in the world than one which is true but complex.”  Tocqueville nailed it.   Simple, appealing answers are always preferable when your goal is to win the battle for public opinion.

Beyond distorting our vision, what I call the “history-as-ammunition” approach also commonly feeds our pride.  Self-righteousness is often one of its first fruits.  After triumphantly “discovering” what we had predetermined to find, we applaud our superior understanding, congratulate ourselves on our disinterested  commitment to truth, and condemn our opponents for their blindness and bias.

But when the debate that we’re drawn into concerns the nature of the religious beliefs of the nation’s founders, there is something more important at stake than historical accuracy or our personal character.  In assessing whether our nation’s founders were Christian, we’re inevitably saying something as well about the Christian faith and Christ himself.  Stephen Nichols makes this point marvelously in his book Jesus: Made in America.  As Nichols puts it, when we exaggerate the degree to which the founders were Christian, we not only “do injustice to the past and to the true thought of the founders,” but we also do “injustice to Christianity and the true picture of Jesus.”

I read Nichols’ book over Christmas vacation, and his point finally convinced me to speak out about David Barton’s book, The Jefferson Lies: Exposing the Myths You’ve Always Believed about Thomas Jefferson (Thomas Nelson, 2012).   I had long hesitated to write about Barton’s contentious “scholarship.”  Numerous historians (several conservative Christian scholars among them) have already called attention to its numerous factual errors, half-truths, and misinterpretations, but it seems to me that Nichols’ point has somehow gotten lost in their critique of Barton’s historical claims.

Certainly, as  a piece of historical scholarship, the book is awful.  I take no pleasure in saying so, but no other word will do.  But it has another quality which may ultimately be more detrimental in its effect on Christian readers: it is relentlessly anti-intellectual.  Barton prepares his readers for the criticism his views will elicit by means of a preemptive first strike.  The views about Jefferson that he disagrees with are “lies.”  (It follows that those who promote such views are liars.)  Those who side with his critics are “ill informed or ill intentioned.”  Academic historians disagree with him because they have been corrupted by a range of “isms” that lead to historical “malpractice.”  Almost every work of U. S. history written since 1900 is suspect (except his own, of course).  Barton’s advice: flush the last century of historical scholarship and depend on earlier works less likely to be “infected with our modern agendas.”

Jefferson Lies II

Note also that there is a “bait-and-switch” dimension to Barton’s promise to sweep away the “lies.”  After explaining in the opening pages why academic scholarship cannot be trusted, when Barton actually shifts his attention in subsequent chapters to specific claims about Jefferson, the “lies” that he exposes often come from sources that few academic scholars would find credible, such as journalistic essays, personal web sites, and Facebook pages.  (Can’t we all find someone on the internet who disagrees with us?)  In some instances, at least, Barton is clearly toppling a straw man.

Let me be clear: my goal is not primarily to defend the Academy against an outsider.  Is some modern scholarship ideologically driven and hostile to traditional Christian values?  Absolutely.  But Barton’s approach is not preparing us to think Christianly or to argue persuasively about other perspectives.  He is training us simply to attack the character of those who disagree with us.  This is not a winsome witness to the world.  It is more like schoolyard name-calling.

Why is this a big deal?  It is a big deal because an important part of what historical interpretations teach us has little to do with the past  per se.  Our historical interpretations always contain a “teaching behind the teaching,” to borrow a phrase from Christian writer Parker Palmer.  Even when the “teaching behind the teaching” is not explicit, works of history are modeling to us a particular way of thinking about the past and of engaging with the present.  Even apart from its contentious claims about Thomas Jefferson, I shudder to think that my brothers and sisters in the Church are learning from books like The Jefferson Lies about what it means to love God with our minds.

But in the end I think Stephen Nichols’ observation presents us with by far the most important concern to raise about the book: what does Barton’s representation of Jefferson teach readers about Christ and the Christian faith?   Of the seven “lies” that Barton claims to refute, the final one is the most pertinent to this question, the supposed claim that “Thomas Jefferson was an atheist and not a Christian.”

Logically, this seventh “lie” involves two claims rather than one, and Barton should never have joined the two.  If Jefferson was indeed an atheist, of course, it is necessarily the case that he was also not a Christian.  The converse, however, is far from true, i.e., to establish that Jefferson was not an atheist in no way proves that he was a Christian.  (The world is full of non-Christians who believe in God.)   I can’t read Barton’s mind, but I find myself wondering whether he formulated this illogical proposition intentionally.  Linking the two claims into one proposition helps to obscure the weakness of his argument about Jefferson’s supposed Christian faith.  Jefferson was no atheist, and proving that he wasn’t is easy.  (There is scarcely a single reputable scholar who argues that he was an atheist, by the way, a fact you wouldn’t learn by reading The Jefferson Lies.)  The evidence that Jefferson was not an orthodox Christian, on the other hand, is irrefutable.

I can imagine that you might be uncomfortable with my making such a dogmatic statement about Jefferson’s personal faith.  Who am I, after all, to claim to have penetrated the man’s heart?  But that is not what I am claiming at all.  Yes, only God knows our hearts perfectly, so when someone claims to have made a profession of Christian faith, we are rightly hesitant to declare that God has not done a work in his or her heart, even if there seems to be much evidence to the contrary.   But when someone comes to us and explicitly renounces the central pillars of historical Christian orthodoxy, it does not require divine insight to categorize that person as not a Christian, at least according to the historic creeds that have defined the boundaries of orthodoxy for centuries.

Take, for example, the Apostles’ Creed, a distillation of Christian belief that took its final form in the seventh century.  By my calculation, Jefferson explicitly repudiated at least two thirds of its indicative statements.  Did Jefferson “believe in God, the Father almighty, Creator of heaven and earth”?  Yes.  Did he believe in “the resurrection of the body and the life everlasting”?  Possibly.  Did he believe that Jesus was God’s “only Son,”  that he “was conceived of the Holy Spirit” and “born of the virgin Mary,” that he ” rose again from the dead,” that he “ascended to heaven and is seated at the right hand of God the Father almighty,” or that Jesus will come again “to judge the living and the dead?”  The answers are no, no, no, no, and no.  Measured by the Apostles’ Creed, Jefferson was a heretic, and we don’t need to plumb the depths of his heart to conclude this.

To his credit, Barton concedes (in a masterpiece of understatement) that “in his later years” Jefferson’s views “do not comport with an orthodox understanding of what it means to be a Christian.”  But he immediately goes on to insist–in the very same sentence–that “throughout his life Jefferson was pro-Christian and pro-Jesus in his beliefs.”  Barton’s assertion that Jefferson only fell into heresy late in life is almost certainly wrong, but I am not going to take the time here to address it systematically.  For our purposes, it is enough to examine how Barton characterizes Jefferson’s religious views in his old age.  According to Barton, they were BOTH (1) outside the bounds of orthodox Christianity, AND (2) “pro-Christian and pro-Jesus.”  How can both of these conclusions be true?

They can both be true only if Barton is separating our perceptions of Jesus from the historic, orthodox understanding of Christ embodied in the creeds, which is precisely what enabled Jefferson to move toward heresy in the first place. After divorcing his understanding of Jesus from the historic creeds, Jefferson went on to jettison much of the Bible as well.  Scripture was not the ultimate arbiter of truth by Jefferson’s reckoning; it was riddled with fabrications, embellishments, and the misunderstandings of human authors less enlightened than he.  God had not left his creation without testimony, however.  The creator had given to all mankind a moral sense by which to determine right from wrong, and he had also  inculcated in humans the faculty of reason, by which they could distinguish between truth and superstition.

“Fix reason firmly in her seat,” Jefferson counseled his teenaged nephew in 1787, and call to her tribunal every fact, every opinion.”  In an appalling misreading of Jefferson’s counsel, Barton insists that Jefferson was merely trying to train his nephew to be a good apologist for Christianity, someone who would have a logically consistent and intellectually formidable defense of the faith (an 18th-century Josh McDowell, say).  In reality, because Jefferson did not believe that the biblical canon is inspired, he explicitly encouraged his nephew to read the Bible in the same way that he read pagan literature, accepting whatever seemed to be in accord with reason and rejecting all else.

“Your reason is the only oracle given you by heaven,” Jefferson stressed to his nephew.  With amazing obtuseness, Barton quotes Jefferson’s counsel and underscores (literally italicizes) the phrase “given you by heaven.”  It is evidence, he contends, that Jefferson “definitely held a strong, personal, pro-God position.”  And yet the idea that reason was a faculty given to man by God was a common Enlightenment belief and hardly uniquely Christian.  Far more revealing is Jefferson’s assertion that reason is our ONLY oracle.  Indeed, in the very sentence that Barton cites as evidence of Jefferson’s “pro-God” views, Jefferson was actually denying the inspiration of scripture!

Jefferson would have been gratified by Barton’s conclusion that he was “pro-Jesus.”  In truth, Jefferson did have unbounded admiration for Jesus–as long as he himself could define who Jesus was, unconstrained by either Scripture or centuries of Church teaching.  “I am a sect by myself, as far as I know,” Jefferson once confessed to a Prebyterian minister.  Echoing Thomas Paine’s earlier declaration  (“My mind is my own church,” Paine had written in The Age of Reason), Jefferson nicely foreshadowed the radical individualism, relativism, and insistent autonomy so pervasive in twenty-first -century America.

By his own testimony, the Jesus that Jefferson admired was an enlightened philosopher, a moral teacher, and a “benevolent and sublime reformer.”  The Jesus of Jefferson’s creation was appalled at the superstition of Judaism (“the depraved religion of his own country”).  He sought to reform the Jews’ “moral doctrines to the standard of reason.”  The God of the Jews was “vindictive, capricious, and unjust,” but Jesus had re-envisioned the Deity by imputing to him “the best qualities of the human head and heart.”  In so doing, he had given the world a “Supreme Being . . . really worthy of their adoration.”  God became truly worthy of our worship, in other words, once enlightened minds created God in their own image.

And yes, by his own reckoning, Jefferson did think of himself as a Christian, but should that surprise us?  (Once Jefferson had finished with Jesus, Jesus looked a lot like Jefferson–why wouldn’t he wish to follow him?)  “I am a Christian,” then-president Jefferson wrote confidently  to his friend Benjamin Rush in 1803, “in the only sense he wished any one to be; sincerely attached to his doctrines; in preference to all others; ascribing to himself every human excellence; & believing he never claimed any other.”  It is more than a little ironic, as Christian scholar Stephen Nichols has pointed out, that so many evangelicals quote the first half of that sentence, given that Jefferson undermined his apparent confession of faith in the second half.  “I am a follower of Jesus,” Jefferson was saying, “as long as we understand that he was not the son of God nor ever claimed to be.”

And why does any of this matter, apart from a desire for historical accuracy?  It matters because of what Stephen Nichols warned us about–in exaggerating Jefferson’s endorsement of Christianity, Barton is not only making claims about Jefferson.  His findings reflect on Christianity, and what is more, they reflect on Christ.  For all of its gross historical flaws, what bothers me most about The Jefferson Lies is how its author–himself a former pastor–minimizes the gravity of Jefferson’s heresy.

In all candor, I am at a loss to know how to explain this.  My best guess is that Barton has become so all-consumed with his campaign to prove that the Founders weren’t hostile to religion that nothing else matters.  Determined to prove the point, Barton drowns out everything else that the past has to say to us, including much that American Christians need to hear.

“Who do men say that I, the Son of Man, am?” Jesus asked His disciples in Matthew chapter 16.  The correct answer to this question lies at the very heart of the Christian gospel.  When Peter proclaims “You are the Christ, the Son of the living God,” Jesus tells Peter he is blessed indeed, because His Father in heaven has revealed this truth to him–the truth upon which Jesus promised to “build My church.”

Today American Christians, including American evangelicals, are increasingly confused about the person of Jesus.  A 2009 survey of self-described Christians by the Barna Group found that roughly two-fifths of American Christians believe that Jesus sinned when he lived on earth.  An intensive 2010 study of Christian teenagers by Mike Nappa (The Jesus Survey: What Christian Teens Really Believe and Why) found that a clear majority doubted that the Bible was trustworthy, and fully one-third rejected the scriptural teaching that belief in Christ is essential for salvation.

In part, such findings may reflect the impact of a wider culture that glorifies “tolerance” and rejects all exclusive truth claims as narrow-minded or bigoted.  But as Kendra Creasy Dean observes in her book Almost Christian: What the Faith of Our Teenagers is Telling the American Church, it may also reflect the “watered-down gospel” that Christian teens are receiving from the church itself.  Nappa agrees, concluding that such enormous misunderstandings of basic Christian truths “wouldn’t be widespread in our youth groups if adult Christians in our churches weren’t also embracing” them.

David Barton apparently believes that the greatest need of the moment is to re-establish the cultural authority of the church in the public square.  To further that end, he is determined to prove, at all costs, that one of our most eminent Founders wasn’t as opposed to religion as the Supreme Court seems to think.  But it will be a hollow victory for Christians to increase their public presence if they have no more to say about Jesus than what Jefferson himself thought was true.

Barton concludes The Jefferson Lies by characterizing our third president as a man sent by “Divine Providence” to “serve and inspire” us.  We would do better to view him as a cautionary tale.  Thomas Jefferson had many virtues, but with regard to life’s most important question–the question that Jesus asks each of us, “Who do you think that I am?”–Jefferson got it wrong.


It was the headline that got my attention.  “Professors, We Need you!” trumpeted the title of a recent editorial by conservative political columnist Nicholas Kristof.  In a masterpiece of left-handed compliments, Kristof observes that “some of the smartest thinkers on problems at home and around the world are university professors,” and yet “most of them don’t matter in today’s great debates.”  And why is that, he asks?  Part of the explanation lies in a widespread anti-intellectualism in American culture, but the sadder reality is that academics have also been authors of their own irrelevance.  “It’s not just that America has marginalized some of its sharpest minds,” Kristof writes.  “They have also marginalized themselves.”

Kristof devotes most of his essay to explaining how such a thing has come to pass.  As academic disciplines have become progressively more specialized, their subject matter has become progressively less accessible.  To compound the problem, Ph.D. programs “have fostered a culture that glorifies arcane unintelligibility.”  Aspiring academics are forced to “encode their insights into turgid prose.”  Then, to make sure that their findings have little public impact, they bury their “gobbledygook” in “obscure journals” or books published by university presses with “reputations for soporifics.”  (Presumably, this limits their audience to readers who don’t need to look up the definition of words like soporifics.)

There is much that I could quibble with in the details of Kristof’s explanation.  It is simplistic and one-sided, although I suppose that goes with the territory of a thousand-word op-ed.  The one concrete recommendation that he offers—professors should “cast pearls through Twitter and Facebook”—is also disappointingly superficial.  And yet the fundamental problem that Kristof points to is undeniable.  In more cases than not, there is a chasm between academic scholars and the general public.  This is true, furthermore, not only of scholars in highly technical fields, where the gulf is perhaps unavoidable.  (Think Sheldon Cooper in The Big Bang—does anyone really expect a theoretical physicist to relate to normal people?)  It is also true of professors in the humanities who have much to offer the general public and much less excuse for the isolation in which they labor.

When I think about this problem, my heart and mind go directly to a particular subset of the larger pattern, namely the gulf that too often separates Christian scholars from their brothers and sisters in the church pews.  As I have shared on numerous occasions, my heart’s desire is to find ways to bridge this gap.  This is why I left the University of Washington for Wheaton College, why I want to write books like The First Thanksgiving, why I even started this blog.

In this post and the next one, I want to think out loud with you about the possible causes of the chasm and what might be done to narrow it.  I think Kristof is on to something when he suggests that the root of the problem lies both (a) in the culture of American society and (b) in the culture of the Academy.  I’ll speak to the culture of the Academy first—I think I have more to add to this part of the conversation—and then next time I will share some thoughts and invite your input into how the church might be contributing to the marginalization of Christian scholarship.

To begin with, let me emphasize that I think the Academy is full of Christian scholars who think of their vocations as a form of service to the church.  Many of them work in small Christian colleges where they labor in obscurity, pouring out their lives in return for meager salaries and minimal professional reward.  They are teachers first and foremost, and they serve the greater good by equipping generations of students to think faithfully about the life of the mind and the love of God.  May their tribe increase.

And yet the vast majority of Christians in the United States will never darken the door of a Christian college or university.  To connect with the remainder, Christian professors cannot rely on their teaching alone, and therein lies the rub.  Kristof is right when he says that academic culture discourages efforts to speak to a broad audience.  Since the rise of the American university toward of the end of the nineteenth century, the Academy has effectively separated the purposes of teaching and scholarship.  To teach is to convey knowledge only; to engage in scholarship is to push back the boundaries of knowledge.  The former centers on simple communication, which supposedly anyone can master; the latter requires innovation, by comparison a rare and precious commodity.  At the research university where I taught for more than two decades, the unstated assumption in faculty hiring discussions was that anyone who could write a “cutting-edge” doctoral dissertation would automatically be dynamic in the classroom.  Generations of American college students might think differently.

All this is to say that Christian professors who want to write for a popular Christian audience face a double whammy.  To the degree that the result is palpably Christian, it will run afoul of an academic culture that equates traditional Christian beliefs with superstition and ignorance.  To the degree that it is clearly intended for a non-academic audience, it will trigger the Academy’s contempt for anything “popular.”  Serious scholars write for each other, after all.

My point is not to absolve Christian professors of all responsibility.  Allowing for many exceptions, on the whole we have accepted the values of the Academy too readily.  We have passively allowed the academy to establish our vocational priorities, to determine which questions should be important to us, to define what should pass for excellence and success.  We need to be bolder than we have been, less dependent on professional affirmation, and less reticent about declaring our ultimate loyalties.

To do this we will need God’s strength, wisdom, courage, and grace.  We will also need help from the body of Christ.  That’s where you come in.  Let’s talk again soon.


These last couple of days I have been reading a fair amount in the correspondence of Thomas Jefferson, and just yesterday I came across some of Jefferson’s ruminations on the importance of exercise that might interest you, especially given the likelihood that almost all of us will soon be watching a certain athletic contest.

The comments I have in mind come from a letter that Jefferson wrote to his nephew, Peter Carr, in the summer of 1785.  Carr was his wife’s brother’s son, and Jefferson seems to have taken great interest in his upbringing after Carr’s father died when young Peter was only three.  By 1785 Jefferson’s wife had also passed away, and it may be that Jefferson saw in the teenaged Peter a tangible link to his beloved Martha.  Perhaps he even saw in Peter the son he would never have.  Whatever his motive, Jefferson devoted considerable attention to Carr’s education, and he counseled him frequently on the path his nephew must follow if he aspired to a career of accomplishment and service worthy of a Virginia gentleman.

Thomas Jefferson sat for this portrait by Mather Brown in 1786, the year after he wrote to his nephew Peter Carr.

Thomas Jefferson sat for this portrait by Mather Brown in 1786, the year after he wrote to his nephew Peter Carr.

Although only forty-two years old, Jefferson in 1785 had already compiled an amazing record of publish service.  In the past decade alone, he had served as a delegate to the Second Continental Congress, been the principle author of the Declaration of Independence, held the post of governor of Virginia, and was now representing the United States as ambassador to France.  “Mortified” by reports of his nephew’s slow academic progress, Jefferson wrote from Paris on the 19th of August to exhort his fifteen-year-old nephew to greater effort.  Did Peter not realize that “every day you lose, will retard a day your entrance on that public stage whereon you may begin to be useful to yourself?” (It’s just a suspicion, but Jefferson may not have been cut out to work with teenagers.)

Jefferson began by focusing on his young charge’s character.  “The purest integrity” and “the most chaste honour” must be his nephew’s “first object.”  “The defect of these virtues can never be made up” by other accomplishments.   Then after integrity comes intellect.  “An honest heart being the first blessing,” Jefferson explained, “a knowing head is the second.”  And so the future president  lay out a course of reading for the teenager.  He must begin with “antient” history–”reading everything in the original” language, of course–and proceed from there to Greek and Roman poetry, followed by a systematic study of philosophy and ethics beginning with Plato and Cicero.

But the body was important as well as the heart and head.  To maximize his academic progress, Peter should set aside at least two hours every day for exercise, “for health must not be sacrificed to learning.”  “A strong body,” Jefferson lectured, “makes the mind strong.”  (Michelle Obama could not have said it better.)  But what kind of exercise should Peter pursue?   Jefferson left nothing to doubt.  “Walking is the best possible exercise,” he instructed, but not just any kind of walking.  “Never think of taking a book with you” while you walk, Jefferson stressed.  Instead,  “let your gun . . . be the constant companion of your walks.”  While “this gives a moderate exercise to the body, it gives boldness, enterprize, and independance [sic] to the mind.”

And what about more modern kinds of sports, you ask?  Peter should avoid “games played with the ball and others of that nature,” Jefferson cautioned his nephew.  They “are too violent for the body and stamp no character on the mind.”

Thomas Jefferson probably wouldn't approve.

Thomas Jefferson probably wouldn’t approve.

Go Broncos!


In my last post (which was too long ago, I know) I recommended a new book  just out by historian Margaret Bendroth titled The Spiritual Practice of Remembering.  One of Bendroth’s many spot-on observations is the following:

Historical perspective should make us more humble and cautious about ourselves.  People from the past were not the only ones operating within a cultural context–we have one, too.  Just like them we cannot imagine life any other way than it is: everyone assumes that “what is” is what was meant to be.

What Bendroth is telling us is that we do not naturally think of the values that we hold (what we believe, how we think, how we behave) as influenced by the historical context in which we live.  The beliefs of people in other times and places may strike us as peculiar, but not our own.  No, our way of looking at the world strikes us (if we stop to question it at all) as obvious, self-evident, natural.  Our way of thinking requires no explanation.  It just is.  It’s the deviations from our pattern that demand justification.

Reflecting on this very human trait always brings to mind an observation that my younger daughter made many years ago, when she was about four years old.  To understand this illustration, you have to know that both my wife and I were born and raised in the South, and when we moved to Seattle (and the University of Washington) right after I finished graduate school, among the baggage that each of us took with us was a couple of substantial southern accents.  My wife’s drawl was substantially moderated by several years of classical voice training, but I struggled (and to some degree, still do) not to sound like Gomer Pyle with a Ph.D., and our children, quite naturally, absorbed much of their parents’ speech patterns.  And so for years after we settled in the Pacific Northwest, it was not  uncommon for guests to our home to comment on our accents.

On one such occasion, our precocious four-year-old overheard a visitor talking about her daddy’s accent, and as soon as the opportunity presented itself, she tugged on my pants’ leg and pulled me aside to ask a question.  “What’s an accent?” she asked, and I did my best to explain the concept.  She seemed to follow my explanation, but she still looked troubled.   “You don’t have an accent, Daddy,” she declared emphatically, her southern drawl reminiscent of molasses oozing across a plate.  “You talk just like I do!”

One of history’s priceless benefits, potentially, is that it can help us see with new eyes what we would otherwise take for granted.  It rattles our complacency, challenging us to think more deeply about the things we see as too self-evident to require explanation.  By introducing us to people from other times and places who saw things differently, history can put our own values to the test.  And in doing so, it makes it easier for us to fulfill the biblical injunction to “take very thought into captivity to the obedience of Christ” (II Corinthians 10:5).

Over Christmas vacation I read three related books that powerfully illustrate this benefit.  This semester I am co-teaching for the first time a course on race and ethnicity in U. S. history, and with that course in mind, I picked up The Color of Christ: The Son of God and the Saga of Race in America (University of North Carolina Press, 2012), by Edward J. Blum and Paul Harvey.  As the title suggests, the authors are interested in how Americans have imagined Jesus in racial terms over the course of U. S. history.  (Was Jesus fair-skinned?  dark complected?)  Although focused specifically on attitudes about race, the book offers a convicting case study of the ways that cultural values inform–and often distort–the substance of our religious faith.

For broader context, I also read two other works of history that speak to this larger topic:  American Jesus: How the Son of God Became a National Icon (Farrar, Straus, and Giroux, 2003), by Stephen Prothero; and Jesus: Made in America (Intervarsity Press, 2008), by Stephen J. Nichols.  Both works echo The Color of Christ in documenting the myriad ways that Americans’ changing values over the centuries have influenced their religious convictions as reflected in their perceptions of the nature of Jesus.

American jesusI don’t endorse each of these books equally.  Prothero, professor of religion at Boston University, is the kind of public authority on religion who gets invited to appear on Oprah, The Daily Show with Jon Stewart, or the Colbert Report.  These are not venues that typically promote deep reflection.  American Jesus is an entertaining read, but maybe more clever than insightful.  I couldn’t help suspecting that Prothero writes at times with shock value in mind.  As he explains early on, when he refers to “Jesus” he does not mean the man from Galilee whom Christians believe to be the Son of God; rather, he has in mind the “American Jesus” (hence the title of the book),  i.e., Jesus as Americans have perceived him.  Whether their perceptions are true to who Jesus really was (and is) does not interest him, and it is not at all clear that he would even view the question as important.   Perceptions of Jesus should change over time, he says.  “Only dead religions stay the same; living faiths adapt continuously to changes in their environment.”

color of christ 3The Color of Christ is a very different book.  Published by a university press, its primary intended audience (I am inferring here) is readers within the academy.  (Both authors work within the Academy themselves, Blum at San Diego State University and Harvey at the University of Colorado.)  The prose is denser than Prothero’s, the tone far more serious.   Blum and Harvey begin the book with a somber vignette: a sobering account of the 1963 bombing by white supremacists of the all-black Sixteenth Street Baptist Church of Birmingham, Alabama.  In addition to taking the lives of four little girls, the dynamite planted by opponents of integration marvelously (miraculously?) also shattered the face of Jesus in the church’s stained-glass window.  “In the blink of an eye,” Blum and Harvey write, “the prince of peace was made a casualty of race war.”  The authors see the tragic episode as a kind of parable, underscoring how central images of Jesus have been to American understandings of race.

Jesus Made in AmericaJesus: Made in America is yet another kind of book with a different kind of emphasis.  While Stephen Prothero breezily reviews how Americans of all races and creeds have thought about Jesus, and Blum and Harvey focus specifically on how a broad sampling of Americans have imputed racial characteristics to Jesus, Stephen Nichols is interested particularly in the perceptions of American evangelicals regarding the Man from Nazareth.  An evangelical himself (a graduate of Westminster Theological Seminary and a professor of theology at Lancaster Bible College), Nichols writes openly to evangelicals as well as about them.

For all their differences in background and approach, all three works arrive at a core of strikingly similar conclusions.  Taking the long view of American perceptions of Jesus over the past four centuries, the evidence is overwhelming that Americans–including American evangelicals–have allowed the values of their culture to influence significantly how they envision Jesus.  Focusing on American racial attitudes, Blum and Harvey conclude that, as Americans imagined visual images of Jesus (who began to become white in their minds’ eye in the early nineteenth century), “they made a sacred window through which they could see their hopes, fears, dreams, and conflicts in racial and religious forms.”  Reviewing American attitudes more broadly, Prothero observes, “In the book of Genesis, God creates humans in His own image: in the United States, Americans have created Jesus, over and over again, in theirs.”  We have constantly imagined a Jesus who affirmed precisely the values we already hold, Prothero determines.  From our particular vantage points, we have imagined him as manly and effeminate, ” a socialist and a capitalist, a pacifist and a warrior, a Ku Klux Klansman and a civil rights agitator.”  Nichols heartily agrees.  The review of American evangelicals’ perceptions of Jesus over time demonstrates “the ways in which we have capitulated to our culture and have subjected Christ to our cultural predilections.”

Unlike the other two works, however, Nichols’ Jesus: Made in America seeks to edify as well as educate.  Repeatedly, he challenges evangelical readers to find lessons in the story that he tells.  If, after reading his book, we simply click our teeth in judgment of our ancestors for their blindness to the ways that they conformed to the culture, Nichols knows that he has failed.  Rather, he wants us to see ourselves–at least potentially–in the pages he has written.  He insists that his account should serve “as a parable for contemporary American evangelicals.”  The trap that ensnared previous generations can capture us as well.  What arrogance to think that we will be immune to the temptation to let the culture shape our faith!  In this sense, the movement away from an orthodox understanding of Jesus across American history should make us fearful rather than judgmental.

In the course of his study, Nichols points to several aspects of evangelical belief that make us especially vulnerable to being conformed to the world without even knowing it.  Rather than summarize his argument, I will recommend that you spend time with the book yourself.  I can’t resist sharing one of the factors that he pinpoints, however: we ignore the past as a source of wisdom.  This dismissive attitude toward history–what one specialist on American religious values has termed “historylessness”–”leaves American evangelicals more vulnerable than most when it comes to cultural pressures and influences.”

It’s a sobering assessment.