Monthly Archives: July 2014

FROM MY COMMONPLACE BOOK: SEEING MORE CLEARLY WHOM WE SERVE

Sociologist Christian Smith–a believing scholar at Notre Dame, formerly at UNC–has spent most of his career systematically surveying American religious beliefs.  A prolific author, he is perhaps best known for his 2005 book (coauthored with Melinda L. Denton) Soul Searching: The Religious and Spiritual Lives of American Teenagers. Less well known outside academic circles is his 1998 study focused specifically on the values and beliefs of evangelical Christians in the U. S.–American Evangelicals: Embattled and Thriving.

The book’s title nicely captures its main argument.  After undertaking extensive polling and conducting thousands of interviews, Smith and his team of researchers concluded that American evangelicals were thriving  in large part because they were embattled.  Evangelicalism was growing rapidly, in other words, “very much because of and not in spite of its confrontation with modern pluralism.”  Evangelicals see themselves as taking part in an ongoing struggle with an unbelieving culture, Smith found, and that sense of struggle has given evangelicalism much of its religious strength.

The sense of cultural struggle Smith alludes to has surely had its benefits for the life of the mind.  Most notably, as Smith points out, it has kept American evangelicals from either blandly blending into the secular mainstream or wholly withdrawing into fundamentalist ghettos.  That’s a good thing.  But when it comes to our engagement with the past, our sense of being engaged in a cultural struggle has been a mixed blessing.

On the one hand, it has led countless believers to value the past, to believe that it is vitally important that American Christians not lose touch with their religious and national history.  Although we historians often bemoan our culture’s “chronological snobbery” and relentless present-mindedness, every time I attend a home-school gathering or speak to a private Christian school, I am reminded that there is an enormous population of American Christians who take history with the utmost seriousness.

On the other hand, the embattled mindset that Smith writes about has encouraged countless Christian leaders and thinkers to study the past with an agenda in mind.  The most influential contributors to the popular view that America was founded as a Christian nation are also among the most egregious practitioners of what I call the “history-as-ammunition” approach to the past.

Although their intentions may be honorable, those who adopt this strategy are more interested in proving points and winning arguments than in gaining greater understanding of a complex past.  They know in advance what they want to find in their investigations, and they can already envision how their anticipated “discoveries” will reinforce values that they already hold.  I cannot overstate the costs of such an approach.  When we employ the history-as-ammunition approach, we predictably find what we are looking for, but we rob history of its power in the process. History loses its potential to surprise and unnerve us, ultimately to teach us anything at all. We learn nothing beyond what we already “know.

Conceiving the Christian CollegeHere is an extended quote from my commonplace book that calls Christians to a different standard.  The author is Duane Litfin, who for seventeen years (1993-2010) was president of my current institution, Wheaton College.  The passage is from his 2004 work Conceiving the Christian College.  In context, Litfin is exploring the possible motivations for Christian scholarship and challenging Christians engaged in the life of the mind “to see more fully whom we serve.”  Listen to what he has to say:

I am highly motivated to be about the business of cultivating our minds and our learning, but it seems to me that our first motives must be intrinsic rather than instrumental.  In other words, we must learn to love God with our minds, to use our artistic gifts for Christ, to embody him in serving our neighbor and our society.  But our primary motive for doing so must not be the transformation of our culture.  Our prime motive must be obedience to Jesus Christ.  Then, if the living Christ graciously chooses to use our efforts to mold our culture into more of what he wants it to be, we will be grateful.  On the other hand, if he does not so choose–and let us be clear about it, he does not always so choose–and the culture remains resistant, even hostile, to our Christian influence, we must not be cast down.  Our motivation is not dependent on the acceptance and approval of our culture; in the end we care preeminently about the approval of Jesus Christ.  Our goal is to love God with our minds, whether the culture comes to appreciate our efforts or not.

AN ATHEIST’S HISTORY OF THE AMERICAN FOUNDING

Earlier this month I had an opportunity to review a new book on the American founding for Christianity Today. The book is Nature’s God: The Heretical Origins of the American Republic. The author, Matthew Stewart, is an independent writer, a philosopher by training, and an atheist by conviction. (If you missed the review, you can read it here.) Summarizing broadly, Nature’s God argues that the vision of the leading Founders was aggressively secular. Their worldview centered on a radical deism that was tantamount to atheism, and their ultimate objective was not freedom of religion but freedom from religion. What is more, their views were widely shared by common Americans in the revolutionary era.

Natures GodAlthough Stewart cloaks his argument in a 400-page narrative, the heart of his reasoning boils down to a simple syllogism: The ideas that matter in history are the ones that are true. Religious beliefs are, by definition, false. Ergo (philosophers say ergo a lot), religious beliefs couldn’t have mattered in the American founding. If lots of colonists back in ’76 thought otherwise, that’s because they weren’t as enlightened as the author. Too bad for them.

The thrust of my review was to call attention to Stewart’s a priori assumptions and to remind readers of historians’ quaint belief that historical assertions should be grounded in historical evidence. Stewart is correct to point out that the religious beliefs of many of the leading Founders were unorthodox, David Barton’s wish-dreams to the contrary notwithstanding. But Stewart errs badly in equating the views of the leading Founders with atheism, and he provides almost no evidence at all for his insistence that radical philosophy was widespread among the rank and file of colonial patriots.  In short, the emperor has no clothes.

I was under a strict word limitation in my review for CT, and there was quite a bit that I wanted to say that space didn’t allow. Before the buzz about the book fades completely—hopefully not too long from now—I thought I would share some thoughts that didn’t get into the formal review. Here are two somewhat lengthy additional reflections:

First, a great deal of what Stewart wants to do in Nature’s God is challenge the intellectual coherence of orthodox Christianity. Debates about the past are almost always debates about the present in disguise, and Stewart’s claims about the origins of the American Revolution are no exception. The author openly longs for the day when religious belief is wholly “confined to the private sphere, as a purely inward matter, where it is rendered harmless.” He recognizes that it’s easier to justify the banishment of faith from public life in 2014 if you can prove that it was irrelevant in 1776.

Yet for a study that is so determined to discredit orthodox Christianity, the author is curiously averse to engaging Christian scholars, whether historians or theologians. When it comes to the religious beliefs of the revolutionary generation, quite a number of Christian historians have anticipated much of Stewart’s findings, albeit with vastly greater nuance and balance, but you’d never know it from his account. And as for the teachings of Scripture and the elements of orthodoxy, Stewart’s strategy is to ignore theologians altogether and instead lampoon the purported beliefs of “the common religious consciousness.”

Stewart alludes to “the common religious consciousness” incessantly (on pages 72, 92, 131, 158, 173, 174, 322, 339, 370, 374, 387, 389, 397, 427, among other places).  When he tires of the phrase he ridicules instead “the common view of things,” “the religious conception,” “the common sense of the matter,” “conventional wisdom,” “the common conception,” “common intuition,” “common ideas about things,” “a common line of interpretation” and the “widely accepted view today.” The one thing that unifies every one of these references is that they lack even a single specific reference to supporting evidence. The “common religious consciousness” is simply Stewart’s rhetorical whipping boy.  It stands for whatever straw man he needs at the moment to make Christianity appear ludicrous.

Don’t get me wrong. At times Nature’s God is an impressively scholarly work. The end notes are ninety pages long, and Stewart can split hairs with the best of them in exploring the subtleties of Epicurean philosophy or the writings of Benedict de Spinoza. But when it comes to defining the Christianity he so detests, the book becomes appallingly unscholarly, even anti-intellectual. Christianity is simply whatever Stewart says it is. And that makes Stewart’s job of ridiculing it a lot easier. “Nice work, if you can get it,” as we like to say around the McKenzie household.

Second, although Stewart would wince at the comparison, I kept thinking while reading Nature’s God that the book has a lot in common with the works of David Barton. A recurring theme in Barton’s “Christian America” interpretation is that the true history of America’s origins has been intentionally hidden by secularists who hate the truth. With almost perfect symmetry, Stewart argues that Christian apologists have “lobotomized” the more radical leaders of the Revolution and covered up the reality that they were religious heretics. From the founding all the way to our day, “there have been many attempts,” Stewart charges, “most of them misinformed, some shamelessly deceitful—to deny or emend this basic fact of American history.”

Like Barton, Stewart also contends that he has no agenda other than a zealous commitment to discover the truth. He claims that he was “eager to see what I might learn” from the writings of Barton, Tim LaHaye, Gary Demar, and company—a whopper if I’ve ever heard one—and he insists that he was repeatedly surprised by the conclusions that his unbiased examination of the evidence thrust upon him. As I followed Stewart’s description of his approach in the book’s preface, the image that came to mind was an academic version of Sgt. Joe Friday, the relentless Dragnet detective who followed the evidence wherever it led. Just the facts, ma’am.

The reality is much different.  Stewart–like Barton–approaches the past more like a defense attorney than a police detective.   His job is not to present the whole truth to the jury, but rather to make the strongest case that he can for his client.  To put it differently, Stewart–just like Barton–is focused more on scoring points in the culture wars than on wrestling with the complexities of the past.  Winning the argument trumps understanding the issues.

I’ll take the time to share one appalling example of this from Nature’s God.  In chapter two (titled “Pathologies of Freedom”), Stewart introduces the villain in his melodrama, namely the Protestant Christianity that was widespread in the American colonies in the aftermath of the Great Awakening.  His primary goal for the chapter is to demonstrate how utterly anti-intellectual Christianity was (and is).

To that end, Stewart frames the chapter in terms of a relentless struggle between science and religion.  The former is defined by an open-ended commitment to truth, the latter by narrow-minded bigotry and hostility to free inquiry.  Stewart begins the chapter with an anecdote involving Ethan Allen, the free-thinking backwoodsman who would go on to fame during the Revolution as leader of Vermont’s “Green Mountain Boys.”  In 1764 Allen was arrested in Salisbury, Connecticut for defying a town ordinance prohibiting the administration of smallpox vaccinations.  According to Stewart, the town’s council of “selectmen” had caved in to religious arguments that vaccination interfered with divine sovereignty.  In an end note buried 414 pages later, he acknowledges that “opinion on the subject of inoculation did not consistently divide along theological lines.”  But in the text he notes only that Allen’s arrest “could be seen as one of many collisions between religion and science.”

Having used the vignette to illustrate the supposed hostility between faith and reason, Stewart then devotes the heart of the chapter to an overview of the theology of the Great Awakening, focusing most of his attention on an extended character sketch of the famous preacher and theologian Jonathan Edwards.  According to Stewart’s contemptuous caricature, Edwards fomented hate, taught “strikingly cruel doctrines,” and brainwashed his congregation into worshiping “an angry God who demands absolute humiliation upon pain of eternal damnation.”  What offends Stewart most is Edward’s purported war on reason.  His followers were sheep who succumbed to Edwards’ insistence on “absolute  submission,” on “obedience without sense or purpose.”  Finding no intellectually respectable grounds for Christian conviction, Stewart dismisses the Christianity of colonial America as a form of “madness.”

At this point, I could almost feel myself pulling for those brave colonial atheists who refused to shut off their brains even as waves of religious superstition rolled across the land.  But although Stewart’s prose is colorful and engaging, the author’s characterization of Edwards is more ignorant rant than serious scholarship.  Jonathan Edwards was one of the preeminent intellectuals of colonial America.  He read widely, thought deeply about literature and art and philosophy, and was throughout his life an advocate, not an opponent of science.  When he died prematurely in 1758, he had just assumed the presidency of one of the leading institutions of higher education in North America, Princeton College.  He was the last person to cast faith and reason as unalterable enemies.  That view belongs to Matthew Stewart, not Jonathan Edwards.

And the cause of Edwards’ premature death?  The point is hardly irrelevant to the chapter on colonial religion as Stewart frames it.  Edwards died from a smallpox inoculation, having concluded that the risk involved from being infected with a mild case of the disease was justified by the statistical likelihood of its efficacy.   Stewart never once hints at this fact.  He is either unaware of it–which is possible, though I find it unlikely–or the truth simply didn’t fit with his predetermined agenda to discredit the Christianity he so despises.

FROM MY COMMONPLACE BOOK: SCREWTAPE ON “THE HISTORICAL POINT OF VIEW”

CSLewisAs a historian, one of the things I most appreciate about C. S. Lewis is his conviction that the present has much to learn from the past.  As a teacher, one of the things I admire most about Lewis is his ability to communicate that conviction in an accessible, memorable, and imaginative way.  The extended quote below from my commonplace book wonderfully embodies both of these features.

The quote comes from Lewis’s WWII-era classic The Screwtape Letters.  If you’re not familiar with the book, I heartily recommend it.  It is a great example of Lewis’s genius at using imaginative literature to convey spiritual truth.  The book consists of a series of 31 letters from a senior devil named Screwtape to his nephew, a junior devil named Wormwood.  Throughout the letters, Screwtape lavishes his nephew with advice on how to cause Christians to stumble.   The book is both engaging and convicting, as long as you remember that everything comes from a diabolical perspective.  What Screwtape is recommending, Lewis is warning us to avoid.

Screwtape lettersToward the end of the book, in letter 27, Screwtape shares with Wormwood about what he calls “the Historical Point of View.”  In context, Screwtape has been explaining to his nephew how best to undermine the effectiveness of human prayers.  He notes that an ancient writer  had shared insights that, if humans took them to heart, would badly undermine the devils’ strategy.  There is no need to worry, however, Screwtape assures his nephew.  “Only the learned read old books, and we [he means the devils of Hell] have now so dealt with the learned that they are of all men the least likely to acquire wisdom by doing so.”  Screwtape then goes on to explain the reason for this hellish success:

We have done this by inculcating the Historical Point of View.  The Historical Point of View, put briefly, means that when a learned man is presented with any statement in an ancient author, the one question he never asks is whether it is true.  He asks who influenced the ancient writer, and how far the statement is consistent with what he said in other books, and what phase in the writer’s development, or in the general history of thought, it illustrates, and how it affected later writers, and how often it has been misunderstood (specially by the learned man’s own colleagues) and what the general course of criticism on it has been for the last ten years, and what is the “present state of the question.” To regard the ancient writer as a possible source of knowledge–to anticipate that what he said could possibly modify your thoughts or your behavior–this would be rejected as unutterably simple-minded.  And since we cannot deceive the whole human race all the time, it is most important thus to cut every generation off from all others; for where learning makes a free commerce between the ages there is always the danger that the characteristic errors of one may be corrected by the characteristic truths of another.  But thanks be to Our Father [i.e., Satan] and the Historical Point of View, great scholars are now as little nourished by the past as the most ignorant mechanic who holds that “history is bunk.”

Isn’t that a delightful passage?  I could go on and on about it, but let me share just a few observations.  First, Lewis is reminding us that there are moral consequences to our ready dismissal of the past.  By cutting ourselves off from all those who have gone before us, we forfeit the hard-won wisdom of experience that our ancestors might otherwise bequeath to us.  This lessens our ability to live virtuously.  Our contempt for the past is itself a sign of a moral shortcoming on our part, namely intellectual pride–or what Lewis elsewhere labeled “chronological snobbery.”

Second, while there are many reasons why western society as a whole learns little from history, be sure to notice that in this passage Lewis is focused on “the learned.”  The Historical Point of View that he describes is most pronounced among the well educated, but I think we can be even more specific:  In the United States, at least, the Historical Point of View–the mindset that finds it “unutterably simple-minded” to suppose that one could learn how to live by studying the past–is most pronounced among academic historians. At its best–in the words of historian David Harlan–the study of history should be “a conversation with the dead about what we should value and how we should live.”  Too often in today’s universities, however, the study of history is a closed, self-referential conversation that individuals with Ph.Ds have with each other.

Finally, don’t miss the potshot that Lewis takes at the individual who was then the wealthiest man in the world.  During WWI, automobile tycoon Henry Ford had famously lectured Congress on the worthlessness of the past. “I don’t know much about history, and I wouldn’t give a nickel for all the history in the world,” Ford proclaimed.   “History is more or less bunk. It’s tradition. We don’t want tradition. We want to live in the present and the only history that is worth a tinker’s damn is the history we make today.”

Somewhere in the Lowerarchy of Hell, Screwtape smiled.

THOUGHTS ON THE SECULAR UNIVERSITY—Pt. II

Last week I responded to a rant in the Chronicle of Higher Education by University of Pennsylvania Professor Peter Conn (“The Great Accreditation Farce”). With considerable righteous indignation, Conn insists that to grant accreditation to schools like Wheaton College makes a mockery of the academic ideal of “unfettered inquiry” that supposed defines the secular academy. In my response (“Should Religious Colleges Be Denied Accreditation?”), I mainly pointed out that Conn’s diatribe failed to capture my own experience. Since leaving the University of Washington for Wheaton College I have enjoyed more, not less academic freedom.

Then, prompted by a question from a reader, I decided to follow up with two broader posts on the worldview of the secular university as I experienced it in my twenty-two years as a faculty member at such an institution. In the first (“Thoughts on the Secular University–Pt. I”), I primarily wanted to stress my belief that, at the level of the institution, today’s state universities are influenced by a hefty helping of market-oriented pragmatism. State universities are frequently enormous economic concerns. They employ thousands of workers and have billion-dollar budgets. And although they are non-profit organizations, they have to attract customers and keep them smiling just as much as Walmart or McDonalds.

We are tempted to think that state schools are shielded from market pressures because they receive state funds, and perhaps there was a time when that was largely true. State legislatures have slashed their support to higher education over the last generation, however, so much so that many state universities are “public” institutions in name only. Universities compete for students, they compete for wealthy private donors, and they compete for government and corporate grants. To a significant degree, they take the shape of what others are willing to pay for.

While this is true, I am not remotely suggesting that the secular university is an ideology-free zone. Far from it. There is a well-defined ideology that predominates in the secular university. Not every faculty member equally endorses it, but it is pervasive enough and dominant enough that it is reasonable to call it the secular university’s defining worldview.

So what does this ideology look like? It’s probably best to begin by defining terms. A political philosopher could come up with a much more precise (and convoluted?) definition, but I like the simple definition of “ideology” as essentially your ideas about the way the world is and the way the world should be. Let’s take these two components in turn. What is the prevailing view in the secular university of how the world is?

The answer is simple: it is material, period. More than anyplace else in America, today’s secular universities are strongholds of the materialist view (as opposed to the religious view) of the origins and nature of the universe. Matter and space have always existed according to this notion. Outside of the physical world there is only nothingness. Everything is immanent. Nothing is transcendent. As the late astronomer Carl Sagan used to put it in the opening of the popular PBS series Cosmos, the material universe is all there is, all there ever has been, all there ever will be.

When it comes to higher education, the dogma of materialism finds expression in a single, overarching, non-negotiable dictum: in the words of atheist Matthew Stewart, “there is nothing outside the world that may explain anything within it.” The label for this philosophy of knowledge is rationalism. Rationalism regards human reason as the only path to truth. It says that the only way to make sense of the world is to put autonomous humans at the figurative center of the universe and rely on human reason to explain whatever it can.

More to the point, rationalism dismisses the very possibility of divine revelation. This doesn’t mean that the university has to dismiss religion per se from the curriculum, as long as it’s studied as an odd cultural phenomenon that human reason explains away. Most universities of any size have departments of religious studies (often staffed by professors who are atheists or agnostics).  Sociologists, anthropologists, and historians often touch upon religion as well. They just can’t profess to believe any of the truth claims of the religions they study.

All of this makes sense within a materialist, rationalist framework. So does the university’s theoretical stance on moral values. Remember, matter is all that there is. Matter can be weighed, measured, and explained. Values, on the other hand, are immaterial. They are, by definition, subjective and beyond proof. In the moral philosophy of the university, whatever values predominate in a particular place and time are best understood as “social constructions.” They are invented, not discovered. Societies adopt them over time because they are useful or, more likely, because those who wield power over them find them useful. In sum, while there may be discernible patterns of human behavior and belief, these cannot reflect objectively true values that transcend space and time.  Why?  Because nothing transcends space and time.

This, in a nutshell, is how the world is in the eyes of the secular university. What is its vision for how the world should be? Well, it should certainly be more rational, which is another way of saying it should be more secular. For several generations scholars have been asserting that secularization is the natural path of human development and predicting that religion will soon be an embarrassing memory from humanity’s superstitious childhood. Billions of believers have failed to get this memo, however, and both Islamic and Christian revivals continue to sweep vast portions of the majority world.

The world should also be much more just than it is. If the secular university exhibits a fair amount of pragmatism, it also exudes more than its share of moral passion and righteous indignation. This was certainly the case at the University of Washington. Walking across campus on a sunny day meant running a gauntlet of leaflet-wielding student organizations, each bent on converting you to their “cause” of choice: Aids awareness, homelessness, environmentalism, human trafficking, apartheid, gay/lesbian/bisexual/transgender/queer rights, etc. Both faculty and students spoke glibly of “social justice” and “human rights” and both took for granted that these concepts were far more than “social constructions” reflecting the “cultural hegemony” of the cultural elite. The campus was awash in moral claims.

From retrospect, this is the feature of the secular university that I find most striking: On the one hand, the university rests on a theoretical foundation that denies the very possibility of objective moral truth. On the other hand, it promotes an academic culture characterized by pervasive, passionate moralizing. Put the two together and you get the contradiction at the heart of the secular academy: Deny the possibility of moral Truth while crusading for moral truths.

The stereotypical embodiment of this contradiction is the  self-described relativist who denies that there is any transcendent meaning or purpose to human existence, and yet expresses great hope for the future of humanity and feels passionately about his own non-negotiable set of ethical values. Michael Novak has called this oxymoronic outlook “nihilism with a happy face.” It flourishes in the secular university.

The contradiction underlying “nihilism with a happy face” is glaring, but it’s only troubling if you hold to the quaint belief that your worldview should be internally consistent. But I found that, for all its exaltation of reason, when it comes to worldviews, the secular university is not that big on logical consistency. That, at least, was my experience at UW. While I regularly encountered students with strong moral convictions, I encountered few who felt obliged to reconcile their moral commitments with a companion set of beliefs about the origin, nature, and meaning of the universe. In other words, almost none of the students that I got to know thought it essential to develop a comprehensive and logically consistent philosophy of life.  It was not so much that they were opposed to the idea; they had never given it any thought.  Nor were they much challenged to do so during their time at the university, sadly, for the university had given up on that project long ago.

It was pretty much the same with the faculty and graduate students whom I engaged in “meaning-of-life” conversations.  Repeatedly I encountered scholars who condemned religion as irrational but were more than willing to jettison reason in order to cling to their own secular philosophies.  When I gently accused one of my graduate students of inconsistency, she left my office mildly troubled and then returned a few days later to say that she had concluded that I was right and that she was quite willing to live with a measure of irrationality. When I confronted a colleague (a senior professor) about an irrational inconsistency in his worldview, he forcefully objected at first and then—unconvinced by his own argument—shrugged and observed that “perhaps it isn’t all that important to be rational.” Another colleague, a brilliant scholar and religious skeptic, ended our conversation by declaring, “Logical consistency is not my god.”

In “The Great Accreditation Farce,” Peter Conn insists that the faculty at Christian colleges like Wheaton necessarily abandon “the primacy of reason.”  I haven’t encountered that yet, but thanks to my time in the secular university, I think I’ll be able to recognize it when I see it.

FROM MY COMMONPLACE BOOK: STEVEN GARBER ON VOCATION

I hope to be back in touch soon with part two of my thoughts on the world view of secular university, but I’m going to interrupt that thread temporarily to share a few more quotes from my commonplace book. A commonplace book, you will recall, was essentially a quote journal or intellectual diary that students were often required to keep in the seventeenth and eighteenth centuries. I use mine to write down passages that I want to revisit regularly, quotes that challenge how I think through my calling as a Christian, historian, and teacher.

I recently finished a marvelous new book from Intervarsity Press that is a treasure trove of such passages. The book is Visions of Vocation: Common Grace for the Common Good, by Steven Garber. I heartily recommend it, whatever your age, occupation, or outlook on life. Garber heads up the Washington Institute for Faith, Vocation, and Culture in Washington, D. C. He writes from an explicitly Christian foundation, but graciously, winsomely, and non-dogmatically, and I would not hesitate to give this book to anyone wrestling with questions about the purpose and meaning of life.

Garber

The book hinges on one simple, haunting question: “what will you do with what you know?” Garber has spent much of his life studying the history and philosophy of science and in thinking critically about the world view of the Enlightenment. For all of the advances in knowledge that the Enlightenment furthered, it erred—tragically—in promoting the belief that mind and heart could be separated. Knowledge always comes with moral responsibility, Garber insists. This is one of the key truths imbedded in the account of the tree of the knowledge of good and evil in Genesis chapters 2-3. The questions “What do you know?” and “What will you do with what you know?” can never be divorced, as much as we might like to pretend otherwise.

From this initial premise, Garber observes that the hardest thing we are called to do in life is to know and still love. Modern educators naively often suggest otherwise. Indeed, one of the underlying premises of the modern multicultural education movement is the idea that increasing students’ understanding of diverse perspectives naturally increases their appreciation of diverse peoples. That may sometimes be the case, but I think that Nietszche was right when he observed (I’m paraphrasing), “If I understood you better I might hate you more.” Knowledge makes us morally responsible, not morally right. How we respond to knowledge is the key to the latter.

Garber maintains that the more intimately we know the world the harder it becomes to love. Knowing and persevering in love is rare. To know those around us truly is to know the brokenness of the world and to share in its pain. To ease our pain, our natural response is to build a wall around our hearts made of stoicism or cynicism. The stoic trains her heart not to care about the world; the cynic convinces himself that all efforts to help are naïve or futile.

Visions of Vocation is filled with stories of men and women who have refused to give in to stoicism or cynicism. Garber describes his teaching philosophy as “come-and-see” pedagogy. “We learn the most important things over the shoulder, through the heart,” he writes, and so he doesn’t waste much time on abstract assertions. Because “words always have to be made flesh if we are going to understand them,” he spends most of his time introducing us to people he has walked with, individuals who have become “hints of hope” to a hurting world by choosing to know and still love.

Two convictions distinguish these men and women, Garber finds. First, they refuse to accept the delusion of individual autonomy that shapes the modern western world. They realize that “none of us are islands. . . . We are we, human beings together. Born into family histories, growing up into social histories, we live our lives among others, locally and globally, neighbors very near and neighbors very far.” Second, in acknowledging this relationship, they have accepted also that they are obligated to others and implicated in their suffering. In sum, in acknowledging relationship they have accepted responsibility, and after accepting responsibility they have chosen to take action.

Here are two final, more extended quotes from the book to whet your appetite. The first is a word of warning:

These are the truest truths in the universe: We do not flourish as human beings when we know no one and no one knows us; we do not flourish as human beings when we belong to no place and no place cares about us. When we have no sense of relationship to people or place, we have no responsibility to people or place. Perhaps the saddest face of the modern world is its anonymity.

The second is an exhortation to daily faithfulness and perseverance:

. . . But that is what matters most in life, for all of us. The long obedience in the same direction. Keeping at it. Finding honest happiness in living within the contours of our choices. To wake up another morning, beautifully bright as a summer day spreads its warmth across the grass, or awfully cold as winter blows its way over the high prairie, and stepping into the world again, taking up the work that is ours, with gladness and singleness of heart . . .

THOUGHTS ON THE SECULAR UNIVERSITY—PT. I

I responded over the weekend to a recent opinion piece in The Chronicle of Higher Education ridiculing the idea that any religious college could truly qualify as a legitimate institution of higher learning. (See “The Great Accreditation Farce,” by Peter Conn.) Offering a series of pronouncements rather than a chain of reasoning, the author, a professor of English and Education at the University of Pennsylvania, insists that to grant accreditation to schools like Wheaton College makes a mockery of the academic ideal of “unfettered inquiry.” Wheaton’s provost, Dr. Stan Jones, responded with a thoughtful essay that contests Conn’s twin assumptions, each equally naïve: the first, that scholars with religious convictions necessarily embrace irrationality and abandon reason; the second, that the secular university is devoid of dogma of its own. In my response I chose to offer a personal testimony of sorts, comparing my experiences on the faculty of the University of Washington, where I taught for over two decades, and at Wheaton College, where I have served since 2010. Although the UW has many strengths and numerous committed faculty, I have nevertheless felt much greater academic freedom since coming to Wheaton.

This was all that I originally intended to share, but a question from a reader has changed my mind. In a thoughtful comment to my original post, Daniel Davis asks whether, in my opinion, secular professors like Conn are aware of the holes or contradictions of their own worldviews. After some hesitation, I decided to offer a ridiculously broad reply to Davis’s focused question. In this post and the next one, I’d like to share my sense of the world view of today’s secular university. You should file this under the category of my thinking out loud with you about a question that’s way beyond my pay grade. I’d love to hear your perspectives, which may well differ from mine. As G. K. Chesterton warned, “Thinking in isolation and with pride ends in being an idiot.”

But first, two caveats:

To start with, I need to stress that I’m not an expert on the philosophy of higher education. I can only offer my individual perspective as someone who taught at a fairly typical research university for twenty-two years. This does not make me an authority on the subject (although it does give me twenty-two years’ more experience at a secular institution than Peter Conn has at any of the Christian institutions he sweepingly condemns).

Beyond that, it is imperative that I reiterate my appreciation for the many positive aspects of the secular institution where I taught, i.e., the University of Washington. Although much of what follows will be critical, I do not mean to single out the UW as having more problems than most universities. Nor do I mean to cast aspersions on the faculty there. Although I differed dramatically in worldview with almost all of my colleagues, I was nonetheless surrounded by men and women who pursued their vocations—as they understood them—with dedication and integrity.

So what kind of worldviews did I encounter there? The answer is, “It depends.” It’s important to address the question at two levels. Institutions take on lives of their own, and the philosophies that guide overall decision making don’t always bear much resemblance to the personal values motivating the individuals involved in them. Most of my colleagues at UW were at least relatively idealistic. They loved their subjects. They were passionate about teaching, or research, or both. They genuinely wanted to make the world a better place. And they were willing to make personal sacrifices to be a part of such a work. Generalizing broadly, almost everyone I met at UW could have pulled down a much higher salary by opting for a career outside the academy. As a rule, they had compiled impeccable undergraduate records, thrived in top-notch graduate programs at elite universities, and earned their jobs at UW by beating out hundreds of other applicants. In sum, they had the intellectual tools to earn handsome livings, but they freely chose the much more modest compensation that the academy typically offers.

It’s crucial to stress this because, when it comes to the institutional philosophy that guides UW and schools like it, much of this idealism vanishes. At the institutional level, these schools aren’t driven by an irreligious or specifically anti-Christian ideology, as conservative Christians often claim. Indeed, they’re not very ideological at all. They’re pragmatic. At the institutional level, the values of the university are pretty much the values of the marketplace. Universities are enormous economic concerns (the UW is the third largest employer in the state of Washington), and they are shaped first and foremost by economic forces. This may be the most important thing to know about higher education over the past half century.

As Mark Edmondson explains in his wonderful book Why Teach?, colleges and universities expanded dramatically during the fat years of the GI-Bill and the baby boom. The baby boom had ended by the mid-1960s, however, and the rate of growth of the potential college population was slowing dramatically by the mid-to-late 1980s. Compounding this demographic problem was a political one. Just as demand/supply forces began to turn against higher-ed, state legislatures began to respond to straitened economic circumstances by slashing their appropriations to state universities. In 1975, state and local government appropriations accounted for 60% of total expenditures on higher education. By 2010 that proportion had fallen to 34%.

If anything, the trend has been more dismal at UW. When I joined the faculty in the late 1980s, state appropriations accounted for about 80% of the instructional budget, with tuition payments making up the balance. In the coming academic year those proportions will be almost exactly reversed. To call schools like the University of Washington or the University of Michigan or the University of Arizona “public” schools is more than a bit misleading. They are now, for all practical purposes, private institutions.

This demographic and political one-two punch has forced public colleges and universities to respond to market forces more than ever before. Many observers will think this is a good thing, and it’s possible that it has been—in some respects. Perhaps there is greater “efficiency”; maybe there is less fat in the budget than before (although trimming the fat typically involves cutting faculty rather than administrators).

But colleges and universities cannot stay afloat solely by cutting costs. To survive, they must do two other things: they have to be more competitive in attracting students, and they have to be more successful in attracting other sources of revenue beyond tuition payments and state allocations. The latter means courting wealthy donors and eliciting grants from the federal government and from large corporations. Both trends contribute to a growing “customer-is always-right” mentality. I suspect that this has a lot to do with the grade inflation that is rampant in higher education, the widespread acceptance of A.P. courses for college credit (despite dubious evidence that they are comparable to college courses), as well as the reduction in required courses that allow eighteen-year-olds more and more to define their own programs of study.

When it comes to research, universities are more and more dependent on outside grants. This is much less the case in the humanities, where the cost of research is typically minimal, but it is the norm in the hard sciences, where the costs of equipping a laboratory can be enormous. In 2009, grants from federal agencies (most notably the Departments of Defense, Energy, and Agriculture; NASA, the National Institute of Health, and the National Science Foundation) provided 59 cents of every dollar spent on university research in the fields of science and engineering. The total amount was just under $33 billion. (The University of Washington regularly leads all public universities in federal research dollars; in 2012, UW faculty received over 5,000 grants totaling nearly $1.5 billion.) Grants from private corporations are much smaller but growing. According to the National Science Foundation, in 2012 private corporations invested more than $3 billion in academic research.

None of this means that the objectivity of the research itself is compromised, although many have made that charge. What is undeniable is that the lion’s share of the research conducted at public universities is research that outside sources with deep pockets are willing to pay for. Outside funding may not determine the answers researchers arrive at, but it surely helps to determine the questions that get asked. You would never know that from Peter Conn’s characterization of the secular Academy, however. In the secular university of Conn’s imagination, “unfettered inquiry is the hallmark” of research. In contrast to religious institutions, where blind submission to dogma is the order of the day, in the secular university scholars are committed only to the courageous pursuit of truth without respect to other considerations of any kind.

Call me skeptical.

Next time we’ll shift our focus from the institutional philosophy of the research university to the individual philosophies of its faculty. Thanks for reading.

SHOULD RELIGIOUS COLLEGES BE DENIED ACCREDITATION?

Hello!  I hope everyone had a wonderful Fourth of July yesterday.

I am going to interrupt my current focus on faith and the American founding for just a moment, as I can’t help addressing one current news item that’s generated considerable buzz in these parts.  The school where I teach, Wheaton College, got mentioned quite prominently in the most current issue of The Chronicle of Higher Education.  If you’re not familiar with this publication, the Chronicle is sort of like the New York Times of the academic world.  Its target audience is primarily educators and administrators and public policy types, in addition to highbrow readers who want to follow the latest trends in higher education.  For a small school like Wheaton, getting a shout-out from the Chronicle is big.

But big isn’t necessarily good, and the attention Wheaton received is a case in point.   As it turned out, the college was exhibit A for the prosecution in a lengthy, inflammatory opinion piece titled “The Great Accreditation Farce.”  The author is Dr. Peter Conn, a professor of English and education at the University of Pennsylvania.  Conn is apparently an expert on accreditation because he participated in two accreditation reviews a decade or more ago.  In 2003 he helped his own institution prepare for an accreditation review, and the following year he was invited to be on the external accreditation committee charged with evaluating Johns Hopkins.

Conn’s title got my attention, as I confess I am more than a little skeptical of the accreditation process myself.  Any school interested in pursuing excellence should both seek and welcome outside feedback on a regular basis.  But because federal financial aid is prohibited to students attending non-accredited institutions, the accreditation process in its current form is, among other things, a Trojan horse for increased federal control of state and private institutions.  Considering all forms of aid, the U. S. government distributed nearly a quarter of a trillion dollars to college and university students during the 2012-13 academic year.   In today’s academic marketplace, rare is the college or university than can survive if its students are barred from federal aid.  This gives enormous leverage to the federal government, and I think it has more than enough of that already.

Conn has a very different set of concerns, however, as I soon learned.  The great “farce” at the heart of the accreditation process consists of the ridiculous practice of accrediting religious institutions.  “By awarding accreditation to religious colleges,” the author writes, “the process confers legitimacy on institutions that systematically undermine the most fundamental purposes of higher education.”  This is because “skeptical and unfettered inquiry is the hallmark of American teaching and research.”  It is “obvious” to Conn that “such inquiry cannot flourish” inside a religious institution.

Conn reserves his greatest scorn for Christian colleges–like Wheaton, which he singles out specifically–that require their faculty to sign statements of religious faith.  Such “intellectually compromised” schools are institutions for brain-washing the faithful, not pursuing the life of the mind.  “At Wheaton,” Conn explains, “the primacy of reason has been abandoned by the deliberate and repeated choices of both its administration and its faculty.”

Lest the reader misconstrue, Conn makes clear–with calculated condescension– that he has “no particular objection to like-minded adherents of one or another religion banding together, calling their association a college, and charging students for the privilege of having their religious beliefs affirmed.”  He does have “a profound objection,” however, “to legitimizing such an association through accreditation.”  Call it whatever you like, in other words, just don’t pretend that such a place is characterized by academic freedom and intellectual integrity.  In short, Conn concludes, “Providing accreditation to colleges like Wheaton makes a mockery of whatever academic and intellectual standards the process of accreditation is supposed to uphold.”

Wheaton’s provost, Dr. Stan Jones, has already published an eloquent reply to Conn’s polemic.  Consonant with his character, Jones’ comments are judicious, balanced, and unfailingly gracious, and I could not begin to improve on them, but I do want to share a personal testimony of sorts.  In his response, Jones notes that when Wheaton College “hires colleagues away from nonreligious institutions, we often hear they feel intellectually and academically free here for the first time in their professional careers.”  This was precisely my experience.

My professional life has been framed by two very different institutions.  For the first twenty-two years of my academic career, I taught at the University of Washington in Seattle.  In many ways, my time there was a blessing.  The UW is an elite academic institution with an extraordinary faculty and world-class resources.  During my time there it boasted five Nobel Prize winners, one of the largest libraries in North America, and was ranked by the Economist as one of the top twenty public universities in the world.

I also made several good friends at UW and benefited from a number of genuinely kind colleagues who took sincere interest in my well being, both personal and professional.  Finally, I should acknowledge that I flourished there professionally–in certain respects.  I was awarded tenure, rose in rank from assistant to associate to full professor, won the university’s distinguished teaching award, and was accorded a prestigious endowed chair in U. S. history.

And yet while I was experiencing a certain measure of professional success, my soul was always deeply divided.  I can best describe the alienation I felt by quoting from Harry Blamires, one of the last students of C. S. Lewis.  In his book The Christian Mind, Blamires wrote hauntingly of “the loneliness of the thinking Christian.”  Describing my life at UW, Blamires described his own experience as a Christian in the secular academy as akin to being “caught up, entangled, in the lumbering day-to-day operations of a machinery working in many respects in the service of ends that I rejected.”

That is eventually how I came to think of my time at UW.  For all of its discrete strengths, the university is less than the sum of its parts.  Like the secular academy overall, it is “hollow at its core,” to borrow the words of historian George Marsden.  There is no common foundation, no cohering vision, no basis for meaningful unity.  After twenty-two years of faculty meetings, I can attest to the truth that the faculty functioned best as a group when we avoided the questions of why we existed as a group.  As long as we could each do our own thing we were fine.

When it came to matters of faith, the university’s unwritten policy was a variation of “don’t ask, don’t tell.”  It celebrated racial and ethnic diversity relentlessly but was never all that enthusiastic about a genuine diversity of worldviews, at least among the faculty and in the curriculum.  If you espoused a vague “spirituality” that made no demands on anyone–or better yet, seemed to reinforce the standard liberal positions of the political Left–all well and good.  Otherwise, it was best to remember that religious belief was a private matter that was irrelevant to our teaching and our scholarship.

For twenty-two years I accommodated my sense of calling to this secular dogma, bracketing my faith and limiting explicit Christian expressions and Christian reflections to private conversations with students who sought me out.   In his book Let Your Life Speak: Listening to the Voice of Vocation, Parker Palmer writes movingly about the costs of such segmentation.  Vocation is a calling to a way of life more than to a sphere of life.  “Divided no more!” is Palmer’s rallying cry.

If I were to characterize my experience since coming to Wheaton four years ago, these are the words that first come to mind–divided no more.  Wheaton is not a perfect place, nor did I expect it to be one when I came here.  But I can honestly say that I have experienced much greater academic freedom at Wheaton than I ever did at the secular university that I left.  Conn’s assertion that, in leaving UW for Wheaton,  I have necessarily abandoned reason for dogma also mystifies me.  That he assumes such a trade-off suggests that Dr. Conn is not entirely free of dogma himself.  I could tell Conn about the intellectual excitement that abounds at Wheaton, about the brilliant colleagues I am privileged to work with (trained at places like Harvard and Yale and Duke and UNC), and about the extraordinarily gifted and motivated students that fill my classes, but I doubt that such a reasoned argument would sway him.  Reason is rarely helpful in changing an opinion not grounded in reason to begin with.

A final comment, this one about the relationship between academic freedom and academic community.  In addition to finding greater academic freedom at Wheaton, I have also encountered a true intellectual community here, one that the sprawling postmodern multiversity cannot be expected to equal.  Countless times I have reflected on the words of the German minister Dietrich Bonhoeffer, who observed in his 1938 classic Life Together, “it is not simply to be taken for granted that the Christian has the privilege of living [and, I would add, of laboring] among other Christians.”   When we have that privilege, Bonhoeffer went on to observe, we should fall to our knees and thank God for his goodness, for “it is grace, nothing but grace, that we are allowed to live in community with Christian brethren.”

Bonhoeffer knew firsthand about the hostility of an aggressively secular ideology.  He penned the words above while teaching at a clandestine seminary watched closely by the Gestapo; they were translated into English a decade after his execution in a Nazi prison camp for his opposition to Adolph Hitler.  Peter Conn does not want to drive all religious colleges underground.  He just wants to declare them to be, by definition, academically illegitimate.  In my experience, the most zealous champions of “academic freedom” are often selective in applying it.  That’s certainly the case with Dr. Conn.

PREACHING LIBERTY TO THE COLONISTS

Earlier today I posted a link to my review for Christianity Today of a polemical book by philosopher Matthew Stewart that makes the untenable claim that the American Revolution was, at its most fundamental, a revolution against the tyranny of revealed religion.  Christian readers interested in the role of religion in the American founding will learn much more from a book that I reviewed for CT this time a year ago, Sacred Scripture, Sacred War, by James Byrd.  I  re-post below my review of Byrd’s fine work for those who may have missed it.

Sacred Scripture

James P. Byrd, Sacred Scripture, Sacred War: The Bible and the American Revolution (New York: Oxford University Press, 2013).

The history of the American Revolution is, above all, a story about national beginnings, and stories about beginnings are stories that explain. How we understand our origins informs our sense of identity as a people. We look to the past not only to understand who we are but also to justify who we wish to become. And so, as a nation divided over the proper place of religious belief in the contemporary public square, we naturally debate the place of religious belief in the American founding.

Outside of the academy, much of that debate has focused on a simplistic, yes-or-no question: did religious belief play an important role in the American founding? This makes sense if the primary motive is to score points in the culture wars, mining the past for ammunition to use against secularists who deny that the United States was founded as a Christian country. There’s a problem with the history-as-ammunition approach, however. It’s good for bludgeoning opponents with, but it positively discourages sustained moral reflection, the kind of conversation with the past that can penetrate the heart and even change who we are.

In contrast, books like Sacred Scripture, Sacred War have the potential to challenge us deeply. Granted, author James Byrd inadvertently offers ammunition to readers cherry-picking evidence for a Christian founding. He matter-of-factly contends that sermons were more influential than political pamphlets in building popular support for independence, and he insists unequivocally that “preachers were the staunchest defenders of the cause of America.” And yet the question that really interests him is not whether religion played an important role in the American founding but how that it did so. More specifically, he wants to understand how colonists used the Bible in responding to the American Revolution.

Toward that end, Byrd went in search of original colonial sources that addressed the topic of war while appealing to scripture. He ultimately identified 543 colonial writings (the vast majority of which were published sermons) and systematically analyzed the more than 17,000 biblical citations that they contained. The result is by far the most comprehensive analysis ever undertaken of “how revolutionary Americans defended their patriotic convictions through scripture, which texts they cited and how they used them.”

Byrd relates his conclusions in five thematic chapters, each of which highlights a common scriptural argument in support of the Revolution. Americans found in the scripture “a vast assemblage of war stories” relevant to their own struggle with England. From the Old Testament, ministers drew inspiration especially from the story of the Israelites’ exodus from Egypt (Exodus 14-15), from the Song of Deborah in Judges 5, and from the example of David, the man of war who was also the “man after God’s own heart.” Ministers read each of these stories analogically and drew lessons from them. The Israelites’ enslavement in Egypt resembled their own bondage to British tyranny; ditto for the Israelites’ subjection centuries later to Jabin, king of Cannaan. The contest between David and Goliath, in like manner, foreshadowed the colonists’ righteous struggle with a powerful but arrogant British empire. (That David went on to become a king was a fact that need not be emphasized.)

To the patriotic ministers who declared them from the pulpit, the lessons embedded in these stories were indisputable. God championed the cause of independence. A warrior who liberated his people by means of war, the Lord clearly sanctioned violence in the pursuit of freedom. Furthermore, he would intervene on their behalf, and with God on their side, the ill-trained and poorly equipped patriots would be victorious. This meant that loyalism was rebellion against God, and pacifism was “sinful cowardice.” Had not the angel of the Lord cursed the people of Meroz because they did not come “to the help of the Lord against the mighty” (Judges 5:23)? Had not the prophet Jeremiah thundered, “Cursed be he that keepeth back his sword from blood” (Jer. 48:10)?

If the biblical argument in support of the Revolution was to succeed, of course, patriot ministers knew that they must buttress these arguments with support from the New Testament. This was no simple task, inasmuch as the apostles Peter and Paul both seemed to condemn rebellion and teach submission to rulers as a Christian’s duty. Paul enjoined the church at Rome to “be subject to the governing authorities” (Romans 13:1); Peter commanded Christians to “honor the king” (I Peter 2:17b). Neither admonition seemed to leave much room for righteous resistance to civil authority.

Advocates of independence countered, however, that these passages only commanded obedience to rulers who were ministers of God “for good,” and since liberty was self-evidently good, the apostles could not possibly be calling for submission to tyrants. They reassured their flocks, furthermore, by repeatedly citing one of the few unambiguous endorsements of liberty in the New Testament. “Stand fast,” Paul had counseled the churches of Galatia, “in the liberty wherewith Christ hath made us free” (Gal. 5:1). The liberty Paul had in mind was civil as well as religious, ministers insisted, which meant that the refusal to “stand fast” with the patriot cause was nothing less than “a sin against the express command of God.”

Three overarching patterns emerge from Byrd’s study that should trouble Christian readers. First, the influence of political ideology and historical circumstance in shaping the colonists’ interpretation of scripture is striking. Traced to its roots, the colonists’ conviction that civil liberty is a God-given right owed more to the Enlightenment than to orthodox Christian teaching, and yet the belief strongly informed how colonists understood the Word of God. Reading the scripture through the lens of republican ideology, they discovered “a patriotic Bible” perfect for promoting “patriotic zeal.”

Second, the readiness with which Christian advocates of independence sanctified violence is disturbing. “Colonial preachers did not shy away from biblical violence,” Byrd finds. “They embraced it, almost celebrated it, even in its most graphic forms.”

Third, and most ominously, the evidence suggests that the way patriotic ministers portrayed the military conflict with Britain morphed rapidly from merely a “just war”—a war originated for a morally defensible cause and fought according to moral criteria—into a “sacred” or “holy war”—a struggle “executed with divine vengeance upon the minions of Satan.” Patriotism and Christianity had become inseparable, almost indistinguishable.

Byrd writes with restraint and offers little commentary on his findings, but the implications for American Christians are sobering and the stakes are high. As Byrd acknowledges in his conclusion, over time the United States has come “to define itself and its destiny largely through the justice and sacredness of its wars.” American Christians have played a major role in that process of national self-definition, all too regularly sanctifying the nation’s military conflicts as sacred struggles.

Historian Mark Noll has lamented that by the time of the American Revolution “the thought and activity of the American churches tended to follow the thought and activity of the American nation,” not the other way around. With painstaking thoroughness, James Byrd reaffirms that conclusion, showing that the pattern even defined how revolutionary-era Christians read their Bibles and thought about war.

ONE NATION WITHOUT GOD??

Natures GodI have a review just now posted at Christianity Today online of the latest volley in the debate over the religious dimensions of the American founding.  The book in question is Nature’s God: The Heretical Origins of the American Republic, by Matthew Stewart.  I will add some thoughts here sometime in the next few days, but I encourage you to check out my review at CT.  You can read it here.  As a work of history, the book is deeply flawed, but Stewart is a good writer, and his interpretation is one that many secularists want badly to believe, so look for the book to receive a considerable amount of positive attention.

 

 

 

JEFFERSON’S FAITH

Were our Founding Fathers devout Christians determined to create a Christian commonwealth grounded on biblical principles?  Or were they secular sons of the Enlightenment who hoped to banish orthodox Christianity from the public square?  This Fourth of July, combatants on both sides of the culture wars will gravitate to one or the other of these extremes as they remember our nation’s birth.  It’s a horrible dichotomy that demands that we choose between two equally untenable positions.

A more defensible position rejects both of these all-or-nothing claims.  As Matthew L. Harris and Thomas S. Kidd observe in their anthology The Founding Fathers and the Debate Over Religion in America, “None of the Founders were atheists . . . but none of the most famous Founders were ‘evangelical’ Christians of the sort produced by the Great Awakening, either.”  Many of the Founders were significantly influenced by the Enlightenment, most notably in their frequent willingness to let reason trump revelation when they seemed to be in conflict.  On the other hand, as Harris and Kidd note, “hardly anyone during the revolutionary era doubted that religion, and especially moral virtue, was important to the life of the new American republic.”   Citing such complexity, they conclude that any broad generalization of the Founders as either “secular” or “Christian” is problematic at best.

 

Founding Fathers and the Debate over Religion

Thomas Jefferson was not necessarily a representative Founder in his religious views, but he did embody the complexity that Harris and Kidd point out.  Since   in two days we’ll be celebrating the anniversary of his handiwork–the Declaration of Independence–it makes sense to revisit a few samples of his thinking.

First, Jefferson was no atheist.  In fact, he regularly made an argument for God that today we would call an appeal to “intelligent design.”  Here is how Jefferson put it in an 1823 letter to John Adams:

“When we take a view of the Universe, in its parts general or particular, it is impossible for the human mind not to perceive and feel a conviction of design, consummate skill, and indefinite power in every atom of its composition. . . . So irresistible are these evidences of an intelligent and powerful Agent that, of the infinite numbers of men who have existed thro’ all time, they have believed, in the proportion of a million at least to Unit, in the hypothesis of an eternal pre-existence of a creator, rather than in that of a self-existent Universe.”

Jefferson also welcomed the contribution that religious belief might make in promoting virtue among the American people.  Jefferson, like almost all of the Founders, took for granted that a free society could not survive without virtue, and that virtue was unlikely to thrive in the absence of religious conviction.  Or as Jefferson expressed the point in his book Notes on the State of Virginia:

“Can the liberties of a nation be thought secure when we have removed their only firm basis, a conviction in the minds of the people that these liberties are the gift of God?”

Thomas Jefferson sat for this portrait by Charles Willson Peale in 1791.

Thomas Jefferson sat for this portrait by Charles Willson Peale in 1791.

Jefferson praised the civic utility of religion publicly in his first inaugural address in 1801.  In a lengthy paragraph listing the country’s peculiar “blessings,” the new president described the American people as

“enlightened by a benign religion, professed, indeed, and practiced in various forms, yet all of them inculcating honesty, truth, temperance, gratitude, and the love of man.”

He want on to observe that his fellow countrymen “acknowledg[ed] and ador[ed] an overruling Providence, which by all its dispensations proves that it delights in the happiness of man here and his greater happiness hereafter.”

And yet there was another side to Jefferson’s perspective on religion.  While he admired a “rational” religion that promoted good works and civic virtue, he was contemptuous of much of orthodox Christianity as just so much superstition.  In private correspondence, he referred to evangelical religion with a sneer, as in this 1822 letter to Thomas Cooper, a Unitarian professor that Jefferson was trying to lure to the newly-founded University of Virginia:

“In our Richmond there is much fanaticism, but chiefly among the women: they have their night meetings, and praying-parties, where attended by their priests, and sometimes a hen-pecked husband, they pour forth the effusions of their love to Jesus in terms as amatory and carnal as their modesty would permit them to use to a more earthly lover.”

Jefferson’s skepticism of the Bible is also well established, notwithstanding David Barton’s tortured efforts to prove otherwise.  In The Jefferson Lies, Barton insisted that Jefferson wholly accepted the gospels while suspecting the reliability of Paul’s epistles, but in reality Jefferson believed that a great deal of the gospels were invention.  As he summarized in an 1820 letter to William Short,

“We find in the writings of his [Jesus’] biographers matter of two distinct descriptions. first a ground work of vulgar ignorance, of things impossible, of superstitions, fanaticisms, & fabrications. intermixed with these again are sublime ideas of the supreme being, aphorisms and precepts of the purest morality & benevolence, sanctioned by a life of humility, innocence, and simplicity of manners, neglect of riches, absence of worldly ambition & honors, with an eloquence and persuasiveness which have not been surpassed.”

Jefferson could easily distinguish between these two categories by subjecting them to the test of reason.  “Your reason is the only oracle given you by heaven” for discerning truth, Jefferson famously counseled his teenaged nephew in 1787.  A great deal of the gospels were unreasonable (the virgin birth, miracles, and the resurrection, for example), so these had to be discarded.  Perhaps the greatest irrationality of all, however, was the concept of the Trinity.  As he wrote to James Smith:

“[The] paradox that one is three, and three but one is so incomprehensible to the human mind that no candid man can say he has any idea of it, and how can he believe what presents no idea? He who thinks he does, deceives himself. He proves also that man, once surrendering his reason, has no remaining guard against absurdities the most monstrous, and like a ship without rudder is the sport of every wind. With such persons gullibility, which they call faith, takes the helm from the hand of reason and the mind becomes a wreck.”

In sum, the primary author of the Declaration of Independence was no atheist, nor was he committed to a wholly secular public sphere, but neither did he believe that Jesus was the Christ.   So where does this leave us?  Somewhere, I think, between comfortable but false extremes.