Tag Archives: Barack Obama


              Think of your forefathers!  Think of your posterity!                      —John Quincy Adams—


As if things weren’t bad enough in this interminable election cycle, the recent death of Supreme Court justice Antonin Scalia has added to the partisan posturing plaguing the nation.  With righteous indignation, Senate Republican leaders avow never to consider a replacement until after the coming election on the grounds that, to quote majority leader Mitch McConnell, “the American people should have a say in the Court’s direction.”  With righteous indignation, White House spokesmen present President Obama as a strict constitutionalist who is bound by his sworn oath to the law of the land to act immediately to fill the vacancy.

Let me be honest: I’m skeptical of both claims.  They strike me as equally self-interested and disingenuous.  But I’ve never wanted to make this blog overtly political, and the point of this series on the 2016 election is not to take sides but to put it in historical perspective.  Americans in 2016 are engaged in an ongoing debate about the role of government in a free society that began long before any of us arrived on the scene.  Why wouldn’t we want to learn from that conversation?  With John Quincy Adams, I think that listening to our ancestors is one of the best things we can do to serve our posterity.

Let’s start with Democratic appeals to the Constitution to arguing that President Obama was obligated to nominate a successor immediately.  It’s worth pointing out that the Constitution says almost nothing about the nomination process.  The Framers devoted a total of twenty words to the subject, and nearly half of them are and, of, by, or the.  Buried in a portion of a single sentence in Art. II, sect. 2, we read that the president “shall nominate, and by and with the advice and consent of the Senate, shall appoint . . . judges of the Supreme Court.”  Not a lot to go on there.

Imagine for a moment the following scenario: President Obama holds a press conference and announces, “Given that my party has lost control of both houses of Congress, and that Constitutionally any nomination that I might make requires the Senate’s approval, it makes sense to forego a nomination at this time and allow my successor to act.”  Would that be unconstitutional?  I don’t think so.  It might be politically irresponsible—not to mention utterly inconceivable—but it would be hard to twist it into a palpable violation of the Constitution, even if it meant postponing Scalia’s replacement an entire year.  The Constitution only requires that the legislative branch meet once a year, for goodness sake, and through the end of the nineteenth century Congress was in adjournment for nine months in every odd-numbered year, so the possibility that a seat could remain vacant for an extended period was ever present.

But my objection runs deeper than this.  Democratic appeals to the Constitution on the issue ring hollow because most Democrats long ago embraced a role for the Supreme Court that the Framers of the Constitution could have scarcely imagined.  In their essays promoting ratification, both James Madison and Alexander Hamilton insisted that the proposed Supreme Court would have the right and the responsibility to rule on the constitutionality of federal and state laws.  At the same time, however, they went out of their way to assure critics that the Court’s powers, though important, were limited.

Alexander Hamilton

Alexander Hamilton

Of the eighty-five essays in the Federalist Papers, only six focus on the judiciary (nos. 78-83), all of them written by Hamilton.  The New Yorker wrote to refute Anti-Federalist charges that the Framers at Philadelphia had created a monster that would run rough-shod over the prerogatives of the states and the liberties of the people.  In Federalist no. 78, for example, Hamilton led with the reminder that “the judiciary is beyond comparison the weakest of the three departments of power.”  Hear how he explains “the natural feebleness of the judiciary”:

 . . . the judiciary, from the nature of its functions, will always be the least dangerous to the political rights of the Constitution; because it will be least in a capacity to annoy or injure them.  The executive not only dispenses the honors but holds the sword of the community.  The legislature not only commands the purse but prescribes the rules by which the duties and rights of every citizen are to be regulated.  The judiciary, on the contrary, has no influence over either the sword or the purse; no direction either of the strength or of the wealth of the society, and can take no active resolution whatever.  It may truly be said to have neither FORCE nor WILL but merely judgment.

In Federalist no. 81 Hamilton again sought to reassure his readers.  He acknowledged the popular fear that the Court would abuse its prerogatives to usurp the power of the legislature.  According to critics, the justices would be free to evaluate laws according to “the spirit of the Constitution,” rather than the strict letter of the document.  This in turn would enable the Court to mold the laws “into whatever shape it may think proper; especially as its decision will not be in any manner subject to the revision or correction of the legislative body.”

This “supposed danger . . . is in reality a phantom,” Hamilton insisted.  And why was this?  Because “there is not a syllable in the plan under consideration which directly empowers the national courts to construe the laws according to the spirit of the Constitution.”

Bottom line: In 1787 the Framers of the Constitution believed that the Supreme Court would have “neither force nor will” and that “the general liberty of the people can never be endangered from that quarter.”  Does anyone believe this in 2016?

In my next post I’ll share what the Founders would have thought of the Republican claim that “the people” should influence the make-up of the Court.


(I’m teaching a course this semester on the American Civil War, and so I’m doing my best to immerse myself in that subject, reading works on the conflict as much as time allows. In the review below I share my opinion of a book that I purchased at the recent annual meeting of the American Historical Association. I didn’t like it. I might even detest it. Read on to find out why.)


A Just and Generous Nation: Abraham Lincoln and the Fight for American Opportunity, by Harold Holzer and Norton Garfinkle. New York: Basic Books, 2015.


I’ll start with a compliment. Overall, academic historians long ago abandoned any sense of social responsibility to the larger society. There are admirable exceptions, but for the most part, academic history is an inward-focused conversation that academic historians have with each other about the academic questions they find of academic interest. And if the public beyond the walls of the Academy equates academic with “arcane,” “elitist,” or “irrelevant”—a pretty logical inference—well, that’s the public’s problem, not ours. Our job is to advance the boundaries of knowledge after all, not to communicate with the masses.

Just and Generous NationTo their credit, Harold Holzer and Norton Garfinkle have written A Just and Generous Nation with a broad audience in mind (as its publication by a trade press, Basic Books, underscores). The book tries to make the past relevant to the present, and I applaud that. It deals with big questions, and I applaud that also. It’s written in an engaging manner—always a plus—and the authors unabashedly point out lessons they think we should learn, a trait I admire.

In sum, I really like the conception of the historian’s task that underlies A Just and Generous Nation. It’s the authors’ execution of the task that drives me crazy.

The book’s thesis is clear, in part because the authors’ repeat it monotonously. Until his final breath, Abraham Lincoln was animated by the conviction that the United States had been uniquely founded on the “vision of a just and generous economic society.” The Founders, Lincoln believed, “brought forth a new nation” in which all would have an equal chance to rise into the middle class. This is why he sought to save the Union. This is why he acted to emancipate slaves. Neither were ends in themselves. The Civil War, Holzer and Garfinkle contend, was always primarily a struggle over “what kind of economy the nation should have.” More precisely, it was a war for “the American Dream,” for the triumph of a society that gives “all a chance” and allows “the weaker to grow stronger.”

OK. This is a provocative thesis, but not beyond the realm of possibilities. Historians have debated Lincoln’s motives for a century and a half. Some have suggested that Lincoln was propelled by an almost mystical veneration of the Union bequeathed by the generation of 1776. Some have pointed to his conviction that slavery was “a moral, social, and political wrong,” a stain on the national fabric. Others have stressed Lincoln’s conviction that the Civil War was the ultimate test of the viability of democracy, a bloody trial to determine whether common people could govern themselves. And some have portrayed the war as a monumental clash of economic systems, a conflict between agricultural and industrial societies for national dominance. The Columbia University historian Charles Beard made that argument nearly a century ago, and there are faint echoes of that claim in A Just and Generous Nation.

But the heart of the authors’ argument isn’t really about Lincoln’s conception of the “American Dream.” It’s about his purported vision for the role of the federal government in promoting it. And Lincoln’s vision, Holzer and Garfinkle insist with undisguised admiration, was breathtakingly expansive and modern. “Lincoln was the first president to use the federal government as an agent to support Americans in their effort to achieve and sustain a middle class life,” they gush. He “never changed his view that government should engage proactively to build, expand, and provide opportunities for working people to improve their economic status.”

This is what the Civil War was about. This is why nearly eight hundred thousand men died and more than a million more were maimed: to secure for future generations an activist role for the federal government as the guarantor of middle-class prosperity.

Almost all historians acknowledge that Lincoln advocated an active role for government in promoting economic development and economic opportunity. Probably the first political speech he ever gave called for state aid to dredge the Sangamon River in order to help local farmers get their crops to market. He quickly embraced the new Whig Party’s commitment to what Whig leader Henry Clay called “the American System.” This included a national bank to facilitate economic exchange, a high protective tariff to promote industrialization, and government aid to “internal improvements”—subsidies for the construction of railroads, canals, and the improvement of waterways—in order to accelerate the development of a market economy. Evaluated in the context of the mid-1800s—when the federal government was minuscule and the only federal employee that most Americans ever met was the mailman—Lincoln was undeniably a champion of active government.

But he wasn’t a modern-day big-government Democrat, although Holzer and Garfinkle do their best to convince us otherwise. They may be right when they claim that “Lincoln’s domestic policies provided the first clear example of the positive role that could be played by the federal government to encourage the economic growth of the nation.” The Republican-controlled Congress passed a series of landmark economic measures during the war. Through the Morrill Act, the Transcontinental Railroad Act, and the Homestead Act, the Congress used the nation’s vast untapped resources of western lands to promote higher education, railroad construction, and increasing farm ownership.

But it’s at least debatable whether these measures all benefited, or at least primarily benefited, the middle class. The Morrill Act helped to create land-grant colleges across the West, but generations would pass before as much as five percent of the white male population would attend. The building of the first transcontinental railroad undoubtedly expanded the national economy and indirectly aided many, but it set a precedent of gargantuan subsidies to private corporations in the process. (Between 1862 and 1871, the federal government granted land subsidies to private railroad companies of nearly two hundred million acres—roughly the size of England, France, and Scotland combined.)

You can even question whether the Homestead Act was all that helpful in aiding upward mobility into the middle class. Although the act provided “free” farms to settlers who would improve the land for five years, economic historians have found that few working-class households had the resources to move west, erect buildings and fences on a homestead, and feed and clothe themselves for months while waiting for the first crops to come in.

If it’s debatable to characterize these measures as unalloyed victories for the middle class, it’s preposterous to describe the enormous military expenditures that the war demanded as “the federal government’s stimulus programs.” Yet Holzer and Garfinkle do so, in keeping with their determination to portray Lincoln as the founding father of twenty-first-century liberalism. To drive home their point, in the second half of the book they trace the decline and rebirth of Lincoln’s progressive vision in the century and a half since his death. As they tell the story, in the late-nineteenth century the GOP turned its back on Lincoln’s dream for America and become the party of the one percent. Theodore Roosevelt tried to restore the GOP’s moral center, but it was the Democratic Party that eventually became, in vision if not in name, the true party of Lincoln.

The central agent in this transformation was Franklin Roosevelt. The authors pair FDR with Lincoln as the two most important promoters of the American Dream in U. S. history. Lyndon Johnson was also a worthy heir of Lincoln when he sought to use the federal government to build the “Great Society.”  So was Barack Obama, who during his second term finally began “girding his loins to follow in Lincoln’s footsteps and take new steps to use the power of the presidency to improve the status of middle-class and working-class members of the American community.”

In sum, Lincoln would have been an enthusiastic advocate of social security, welfare, affirmative action, and the Affordable Health Care Act. Apparently, he also would have looked for economic guidance to Sweden and Denmark, where public spending and tax revenue as a percentage of GDP is double what it is in the U. S. The authors conclude A Just and Generous Nation with an extended tribute to both nations, leaving us to conclude that, while Lincoln’s vision may be withering in the United States, it’s alive and well in Scandinavia.

You can draw your own conclusions about the authors’ policy proposals. There are arguments for and against them, with intelligent and decent people on both sides of the debate. As a historian, however, I have to say that A Just and Generous Nation is bad history, and I don’t say that lightly, given that Holzer is widely recognized as a leading Lincoln scholar. And yet the book is riddled with inaccuracies. I won’t bore you with the details, except to say that the authors misstate or misrepresent the facts concerning the Fugitive Slave Act of 1850, the meaning of Lincoln’s “house divided” metaphor, his vision for slavery’s “ultimate extinction,” the significance of Congressional compromise proposals in 1860-1861, Lincoln’s stance on the Second Confiscation Act of 1862, the implications of the Wade-Davis bill of 1864, the end of the Freedmen’s Bureau, and the relationship between postwar peonage and convict labor. The book can be sloppy at times.

It is also relentlessly one-sided. The authors regularly ignore evidence that would weaken their argument. (In a masterpiece of understatement, a New York Times review notes that the book “flattens out a story that has some uncomfortable complexities.”) While praising Lincoln’s commitment to the working class, for example, the authors fail to mention that by the 1850s Lincoln was essentially a corporate lawyer who earned the lion’s share of his living representing wealthy commercial concerns: insurance companies, banks, and railroads. His single largest client was the Illinois Central Railroad, the longest railroad in the world at the time and one of the world’s largest corporations. (Lincoln’s law partner, William Herndon, would later joke, “Much as we deprecated the avarice of great corporations, we both thanked the Lord for letting the Illinois Central Railroad fall into our hands.”)

Nor do the authors find occasion to mention Lincoln’s well-documented response when his brother-in-law, a subsistence farmer named John D. Johnston, wrote Lincoln in 1848 and asked to borrow $80 to pay off some pressing bills. “What I propose is,” Lincoln wrote Johnston, “that you shall go to work ‘tooth and nails’ for some body who will give you money [for] it.” “Follow my advice,” Lincoln lectured his brother-in-law, “and you will find it worth more than eight times eighty dollars to you.”

Most problematic of all is the authors’ reading of Lincoln’s Gettysburg Address, which can most charitably be described as imaginative. After paying tribute to those “who here gave their lives that [the] nation might live,” Lincoln had challenged the assembled throng “to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced,” to “take increased devotion to that cause for which they gave the last full measure of devotion.” Although his audience didn’t know it (nor did the Union soldiers Lincoln was praising), the “cause” was not the preservation of the Union. It was not the eradication of slavery and a “new birth of freedom.” It was the promotion of the American Dream grounded in activist government. “Looking to the aftermath of the Civil War,” the authors explain, Lincoln “was defining the nation’s ‘unfinished work’ as the new task of providing all citizens a government committed to helping all citizens build a middle-class life.”

A Just and Generous Nation is a textbook example of what I call “history as ammunition,” an approach to the past as a storehouse of illustrations for proving predetermined points. When politically conservative amateur historians appeal to America’s Founders to promote a conservative contemporary agenda, academic historians are quick to protest. Only anti-intellectual populist yahoos—“historical fundamentalists,” to use Harvard historian Jill Lepore’s condescending phrase—would naively do such a thing. But it’s apparently fine for two prominent scholars to ask WWLD?—“What Would Lincoln Do?”—as long as the answer points in the direction that most academics are already headed.

Let me be clear: I’m not frustrated by this book because I disagree with the authors’ liberal politics. Their politics are irrelevant. Long-time readers of this blog will know that I have regularly called to account conservative Christians when they have done something similar. As a historian of the United States, my frustration is with those who distort our past while claiming to honor it. And as a historian of the American Civil War more specifically, I can only say that Holzer and Garfinkle have so contorted that crucial conflict that few of the men who fought in it would recognize it.


One of the reasons to study the past is to see the present more clearly.  By figuratively visiting other times and places, we become more aware of aspects of our place and time that we would otherwise take for granted.  Last night’s State of the Union address is a case in point.


When the framers of the Constitution crafted our blueprint of government in 1787, they stipulated that the president “shall from time to time give to the Congress information of the state of the Union and recommend to their consideration such measures as he shall judge necessary and expedient” (Art. II, sect. 3).  From this requirement the custom evolved that the executive would formally address the Congress at least once annually (perhaps in keeping with the Constitutional requirement in Art. I that the Congress “assemble at least once” annually).  For decades this address was typically called the president’s “annual message,” and now it is more commonly known as the “State of the Union Address.”

The president’s State of the Union Address (or SOTU by POTUS for those who think acronyms are cool) is now an enormously significant media event with huge political ramifications for both parties. As the nation watches (to the degree that we watch), the president and his party enjoy millions of dollars’ worth of national publicity.  The party’s leader pitches his policy proposals, while the camera pans to congressmen looking variously engaged or bored, gleeful or glum, enthusiastic or resentful.

From first to last, this is a media-driven event.  In advance of the spectacle you could tune in to any number of pre-game shows, not the least of which was sponsored by the president himself.  Virtual visitors to the White House website were first reminded that “Together, we can make change happen.”  You could then watch the SOTU “pre-show,” view video of everyday Americans as they received phone calls inviting them to sit with the First Lady in her box during the speech, and even read synopses of what the president planned to share regarding the economy, climate, health care, foreign policy and social progress.  After the hour-plus speech, a smorgasbord of talking heads told us what the president said, why he said it, and what they thought of it, while pollsters scurried to ask us (or at least a few hundred of us) if we thought what the talking heads thought we should think.

It has not always been this way.  The Constitution doesn’t require the president to give a speech to the Congress, only to give it information and make recommendations.  And for most of American history, U. S. presidents have opted to send a formal written report via messenger and skip the personal oration.  Overall, since 1789 that’s been the case for nearly two thirds of these messages–only 82 out of 226 (about 36%) have come as speeches.

Our first two presidents, Federalists George Washington and John Adams, appeared personally before Congress to satisfy their Constitutional duty.  But between 1801 and 1913, not a single U. S. president followed their example.  In 1801, Democratic-Republican Thomas Jefferson decided to send his message in writing to Congress on the grounds that the practice of lecturing Congress in person was undemocratic.  In England it was customary for the king to speak periodically “from on high” to the Parliament, and Jefferson–who hated public speaking anyway–insisted that a truly republican government should not be perpetuating the trappings of monarchy.

The precedent held for a long time.  Each of the next twenty presidents followed Jefferson’s lead.  Even Abraham Lincoln’s eloquent 1862 message in the midst of the Civil War–calling the North to preserve the United States as the “last best hope of earth”– was sent by a courier and read by a congressional clerk.  It was not until 1913 that Woodrow Wilson would defy what was by then a hallowed tradition and appear before Congress in person.  And when he did so, headlines in the New York Times declared “SENATORS FROWN ON WILSON’S VISIT: Reading is Compared to Speech from Throne.”

From this point, the pattern began to shift slowly but surely toward personal appearances.  In the process, what had once been a rather perfunctory summary of the work of the various executive departments gradually became a major political statement on behalf of the president and his party.  More important, the originally intended audience of the address–the U. S. Congress–was replaced by the American public.

The growing importance of radio and television was central to the latter transformation.  The first president to deliver his address to a national radio audience was, ironically, “Silent” Cal Coolidge, who belied his nickname with a 22-page long speech in 1923.  In 1947 television got into the act, broadcasting Harry Truman’s address to the fraction of American households who had invested in that dubious technology.  The TV audience grew steadily thereafter, so that by the mid-1960s Lyndon Johnson decided to shift his speech from the traditional afternoon setting to the early evening in order to garner a much larger “prime-time” audience.

Which brings us, more or less, to the highly choreographed, vacuous public spectacle that the State of the Union address has long since become.  True to form, last night’s was a relentless rhythm of presidential statement and partisan response: Democratic ovations, Republican groans, Joe Biden repeatedly rising to his feet, Paul Ryan glued to his chair.  If the Washington Post transcript of the event is accurate, President Obama was interrupted by applause seventy-one times during his fifty-nine-minute speech.

In sum, the event is now much like our quadrennial party conventions.  Photo-ops, posturing, and platitudes abound, but almost no real work gets accomplished–at least not the kind of work that the framers of the Constitution envisioned.  The only real suspense in the event came when, for a moment, it looked as if a drowsy Justice Ruth Bader Ginsberg was going to fall from her chair.  I could sympathize.

Does anyone else find this tedious?


Well, it’s been a good two weeks since I last posted, and that might as well be two years from the perspective of perpetual connectivity that defines the blogosphere.  Sorry about that.  We’re running a job search for a new U. S. historian at Wheaton, and that has been tremendously time consuming. We’re also addressing last-minute logistics for a distinguished visiting lecturer, Clemson University Professor Vernon Burton, who will be on campus next week delivering two public addresses pertaining to the 150th anniversary of the Emancipation Proclamation.  These are good tasks to have, but they have absorbed most of my extra time and energy.

I know that I promised in my last post that I was through with President Obama’s inaugural address, but it turns out I was wrong.  (My students will tell you that I have a habit of offering multiple “final” points in my lectures, repeatedly raising and then dashing their hopes of getting out of class early.)  One of the axioms that I bring to my teaching is that an effective way to stretch the mind is to challenge the heart.  This is because the kind of thinking that has the potential to be truly transformative comes most naturally when something we feel deeply about is called into question.  One of my students put it this way in a recent reflection: “In my own historical study I have found that the issues that carry an emotional component are the ones that I study the best. . . . I pour so much more time and energy and thought into a subject that pulls at my heartstrings.”

I mention this because I sense that in a previous post I struck a nerve with a few readers.  (See “The Rhetoric of the President’s Address—Digging More Deeply.”)  We are so deeply immersed in our contemporary “rights” culture that it is hard for us to imagine an alternative.  We take for granted that we have inalienable natural rights, and our main argument with the unbelieving culture around us involves the question of where those rights come from.  This puts us in the ironic position of citing the Deist son of the Enlightenment, Thomas Jefferson, to remind the culture that our rights are not suspended in a vacuum, but that they have been given us by our Creator.

One of the key principles in thinking historically is remembering the crucial importance of context.  We engage the past in search of wisdom for the present, but if we are to understand rightly what the past has to say to us, we first need to understand the ideas that we encounter in their historical context.

Jefferson’s reference to “inalienable rights” that include “life, liberty, and the pursuit of happiness” comes almost verbatim from the late-seventeenth-century political philosopher John Locke.  Scholars disagree about Locke’s private religious beliefs, but we can agree that much of Locke’s public argument contradicted orthodox Christian doctrine.  Most notably, Locke overtly denied Paul’s teaching in Romans 2 that God has inscribed His law in our hearts.  God has given us no conscience or innate sense of right and wrong, Locke argued.  Our primary gift from God at birth is the faculty of reason, and He intends for us to rely on reason in determining how we are to treat one another.

According to Locke, the process of discovering the “law of nature” and the inalienable rights that ensue is a process of applying reason to experience.  It is also, at its heart, a process of the rational pursuit of self-interest.  By nature, none of us wants to be killed, or enslaved, or have our property stolen, Locke theorized, and over time we logically conclude that one of the ways we protect ourselves from such a fate is to refrain from killing, enslaving, and stealing the property of others.  If most individuals exercise such self-control, the societies we form will be societies in which we are more likely to get what we want, which is actually a pretty good definition of “right” as Americans currently employ the term.  (I am reminded of historian Robert Wiebe’s definition of “right” in his book Self-Rule: that “delightful euphemism for ‘what I want.’”)

One of the commentators to my earlier post acknowledged that the language of “rights” is not prevalent in the Bible but asked if I was trying to argue that anything not expressly spelled out in scripture was, by definition, unchristian.  Not at all.  My point is simply that we have not thought very deeply about the concept of rights, and that because ideas come to us embedded in historical contexts, we ignore those contexts at our peril.  If we thought more deeply about the context of Jefferson’s assertion, we might understand more readily how it is that the principles of the Declaration have come to justify a radically individualistic vision in which the autonomous individual is the constituent element of society and all other social groups (family, church, community) must defer to the individual and the paternalistic state that protects him.

Thinking Christianly about the Declaration, we might conclude that Jefferson’s “self-evident” truth that “all men are created equal” is true in certain respects but not in all.  Here let me end by quoting at length from Christian political scientist James Stoner’s 2005 essay “Is there a Political Philosophy in the Declaration of Independence?”  According to Stoner, the “self-evident truths” in the Declaration

do not give an adequate account of the family, the fundamental institution of social life. . . . The family is built not around equality, but around the inequality of parent and child.  Precisely the most basic meaning of Jefferson’s statement of equality—that no man is the natural ruler or the natural subject of another—is not true of this relation, for the parents are surely the natural rulers of their dependent children.  [Beyond this,] the family is first and foremost not about rights, but about duties; even the right of children to care and education is abstract and vague compared to the duties of parents to provide and instruct and the duty of children to obey and learn. . . . [Furthermore,] the end of the family is only incidentally the security of rights; it is principally provision and nurture in an environment formed by love.

Much food for thought here.


President Obama’s second inaugural address is already rapidly fading from public memory, but I hope that you will indulge one more comment on the president’s rhetoric.  In previous posts, I have stressed the important symbolic role that presidential inaugurals play in our collective definition of what America stands for.  I have also argued that, as Christians seeking to “take every thought into captivity to the obedience of Christ” (II Corinthians 10:5), we need to go far beyond merely counting the president’s references to God.  Thinking Christianly and historically about such important rituals involves far more than parsing the president’s prose to determine whether he has paid sufficient homage to our purported Christian heritage.

Regardless of the terminology employed, we need to be evaluating the president’s rhetoric in light of scriptural principles.  We shouldn’t just ask whether the president defines our nation as Christian.  We need to be asking the far more difficult question of whether the statements that he makes are consistent with Christian precepts.  This comes more naturally when scrutinizing specific policy proposals.  Although devout Christians can and do disagree about the government’s proper stance concerning homosexual rights, women in combat, or governmental obligations to the poor, to name three examples, many of us will think through the president’s positions on those issues by measuring them against our own understandings of biblical teaching.

We’re not nearly as careful to scrutinize the tributes that the president pays to America and Americans.  As I noted in my last post, we need to be asking of the president’s rhetoric—and of political rhetoric more generally—not only what it says about God, but also what it says about us.  A knowledge of American history can help in this process, not by showing us how to evaluate what the president says, but by helping us more fully to see what he says, to be sensitive to claims that are so familiar to us that we have come to take them for granted and to accept them as self-evident.

Let me explain what I mean.  One of life’s paradoxes is that many of the values that most shape our worldviews are often invisible to us.  They involve beliefs that are so widely agreed on that we seldom hear them debated.  Never hearing them debated, we come to see them as so obviously beyond question that there is little reason to think deeply about them.  With little reason to think deeply about them, we soon stop thinking about them at all.  And when we stop thinking about them, there is a sense in which we literally cease to see them.  They may be shaping us, but they are invisible to us.

Here is where the study of the past can be so powerfully illuminating.  In studying other times and places, we frequently come face to face with values that are very different from our own, held by people who would find our own views mystifying, illogical, or even repulsive.  By exploding our reassuring conception of our values as unquestionable and unquestioned, the study of the past can make the present seem strange to us, helping us to re-evaluate what we have long taken for granted.

As a historian, one of the aspects of the president’s rhetoric that stands out to me is the praise that he heaps on the American people.  Listen to what he tells us about ourselves: we are characterized by “our insistence on hard work and personal responsibility,” our “resolve” and “our resilience.”  The members of our armed forces “are unmatched in skill and courage.”  Our “possibilities are limitless, for we possess all the qualities that this world without boundaries demands: youth and drive; diversity and openness; an endless capacity for risk and a gift for reinvention.”  We are the “most powerful nation” in the world, and it is our responsibility to be “a source of hope to the poor, the sick, the marginalized, [and] the victims of prejudice.”

I notice these comforting claims because I have spent a lot of time studying a period in American history when statesmen did not invariably flatter the public.  To draw from just one body of evidence, consider the arguments contained in the Federalist, the famous compilation of essays authored primarily by Alexander Hamilton and James Madison in 1787-88 to support the ratification of the U. S. Constitution.  In the Federalist we read about “the folly and wickedness of mankind” and the “ordinary depravity of human nature.”  We are told that “men are ambitious, vindictive, and rapacious”; that “momentary passions and immediate interests” control human conduct more than “considerations of . . . justice”; that “the mild voice of reason . . . is but too often drowned . . . by the clamors of an impatient avidity for immediate and immoderate gain.”  Hamilton and Madison made no claim that Americans were exceptions to these generalizations.  Instead, the authors of the Federalist essays reminded their readers that, even in America, self-interest was the predominant drive in the human breast and virtue was as uncommon as it was precious and fragile.

A familiarity with American history, in other words, can help us to see as strange President Obama’s repeated tributes to his audience.  What we take for granted, the Federalist would have roundly denounced.  “Of those men who have overturned the liberties of republics,” Hamilton wrote in the opening essay, “the greatest number have begun their career by paying an obsequious court to the people, commencing demagogues and ending tyrants.”  We now routinely demand of our leaders such obsequious homage, however, and we have done so for more than a century and a half.  Writing in the 1830s, French visitor Alexis de Tocqueville concluded that no U. S. politician could long survive without paying a “tribute of adulation to his fellow citizens.”  As he noted so trenchantly in his classic Democracy in America, “the majority lives in the perpetual utterance of self-applause, and there are certain truths which the Americans can learn only from strangers or from experience.”

Or from history, I would add.  But if history can make us more aware of the “tribute of adulation” that we demand of those we put in public office, it cannot, by itself, tell us whether such demands are “Christian” or not.  We turn to scripture and to church teaching for that.  We can only scrutinize carefully the values that we can see.  History can help to make the invisible visible, but it rightfully wields no moral authority.  As Christians, we must turn elsewhere for our standard of judgment.


Presidential inaugural addresses serve an important function.  Although they may contain references to specific programs or initiatives, they are not primarily policy statements.  They are first and foremost civic rituals that reinforce our collective sense of what it means to be an American.  The recently elected president plays a crucial role in this by calling attention to those specific principles that are supposed to define us as a people.  We all are implicated in his rhetoric. 

In my last post I began to suggest ways that American Christians might think both historically and Christianly about President Obama’s inaugural address.  One of the most obvious is simply to scan the text for allusions to God.  When we do so, we find that President Obama referred to “God” in five instances.  As a Christian, I can affirm each of the statements containing these allusions.  Setting aside the perfunctory “God bless you” (did someone sneeze?) and the formulaic “may He forever bless these United States of America,” I can say “amen” to the president’s more substantive assertions that “freedom is a gift from God,” that we are all equal “in the eyes of God,” and that the earth has been “commanded to our care by God.” 

But I am pretty sure that I could also affirm them if I were Muslim or Jewish.  As I noted in the last post, none of these references to God is unambiguously Christian.  As a historian, this does not surprise me, for as I shared earlier, no American president has ever made in an inaugural address an unambiguous, unequivocal reference to the triune God of traditional, orthodox Christian confession.  The language in these addresses makes ambiguous references to God an art form.  They purvey what might be called “civil religion,” a nondescript faith defined by vague references to God shorn of specific truth claims that might offend or divide. 

This pattern is so deeply ingrained and unvarying in inaugural addresses that we can rightly call it an American tradition.  My point in stressing this is neither to condemn nor to defend civic pluralism.  I simply want to put the president’s speech in historical context.  If our goal is to think with Christian discernment about the American past, surely this is an important part of our national story. 

Having said this, I think there are more penetrating questions that we might ask of the president’s speech than how many times he alluded to God.  Our religious beliefs are revealed as much in our anthropology as in our theology.  Our religious worldview doesn’t consist solely of our understanding of God, in other words.  It is also defined by our understanding of human nature and the human condition.  As we strive to think Christianly, then, we need to be asking of the president’s rhetoric—and of political rhetoric more generally—not only what does it say about God, but also what does it say about us?    

Let me give just one example of what I have in mind, and in my next post I will share one or two others.  At the very outset of his address, President Obama stated, “What makes us exceptional, what makes us America is our allegiance to an idea articulated in a declaration made more than two centuries ago,” referring to the Declaration of Independence.  “We hold these truths to be self-evident, that all men are created equal.  That they are endowed by their creator with certain unalienable rights, and among these are life, liberty, and the pursuit of happiness.”  Several minutes later the president returned to these “founding principles.”  “We are true” to these principles, the president proclaimed, “when a little girl born into the bleakest poverty knows that she has the same chance to succeed as anybody else because she is an American, she is free, and she is equal not just in the eyes of God but also in our own.”

This is inspiring oratory, but let’s think carefully about what Mr. Obama is really saying.  Following Abraham Lincoln, the president tells us that the essence of what it means to be an American is our faith in the Enlightenment principle of natural equality as the basis for political rights.  In his wonderful book God of Liberty: A Religious History of the American Revolution, Baylor historian Thomas Kidd shows how readily American Christians—who of course already believed that all peoples descended from a common creation—appropriated this largely secular principle and engrafted it into their worldview.  Even as late as the middle of the 1700s, Congregational pastor Jonathan Edwards, arguably the greatest American theologian of all time, distinguished between spiritual and social or political equality.  Because “all have sinned,” all humans—regardless of race, class, or nationality—stand on the same footing before Almighty God, equally in need of God’s grace.  This spiritual equality, Edwards believed, was not inconsistent with a hierarchical society in which “different members of society have all their appointed office, place and station, according to their several capacities and talents, and everyone keeps his place.” 

Let me be clear: I am not trying to make a case for inequality per se.  I do want us to realize that the language of “rights” that is so pervasive today is rooted more in secular thinking than in Scripture.  The Bible speaks primarily in terms of sacrifice, not self-assertion; it defines obligations far more than rights.

We must also be leery of the president’s suggestion of a future day in which the poor and powerless among us have hope not just because of their preciousness in the eyes of God but because, as a nation, we have also come to accept the equality of all people.  That day may come, but as Christians we should know better than to expect it without the gracious intervention of God.  God has created us from one blood, but as theologian John Howard Yoder pointed out, ever since the Fall mankind has naturally found innumerable bases for dividing and subdividing into “in” groups and “out” groups.  True social harmony will never come from our giving intellectual assent to an abstract principle about the implications of our common creation.  In Yoder’s words, “To make anyone believe in the equal dignity of all humans God must intervene.  It took the cross to break down the wall.”  And only in Christ Jesus is there “neither Jew nor Greek . . . neither slave nor free . . . neither male nor female” (Galatians 3:28).


In my latest post I noted that one of the most important functions that inaugural addresses serve is to reinforce our sense of identity as a nation.  Presidential inaugurations are important public rituals, and presidents have regularly used their addresses as opportunities not only to make a case for their agenda but also to remind Americans of their defining principles.  (It is coincidental, I am sure, that these are invariably presented as mutually reinforcing.)  In sum, inaugural addresses are symbolically important public efforts to define what the United States stands for, and this means that we all have something at stake in the undertaking.  The president’s rhetoric matters.

As Christians called to “take every thought captive to the obedience of Christ” (2 Corinthians 10:5), we want to do our best to “think Christianly” (as the late Harry Blamires put it) about all such pronouncements.  As a Christian historian, I am also convinced that it will enhance our insight to bring a historical perspective to bear.  Here’s what I mean in this instance:

One of the most obvious questions that Christians will likely ask about President Obama’s inaugural address today concerns his use of religious rhetoric.  In defining our nation’s “founding principles” and the “journey” we must still complete in order to fulfill them, did the president pay proper tribute to the place of religious faith—to Christian faith, specifically?  This is a huge, and hugely complex, question, but here are just a couple of preliminary thoughts.

Four years ago the newly elected president angered many Christians with his declaration that “we are a nation of Christians and Muslims, Jews and Hindus, and non-believers.”  Pundits will be parsing the president’s rhetoric for weeks, but my initial impression is that today’s speech was not quite as pointed as Obama’s 2009 address in linking American identity with an amalgam of world religions (not to mention atheism as well).  Those who simply want to count terms will note that the president referred to “God” in five instances.  He told us that “freedom is a gift from God,” that we are all equal “in the eyes of God,” and that the earth has been “commanded to our care by God,” before concluding with the obligatory “God bless you, and may He forever bless the United States of America.”

But what do such allusions to “God” really mean?  What purpose do they serve if the implication is that they carry no truth claims that would divide Christians, Muslims, Jews, and Hindus?  I think this is an important question, and I know that I am not equipped to answer it dogmatically.  As a historian, however, I would simply add this historical context: No American president, from George Washington onward, has ever made an unambiguous, unequivocal reference to the triune God of traditional, orthodox Christian confession (e.g., as summarized in the Apostles’ Creed or Nicene Creed).

When it comes to referring to God, American presidents have been masters of creative euphemism.  To cite but a few examples, George Washington referred to “that Almighty Being who rules over the universe,” to “the Invisible Hand which conducts the affairs of men,” and to “the benign Parent of the Human Race.”  John Adams alluded to that “Being” who is “the Patron of Order” and the “Fountain of Justice.”  James Monroe mentioned “the Divine Author,” Martin Van Buren and James Buchanan spoke of a “Divine Being,” and Zachary Taylor and Dwight Eisenhower referred to “Divine Providence.”  Thomas Jefferson and William Henry Harrison alluded to “the Creator”; Andrew Jackson referred to “that Power”; and Abraham Lincoln, Harry Truman, and Bill Clinton each made mention of “the Almighty.”  More recently, George W. Bush referred to the “Author of liberty” and “Maker of heaven and earth.”

What do all of these references to God have in common?  None of them is uniquely Christian; none of them is explicitly Trinitarian.  There have been fifty-eight inaugural addresses since George Washington was elected as the first President of the United States in 1789.  In addition to a host of euphemisms such as those mentioned above, the word “God” appears fifty-four times in those addresses.  The words “Jesus” and “Christ” have never appeared.  In sum, the rhetoric of American inaugural addresses has always been the language of what sociologist Robert Bellah long ago termed “civil religion”—a set of vague, least-common-denominator principles calculated to unify Americans with generalities rather than divide them over specifics.