Monthly Archives: April 2016

THE CONFEDERATE FLAG–A “SYMBOL OF LOVE”?

Confederados I

OK–I’ll give you three guesses: where was the picture above taken?

If you guessed Santa Barbara D’Oeste, Brazil, you hit it on the nose.  One of the more odd Civil-War-related articles I’ve seen in a while was carried by USA Today over the weekend.  (You can read it here.)  The article, titled “Why These Brazilians Love the Confederate Flag,” was timed to coincide with an annual festival in this south Brazilian city sponsored by the Fraternidade Descendencia Americana.

The F.D.A. is an organization of descendants of southern Confederates who emigrated to Brazil at the close of the Civil War.  The “Confederados,” as they are known locally, gather each spring to celebrate their Confederate heritage.  They dress up like rebel soldiers and southern belles, consume large quantities of fried chicken and watermelon, and proudly exhibit Confederate flags–lots of them.

The thrust of the article–written, for reasons that I can’t explain, by a British-born sports writer–is to stress that the Confederate battle flag means something very different in Brazil than it does in the United States.  Here the flag has long been divisive, hailed by defenders as a reminder of a proud heritage, descried by critics as a symbol of hate.

This is not the case in Santa Barbara D’Oueste, apparently.  Everyone interviewed–from the president of the Fraternidade Descendencia Americana to a local historian to nine-year-old Bruno Lucke–agree that the flag carries no racist connotations whatever.  “To me,” little Bruno says, “the flag is a symbol of love.”

The Confederate battle flag is a "symbol of love" to this nine-year-old Brazilian.

The Confederate battle flag is a “symbol of love” to this nine-year-old Brazilian.

I wouldn’t read this piece expecting to learn much about either the recent or the distant past of the United States.  The author alludes to Dylann Roof as a 21-year-old who “allegedly” gunned down black worshipers in Charleston last year after posing with the rebel flag.  (Why “allegedly”?  Does anyone doubt this?)  He cites unnamed “historians” to contend that Confederate General Nathan Bedford Forrest (founder of the Ku Klux Klan) “in later life . . . fought racism.”  (This is a silly claim that professional historians don’t take seriously.)  Above all, he accepts uncritically the Confederados’ claim that the presence of slavery had nothing to do with their ancestors’ choice of Brazil as their new home.  In 1866 Brazil was the last remaining nation in the western hemisphere where slavery was legal, and historians agree that the desire to distance themselves from free blacks was “almost universal” among Confederate emigres.

On the plus side, the story does remind us that contemporary context is hugely important in determining how historical symbols are remembered.  The Brazilian Confederados’ memory of their noble Confederate heritage is as flawed and fantastic as anything you could find in the U. S. South today (e.g., among groups like the Sons of Confederate Veterans or the League of the South).  The difference is that, four thousand miles farther south, no one in Santa Barbara D’Oueste seems to care.

**********

For more on the Confederate battle flag in historical context, check out my previous posts on the topic here, here, here, and here.

Confederados II

TWO GREAT GIFTS FOR GRADUATES: WISDOM AND INSPIRATION

Graduation season is almost here—Wheaton College will be holding its 157th “commencement” in a little over two weeks—and it dawns on me that some of you may soon be shopping for a meaningful gift to give a new college graduate.  If so, I have a couple of recommendations to share.

Most graduates have a palpable sense of heading into the unknown, and the lifelong questions “What will I do?” and “Why will I do it?” will seem unusually relevant, even urgent.  This is why I often give graduates a book that will help them think Christianly about vocation.  There are two that I especially recommend.  Each is short, inexpensive, challenging, accessible, and wise.

GarberThe first book is Visions of Vocation: Common Grace for the Common Good, by Steven Garber.  The author heads up the Washington Institute for Faith, Vocation, and Culture in Washington, D. C. He writes from an explicitly Christian foundation, but graciously, winsomely, and non-dogmatically, and I would not hesitate to give this book to anyone wrestling with questions about the purpose and meaning of life.

The book hinges on one simple, haunting question: “what will you do with what you know?”  Knowledge always comes with moral responsibility, Garber insists. This is one of the key truths imbedded in the account of the tree of the knowledge of good and evil in Genesis chapters 2-3. The questions “What do you know?” and “What will you do with what you know?” can never be divorced, as much as we might like to pretend otherwise.

From this initial premise, Garber observes that the hardest thing we are called to do in life is to know and still love. Knowing and persevering in love is rare. To know those around us truly is to know the brokenness of the world and to share in its pain. To ease our pain, our natural response is to build a wall around our hearts made of stoicism or cynicism. The stoic trains her heart not to care about the world; the cynic convinces himself that all efforts to help are naïve or futile.

Visions of Vocation is filled with stories of men and women who have refused to give in to stoicism or cynicism. Garber describes his teaching philosophy as “come-and-see” pedagogy. “We learn the most important things over the shoulder, through the heart,” he writes, and so he doesn’t waste much time on abstract assertions. Because “words always have to be made flesh if we are going to understand them,” he spends most of his time introducing us to people he has walked with, individuals who have become “hints of hope” to a hurting world by choosing to know and still love.

Two convictions distinguish these men and women, Garber finds. First, they refuse to accept the delusion of individual autonomy that shapes the modern western world. They realize that “none of us are islands. . . . We are we, human beings together. Born into family histories, growing up into social histories, we live our lives among others, locally and globally, neighbors very near and neighbors very far.” Second, in acknowledging this relationship, they have accepted also that they are obligated to others and implicated in their suffering. In sum, in acknowledging relationship they have accepted responsibility, and after accepting responsibility they have chosen to take action.

PalmerThe second book is Let Your Life Speak: Listening for the Voice of Vocation, by Parker J. Palmer.   A couple of years ago I led students in an informal book discussion centered on this book, and for a long while I kept a box of extra copies in my office to give away as opportune moments arose.  It’s a great book on many levels.

The author has long been one of my favorite writers.  Although I have not always agreed with him–and still do not–I find him wonderfully challenging and provocative in the very best way.  Palmer began his adult career on an academic track, earning a Ph.D. in sociology from U.C.-Berkeley.  Although he left the Academy after a few years, he has devoted most of the past four decades to writing and lecturing on the nature of education and the relationship between the intellectual and the spiritual.  I first encountered Palmer in the pages of his 1983 book To Know as We Are Known: Education as a Spiritual Journey, a work that still informs my approach to teaching and my views on how education shapes the heart.

Let Your Life Speak can sound a little “New Age-y” if you don’t understand where Palmer is coming from.  Like most of the great Christian writers who addressed the concept of vocation during the Reformation, Palmer believes that our talents and passions are valuable clues to our ideal vocations.  When he counsels the reader to listen to the voice within, he can sound like a secular humanist (or a script-writer for the Hallmark Channel), but he is absolutely not advising us to look within our own hearts for the ultimate guide to wise living.  Instead, he is urging us to take seriously the truth that God has designed us with specific abilities and desires, and that our life’s vocation should unfold at the intersection of those personal traits and the needs of a hurting world.

We must understand vocation, Palmer writes, “not as a goal to be achieved but as a gift to be received.” He goes on to explain,

Vocation does not come from a voice “out there” calling me to be something I am not.  It comes from a voice “in here” calling me to be the person I was born to be, to fulfill the original selfhood given me at birth by God.

In sum, “we are here on earth to be the gifts that God created.”

With refreshing candor, Palmer reminds us that, “despite the American myth,” we simply cannot do or be anything we desire.  “There are some roles and relationships in which we thrive and others in which we wither and die.”  One of our goals, then, should be to learn our limits, distinguishing between the limits that are a product of the nature that God has implanted in us, and the limits “that are imposed by people or political forces hell-bent on keeping us ‘in our place.’”

Finally, I would note that Palmer intersperses his observations with intimate reflections on the path that he personally has traveled.  These include hard-earned insights from two extended bouts with depression as an adult.  Refreshing in its honesty and transparency, Let Your Life Speak will be encouraging both to those seeking direction for the future as well as to readers trying to make sense of suffering.  I heartily recommend it.

graduation

“WHAT SORT OF DESPOTISM DEMOCRATIC NATIONS HAVE TO FEAR”

Last week I attended a wonderful presentation here at Wheaton by my friend and colleague Bryan McGraw.  In addition to being a connoisseur of southern barbecue, Dr. McGraw is also a first-rate political philosopher.  In the course of his presentation, McGraw highlighted an extended passage from one of my favorite writers from the nineteenth century, Alexis de Tocqueville, and I was so struck by its relevance during this election season that I wanted to pass it along.

Tocqueville posed for this portrait around 1850, nearly two decades after his American odyssey.

Tocqueville posed for this portrait around 1850, nearly two decades after his American odyssey.

As many of you will know, Alexis de Tocqueville was a French aristocrat who traveled to the United States during the height of the period of Jacksonian democracy.  In 1831, at the age of twenty-six, Tocqueville was commissioned by the French government, in tandem with another young aristocratic Frenchman, Gustave de Beaumont, to travel to the U.S. to investigate and report on the American penitentiary system.  Tocqueville and Beaumont spent nine months exploring the country, traveling by stagecoach, steamboat, and on horseback from the urban northeast to the edge of the western frontier and back again.

Upon returning to France, Tocqueville and Beaumont filed their report on penitentiaries, and then Tocqueville began to pen a much broader set of reflections on American politics, American institutions, American culture, and the American people.  The result, Democracy in America, remains one of the most remarkable commentaries ever penned on the interrelationship of liberty, equality, religion, and popular government. I would be surprised to learn that any of this year’s leading presidential aspirants has ever read it.

A sympathetic critic of American democracy, Tocqueville wrote partly to praise but also partly to warn.  Quick to highlight the “benefits which democracy promises to mankind,” he also purposed to “point out the distant perils with which it threatens them.”  Chief among the latter was the potential for tyranny.  As Tocqueville observed, “I noticed during my stay in the United States that a democratic society similar to that found there could lay itself peculiarly open to the establishment of a despotism.”

Title Page of the first American edition of Tocqueville's classic, published in 1838.

Title Page of the first American edition of Tocqueville’s classic, published in 1838.

So how might this come about?  Tocqueville believed that there were certain attributes of the popular democratic mindset in the United States that would gradually facilitate the centralization of governmental power.  In the extended passage below (from volume II, part 4, chapter 6), Tocqueville shows how the individualist, materialistic ethos that he encountered among Americans might encourage the inexorable growth of government.  Read it and see what you think.

I am trying to imagine under what novel features despotism may appear in the world.  In the first place, I see an innumerable multitude of men, alike and equal, constantly circling around in pursuit of the petty and banal pleasures with which they glut their souls.  Each one of them, withdrawn into himself, is almost unaware of the fate of the rest.  Mankind, for him, consists in his children and his personal friends.  As for the rest of his fellow citizens, they are near enough, but he does not notice them.  He touches them but feels nothing.  He exists in and for himself, and though he still may have a family, one can at least say that he has not got a fatherland.

Over this kind of men stands an immense, protective power which is alone responsible for securing their enjoyment and watching over their fate. That power is absolute, thoughtful of detail, orderly, provident, and gentle. It would resemble paternal authority if, fatherlike, it tried to prepare its charges for a man’s life, but on the contrary, it only tries to keep them in perpetual childhood.  It likes to see the citizens enjoy themselves, provided that they think of nothing but enjoyment. It gladly works for their happiness but wants to be sole agent and judge of it.  It provides for their security, foresees and supplies their necessities, facilitates their pleasures, manages their principal concerns, directs their industry, makes rules for their testaments, and divides their inheritances.  Why should it not entirely relieve them from the trouble of thinking and all the cares of living? . . .

Having thus taken each citizen in turn in its powerful grasp and shaped him to its will, government then extends its embrace to include the whole of society.  It covers the whole of social life with a network of petty, complicated rules that are both minute and uniform, through which even men of the greatest originality and the most vigorous temperament cannot force their heads above the crowd.  It does not break men’s will, but softens, bends, and guides it; it seldom enjoins, but often inhibits, action; it does not destroy anything, but prevents much being born; it is not at all tyrannical, but it hinders, restrains, enervates, stifles, and stultifies so much that in the end each nation is no more than a flock of timid and hardworking animals with the government as its shepherd.

**********

If you’d like to read more on Tocqueville’s critique of American democracy,  check out these earlier posts by clicking here, here, here, and here.

George Caleb Bingham, "The County Election," 1852

George Caleb Bingham, “The County Election,” 1852

THERE’S NOTHING NEW ABOUT A CONTESTED CONVENTION

Think of your forefathers!  Think of your posterity!

—John Quincy Adams—

 

So what would you make of the following scenario?

In a highly charged election year, the Republican Party faces a showdown at its impending national convention.  The field of presidential contenders has been large, and no single candidate will come to the convention with a majority of the delegates behind him.  Candidate A of New York is the clear front runner, and for months his rank-and-file supporters have considered him the presumptive nominee.  But Republican elites are lukewarm about A.  His reputation as an extremist gives them pause, and despite the enthusiasm of A’s followers, they worry that A will fare poorly in the general election.  They fear that A is unelectable, and by nominating him they will not only sacrifice any chance at the presidency but harm Republican candidates for state and federal offices as well.  The future of the party hangs in the balance.

As the opposition to A becomes ever more outspoken, a “Stop A” movement works frantically behind the scenes to rally behind a single alternative.  The number of potential nominees makes this difficult, however, and the divisions within the “Stop A” movement look to be crippling.  Candidate B is a southern conservative with tenuous links to party leaders.  Candidate C is an economic and social conservative who has risen to prominence in the Senate but made too many enemies along the way.  Candidate D is a northeasterner with a following in his own state but viewed elsewhere as a corrupt opportunist.  Candidate E has none of these liabilities, but as the convention approaches this Midwesterner is the first choice of only one state: his own.

Although candidate A commands a sizable plurality of delegates when the convention opens, candidate E’s campaign team goes to the convention determined to deny A a first-ballot nomination and open the door for E.  Unabashedly pragmatic, their message to delegate after delegate emphasizes expediency.  E is electable.  A is not.  E lacks A’s negative baggage and is widely respected.  He is a unifier who has been careful not to denigrate the other candidates.  E’s promoters encourage A’s delegates to consider E as a good second choice if it becomes clear that A cannot win a majority on the convention floor.  Where it promises to be helpful, E’s team makes thinly veiled offers of future political favors to delegations willing to switch their support to E after the initial ballot.  A significant number of wavering delegates are even willing to shift their allegiance before the balloting begins.

In the end, the strategy works.  On the first ballot, A takes 37% of the vote to E’s 22% (with candidates B, C, and D trailing even farther behind).  But as delegates are released from their first-ballot pledge to support A, the momentum shifts decidedly toward E on the second ballot, and by the third ballot E claims the nomination over A.  E’s margin of victory?  A razor-thin 50.5% to 49.5 percent.

So how would you evaluate the outcome of this contested convention?  Was it a miscarriage of justice?  An assault on democracy?  A “brokered” behind-the-scenes deal that bartered the wishes of the people? Or was it a politically prudent compromise that secured the best outcome realistically available?

If you say that you don’t have enough information to answer the question, you would be right.  But in thinking through the scenario, it might be helpful to know that it isn’t hypothetical.  It’s my best attempt to summarize the nomination of Abraham Lincoln in 1860.  Candidates A, B, C, and D were Republicans William Seward, Edward Bates, Salmon Chase, and Simon Cameron.  We don’t know how this year’s Republican slugfest will play out, of course, but so far I’d say there are some pretty striking similarities to the 1860 Republican contest.  And although Donald Trump has modestly proclaimed that he is as “presidential” as Abraham Lincoln, right now the person best approximating that role is probably John Kasich.

Abraham Lincoln took 22% of the votes on the first ballot at the Republican National Convention in 1860.

Abraham Lincoln took 22% of the votes on the first ballot at the Republican National Convention in 1860.

So what does this analogy prove?  Can it help us to predict how the race for the Republican nomination will come out?  Can it teach us how it should come out?

Absolutely not.  The point of listening to the past is not to get easy answers to contemporary problems.  I cringe whenever I hear someone in the public opining ponderously about what “history proves.”  We study the past not as a storehouse of simple lessons but as an aid to thinking more deeply, more self-consciously, and hopefully more wisely as we meet the future.  History promotes wisdom, when it does, by expanding the range of our experiences to draw from.  As C. S. Lewis put it figuratively in “Learning in Wartime,” the student of history has lived in many times and places, and that greater breadth of perspective aids us as we seek to think wisely and live faithfully in our own historical moment.

I suspect that much of the popular hyperventilating about the prospect of a contested Republican convention stems from the fact that the last multi-ballot nomination of a major-party candidate came in 1952, before the vast majority of Americans were born.  And because we have no memory from before we were born—only people with historical knowledge can have that—we are vulnerable to all kinds of nonsense from those who would prey on our ignorance.

The reality is that the presidential primary model that we take for granted today has been dominant for less than a half century.  The earliest presidential candidates were chosen without any popular involvement at all, hand-picked by party caucuses in Congress.  Beginning in the 1830s (following the lead of a bizarre coalition known as the Anti-Masonic Party), the major parties established the pattern of choosing candidates in party conventions.  And although some states began to hold presidential primaries as early as 1912, as late as the 1950s conventions still effectively made the final decision, and it was possible for a presidential candidate like Adlai Stevenson to win the nomination without running in a single state primary.

And unlike the conventions of the last half century—which are carefully choreographed, excruciatingly boring infomercials—the conventions between the 1830s and the 1950s were frequently contested.  It wasn’t just Abraham Lincoln who was nominated after multiple ballots.

Future president James K. Polk was nominated on the ninth ballot at the Democratic Convention in 1844.  In 1848 future Whig president Zachary Taylor was nominated on the fourth ballot.  Future Democratic president Franklin Pierce was nominated on the forty-ninth ballot in 1852 (and received no votes at all for the first thirty-five ballots).  Among other future presidents, James Buchanan was nominated on the seventeenth ballot in 1856, Rutherford Hayes on the seventh ballot in 1876, James Garfield on the thirty-sixth ballot in 1880, Benjamin Harrison on the eighth ballot in 1888, Woodrow Wilson on the forty-sixth ballot in 1912, and Warren G. Harding on the 10th ballot in 1920.  And although he lost in the general election, Democrat John W. Davis outdid them all, claiming his party’s nomination in 1924 on ballot number one hundred and three!

There was much that was broken about this system of selecting nominees.  Political bargains in proverbial “smoke-filled rooms” were the norm, and I’m not recommending that we return to them.  But these examples should give us pause and lead us to wrestle with some questions that might not otherwise occur to us about the current Republican contest.  Why, for one, would we assume that a candidate with a plurality of popular support has earned his party’s nomination?  Is it wrong to take “electability” into question in selecting a nominee?  Why do we think that a contested nominating convention is automatically disastrous for the party in question?  I have thoughts about all of these, but I’ll stop here and invite you to share what you think.

SHOULD THE AMERICAN PEOPLE HAVE A SAY IN THE SUPREME COURT’S DIRECTION?

“Think of your forefathers!  Think of your posterity!”—John Quincy Adams

constitution

I have a hard time taking seriously Democratic appeals to the Constitution as they insist that Senate Republicans act promptly on President Obama’s nomination of Merrick Garland to the Supreme Court.  Most Democrats long ago embraced a role for the Court that the Framers of the Constitution could have scarcely imagined.  It seems more than a little opportunistic to rediscover the authority of original intent now that it suits their purposes.  As I noted in a previous post, the Founders described the judicial branch in terms of its essential “feebleness.”  The Court could never be a threat to the rights of the people, Alexander Hamilton insisted in Federalist no. 78, because under the Constitution the Court possessed “neither FORCE nor WILL but merely judgment.”  The Framers would be stunned to see the power that the Court wields today.

And yet they would be equally mystified by Republicans who insist that public opinion should play an important role in shaping the composition of the Court.  In arguing to postpone consideration of Obama nominee Merrick Garland, for example, Senate majority leader Mitch McConnell observes that “of course the American people should have a say in the Court’s direction.”  That subterranean rumbling you may have noticed afterward came from the Founders collectively turning over in their graves.

The Framers’ vision for the proper place of public opinion in a free government is easy to caricature because it is complicated.  In their reading of human nature, it was just as fallacious to assume “universal venality” (i.e., moral corruption) as to assume “universal rectitude,” to use Hamilton’s terminology.  They believed that there was enough “honor and virtue among mankind” to justify an experiment in republican government grounded in the consent of the governed.  But they also knew that there was enough “folly and wickedness” in human nature to make such a government perpetually susceptible to tyranny and injustice.  And so they sought to ensure a degree of popular involvement in the new government while protecting it from undue popular pressure.

Remember how federal officeholders were originally to be selected: Only members of the House of Representatives would owe their appointment to the direct election of the people.  According to James Madison in Federalist no. 52, this meant that the House, unique among the components of the new government, would have “an immediate dependence on, and an intimate sympathy with, the people.”  But even then, “the people” who would so influence the Representatives comprised a severely truncated shell of today’s electorate.  The Framers did not specify who should be eligible to vote, except to stipulate that the states should apply the same criteria for federal elections as they did for the houses of representatives in their own state legislatures.  Based on state voting laws as they currently existed, the Framers could expect that members of the U. S. House of Representatives would be chosen by the votes of adult white male landowners.  (Depending on the state, the property requirement for voting regularly disqualified from one-third to two-thirds of adult white males.)

So much for the government’s “popular branch.”  The members of the more august Senate would be selected, not by the people directly, but by the state legislatures. To buffer the Senate further against popular pressures, only one-third of Senate seats would be open in any given election year.  These features would make the Senate less susceptible to “the impulse of sudden and violent passions,” as Madison put it.  More removed from “the people,” the upper chamber would function as an “anchor against popular fluctuations.”  “I shall not scruple to add,” Madison further noted, that the Senate “may be sometimes necessary as a defense to the people against their own temporary errors and delusions.”

Less “popular” still would be the executive under the new Constitution.  We sometimes forget that the Articles of Confederation didn’t include an executive branch at all, and so the Constitution’s framers were in uncharted territory as they sought to define the role of a “president” of the United States.  Andrew Jackson would later redefine the office of president as the most uniquely democratic element of the federal government.  This was true, he contended, because the president alone among all federal officeholders was chosen by the entire country.

But Jackson, keep in mind, made a practice of inverting the vision of the Framers while claiming to uphold it.  In reality, the Framers went to great lengths to isolate the executive from popular pressure.  They anticipated that, in most election years, the president would actually be chosen in a run-off in the House of Representatives where each state delegation would cast one vote.  The congressmen would be choosing from among the five individuals who had received the most votes by “electors” from each state.  (These are the members of our so-called electoral college.  The Constitution does nothing to tie the selection of electors to the vote of the people, but leaves the manner of choosing them to each state legislature.  For the next generation, at least, upwards of two-thirds were appointed by the state legislatures, not elected by “the people.”)

By protecting the executive from direct popular pressures, the Framers hoped that the executive would be able to perform his duties more effectively, always sensitive to the public welfare but never captive to the passions of the people.  Alexander Hamilton developed the point at length in Federalist no. 71:

There are some who would be inclined to regard the servile pliancy of the executive to a prevailing current, either in the community or in the legislature, as its best recommendation.  But such men entertain very crude notions, as well of the purposes for which government was instituted, as of the true means by which the public happiness may be promoted.  The republican principle demands that the deliberate sense of the community should govern the conduct of those to whom they entrust the management of their affairs; but it does not require an unqualified complaisance to every sudden breeze of passion, or to every transient impulse which the people may receive.

Which brings us to the judicial branch.  According to Mitch McConnell, it’s self-evident that “the American people should have a say in the Court’s direction.”  He may be right.  Given the inordinate power that the Court wields today, he probably is right.  I would just like to hear this prominent conservative leader tell his constituents that none of the Framers whom he claims to venerate would agree with him.

Just think for a minute about how the Framers of the Constitution expected the members of the Supreme Court to be chosen: Supreme Court justices were to be appointed for life by the president of the United States, who in turn was to be elected by the House of Representatives, who would choose the president from a group of finalists identified by electors, who in turn were appointed by the state legislatures, who for their part were elected by adult, white, male, property-holding voters.  Whatever you or I or Mitch McConnell may think, the Framers certainly didn’t believe that “the people” should “have a say in the Court’s direction.”

So what do we do with this information? I’m far from suggesting that we simply ask WWFD—What Would the Founders Do?—and then go and do likewise.  Figures from the past have no authority over us, and figuring out what the Founders would think about contemporary politics settles nothing.  But following John Quincy Adams, I think there’s more than a little wisdom in situating our contemporary debates within the larger conversations across time of which they are but a part.  While we shouldn’t be slavishly submissive to the values of the Framers, neither should we be cavalierly dismissive.  In Christian terms, the former is idolatry, the latter is arrogance.  We avoid both these extremes when we assess their views critically but respectfully, grappling with them as we seek to clarify and justify our own positions.

The Framers’ determination to shield federal officeholders from undue popular pressure stemmed logically from a skeptical view of human nature that few twenty-first century Americans share.  Nearly two centuries have passed since Alexis de Tocqueville noted wryly that the American people “live in the perpetual utterance of self-applause.”  Most Americans—including most American Christians—now reflexively view human nature as essentially good and the wishes of the majority as essentially just.  It might not be a bad thing if we reevaluated that popular prejudice in the light of Scripture.

But whatever your view of human nature, we might at least concede that there is a consistency and symmetry in the Framers’ efforts to shield the judiciary from popular opinion.  In Federalist no. 52, James Madison explained the overall structure of the proposed government with reference to the maxim that “the greater the power is, the shorter ought to be its duration.”  Because the House of Representatives would be Constitutionally charged with the responsibility of initiating all revenue measures, it was right and proper that representatives have the shortest terms of office and be most immediately responsive to “the people.”  Terms of office would increase in length as the power of the office declined.  Congressmen would serve for two years, presidents for four, senators for six . . . and Supreme Court justices for life.

That the Framers would allow such dramatically longer terms for Supreme Court justices makes sense only in light of their belief that the Court would exercise dramatically less influence on the life of the nation than it now does.  If Mitch McConnell is right, and we now need to disregard the Framers’ goal of protecting the Court from public opinion, he is right because we have long since abandoned the Framers’ vision of a “feeble” judiciary with “neither FORCE nor WILL but merely judgment.”

COMING SOON TO A CABLE CHANNEL NEAR YOU

Fame has eluded me until relatively late in life, but that is about to change, and I wanted my loyal readers to be the first to know.

There have been some near misses, times when I suspected that popular acclaim was going to elevate me to stardom, despite my shy and humble nature.  There was the time, at age five, that I appeared on the children’s show “Fun Time with Miss Marsha.”  A decade later I was front man for my church youth ensemble as we performed for a March of Dimes telethon.  My mom bragged about that for years.

But the closest call was actually after I began teaching, back in 1991 when a retired humanities professor from the University of Florida popularized the theory that Zachary Taylor, not Abraham Lincoln, had been the first American president to be assassinated.  Taylor had died in July 1850, sixteen months into his presidency.  The cause, according to most historians, was acute gastroenteritis brought on when Taylor gorged himself on raw cherries and iced milk during a Fourth of July celebration in the nation’s capital.  Not so, said Professor Clara Rising, who speculated that the twelfth president had in fact been poisoned by one of his political enemies.  Although she had no real evidence to support her suspicions, Rising convinced Taylor’s descendants to agree to an exhumation of their ancestor’s remains, and for a week or so that June the nation breathlessly awaited the results of the partial autopsy.

Within hours of the announcement of the impending autopsy, a TV journalist from a popular Seattle news magazine program was calling to say that he would like to interview me to get my take on the story.  He wanted me to speak about the implications of Taylor’s alleged assassination, how it changed the course of history, etc.  I cleaned up my office (no small feat), put on a tie, and in a lengthy interview I shared a plethora of erudite insights about Zachary Taylor, antebellum American politics, and the coming of the Civil War.  I could tell that the reporter was deeply moved, although he was too professional to let on.

And then the results of the autopsy were announced the next morning, and unfortunately (at least for my television career), there was no evidence of foul play.  I never talked to the reporter again.  All I got was a telephone message left while I was in class.  One of the secretaries in the History Department office had summarized the message on one of those pink “while you were out” slips that functioned as voice mail before there was voice mail.  “Taylor wasn’t poisoned, so no story,” said the memo.  “Thanks anyway.”  Such is the fickleness of fame.

IMG_0819Now, twenty-five years later, the siren song of celebrity calls for me again.  This was the scene last December in my U. S. History to 1865 class at Wheaton.  C-SPAN was there to film a class session for later broadcast on their wildly popular “American History TV” (which probably all of us watch religiously on C-SPAN-3 every Saturday night).  Although the lights, cameras, microphones, and miles of cable were hardly unobtrusive, my students were real troopers.  They stayed awake at all times, looked variously intrigued and enthusiastic, and interjected with thoughtful, penetrating comments at the proper moments. It was a bravura performance.

A little behind-the-curtain confession: we had actually practiced all of this in advance.  Although I chickened out at the last moment, I had even scripted an “impromptu” comment from one of the students who would interrupt me at the beginning of the class session with the following heart-felt observation:

Professor McKenzie,

Before we get started, may I share something?

As I was walking across the beautiful grounds of Wheaton College this morning, I was reminded of how greatly I have been blessed by the opportunity to be a history major here.  I can say without hesitation that it has been a transformative experience.  Indeed, words cannot express the depth of my gratitude to the Wheaton History Department.  You and your colleagues have changed my life forever.  If I were a parent of a high-school senior, I know that I would be encouraging him or her to apply to Wheaton College and become a history major.

In conclusion, if it will not embarrass you unduly, Professor McKenzie, I must say that your brilliant lectures, your unparalleled sense of humor, your remarkable wisdom, and your gracious, winsome spirit have been the pinnacle of my experience here at Wheaton College.

Thank you, thank you, a thousand times thank you.

I’m pretty sure he could have made it sound unrehearsed.  My students make comments like this all the time.

At any rate, I have purposely refrained from telling you about the scheduled broadcast, for fear that C-SPAN would go bankrupt or that the producer would watch the tape and say “What was I thinking?!”  Neither has happened, however, and I can now report that our class session on “Emancipation and the Civil War” will air this Saturday night at 8:00 (eastern) on C-SPAN 3.  If your cable plan doesn’t include C-SPAN 3, you can watch the video after the fact from the “American History TV” website by clicking here.

PRELUDE TO CIVIL WAR: WILLIAM SEWARD’S “APRIL FOOLS’ MEMORANDUM”

As I was backing out of the driveway this morning, I was distressed to see a “For Sale” sign in the front yard of our very dear next-door neighbors. I backed down the street to where I could see the sign more clearly and discovered, to my relief, that it read as follows: “FOR SALE BY OWNER,” and then in much smaller, handwritten print, “One Day Only: April 1st, 2015.”

April Fools’ Day. I’ve hated this day all of my life.

At any rate, the prank got me to thinking about April Fools’ Days in American History, and my thoughts went to one of the most ominous April 1sts in our past. It was April 1st, 1861, and the United States was perched precariously on a precipice. (How’s that for alliteration?) Since the election of Abraham Lincoln as president the preceding November, seven southern states had issued resolutions purporting to sever their ties with the Union. A half dozen more were sorely tempted to follow suit, and would very likely do so if the Federal government took steps to restore the Union by force. As the country waited for the inauguration in March of its new Republican president, the seceding states took steps to constitute themselves the Confederate States of America and set to work confiscating all federal property—forts, arsenals, customs houses, and mints—within their borders. By the time Lincoln took the oath of office on March 4th, 1861, the Union was visibly collapsing and the authority and prestige of the U. S. government was at its nadir.

Lincoln in 1860

Lincoln in 1860

Lincoln and his cabinet—which included four of his rivals for the Republican presidential nomination—were deeply divided as to how to respond to the crisis. In his inaugural address, the president had tried to show both moderation and resolve. On the one hand, he had gone out of his way to try to reassure his southern critics that they need not fear a Republican presidency. On the other, he had insisted that “secession is the very essence of anarchy” and declared that the Union is “perpetual.” Drawing a line in the sand, he had pledged (somewhat redundantly) to “hold occupy, and possess” all federal property within the rebellious states. Implicitly, this seemed to obligate the new president, at the very least, to do all within his power to maintain control of the federal forts in the lower South not yet in Confederate hands—most notably Fort Sumter in the mouth of Charleston harbor.

William Seward, Lincoln's Secretary of State

William Seward, Lincoln’s Secretary of State

The ranking member of Lincoln’s cabinet—Secretary of State William Seward—led a faction within the administration that sought to avoid a showdown if possible. Seward had much more experience in national politics than Lincoln and had been the odds-on favorite for the Republican presidential nomination in 1860, before losing out to Lincoln on the 3rd ballot. It is quite possible that he had accepted the State Department post—historically the most prestigious and influential cabinet appointment—with the intention of serving as the de facto head of the administration, pulling the strings behind the scenes while the inexperienced Lincoln played the role of puppet and figurehead. Toward the end of March, Seward had met secretly with representatives of the Confederate government, assuring them that the government would not use force to uphold its authority and promising—without Lincoln’s knowledge or approval—that the Union troops assigned to Fort Sumter would soon evacuate the installation.

As March drew to a close, and as it became increasingly evident to Seward that Lincoln intended to uphold his inaugural pledge, the secretary drew up one of the most remarkable memoranda ever given to a sitting president by a high-ranking government official. Because Seward forwarded the memorandum—titled “Some Thoughts for the President’s Consideration”—on April 1st, historians have commonly referred to the document as Seward’s “April Fools’ Memorandum.” In truth, the proposals it contains are so outlandish that it is tempting to conclude that the secretary was pranking the president, but he wasn’t. He was in dead earnest.

After criticizing Lincoln for having failed to define a clear policy, “either foreign or domestic,” Seward went on to repeat his recommendation that Sumter be evacuated. In Seward’s mind, this sort of concession to the South was the best way both to keep the upper South in the Union and avoid the tragedy of civil war.

Page 3 of Seward's April 1, 1861 memorandum to President Lincoln

Page 3 of Seward’s April 1, 1861 memorandum to President Lincoln

Then came the clincher. Under the heading “For Foreign Nations,” the Secretary of State recommended to the president that the administration “demand explanation from Spain and France, categorically, at once.” Spain had recently sent troops into Santo Domingo, while France was casting its eyes on Mexico, and Seward was proposing to challenge both on the grounds that they were in violation of the Monroe Doctrine. “If satisfactory explanations are not received from Spain and France,” Seward went on, the president should “convene Congress and declare war against them.” Although he didn’t spell out his rationale for the president, Seward clearly believed that the best way to unify the country was to provoke a war with one of the major powers of Europe.

But what if Lincoln was not prepared to take the lead on such a drastic policy? The Secretary of State concluded his memorandum with the following presumption:

Whatever policy we adopt, there must be an energetic prosecution of it. For this purpose, it must be somebody’s business to pursue and direct it incessantly. Either the President must do it himself and be all the while active in it, or devolve it on some member of his Cabinet. Once adopted, debates on it must end, and all agree and abide. It is not in my especial province but I neither seek to evade nor assume responsibility.

Lincoln replied in writing to Seward the same day, although it is not clear whether his brief note was ever actually delivered. What is clear is that Lincoln ignored Seward’s proposal of provoking a European war. He also effectively declined the Secretary of State’s polite offer to take over the management of his administration. If something “must be done,” the president wrote in his reply, I must do it.”