Was America Founded As a Christian Nation?
The single most determining factor for whether or not an individual voted for Donald Trump in the 2016 presidential election wasn’t race, religion, Islamophobia, economic dissatisfaction, or even pro-life issues, it was Christian nationalism. And though this ideology is difficult to define, it begins with the idea of “Christian nation-ism,” or the belief that the United States was founded as a Christian state and therefore must return to its Judeo-Christian roots. It seems evangelicals really did want to Make America Christian Again.
But was America ever a Christian country? If you or someone you love continues to labor under the illusion that America is a Christian nation, here are three realities that debunk the myth the United States was and should become once again a Christian nation.
The Historical Lie: The Founding Fathers Myth
Patriotic pastors like Robert Jeffries and evangelical grifting politicians like Ted Cruz often evoke language of exile and return, calling on America to restore her values and get back to our Judeo-Christian heritage. The idea stems from the belief America was founded by “Christian men guided by Christian principles who set in place Christian institutions to create a Christian country,” to borrow the words of Wheaton College professor of history Dr. Tracy McKenzie. But there are several holes in this theory.
First, many of the founding fathers (John Adams, Benjamin Franklin, Thomas Paine, and Thomas Jefferson) were deists. Jefferson went so far as to cut out every reference to Jesus’ divinity in his personal Bible. Second, many of the founders owned slaves, defying their own “self-evident truths” that “all men are created equal.” Well, at least all property-owning white men. Third, the original crafters of both the Declaration of Independence and the Constitution of the United States were explicitly clear on the separation of Church and state. Given the political history of the “Divine Right of Kings,” they were appropriately skeptical about fusing any concepts of religion with public policy. And though George Washington was apparently quite pious, he declared, “The government of the United States is not in any sense founded on the Christian religion.”
The Moral Lie: The Myth of American Exceptionalism
Presidents ranging from Dwight D. Eisenhower to Bill Clinton were fond of quoting French historian and political scientist Alexis de Tocqueville, who is attributed with saying, “America is great because she is good, and if America ever ceases to be good, she will cease to be great.” The problem is not only that de Tocqueville never said this, but more significantly, it isn’t true. America has never been uniquely virtuous, regardless of what standard of “goodness” we apply. The history of the United States is a narrative of conquest, displacement, genocide, bondage, and white supremacy.
Consider for a moment when exactly was America good. Was it in 1492, when sailing under the flag of Christ, Christopher Columbus enslaved, murdered, and wiped out the Tiano people? Was it in 1619 when the first slave ships arrived in Virginia? Was it in 1637 when Puritan soldiers surrounded a Pequot village and slaughtered the mostly women and children in their sleep? Or what about when former Methodist minister John Chivington led a band of Christian soldiers to Sand Creek, Colorado, where they murdered and mutilated over 150 Cheyenne women and children before carrying off their body parts as trophies? Maybe it was when the U.S. dropped two nuclear bombs on the civilian populations of Nagasaki and Hiroshima, killing over 200,000 people and decimating the largest Catholic community in Japan. Or during America’s secret war over Laos, when the United States military dropped more than two million tons of bombs, many of which are still maiming and killing civilians to this day. The truth is America, like every other nation, has often failed to live into her highest ideals. That doesn’t make her exceptional, but very ordinary indeed.
The Theological Lie: The Myth of God and Country
The temptation for Americans with a somewhat proud religious heritage is to tell our national story as if it is also God’s story, conflating Christianity with American identity by adapting, joining, and corrupting the story of God’s Kingdom with the narrative of empire. Phrases like “God Bless America” or “God and Country” communicate an ignorant, if not intentional, amalgamation of God’s ultimate human purposes with the goals and aims of the United States of America.
There is a commonly held belief among white evangelicals that the U.S. holds a unique, covenant relationship with God. In Peter Marshall’s widely popular American history book, The Light and the Glory, he writes, “God has chosen America as the New Israel, to be a light in the world.” But truth be told, no geo-political entity claims divine preference over any other. God does not bless America at the expense of other nations. To claim so is an act of patriotic heresy.
The Kingdom of God can never be reduced to the imperialistic goals of any country, if for no other reason than, by their very nature, nation-states are built for the benefit of some at the expense of others. Nations are defined by boundaries, and they unite around race, religion, and language, meaning there is a clear distinction in almost every modern country between who belongs and who doesn’t. But the global Kingdom of God defies such man-made distinctions. Likewise, the kingdoms of this world function with power over people, while Jesus commands his followers to have power under people. America in particular wages pre-emptive war against our enemies. Jesus calls us to love our enemies.
The Kingdom of God doesn’t look like America, it looks like Jesus with arms outstretched dying for the very people who are killing him. There is indeed a blessed nation whose God is the Lord, but it is not the United States. She is the Bride of Christ, this beautiful community of resident aliens called out of every tribe, tongue, people, and nation to be God’s chosen people and special possession.
Was America ever a Christian country? If you or someone you love continues to labor under the illusion that America is a Christian nation, here are three realities that debunk the myth that America was and should become once again a Christian nation.