A new study showing that most Tea Partiers say America is a “Christian nation” has reignited an age-old argument about our nation’s roots. Traditionally, the debate has been polarized with social conservatives like Fox News host Glenn Beck claiming our country was founded as a sacred Christian nation and left-leaning thinkers asserting that America was and should stay a non-religious country. Proponents of both views can drum up quotes by historical figures to support their position.
Recently, however, a third way has emerged among a surprising demographic: younger Christians. In the last three years, I’ve conducted hundreds of interviews and focus groups with the next generation of Christian leaders and found a new view that threads the needle between the left and the right.
Rather than view America’s founding as either wholly secular or sacred, many claim to believe that we are a country influenced by Christian ideas. On the one hand, they recognize that many early patriots and politicians were deeply influenced by their faith. No doubt such influence can readily be seen in the many American icons and traditions where God is acknowledged.
On the other hand, they are quick to point out that being influenced by such ideas does not equal the establishment of a Christian state. They uphold that the founders did not mean to legislate or authorize any one religious viewpoint over others. As the Treaty of Tripoli, signed by John Adams and ratified unanimously by the U.S. Senate in 1797, states, “the Government of the United States is not, in any sense, founded on the Christian religion…”
In times past, religious Americans have been some of the most vocal opponents to this way of thinking. The now-deceased former Moral Majority leader Jerry Falwell was famous for publicly claiming that America “was founded by Christians as a Christian nation.” Evangelicals of that day followed him in droves. And according to Barna Research, today some 43 percent of Protestants still believe it would be good to pass an amendment to the Constitution making Christianity the official religion of the United States.
But support for such views is waning among the general public. According to a 2009 Newsweek poll, the number of people who consider the United States a “Christian nation” has fallen nine points in the last five years, seven points in the last year alone. Public sentiments in this debate are shifting, and the next generation of Christians is too—albeit to a new paradigm.
READ THE FULL ARTICLE at The Huffington Post
What do you think? Was America founded as “a Christian nation?”