Nearly half of Americans believe the United States should be a Christian nation, according to a new study from Pew Research.
The survey, conducted between September 13 and 18, 2022, found that 45% of Americans want the country to be a Christian nation, six in ten respondents say America’s founders originally intended the country to be a Christian nation, and 33% said America is currently a Christian nation.
Six in ten American adults say America’s founders originally intended the United States to be a Christian nation. 45% say they personally think the United States should be a Christian nation. https://t.co/RElFoQFX7V pic.twitter.com/1EQIrOwR0I
— Pew Research Religion (@PewReligion) October 27, 2022
“There are a lot of Americans – 45% – who tell us that they think the United States should be a Christian nation. That’s a lot of people,” Greg Smith, who helped co-write the survey, told Religion News Service. “(But) what people mean when they say they think the United States should be a Christian nation is really very nuanced.”
Additional findings revealed that 47% of Americans believe the Bible should have “a lot” or “some” influence over American law, and 27% believe the scriptures “should have more influence than the will of the people” in every time. the law and the word of God contradict each other.
As Faithwire reports, a strong majority of Americans (78%) who say the United States should be a Christian nation said the word of God “should have great or some influence on American law.” Conversely, 21% said the scriptures should have “little or no influence” on the nation’s laws.
The Pew Research study also highlights the mixture of religion and politics. According to the survey, most American adults (77%) say candidates for political office should not have the public endorsement of churches and places of worship. More than half of respondents (67%) believe that religious institutions should not be involved in political affairs instead of sharing their views on social and political issues.
When asked what people meant when they said, “America should be a Christian nation,” a variety of responses were shared. Some Americans define the concept as when the country has Christian leaders and principles based on Christian values. Some have said that America being a Christian nation has to do with the United States having a predominantly Christian population, while others have said that having good morals and respect for one another others made the country a Christian nation.
Photo credit: John Silliman/Unsplash
Milton Quintanilla is a freelance writer. He is also the co-hosts of the For Your Soul podcast, which seeks to equip the church with biblical truth and sound doctrine. Visit his blog Blessed Are The Forgiven.