People also ask
Is the United States of America a Christian nation?
America was not founded as a Christian nation; take a look at the Constitution. The more interesting question is whether America is a Christian nation. For the better part of its history, the United States and the most prominent form of Christianity in America, mainline Protestantism, were intertwined.
How did religion influence the founding of the United States?
Religion and the Founding of the American Republic. Exhibition Home. This exhibition demonstrates that many of the colonies that in 1776 became the United States of America were settled by men and women of deep religious convictions who in the seventeenth century crossed the Atlantic Ocean to practice their faith freely.
When did Christianity become popular in the United States?
When deism and open ridicule of religion became popular among college students, physicians, and Western settlers in the 1790s, evangelical Christianity gained popularity as a reactive force against atheism and a source for new constructions of American nationhood.
Why was America founded in the seventeenth century?
Many of the British North American colonies that eventually formed the United States of America were settled in the seventeenth century by men and women, who, in the face of European persecution, refused to compromise passionately held religious convictions and fled Europe.