Posts Tagged ‘United States a Christian nation’
It’s Time America,‘In God We Trust’ Will Be Seen in All Louisiana Public Schools
One way to interpret the question “Is the United States a Christian nation?” is to ask if the U.S. has a Christian heritage. In other words, do the history, culture, language, and lifestyle of the nation reflect Christianity, and to what extent? This is, by far, the least controversial aspect of this issue, since the…
Read More