Not only was the United States founded as a Christian nation, but even before we were a nation the continent itself was explored and settled, again and again, with the main goal of spreading Christianity to people who did not know Jesus Christ. Contrary to the modern leftist narrative, the goal was not to rape, pillage and plunder. Listen to the actual words of the documents and charters that began Western civilization in the America’s.
The American Soul Podcast
https://www.buzzsprout.com/1791934/subscribe