Seems there is debate
over whether or not America started out as a Christian nation. My thoughts is it was established by Judea-Christian principles but some of the Founding Fathers were far from Christian. (George Washington, Benjamin Franklin, Thomas Jefferson - who had his own mistress on the side and rewrote the Bible).
Did each time period have its struggles and issues to deal with? You bet!
I think if we elevate America as a Christian nation, we do a dis-service to that of the Kingdom of God is bigger than any one nation and that our citizenship ultimately is not here on this earth. I'm all for patriotism but I'm cautious of blending that of political agenda and Christianity.
I think God works and gives the authorities He does for a time and that we are to respect them. Thoughts? Do you think America is a Christian nation or was? Some point to the founding fathers - others to the 50s (When the good ole hymns were sung right?) And sometimes I wonder why the church gets caught up in all this. Interesting enough is our non-instrumental brethen don't seem to struggle as much in their churches with this patroitism issue. (Some don't have the American or Christian flag in their assemblies)