There is a popular teaching floating around in Evangelical circles.
It says that Christians are actually Exiles in America; as well as Exiles in the world.
But this not yet true for Americans. We actually still have a voice.
We still have a right to make our voices heard.
And we still have a duty to make our voices heard.
However, do we still have the backbone to stand upon truth?
Will we just go along with the political and spiritual flow of the age?