America - the New Christian State?
Has anyone else noticed that since Bush has come to power how Christian Polarized the US has become?
I am a-religious, but I am not sure if the US was indeed to become outwardingly Christian what wouls this mean?
I personally do not like my politics and laws influenced by religion.
My gut says it would be the end of the America we know now.
Thoughts?