In the last several weeks there have been several articles about America the post-christian nation. It could mean a lot of things and I wonder if this article doesn’t represent what those who say they have no religion are feeling. They are still believers but not wanting to be called “Christian” or identified with a denomination. Maybe they are like Geoff Surratt who posted this:
You see, I am one of the many Americans who would no longer describe themselves as a professing Christian. I cannot in good faith associate any more with what the label Christian has come to represent in America. Christianity is now a set of political views, a way to distinguish different groups of people (Jews, Muslims, Christians, Hindus), a movement to impose a certain view of morality on others regardless the condition of their hearts.
We must return to our foundations. We must be relevant and current but true to the foundational principles.
I’m not talking about doctrinal issues. I’m talking about equipping the saints and releasing them into the fullness of their calling.
We must stand for righteousness but it’s not enough to have an opinion on an issue we must steward God’s people. The church must be a place where God’s people can thrive and serve in the fullness of their calling not just come and listen to those who are paid to preach, teach, or sing.
Do you agree with Geoff?