Since I became a Christian in 1987, I've constantly heard the phrase "Culture Wars" thrown around. It usually refers to the clash of two perspectives in America: a more traditional worldview based on a Biblical understanding of how a society should function and a more liberal view of culture where old perspectives are challenged and replaced with new viewpoints and ideas on everything from marriage to education to economics to the power of government.
Christians today find themselves in a tough spot. As the culture shifts further away from the long assumed Christian ethos of American history, we as followers of Christ remain firmly planted in our convictions on essential values and beliefs. But the cultural shift means that as society moves farther from the way things used to be, evangelicals look more and more odd to outsiders.