In the West, it would seem that Christianity is fading. The world-wide reality is quite different. The percentages of those who self-identify as Christian have sharply shifted so that Africa and Latin America are far more influential.Tsar wrote:In the Western World it seems Christianity has died.
Many people that call themselves Christians do not abide by the moral tenants of the faith.
The Catholic Church doesn't do enough to speak out against homosexuality or sexual immorality. This is primarily to not risk offending anyone. However, how can a religion survive if it does not teach it's young believers the true tenants and how to be a true believer? Whitewashing the facts essentially undermines a religion.
Many Protestant Churches have become much worse in these respects. Some Protestant Churches will "marry" homosexual couples. They do this to profiteer and abandon the moral tenants of Christianity. The Bible specifically states it is against homosexuality.
Many people in North America, Australia, and Western Europe claim to be Christians but do not follow the major moral tenants or live up to Christianity's true meaning. They support homosexuality, promiscuity, binge drinking, and they don't see a problem with it which means they are secular and not a real believer.
It doesn't help how the schools, government, media, Hollywood, and some organizations that hope to remove religion all target Christianity in many Western countries. It seems like newer religions in America like Mormonism have more of a devout following.
Does Western Civilization need a newer religion that the media hasn't discredited to help reverse the trends?
Good source here: http://www.pewforum.org/christian/globa ... -exec.aspx
Outwest