posted on Nov, 27 2005 @ 11:33 PM
Originally posted by apex
Anyone have any thoughts/reasons why it is happening?
My guess is because it's now legal in the West to not pretend to have faith (even though in some places you still have to align yourself with some
religion officially / pay religious taxes), combined with the information age.
It's difficult to keep people in the dark when they are informed, even if the information is dumbed down and sold as entertainment.
The revelations from the Dead Sea scrolls and the Nag Hammadi Library are just now making their way to laymen and showing that the common vision of
early Christianity is plain wrong. Many who have never given much thought about their religion, when faced with this, will just abandon it (which is
most everyone).
In the West (which is what you were really referring to), Christianity has been the dominant religion for close to 2000 years, and this recovered
information is finally taking its toll, after having been suppressed with physical force since the 3rd century.
I don't know if there is a similar rejection of religion happening in other parts of the world.