Looking at it from a secular perspective I’ve been thinking about religion lately. As we all know, the West has all but abandoned Christianity: the little of it that remains is not very different from progressivism; and progressivism itself seems to have morphed into a new type of religion, with its own processions, prayers and…
Was it bad for the West to abandon religion?
