Once the foundation of Western values, Christianity’s influence is declining. Some see this as progress toward secularism, while others believe it leads to moral decay. Has the decline of Christianity weakened Western identity? Can a society function without religious moral guidance?