On the subject of war, you'd expect religion to provide solid ethical guidelines - that's what it does best, after all, say its adherents. There may not exist literal deities, messiahs, prophets - but at least the holy texts give us some firm basis by which to lead a good life, and maintain order in society. War is an issue that has troubled thinkers since biblical times and before; it has permitted the greatest atrocities imaginable to take place practically everywhere on Earth. War is the most violent, unforgiving aspect of modern existence; surely we must turn to religion for guidance here?
Take Christianity. The first Christians were absolute pacifists, who held that violence was inherently wrong, completely unacceptable and wholly avoidable. Jesus instructed them explicitly: "if someone strikes you on the right cheek, turn to him the other also”, "love your enemies, and pray for those who persecute you", “blessed are the peacemakers”. They refused to join the Roman Army on principal, and for this were persecuted.
The Emperor Constantine converted to Christianity in AD 312, and with him all of Rome. However, the Emperor desired to wage wars for the good of his people, and this was opposed by the teachings of Jesus. So, did he accept the fundamental basis of his religion in order to act morally? No. He waged war regardless.
Later, Christian theologians, most influentially Thomas Aquinas, attempted to justify violence using religion, producing the Just War Theory, which has since been adopted by most developed countries, and shaped United Nations' policy. Today, most Christians see no problem with sending their children to war, or supporting and voting in favour of conflicts around the globe.
My conclusion: despite the fact that Christians are by definition followers of Christ, who, if he taught ONE thing, it was that war, violence and conflict can NEVER be justified, this message is ignored, subordinated to the cunning logic of warmongers.
So what is the goddamn point of Christianity?