A new book seeks to counter the trend in academia and pop literature to depict American history as a relentless trampling of human rights by an intolerant Christianity. But does the counteroffensive prove America’s essentially Christian—and liberal in the best sense—character?