War has long been a source of contention between nations, and the United States is no exception. Throughout its history, conflict has not only shaped the physical world, but it has also changed the political landscape. With the exception of seismic shifts in alliances and the redefinition of national identity, wars have defined American society