In the early 20th century, Americans tended to be isolationists when it came to foreign policy. For the most part, WWI looked like a European conflict into which America need not enter, and president Woodrow Wilson pledged to keep the country out of the conflict. However, after Germany continued to attack unarmed merchant and passenger ships the U.S. severed diplomatic ties with it. Continue reading
The Nation at War
Leave a reply