America -- Never an Empire

America -- Never an Empire

The United States has never been an aggressive power. Only when the Germans insanely attacked American commercial shipping on the high seas did the United States enter World War I, just as Russia was defeated and left the war. The Americans provided the final margin of victory for the beleaguered French, British and Italians (who took 4-million war dead and nearly 7-million wounded between them). The Americans then turned their back on Wilsonian internationalism and their president’s League of Nations, and emerged from isolation only once Franklin D. Roosevelt, who spoke German and French and knew Europe well, and whose family’s fortune was earned in the Far East, concluded that the United States alone could keep the British Commonwealth in the war, ensure Stalin did not make a separate peace with Hitler (as he attempted to do with the Nazi-Soviet Pact in 1939), and prevent Japan from overrunning the entire Western Pacific and Far East.

Read Full Article »
Comment
Show commentsHide Comments

Related Articles