Page 3 - Final Proyect
P. 3
What happened before the second world war?
AftermAth of World WAr I…
World War II is generally viewed as having its roots in the aftermath of World War
I, in which the German Empire under Wilhelm II, with its Central Powers, was
defeated, chiefly by the United Kingdom, France, and the United States.
The victors blamed Germany entirely for the war and all resulting damages; it was
Germany that effectively started the war with an attack on France through
Belgium. France had, in 1871, suffered a defeat in the Franco-Prussian War, and
demanded compensation for financial devastation during the First World War,
which ensured that the various peace treaties, specifically the Treaty of Versailles
would impose tough financial war reparations and restrictions on Germany in the
aftermath of World War I.