How did the victorious countries make peace with Germany after World War 1 and World War 2?

1 answer

Answer

1169183

2026-03-20 18:25

+ Follow

After WWI a treaty of Versailles was signed by Germany which was force upon them by the victorious nations. This Treaty ordered for Germany to take all responisbility for the causes of the war and to pay for repairs that took place during the war.So really peace was never made after WWII but 'compromises' were made Inevitably this humiliated Germany and severley effected their economy which is why Hitler spent many years in his life dedicated to making Germany 'Great' again so he blamed problems on Jews which eventually bought round WWII. I'm not sure how they made peace after WWII

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.