After WWI a treaty of Versailles was signed by Germany which was force upon them by the victorious nations. This Treaty ordered for Germany to take all responisbility for the causes of the war and to pay for repairs that took place during the war.So really peace was never made after WWII but 'compromises' were made Inevitably this humiliated Germany and severley effected their economy which is why Hitler spent many years in his life dedicated to making Germany 'Great' again so he blamed problems on Jews which eventually bought round WWII. I'm not sure how they made peace after WWII
Copyright © 2026 eLLeNow.com All Rights Reserved.