Why do historians say the ending of World War 1 led to World War 2?

1 answer

Answer

1086057

2026-03-22 15:45

+ Follow

When WW1 ended, the victorious Allies placed the main blame for the war on Germany. They made Germany pay the costs of repair, and put limits on Germany's military strength. This made the people of Germany mad, and they turned to a man that promised that Germany would return to power. This man's name was Adolf Hitler.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.