World war 1 was officially ended with the signing of the treaty of?

1 answer

Answer

1290954

2026-03-31 21:30

+ Follow

WW1 was ended with the Treaty of Versailles which blamed the war on Germany. Germany was so mad about this that, for their new dictator, they looked to Hitler for guidance This led up to WW2, in a way.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.