What happened to the colonies once ruled by Germany after World War 1?

1 answer

Answer

1096837

2026-03-18 06:30

+ Follow

The resolution to the war took them away from Germany. Most were given to England or France to govern.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.