WWI first began in France, France and Germany hated each other. When Germany became an United Country in 1870-1871 , France went to war to try to stop it. Germany won and France wanted revenge. They started forming alliances and many countries got involved. It was a tragic moment.
Copyright © 2026 eLLeNow.com All Rights Reserved.