After World War One, Germany was humiliated and forced to pay reparations and take blame for the cause of the War. People were suffering, it was like the depression in America, but in Germany. Hitler came to power and promised to restore Germany to it's original powerful self, and people were so desperate that they bought into it, and didn't question a thing, even when he oddly started telling people to prosecute the Jews, and that they were the reason for the fall of Germany's power. Back at the time, they did not realize that they were not dealing with logical thinking, and they believed him, and they did not question anything that he said as long as he kept his promise. And he did in a way. He took over many countries while in power, and no other countries stepped in too much, mostly because of appeasement. The U.S. allowed the Lend-Lease Act for Britain who was basically fighting Germany alone. However, The immediate cause of WWII was the German invasion of Poland. This did not get the U.S. involved though. The immediate cause of the U.S. involvement in the war was the Japanese invasion of Pearl Harbor. This is probably more information than you wanted, but ya know :)
Copyright © 2026 eLLeNow.com All Rights Reserved.