Are Nazis German

1 answer

Answer

1053215

2026-04-13 04:15

+ Follow

Since the foundation of the Nazi party, there have been members all over the world. The majority lived in Germany up until the end of the Second World War but others lived in countries like England and the United States.

Today, people who call themselves Nazis are generally white males who live in Western Europe and in the United States.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.