What did World War 2 teach Americans?

1 answer

Answer

1000423

2026-04-05 23:30

+ Follow

World War 2 taught (some) Americans that we need to be more tolerant of different races. When they heard the Horror of the Holocaust, they realised that we were also intolerant of African-Americans as they were of the Jewish population.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.