America is not imperialist. America has not claimed any land as their own since Hawaii and Alaska. Albiet this was in the middle 20th century, but America did not claim any sizable amount of land of their own for most of the 19th century. While America was a "power", it was not imperialist. So the answer to your question is "no". Also recategorizing.
Copyright © 2026 eLLeNow.com All Rights Reserved.