No, the colonies were not considered a continent. The colonies were separate regions that were established by various European powers in North America before they eventually became the United States. The continent they were part of is North America.
Copyright © 2026 eLLeNow.com All Rights Reserved.