No, England did not end up acquiring Florida and Canada in their entirety. While England gained control of Florida from Spain in the Treaty of Paris (1763), they eventually returned it to Spain in 1783 after the American Revolutionary War. Canada, on the other hand, was retained by Britain following the same treaty, becoming a significant part of its colonial possessions in North America.
Copyright © 2026 eLLeNow.com All Rights Reserved.