Did England end up getting Florida and Canada?

1 answer

Answer

1212673

2026-05-12 03:00

+ Follow

No, England did not end up acquiring Florida and Canada in their entirety. While England gained control of Florida from Spain in the Treaty of Paris (1763), they eventually returned it to Spain in 1783 after the American Revolutionary War. Canada, on the other hand, was retained by Britain following the same treaty, becoming a significant part of its colonial possessions in North America.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.