No, Florida did not become English territory through the Treaty of Paris of 1793. In fact, the territory of Florida was ceded to the British by Spain in the Treaty of Paris in 1763, following the Seven Years' War. However, Spain regained Florida in 1783 through the Treaty of Paris that ended the American Revolutionary War. The 1793 treaty did not pertain to Florida; it primarily dealt with other matters related to the conflicts of the era.
Copyright © 2026 eLLeNow.com All Rights Reserved.