What land did the United states gain in the treaty of Paris?

1 answer

Answer

1204515

2026-04-19 02:05

+ Follow

In the Treaty of Paris, signed in 1783, the United States gained significant territory, including all land east of the Mississippi River, north of Florida, and south of Canada. This expansion provided the U.S. with a vast area for future growth and settlement. Additionally, Britain recognized American independence and ceded Florida back to Spain, while also relinquishing its claims to the land that would become the Northwest Territory.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.