What did blacks do after Civil War Did they get their civil rights?

1 answer

Answer

1127400

2026-04-27 16:36

+ Follow

Yes only temp though when the north stop having military station there prejudice southerns took matters of taking them away.

Blacks began receiving rights in 1865, but with the implementation of anti-freedmen organizations, such as the KKK (b. 1865), blacks' newfound rights were being taken away. The later "Civil Rights Movement" was blacks fighting to gain those rights back & to receive a more equal standing in America.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.