The Civil war changed public opinion about the capabilities of women. While the men were away at war, women took over the farms and plantations, as well as working in factory jobs that were thought to be inappropriate for women.
Copyright © 2026 eLLeNow.com All Rights Reserved.