Freedom From Discrimination

Women's rights in the United States

Overview

Discrimination is something that women face around the world. In the U.S although women's rights have improved, people mistakenly believe women are not discriminated against here. But women earn less than men, experience violence, and do not hold the same top positions in politics or business as men.