Feminism in America

Evana Flores Augusta Tostrud

Respect Women

Women have the same rights as men in the workplace. Although feminist oppression is not as extreme in America as it is in other countries, it does not mean that women don't care and that it is okay to oppress them in the workplace or deny them the rights or authority to have a high position or pay in the office!