feminism

Feminism | ˈfem.ɪ.nɪ.zəm | noun | politics

  • Feminism is the belief that women should be allowed the same rights, power, and opportunities as men and be treated in the same way or the set of activities intended to achieve this state.

Source: Cambridge Dictionary

Scroll to Top