I feel my life has lead me to wonder if some people will ever stop treating men better than they treat women.
I don't mean these people always like men better. They might "like" women more sometimes. But they still favor men in EVERYTHING.
Women with health issues deserve to be treated much better.
Women's healthcare has a long, long, long ways to evolve.
Married moms being treated better than single moms is just sick & always has been a boring and unfun aspect of life under the patriarchy.
Last names. Who always is expected to change theirs when they get married?
Who gets treated worse when reproductive rights get restricted?
Treating a few women married to prominent men better, with more "fake respect" than you treat women who labor in awful conditions for the food you eat, is not admirable or sustainable in an era of climate change.
Catering to men who make excuses for rape culture, constantly treating men better than women, and getting a portion of the female population to say they don't care as long as there's $$$ in it for them is normal? Normal for who? Acceptable for who?