Let me just clarify something before you start delving into this post: I love women! I am a woman; women are great, we can do amazing things, is being able to push babies out of our bodies is one. I am absolutely not opposed to women working. But I am opposed to women resisting their natural femininity and "womanliness" while growing out their armpit hair, acting like sleazy men, and hating children.
Over the last three decades women have become less, and less happy. Interestingly enough, this has happened at the same time feminists started pushing the popular narrative that women need to get in the workforce. Women are naturally very caring, nurturing, and sympathetic, but feminists completely disregard our biological differences and insist that we should have the same lives as men. I don't know when feminists decided that this was "socially constructed" and needed to be abolished, but I do know that this decision has negatively affected the lives of a lot of women.
Research has found that stay-at-home mothers are happier than those who work. Now if you knew you could be way happier at home cleaning, cooking, and caring for your family, would you still go out and work a stressful 9 to 5 office job? No? Neither would I.
Today most women are taught from a young age that they don't need to rely on a man, and that they can do anything a man can do. When in reality women rely on men and vice versa. The very basis of female nature is to care and nurture your husband, and then the offspring you have with your husband. We should be teaching young girls how to embrace this not destroy it. And then maybe the 40 year-old, single "non-binary" dykes who plague are society will quickly become a minority.