Are Gender Roles In Relationships On Their Way Out?

By

housewife retro
Liza Mundy's new book "The Richer Sex" says women are earning more money than men.

Stay-at-home dads and breadwinning moms may be the norm soon, predicts Liza Mundy in her new book, The Richer Sex. She points out that "almost forty percent of U.S. working wives now outearn their husbands," and that traditional gender roles are a thing of the past. It's not surprising, given that society's view of women has rapidly changed in the past century. Hello, right to vote and Samantha Jones! But what does this mean for relationships?

According to Mundy, a "Big Flip" in gender roles is inevitable and the dynamics of male-female relationships will change drastically. Women will have increased power and will no longer see sex as a way to get a man to commit (and when was that the norm, anyway?) Mundy cites that younger women are having sex with more partners and "are becoming the gender that wants sex more than men do."

 

Well, I really don't know about all of that — if anything, I think that men and women have always wanted to have sex the same amount, women are just better at expressing it now — but I can speak for the change in gender roles and expectations in relationships, especially in contrast to my grandparents' relationships, and even my parents'.

Financially, the women in my family have either out-earned or made an equal amount of money to their spouses at this point. Women in my family do it all — they have careers, make money, cook meals, run the household, shuttle kids to and from school, balance bank accounts and mortgages, and of course, navigate gracefully through complicated social politics. Sounds like most women today, right?

Interestingly enough though, in my family, the "power" has still always been in favor of men, simply because they are the "man of the house." I wonder if that will still hold true for women of my generation and beyond — will men be expected to act a certain way (i.e. manly), just because they are men?

Plus, we're all still watching romantic comedies where men "save" women who either don't have any direction in life until their prince comes along or are career-hungry "bitches" who "need" a man to see that romance isn't dead after all. And Disney still churns out (or re-releases) their stories of princesses being saved by princes. The idea of what a male-female relationship "should be" is still deeply ingrained in our society. Will this change soon?

What do you think? Is a big male-female role reversal just around the corner? 

More juicy content from YourTango: