Are Romance Novels Bad For Women?


romance novel
A psychologist thinks romance novels give women unrealistic expectations about love.

That's not all Quilliam warns against. As a relationship counselor, she's aware of how susceptible some women are to romanticism. Becoming overly influenced by romance novels can lead to full-on panic when real-world dilemmas come into play. Romance novels tell you that babies strengthen a relationship, but real-life relationships can spiral downward when you become parents. Romance novels also tell you that your patience will be rewarded by the arrival of Prince Charming, but the fact is that some women will die alone.

So what's the takeaway here? While pointing out that romance novels are less misogynsitic, and heroines less passive, than they were several decades ago, they contain an inherent tendency toward misleading the naive, just as romantic comedies often can. It's not as if romance novels should be demonized; after all, video games and heavy metal music have been accused of promoting violence among men for years now, so it only follows that women have their own stigmatized form of entertainment. Romantic Comedies Warp Your Brain

Do you read romance novels? Do you agree that they could give women the wrong idea about love?

Must-see Videos
Most Popular