Are Romance Novels Bad For Women?

romance novel
Contributor
Buzz, Self

A psychologist thinks romance novels give women unrealistic expectations about love.

Author
Contributor

Expert advice

Save your breath because you only need two words to make him commit.
Are you REALLY thinking about their happiness?
If you keep finding yourself in heartbreaking, dead end relationships, listen up.
It seems like you can't do anything right.
Contributor

Explore YourTango