Should You Just Let A Man Be A Man?
With so much going on about male/female roles, and how men are now happier than women according to the NY Times, should we just go back to old fashioned values?
Back then relationships were clear, defined, and still had its share of problems, but there were no questions about who was paying the bills and who was doing the dishes. Or should men be more like women, and women more like men in order to create balance and reciprocity?
What do you think? I prefer tradition, but that’s just me, I don’t have time reinventing the wheel and figuring it out on the way. I am way too impatient. Just let me know what I have to do! Take the poll below.