I was going to share a message I picked up earlier from the media about roles of men and women where I made personality changes and now am thinking, or feeling, I was sold a bill of goods for someone else's advantage.
Instead, I would like to open it up. What have you heard about the role of men or women that made sense to you at first. But now you are questioning both the message and the impact.