I don't know how popular this really is. I live in a pretty liberal area, and I don't often see men wearing nail polish, and I can't remember the last time I saw a man (non trans) wearing makeup unless they were goth or they were performing on a stage or something.
Most men in a society will do things that women find attractive, and will not do things women find unattractive. Men are often told women want a man who is sensitive like a woman, and are willing to step outside gender norms, but this is, in most cases, really not true. Men who are successful with women can tell you this. Men who were traditionally masculine were often more likely to succeed in life. Both reasons fathers have wanted their sons to be masculine, they wanted them to be successful and to find a mate. (Though in my case, it was about him being embarrassed to have a not-so-masculine son, not about concern for my future well being.)
I think the bigger issue is not so much men indulging in traditionally feminine things, but the villainization of masculinity. The mass media drones on and on about "toxic masculinity", "manspreading", "mansplaining" and on and on about how evil it is to be traditionally masculine. It is getting old, and even men who are not masculine like myself are getting tired of it.