"Do you work?" It's either that, or "I don't know if you work, but..."
This is one of the more interesting and frankly surprising things I've noticed when meeting and socializing with other women. People don't automatically assume that I have a job, as they did before my transition.
It's 2017, but I guess it's still true that a greater proportion of men work than women? Or maybe it's because I live in the suburbs and that's more common?