Just about everyone looks better when they smile. It’s true regardless of gender. I don’t see where sexism enters the equation.
I feel pretty oblivious. What am I missing?
Just about everyone looks better when they smile. It’s true regardless of gender. I don’t see where sexism enters the equation.
I feel pretty oblivious. What am I missing?
Tell us a story of the last time you witnessed someone telling a man to smile because he would look so much better if he did.
I can’t, either. That’s why.
Well people have frequently mistaken me for a woman most of my life and thats happened to me a few times. Its intrusive and irritating to be told I should look a certain way, especially by a stranger and I would consider it rude to say to anyone unsolicited.
Thats not to say its not worse for women having to deal with the objectification layer, too.
Nobody asked Luigi to smile, he just did…
Gigachad Sigma Grindset 😎
I agree with the sentiment, but I, a man, actually have customers tell me to smile more weirdly often working retail.
And they say it’s because “you’re prettier when you smile” or something like that?
That’s one exception that doesn’t surprise me. Do you have any sense of how often they are doing this with intentional irony compared to with genuine obliviousness?
It’s really only creepy old dudes I get it from. It seems pretty genuine most of the time. These comments are more frequent and more egregious with my women coworkers, though, as one might expect.
How interesting! That makes it even less surprising.