at what point did a woman — most likely one who made money with her body somehow — say, "geez, self, let's scrape away the hair under our arms that wicks away heat and moisture." or, "now that we've done that, self, our legs look awfully hairy when i'm up there on stage/lying on my back/posing for that nudie portrait."
i'm neither knocking or, nor saying that i'm not a hair-remover-er. i just want to know at what point in our history it happened. who came up with it? how did it become mainstream — even expected in the civilized world?
and have you seen the new skin-whitening deodorant? for those of us with unsightly dark splotches in our 'pits. when will i have to buy that?