I don't think I'm using the right word here but... Follow along... What is it with society that makes us think the norm for lets say men, shouldn't be the norm for a woman? Like for example a lot of people would turn away in disgust from a woman who doesn't shave he armpits or legs daily but with guys its normal have hairy underarms... Sorry I'm terrible at trying to get a point across.