Yes, I agree. A woman who steps into a position were she has to deal with men will come into the situation with a chip on her shoulder and will feel that she needs to lay down the law, or get tough on everybody out of fear that she won't be taken seriously. Other women should not even be working at all because they are not fair or consistent like a man would be when laying down the law.gsjackson wrote:Speaking from the standpoint of having practiced law for several years, I would say that the qualities you speak of have been a plague on the American workplace for about the last 30 years. To take law as an example -- despite its reputation to the contrary, the law can't really function primarily in an adversarial mode. For the system to work, you need lots of negotiating and compromising. When the profession was mostly men, that's how things worked. But American women, tutored by feminist harpies, have got to show how tough they are, and consequently are much less inclined to deal. It's led to a demoralized and far less functional profession.Repatriate wrote:I noticed that American women can be useful for certain occupations..for instance they are good at sales, lawyer, and PR type positions. Those are situations when you need aggressive and conniving people to do the job. American women are a natural fit.
And I would disagree that this is something that comes naturally to women, even AW. Witnessing it up close and personal, you can see it as an unnatural thing, something that undermines their true natures.
Anyhow, I have worked in all male environments and it's much better. The work ethic is kept high and distractions are kept to a minimum.