
Have you noticed that most movies and TV shows made after 1995 tend to portray women as the stronger sex and men as the weaker sex? Why is that? It wasn't like that before.
In modern movies, when a man and woman engage in physical combat, the woman ends up kicking butt most of the time. And every time they argue, the man has to be shamed and realize that he was wrong. In action movies where there's a group of people fighting for survival, the women always take charge.
Examples: In the recent V series remake, both the Resistance and Visitors have a woman in charge. In the Resident Evil movie series, the main female hero and protagonist, Alice, is not only the strongest and most invincible character who wins every fight, but when she finds other survivors of the zombie holocaust, she always takes charge and leads them. Even when she meets other groups of survivors (such as in Resident Evil 3: Extinction) the groups are also led by other "strong females".
And on sitcoms such as "Home Improvement" and "King of Queens", whenever the husband argues with the wife, he ends up being shamed, admitting that he was wrong and apologizing. Even on radio shows like Radio Delilah and talk shows like Dr. Phil, psychologists and relationship counselors are often brought in to say, "The key to a successful relationship is for the man to learn to listen to his wife."
However, all this contradicts reality and common sense. In real life, men tend to be physically stronger than women, and would therefore win in a real fight. And in arguments, women are often proven wrong and shamed, not just men. (I know I've gotten the last word in many arguments with feminist-minded women for example, which pissed them off big time because they felt entitled to get the last word)
Most women also do not do well in leadership roles, as leadership requires the use of reason, rationality and logic. But women are not usually strong in these areas. Women by nature are nurturers. They tend to be conformists with no inner convictions. And they care more about how they are perceived than in doing what's right. (There are exceptions of course, but we are talking about most women here, not all) None of these qualities are good for leadership.
Just look at the wisest figures and great thinkers in history, people who were renowned for their wisdom, philosophy, and great ideas. The vast majority of them were men, not women. Thus, reality stands in stark contrast for how Hollywood and the media portrays things.
So why does Hollywood and the modern media like to portray the opposite of reality? Why do they like to flip the natural gender roles - making women dominant/stronger while men submissive/weaker? Why do they want to make women more masculine and men more feminine? What does that accomplish exactly? I don't get it. It doesn't make sense.
Are they trying to socially program people or brainwash them? If so, for what purpose? I can only imagine that the purpose must be something nefarious, because the effect on society, relationships, family, love and dating have been highly negative and dysfunctional.
Perhaps the conspiracy people are right when they say that the elite (Illuminati/Zionists/Powers that be) are trying to divide us with feminism and the "battle of the sexes" in order to make us weaker, destroy the family unit and lower the birthrate to have a more manageable population? What else could explain this illogical social engineering?
Consider the unnatural dysfunctional effect that turning men into women and women into men has had on relationships and dating for both genders:
1) Women by nature need a man who can be her protector. In order for him to protect her, he's obviously got to be stronger than her. Duh. If he is the weaker and more submissive sex, as Hollywood wants men to be, then how can he be her protector? Duh! This can only result in women becoming confused and dissatisfied that the men around her are too feminized and weak to be her "protector". As a consequence, she will have a hard time finding a guy or becoming attracted to one, since her own nature has been flipped upside down.
What's more, she won't understand why either. Instead, she will be left confused, disappointed and messed up due to her true feminine nature being stripped and suppressed.
All this, in addition to the off-chart entitlement mentality and pickiness of American women, make dating in America hell on Earth for men who are not in the top 20 percent.
2) On the other hand, men by nature seek a woman who is feminine (not masculine, you stupid media) and can be a "princess to rescue". But how can a woman be a "princess to rescue" if she is the dominant/stronger one while he is the weaker/submissive one? Duh. Are the elites that run Hollywood and the media stupid, insane or evil? WTF?
Now, I'm not trying to stereotype or put people into boxes, but trying to turn men into women and women into men obviously CANNOT be a good thing. There's no question about that.
By turning both men and women against their inherent nature, the modern feminist culture in the US has left both men and women feeling confused, lonely and dissatisfied. All this has resulted in the degradation of love, dating, relationships and families, as well as the gender identities of men and women, all of which is highly negative. WTF is going on?