Liberalism is not new or progressive, but a sign of decay
Posted: May 26th, 2014, 12:47 am
When I met fschmidt, he told me something interesting.
He told me that liberalism is nothing new and it is not progressive at all. It is not an advancement in society, nor does it make a culture more evolved. It is not even a positive change.
Historically, liberalism has always come about when a culture was in decline and decay. It has never made a culture or society stronger or more prosperous, but the exact opposite. For example, after Alexander the Great died, ancient Greece went into a state of moral decay. Liberalism erupted, which brought feminism, gay rights, promiscuity, rejection of traditional values, etc. This weakened Greece so that when the Romans arrived, they were too weak to resist them.
Likewise, ancient Rome went into a state of decay in the 400's AD. The same liberalism took over and resulted in moral degradation. People became degenerate and that weakened the empire to the point of internal collapse. Hence, by 475 AD Rome itself was sacked by the northern Germanic tribes.
Now the same is happening to America.
Is that true? If so, then why do liberals and their "educated leftist intellectuals" think that liberalism is something new and progressive? Why don't American historians explain to liberals that liberalism has always been a sign of decay and cultural collapse throughout history?
I think this makes sense. Rejecting traditional values and promoting gay rights, promiscuity, and feminism, logically leads to the destruction of the family, society, men, masculinity, femininity, and all that is wholesome and natural. So how can liberals expect that such values will lead to the prosperity of America? Are they crazy and deluded? Or just evil? What motivates them?
How can making women independent and not need men, which in turns leads to depressed lonely sexually deprived men, as well as the breakdown of family and relationships, lead to a better and more prosperous society? Wtf are liberals smoking? When you destroy families and men, you destroy your own society and nation. Duh. What could be more obvious and logical?
I've never understood the appeal of liberalism. What good does it do? What's in it for me? If you aren't gay or a feminist, what's in it for you? It doesn't fulfill any of my needs at all. It's just pointless drivel that has harmful dangerous consequences to society. And why do young people tend to lean toward liberalist values?
How does liberalism begin anyway? Fschmidt told me that it begins when a society is so prosperous that its citizens become arrogant and spoiled and starting wanting more power and special rights. So they campaign for more power and then it escalates into liberalism and the breakdown of traditional values.
Fschmidt also told me something else that was interesting. He said that he researched some books that proved without a doubt that the valuation of female chastity has ALWAYS correlated with prosperous societies and cultures at their apex. Historically, societies that have valued female chastity more have always prospered.
But I'll let him elaborate on that if he wants.
He also told me that it's a bad and wrong thing for women to dress sexy and slutty and tempt men out i public that they have no interest in attracting. This is why Muslim countries don't allow them to do that, and why in the 1800's women were required to wear long dresses that covered their arms and legs. It's an insult to men to tempt them for no reason.
When I asked him why famous critics of American policy like Noam Chomsky can expose US government crimes and get away with it and keep his professorship at MIT. He told me that it's ok to be anti-American, as long as you aren't anti-liberal. Being anti-liberal is the biggest taboo and will cause one to lose their job.
What do you all think?
He told me that liberalism is nothing new and it is not progressive at all. It is not an advancement in society, nor does it make a culture more evolved. It is not even a positive change.
Historically, liberalism has always come about when a culture was in decline and decay. It has never made a culture or society stronger or more prosperous, but the exact opposite. For example, after Alexander the Great died, ancient Greece went into a state of moral decay. Liberalism erupted, which brought feminism, gay rights, promiscuity, rejection of traditional values, etc. This weakened Greece so that when the Romans arrived, they were too weak to resist them.
Likewise, ancient Rome went into a state of decay in the 400's AD. The same liberalism took over and resulted in moral degradation. People became degenerate and that weakened the empire to the point of internal collapse. Hence, by 475 AD Rome itself was sacked by the northern Germanic tribes.
Now the same is happening to America.
Is that true? If so, then why do liberals and their "educated leftist intellectuals" think that liberalism is something new and progressive? Why don't American historians explain to liberals that liberalism has always been a sign of decay and cultural collapse throughout history?
I think this makes sense. Rejecting traditional values and promoting gay rights, promiscuity, and feminism, logically leads to the destruction of the family, society, men, masculinity, femininity, and all that is wholesome and natural. So how can liberals expect that such values will lead to the prosperity of America? Are they crazy and deluded? Or just evil? What motivates them?
How can making women independent and not need men, which in turns leads to depressed lonely sexually deprived men, as well as the breakdown of family and relationships, lead to a better and more prosperous society? Wtf are liberals smoking? When you destroy families and men, you destroy your own society and nation. Duh. What could be more obvious and logical?
I've never understood the appeal of liberalism. What good does it do? What's in it for me? If you aren't gay or a feminist, what's in it for you? It doesn't fulfill any of my needs at all. It's just pointless drivel that has harmful dangerous consequences to society. And why do young people tend to lean toward liberalist values?
How does liberalism begin anyway? Fschmidt told me that it begins when a society is so prosperous that its citizens become arrogant and spoiled and starting wanting more power and special rights. So they campaign for more power and then it escalates into liberalism and the breakdown of traditional values.
Fschmidt also told me something else that was interesting. He said that he researched some books that proved without a doubt that the valuation of female chastity has ALWAYS correlated with prosperous societies and cultures at their apex. Historically, societies that have valued female chastity more have always prospered.
But I'll let him elaborate on that if he wants.
He also told me that it's a bad and wrong thing for women to dress sexy and slutty and tempt men out i public that they have no interest in attracting. This is why Muslim countries don't allow them to do that, and why in the 1800's women were required to wear long dresses that covered their arms and legs. It's an insult to men to tempt them for no reason.
When I asked him why famous critics of American policy like Noam Chomsky can expose US government crimes and get away with it and keep his professorship at MIT. He told me that it's ok to be anti-American, as long as you aren't anti-liberal. Being anti-liberal is the biggest taboo and will cause one to lose their job.
What do you all think?