Hey guys,
It seems currently due to America being the World power it controls the narrative. Everything we see and the rest of the world sees if "filtered" though the lenses of the American perspective. For example, "America" used to be the name of a continent and it became the name of a country.
For instance, there's Americans who travel and will say, "How dangerous is it in Colombia, Thailand, Mexico...etc" They "sensationalize", this view. The American media like, "Vice" amplifies this view and it leads to people being scared to travel abroad. In the meantime, many guys like Winston and others who actually travel to those countries have a more impartial view of those countries. I have been heard people who felt safer in Mexico City than in some places in Chicago.
It seems that when an Empire declines what follows is propaganda against that Empire. Like "French bashing" against French people. And propaganda against the Spanish Empire.
If America stops being the World Power, do you think voices like ours will be amplified?
- ArchibaultNew
- Freshman Poster
- Posts: 277
- Joined: February 28th, 2022, 1:21 pm
-
- Similar Topics
- Replies
- Views
- Last post