I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
Even "America" is a terrible name for the US. North America is the continent, The Great Country in the south should be known as The United States, not America. 50 sovereign states United under the Constitution. This is why Canada is called Canada and has provinces because we are all just provinces under the Queen. The oldest United States money and documents doesnt even say America on them, no one even saw themselves as American, people were Bostonians, Texans, Georgians ect.
It's short for "western civilization". Here's one of many articles that gives their take on what "western civilization" is:
https://study.com/academy/lesson/what-is-western-civilization-definition-overview.html
This is correct. However, currently, it is used to refer to the Rothschild countries.
I prefer being called occidental....
A contraction of Western Hemisphere. The West including Western Europe. America was (re)discovered by ships sailing West from the Greenwich meridian. The current conflict is between the interests of the Anglo-Saxon culture and the Eastern Slavic.
The world is navigated using lines of Latitude and Longitude WEST or EAST of the Greenwich meridian of Longitude. The former British Empire founded the system ...to help it "conquer" the world (i.e. "the good old days"). 😁
America (the American Empire) as the world's superpower in the West (i.e. Western Hemisphere) is generally the focus of references to "Western Civilization". For a long while, the land to escape to if you were to the East of Stalin's wall.
I feel like it's conditioning for the one world government. I know they mean Europe and others besides the US, but it seems like what they always try to do.
Or, you could be reading too much into it. There’s nothing new about referring to the US as “the west”.
There is nothing new about conditioning.
I use "The West" when it is not entirely the US of A. The Brits or the EU are often co-conspirators along with Canada, Australia and New Zealand.
Tells you who is in charge of the people writing the articles. The East.
America isn't the name of a country, either.
Otherwise it makes as much sense as calling China "East" or calling Egypt "Africa" or calling Australia "Oceania", etc. These are continent / region names, not country names. West is a region on a map. America is a whole ass continent, of which less than 25% makes up the USA the country.