I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
America isn't the name of a country, either.
Otherwise it makes as much sense as calling China "East" or calling Egypt "Africa" or calling Australia "Oceania", etc. These are continent / region names, not country names. West is a region on a map. America is a whole ass continent, of which less than 25% makes up the USA the country.