I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
It's short for "western civilization". Here's one of many articles that gives their take on what "western civilization" is:
https://study.com/academy/lesson/what-is-western-civilization-definition-overview.html
This is correct. However, currently, it is used to refer to the Rothschild countries.