I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
Tells you who is in charge of the people writing the articles. The East.