I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
I feel like it's conditioning for the one world government. I know they mean Europe and others besides the US, but it seems like what they always try to do.
Or, you could be reading too much into it. There’s nothing new about referring to the US as “the west”.
There is nothing new about conditioning.