I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
Even "America" is a terrible name for the US. North America is the continent, The Great Country in the south should be known as The United States, not America. 50 sovereign states United under the Constitution. This is why Canada is called Canada and has provinces because we are all just provinces under the Queen. The oldest United States money and documents doesnt even say America on them, no one even saw themselves as American, people were Bostonians, Texans, Georgians ect.