I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
I keep reading articles that no longer refer to US as US or America.
Instead they refer to US as "The West".
Why is that? And what does it mean? And why the terminology change now?
The world is navigated using lines of Latitude and Longitude WEST or EAST of the Greenwich meridian of Longitude. The former British Empire founded the system ...to help it "conquer" the world (i.e. "the good old days"). 😁
America (the American Empire) as the world's superpower in the West (i.e. Western Hemisphere) is generally the focus of references to "Western Civilization". For a long while, the land to escape to if you were to the East of Stalin's wall.