I feel like the Allies leaders were globalist pieces of shit. They basically did their best to make Germany starve to death. Maybe fucked up the Middle East on purpose so they would have forever wars and weapons to sell.
It seems logical to me that British Empire if they were asshoes would want to take out Germany before it got too strong. Maybe they duped them into it.
Does anyone have some "alternate history" insights?
You would think that after WWI British's hold on #1 empire would be stronger than ever, they crushed the rising Germany, they gained a bunch more territory. But, it seems like the world's main empire soon became the US without much of a fight.
Why did Britain lose its status so fast? Was the decision made by the people who really own/control things to switch the main HQ to the the US? And why pick the US, a country with a history of unruly freedom fighters who've taken the fight to them before, instead of the UK who has a culture of "loyalty"?
For whatever reason, the British Empire was systematically dismantled. I guess they were confident enough that they had captured the USA with the introduction of the Fed a few years earlier.
It’s hard to critique their decisions earlier in the century from a strategy standpoint. It set them up to run the table and capture the western world. Nearly a century later and the world is only now starting to mount a defence.