A.I. is here to stay,
and A.G.I. seems to be close,
and we need to at least understand the concepts well enough to explain it to another person...
You need to learn AI in 2024! (And here is your roadmap)
45:20
https://www.youtube.com/watch?v=x1TqLcz_ug0
Sam Altman STUNS Everyone With GPT-5 Statement (GPT-5 Capabilities + ASI)
33:24
https://www.youtube.com/watch?v=-Mca6eN81Is
Sam Altmans SECRET Plan For AGI - "Extremely Powerful AI is close"
22:56
https://www.youtube.com/watch?v=v-grfwbZGc8
Raising $7T For Chips, AGI, GPT-5, Open-Source
27:08
It goes without saying that AI is probably here to stay in some form. Like it or not. Anyone who can afford to develop models. By necessity needs to or risk being squeezed out of the market in the case of companies, or open themselves to attack in the case of nation states.
And at the moment it’s a tool like any other that can be used for good or evil. Energy when AI is brought up at the moment is frequently spent on worrying about Clickbait AI Doomsday Scenarios. Imagining some fantasy scenario where the world gives up AI en-mass. Or imagining some sort of Utopian post-scarcity existence built on the backs of AI and Robots. Instead of confronting and finding solutions to the very real problems it’s introduction could and is causing. And honestly evaluating the benefits it can and is bringing.
Won’t venture to guess about AGI. As that’s one of those things that has its definition change periodically and the goal posts move. According to some of the earliest barely defined notions of AGI for instance. ChatGPT and it’s many assorted clones have already passed that benchmark.