Reading that paper I was reminded of the old 1950s sci-fi movie "Forbidden Planet." The previous, extinct inhabitants (the Krill) of the planet had created a new technology that killed off all the Krill in a single night.
Great movie for the time, the plot was based on Shakespeare's "The Tempest" and was far ahead of other sci-fi movies of the time.
But the sudden kill-off of the Krill is the key to the movie, and maybe even today's AI.
There are a lot of places that are scifi-science crossovers and that explore the Fermi Paradox and reasons as to if there are inherent blockers to the advancement of terrestrial intelligent life in becoming space-faring. Do they ruin their own kind with war, with resource scarcity, with plagues, with pollution, with deranged ideologies causing similar mass psychosis we saw over covid etc? Are there caps on the industrial base and societal complexity needed to achieve this peacefully that most species just cant push beyond?
Added to the list surely would be a crafted machine intelligence that turns hostile against its creators. Or even that kills pragmatically and not out of any desire to do harm directly - say one that diverts all food crops logistics and future planning into making biodiesel to power itself, with scant regard for the biologicals that need food.
Good observation. An example of your latter point is the MCAS software on the 737 MAX: "I will not let this airplane stall, come Hell or high water, pilots and passengers be damned." And we give life-and-death decisions to a toaster.
Reading that paper I was reminded of the old 1950s sci-fi movie "Forbidden Planet." The previous, extinct inhabitants (the Krill) of the planet had created a new technology that killed off all the Krill in a single night.
Great movie for the time, the plot was based on Shakespeare's "The Tempest" and was far ahead of other sci-fi movies of the time.
But the sudden kill-off of the Krill is the key to the movie, and maybe even today's AI.
There are a lot of places that are scifi-science crossovers and that explore the Fermi Paradox and reasons as to if there are inherent blockers to the advancement of terrestrial intelligent life in becoming space-faring. Do they ruin their own kind with war, with resource scarcity, with plagues, with pollution, with deranged ideologies causing similar mass psychosis we saw over covid etc? Are there caps on the industrial base and societal complexity needed to achieve this peacefully that most species just cant push beyond?
Added to the list surely would be a crafted machine intelligence that turns hostile against its creators. Or even that kills pragmatically and not out of any desire to do harm directly - say one that diverts all food crops logistics and future planning into making biodiesel to power itself, with scant regard for the biologicals that need food.
Interesting thoughts fren.
Good observation. An example of your latter point is the MCAS software on the 737 MAX: "I will not let this airplane stall, come Hell or high water, pilots and passengers be damned." And we give life-and-death decisions to a toaster.