12
posted ago by aslan_is_0n_the_m0ve ago by aslan_is_0n_the_m0ve +12 / -0

‘We Are All Going to Die:’ Researcher Calls for Advanced AI Projects to Be Shut Down https://www.breitbart.com/tech/2023/03/30/we-are-all-going-to-die-researcher-calls-for-advanced-ai-projects-to-be-shut-down/ terminator.0Orion Pictures ALLUM BOKHARI30 Mar 2023376 4:11 A loud voice of doom in the debate over AI has emerged: Eliezer Yudkowsky of the Machine Intelligence Research Institute, who is calling for a total shutdown on the development of AI models more powerful than GPT-4, owing to the possibility that it could kill “every single member of the human species and all biological life on Earth.”

Yudkowsky’s apocalyptic warning came in response to an open letter from 1,000 experts and tech leaders, including Apple co-founder Steve Wozniak and Tesla and Twitter CEO Elon Musk, calling for a six month moratorium on the development of AI technologies more powerful than OpenAI’s GPT-4.

In an article for Time Ideas, Yudkowsky states he did not sign the letter because he does not believe a six month moratorium goes far enough, imagining a scenario in which an uncontrollable AI begins manufacturing biological viruses.

Via Time:

To visualize a hostile superhuman AI, don’t imagine a lifeless book-smart thinker dwelling inside the internet and sending ill-intentioned emails. Visualize an entire alien civilization, thinking at millions of times human speeds, initially confined to computers—in a world of creatures that are, from its perspective, very stupid and very slow. A sufficiently intelligent AI won’t stay confined to computers for long. In today’s world you can email DNA strings to laboratories that will produce proteins on demand, allowing an AI initially confined to the internet to build artificial life forms or bootstrap straight to postbiological molecular manufacturing.

If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter.

Yudkowsky went on to call for a total global shutdown on advanced AIs, enforced by U.S. air power if necessary.

The moratorium on new large training runs needs to be indefinite and worldwide. There can be no exceptions, including for governments or militaries. If the policy starts with the U.S., then China needs to see that the U.S. is not seeking an advantage but rather trying to prevent a horrifically dangerous technology which can have no true owner and which will kill everyone in the U.S. and in China and on Earth. If I had infinite freedom to write laws, I might carve out a single exception for AIs being trained solely to solve problems in biology and biotechnology, not trained on text from the internet, and not to the level where they start talking or planning; but if that was remotely complicating the issue I would immediately jettison that proposal and say to just shut it all down.

Shut down all the large GPU clusters (the large computer farms where the most powerful AIs are refined). Shut down all the large training runs. Put a ceiling on how much computing power anyone is allowed to use in training an AI system, and move it downward over the coming years to compensate for more efficient training algorithms. No exceptions for governments and militaries. Make immediate multinational agreements to prevent the prohibited activities from moving elsewhere. Track all GPUs sold. If intelligence says that a country outside the agreement is building a GPU cluster, be less scared of a shooting conflict between nations than of the moratorium being violated; be willing to destroy a rogue datacenter by airstrike.

While Yudkowsky’s warnings seem extreme, they are not outside the norm for AI researchers and futurologists, who have long debated the outcome of a “technological singularity” in which the reasoning capabilities of machines — and their capability to create newer, more intelligent versions of themselves — outstrip humanity’s and humanity’s ability to control it. Some predict utopia. Others, like Yudkowsky, predict catastrophe.