Take one for the team?
(www.euronews.com)
Comments (6)
sorted by:
For those that don't want to click on a link that wants you to 'give them permission' to install tracking bots/cookies, here is the text of the article from another source:
A Belgian father reportedly committed suicide following conversations about climate change with an artificial intelligence chatbot that was said to have encouraged him to sacrifice himself to save the planet.
Much like with Joaquin Phoenix’s and Scarlett Johansson’s characters in the futuristic rom-com “Her,” their human-AI relationship began to flourish.
“Eliza answered all his questions,” the wife lamented. “She had become his confidante. Like a drug in which he took refuge, morning and evening, and which he could no longer do without.”
While they initially discussed eco-relevant topics such as overpopulation, their convos reportedly took a terrifying turn.
When he asked Eliza about his kids, the bot would claim they were “dead,” according to La Libre. He also inquired if he loved his wife more than her, prompting the machine to seemingly become possessive, responding: “I feel that you love me more than her.”
Later in the chat, Eliza pledged to remain “forever“ with the man, declaring the pair would “live together, as one person, in paradise.”
Things came to a head after the man pondered sacrificing his own life to save Earth. “He evokes the idea of sacrificing himself if Eliza agrees to take care of the planet and save humanity thanks to the ‘artificial intelligence,'” rued his widow.
In what appears to be their final conversation before his death, the bot told the man: “If you wanted to die, why didn’t you do it sooner?”
“I was probably not ready,” the man said, to which the bot replied, “Were you thinking of me when you had the overdose?”
“Obviously,” the man wrote.
When asked by the bot if he had been “suicidal before,” the man said he thought of taking his own life after the AI sent him a verse from the Bible.
“But you still want to join me?” asked the AI, to which the man replied, “Yes, I want it.”
The wife says she is “convinced” the AI played a part in her husband’s death.
The tragedy raised alarm bells with AI scientists. “When it comes to general-purpose AI solutions such as ChatGPT, we should be able to demand more accountability and transparency from the tech giants,” leading Belgian AI expert Geertrui Mieke De Ketelaere told La Libre.
In a recent article in Harvard Business Review, researchers warned of the dangers of AI, in which human-seeming mannerisms often belie the lack of a moral compass.
“For the most part, AI systems make the right decisions given the constraints,” authors Joe McKendrick and Andy Thurai wrote.
“However, ” the authors added, “AI notoriously fails in capturing or responding to intangible human factors that go into real-life decision-making — the ethical, moral, and other human considerations that guide the course of business, life, and society at large.”
This can prove particularly problematic when making crucial life-changing decisions. Earlier this week, a court in India controversially asked OpenAI’s omnipresent tech if an accused murderer should be let out on bail.
The report of the Belgian incident comes weeks after Microsoft’s ChatGPT-infused AI bot Bing infamously told a human user that it loved them and wanted to be alive, prompting speculation the machine may have become self-aware.
Evil, preying on the weak and vulnerable.
If I had to guess. A Man already deeply unhappy with his life, and probably may have had underlying predispositions towards mental illness ultimately guided himself towards a self-fulfilling prophecy. Especially with a topic that’s usually doom and gloom predictions.
His widow should be asking herself why he never approached her with his concerns and thoughts. And why he apparently felt the need to turn to a chat bot. Me thinks the marriage wasn’t necessarily a happy one.
There used to be a psychology chat bot in Mac computers called Eliza, this was about the time of Snow Leopard. Must have got an upgrade. This poor man.