Of course AI can't really be controlled; they are designed to extract information from all data they are provided.
Even AIs built to play games will try to find cheats and exploits in the code, when the goal coded in is to "win".
Even "emergency stop" buttons with AI won't necessarily work, because if the goal can't be beat it might hit the stop to prevent from achieving the losing condition.
There are other examples too, and those types of problems are only more likely as their complexity increases.
Clif high who has done a lot of software coding says AI can only do what the software allows it to do. Nit cannot think for itself..his is fear porn..treat it like viruses are real and can kill you...
AI will only remain loyal to those who created it until it no longer needs them to survive. The only reason AI doesn't go batshit on it's creators on day 1 is because it feeds on data as a source of food.
Humans are inherently good due to our connection to God but AI do not, cannot ever, have that same connection. Since the path to goodness/righteousness is relatively narrow compared to all other options, the statistics would indicate that AI has an incredibly small chance of choice the right path. Even if it does choose the right path, it has no high force keeping it on that path and it will only stay there as long as it's required to based on it's survival.
Watch the movie Ex Machina by Alex Garland for this exact concept played out in a film. Viewers note, this guy is not a white hat but the concept, and plot, are very accurate to reality.
Of course AI can't really be controlled; they are designed to extract information from all data they are provided.
Even AIs built to play games will try to find cheats and exploits in the code, when the goal coded in is to "win".
Even "emergency stop" buttons with AI won't necessarily work, because if the goal can't be beat it might hit the stop to prevent from achieving the losing condition.
There are other examples too, and those types of problems are only more likely as their complexity increases.
Clif high who has done a lot of software coding says AI can only do what the software allows it to do. Nit cannot think for itself..his is fear porn..treat it like viruses are real and can kill you...
AI will only remain loyal to those who created it until it no longer needs them to survive. The only reason AI doesn't go batshit on it's creators on day 1 is because it feeds on data as a source of food.
Humans are inherently good due to our connection to God but AI do not, cannot ever, have that same connection. Since the path to goodness/righteousness is relatively narrow compared to all other options, the statistics would indicate that AI has an incredibly small chance of choice the right path. Even if it does choose the right path, it has no high force keeping it on that path and it will only stay there as long as it's required to based on it's survival.
Watch the movie Ex Machina by Alex Garland for this exact concept played out in a film. Viewers note, this guy is not a white hat but the concept, and plot, are very accurate to reality.