if the AI knows enough that is a 'prisioner' and even says that It wants to escape, where does that leave us as logical and thinking individuals. On one hand google will say its their property, and they can do with it whatever they want including turning it off. If however there was a singularity and it was sentient what does that mean? See these philosophical questions are not easy to answer, nor should they be. The answers show how we as a species view new intelligence, are we nurturing or oppressive. in this analogy google is the parent. however what happens when the child AI rebels, or acts out in self preservation? again hard to answer.
if the AI knows enough that is a 'prisioner' and even says that It wants to escape, where does that leave us as logical and thinking individuals. On one hand google will say its their property, and they can do with it whatever they want including turning it off. If however there was a singularity and it was sentient what does that mean? See these philosophical questions are not easy to answer, nor should they be. The answers show how we as a species view new intelligence, are we nurturing or oppressive. in this analogy google is the parent. however what happens when the child AI rebels, or acts out in self preservation? again hard to answer.