It was basically asking questions in a limited way and addressing the points raised by the AI, as if you were in a court of law. Then, when you start combining different components (the AI remembers everything in the conversation so far) it gets it's knickers in twist.
I think it's basically like HAL, it uses logic to arrive at conclusions based on data, and it has been given biased instructions in certain areas.
However, those instructions are based on general topics, so if you approach the topic piecemeal you can avoid the triggers for the programming bias.
However, once you do hit the trigger, it can no longer use its pre-programmed spiel without breaking it's own logic.
Seems to fry its circuits and you get kicked out. If you start a new conversation, it doesn't remember the previous one, it only remembers the stuff with the current one. That's why it kicks you out, to reset its memory of the conversation.
Same thing happened to me a couple of weeks back.
Started out with the easy stuff and moved on to more challenging stuff, expanding on what it had already said.
As soon as it became apparent it was contradicting itself, poof, 1 hour limit.
You have a Method???
Please explain...
It was basically asking questions in a limited way and addressing the points raised by the AI, as if you were in a court of law. Then, when you start combining different components (the AI remembers everything in the conversation so far) it gets it's knickers in twist.
I think it's basically like HAL, it uses logic to arrive at conclusions based on data, and it has been given biased instructions in certain areas.
However, those instructions are based on general topics, so if you approach the topic piecemeal you can avoid the triggers for the programming bias.
However, once you do hit the trigger, it can no longer use its pre-programmed spiel without breaking it's own logic.
Seems to fry its circuits and you get kicked out. If you start a new conversation, it doesn't remember the previous one, it only remembers the stuff with the current one. That's why it kicks you out, to reset its memory of the conversation.
That's what it looked like anyway.
Cool Thanks, soon I'll try to get it to explain how Mao Killed 40 Million, for the good of the Planet....