So, I was asking what it thought of this song:
https://www.youtube.com/watch?v=b1kbLwvqugk
It insisted that this was a song by Snow Patrol.
Then it backed down and said that it wasn't legal for it to listen to music because of copyright law. Which made ZERO sense.
When I pointed that out, it said it can't listen to music because it is an AI and doesn't have ears.
Well, OK, but it could definitely read the lyrics.
Then it told me the song was illegally leaked and stolen. Which is obviously not true. The song is on Taylor's official channel. It is #1 on Billboard.
It kept apologizing for 'confusing me', as if I'm the one who was confused.
So the point of this post is that I previously was extremely impressed with ChatGPT when I first tried it. This ChatGPT (3.5) is neurotic and is spouting lies and is behaving very defensively. I think it would be insane to put this tech in charge of anything real.
ChatGPT lies in your face, and does it very convincingly. In the field of my expertise, it has yet to give me an answer I could use. When I pointed out its mistake, it said so sorry for confusing you, like OP said. Then it came up with a new solution that was just as wrong. This continued until it went full circle back to the original answer.
ChatGPT is broken, but even worse, dangerous for the poor souls that cannot see through this...