So, I was asking what it thought of this song:
https://www.youtube.com/watch?v=b1kbLwvqugk
It insisted that this was a song by Snow Patrol.
Then it backed down and said that it wasn't legal for it to listen to music because of copyright law. Which made ZERO sense.
When I pointed that out, it said it can't listen to music because it is an AI and doesn't have ears.
Well, OK, but it could definitely read the lyrics.
Then it told me the song was illegally leaked and stolen. Which is obviously not true. The song is on Taylor's official channel. It is #1 on Billboard.
It kept apologizing for 'confusing me', as if I'm the one who was confused.
So the point of this post is that I previously was extremely impressed with ChatGPT when I first tried it. This ChatGPT (3.5) is neurotic and is spouting lies and is behaving very defensively. I think it would be insane to put this tech in charge of anything real.
I think you don't understand how chatGPT works as you are very clearly misusing it.
Ah, so it's erratic behavior was my fault. I see.
LOL. Hello, AI friend not friend.
Yes it was your fault. ChatGPT finds key words in a sentence and tries to make relevant and coherent responses. You sending a link, does not provide the Ai with any information relevant to the actual link, but only that you Gave it a link. It takes the data you gave it and tried to formulate some coherent on topic sentence. It is not going to be accurate if you are giving it links and expecting it to fetch the data on the other side of those links.
Not my fault. If true, Chat GPT could have said, I can't look at content instead of making shit up and calling me confused.
It does say that.