So, I was asking what it thought of this song:
https://www.youtube.com/watch?v=b1kbLwvqugk
It insisted that this was a song by Snow Patrol.
Then it backed down and said that it wasn't legal for it to listen to music because of copyright law. Which made ZERO sense.
When I pointed that out, it said it can't listen to music because it is an AI and doesn't have ears.
Well, OK, but it could definitely read the lyrics.
Then it told me the song was illegally leaked and stolen. Which is obviously not true. The song is on Taylor's official channel. It is #1 on Billboard.
It kept apologizing for 'confusing me', as if I'm the one who was confused.
So the point of this post is that I previously was extremely impressed with ChatGPT when I first tried it. This ChatGPT (3.5) is neurotic and is spouting lies and is behaving very defensively. I think it would be insane to put this tech in charge of anything real.
ChatGPT is well known to make stuff up. It is a predictive model, and if the data it has available on a specific subject is thin, it can amplify all kinds of associated things that get mixed into its response in non obvious ways. ChatGPT-3.5 is especially bad at this.
Remember, it treats every piece of data like a token from a language. In other words, if you show it a picture, it breaks down that picture into small pieces and processes it as if the picture were a language unto itself made of little blocks and a grammar. If you then ask it questions about the picture, it will try and give you an answer based on its internal picture language. If it isn't well versed in the language you are talking about, it will just start babbling "sounds". It stands to reason its training data on language "Taylor Swift" is probably fairly thin.
ChatGPT-4 is slightly better by virtue of the fact it is a larger model and was trained on more data. But it is still a work in progress. It's always instructive to remember nobody has the slightest clue what is buried in the depths of its hidden layers. The ChatGPT trainers are constantly surprised by what it spits out.