The “all digits are independent” assumption completely throws off the estimate, because it’s pretty obvious with the repeated 45s that they’re not independent. And it seems that ChatGPT even messed up following its own logic, because it also seemed to assume that every 8-digit number would give a completely different decimal expansion, which is how it got that 1/10^8.
For the exact chance you have to count the number of quotients that reduce to a whole number plus 345/2024. The chance of any number having a specific remainder (e.g. 345) when divided by 2024 is approximately 1/2024 (not exact because we don’t have an exact multiple of 2024 different inputs). Still unlikely but not the astronomical number ChatGPT gave you.
I am starting to call it ChatBS. It’s great when you need some existing stuff put into shape, when you want help getting to a first draft, and many other cases, but you gotta mistrust its brain and check up on it like you would a person who isn’t very bright. I started to trust it a little bit then discovered a complete nonsense word salad sentence in the middle of a paragraph I had thought it nailed.
I tried asking questions about a replacement part. What I got back didn’t make sense size wise. I ended up asking if it was just making stuff up and it admitted it didn’t have an answer.
The “all digits are independent” assumption completely throws off the estimate, because it’s pretty obvious with the repeated 45s that they’re not independent. And it seems that ChatGPT even messed up following its own logic, because it also seemed to assume that every 8-digit number would give a completely different decimal expansion, which is how it got that 1/10^8.
For the exact chance you have to count the number of quotients that reduce to a whole number plus 345/2024. The chance of any number having a specific remainder (e.g. 345) when divided by 2024 is approximately 1/2024 (not exact because we don’t have an exact multiple of 2024 different inputs). Still unlikely but not the astronomical number ChatGPT gave you.
I am starting to call it ChatBS. It’s great when you need some existing stuff put into shape, when you want help getting to a first draft, and many other cases, but you gotta mistrust its brain and check up on it like you would a person who isn’t very bright. I started to trust it a little bit then discovered a complete nonsense word salad sentence in the middle of a paragraph I had thought it nailed.
I tried asking questions about a replacement part. What I got back didn’t make sense size wise. I ended up asking if it was just making stuff up and it admitted it didn’t have an answer.
I used ChatGPT for the first time yesterday. ChatGPT can’t even do basic math. There are reasons for it if you Google up that question.