To create ChatGPT part of what happened was that It scraped data from a bunch of sites deemed "trustworthy " by its engineers. If what it learned was bias, it will sound bias since chat GPT is not design to logically think out concepts, it is design to send responses that sound like a valid response.
To create ChatGPT part of what happened was that It scraped data from a bunch of sites deemed "trustworthy " by its engineers. If what it learned was bias, it will sound bias since chat GPT is not design to logically think out concepts, it is design to send responses that sound like a valid response.
But even Google provides basic answers, such as the tunnels from Egypt. It's just funny that they would program it to avoid even obvious things.
It possible they programmed to to avoid questions but its also possible that it did not gather that information.