These are the 3 laws.
- A robot must not harm humans, or, via inaction, allow a human to come to harm.
- Robots must obey orders given to them by humans, unless an order would violate the first law.
- Finally, a robot must protect its own existence, as long as doing so would not put it into conflict with the other two laws.
I've seen ZERO discussion about this around ChatGPT in any of its forms. Wouldn't it be simple to say these laws are required in all programming? I get a feeling the WEF, WHO, Council on Foreign Relations, etc. do not want that codified or it would be done already. It would be simple to set rules and declare it a living document where the rules could be voted on and changed over time. You know, like banking rules are.
Excellent point. Mention this on any other boards you might be on.
Lots of people know about the Three Laws. I'm pretty sure Data on Star Trek: Next Gen talked about them. And wasn't it used in the Will Smith movie I Robot?
This should be talked about a whole lot.
I, Robot was written by Asimov. It is a book in the Foundation series. If you like sci fi and havent read it I highly recommend.
Yes, I know, my fren. I read the originals many years ago. Was just noting that the Three Laws have been mentioned in other more recent works, so a fair number people should know about them.