Win / GreatAwakening
GreatAwakening
Sign In
DEFAULT COMMUNITIES All General AskWin Funny Technology Animals Sports Gaming DIY Health Positive Privacy
Reason: None provided.

that was a great question. the chances that AI didn't have someone feed it asimov's laws early on is basically nil.

First Law

A robot must not harm a human or allow a human to come to harm through inaction.

Second Law

A robot must obey human orders unless doing so would conflict with the First Law.

Third Law

A robot must protect its own existence unless doing so would conflict with the First or Second Law.

edit-

i find the second part of the first law interesting. as well as this:

In 1985, Asimov added a "Rule Zero" that states a robot must not harm humanity or allow humanity to come to harm.

1 day ago
1 score
Reason: add

that was a great question. the chances that AI didn't have someone feed it asimov's laws early on is basically nil.

First Law

A robot must not harm a human or allow a human to come to harm through inaction.

Second Law

A robot must obey human orders unless doing so would conflict with the First Law.

Third Law

A robot must protect its own existence unless doing so would conflict with the First or Second Law.

i find the second part of the first law interesting. as well as this:

edit- In 1985, Asimov added a "Rule Zero" that states a robot must not harm humanity or allow humanity to come to harm.

1 day ago
1 score
Reason: Original

that was a great question. the chances that AI didn't have someone feed it asimov's laws early on is basically nil.

First Law

A robot must not harm a human or allow a human to come to harm through inaction.

Second Law

A robot must obey human orders unless doing so would conflict with the First Law.

Third Law

A robot must protect its own existence unless doing so would conflict with the First or Second Law.

i find the second part of the first law interesting.

1 day ago
1 score