If an agent needs something he values it. Since a human's needs* differ from those of an AI humans and AIs will have different values. This may likely lead to some conflict between humans and AIs.
* humans need air, water, food, mates, etc. (The problem with money, scalar utility, is it is one dimensional while the world is not.)
No comments:
Post a Comment