Much-Better-Life Simulator™ – Sales Conversation

Related to: A Much Better Life?

Reply to: Why No Wireheading?

The Sales Conversation

Sales girl: Our Much-Better-Life Simulator™ is going to provide the most enjoyable life you could ever experience.

Customer: But it is a simulation, it is fake. I want the real thing, I want to live my real life.

Sales girl: We accounted for all possibilities and determined that the expected utility of your life outside of our Much-Better-Life Simulator™ is dramatically lower.

Customer: You don’t know what I value and you can’t make me value what I don’t want. I told you that I value reality over fiction.

Sales girl: We accounted for that as well! Let me ask you how much utility you assign to one hour of ultimate well-being™, where ‘ultimate’ means the best possible satisfaction of all desirable bodily sensations a human body and brain is capable of experiencing?

Customer: Hmm, that’s a tough question. I am not sure how to assign a certain amount of utility to it.

Sales girl: You say that you value reality more than what you call ‘fiction’. But you nonetheless value fiction, right?

Customer: Yes of course, I love fiction. I read science fiction books and watch movies like most humans do.

Sales girl: Then how much more would you value one hour of ultimate well-being™ by other means compared to one hour of ultimate well-being™ that is the result of our Much-Better-Life Simulator™?

Customer: If you ask me like that, I would exchange ten hours in your simulator with one hour of real satisfaction, something that is the result of an actual achievement rather than your fake.

Sales girl: Thank you. Would you agree if I said that for you one hour outside, that is 10 times less satisfying, roughly equals one hour in our simulator?

Customer: Yes, for sure.

Sales girl: Then you should buy our product. Not only is it very unlikely for you to experience even a tenth of ultimate well-being™ that we offer more than a few times per year, but our simulator delivers and allows your brain to experience 20 times more perceptual data than you would be able to experience outside of our simulator. All this at a constant rate while experiencing ultimate well-being™. And we offer free upgrades that are expected to deliver exponential speed-ups and qualitative improvements for the next few decades.

Customer: Thanks, but no thanks. I rather enjoy the real thing.

Sales girl: But I showed you that our product easily outweighs the additional amount of utility you expected to experience outside of our simulator.

Customer: You just tricked me into this utility thing, I don’t want to buy your product. Please leave me alone now.

Utility Maximization

You first have to realize that it is not possible to only consider utility preferences between “world states”. You have to assign utility to discrete items to deal with novel discoveries.

Think about it this way. How does a hunter gatherer integrate category theory into its utility function?

World states are not uniform entities, but compounds of different items, different features, each adding a certain amount of utility, weight to the overall value of the world state.

To only consider utility preferences between world states that are not made up of all the items of your utility-function would constitute a dramatic oversimplification. A world state that features a certain item must be different from one that doesn’t feature that item, even if the difference is tiny. So if I ask how much utility you assign to a certain item I ask how you weigh that item, how the absence of that item would affect the value of a world state. I ask about your utility preferences between possible world states that feature a certain item and those that don’t.

Now back to the gist of this post.

If you are human and subscribe to rational, consistent, unbounded utility maximization, then you assign at least non-negligible utility to unconditional bodily sensations. If you further accept uploading and that emulations can experience more in a shorter period of time compared to fleshly humans, then it is a serious possibility that you can outweigh the extra utility you assign to the referents of rewards in the form of bodily sensations and other differences like chatbots instead of real agents (a fact that you can choose to forget).

Utility maximization destroys complex values by choosing the value that yields the most utility, i.e. the best cost-value ratio.

One unit of utility is not discriminable from another unit of utility. All a utility maximizer can do is to maximize expected utility. If it turns out that one of its complex values can be effectively realized and optimized, it might turn out to outweigh all other values. This can only be countered by changing one’s utility function and reassign utility in such a way as to outweigh that effect, which will lead to inconsistency, or by discounting the value that threatens to outweigh all others, which will again lead to inconsistency.

Tags: