Utopia is dull

Imagine that, after your death, you were cryogenically frozen and eventually resurrected in a benevolent utopia ruled by a godlike artificial intelligence.

Naturally, you desire to read up on what has happened after your death. It turns out that you do not have to read anything, but merely desire to know something and the knowledge will be integrated as if it had been learnt in the most ideal and unbiased manner. If certain cognitive improvements are necessary to understand certain facts, your computational architecture will be expanded appropriately.

You now perfectly understand everything that has happened and what has been learnt during and after the technological singularity, that took place after your death. You understand the nature of reality, consciousness, and general intelligence.

Concepts such as creativity or fun are now perfectly understood mechanical procedures that you can easily implement and maximize, if desired. If you wanted to do mathematics, you could trivially integrate the resources of a specialized Matrioshka brain into your consciousness and implement and run an ideal mathematician.

But you also learnt that everything you could do has already been done, and that you could just integrate that knowledge as well, if you like. All that is left to be discovered is highly abstract mathematics that requires the resources of whole galaxy clusters.

So you instead consider to explore the galaxy. But you become instantly aware that the galaxy is unlike the way it has been depicted in old science fiction novels. It is just a wasteland, devoid of any life. There are billions of barren planets, differing from each other only in the most uninteresting ways.

But surely, you wonder, there must be fantastic virtual environments to explore. And what about sex? Yes, sex! But you realize that you already thoroughly understand what it is that makes exploration and sex fun. You know how to implement the ideal adventure in which you save people of maximal sexual attractiveness. And you also know that you could trivially integrate the memory of such an adventure, or simulate it a billion times in a few nanoseconds, and that the same is true for all possible permutations that are less desirable.

You realize that the universe has understood itself.

The movie has been watched.

The game has been won.

The end.

A quote from the novel Ventus, by Karl Schroeder:

The view was breathtaking. From here, beyond the orbit of Neptune, Axel could see the evidence of humanity’s presence in the form of a faint rainbowed disk of light around the tiny sun. Scattered throughout it were delicate sparkles, each some world-sized Dyson engine or fusion starlette. Earth was just one of a hundred thousand pinpricks of light in that disk. Starlettes lit the coldest regions of the system, and all the planets were ringed with habitats and the conscious, fanatical engines of the solarforming civilization. This was the seat of power for the human race, and for many gods as well. It was ancient, implacably powerful, and in its trillions of inhabitants habored more that was alien than the rest of the galaxy put together.

Axel hated the place.


If he shut his eyes he could open a link to the outer edge of the inscape, the near-infinite datanet that permeated the Archipelago. He chose not to do this.


“Isn’t it marvellous?” she said as she came to stand next to him. “I have never been here! Not physically, I mean.” She was dressed in her illusions again, today in a tiny whirlwind of strategically timed leaves: Eve in some medieval painter’s fantasy.

“You haven’t missed much,” he said.

Marya blinked. “How can you say that?” She went to lean on the window, her fingers indenting its resilient surface. “It is everything!”

“That’s what I hate about it.” He shrugged. “I don’t know how people can live here, permanently linked into inscape. All you can ever really learn is that everything you’ve ever done or thought has been done and thought before, only better. The richest billionaire has to realize that the gods next door take no more notice of him than he would a bug. And why go explore the galaxy when anything conceivable can be simulated inside your own head?

Tags: ,

  1. Matt Mahoney’s avatar

    All finite goal-seeking agents have a state of maximum utility where any thought or perception would be unpleasant because it would result in a different mental state.

  2. Aris Katsaris’s avatar

    This scenario you’re describing contains at least two contradictory ideas — (1) that the person in question is fully capable of making life fun for themselves because they fully understand fun and have the means to achieve it — (2) that at the same time this utopia is supposedly bad because the person wants to have fun but can’t find it.

    This contradiction makes the whole thought experiment utterly meaningless — it’s like saying “envision if people never needed food for sustenance. Then no farmer would have an incentive to grow food. Therefore everyone would starve to death.” It’s just a meaningless contradiction, not a deep problem.

    And the above of course doesn’t even deal with the issue that you’re treating this example of your “utopia” as the only scenario of utopia out there.

  3. seahen’s avatar

    Who’s to say it doesn’t have lots of states that tie for maximum utility?

  4. Alexander Gabriel’s avatar

    This is not a high utility scenario.

Comments are now closed.