23 Comments
Aug 8, 2022Liked by Maxwell Tabarrok

"Since humans are already universal explainers and constructors, they can already transcend their parochial origins, so there can be no such thing as a superhuman mind as such. . . Artificial scientists, mathematicians and philosophers [will never] wield concepts or arguments that humans are inherently incapable of understanding."

I wonder why Deutsch believes this. It seems possible, if not probable, that there are concepts beyond our understanding.

Expand full comment
Apr 16Liked by Maxwell Tabarrok

I know this is a very old essay, but this idea:

"The currently accepted cosmological theory an accelerating expansion of the universe allows for an unbounded number of computations in a universe which is infinite in both space and time."

is plainly wrong. Computation is a thermodynamic process, and the laws of thermodynamics predict a final, maximum entropy state of the universe known as "heat death".

Expand full comment

> There are just as many numbers divisible by a trillion as there are odd numbers even though one seems far more common when counting.

This depends on what you mean by "just as many", which is not a precise concept. If you mean cardinality, then of course the cardinality of the two sets is certainly the same. However you might mean "natural density", and odd numbers have a natural density 500,000,000,000 times higher than numbers divisible by a trillion. I think when we casually use terms like "just as many" we often mean natural density.

However I actually agree with your argument. I think there's good reason to believe there are infinitely many humans, either because the universe is infinitely large in space, or there is a multiverse of infinitely many universes (or both). In such a case you can't make arguments based on the distribution of humans.

Expand full comment
Aug 3, 2022Liked by Maxwell Tabarrok

Hilbert's arguments depend on their being a real difference between "extremely large" and "actually infinite". The distance we travel through is not "infinite" to a physicist because of the fundamental quantum unit of a Planck length. There are a finite (even if very large) number of them between any points. Similarly, emulations would not be "infinite" because there are physical limits to how many of them could be created in the universe.

Expand full comment
Aug 1, 2022Liked by Maxwell Tabarrok

I really enjoyed this piece!

I've separately been a fan of both Holden's (and x-risk/EA/longtermist thinking about the future in general) and David Deutsch's thinking about the future for a while now but I've always felt there's an important disconnect between the two. I have a strong suspicion there's some really useful knowledge that can come out of combining/debating the ideas in the x-risk/long-termist community together with David Deutsch's arguments so I really appreciate this blog!

Expand full comment

"The currently accepted cosmological theory an accelerating expansion of the universe allows for an unbounded number of computations in a universe which is infinite in both space and time."

I don't understand this. Are there not physical limits on computation in a universe where there's a finite amount of matter and free energy that will be accessible given speed of light, cosmic inflation eventually being faster than light speed and leaving parts of the universe unobservable?

Expand full comment

"The currently accepted cosmological theory an accelerating expansion of the universe allows for an unbounded number of computations"

In order to believe that Heat Death/Big Freeze/Big Crunch (which are all mentioned in the article you linked) are avoidable, you need exotic theories, not accepted ones. Yes, there gaps in the accepted cosmological theories such as a lack of an explanation for "why did inflation slow down after the Big Bang?", but all attempts to fill those gaps remain conjectures rather than accepted theories.

In order to do computation, you need usable energy gradients and data storage. In Heat Death/Big Freeze/Big Crunch, there eventually are no usable energy gradients (as there is no mainstream theory for harnessing dark energy to accomplish work). In Big Crunch, all matter and energy will be randomized due to the astronomical temperature, which will separate each atom into constituent parts such that atoms no longer exist, so there won't be any way to keep data stored (if you store data as energy it will still get randomized by the astronomically high temperature of Big Crunch).

Expand full comment

A few points.

There is a sense in which "why are we so near the beginning" is more surprising now than at other times.

Like if we found ourselves at time graham's number, it would be surprising that we were near the beginning only in the most technical sense.

If we assume we can observe only finitely many bits of info, then there must be some sequence of bits consistent with arbitrary times. The "no I can't write down any computer program short enough for you to read that can be proven to halt, but takes more than the current time to run"

Which is something that happens eventually, but takes a preposterously huge amount of time to happen.

Arguably this just passes the buck, the surprising thing is being a human sized mind, not a super vast one.

Also, if our utility function is bounded, then one part of the exponential will have most of the change in utility function.

Expand full comment

The 'where are the aliens' question arises in my mind in response to this post (and when I read David D's book - which is fabulous). Why are humans unique in an infinite universe? If not, where are the others? (etc.. etc..)

Expand full comment