*Edit: in the time since writing this post I realized people way smarter than me have already thought about this stuff (which I should have expected), see **here**, **here** and **here**. You should read those instead of this! *

*Original post follows.*

Around 120 billion people have ever lived. Given this number, should we be extremely skeptical of claims that 10^50 people might live in the future? What are the odds that we'd just happen to live among the first 1/10^38th of people who will ever exist? Similarly, should we be skeptical of claims that humanity is ending in the next few years, which would suggest we're among the last 5% of all humans?

If you don't really buy this sort of reasoning to begin with, here's an example of its usefulness. During World War II, the Allies wanted to determine how rapidly Germany was making tanks. After discovering that the serial numbers on German tank parts were sequentially numbered (e.g., 1, 2, 3...) some Allied statisticians obtained surprisingly accurate estimates of the speed of German tank production by looking at the numbers on just 2 captured tanks. (For example, if they'd have seen the numbers 1012 and 1041 on the different tanks, they'd have been very surprised to learn the Germans were pumping out 500 tanks a day.)

Of course, this example doesn't exactly mirror my question - I'm not interested in how fast new people are being brought into existence, but *where in the complete distribution of people over time we are most likely to be.* Answering this question would also tell us something about how many people we should expect to ever exist, and vice versa. Another way my question differs is this: In contemplating how many total people are likely to exist, we can only rely on *one* observation - our own consciousness - because this is the only plausibly random sample of humanity available to us. Unlike statisticians inspecting tanks, we cannot assume the people we encounter are an unbiased, random subset of all humans who will ever exist, since our sampling process is confounded by our unfortunate access to only the past and the present. But despite these differences, it still feels like a clever Bayesian might be able to use the knowledge that they’re the 120-billionth person as some kind of evidence about the total number of people who will ever live. Right?

**Two competing perspectives:**

One perspective says that regardless of the number of total people who will ever exist, the odds of being any particular person in that universe are equally likely. If you generate a random number between 1 and 100, you shouldn't be any more surprised if you get 1 than if you get 50 - each had a 1% chance. The same goes if you generate a number between 1 and 1 trillion: getting 1 shouldn't be any more surprising than getting 4,011,218,693. According to this perspective, we should be totally agnostic about how many people should exist, since regardless of the answer, our existence at the present point in humanity's timeline is not surprising.

But a second perspective says that our sole observation of consciousness around the 120-billion-people mark *can* tell us about the *relative likelihood of competing views* about humanity's lifespan. Suppose I bring you a bag full of green and red balls and have you draw one. It's red. Then I tell you the bag either had 1 red ball and 4 green balls, or 1 red ball and 499 green balls (uh, it's a magic Mary Poppins bag, so you really couldn’t tell how many total balls were in there). Which seems more plausible - that there were 5 total balls and you drew the 1/5 chance red ball, or that there were 500 balls and you drew the 1/500 chance red ball? Similarly, the second anthropic perspective asserts that our existence at T + 120 billion humans is evidence that 240 billion total humans ever living is a lot more likely than 10^50 humans ever living, as in the former scenario our particular observation takes up a larger total proportion of the probability space^{[1]}.

In thinking about this question, I came up with 5 thought experiments that I think get progressively closer to accurately modeling the right way to think about this question. In these analogies, God creates some number of boxes (aka people who you might have wound up being), and depending on the way you frame her randomization/box-creation process, which of the above perspectives you should take seems to differ.

**1. Box Before Universe**

God tells you there are ten boxes. When you fall asleep, she'll randomly put you in one of them. Then, she'll flip a coin. If it's tails, she'll make another 999,999,999,990 boxes.

You wake up in box 3.

In this scenario, you’re clueless about God's coin toss, obviously. Whether or not she created the additional boxes, the odds of you being in box 3 are the same. If you don’t think the universe is deterministic, you might think that this is a good analogy for the universe, as our relative position in humanity’s larger lifespan isn’t “set” in any sense before we’re born. Perhaps the number of people of people who will ever exist is constantly changing, and we’re regularly affecting this number by making decisions. But if you agree with me that determinism is true (or you at least think it's possible to predict things), then you’re probably similarly dissatisfied with this first thought experiment. *There is and has always been some fact of the matter about how many people will ever exist,* and so this analogy falls short.

**2. Universe Before Box**

God tells you she's going to flip a coin. If it's heads, she'll randomly put you in one of ten boxes numbered 1-10. If it's tails, she'll randomly put you in one of ten trillion boxes numbered 1-10,000,000,000,000. You wake up in box 3.

In this scenario, the odds of waking up in box 3 if God’s coin landed heads are 1/10. The odds of waking up in box 3 if God’s coin landed tails are 1/10,000,000,000,000. Therefore, by Bayes theorem, the odds of heads given that you woke up in box 3 are one-trillion-to-one. I think you should be confident the coin came up heads^{[2]}.

Similarly, assuming there is a fact-of-the-matter about how many people will exist, perhaps our “box number” can justifiably inform our evaluation of competing claims about humanity’s lifespan. But this scenario doesn’t really feel complete: there are clearly more possibilities than only 10 or 10 trillion people ever living - what happens when we try to consider *all the numbers*? The following three thought experiments try to answer this.

**3. Smiteful God 1.0**

God tells you there are ten boxes. She'll randomly put you in one of them, but you won’t be able to see the number. She will smite box 1 on day 1, smite box 2 on day 2, and so on. On day 3, after God smites the third box, you find yourself still alive and wonder how many days you have left to live.

At this point, you could be in box 4, 5, 6, 7, 8, 9, or 10, meaning you’ll live four more days on average. But the fact that three boxes have been smitten so far tells you nothing about which box you’re in (besides that you obviously weren’t in those ones).

I included this scenario to acknowledge that knowing you *aren’t* in other boxes doesn’t work as evidence about which remaining box you *are* in, even in this problematic analogy where you do somehow know the total number of boxes that exist. But as the next two experiments will demonstrate, knowing you weren’t one of these numbers *is* still useful evidence if you modify this experiment to incorporate the broader question about how many boxes (or people) to expect to be in a universe in the first place. And once we answer this question, determining our relative position within that universe is much easier.

**4. Smiteful God 2.0**

God tells you she’s going to flip a coin. If it's heads, she'll randomly put you in one of ten boxes numbered 1-10. If it's tails, she'll randomly put you in one of ten trillion boxes numbered 1-10,000,000,000,000. In either case, you won’t be able to see the number. She will smite box 1 on day 1, smite box 2 on day 2, and so on. On day 9, after God smites the ninth box, you find yourself still alive and wonder how many days you have left to live.

The odds of surviving day 9 if God’s coin landed heads are 1/10. The odds of surviving day 9 if God’s coin landed tails are 9,999,999,999,999,991/10,000,000,000,000,000. Therefore, by Bayes theorem, the odds the coin landed heads given you survived day 9 are nearly ten-to-one. You should be quite confident the coin came up heads^{[3]}.

As in the second thought experiment, when you use your observed number to evaluate which of two possible universes you are in, it becomes much clearer that your observed number is relevant information. Finally: what happens if you extend this thinking to compare not two, but infinitely many possible hypotheses about how many boxes/people might exist?

**5. Smiteful God 3.0**

God tells you she’s going to pick some number N. Then she’ll create N boxes and put you in a random one of them. She will smite box 1 on day 1, box 2 on day 2, and blah blah blah you get the point. Three days pass and your existential dread is unbearable.

View one: You have no clue when you’re gonna die. All you know is that it’s 3 days less than whatever it was 3 days ago. On average, you will live (N - 3) / 2 more days, and without knowing N you can’t know shit. Even if God chose 4 and it was improbable that you would make it here, there’s no way of knowing that, and you have no reason to suspect she chose to make 10 trillion boxes over 4 boxes or any other number. Just like in Smiteful God 1.0, the fact that three boxes have been smitten so far gives you no new information about which box you’re in.

View two: Suppose that instead of telling you she was going to choose any number, God said she was going to choose some power of 10 up to 10 trillion (could be 10, 100, 1,000, etc.). Then, you would have started with some complicated prior expectation about how many years you would live, but each time a box gets smited and you aren’t in it, the odds she chose 10 decrease and the odds she chose something above your prior expectation increase, as in experiment 4. Thus, with each successive smite, your expectation about how much longer you have to live should increase.

This same logic would apply if God told you she was going to choose any individual number up to 10 trillion, or 10^100, or something way bigger. On view one, because you don’t know N, updating it in any way feels unreasonable. But on view two, for any number God chose as the maximum value she *could* have generated, the rational response would be to update your expectation of N’s value slightly upwards each time you wake up. In fact, God doesn’t even have to tell you what the maximum number of boxes she might have chosen is: It need only be the case that there exists such a maximum number.

And so: The objection that we cannot update N since we’re completely clueless about it is actually an infinity-breaking-logic-type objection in disguise. As long as we accept that there is some fact-of-the-matter about how long human history could possibly be (e.g., it won’t be longer than the length of time from the big bang until the heat death of the universe), our prior expectation should be that we are not near the very beginning nor end of human existence. We should also increase our expectation of humanity’s total lifespan as time goes on.

This prior belief should of course be influenced by other evidence and I’m not sure how resilient/malleable it should be - possibly this sort of consideration is easily outweighed by our observations about technological progress, space travel, etc. But in any case, we should be (at least marginally) more receptive to the idea that humanity will end soon (which would mean we’re in the last 5% of people and is not so unreasonable on priors) than that there will be 10^50 more humans (which would mean we’re in the first **1/10^38th** [.000000000000000000000000000000000000001%] of people^{[4]}).

The universe has already flipped the metaphorical coins to determine the total number of people who will exist. I hope I got lucky and I’m merely one of the first human beings who will ever live- but I somewhat doubt that’s the case.

**Summary:** I think it’s valid for our prior expectation to be that we’re in the middle of humanity. As long as we believe there is a true, finite answer to the question, “how many people will ever exist,” we should theoretically be able to lay out all possible numbers of people who could exist and view them as competing hypotheses. Assuming we’re completely uncertain, we can assign each number an equal chance to start. Low numbers (120 billion being the minimum) look implausible simply because there are a ton of higher numbers the answer could be. Super high numbers also look implausible because the odds of drawing 120 billion from a bag with numbers 1 through N become increasingly slim as N approaches infinity. The answer which minimizes both kinds of surprise is N = 2X (where X is the number of people who have lived so far)^{[5]}, suggesting that we should assume we’re living in the middle of humanity by default.

By this logic, why is it not the case that 120 billion is more plausible than 240 billion, and we should assume we’re right at the end? 1/120b is more likely than 1/240b, right? The reason is that one also has to consider, “what are the odds that 120 billion people would have existed and I wouldn’t have been one of them, given the total number of people who will ever live in the universe is N?” Given this consideration, 120 billion and 1 seems a lot less plausible than 240 billion. I think N = 2 times the number of humans who have lived before you optimally balances both of these considerations.

A note on the math here: I assume you assign a lower overall probability to p(box 3|tails) than to p(box 3|heads). This seems obviously correct to me. However, you might just reject all of this anthropic reasoning stuff and assign an equal probability to every possible scenario out of ignorance/humility or something. In this case, if you wake up in box 3, you don’t gain any information about God’s coin flip, and your credence the coin landed heads should still be 50%. Note, though, that this perspective also has the counterintuitive implication that

*before God put you in any box, it was more likely the coin would come up tails than heads,*since there were more possible outcomes that could follow from the coin landing heads (if you’re confused, try assigning a probability to each possible outcome in this scenario and you’ll see what I mean; see also criticisms of the halfer position in__the sleeping beauty problem__). Alternatively, you could just stubbornly insist on not needing to be consistent with how you assign probabilities to things. But I don’t find that satisfying either (and I think I could decimate anyone with this belief in a casino). So in the remaining thought experiments I continue to assume the way I applied Bayes theorem here is valid.I think some people might have an objection to experiment 4 that sounds like, “but if you had been in one of boxes 1-9, you wouldn’t be around anymore, and so you wouldn’t be making this judgment!” This kind of argument is exceptionally hard to think about, but while I think it might be relevant to the question “which box are you in?” in this case, I’m quite confident it

*isn’t*relevant to the question, “how many boxes are in your universe?” I can’t see how this consideration would affect the evidential value of knowing that, alas, we were*not*in boxes 1-9. Sorry for including the word*alas*, that was obnoxious.Tbf, I think most people who say 10^50 people might live in the future are counting digital people, who account for a majority of the people-in-expectation. And in this case, they’d probably endorse the view that we’re most likely

*not*in the first 1/10^38th of people; we’re actually probably in one of these future simulations (from the perspective of the early non-digital guys). I think this is consistent, just not what most people are assuming when they hear nerds say that 10^50 people will live in the future.I want to insert math here later.

Thanks for this write up!

You might already be aware of these, but I think there are some strong objections to the doomsday argument that you didn't touch on in your post.

One is the Adam & Eve paradox, which seems to follow from the same logic as the Doomsday argument, but also seems completely absurd.

Another is reference class dependence. You say it is reasonable for me to conclude I am in the middle of 'humanity', but what is humanity? Why should I consider myself a sample from all homo sapiens, and not say, apes, or mammals? or earth-originating life? What even is a 'human'?

Executive summary:Our existence as roughly the 120 billionth human should inform our prior expectations about how many total humans will ever exist, making us skeptical of claims that humanity will end soon or that vastly more humans will exist in the future.Key points:This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, andcontact usif you have feedback.