The link provided in the article to estimate total earnings has this input value for the model "Estimated annual raises & cost of living increases". I do not know what this is. How do I calculate it for myself?
Changing this input by 10% results in a roughly 3% change in the result or about $100,000 in total earnings. So the model output is pretty sensitive to this input.
Author here. Here are some of the novel aspects of this essay (to the best of my knowledge).
* Identified a new longtermist cause area of increasing humanity's Kardeshev scale and quantified the K1.7 civilization scale as being particularly longtermist relevant.
* Rigorously proved a strategy that will achieve existential risk that is lower than the natural background existential risk before artificial extinction causes were invented.
* Proved a strategy that will ensure exponentially decreasing existential risk for all future time until the heat death of the universe.
* This was achieved by generalizing extinction events to be cause and effect which was an infohazard free way of analyzing extinction risks. This seems to be the first literature in EA that does this.
* This essay described the novel idea of how the expansion of the universe can be used to defend against existential risk of all types.
* Identified another longtermist relevant feature of higher level Kardeshev civilizations adding to simulating conscious experience, etc.
* Set the appropriate scale of longtermist thinking to the cosmic event horizon scale. I find most EA literature on space settlement assume humanity's maximum expansion is limited to this galaxy. There are 64 mentions of word galaxies in longtermist EA forum articles and one other longtermist article that mentions cosmic event horizon.
* Talked about the heat death of the universe as the ultimate existential threat. There are 15 other longtermist EA forum articles that talk about this. So it isn't particularly novel or neglected.
* The Kardeshev scale is mentioned in only one previous longtermist EA forum article which in it's own words says it "is hard to take seriously". Here I show it should be taken seriously.
* Introduced the concept of self determination in the longtermist discussion. Self determination is mentioned in a serious capacity two times in the longtermist EA forum articles. So not completely novel but quite neglected in the EA literature.
* Argued against the current longtermist majority view that minimizing existential risk now is all that matters. Also not novel but does challenge the current paradigm.
* I discuss a novel mechanical mechanism by which we can trade off between a marginally bigger future and a marginally more robust future.