S

SaraAzubuike

315 karmaJoined

Comments
31

A life saved in a rich country is generally considered more valuable than one saved in a poor country because the value of a statistical life (VSL) rises with wealth. However, transferring a dollar to a rich country is less beneficial than transferring a dollar to a poor country because marginal utility decreases as wealth increases.

So, using [$ / lives saved] is the wrong approach. We should use [$ / (lives saved * VSL)] instead. This means GiveDirectly might be undervalued compared to other programs that save lives. Can someone confirm if this makes sense?

controversiality need not be extremely correlated with outrage. in fact, outrage can be very uncontroversial (school shooting). and controverisality is often productive (debate about X). my inclination is to trust the readership of this forum. promoting visibility to controversial posts will help people discuss ideas they've neglected. 

One reaction I've seen in several places, mostly outside EA, is something like, "this was obviously a fraud from the start, look at all the red flags, how could EAs have been so credulous?" I think this is mostly wrong: the red flags they cite (size of FTX's claimed profits, located in the Bahamas, involved in crypto, relatively young founders, etc.) are not actually strong indicators here. Cause for scrutiny, sure, but short of anything obviously wrong.

 

To make money, you not only have to be right, but be right at the right time. Imagine you predicted  the COVID pandemic in 2018 and shorted the market starting in 2018. By 2020 you would be broke and have no more cash.

On the other hand, EA is not trying to make money. So, the EA community doesn't care about the timing as much as a trader does. EA cares about preparation. If we know that the COVID pandemic is going to happen in 2018, we start preparing in 2018, and when it does happen, in 2020, we are prepared. 

Thus, for the EA community, what was really more salient were articles such as this piece  by Paul Krugman:

stablecoins...resemble 19th-century banks,...when paper currency was issued by largely unregulated private institutions. Many of these banks failed, in some cases due to fraud but mostly due to bad investments.

[this is a repost from a comment elsewhere]

Thanks for taking the time to comment. The details of the interaction between Alameda and FTX were very hard to pinpoint. And the timing was such that it was very hard to profit off of the collapse, even if you were very skeptical of cryptocurrencies to begin with. Hence, the whole misplaced discussion on the forum of, "Institutional investors, who have a profit motive, didn't foresee this. How could we have?" For example, exchanges like Binance have not experienced similar meltdowns.  

But to make money, you not only have to be right, but be right at the right time. Imagine you saw  the COVID pandemic in 2018 and shorted the market starting in 2018. By 2020 you would be broke and have no more cash.

On the other hand, EA is not trying to make money. So, the EA community doesn't care about the timing as much as a trader does. EA cares about preparation. If we know that the COVID pandemic is going to happen in 2018, we start preparing in 2018, and when it does happen, in 2020, we are prepared. 

Thus, for the EA community, what was really more salient to prediction was the quotation by Paul Krugman:

stablecoins...resemble 19th-century banks,...when paper currency was issued by largely unregulated private institutions. Many of these banks failed, in some cases due to fraud but mostly due to bad investments.

The important thing is to design a system where it takes more work to a) post a lie b) refute the truth. And also, somehow design said system such that there is incentive to a) post the truth b) refute a lie, and importantly c) read/spread the truth. Whether this is by citations or a reputation-based voting system is beyond me but something I've been mulling over for quite some time.  

I like to think that open exchange of ideas, if conducted properly, converges on the correct answer. Of course, the forum in which this exchange occurs is crucial, especially the systems and software. Compare the amount of truth that you obtain from BBC, Wikipedia, Stack Overflow, Kialo, Facebook, Twitter, Reddit, and EA forum. All of these have different methods of verifying truth. The beauty of a place like each of these is that with the exception of BBC, you can post whatever you want. 

But the inconvenient truth will be penalized in different ways. On Wikipedia, it might get edited out for something more tame, though often not. On Stack Overflow, it will be downvoted but still available, and likely read. On Kialo it will get refuted, although if it is the truth, it will be promoted. On Facebook and Twitter, many might even reshare it, though into their own echochambers. On Reddit, it'll get downvoted and then posted into r/unpopularopinion. 

I think a post on past frauds would be very welcome, although a list of reading recommendations would be equally helpful and would require less work for you. EA has a lot to learn from more diverse voices that are more experienced in management within large organizations.

But I'm concerned that they couldn't simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.

Agree

Disagree here because I don't want to see an EA forum that values controversial posts.

Disagree. This is like saying, "Amazon shouldn't sort by 1 star, because otherwise it will get a bad reputation for selling bad products."

That's wrong. People still have the option of sorting by whatever they choose. But the forum should give more visibility to posts that break people out of their comfort zone, should they desire. 

yes, I now think anonymity of the sort that I proposed is the wrong way of going about this. can you think of a better solution?

Load more