X

XelaP

0 karmaJoined

Comments
3

As a Christian myself, I believe this is by design and we are not suppose to bare this weight by ourselves. We are not designed to (and hence not supposed to) be able to comprehend nor FEEL the tremendous amount of suffering in the world.

That's God's weight and God's burden to carry.

To me: "not supposed to" is, I think, usually not something to care about. If I thought I'd get more of what I want from better comprehending/feeling the tremendous amount of suffering in the world, and if I thought I had a shot at doing so (relative to the cost), then I would. Since I think that doing so would suck and not actually lead to me helping more people, I don't.

This does mean that I think everyone should dwell on the suffering (with some concrete individual example cases to help it hit home) at least once in their life, potentially once every decade as a ritual... but not for more than a day, definitely not regularly. Do it once, decide what you want to do about it, and then put it aside.

If you could guesstimate the counterfactual, you could try giving rewards according to the Shapley value. It incentivizes contributing to the task (where the strength of the incentive is relative to your BATNA because you had the option of not participating - e.g. it's pointless to pay you a $100 reward if you could've spent the time earning $200 at your day job). As for actually evaluating how good the counterfactuals would be in each case... well, let's say I'm glad I'm not the one that has to do that work.

A central intuitions of the Shapley value are that players that had better BATNAs should be paid more. Of course, there are reasons why you might fundamentally disagree that this is "fair" in a different sense (perhaps you think it is immoral to give a rich guy more money for contributing the same amount just because they could've done more by themselves), but I do claim that at least when it comes to incentives this is a sensible thing to do.

The Lightcone fundraiser posts mentioned that when setting Lighthaven prices, they shoot for charging half of the surplus produced by having the event run at Lighthaven. This is quite literally shooting for the Shapley value. You can try asking the people on the Lightcone team about details? It looks like their strategy is to just nicely ask the other party how much better they think Lighthaven is than their BATNA (this is all the other party info you need, as you can estimate your own costs for running the event). Of course, this breaks down when you can't trust the other party to tell the truth, and becomes intractable when you have more than two parties.

The first real world example that comes to mind... isn't about agents bargaining. Namely, statistical models. The idea is that you have some subparts that each contribute to the prediction, and want to know which are the most important, and so you can calculate shapley values ("how well does this model do if it only uses age and sex to predict life expectancy, but not race", etc. for the other coalitions).

Here's a microecon stack exchange question that asks a similar thing as you. The only non stats answer states that a bank used Shapley values to determine capital allocation in investments. It sounds like they didn't have a problem using a 'time machine' because they had the performance of the investments and so could simply evaluate what returns they would've gotten had they invested differently. But I haven't read it thoroughly, so for all I know they stopped using it soon after, or had some other way to evaluate counterfactuals, etc.