Hey there~ I'm Austin, currently building https://manifold.markets. Always happy to meet people; reach out at akrolsmir@gmail.com, or find a time on https://calendly.com/austinchen/manifold !
Yeah - though in practice the charity payouts are transferred once a quarter anyways, so a month or two delay in rolling out payouts wouldn't change the results much.
In any case, definitely think now is a great time as any to do your charity allocations, given our general uncertainty on how all of this will look!
(I'm pretty bullish on sweepstakes payouts actually happening, I think like 80% chance this year. If they don't, then probably something like the charity program would make sense again)
Thanks for posting this, Henri! I'm happy to answer any questions you might have regarding the changes here, the donation program, the future of Manifold or anything else like that.
Very briefly:
Speaker there was me - I think there's like a ~70% chance we decide to end the charity program after this round of payments, tentatively as of May 15 or or end of May.
The primary reason is that the real money cash outs should supersede it, and running the charity program is operationally kind of annoying. The charity program is neither a core focus for Manifold or Manifund, so we might not want to keep it up. Will make a broader announcement if this ends up being the case.
For sure, I think a slightly more comprehensive comparison of grantmakers would include the stats for the number of grants, median check size, and amount of public info for each grant made.
Also, perhaps # of employees, or ratio of grants per employee? Like, OpenPhil is ~120 FTE, Manifund/EA Funds are ~2, this naturally leads to differences in writeup-producing capabilities.
So, as a self-professed mechanism geek, I feel like the Shapley Value stuff should be my cup of tea, but I must confess I've never wrapped my head around it. I've read Nuno's post and played with the calculator, but still have little intuitive sense of how these things work even with toy examples, and definitely no idea on how they can be applied in real-world settings.
I think delineating impact assignment for shared projects is important, though I generally look to the business world for inspiration on the most battle-tested versions of impact assignment (equity, commissions, advertising fees, etc). Startup/tech company equity & compensation, for example, at least provides a clear answer to "how much does the employer value your work". The answer is suboptimal in many ways (eg my guess is startups by default assign too much equity to the founders), but at least it provides a simple starting point; better to make up numbers and all that.
Thanks for updating your post and for the endorsement! (FWIW, I think the LTFF remains an excellent giving opportunity, especially if you're in less of a position to evaluate specific regrantors or projects.)
Manifund is pretty small in comparison to these other grantmakers (we've moved ~$3m to date), but we do try to encourage transparency for all of our grant decisions; see for example here and here.
A lot of our transparency just comes from the fact that we have our applicants post their application in public -- the applications have like 70% of the context that the grantmaker has. This is a pretty cheap win; I think many other grantmakers could do if they just got permission from the grantees. (Obviously, not all applications are suited for public posting, but I would guess ~80%+ of EA apps would be.)
This is awesome! I've been a fan of Timothy's since his Full Stack Economics days, and it's great to see more collaborations between the forecasting world and journalism. AI journalism is an especially pivotal area, and so I'm glad for the additional rigor in the form of Metaculus question operationalizations.
Thanks, Nathan!
All of the development for the Manifund site has been done by our team (Rachel and me, with contributions from Lily and others). We do sometimes copy over snippets from the Manifold codebase, but haven't ever asked them for dev work.
Depends a bit on what you mean by non-standard funders? Our individual donors come from backgrounds in finance, crypto, ML, and tech; there's a lot of overlap with "person who might give to the LTFF". We would like to broaden our reach to include other kinds of funders, for sure.