This is a good discussion, but I think that you're missing the strongest argument, in this context, against donation-splitting: we're not in a one-shot. Within the context of a one-shot, it makes much more sense to donation-split. But there's a charity that I donate to in part because I am convinced that they are very financially constrained (iteration impact). Furthermore, Donors can respond to each other within a year. If most EAs give around December/January, OpenPhil and GWWC can distribute in February, after hearing from relevant orgs how much funding they received. If a charity is credible in saying that they're under their expected funding, again, people can donate in response. So in practice I don't expect donation splitting to have that positive an effect on charity financing uncertainty, particularly compared to something like multi-year commitments.
Nonlinear, thank you. Edited.
Looking at this comment after Nonlinear, I think it holds up. There exists a point at which an org loses the (moral, not legal) right to see questions / a writeup in advance, and Nonlinear was past it. Legal threats, contacting the people you spoke with, and contacting your employer are classic examples of this. I am also sympathetic to journalists covering industries that are known to react strongly, such as oil and tobacco. But the items in the list you provide do not come close to the bar of the org being untrustworthy, and that is the bar I think must be cleared.
I think that distinguishing between 1-8 hours (preferably paid), up to 40 hours, and 1-6 months, is very important here. I am happiest about the shortest ones, particularly for people who have to leave a job (part of why I think that OP is talking about the latter sort).
Correct: I'm vaguely aware of Kat Woods posting on FB, but haven't investigated Nonlinear in any depth before: having an explicit definition of "what information I'm working with" seemed useful.
Yes, Nonlinear is smaller than expected.
I outlined a bad org with problems, even after adjusting for a hostile reporter and a vengeful ex-employee. I think that the evidence is somewhat weaker than what I expect, not counting that I trust you personally, and the allegations are stronger/worse. Overall, it was a negative update about Nonlinear.
I think part of the disconnect, from my perspective, is that I have experience with small scrappy conventions that deliver good talks and an enjoyable time and a large central room where people can mingle. The scrappier science-fiction conventions seem to charge in the range of $60-$120, usually on the lower side, and, while relying very heavily on volunteer labor and physical assets, about break even. The fancier ones might charge $250/person/weekend. That's not the true price, since it excludes what dealers pay for access, advertising, etc. But my sense of con budgets is that it is at least half of the true price.
Obviously a large chunk of that is the $240 on food that you're spending and they're not. Another chunk of the cost is location: said cons tend to be out in the boonies of their relevant cities, passing along to attendees costs of travel or increased hotel prices.
The context that non-profit conventions tend to be $400+ is helpful: thank you. I really appreciate the transparency.
I don't think that this is a good state of affairs. I think that the points I raise range from "this should be completely unacceptable" (4, 6) to "if this is the worst credible information that can be shared, the org is probably doing very well (3, 5)". This is not a description of an org that I would support! But if a friend told me they were doing good work there and they felt the problems were blown out of proportion or context by a hostile critic and a vengeful ex-employee with an axe to grind, I would take them seriously and not say "you have to leave immediately. I can cover two months salary for you while you find another job, but I believe that strongly that you should not work here."
As always, context is important: "the head of the org is a serial harasser with no effective checks" and "we fired someone when their subordinate came forward with a sexual harassment allegation that, after a one-week investigation, we validated and found credible: the victim is happily employed by us today" are very different states of affairs. If someone is sharing the worst credible information, then the difference between "we were slow to update on X" and "they knew X was false from the report by A, but didn't change their marketing materials for another six months" can be hard to distinguish.
Running an org is complicated and hard, and I think many people underestimate how much negative spin a third party with access to full information can include. I am deliberately not modelling "Ben Pace, who I have known for almost a decade" and instead framing "hostile journalist looking for clicks", which I think is the appropriate frame of reference.
Worst credible information about a charity that I would expect based on the following description (pulled from Google's generative AI summary: may or may not be accurate, but seemed like the best balance to me of engaging with some information quickly):
Nonlinear is an organization that funds and researches AI safety interventions. They also offer an incubation program that provides seed funding and mentorship. The Nonlinear Library is a podcast that uses text-to-speech software to convert the best writing from the rationalist and EA communities into audio.The Nonlinear Fund is an organization that aims to research, fund, and seed AI safety interventions. Their incubation program provides seed funding and mentorship. The seed funding is for a year's salary, but you can also use it for other things, such as hiring other people.The Nonlinear Library is a podcast that uses text-to-speech software to convert the best writing from the rationalist and EA communities into audio. You can listen to the podcast on Apple Podcasts and Spotify.
Nonlinear is an organization that funds and researches AI safety interventions. They also offer an incubation program that provides seed funding and mentorship. The Nonlinear Library is a podcast that uses text-to-speech software to convert the best writing from the rationalist and EA communities into audio.
The Nonlinear Fund is an organization that aims to research, fund, and seed AI safety interventions. Their incubation program provides seed funding and mentorship. The seed funding is for a year's salary, but you can also use it for other things, such as hiring other people.
The Nonlinear Library is a podcast that uses text-to-speech software to convert the best writing from the rationalist and EA communities into audio. You can listen to the podcast on Apple Podcasts and Spotify.
I am not describing a charity with ideal management practices, but envisioning one with 25 employees, active for 5 years, and which has poor but not shockingly or offensively bad governance by the standards of EA orgs. Someplace where I wouldn't be worried if a friend worked there, but I would sympathetically listen to their complaints and consider them not the best use of my marginal dollar.
Maybe I am excessively cynical about what bad things happen at small charities, but this feels like a reasonable list to me. There may be other events of similar badness.
From what I can tell, Harris has impressively low name recognition and is fairly unpopular with voters. That doesn't mean that party elites won't object to an outside group sponsoring a candidate who doesn't have their blessing.
A few points.
With the same resources, it's probably easier and more effective to try to persuade candidates who are more successful.