GWWC board member, software engineer in Boston, parent, musician. Switched from earning to give to direct work in pandemic mitigation. Married to Julia Wise. Speaking for myself unless I say otherwise. Full list of EA posts: jefftk.com/news/ea
I do think there are downsides with sharing draft reviews with organizations ahead of time, but I think they're mostly different from the ones listed here. The biggest risk I see is that the organization could use the time to take an adversarial approach:
Trying to keep the review from being published. This could look like accusations of libel and threats to sue, or other kinds of retaliation ("is publishing this really in the best interest of your career...?").
Preparing people to astroturf the comment section
Preparing a refutation that is seriously flawed but in a way that takes significant effort to investigate. This then risks turning into the opposite of the situation people usually worry about: instead of people seeing a negative review but not the org's follow-up with corrections they might see a negative review and a thorough refutation come out at the same time, and then never see the reviewer's follow-up where they show that the refutation is misleading.
I also think what you list as risk 2, "Unconscious biases from interacting with charity staff", is a real risk. If people at an evaluator have been working with people at a charity, especially if they do this over long periods, they will naturally become more sympathetic. [1]
Of the other listed issues, however, I agree with the other commenters that they're avoidable:
There are many services for archiving web pages, and falsely claiming that archives have been tampered with is a pretty terrible strategy for a charity to take. If you're especially concerned about this, however, you could publish your own archives of your evidence in advance (without checking with the org). The analogy to police is not a good one, because police have the ability get search warrants and learn additional things that are not already public.
If the charity says "VettedCauses' review is about problems we have already addressed" without acknowledging that they fixed the problems in response to your feedback I think that would look quite bad for them. There is risk of dispute over whether they made changes in response to your review or coincidentally, but if you give them a week to review and they claim they just happened to make the changes in that short time between their receiving the draft and you releasing it I think people would be quite skeptical.
On "It is not acceptable for charities to make public and important claims (such as claims intended to convince people to donate), but not provide sufficient and publicly stated evidence that justifies their important claims", I don't think you've weighed how difficult this is. When I read through the funding appeals of even pretty careful and thoughtful charities I basically always notice claims that are not fully backed up by publicly stated evidence. While this does sound bad, organizations have a bunch of competing priorities and justifying their work to this level is rarely worth it.
Overall, I don't think these considerations appreciably change my view that you should run reviews by the orgs they're about.
[1] Charities can also trade access (allowing a more comprehensive evaluation) for more favorable coverage, generally not in an explicit way. I think this is related to why GiveWell and ACE have ended up with a policy that they only release reviews if charities are willing to see them released. This is a lot like access journalism. But this isn't related to whether you share drafts for review.
To the extent that you update against an org, of currently existing orgs this would be 80k, not CEA. At the time that this happened current CEA and current 80k were both independently managed efforts under the umbrella organization then known as CEA and now known as EV (more).
Separately, I agree this editing was bad, but doing it in the context of a review would be much worse.
The motivation for focusing on global catastrophic risks is that these could dramatically limit humanity's potential. If, per your population ethics, such a limitation wouldn't be concerning, then it's not surprising that you wouldn't find work aiming to avert or mitigate such risks compelling.
I think the post would be clearer if it were explicit about this up front: the disagreement here isn't about the relative scale of biorisk vs factory farming, but instead about how much value there is in averting civilizational collapse and/or extinction.
Looking at the two comments, I see:
Your comment on a comment on a quick take, suggesting suing OpenAI for violating their charter and including an argument for why. Voted to +4.
Aaron's quick take, suggesting suing OpenAI for their for-profit conversion. No argument included. Voted to +173.
I don't see anything weird here. With the design of the site a quick take is likely to get much more attention than a nested comment on a quick take, and then when people start voting one up this snowballs because the site makes it more visible.
But even if you'd posted your comment as your own quick take I think it probably wouldn't have taken off: it doesn't give enough context for someone seeing it out of nowhere to figure out if they think it's worth paying attention to, or enough of an explanation for what a suit would look like. You can gloss this as packaging/rigor, I guess, but I think it's serving a useful purpose.
(I think neither posting is amazing: a few minutes with an LLM asking about what the rules are for converting 501c3s into for-profits would have helped both a lot. I'd hold that against them if they were regular posts but that's not a standard we do, or should, hold quick takes or comments to.)
I post a fair number of offbeat ideas like this, and they don't generally receive much attention, which leaves me feeling demoralized
In general, if you want ideas to receive attention you should expect to put in some work preparing them for other people's attention: gather the information that will help others evaluate them, make an argument for why these ideas are important. If you do that work, and then post as a quick take or (better, but requires more investment) top-level post, I do think you'll get attention. This is no guarantee of a positive reaction (people may disagree that you've sufficiently made your case) but I don't think it's a process that selects against weird ideas.
There's a reason people use "low-effort" as a negative term: you pay with your own effort in a bid on other people's attention.
I got downvoted/disagreevoted for asking if there's a better place to post offbeat ideas
Your comment starts with claims about what people want on the forum and a thesis about how to gain karma, and only gets to asking about where to post weird ideas in the last paragraph. I interpret the downvoting and disagree voting as being primarily about the first two paragraphs.
basically acknowledges that this is a hypothetical, and new ideas mostly don't get posted here
I wasn't trying to make a claim either way on this in my comment. Instead, I was adding a caveat that I was going by my impression of the site instead of taking the time to look for specific examples that would support or counter my claim, and so people should put less weight on my claim.
Thinking now, some example ideas that were new/weird in the sense that they were pretty different from the lines of thought I'd seen here before but that still got attention (or at least comments / votes):
Top level post: Let’s think about slowing down AI
Quick take: EA Awards
Copying Chandler's response from the comments of the open thread:
Hi Arnold,
Thanks for your question! You are correct that our funds raised for metrics year 2023, $355 million, was below our 10% percentile estimate from our April 2023 blog post. We knew our forecasts were quite uncertain (80% confidence interval), and, looking back, we see two primary reasons that our forecasts were incorrect.
First, we were optimistic about the growth of non-Open Philanthropy funding. Our funds raised in 2023 from sources other than Open Philanthropy was $255 million, which is about at our 10% percentile estimate and is similar to the $253 million we raised in 2022 from sources other than Open Philanthropy (see the bottom chart in the blog post). We've continued to expand our outreach team, with a focus on retaining our existing donors and bringing in new donors, and we believe these investments will produce results over the longer term.
Second, Open Philanthropy committed $300 million in October 2023 and gave us flexibility to spend it over three years. We chose to allocate $100 million to 2023, 2024, and 2025, which is less than the $250 million we had forecast for 2023.
We discuss our current funding situation in a recent blog post about our approach to grant deployment timelines. We remain funding constrained at our current cost-effectiveness bar. Raising more money remains our single most important lever for maximizing impact---if we have more funding, we'll be able to make more grants to cost-effective programs that save and improve lives.
I don't think I gave any conclusion about CEA? I was pointing out that 80k's past actions are primarily evidence about what we should expect from 80k in the future.
I think your comment is still pretty misleading: "CEA released ..." would be much clearer as "80k released ..." or perhaps "80k, at the time a sibling project of CEA, released ...".
FYI I'm not getting into the separate incident because, as you point out, it involves my partner.