Funding Strategy Week
Marginal Funding Week
Donation Election
Pledge Highlight
Donation Celebration
Nov 12 - 18
Marginal Funding Week
A week for organisations to explain what they would do with marginal funding. Read more.
Dec 3 - 16
Donation Election
A crowd-sourced pot of funds will be distributed amongst three charities based on your votes. Continue donation election conversations here.
$25 598 raised
Dec 3 - 16
Intermission
Dec 16 - 22
Pledge Highlight
A week to post about your experience with pledging, and to discuss the value of pledging. Read more.
Dec 23 - 31
Donation Celebration
When the donation celebration starts, you’ll be able to add a heart to the banner showing that you’ve done your annual donations.
Welcome to the EA Forum bot site. If you are trying to access the Forum programmatically (either by scraping or via the api) please use this site rather than forum.effectivealtruism.org.

This site has the same content as the main site, but is run in a separate environment to avoid bots overloading the main site and affecting performance for human users.

New & upvoted

Customize feedCustomize feed
CommunityCommunity
Personal+

Posts tagged community

Quick takes

Show community
View more
I wrote a post on “Charity” as a conflationary alliance term. You can read it on LessWrong, but I'm also happy to discuss it here. If wondering why not post it here: Originally posted it here with a LW cross-post. It was immediately slapped with the "Community" tag, despite not being about community, but about different ways people try to do good, talk about charity & ensuing confusions. It is about the space of ideas, not about the actual people or orgs. With posts like OP announcements about details of EA group funding or EAG admissions bar not being marked as community, I find it increasingly hard to believe the "Community" tag is driven by the stated principe marking "Posts about the EA community and projects that focus on the EA community" and not by other motives, like e.g. forum mods expressing the view "we want people to think less about this / this may be controversial / we prefer someone new to not read this".   My impression this moves substantial debates about ideas to the side, which is a state I don't want to cooperate on by just leaving it as it is -> moved the post on LessWrong and replaced by this comment. 
[Promise this is not a scam] Sign up to receive a free $50 charity gift card from a rich person Every year, for the past few years, famous rich person Ray Dalio has given away 20,000 $50 gift cards. And he is doing it again this year. These can be given any of over 1.8 million US registered charities, which includes plenty of EA charities Here's an announcement post from Ray Dalio's instagram for verification Register here to receive notification when the gift cards become available.
I just sent out the Forum digest and I thought there was a higher number of underrated (and slightly unusual) posts this week, so I'm re-sharing some of them here: * What does starvation feel like? by @Elijah Whipple * Detection of Asymptomatically Spreading Pathogens by @Jeff Kaufman (I think the slide-show format works really well) * The anti-clickbait titled Next-gen LLINs with chlorfenapyr by @Scott Smith, which analyses the effect of using chlorfenapyr rather than pyrethroid as an insecticide in bednets. This reduces the effect of insecticide-resistance
Introduction In this post, I share some thoughts from this weekend about the scale of farmed animal suffering, compared to the expected lives lost from engineered pandemics. I make the case that animal welfare as a cause has a 100x higher scale than biorisk. I'd happily turn this in to a post if you have more you'd like to add either for or against.   Scale Comparisons Farmed Animal Suffering. I was thinking about the scale of farmed animal suffering, which is on the order of 1011 lives per year. These animals endure what might be among the worst conditions on the planet, considering only land animals. My estimate for the moral weight of the average land animal is approximately 1% to 0.1% that of a human. At first glance, this suggests that farmed animal suffering is equivalent to the annual slaughter of between 100 million and 1 billion humans, without considering the quality of their lives before death. I want to make the case that the scale of this could be 100x or a 1000x that of engineered pandemics.  Engineered Pandemics. In The Precipice, Toby Ord lists engineered pandemics as yielding a 1 in 30 extinction risk this century. Since The Precipice was published in 2020, this equates to a 1 in 30 chance over 80 years, or approximately a 1 in 2,360 risk of extinction from engineered pandemics in any given year. If that happens, 1010 human lives would be lost, resulting in an expected loss of approximately four million human lives per year.    Reasons I might be wrong Tractability & Neglectedness. If engineered pandemic preparedness is two orders of magnitude higher in neglectedness and/or tractability, that would outweigh the scale and make them tractable. I'd be happy to hear someone more knowledgeable give some comparisons here.  Extinction is Terrible. Human extinction might not equate to just 1011 lives lost, due to future lives lost. Further, the The Precipice only discusses extinction level pandemics, but as suggested by Rodriguez here, one in 100
What is your AI Capabilities Red Line Personal Statement? It should read something like "when AI can do X in Y way, then I think we should be extremely worried / advocate for a Pause*".  I think it would be valuable if people started doing this; we can't feel when we're on an exponential, so its likely we will have powerful AI creep up on us. @Greg_Colbourn just posted this and I have an intuition that people are going to read it and say "while it can do Y it still can't do X" *in the case you think a Pause is ever optimal.