DM

David_Moss

Principal Research Director @ Rethink Priorities
7363 karmaJoined Working (6-15 years)

Bio

I am the Principal Research Director at Rethink Priorities. I lead our Surveys and Data Analysis department and our Worldview Investigation Team. 

The Worldview Investigation Team previously completed the Moral Weight Project and CURVE Sequence / Cross-Cause Model. We're currently working on tools to help EAs decide how they should allocate resources within portfolios of different causes, and to how to use a moral parliament approach to allocate resources given metanormative uncertainty.

The Surveys and Data Analysis Team primarily works on private commissions for core EA movement and longtermist orgs, where we provide:

  • Private polling to assess public attitudes
  • Message testing / framing experiments, testing online ads
  • Expert surveys
  • Private data analyses and survey / analysis consultation
  • Impact assessments of orgs/programs

Formerly, I also managed our Wild Animal Welfare department and I've previously worked for Charity Science, and been a trustee at Charity Entrepreneurship and EA London.

My academic interests are in moral psychology and methodology at the intersection of psychology and philosophy.

How I can help others

Survey methodology and data analysis.

Sequences
3

RP US Public AI Attitudes Surveys
EA Survey 2022
EA Survey 2020

Comments
516

The only statistically significant results are that people who posted or commented on the Forum are more Center-left (41.2% vs 34.9% for non-Forumites), but less Left (27.8% vs 37.8%).

Thanks JWS, it certainly sounds like we agree more than we disagree.

even if it has been established I want to push back and un-establish 

That's definitely fair! 

For what it's worth I think that the explanation for differences in support for these two different clusters of causes is more epistemic and than it is to do with attitudes towards the longterm or near term per se.[1] Ideally, I'd like the terms we use to not (be seen to) refer to the explanation for supporting the causes at all, since I think the reasons are heterogeneous. 

In any case, we definitely agree that none of these terms are perfect, and I suspect no terms are going to be completely satisfactory, but I'm open to continued discussion about what better terms would be.

 

  1. ^

    Although, in terms of predicting "LT minus NT" cause prioritisation from our cause-related idea items, the "long term future" item and "low probability, high impact" items were about equally predictive. 

    Interestingly, this also holds true in unpublished work we have looking at the general public, for whom objections that influencing the far future is impractical or impossible are more consequential than their lack of concern for future generations.

Many thanks Devon!

I agree if you adjusted most of these results by a factor of 3/4x (for the LTF vs GHD/NT ratios above), you'd see GHD/NT ahead pretty much across that board. The biggest ratios in favour of longtermism in the results above are ~2x (though closer to 4x among the highly engaged specifically).

That said, I think the relationship between funding and intra-EA influence is unclear. I would expect large diminishing returns, and for a lot of Meta resources to not be spent on intra-EA influence. My guess is also that a lot of the influence driving people from neartermist to longtermist causes comes from the support of EA elites in a way that is partially separable from funding.[1] So adjusting by the funding ratio would not be straightforward.

  1. ^

    Obviously funding levels and influence are causally related in both directions. That said, I imagine OP/80K/CEA and advocating for cause areas would have significant influence independent of their funding levels.

This is not a bad summary overall, but has some errors/confusions:

Longtermist causes are prioritized by 63.6% of respondents vs. 46.8% for neartermist causes, a gap that widens among highly engaged EAs.

Both parts of this are technically true, but the statistic referred to in the first half of the sentence is different from the one that we reported to show the gap between low/high engagement EAs.

Comparable statistics would be:

  • Overall: 37.6% of respondents most prioritised only a longtermist cause, 21.0% most prioritised only a neartermist cause. 
  • Among the highly engaged, 47% most prioritized only a longtermist cause and only 13% most prioritized a neartermist cause. 
  • Among, the less engaged 26% most prioritized a longtermist cause and  31% most prioritized a neartermist cause.

In an allocation task, respondents assign the most resources to Global health/poverty, then AI risk, then animal welfare

True looking at farmed animal welfare, but a combined animal welfare category would be neck and neck with AI (slightly ahead but not significant).

with actual allocations lower on Global poverty and animal welfare than the survey or an earlier survey of EA leaders.

Actual allocations to Global Poverty are higher than our survey allocations and actual allocations to FAW are lower. I don't have statistics for actual allocations to WAW, but they are likely dramatically lower.

Thanks for the detailed comment!

I have to object to this. I don't think longtermism is best understood as a cause, or set of causes, but more as a justification for working on certain causes over others. e.g.:

Working on Nuclear Risk could be seen as near-termist.

We agree and say as much:

As we have noted previously, people may of course prioritise these causes for reasons other than longtermism or neartermism per se. Likewise, people might support the ‘Other’ causes here for neartermist or longtermist reasons. 

And here's what we say at the linked previous post:

For simplicity, we label support for these, non-longtermist, causes “neartermist”, as we have in previous years. However, it’s worth noting explicitly that there is little reason to suppose that neartermism specifically, (e.g., attitudes or beliefs related to helping present vs. future generations or different time preference) explain support for these causes, rather than different epistemic beliefs (e.g., about appropriate kinds of evidence) or support for more traditional causes etc.

We also explicitly discuss your example of Climate Change making much the same point:

The classifications we used here were informed by our prior analyses of the factor structure of the cause prioritisation items and a priori theoretical considerations (namely Climate Change is clearly associated with the neartermist causes not the longtermist causes, but theoretically some might think of it as a longtermist cause, and we count it as ‘Other’ in this analysis)

I acknowledge though that your concern is that "this categorisation helps to reify and entrench those divisions". I think this is possible, but I think that:

  • There is something[1] unifying these causes
  • It is important to talk about that (I think we'd be more confused/less informed if we just considered all the causes as separate/independent). 
  • Referring to these clusters of causes and ideas in terms of "longtermism" and "neartermism" is established terminology. Crucially, I don't think there's an obviously better  set of term, because "existential risk" wouldn't capture some causes within this bucket (e.g. in previous years we had a "broad longtermism" item which was also part of this cluster)

I think it's important enough not to hide the details in footnotes

I think it's an entirely reasonable view to think discussion of this should be in the text, not a footnote. Though we had a lot of information in both the footnotes and appendix, so it's tricky.

 

  1. ^

    Though I don't claim it's a single thing, rather than a cluster of correlated things. And empirically, our longtermist-neartermist cause scores measure is strongly correlated with people's stated abstract beliefs. The single abstract item explicitly about longtermism, is correlated with LT-NT at r=0.457, which is appreciably strong for a necessarily noisy cause prioritisation score and single item, in a social science context.

Thanks Grayden!

  • I strongly agree that engagement =/= commitment or impact. 
  • That said, I'd the trend for higher engagement to be associated with stronger support for longtermist over neartermist causes is also observed across other proxies for engagement. For example, perhaps most surprisingly, having taken the GWWC pledge is (within our sample) significantly associated with stronger support for LT over NT.

Thanks for the question Huw. This is in reference to the EA cause mental health, rather than mental health for EAs (which would fall under movement building (assuming that the purpose was building the movement, rather than just narrowly helping EA)). This was a requested addition from 2018. 

Thanks!

do you see that any changes over time in cause prioritization could be explained by changing demographics? You mentioned e.g. engagement level predicting cause prioritization. I am then thinking that if the % of low to high engagement EAs have changed over time, perhaps that partially drives the trends in your 4th chart?

Yes, that's definitely right, and one of the reasons why we plotted the interaction between engagement and time in the appendix. 

The percentage of high engagement EAs (using the simple binary measure) has increased over time, from 44% in 2019 to 49% in 2020 and 55% in 2022. So you would expect this to increase support for causes which are shown to be more strongly supported by high engagement EAs across those years. That said, looking at the interaction plots, you can also see that support for Biosecurity and Nuclear and AI has increased among both high and low engagement EAs over that time period,

It seems possible that both of these are neglected for similar reasons. 

It seems surprising the funding would be the bottleneck (which means you can't just have more of both). But that has been my experience surprisingly often, i.e. core orgs are willing to devote many highly valuable staff hours to collaborating on survey projects, but balk at ~$10,000 survey costs.

Load more