This is a linkpost for our new and improved public dashboard, masquerading as a mini midyear update
The headline numbers
- 4,336 conference attendees (2,695 EA Global, 1,641 EAGx)
- 133,041 hours of engagement on the Forum, including 60,507 hours of engagement with non-Community posts (60% of total engagement on posts)
- 26 university groups and 33 organizers in the University Group Accelerator Program (UGAP)
- 622 participants in Virtual Programs
There’s much more, including historical data and a wider range of metrics, in the dashboard!
The work of our Community Health & Special Projects and Communications teams lend themselves less easily to stat-stuffing, but you can read recent updates from both:
- Community Health & Special Projects: Updates and Contacting Us
- How CEA’s communications team is thinking about EA communications at the moment
What else is new?
Our staff, like many others in the community (and beyond), have spent more time this year thinking about how we should respond to the rapidly evolving AI landscape. We expect more of the community’s attention and resources to be directed toward AI safety at the margin, and are asking ourselves how best to balance this with principles-first EA community building.
Any major changes to our strategy will have to wait until our new CEO is in place, but we have been looking for opportunities to improve our situational awareness and experiment with new products, including:
- Exploring and potentially organizing a large conference focussed on existential risk and/or AI safety
- Learning more about and potentially supporting some AI safety groups
- Supporting AI safety communications efforts
These projects are not yet ready to be announcements or commitments, but we thought it worth sharing at a high level as a guide to the direction of our thinking. If they intersect with your projects or plans, please let us know and we’ll be happy to discuss more.
It’s worth reiterating that our priorities haven’t changed since we wrote about our work in 2022: helping people who have heard about EA to deeply understand the ideas, and to find opportunities for making an impact in important fields. We continue to think that top-of-funnel growth is likely already at or above healthy levels, so rather than aiming to increase the rate any further, we want to make that growth go well.
You can read more about our strategy here, including how we make some of the key decisions we are responsible for, and a list of things we are not focusing on. And it remains the case that we do not think of ourselves as having or wanting control over the EA community. We believe that a wide range of ideas and approaches are consistent with the core principles underpinning EA, and encourage others to identify and experiment with filling gaps left by our work.
And finally, it wouldn’t be a CEA update without a few #impact-stories:
- Training for Good posted about their EU Tech Policy Fellowship on the EA Forum. 12/100+ applicants they received came from the Forum, and 6 of these 12 successfully made it on to the program, out of 17 total program slots.
Community Health & Special Projects
- Following the TIME article about sexual misconduct, people have raised a higher-than-usual number of concerns from the past that they had noticed or experienced in the community but hadn't raised at the time. In many of these cases we’ve been able to act to reduce risk in the community, such as warning people about inappropriate behavior and removing people from CEA spaces when their past behavior has caused harm.
- The team’s work on AI safety communications has led to numerous examples of positive coverage in the mainstream media.
- An attendee at the Summit on Existential Security fundraised more than 20% of their organisation’s budget through a connection they made at the event.
- An attendee at EAG Bay Area consequently changed their plans, applying to and being accepted by the Charity Entrepreneurship Incubation Program.
- EAGxIndia helped accelerate a project focused on AI safety movement-building in the region, which has now had funding approved.
- A co-team lead for a new-in-2023 EAGx conference credited Virtual Programs and their interactions with their program facilitator as the main reason they were involved in organising the conference and in EA more generally.
- A University Groups retreat led to one attendee working with an EA professional on her Honors program, and her interning with CEA.
- EA Sweden, a group supported by our Post-uni Groups team via the Community Building Grants program, founded The Mimir Institute for Long Term Futures Studies.
We’re now looking for a “CEO” rather than an “ED”, but the role scope remains unchanged
As of the end of Q2, 2023
Counting up until the May 2023 virtual programs cohort
We use impact stories (and our internal #impact-stories channel) to illustrate some of the ways we’ve helped people increase their impact by providing high-quality discussion spaces to consider their ideas, values and options for and about making impact, and connecting them to advisors, experts and employers