Hide table of contents

It’s been four months since our last forum update post! Here are some things the dev team has been working on.

We launched V1 of a new Events page

It shows you upcoming events near you, as well as global events open to everyone! We think it’s now the most complete overview of EA events that exists anywhere.

Some improvements we’ve made over the last few months:

  1. Anyone can post an event on the new Events page by clicking on “New Event” in their username dropdown menu in the Forum. (We also have a contractor who cross-posts many events).
  2. You can easily add an event to your calendar, or be notified about events near you.
  3. Events now have images, which we think makes the page more engaging and easier to parse.
  4. We’ve improved the individual event pages, to show key information more clearly, and make it obvious how you can attend an event.
  5. You can see upcoming events in the Forum sidebar.

If you think the new Events page is useful, please share it widely! :)

We also made a number of small improvements to the Community page, and we’re working on a significant redesign, to make it more visual and groups-focused.

Update: We launched the redesigned Community page! This will eventually replace the EA Hub groups list. (If you would like to be assigned as a group organizer to one of the groups on the Forum, or if you know of groups that are missing, please let me know.)

100+ karma users can add co-authors

It’s now possible for users to add co-authors to their posts. As a precaution against spamming, this is currently only available to users with 100+ karma. If you have less karma, feel free to contact us to add co-authors.

We updated the Sequences page

We renamed it to “Library” and highlighted some core sequences, like the Most Important Century series by Holden Karnofsky.

We merged our codebase with LessWrong

Now we share a Github repo: ForumMagnum. Feel free to check out what we’re working on, and do let us know of any issues you see.

We ran the EA Decade Review

Thanks to everyone who participated! The Review ran from December 1 to February 1. Our team has been busy since then, but we should be posting about the results soon - I know I’m looking forward to reading it! :)

We started reworking tag subscriptions

Currently, “subscribing” to a tag on the Forum means you get notifications for new posts with that tag. However, we are moving more toward the YouTube model, where “subscribing” weights posts with that tag more heavily on the frontpage, and you can separately sign up for tag notifications via the bell icon. See more details here.

Right now this new version is behind the “experimental features” flag, so if you want to play with it you’ll have to check this box in your account settings:

You can now change your username

You can now change your username (i.e. display name) yourself via your account settings page. However, you only get one change - after that, you’ll need to contact us to change it again. 

We can also hide your profile from search engines and change the URL associated with your profile. Please contact us if you’d like to do this.

We added footnotes support to our default text editor

Last but certainly not least, we deployed one of the most requested Forum features: footnotes! See the standalone post for more details.

Questions? Suggestions?

We welcome feedback! Feel free to comment on this post, leave a note on the EA Forum feature suggestion thread, or contact us directly.

Join our team! :)

We’ve built a lot these past few months, but there’s much more to be done. We’re currently hiring for software developers to join our team and help us make the EA Forum the best that it can be. If you’re interested, you can apply here.

78

0
0

Reactions

0
0

More posts like this

Comments5


Sorted by Click to highlight new comments since:

100+ karma users can add co-authors

Much appreciated!

Currently, “subscribing” to a tag on the Forum means you get notifications for new posts with that tag. However, we are moving more toward the YouTube model, where “subscribing” weights posts with that tag more heavily on the frontpage, and you can separately sign up for tag notifications via the bell icon.

This is great. It would also be valuable (though probably not high priority) to have a Wikipedia-style "watchlist" where users could see all the activity related to the entries they are subscribed to, including new edits to those articles.

Separately, in order to avoid needless jargon, I vote for calling the "sequences" collections.

Thanks for the suggestion! I've added it to our list for triage. Also I agree that "sequence" is unclear - personally I'm a fan of "series", since it still implies that there is an order, but I haven't put that much thought into it. :)

Also [very minor]: the "load more" button loads 10 additional posts, but because of the three-column layout, this means that two out of three times the final line of posts will be incomplete. I think the "load more" button should instead load nine posts, or a multiple of three.

Yeah, I agree "series" would be more appropriate if the collected posts are ordered, though it seems that some of the "sequences" in the library are not meant to be read in any particular order.

Curated and popular this week
 ·  · 10m read
 · 
Regulation cannot be written in blood alone. There’s this fantasy of easy, free support for the AI Safety position coming from what’s commonly called a “warning shot”. The idea is that AI will cause smaller disasters before it causes a really big one, and that when people see this they will realize we’ve been right all along and easily do what we suggest. I can’t count how many times someone (ostensibly from my own side) has said something to me like “we just have to hope for warning shots”. It’s the AI Safety version of “regulation is written in blood”. But that’s not how it works. Here’s what I think about the myth that warning shots will come to save the day: 1) Awful. I will never hope for a disaster. That’s what I’m trying to prevent. Hoping for disasters to make our job easier is callous and it takes us off track to be thinking about the silver lining of failing in our mission. 2) A disaster does not automatically a warning shot make. People have to be prepared with a world model that includes what the significance of the event would be to experience it as a warning shot that kicks them into gear. 3) The way to make warning shots effective if (God forbid) they happen is to work hard at convincing others of the risk and what to do about it based on the evidence we already have— the very thing we should be doing in the absence of warning shots. If these smaller scale disasters happen, they will only serve as warning shots if we put a lot of work into educating the public to understand what they mean before they happen. The default “warning shot” event outcome is confusion, misattribution, or normalizing the tragedy. Let’s imagine what one of these macabrely hoped-for “warning shot” scenarios feels like from the inside. Say one of the commonly proposed warning shot scenario occurs: a misaligned AI causes several thousand deaths. Say the deaths are of ICU patients because the AI in charge of their machines decides that costs and suffering would be minimize
 ·  · 2m read
 · 
TL;DR Starting an Effective Altruism (EA) group might be one of the highest-impact opportunities available right now. Here’s how you can get involved: * University students: Apply to the Centre for Effective Altruism’s Organiser Support Programme (OSP) by Sunday, June 22. * City or national group organisers: You’re invited, too. See details here! * Interested in mentorship? Apply to mentor organisers by Wednesday, June 18. * Know someone who could be an organiser or mentor? Forward this post or recommend them directly. OSP provides mentorship, workshops, funding support, and practical resources to build thriving EA communities. Why Starting an EA Group Matters EA Groups, especially university groups, are often the very first exposure people have to effective altruism principles such as scale, tractability, and neglectedness. One conversation, one fellowship, one book club - these seemingly small moments can reshape someone’s career trajectory. Changing trajectories matters - even if one person changes course because of an EA group and ends up working in a high-impact role, the return on investment is huge. You don’t need to take our word for it: * 80,000 Hours: "Probably one of the highest-impact volunteer opportunities we know of." * Rethink Priorities: Only 3–7% of students at universities have heard of EA, indicating neglectedness and a high potential to scale. * Open Philanthropy: In a survey of 217 individuals identified as likely to have careers of particularly high altruistic value from a longtermist perspective, most respondents reported first encountering EA ideas during their college years. When asked what had contributed to their positive impact, local groups were cited most frequently on their list of biggest contributors. This indicates that groups play a very large role in changing career trajectories to high-impact roles. About the Organiser Support Programme (OSP) OSP is a remote programme by the Centre for Effective Altruism designed
 ·  · 11m read
 · 
Summary The purpose of this post is to summarize the achievements and learnings at Impact Ops in its first two years. Impact Ops provides consultancy and hands-on support to help high-impact organizations upgrade their operations. We’ve grown from three co-founders to a team of 11 specialists and supported 50+ high-impact organizations since our founding in April 2023. We deliver specialist operations services in areas where we have deep experience, including finance, recruitment, and entity setup. We have 50+ active clients who we’ve helped tackle various operational challenges. Besides our client work, we’re pleased to have contributed to the broader nonprofit ecosystem in several ways, including via free resources. We’re also proud to have built a sustainable business model that doesn’t rely on continuous fundraising. We’ll share details about our services, projects, and business model in what follows, including our key takeaways and what’s next for Impact Ops! What is Impact Ops? Impact Ops is an operations support agency that delivers services to nonprofit organizations.  Our mission is to empower high-impact projects to scale and flourish. We execute our mission by delivering specialist operations services in areas where we have deep experience, including finance, recruitment, and entity setup. Our team has extensive experience within nonprofit operations. Collectively, we have: * 50+ years’ experience working at nonprofits (incl. Effective Ventures, CEA, Panorama Global, Anti Entropy, Code For Africa, Epistea, and the Marine Megafauna Foundation) * 50+ further years’ experience working in related roles outside the nonprofit community, including COO, recruitment, and accounting positions. These figures underrepresent our collective relevant experience, as they exclude time spent supporting nonprofit organizations via Impact Ops (10 years collectively) and working for other consultancies (incl. PwC, EY, BDO, and Accenture). If it sounds like we're pr