TLDR: UK's ARIA (like DARPA) just launched a "Collective Flourishing" opportunity space looking for technical tools to improve humanity's long-term decision-making. They want feedback and research suggestions. Let's crowdsource relevant EA/longtermist work to inform their program (typical grants: £10k-£500k).
Hi all, for those who weren't aware the UK has an advanced science research body, called ARIA, hoping to find new tech breakthroughs in the way the US's DARPA has in the past.
They have many opportunity spaces, in areas such as neurotechnology, materials research etc.
A new opportunity space has just been launched, with calls for recommendations: 'collective flourishing'. [link]
What are they looking for?
They are trying to find technical tools to help make human flourishing more likely.
They want feedback on this idea. Especially:
- Positive/constructive feedback about this opportunity space
- Relevant academic literature
- Technological developments they may have missed
- Novel, forward-thinking research avenues
- Relevant R&D directions
Why is this relevant
This seems like UK government funding for longtermist infrastructure - tools for collective deliberation, foresight, and navigating towards better futures.
This overlaps with:
- Better Futures/viatopia concepts (Should we aim for flourishing over mere survival? The Better Futures series.)
- Long Reflection/Deep Reflection work
- AI governance and collective decision-making
- Reducing risks from concentrated power
Opportunity
I was thinking we could use our hive mind to link and inform all the organisations who are already working on this.
Impact: Non-EA researchers with funds/network/government backing cool existing projects from wider EA community.
ARIA also typically grants £10k-£500k for projects once they launch a space. Getting the right research projects in front of them could help win funding for some existing orgs.
Execution
I'll start brainstorming all the orgs that might have created relevant resources here. Please comment below with any others.
I'll collect all the results in a Google Doc to send the program leads.
Their research areas (help me fill in the gaps):
Each number below is a research direction they have highlighed for feedback.
- Platforms for organising, exploring, and questioning knowledge
- EAForum, LessWrong, Alignment Forum
- Evidence synthesis and world modelling
- ?
- Knowledge, belief, and value elicitation
- Scenario analysis, simulation, and wargaming
- ?
- Decision-making under deep uncertainty, bounded rationality, and negotiating complex trade-offs
- ?
- Methods for enhancing cognitive autonomy, reflective reasoning, and informed citizenship
- Workshop Labs, Ground News, Verity News, Metaculus, Cauldron Labs,
Please comment below any other orgs/projects that these research directions make you think of. I'll send ARIA the summary.
The ask: If we use our collective knowledge of projects to bring together a coherent list, we might be able to shape and deliver some real funding in this space. This is a rare chance to connect existing EA/longtermist work with serious government backing.
