Since Longtermism as a concept doesn't seem widely appealing, I wonder how other time-focused ethical frameworks fare, such as Shorttermism (Focusing on immediate consequences), Mediumtermism (Focusing on foreseeable future), or Atemporalism (Ignoring time horizons in ethical considerations altogether).
I'd guess these concepts would also be unpopular, perhaps because ethical considerations centered on timeframes feel confusing, too abstract or even uncomfortable for many people.
If true, it could mean that any theory framed in opposition, such as a critique of Shorttermism or Longtermism, might be more appealing than the time-focused theory itself. Critizising short-term thinking is an applause light in many circles.
We also encourage you to share this opportunity with others who may be a good fit. If we accept any fellow we contacted based on your recommendation, you'll receive $100 for each accepted candidate. The recommendation form is here.
Pivotal Research is looking for a Operations & Community Manager for our 2024 Research Fellowship.
Employment Period: As soon as possible (subject to availability) to September 2024
Location: London (preferred in-person)
Employment Work-Load: 0.7 – 1 FTE
Salary Range: GBP 4,000 - 5,000 per month for 1 FTE (depending on background and experience)
Deadline: April 17th, 23:59 (CET+1), Apply Here
For the 2024 Research Fellowship, Pivotal Research (previously known as CHERI) is looking for a dedicated Operations & Community Manager to join the team. This role presents a unique opportunity to make a substantial impact in the global catastrophic risk (GCR) field by providing crucial support to the fellowship. The Operations & Community Manager will enjoy significant autonomy and decision-making authority, enabling them to play a key role in ensuring the success of the research fellowship. This position is ideal for individuals passionate about operational excellence and community engagement, aiming to contribute meaningfully to the advancement of GCR research.
Hi Oscar, thanks for the question! To clarify, only the fellowship has moved to the UK, not our entire organisation.
We've thought a lot about the pros and cons of moving from Switzerland and largely agree with your points.[1] The main driver for our decision was Switzerland's comparatively small GCR network.
We see the fellowship as an opportunity to immerse fellows in a rich intellectual environment, which London’s – and especially LISA’s – GCR ecosystem offers. Our experience of running fellowships outside of established hubs suggests that fellowships alone are not a great vehicle to build a new GCR hub due to their seasonal nature and limited ability to retain people long-term. Nevertheless, we do see significant value in diversification and are considering future projects outside established GCR hubs for this reason.
Hope this explains our thinking, happy to answer more questions.
Mentor access isn't a huge concern for us, since we expect most mentor-mentee interactions to happen virtually either way.
"Profits for investors in this venture [ETA: OpenAI] were capped at 100 times their investment (though thanks to a rule change this cap will rise by 20% a year starting in 2025)."
I stumbled upon this quote in this recent Economist article [archived] about OpenAI. I couldn't find any good source that supports the claim additionally, so this might not be accurate. The earliest mention I could find for the claim is from January 17th 2023 although it only talks about OpenAI "proposing" the rule change.
If true, this would make the profit cap less meaningful, especially for longer AI timelines. For example, a 1 billion investment in 2023 would be capped at ~1540 times in 2040.
- Require that developers of the most powerful AI systems share their safety test results and other critical information with the U.S. government. In accordance with the Defense Production Act, the Order will require that companies developing any foundation model that poses a serious risk to national security, national economic security, or national public health and safety must notify the federal government when training the model, and must share the results of all red-team safety tests.
Would the information in this quote fall under any of the Freedom of Information Act (FOIA) exemptions, particularly those concerning national security or confidential commercial information/trade secrets? Or would there be other reasons why it wouldn't become public knowledge through FOIA requests?
Deadline Extended to Tuesday 26. November!
You can recommend others who may be a good fit. We'll give you $100 for each accepted candidate we contact through you.