Hide table of contents
Attendees at our recent PauseAI Unconference

TL;DR: 🥳

Last November we had only 4 months of runway remaining–today we have ~1.5 years.

I’m reminded of the saying ‘it takes a village to raise a child’, and I would be writing a very different update if not for the village that came together to support us since our fundraising appeal last winter.


We received several months of runway as a result of individual donations from alumni and supporters, which gave us the time to approach new funders, as well as encouragement to persevere.

Others volunteered their time and energy to support us. Guests at the hotel helped with day-to-day tasks so that we could focus on fundraising, and we received priceless advice from more experienced EAs on how to improve our offering and make the value of our project more apparent to funders.

As a result of all the above, with just over a month of runway remaining, we received an emergency grant from AISTOF[1] that ensured our continued operation until the end of the year.


And now we’ve been granted an additional full year of funding from EAIF. 🤯 🎤 🫳

MANY thank-yous are due:

  • To those who donated
  • To those who offered their time and advice
  • To those who advocated for us
  • To colleagues, past and present
  • To ML4G, Stampy and PauseAI for choosing us as your venue despite our uncertain future
  • To Wytham Abbey, for their generous donation of equipment
  • To the grant investigators who gave us a chance to explain what this strange little hotel in Blackpool is doing and why it’s worth supporting
  • Last but not least - to our grantees and alumni, for being so committed to having a positive impact on the world, and giving us the chance to play a role in your journey.

The Future of CEEALAR!

AI Winter is Coming to CEEALAR

CEEALAR has been hosting grantees working on AI Safety since it opened in 2018, and this winter we’re going all in - we’re going to be the AI Winter we want to see in the world.

From September until the end of the year we’re going to direct our outreach and programming toward AI Safety.[2]

Keep an eye out for a future update where we’ll go more into the details of what we have planned - which isn’t much right now, so if you’ve got ideas and would like to collaborate with us on AI Winter, get in touch!

If you’d like a reminder, or are interested in participating or collaborating in some fashion - please fill out this tiny form (<2 minutes).

If you don’t need any more convincing, it’s not too early to apply.

Workshops and Bootcamps and Hackathons, Oh My!

As Wytham Abbey have closed their doors, it’s a good job there’s still a swanky EA venue, right guys? If you’re running an event for up to 20*[3] people, we can provide fully catered accommodation, venue space and operations support.

As part of CEEALAR, our venue is nonprofit and operates on a ‘pay what you can' basis–this way we can enable high impact events that might be prevented due to financial constraints.

Please contact us if this sounds like you!

Renovations

We’re never more proud of our space than when our guests say they feel at home here, and we’re always on the lookout for ways to improve our offering so they never want to leave 🙂

On this note, our old home gym[4] was getting a bit long in the tooth, and we’re in the process of totally refitting the space. There is currently less gym now than before but by the time you arrive there will be much more, and better!

 

Less gym.

Upcoming Opportunities

We’ve yet to finalize the details, but we expect to have open positions in the future:

A salaried senior leadership position, an unpaid trustee position, and a rolling volunteer ‘internship’ in Operations for those looking to upskill and get some hands on experience.

Apply to stay at CEEALAR

As a Grantee

Our grantees come from a wide range of backgrounds, and most are at the early stages of their impact journeys – using their time at CEEALAR to upskill, make connections, look for impactful opportunities, or work on projects that they might not otherwise have the chance to pursue.

If you are worried about not having enough experience or qualifications, you might be an ideal CEEALAR Grantee - applicants with high potential and a large counterfactual impact of staying at CEEALAR are the primary beneficiaries that we aim to support.

As our funding is now secure through 2025, you’re welcome to apply in advance for any dates up to the end of 2025.

If in doubt, apply!

Taking a break from saving the world to win a local competition at Blackpool Library.

As a Patron

While it’s always been possible for people to stay at CEEALAR as a paying guest (rather than as a grantee), we’ve recently updated our application process to make this more visible.

Staying at CEEALAR as a Patron is similar to a normal hotel - your application isn’t evaluated as a grant, you’re not required to be working in relevant cause areas, or to participate in our impact measurement processes.

Unlike a normal hotel, you also get as much vegan food as you can eat, multiple coworking spaces and breakout areas, a home gym[5], and the best[6] EA community this side of the singularity to share it all with.

Book to stay as a patron here.

Volunteer

It feels cheeky to ask for more support after having received so much recently, but given the impact volunteers have had on CEEALAR, it would be more foolish to leave such offers of support on the table.

If you’d like to play a part in CEEALAR’s future, here’s are some examples of things we could use help with:

  • Contributing to our AI Winter event
    • Ideas, workshops, hackathons, outreach etc
  • Mentoring grantees, e.g. ARENA tutors coaching people who are upskilling in AI Safety
  • Offering zoom (or in person) talks / workshops
  • Impact measurement. Are you experienced with this, and want to help us improve through measuring what matters most?

If you can help with any of these (or something else not on this list!) please get in touch: contact@ceealar.org

Our Commitments

To conclude, we’d like to share our goals and commitments for 2024 H2 & 2025.

Improve and increase our outreach efforts

With the aim of receiving 5-12 applications[7] per month, by updating our website, enhancing our online presence, as well as active outreach and connecting with EA university societies and careers services across the UK, and around the world.

Improve our impact measurement process

To better capture CEEALAR’s performance against its objectives, publish an annual impact report to be used as a basis for improvement, and as supporting material for funding applications.

Continue to iterate and improve upon day-to-day life at CEEALAR

With the aim of increasing productivity and improving the overall experience of our guests. Renovating our home gym and courtyard are our main priorities for 2024 H2.

Increase opportunities available to grantees and alumni

For upskilling, networking and participating in the community, by facilitating more connections and communication within our alumni network. We’ll begin sending a short quarterly newsletter to our alumni providing information on volunteering opportunities within EA, and a summary of the cause areas and projects of recent grantees so that alumni can connect with people and projects they may be interested in.

  1. ^

     AI Safety Tactical Opportunities Fund

  2. ^

     Though we’ll still be considering applicants from other cause areas.

  3. ^

     There are also many nearby hotels if you need additional accommodation

  1. ^

     That is to say, ‘room with weights and such in it’

  2. ^

     Soon

  3. ^

     As voted by an entirely impartial straw poll of guests present while writing this post

  4. ^

     From quality candidates, i.e. not ChatGPT spam.

Show all footnotes
Comments4


Sorted by Click to highlight new comments since:

I love the warmth, energy and enthusiasm in this post. Although it's not everyone's style and not the norm on the forum, its nice from time to time and was a mini pick-me up this morning.

Especially appreciated the effusive thanks for your donors, guests and supporters, feels good.

Was uplifting to read this, thanks for sharing and congrats! :)

"Taking a break from saving the world to win a local competition at Blackpool Library " 😂, yeah and it was really fun.

Thank you CEEALAR for everything you've done.

Executive summary: CEEALAR, a nonprofit EA hotel in Blackpool, has secured funding through 2025 and is focusing on AI safety initiatives while continuing to offer accommodations and support for EA projects.

Key points:

  1. CEEALAR received emergency funding and grants, extending runway from 4 months to ~1.5 years
  2. Launching "AI Winter" initiative from September to December, focusing on AI safety programming
  3. Offering venue space for EA events (up to 20 people) on a "pay what you can" basis
  4. Renovating facilities, including home gym upgrade
  5. Seeking applications for grantees and paying guests (patrons) through 2025
  6. Goals include improving outreach, impact measurement, and opportunities for grantees/alumni

 

 

This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.

Curated and popular this week
jackva
 ·  · 3m read
 · 
 [Edits on March 10th for clarity, two sub-sections added] Watching what is happening in the world -- with lots of renegotiation of institutional norms within Western democracies and a parallel fracturing of the post-WW2 institutional order -- I do think we, as a community, should more seriously question our priors on the relative value of surgical/targeted and broad system-level interventions. Speaking somewhat roughly, with EA as a movement coming of age in an era where democratic institutions and the rule-based international order were not fundamentally questioned, it seems easy to underestimate how much the world is currently changing and how much riskier a world of stronger institutional and democratic backsliding and weakened international norms might be. Of course, working on these issues might be intractable and possibly there's nothing highly effective for EAs to do on the margin given much attention to these issues from society at large. So, I am not here to confidently state we should be working on these issues more. But I do think in a situation of more downside risk with regards to broad system-level changes and significantly more fluidity, it seems at least worth rigorously asking whether we should shift more attention to work that is less surgical (working on specific risks) and more systemic (working on institutional quality, indirect risk factors, etc.). While there have been many posts along those lines over the past months and there are of course some EA organizations working on these issues, it stil appears like a niche focus in the community and none of the major EA and EA-adjacent orgs (including the one I work for, though I am writing this in a personal capacity) seem to have taken it up as a serious focus and I worry it might be due to baked-in assumptions about the relative value of such work that are outdated in a time where the importance of systemic work has changed in the face of greater threat and fluidity. When the world seems to
 ·  · 4m read
 · 
Forethought[1] is a new AI macrostrategy research group cofounded by Max Dalton, Will MacAskill, Tom Davidson, and Amrit Sidhu-Brar. We are trying to figure out how to navigate the (potentially rapid) transition to a world with superintelligent AI systems. We aim to tackle the most important questions we can find, unrestricted by the current Overton window. More details on our website. Why we exist We think that AGI might come soon (say, modal timelines to mostly-automated AI R&D in the next 2-8 years), and might significantly accelerate technological progress, leading to many different challenges. We don’t yet have a good understanding of what this change might look like or how to navigate it. Society is not prepared. Moreover, we want the world to not just avoid catastrophe: we want to reach a really great future. We think about what this might be like (incorporating moral uncertainty), and what we can do, now, to build towards a good future. Like all projects, this started out with a plethora of Google docs. We ran a series of seminars to explore the ideas further, and that cascaded into an organization. This area of work feels to us like the early days of EA: we’re exploring unusual, neglected ideas, and finding research progress surprisingly tractable. And while we start out with (literally) galaxy-brained schemes, they often ground out into fairly specific and concrete ideas about what should happen next. Of course, we’re bringing principles like scope sensitivity, impartiality, etc to our thinking, and we think that these issues urgently need more morally dedicated and thoughtful people working on them. Research Research agendas We are currently pursuing the following perspectives: * Preparing for the intelligence explosion: If AI drives explosive growth there will be an enormous number of challenges we have to face. In addition to misalignment risk and biorisk, this potentially includes: how to govern the development of new weapons of mass destr
Sam Anschell
 ·  · 6m read
 · 
*Disclaimer* I am writing this post in a personal capacity; the opinions I express are my own and do not represent my employer. I think that more people and orgs (especially nonprofits) should consider negotiating the cost of sizable expenses. In my experience, there is usually nothing to lose by respectfully asking to pay less, and doing so can sometimes save thousands or tens of thousands of dollars per hour. This is because negotiating doesn’t take very much time[1], savings can persist across multiple years, and counterparties can be surprisingly generous with discounts. Here are a few examples of expenses that may be negotiable: For organizations * Software or news subscriptions * Of 35 corporate software and news providers I’ve negotiated with, 30 have been willing to provide discounts. These discounts range from 10% to 80%, with an average of around 40%. * Leases * A friend was able to negotiate a 22% reduction in the price per square foot on a corporate lease and secured a couple months of free rent. This led to >$480,000 in savings for their nonprofit. Other negotiable parameters include: * Square footage counted towards rent costs * Lease length * A tenant improvement allowance * Certain physical goods (e.g., smart TVs) * Buying in bulk can be a great lever for negotiating smaller items like covid tests, and can reduce costs by 50% or more. * Event/retreat venues (both venue price and smaller items like food and AV) * Hotel blocks * A quick email with the rates of comparable but more affordable hotel blocks can often save ~10%. * Professional service contracts with large for-profit firms (e.g., IT contracts, office internet coverage) * Insurance premiums (though I am less confident that this is negotiable) For many products and services, a nonprofit can qualify for a discount simply by providing their IRS determination letter or getting verified on platforms like TechSoup. In my experience, most vendors and companies