Hi! I'm a twenty three year old writer, researcher, and software engineer. I'm particularly interested in EA community building, public awareness raising, and how we navigate the transition to a post-AGI society.
I also know a bit about farm animal suffering, since I made a website about how to reduce the animal cruelty in one's diet. (boulderhumanefood.org)
I'm looking for opportunities that make use of writing, research, or software engineering, especially in the fields of EA community building, public communication, and AI safety. If you have any projects I find interesting, I may be able to help out for free.
I'm also looking for someone who can exchange feedback on writing with me.
If you want to discuss anything I've written on the forum, don't be afraid to reach out.
Also, I've done a lot of research on EA's presence on YouTube. If you are making YouTube videos or are hoping to do so, feel free to reach out for feedback or suggestions.
That's an interesting approach to trying to improve EA's presence! I really wish more people knew about EA, so I'm glad to see you're helping with that!
I would like to be able to select multiple tags when searching the forum (such as "animal welfare" and "collections/resources"). I would also like to be able to get the top posts for different tags from different years. (For example, I would like to be able to find the top posts on "existential risks" from 2025.)
I would really like to see AI text banned from the forum. Many posters use AI tools to help organize, edit or generate parts of their writing. As a result of this, it's difficult to assess which ideas are originally from the author or merely generated by AI. Overall, I think that AI generated text tends to have a much lower quality because its tends be very vague, general, and repetitive. It also tends to use the exact same structures over and over again, which can be very boring to read. Lastly, by requiring posters to write out their own thoughts, it will create a higher barrier of entry to posting on the forum, which will improve the forum's content overall.
I see. Yeah, I expressed what I was thinking incorrectly here. What I meant was that our preferences are slightly abnormal compared to what you would expect but not extraordinarily alien.
In our ancestral environment, we developed a taste for sugary things because berries, which are sugary, are good for us. Knowing this, you would expect that we would eat berries but not ice cream. The fact that we eat ice cream is somewhat abnormal. That said, it is not extraordinarily alien. If we, for instance, devoted our entire lives to eating as much sugar as is physiologically possible without dying, that would be very alien.
I should have said that their argument implies that AIs might have weird preferences but not extremely bizarre ones.
Thanks for catching that!