Epistemic status: around that of Descartes' (low)

I am not a native English speaker. Despite that, I've had my English skills in high regard most of my life. It was the language of my studies at the university. Although I still make plenty of mistakes, I want to assure you I am capable of reading academic texts.

That being said: a whole lot of posts and comments here do feel like academic texts. The most basic/heuristic check: I found a tool to measure linguistic complexity, here https://textinspector.com/ - so you can play with it yourself, if you'd like to. Now, I realize that AI Safety is a complicated, professional topic with a lot of jargon. Hence, let's take a discussion that, I believe, should be especially welcoming to non-professionals: https://forum.effectivealtruism.org/posts/kuqgJDPF6nfscSZsZ/thread-for-discussing-bostrom-s-email-and-apology

I could make some Python project and analyse lingustic complexity of a whole range of posts, produce graphs and it sure would be fun and much better, but I am a lazy person and I just want to show you the idea. I mean to sound extremely simple when I say the following. 

There's a whole lot of syllables right there.

Most of the comments here do feel like academic papers. Reading them is a really taxing exercise. In fact, I usually just stray from it. Whether it's my shit attention span or people on a global scale are not proficient English speakers, it is my firm belief that ideas should be communicated in an understandable matter when posssible. That is, most of people should be able to understand them. If you want to increase diveristy and be more inclusive, well, I think that's one really good way at attempting so.

This is also the reason for the exact title of the post, rather than "Linguistic preferences of some effective altruists seem to be impacted by a tendency to overly intellectualize."

Comments28


Sorted by Click to highlight new comments since:

I wanted to push back on this because most commenters seem to agree with you. I disagree that the writing style on the EA forum, on a whole, is bad. Of course, some people here are not the best writers and their writing isn't always that easy to parse. Some would definitely benefit from trying to make their writing easier to understand. 

For context, I'm also a non-native English speaker and during high school, my performance in English (and other languages) was fairly mediocre.

But as a whole, I think there are few posts and comments that are overly complex. In fact, I personally really like the nuanced writing style of most content on the EA forum. Also, criticizing the tendency to "overly intellectualize" seems a bit dangerous to me. I'm afraid that if you go down this route you shut down discussions on complex issues and risk creating a more Twitter-like culture of shoehorning complex topics into simplistic tidbits. I'm sure this is not what you want but I worry that this will be an unintended side effect. (FWIW, in the example thread you give, no comment seemed overly complex to me.)

Of course, in the end, this is just my impression and different people have different preferences. It's probably not possible to satisfy everyone. 

I'm going to push back against this a very slight amount. It is good to write a thing as simply as possible while saying exactly what it's meant to say in exactly the way it's meant to be said - but not to write a thing more simply than that. 

I agree and will use this opportunity to re-share some tips for increasing readability. I used to manage teams of writers/editors and here are some ideas we found useful:

To remove fluff, imagine someone is paying you $1,000 for every word you remove.  Our writers typically could cut 20-50% with minimal loss of information.

Long sentences are hard to read, so try to change your commas into periods. 

Long paragraphs are hard to read, so try to break each paragraph into 2-3 sentences.

Most people just skim, and some of your ideas are much more important than others, so bold/italicize your important points.

This post has some additional helpful tips, in particular having a summary/putting key points up front.

This doesn't solve the problem OP complained of - that writers use unnecessarily complicated phrases and long jargon words to describe simple ideas.

Agreed that it doesn't solve that specific problem, but it serves the same end goal: making things easier for the reader.

I agree that academic language should be avoided in both forums and research papers.

It might be a good idea for forum writers to use a tool like ChatGPT to make their posts more readable before posting them. For example, they can ask ChatGPT to "improve the readability" of their text. This way, writers don't have to change their writing style too much and can avoid feeling uncomfortable while writing. Plus, it saves time by not having to go back and edit clunky sentences. Additionally, by asking ChatGPT to include more slang or colloquial language, the tool can better match the writer's preferred style. (Written with the aid of ChatGPT in exactly the way I proposed. :p)

From my playing with it, ChatGPT uses complex language even when told not to. In notion, there's a AI assistant (GPT3 based) and it has a "simplify writing" feature. The outputs were still pretty verbose and had overly long sentences. Soon though, sure!

Most output I've seen from ChatGPT has been horrendously verbose

As far as I can recall, my paragraphs are usually about half as long when I ask ChatGPT to simplify.

That said, I tend to write in an academic style.

+1 for using ChatGPT. I've also been using this. 

Similarly, I hope that GPT could later be used to customized text to whatever background the reader has, on-demand. 

Jargon is great for some people but terrible for others.

I dunno, encouraging people to use an AI tool rather than improve their writing seems a bit like a parent encouraging their child to just keep using training wheels, because it's easier.

Sure, if your goal is to be a good writer! But, I'm not worried about that. I just want people to understand me.

  1. I don't see how encouraging people to use AI tools really means discouraging them to try to improve writing.  
  2. There are many cases where I find AI tools help me become a better writer. It can be like having a personalized tutor.

I disagree about 1. About 2, I agree but that doesn't seem to me to be what Jonas is aiming for.

I agree that (2) wasn't Jonas's aim. 

 Michał -- thanks for this reality check. 

If EA wants to be genuinely, globally inclusive, we need to remember that many of our members learned English as a second language, and that it's important for us all to write as clearly as possible. 

According to sources like this, about 400 million people worldwide are native English speakers, but over 1.2 billion have learned to read English as a second language.  So that's about a 3:1 ratio of non-native to native speakers. This is worth bearing in mind when native-speaking people (like me) are writing on EA forum, and potentially being read by many non-native speakers.

It's also important to reign in our natural tendency to IQ-signal through displaying our vocabulary size, capacity for complex grammar, and subtlety of verbal reasoning. These can make us sound smart to people with similar levels of English fluency and domain expertise, but they inhibit our ability to communicate with wider audiences.

Well said, though I think your comment could use that advice :) Specific phrases/words I noticed: reign in, tendancy, bearing in mind, inhibit, subtlety, IQ-signal (?).

I'm non-native and I do know these words, but I'm mostly native level at this point (spent half my life in an English speaking country) I think many non-native speakers won't be as familiar

Ariel -- Fair point! I agree. My posts was intended to be subtly self-satirizing, but I should have made that clearer. 

Ah right, I had that thought but wasn't sure, makes sense!

All of the following are virtues in writing:

  1. Clarity
  2. Precision
  3. Accessibility

I think the EA forum writing tends to do okay on 1, well on 2, and okay-to-bad on 3. 

Obviously being better at all of them simultaneously is the best outcome, but sometimes there's a tradeoff. Personally, I think clarity and precision are more important than accessibility. That doesn't mean we shouldn't try to make our writing more accessible (I endorse Emerson Spartz's list of tips), but I think it is just more important to be clear and precise, and we should be clear about that and happy that we're doing well at those things. And therefore I don't think the writing style here is bad, although it could be improved.

(Or, in the maxim I got taught: "When looking at your writing ask: 'Is it clear? Is it true? Is it necessary?")

I feel like, if we write here to communicate, accessibility is pretty important, maybe more important than the other two (or at least, not clearly less important than them). Why do you think otherwise?

Sometimes it's more important to convey something with high fidelity to few people than it'd be to convey an oversimplified version to many. 

That's the reason why we bother having a forum at all - despite the average American reading at an eighth grade level - rather than standing on street corners shouting at the passers-by. 

Generally disagree with this. Overall, I think the EA forum norms are fairly good in terms of writing style and quality, but I might even be inclined to push in the other direction. 

 After being bombarded with modern American writing advice since University, I've recently become disillusioned with the simplifying, homogenising trend of internationalized English, in favour of a language that borrows from the best of our linguistic traditions. 

I find that the short-sentence, short-word, bullet point style of writing encourages you to skim, while more flowing and elegant language forces the reader to read aloud, and to follow the cadences of the speaker, which promotes a very different state of mind for reading and absorbing information. 

To quote from the opening passage Chapter 2 of Utilitarianism by JS Mill:

“A being of higher faculties requires more to make him happy, is capable probably of more acute suffering, and certainly accessible to it at more points, than one of an inferior type; but in spite of these liabilities, he can never really wish to sink into what he feels to be a lower grade of existence. We may give what explanation we please of this unwillingness; we may attribute it to pride, a name which is given indiscriminately to some of the most and to some of the least estimable feelings of which mankind are capable; we may refer it to the love of liberty and personal independence, as appeal to which was with the Stoics one of the most effective means for the inculcation of it; to the love of power or to the love of excitement, both of which do really enter into and contribute to it; but its most appropriate appellation is a sense of dignity, which all human beings possess in one form or other, and in some, though by no means in exact, proportion to their higher faculties, and which is so essential a part of the happiness of those in whom it is strong that nothing which conflicts with it could be otherwise than momentarily an object of desire to them.”

Utterly impossible to skim, and what a joy to read! 

Just to give you a data point from a non-native speaker who likes literature and languages, this quote wasn't a joy to read for me since it would have taken me a very long time to understand what this is about if I would not have known the context. So I am not sure what you mean by the best linguistic traditions – I think simple language can be elegant too.

It is a more joyful sentence in the context, admittedly.

Simple language can be elegant, of course, and there are excellent writers with a range of different styles and levels of simplicity. I wouldn't dream of saying that everyone should be striving for 200-word sentences, nor that we should be imitating Victorian-era philosophy, but I do think that the trends of relentless simplifying and trimming that editors and style guides foist upon budding writers have diminished the English language.

I find that the short-sentence, short-word, bullet point style of writing encourages you to skim, while more flowing and elegant language forces the reader to read aloud, and to follow the cadences of the speaker, which promotes a very different state of mind for reading and absorbing information.

But... The most common and advocated style here is exactly skimmable bullet points, while prose is many times frowned upon. And the only richness of language used is jargon. This is the opposite of what you say you want.

Also, like Ada-Maaria, the long quote was hard for me to read as a non-native, and I skipped it. That's not to say that I think communication should be confined to short sentences and simplified language. Just that thought has to be put into clarity and accessibility as well.

Strongly upvoted

Curated and popular this week
 ·  · 7m read
 · 
This is a linkpost for a paper I wrote recently, “Endogenous Growth and Excess Variety”, along with a summary. Two schools in growth theory Roughly speaking: In Romer’s (1990) growth model, output per person is interpreted as an economy’s level of “technology”, and the economic growth rate—the growth rate of “real GDP” per person—is proportional to the amount of R&D being done. As Jones (1995) pointed out, populations have grown greatly over the last century, and the proportion of people doing research (and the proportion of GDP spent on research) has grown even more quickly, yet the economic growth rate has not risen. Growth theorists have mainly taken two approaches to reconciling [research] population growth with constant economic growth. “Semi-endogenous” growth models (introduced by Jones (1995)) posit that, as the technological frontier advances, further advances get more difficult. Growth in the number of researchers, and ultimately (if research is not automated) population growth, is therefore necessary to sustain economic growth. “Second-wave endogenous” (I’ll write “SWE”) growth models posit instead that technology grows exponentially with a constant or with a growing population. The idea is that process efficiency—the quantity of a given good producible with given labor and/or capital inputs—grows exponentially with constant research effort, as in a first-wave endogenous model; but when population grows, we develop more goods, leaving research effort per good fixed. (We do this, in the model, because each innovator needs a monopoly on his or her invention in order to compensate for the costs of developing it.) Improvements in process efficiency are called “vertical innovations” and increases in good variety are called “horizontal innovations”. Variety is desirable, so the one-off increase in variety produced by an increase to the population size increases real GDP, but it does not increase the growth rate. Likewise exponential population growth raise
 ·  · 25m read
 · 
Epistemic status: This post — the result of a loosely timeboxed ~2-day sprint[1] — is more like “research notes with rough takes” than “report with solid answers.” You should interpret the things we say as best guesses, and not give them much more weight than that. Summary There’s been some discussion of what “transformative AI may arrive soon” might mean for animal advocates. After a very shallow review, we’ve tentatively concluded that radical changes to the animal welfare (AW) field are not yet warranted. In particular: * Some ideas in this space seem fairly promising, but in the “maybe a researcher should look into this” stage, rather than “shovel-ready” * We’re skeptical of the case for most speculative “TAI<>AW” projects * We think the most common version of this argument underrates how radically weird post-“transformative”-AI worlds would be, and how much this harms our ability to predict the longer-run effects of interventions available to us today. Without specific reasons to believe that an intervention is especially robust,[2] we think it’s best to discount its expected value to ~zero. Here’s a brief overview of our (tentative!) actionable takes on this question[3]: ✅ Some things we recommend❌ Some things we don’t recommend * Dedicating some amount of (ongoing) attention to the possibility of “AW lock ins”[4]  * Pursuing other exploratory research on what transformative AI might mean for animals & how to help (we’re unconvinced by most existing proposals, but many of these ideas have received <1 month of research effort from everyone in the space combined — it would be unsurprising if even just a few months of effort turned up better ideas) * Investing in highly “flexible” capacity for advancing animal interests in AI-transformed worlds * Trying to use AI for near-term animal welfare work, and fundraising from donors who have invested in AI * Heavily discounting “normal” interventions that take 10+ years to help animals * “Rowing” on na
 ·  · 14m read
 · 
As we mark one year since the launch of Mieux Donner, we wanted to share some reflections on our journey and our ongoing efforts to promote effective giving in France. Mieux Donner was founded through the Effective Incubation Programme by Ambitious Impact and Giving What We Can. TLDR  * Prioritisation is important. And when the path forward is unclear, trying a lot of different potential priorities with high productivity leads to better results than analysis paralysis. * Ask yourself what the purpose of your organisation is. If you are a mainly marketing/communication org, hire people from this sector (not engineers) and don’t be afraid to hire outside of EA. * Effective altruism ideas are less controversial than we imagined and affiliation has created no (or very little) push back * Hiring early has helped us move fast and is a good idea when you have a clear process and a lot of quality applicants Summary of our progress and activities in year 1 In January 2025, we set a new strategy with time allocation for our different activities. We set one clear goal - 1M€ in donations in 2025. To achieve this goal we decided: Our primary focus for 2025 is to grow our audience. We will experiment with a variety of projects to determine the most effective ways to grow our audience. Our core activities in 2025 will focus on high-impact fundraising and outreach efforts. The strategies where we plan to spend the most time are : * SEO content (most important) * UX Optimization of the website * Social Media ; Peer to Peer fundraising ; Leveraging our existing network The graphic below shows how we plan to spend our marketing time: We are also following partnership opportunities and advising a few high net worth individuals who reached out to us and who will donate by the end of the year. Results: one year of Mieux Donner On our initial funding proposal in June 2024, we wrote down where we wanted to be in one year. Let’s see how we fared: Meta Goals * Spendi