This is a linkpost for a paper I wrote recently, “Endogenous Growth and Excess Variety”, along with a summary.
Two schools in growth theory
Roughly speaking:
In Romer’s (1990) growth model, output per person is interpreted as an economy’s level of “technology”, and the economic growth rate—the growth rate of “real GDP” per person—is proportional to the amount of R&D being done. As Jones (1995) pointed out, populations have grown greatly over the last century, and the proportion of people doing research (and the proportion of GDP spent on research) has grown even more quickly, yet the economic growth rate has not risen. Growth theorists have mainly taken two approaches to reconciling [research] population growth with constant economic growth.
“Semi-endogenous” growth models (introduced by Jones (1995)) posit that, as the technological frontier advances, further advances get more difficult. Growth in the number of researchers, and ultimately (if research is not automated) population growth, is therefore necessary to sustain economic growth.
“Second-wave endogenous” (I’ll write “SWE”) growth models posit instead that technology grows exponentially with a constant or with a growing population. The idea is that process efficiency—the quantity of a given good producible with given labor and/or capital inputs—grows exponentially with constant research effort, as in a first-wave endogenous model; but when population grows, we develop more goods, leaving research effort per good fixed. (We do this, in the model, because each innovator needs a monopoly on his or her invention in order to compensate for the costs of developing it.) Improvements in process efficiency are called “vertical innovations” and increases in good variety are called “horizontal innovations”. Variety is desirable, so the one-off increase in variety produced by an increase to the population size increases real GDP, but it does not increase the growth rate. Likewise exponential population growth raise
I'm basically an outsider to EA, but "from afar", I would guess that some of the values of EA are 1) against politicization, 2) for working and building rather than fighting and exposing ("exposing" being "saying the unhealthy truth for truth's sake", I guess), 3) for knowing and self-improvement (your point), 4) concern for effectiveness (Gordon's point). And of course, the value of altruism.
These seem like they are relatively safe to promote (unless I'm missing something.)
Altruism is composed of 1) other-orientation / a relative lack of self-focus (curiosity is an intellectual version of this), 2) something like optimism, 3) openness to evidence (you could define "hope" as a certain combination of 2 and 3), 4) personal connection with reality (maybe a sense of moral obligation, a connection with other being's subjective states, or a taste for a better world), 5) inclination to work, 6...) probably others. So if you value altruism, you have to value whatever subvalues it has.
These also seem fairly safe to promote.
Altruism is supported by 1) "some kind of ambition is good", 2) "humility is good but trying to maximize humility is bad" (being so humble you don't have any confidence in your knowledge prevents action), 3) "courage is good but not foolhardiness", 4) "will is good, if it stays in touch with reality", 5) "being 'real' is good" (following through on promises, really having intentions), 6) "personal sufficiency is good" (you have enough or are enough to dare reach into someone else's reality), 7...) probably others.
These are riskier. I think one thing to remember is that ideas are things in people's minds, that culture is really embodied in people, not in words. A lot of culture is in interpersonal contact, which forms the context for ideas. So ideally, if you promote values, you shouldn't just say things, but should instruct people (or be in relationship with people) such that they really understand what you're saying. (Advice I've seen on this forum.) Genes become phenotype through epigenetics, and concepts become emotions, attitudes, and behaviors through the "epiconceptual". The epiconceptual could be the cultural background that informs how people hear a message (like "yes, this is the moral truth, but we don't actually expect people to live up to the moral truth"), or it could be the subcultural background from a relationship or community that makes it make sense. The practices and expectations of culture / subculture. So values are a thing which are not promoted just by communicators, but also by community-builders, and good communities help make risky but productive words safe to spread.