Looking to advance businesses with charities in the vast majority shareholder position. Check out my TEDx talk for why I believe Profit for Good businesses could be a profound force for good in the world.
I find it a bit surprising that your point is so well-taken and has met no disagreement so far, though I am inclined to agree with it.
Another way of framing "orgs that bring talent into the EA/impact-focused charity world" is orgs whose hiring is less focused on value alignment, insofar as involvement in the movement corresponds with EA value alignment. One might be concerned that a less aligned hire might do well on metrics that can be easily ascertained or credited by one's immediate employer, but ignore other opportunities or considerations regarding impact because he/she is narrowly concerned about legible job performance and personal career capital. They could go on, in this view, to use the career capital developed and displace more aligned individuals. If funding is the larger constraint for impactful work than labor willing to work for pay, "re-using" people in the community may make sense because the impact premium from value-alignment is worth the marginal delta from a seemingly superior resume.
Of course, another view is that hiring someone into an EA org can create buy-in and "convert" someone into the community, or allow them to discover a community they already agree with.
Something that just gives me pause regarding giving too much credit for bringing in additional talent is that -regarding lots of kinds of talent- there is a lot of EA talent chasing limited paid opportunities. Expanding the labor pool for some areas is probably much less important because funding is more the limiting factor.
I agree with your post overall and think that EA can be very pedantic, professorial, and overly averse to persuasion. I am very glad that you wrote this post and believe that EAs should credit more the importance of persuasion (and probably be more susceptible to positive persuasion as against criticism).
However, the title of your post suggested that the scout mindset is valuable only as a servant of persuasion. I think that it is important to note that scout mindset has other valuable applications.
On the subject of redirecting streams of money from less impactful causes to EA causes, I feel I need to beat my drum regarding the potential of Profit for Good businesses (businesses with charities in all or almost all of the shareholder position). In such cases, to the extent EA PFGs profits displace those of normal businesses, funds are diverted from the average shareholder to an effective charity.
So when a business like Humanitix (PFG helping projects in the developing world, $4mil AUD to The Life You Can Save) displaces the marketshare of Ticketmaster, funds are diverted not from charities, but from the funds of the business's competitors. This method of diversion seems less difficult because the operative actors (consumers, employees, business partners) are not deciding between a strong non-EA charity often optimized for warm fuzzies and marketing, but rather choosing between products with similar value propositions, but where engaging with one - in addition to the other value proposition - implies helping fight malaria or something instead of enriching a random investor.
If you're interested in learning more about Profit for Good, here is a reading list on the subject.
Perhaps the most compelling reason for independent donors to contribute is that organizations like OP may have methodologies and assumptions that result in important opportunities being missed. Independent donors likely have a different set of methodologies and assumptions â as well as ideas that they are exposed to- that enable them to spot and support high-impact opportunities that OP overlooks or undervalues due to its particular perspectives, biases, or just lack of awareness.
Given the vast landscape of potential research areas, decisions, even by large institutions, about which causes to investigate are often made using rough back-of-the-envelope calculations. And given the importance of finality and focus, promising ideas and/or cause areas can be rather cavalierly dismissed. Even if these calculations are approximately correct, categorically including or excluding entire areas means that promising interventions not typical of a category may be missed. Independent funders would not necessarily be burdened by having removed areas from consideration (although this certainly trades off with OP's ability to zoom in and explore the areas that they positively categorized more fully).
By bringing diverse viewpoints to the table, independent donors can fund innovative projects that might otherwise be overlooked, enriching the philanthropic landscape beyond what a single major funder can achieve.
It seems to me that the proof is in the pudding. The content can be evaluated on what it brings to the discourse and the tools used in producing it are only relevant insofar as these tools result in undesirable content. Rather than questioning whether the post was written by generative AI, I would give feedback as to what aspects of the content you are criticizing.
You seem to indicate that one who is âmaximizingâ for some value, such as the well-being of moral patients across spacetime would lead to, or tend to lead to, poor mental health. I can understand how one might think this for a ânaĂŻve maximizationâ, where one depletes oneself by giving of oneself, in terms of ones effort, time, and resources, at a rate that either makes one burnout, or barely able to function. But this is like suggesting if you want to get the most out of a car, you should drive it as frequently and relentlessly, without providing the vehicle needed upkeep and repairs.
But one who does not incorporate oneâs own needs, including mental health needs, into oneâs determination of how to maximize for a value is not operating optimally as a maximizer. I will note that there have been others who have indicated that when they view the satisfaction of their own needs or desires as primarily instrumental, rather than terminal goals, that this somewhat diminishes them. In my personal experience, I strive to âmaximizeâ- I want to live my life in a way that best calculated toward reducing suffering and increasing flourishing of conscious beings- but I recognize that taking care of my health is part of how to do so.
I would be curious if other âmaximizersâ would say that they are capable of integrating their own health into their decisions such that they can maintain adequate health.
Just when I have seen efforts to improve community relations it has typically been in the "Community Health" context relating to when people have had complaints about people in the community or other conflicts. I haven't seen as much concerted effort in connecting people working on different EA projects that might add value to each other.
A lot of what I have seen regarding "EA Community teams" seems to be be about managing conflicts between different individuals.
It would be interesting to see an organization or individual that was explicitly an expert in knowing different individuals and organizations and the projects that they are working on and could potentially connect people who might be able to add value to each other's projects. It strikes me that there are a lot of opportunities for collaboration but not as much organization around mapping out the EA space on a more granular level.
Joey, thanks for this thought-provoking piece on addressing talent bottlenecks with on-ramps, especially through programs like Founding to Give. You rightly highlight that funding is be a limiting factor for scaling impactful initiatives. While Ambitious Impact's program addresses this by encouraging individuals to commit a portion of their earnings to philanthropy, I believe there is still significant untapped potential.
Profit for Good (PFG) businessesâcompanies that direct their profits to charitable causesâoffer a way to overcome this funding bottleneck. PFGs can effectively compete in for-profit markets by capitalizing on a subtle yet powerful advantage: the preference of economic actors (such as consumers, employees, and business collaborators) for supporting charitable outcomes over simply enriching random shareholders. When people are given a choice between two equivalent options, they often favor the one that directs profits toward causes they care aboutâlike saving kids from malariaârather than increasing the wealth of investors. Even a modest preference for such socially beneficial outcomes can lead to advantages in consumer loyalty, attracting top talent, and forming strategic partnerships.
By not fully exploring how to harness this as a tool and explore the contexts where it could offer the most significant advantages, I think thereâs money being left on the table. PFGs could strategically use this natural inclination to gain competitive advantages without compromising business performance. I'm curious if Ambitious Impact has considered integrating this perspective into their programs, as it could align well with the goal of channeling more resources toward effective causes.
I think the sort of world that could be achieved by the massive funding of effective charities is a rather inspiring vision. Natalie Cargill, Longview Philanthropy's CEO, lays out a rather amazing set of outcomes that could be achieved in her TED Talk.
I think that a realistic method of achieving these levels of funding are Profit for Good businesses, as I lay out in my TEDx Talk. I think it is realistic because most people don't want to give something up to fund charities -as donation would require- but if they could help solve world problems by buying products or services they want or need of similar quality at the same price, they would.