Stan - this is a legitimate and interesting question. I don't know of good, representtive, quantitative data that's directly relevant.
However, I can share some experiences from teaching EA content that might be illuminating, and semi-relevant. I've taught my 'Psychology of Effective Altruism' course (syllabus here), four times at a large American state university where the students show a very broad range of cognitive ability. This is an upper-level undergraduate seminar restricted mostly to juniors and seniors. I'd estimate the IQ range of the students taking the course to be about 100-140, with a mean around 115.
In my experience, the vast majority of the students really struggle with central EA concepts and rationality concepts like scope-sensitivity, neglectedness, tractability, steelmanning, recognizing and avoiding cognitive biases, and decoupling in general.
I try very hard to find readings and videos that explain all of these concepts as simply and clearly as possible. Many students kinda sorta get some glimpses into what it's like to see the world through EA eyes. But very few of them can really master EA thinking to a level that would allow them to contribute significantly to the EA mission.
I would estimate that out of the 80 or so students who have taken my EA classes, only about 3-5 of them would really be competitive for EA research jobs, or good at doing EA public outreach. Most of those students probably have IQs above about 135. So this is mostly a matter of raw general intelligence (IQ), and partly a matter of personality traits such as Openness and Conscientiousness, and partly a matter of capacity for Aspy-style hyper-rationality and decoupling.
So, my impression from years of teaching EA to a wide distribution of students is that EA concepts are just intrinsically really, really difficult for ordinary human minds to understand, and that only a small percentage of people have the ability to really master them in an EA-useful way. So, cognitive elitism is mostly warranted for EA.
Having said that, I do think that EAs may under-estimate how many really bright people are out there in non-elitist institutions, jobs, and cities. The really elite universities are incredibly tiny in terms of student numbers. There might be more really smart people at large, high-quality state universities like U. Texas Austin (41,000 undergrads) or U. Michigan (33,000 undergrads) than there are at Harvard (7,000 undergrads) or Columbia (9,000 undergrads). Similar reasoning might apply in other countries. So, it would seem reasonable for EAs to consider broadening our search for EA-capable talent beyond super-elite institutions and 'cool' cities and tech careers, into other places where very smart people might be found.
I think a hypothesis or framework that you might want to try for examining the question is "What is the characteristics of the market for EA-based philanthropy?" or in less elitist language: "Follow the money!"
A large fraction of the money for the EA movement comes from very wealthy people. Founders pledge consists almost exclusively of wealthy entrepreneurs. Open Philanthropy is funded mostly by very wealthy people. Very wealthy people tend to take the approach of paying a higher price for a premium product or service. It is only logical if you have a lot more money than other people in the market.
According to the Federal Reserve, the top 0.1% of households own $18.6 trillion, the next 0.9% of households own $27.2 trillion, and the next 9% own $54.8 trillion. The remaining 90% own $44.3 trillion in assets. So the top 10% of households own $100.6 trillion which is more than twice as much as everyone else. So naturally the EA movement preferentially serves elite donors, and as a result it has many of the characteristics of an elitist movement: focus on elite universities, lack of diversity, etc.
What is the alternative? Does the EA movement want to bear the cost of resisting the incentives it has to preferentially serve elitist donors and to perhaps unconciously take on elitist characteristics?
The easiest path for most of the EA movement is to simply say that the key focus is on maximizing impact of each particular organizations. In this case, each organization will maximize funding over the short-term by focusing on serving morality-aligned donors who have the most to give and who provide the greatest short-term donation potential. This will keep the EA movement dependent on richer donors.
But the resulting elitism will probably have an adverse PR impact over the long-term. This is because giving less attention to 90% of donors because they have less money than the elite donors will probably alienate a majority of the philanthropic public and prevent the movement from reaching its full growth potential over time.
Keeping this in mind, it might be useful for some of EAs current elite donors to invest in less elitist grassroots EA outreach in order to minimize long-term EA movement PR damage. While this is probably not revenue-maximizing over the short-term, it may make greater inroads into the donations from the $44.3 trillion in assets (and some portion of income) of the 90% majority of people in richer countries who might eventually support EA and increase the movement's impact in ways that complement the donations that they give.
Interesting take, thanks for sharing!
My intuition is that what might be easier than "invest[ing] in less elitist grassroots EA outreach in order to minimize long-term EA movement PR damage" is simply projecting a different image of EA.
It seems like standard practice for large organizations / movements is to project an image that is substantially more diverse or more inclusive than the reality. It's dishonest, but probably does broaden the base of people who engage with the brand. Eventually, reality starts to resemble the image.