Stan - this is a legitimate and interesting question. I don't know of good, representtive, quantitative data that's directly relevant.
However, I can share some experiences from teaching EA content that might be illuminating, and semi-relevant. I've taught my 'Psychology of Effective Altruism' course (syllabus here), four times at a large American state university where the students show a very broad range of cognitive ability. This is an upper-level undergraduate seminar restricted mostly to juniors and seniors. I'd estimate the IQ range of the students taking the course to be about 100-140, with a mean around 115.
In my experience, the vast majority of the students really struggle with central EA concepts and rationality concepts like scope-sensitivity, neglectedness, tractability, steelmanning, recognizing and avoiding cognitive biases, and decoupling in general.
I try very hard to find readings and videos that explain all of these concepts as simply and clearly as possible. Many students kinda sorta get some glimpses into what it's like to see the world through EA eyes. But very few of them can really master EA thinking to a level that would allow them to contribute significantly to the EA mission.
I would estimate that out of the 80 or so students who have taken my EA classes, only about 3-5 of them would really be competitive for EA research jobs, or good at doing EA public outreach. Most of those students probably have IQs above about 135. So this is mostly a matter of raw general intelligence (IQ), and partly a matter of personality traits such as Openness and Conscientiousness, and partly a matter of capacity for Aspy-style hyper-rationality and decoupling.
So, my impression from years of teaching EA to a wide distribution of students is that EA concepts are just intrinsically really, really difficult for ordinary human minds to understand, and that only a small percentage of people have the ability to really master them in an EA-useful way. So, cognitive elitism is mostly warranted for EA.
Having said that, I do think that EAs may under-estimate how many really bright people are out there in non-elitist institutions, jobs, and cities. The really elite universities are incredibly tiny in terms of student numbers. There might be more really smart people at large, high-quality state universities like U. Texas Austin (41,000 undergrads) or U. Michigan (33,000 undergrads) than there are at Harvard (7,000 undergrads) or Columbia (9,000 undergrads). Similar reasoning might apply in other countries. So, it would seem reasonable for EAs to consider broadening our search for EA-capable talent beyond super-elite institutions and 'cool' cities and tech careers, into other places where very smart people might be found.
One way to think about it is that the aim of EA is to benefit the beneficiaries - the poorest people in the world, animals, future beings.
We should choose strategies that help the beneficiaries the most rather than strategies that help people that happen to be interested in EA (unless that also helps the beneficiaries - things like not burning out).
It makes sense to me that we should ask of those who have had the most privilege to give back the most, if you have more money you should give more of it away. If you have a stronger safety net and access to influence, you should use more of that to help others rather than helping yourself.
I think with the salaries, for most people, they could probably earn more in other sectors if they only cared about monetary gain rather than including impact in their career choice. If you're coming from a charity/public service sector they may seem higher, if you're coming from a private sector career they seem lower.
TIL!
I think this strengthens my confidence in my original comment re: nearly all EA roles being paid under market rate.