I tried doing this a while back. Some things I think I worried about at the time:
(1) disheartening people excessively by sending them scores that seem very low/brutal, especially if you use an unusual scoring methodology (2) causing yourself more time costs than it seems like at first, because (a) you find yourself needing to add caveats or manually hide some info to make it less disheartening to people, (b) people ask you follow-up questions (3) exposing yourself to some sort of unknown legal risk by saying something not-legally-defensible about the candidate or your decision-making.
(1) turned out to be pretty justified I think, e.g. at least one person expressing upset/dissatisfaction at being told this info. (2) definitely happened too, although maybe not all that many hours in the grand scheme of things (3) we didn't get sued but who knows how much we increased the risk by
Jamie, I've been contemplating writing up a couple of informal "case study"-type reports of different hiring practices. My intention/thought process would be to allow EA orgs to learn about how several different orgs do hiring, to highlight some best practices, and generally to allow/encourage organizations to improve their methods. How would you feel about writing up a summary or having a call with me to allow me to understand how you tried giving feedback and what specific aspects caused challenges?
Unfortunately this was quite a while ago at the last org I worked at; I don't have access to the relevant spreadsheets, email chains etc anymore and my memory is not the best, so I don't expect to be able to add much beyond what I wrote in the comment above.
Regarding "disheartening people," I once got feedback for a hiring round and the organization shared what scores I got, and even shared scoring info for the other (anonymized) candidates. It was the best and most accurate data I have ever been given as feedback.
I scored very low, much lower than I had expected. Of course I felt sad and frustrated. I wish that I knew more details about their scoring methodology, and part of me says that it was an unfair process because they weren't clear on what I would be evaluated on. But I draw a analogies to getting rejected from anything else (such as a school application or a romantic partner): it sucks, but you get over it eventually. I felt bad for a day or two, and then the feelings of frustration faded away.
(I run hiring rounds with ~100-1000 applicants) agree with Jamie here. However, if someone was close to a cutoff, I do specifically include "encourage you to apply to future roles" in my rejection email. I also always respond when somebody asks for feedback proactively.
Is revealing scores useful to candidates for some other reason not covered by that? It seems to me the primary reason (since it sounds like you aren't asking for qualitative feedback to also be provided) would be to inform candidates as to whether applying for future similar roles is worth the effort.
revealing scores useful to candidates for some other reason not covered by that
Honestly, I hadn't even thought of encouraging them to apply for future roles. My main thought regarding feedback is to allow them to improve. If you assess my work and then tell me the ways in which it falls short, that allows me to improve. I know that to work on. An example would be something like "Although your project plan covered a lot of the areas we requested, you didn't explain your reasoning for the assumption you made. You estimated that a [THING] would cost $[AMOUNT], but as the reader I don't know where you got that number. If you had been transparent about your reasoning, then you would have scored a bit higher." or "We were looking for something more detailed, and your proposal was fairly vague. It lacked many of the specifics that we had requested in the prompt."
I suppose I'm skeptical that quant scores in an auto-sent email will actually give you a nuanced sense - but I do see how, e.g., if over time you realize it's always your interview or always your quant question that scores poorly, that is a good signal
I do think being kind is an underrated part of hiring!
Have also tried this, although most our applicants aren't EAs. People who reapply given detailed feedback usually don't hit the bar.
We still do it, in part because we think it's good for the applicants, and in part because people who make a huge improvement attempt 2 usually make strong long-term hires
That actually seems like a really strong signal of something important: can people improve, if given a modest amount of guidance/support. I'd certainly be interested in hiring someone who does rather than someone who doesn't.
But I'm also impressed that you provide feedback to candidates consistently. I've always thought that it would be something fairly time-consuming, even if you set up a system to provide feedback in a fairly standardized way. Would you be willing to share a bit about how you/your team does feedback for rejected job applicants?
I view our hiring process as a constant work in progress, and we look back at the application process of everyone after their time with us, the best and worst performers alike, and try figure out how we could have told ahead of time. Part of that is writing up notes. We use chatgpt to make the notes more sensitive and send them to the applicant.
Caveat: We only do this for people who show some promise of future admission.