The Invisible Bias in Letters of Recommendation
Imagine two equally qualified medical school graduates applying for the same plastic surgery residency. Both have stellar records, but their letters of recommendation tell surprisingly different stories. One is described as “brilliant,” “driven,” and a “natural leader.” The other is called “compassionate,” “hardworking,” and a “team player.” The difference? Often, it’s not merit—it’s gender and race.
A groundbreaking study published in the Journal of Surgical Research reveals just how deeply unconscious bias seeps into the letters that can make or break a medical career. Researchers analyzed narrative letters of recommendation (NLORs) for plastic surgery residency applicants and found that female and non-white applicants were systematically described in less favorable terms than their male and white counterparts. Even more striking: White letter writers were more likely to use negative language when describing non-white applicants, while non-white writers highlighted accomplishments and drive more often for non-white candidates.
This isn’t just about fairness in medicine. It’s about who gets to become a surgeon, a professor, or a leader in healthcare—and how hidden biases shape the future of entire professions.
The Power of Namsor: Unmasking Bias with Data
One of the study’s most innovative tools was Namsor, an AI-powered software that predicts race and ethnicity based on names. Here’s why that matters:
- Filling the Data Gap: Many studies on bias rely on self-reported demographics, which are often incomplete or missing. Namsor allowed researchers to infer the race and gender of letter writers and applicants with validated accuracy, even when that information wasn’t explicitly provided.
- Revealing Patterns: By analyzing over 600 applicants, the study found that:
- Female writers used more words describing social and emotional qualities, while male writers focused on achievement and power—but only for certain groups.
- White writers were more likely to use negative language (e.g., “adequate,” “struggled”) for non-white applicants, while non-white writers emphasized accomplishments and drive for applicants of color.
- Racial discordance (e.g., a white writer recommending a non-white applicant) led to lower-quality letters, potentially putting minority candidates at a disadvantage.
Without Namsor, these patterns might have gone unnoticed. The software didn’t just confirm that bias exists—it showed how it operates in real time, in the words and phrases that shape careers.
Why This Matters Beyond Medicine
The implications of this research stretch far beyond residency programs. Here’s how data enrichment tools like Namsor could transform other fields:
1. Academic Hiring and Tenure
- Problem: Studies show that women and minorities are less likely to be described as “excellent” or “innovative” in academic letters of recommendation, even with identical qualifications.
- Solution: Universities could use Namsor to audit letters for bias before tenure or hiring decisions, ensuring fair evaluations. For example, a 2025 study in PLoS One found similar biases in PhD admissions, where language in recommendation letters influenced who was accepted.
- Impact: More diverse faculty could lead to more inclusive research, mentorship, and curricula.
2. Public Healthcare Leadership
- Problem: Women and people of color are underrepresented in hospital leadership, policy roles, and research funding. Biased letters may be one reason why.
- Solution: Healthcare systems could flag and revise letters that use stereotypical language, promoting equity in promotions and grants.
- Impact: Diverse leadership teams are linked to better patient outcomes and more equitable policies.
3. Corporate Hiring and Promotions
- Problem: Performance reviews and promotion letters often contain gendered and racialized language (e.g., women are “nurturing,” while men are “strategic”).
- Solution: HR departments could use tools like Namsor to analyze internal recommendations, identifying and correcting biases before they affect careers.
- Impact: Companies with diverse leadership are 35% more likely to outperform their peers—but only if hiring is fair.
4. Grant and Funding Allocations
- Problem: Researchers from marginalized groups receive less funding and are more likely to have their competence questioned in review letters.
- Solution: Funding agencies could screen letters for biased language, ensuring that groundbreaking research isn’t overlooked due to unconscious prejudice.
- Impact: More innovative and inclusive science, from clinical trials to public health initiatives.
5. K-12 and Higher Education
- Problem: Letters of recommendation for college admissions and scholarships often reflect racial and gender stereotypes, disadvantaging students of color and women in STEM.
- Solution: Schools could train counselors and teachers to recognize bias, using data tools to track progress.
- Impact: A more diverse pipeline of students entering STEM, medicine, and academia.
The Bigger Picture: Can AI Fix Human Bias?
Tools like Namsor aren’t a silver bullet, but they offer a powerful diagnostic. By revealing where bias hides, they create opportunities for change:
- Training Programs: Medical schools and residencies are already using the study’s findings to educate faculty on writing objective, fair letterspubmed.ncbi.nlm.nih.gov.
- Structured Evaluations: Some fields are shifting to standardized letters (like the SLOR in surgery), which reduce subjective language and score.
- Accountability: When institutions measure bias, they can set benchmarks and track improvement over time.
Yet, challenges remain:
- Ethical Concerns: Predicting race or gender from names isn’t perfect. Researchers must use these tools responsibly, acknowledging their limitations.
- Systemic Change: Uncovering bias is just the first step. The harder work is dismantling the structures that allow it to persist.
A Call to Action
The plastic surgery study is a wake-up call. Bias isn’t just a personal failing—it’s woven into the systems that shape our careers, our healthcare, and our society. But with the right tools and commitment, we can rewrite the narrative.
What You Can Do:
- If you’re in leadership: Advocate for bias audits in hiring, promotions, and admissions.
- If you’re a researcher: Explore how tools like Namsor could uncover hidden patterns in your field.
The goal isn’t just to see bias—it’s to stop it. And that starts with data.
Further Reading:
- Original Study: “Race and Gender Bias in Narrative Letters of Recommendation for Plastic Surgery Residency Applicants” (pubmed)
- How Standardized Letters Can Reduce Bias in Orthopedic Surgery (sciencedirect)
- Bias in Radiology Residency Letters (pubmed)
Credits : summarization and illustration by LeChat MistralAI (Pro Version)
About NamSor
NamSor™ Applied Onomastics is a European vendor of sociolinguistics software (NamSor sorts names). NamSor mission is to help understand international flows of money, ideas and people. We proudly support Gender Gap Grader.
