Opinion, Berkeley Blogs

Resumes, recruiting and recommendations: Data-driven guidelines for employers and applicants

mountain climber

As the year-end approaches, for many the winter break means not only holiday shopping and family time but also submitting applications for graduate school, searching for summer internships, or scanning job advertisements. Applicants prepare resumes, mentors draft recommendation letters, and employers (or their designated algorithms) assess both. Given the complicated chemistry of matching an ideal candidate to an open position, what do we know about the early stages of the process?

mountain climber

Online tools have made it easier than ever for job seekers to apply to multiple positions. Employers have met this increased and sometimes overwhelming interest with machine-learning algorithms that measure traits of successful employees and match their characteristics to applicants’ skills and experience. In addition to expediting resume review, AI tools promise to reduce explicit or implicit bias found when supervisors review applications. However, machine-driven decisions are only as fair as the humans who trained them. Indeed, Amazon recently suspended development of a recruiting tool that was screening out applications with references to women’s organizations and over-valuing aggressive language like “executed.” Bias in, bias out .

From the first stages of the hiring process, job seekers and employers alike should be aware of conditions—aside from relevant qualifications—that may affect applicants’ chances of moving to the interview stage. Several studies have shown that when resumes indicate a candidate’s gender or race , women and under-represented minorities are less likely to be selected for a final candidate pool than white men and are offered lower starting salaries, even when other hiring criteria are equal.

Additional attributes shown to induce discrimination include parenting status, sexual orientation, and cultural markers, all of which may be revealed in a resume. Jobseekers who are mothers face the so-called “motherhood penalty.” In one famous study subjects evaluated pairs of equally qualified candidates, one of whom was a mother. Researchers found that mothers were 79% less likely to be hired, half as likely to be promoted, offered an average of $11,000 less in salary, and held to higher performance and punctuality standards.

Bias regarding sexual orientation, especially for homosexual applicants seeking jobs in fields typically associated with the opposite sex, appears to be decreasing , at least in some sectors. Still, a 2011 study demonstrated that a gay male job applicant is roughly 40% less likely to be offered a job interview than his heterosexual counterpart.

Even indications of class background and cultural preferences can affect job seekers’ chances of being invited to interview. In a study of hiring practices at elite law firms , resumes for men of higher socio-economic status were four times more likely to be called back than all other applicants combined. One perhaps surprising result of intersections among the categories (male/female and high/low socio-economic status): Higher-class women were less likely than higher-class men or lower-class women to be invited to interview. They were considered a “flight risk” and lacking commitment to a professional career compared to applicants in the other categories.

Bias arises not only from resumes but also recommendation letters. Male and female candidates are often described differently . Letters for male candidates are longer and tend to use more superlatives or “standout” adjectives (e.g., “superb,” “exceptional”). Letters for women emphasize communal qualities (“caring,” “friendly”); women are also described with more qualifiers, hedging language and faint praise. These characteristics hold for both male and female letter-writers—and for how applicants describe themselves.


For applicants: Consider omitting extra-curricular activities and interests from the resume. This can be problematic when leadership activities indicate not only authority and decision-making experience but also affiliation with a protected group. The loss of nuance and texture lent by personal interests also flattens a candidate’s profile and may be an unfortunate cost of creating a more competitive application. Still, leaving off such clues may reduce incidents of bias. Omit reference to parenting activities (for women) or fondness for country music (anyone!).

For mentors: Be consistent in length of letters provided to male and female students. Watch for differences in descriptions of women’s strengths (praising communal, social qualities; hedging on evaluation of leadership) versus men’s (more full-throated recommendations and superlative language).

For employers: Attend to gendered language in job advertisements and subtle differences in letters of recommendation. In addition to being alert to potential areas of unconscious bias, consider adding two words to the job description: “salary negotiable.” A recent study estimated that giving women implicit permission to negotiate decreased the gender wage gap by 45%.

For algorithms (and those who create them): Evaluate the outcomes of algorithmic decision-making to ensure that the results are fair, accountable and transparent . If we believe the demonstrated financial and social benefits of a diverse workforce, we cannot rely on models that simply perpetuate the status quo.

Of course, application materials are just the first part of the hiring funnel. In-person interviews and on-site visits present new circumstances where bias or discrimination can come into play. But getting the interview is an essential first step. Insight into the processes operating in full view or just out of sight can even the playing field for all and yield the most qualified candidates.