Page 112 - Harvard Business Review (November-December, 2017)
P. 112
FEATURE “NUMBERS TAKE US ONLY SO FAR”
the experiences shared by five or 10 employees—or look influencing the language used—it’s a new intervention—
more carefully at the descriptive data, such as head counts but we will be examining patterns over time.
for underrepresented groups and average job satisfaction Perhaps above all, HR and analytics departments must
scores cut by race and gender—to examine the impact of value both qualitative and quantitative expertise and ap-
bias at a more granular level. ply mixed-method approaches everywhere possible. At
In addition, analysts should frequently provide con- Facebook we’re building cross-functional teams with both
fidence intervals—that is, guidance on how much man- types of specialists, because no single research method
agers can trust the data if the n’s are too small to prove can fully capture the complex layers of bias that everyone
statistical significance. When managers get that informa- brings to the workplace. We view all research methods as
tion, they’re more likely to make changes in their hiring trying to solve the same problem from different angles.
and management practices, even if they believe—as most Sometimes we approach challenges from a quantitative
do—that they are already treating people fairly. Suppose, perspective first, to uncover the “what” before looking to
for example, that as Red Ventures began collecting data the qualitative experts to dive into the “why” and “how.”
on self- assessments, analysts had a 75% confidence level For instance, if the numbers showed that certain teams
that blacks and Latinos were underrating themselves. The were losing or attracting minority employees at higher rates
analysts could then have advised managers to go to their than others (the “what”), we might conduct interviews,
minority direct reports, examine the results from that run focus groups, or analyze text from company surveys
performance period, and determine together whether the to understand the “why,” and pull out themes or lessons
self-reviews truly reflected their contributions. It’s a simple for other parts of the company. In other scenarios we might
but collaborative way to address implicit bias or stereotyp- reverse the order of those steps. For example, if we re-
ing that you’re reasonably sure peatedly heard from members
is there while giving agency to of one social group that they
each employee. weren’t seeing their peers get-
Second, companies also need ting recognized at the same rate
to be more consistent and com- Algorithms and as people in other groups, we
prehensive in their qualitative could then investigate whether
analysis. Many already conduct statistics do not numerical trends confirmed
interviews and focus groups to those observations, or conduct
gain insights on the challenges capture what it statistical analyses to figure out
of the underrepresented; some which organizational circum-
even do textual analysis of feels like to be stances were associated with
written performance reviews, employees’ being more or less
exit interview notes, and hiring likely to get recognized.
memos, looking for language the only black or Cross-functional teams
that signals bias or negative also help us reap the benefits
stereotyping. But we have to go Hispanic member of cognitive diversity. Working
further. We need to find a via- together stretches everyone,
ble way to create and process of a team. challenging team members’
more-objective performance own assumptions and biases.
evaluations, given the internal- Getting to absolute “whys”
ized biases of both employees and “hows” on any issue, from
and managers, and to determine recruitment to engagement to
how those biases affect ratings. performance, is always going
This journey begins with educating all employees on to be tough. But we believe that with this approach, we
the real-life impact of bias and negative stereotypes. At stand the best chance of making improvements across
Facebook we offer a variety of training programs with an the company. As we analyze the results of Facebook’s
emphasis on spotting and counteracting bias, and we keep Pulse survey, given twice a year to employees, and review
reinforcing key messages post-training, since we know Performance Summary Cycle inputs, we’ll continue to look
these muscles take time to build. We issue reminders at for signs of problems as well as progress.
critical points to shape decision making and behavior. For
example, in our performance evaluation tool, we incorpo- EVIDENCE OF DISCRIMINATION or unfair outcomes may not
rate prompts for people to check word choice when writing be as certain or obvious in the workplace as it was for me
reviews and self-assessments. We remind them, for in- the time I was evicted from my apartment. But we can in-
stance, that terms like “cultural fit” can allow bias to creep crease our certainty, and it’s essential that we do so. The
in and that they should avoid describing women as “bossy” underrepresented people at our companies are not crazy
if they wouldn’t describe men who demonstrated the same to perceive biases working against them, and they can get
behaviors that way. We don’t yet have data on how this is institutional support. HBR Reprint R1706L
146 HARVARD BUSINESS REVIEW NOVEMBER–DECEMBER 2017