Can We Teach Robots to be Egalitarian?

Abstract robot head from different angles on black background. Artificial intelligence. 3D render.

Can we teach robots to be less biased than us?  Probably yes.  But only if we do this right.  Bias is mostly the product of mental shortcuts we make in our reasoning, and machines can only think clearly if we teach them to not make the same mental shortcuts.

There is an interesting article about employers’ best attempts at reducing bias in hiring algorithms.  Paul Burley, the CEO at Predictive Hire, describes his company’s efforts to identify and eliminate bias in the recruitment and selection of the best job applicants.  This work goes beyond eliminating applicant names from a conventional recruitment processes; this effort gets into predictive analytics to identify the best candidate.

Burley is particularly keen on identifying interview questions that drive bias (either direct or adverse-effect discrimination), and then eliminating those questions entirely.  While they do not use demographic information inside their algorithms, they do use demographic information outside of the algorithm, to test if any of their questions are causing a bias after-the-fact.

Using Workforce Analytics to Identify Invisible Bias

It sounds to me like his company is going about it the right way.  With bias, we don’t disproportionately “choose” white males to be the boss.  Rather, we assess what traits would normally indicate strong leadership, accidentally carry-forward historic stereotypes about strong leaders, and then inadvertently choose white males.  Plenty of people, including some women and visible minorities, accidentally advance this momentum.  That is because it’s the underlying thought patterns driving things, rather than deliberate and malevolent racism and sexism.  You can make one step forward by not being a jerk, but take two steps backward on something called cognitive bias.  And everyone does cognitive bias, not just the man.

Over at Better Humans, they have created a Cognitive Bias Cheat Sheet.  Personally, I have been trying to stay on top of cognitive bias since it was revealed to be a major driver of the 2008 sub-prime mortgage fiasco and the subsequent Great Recession.  Cognitive bias is overwhelming, and that’s illustrative of what the real problem is.  The world just gives us too much information to process, so we make shortcuts in our thinking to make sometimes-accurate judgments.  In the language of behavioral economics, prejudice is largely the advancing of skewed thinking based on cognitive bias shortcuts.

Information Overload – Are Machines Better Equipped Than Humans?

The big deal with big data is that machines are supposed to help us overcome the over-abundance of information.  Sure, we can find patterns and dig up nuggets that are buried in a mountain of data.  But if we are also making judgment calls using cognitive shortcuts because the human brain can’t handle the volume, there is the opportunity to use the machine to allow us to make judgments using all of the information.  We can create algorithms that are larger and more complex, bypassing the constraints of cognitive bias, and produce recommendations that are far less biased than those produced by humans.

We don’t entirely have the option of just turning the machine off.  Going off-grid just sends us back to biased decisions made by humans on gut instinct.  Think of who you know, and consider that not all luddites are champions of equality.  Right now, we are just getting past the first wave of machines imitating our own sexism and racism.  We now have the option of telling the machines to stop doing that, and then building new algorithms that meet our own purported standards of neutrality.

But this will happen if and only if we choose to name our biases, talk openly about them, measure them, make decisions to reverse them, and keep improving the algorithms such that everyone has a fair shot at the good jobs.  And even then, we still can’t trust robots to decide where to seat people on the bus.  We must forever be vigilant, and stay human.

Loving Math, Caring About Peers

Nerd. by David Nichols
Nerd. Photo courtesy of David Nichols

Some time has passed, so let’s calmly reflect on the anti-diversity manifesto that got a software engineer fired from Google in August of 2017.  James Damore, the author, has to be the unluckiest person on earth.  Not only did he lack the genetics and environmental upbringing to be compassionate about the emotions of others (he might be on the autism spectrum), but he also wrongly attributed his career difficulties to the ascent of workplace diversity initiatives.

He delivered his critique via the alt-right media one week prior to the deadly neo-Nazi rally in Charlottesville.  That incident provoked corporate executives, top-ranking generals, and mainstream Republicans to denounce the rally and distance themselves from Donald Trump’s muddled sympathies in the aftermath.  Damore and his fans have left no opening for a nuanced discussion about the effects of diversity initiatives on those with developmental disorders, a potentially meaningful topic of debate.

In this article in the New York Times, author Claire Cain Miller proffers a critique of the role of emotional intelligence in the modern world of information technology.  It turns out that technology has a massive overlap with social and emotional context.

Emotional Intelligence in Workforce Analytics and Computer Programming

For deep evidence, the article cites 2015 research from David Demming that finds job growth and wage growth are highest among roles that use both math skill and social skills.  The idea is that workers “trade tasks” with one another, to allow specialization of talents and improved efficiency when work duties are shuttled back and forth.  Those who trade tasks more effectively through the use of social skills are more productive; hence more jobs and higher pay.

This double-barreled skill set is abundantly obvious to those in workforce analytics.  We spend half our day figuring out cool formulas and novel discoveries.  But the other half of our day is spent interpreting client need, negotiating resource priorities, wordsmithing data definitions, developing interpretations that are suitable to context, and showing compassion while we advance disruption.  However, my field is new.

When computer programming was new it was originally considered highly social work.  There was an abundance of women working in the field.  Through some office-culture twists and turns, things changed.  Boys and men who weren’t as clever at the social skills self-selected into programming.  It worked out for a lot of people.  But there’s a problem; at some point in someone’s career their next chance for a promotion is contingent on social skills.  Those who are lacking in this area see their careers stalled.

Examples abound of coding projects with male-dominated teams who lacked context, who missed an important detail about women’s perspectives.  Apple’s original health app tracked everything except menstrual cycles, the most-tracked health data point amongst women.  Google Plus obliged users to specify their gender and provide a photo, exposing women to harassment.

The Times article also cites research from 2010 by Stanford sociologist Shelley Correll showing that gender stereotypes about skills and performance are a kind of self-fulfilling prophecy.  It’s not true that women are naturally bad at math, but it is abundantly true that women who are told they are bad at math will under-perform and rate themselves more harshly.  The struggle about this stereotype has played out in dramatic ways over the years.

How to Improve Workplace Culture to Ensure Equality for Women

In terms of what to do about this, Correll advises that we:  1) ensure there are no negative gendered beliefs operating in the organization, 2) ensure performance standards are unambiguous and communicated clearly so that sexism does not fill the vacuum, and 3) hold senior management accountable for gender disparities in hiring, retention, and promotion.  That third item is metrics-based accountability, which means that business performance, diversity, and workforce analytics are fundamentally entwined.

The times article notes that “one way to develop empathy at companies is by hiring diverse teams, because people bring different perspectives and life experiences.”  While we might perceive that equity and inclusion efforts come from an activist base, there is a corporate interest in fostering inclusion.  High-performance workplaces need an environment where tasks and diverse views are shuttled back and forth, with ease and good manners.

As for white white males who desperately struggle with emotional intelligence, their voice will have to wait another day.  And probably wait for another leader.

Bias Bad for Business

Hiding, by Wes Peck
Hiding.  Photo courtesy of Wes Peck

Bias is bad for productivity.  Here’s an overview of a study that came out in July 2017.  The findings are that perceptions of bias have a negative impact on idea-sharing and job commitment.  “Of employees who experienced bias, 34% reported withholding ideas or solutions in the last six months and 48% said they looked for a new job while at their current job during the same time period.”

Perceptions of implicit bias are reduced by an inclusive environment. “Employees were 64% less likely to perceive bias at companies with diverse leaders, 87% less likely when they had inclusive leaders, and 90% less likely when they had sponsors.”

The methodology was to compare self-assessments of employee potential to those employees’ estimates of how they would be rated by their manager.  Larger gaps were interpreted as an indicator of bias.  There’s room for debate about the methodology, but the findings ring true.  That is, that managers who favour people like themselves discourage the productivity of a diverse workforce.  Bias is simply malfunctioning thinking.  Leading an organization with malfunctioning thought would presumably be a hindrance to workplace effectiveness.

I’ll Show You My Salary if You Show Me Yours

Secrets. By Salvatore Barbera
Secrets.  By Salvatore Barbera

Human resources departments and those who handle their data are expected to guard the best secrets.  But one of the biggest secrets is ironically an anti-secret.  Did you know you’re allowed to talk openly about your own pay?  Don’t tell HR.  It’s embarrassing (for them).

This article in by Jonathan Timm from July 2014 draws attention to the dubious practice of pay secrecy.  I’m not talking about the employer’s obligation to keep your pay information confidential.  Rather it’s an article about employees being obliged to keep their pay a secret from one another.  These obligations are referred to as “gag rules.”

For the uninitiated, there is no meaningful moral obligation for employees to refrain to talking about their salary with each other.  On the contrary, in the United States there are regulations that protect employees’ rights to discuss working conditions with one another.  It’s on the edges of the legislation that allows employees to collectively discuss their lot in life, bargain for improvements, and possibly unionize.

In that context the moral judgement should be obvious.  Those handling the file at human resources desks are not allowed to advance anti-union behavior, and as professionals they should always advise against such policies.

The article describes personal experiences of people struggling with these fake rules.  What is notable is how people presume these gag rules are legitimate, employers and employees alike.   Gag rules create a sense of guilt about whether we should put ourselves ahead of the employer.  They make us self-consciousness about whether we’re being greedy.  We’re embarrassed to talk about whether we’re losers for being the lowest paid person.  Raising the topic with colleagues is “akin to asking about their sex life.”

These emotions are powerful stuff.  But then, that’s how bullying is done, isn’t it?

Above and beyond beef-and-taters union issues, gag rules are also wrapped up in discriminatory pay practices.  That is, it is easier to under-pay women and visible minorities or play favorites if employees don’t talk about their pay.  A woman named Lilly Ledbetter complied with the gag rule at Goodyear for nearly three decades and ultimately found out she was under-paid.  Ms. Ledbetter sued and lost because she did not complain about being under-paid within the first 180 days of her first paycheck.

Ironically, employers share pay information with each other all the time.  They’re called compensation surveys.  They happen on an annual basis (if not monthly), and they are delivered through specialized consulting services.  The work is done under careful checks and balances that ensure data privacy and keep the whole process fair and legal.  Those who have worked on such surveys are proud of their work.  I used to do compensation surveys myself, and I was good at it.

One of the reasons why compensation professionals love doing this work is because it helps make pay fair and equitable.  Looking down from the ivory tower, human resources people know that perceived unfairness in pay creates discord.  So “good” employers put some work into getting it right, behind the scenes, in a kind of lab environment where social justice is organized by experts.  But really we’re just trying to stay one step ahead of the riff-raff.

Let’s face it, employees and the social justice movements they created are the rightful owner of this dialogue.  Gag rules and compensation surveys are just the cultural appropriation of working class politics.

Beating Bias with Blindness

Blind Justice. By Tim Green.
Blind Justice.  Photo courtesy of Tim Green.

Managers and human resource professionals are supposed to have non-discriminatory hiring practices.  Yet we are only in the early days of seeing job applicants neutrally.  There are several new (and not-so-new) methods for considering applicants fairly.  There is also the possibility of using good math to prove and reduce bias.

Canada’s federal public service announced on April 20, 2017 that it is starting a pilot project to recruit job applicants on a name-blind basis.  The minister responsible said “research has shown that English-speaking employers are 40 per cent more likely to pick candidates with an English or anglicized name…”  At the end of the pilot they will analyze the two sets of candidate shortlists, both name-blind and traditional-method.  The results of the experiment will be ready in October, for possible roll-out to the entire public service.

What is worth noting is that the Canadian government is running a formal experiment for a limited time.  This raises hope that the eventual course of action will be determined by evidence, not speculation.  They will measure the discrimination before attempting to remedy it, which could bolster support.  The approach also implies the pilot has permission to fail.  After all, they might find something totally different from what they expected.  But that kind of thing that happens when you care about science.

Of course this pilot addresses only one part of the discrimination puzzle.  I would speculate that résumés that still indicate the year and city in which a degree is attained will tip-off employers about age and ethnicity.  An obvious next phase of analysis is to block-out the graduation date and the name of the University.  After all, you only need to know if they finished their degree, plus the degree’s level and academic major, and a broad sense of the school ranking (i.e. top-100, top-400).

Job applications also reveal writing style, which should be good.  But there are differences between the sexes in the use of words.  In the book The Secret Life of Pronouns by James W. Pennebaker the author reveals the findings of high-volume statistical analyses revealing (amongst other things) that men make bold pronouncements without referring to themselves in first-person.  Women, by contrast, attribute their story to themselves, which is more clear, social, and modest.  I personally think that confidence, and willingness to boast, are unreliable indicators of competence.

In classical music, blind auditions are now commonly used to select new hires onto symphony orchestras.  They’ve been doing this for years.  The musicians submit recordings of their auditions and provide live performances behind a physical screen.  I have heard that judges gossip “you can tell” if the candidate is a man yet when the winner steps out from behind the screen it is often a woman.  In this not-so-new paper from 2000, authors Claudio Goldin and Cecilia Rouse conducted an analysis of 7,065 individuals and 588 audition-rounds to see what impact blind auditions had.  They identified that the blind auditions work.

When you’re fighting the man, words are important.  When you’re putting change into effect, math is importanter.

Barbie Provokes Equality (By Accident)

Teen Talk Barbie 1991, Attributed to Freddycat1
Teen Talk Barbie 1991.  Courtesy of Freddycat1.

It’s important in the modern workplace to know that there used to be a pervasive stereotype that women were bad at math.  It’s relevant to all of us trying to advance math in human resources.  We have the dual obstacle of getting good math across to clients, while also getting past unfair judgments directed at women who have perfectly good numbers in their hands.

This is a brief inter-generational memo which will be perceived differently depending on when you were born.  In 1992, Mattel produced the toy Teen Talk Barbie.  Amongst the 270 possible phrases the dolls would utter, 1.5% of dolls would say the phrase “Math class is tough.”

The doll was decried by the National Council of Teachers of Mathematics for discouraging women from studying math and science.  It was also referenced when the American Association of University Women criticized the relatively poor education that women were getting in math.  Mattel apologized for the mistake and announced that new dolls would not utter the phrase, and anyone who owned such a doll would be offered an exchange.

I don’t know the full history of women in math, but I do know enough to assert that Teen Talk Barbie was a critical incident.  Mattel did us all a favor by screwing up in exactly the right way, obliging many people to snap out of it, encouraging more women to become great at math, and doubling our talent pool of qualified applicants for math-intensive positions.

What fascinates me the most about this incident, is that people born after 1980 show no outward assumptions that women are bad at math.  For those of us who grew up with this assumption, we were repeatedly corrected that the stereotype was wrong, often by living-out an experience where women excelled.  The younger half of the workforce appears to be advancing their careers in blissful ignorance of this archaic stereotype.

The historic stereotype is important within human resources.  Human resources has historically been bad at math and is also a field with a large representation of women.  Quantitative work is becoming increasingly important within human resources, and human resources is obliged to influence business peers who take math very seriously.  As human resources becomes more sophisticated and makes its way to the big-kids table of decision makers, women who are good at math will speak their minds… as did Teen Talk Barbie.  Shortly after the debacle, the Barbie Liberation Organization swapped voice boxes between the Barbies and talking G.I. Joe action figures.  The liberated Barbies had access to the phrases “Eat lead, Cobra!” and “Vengeance is mine!”