Cybersecurity

Computer Bias: What It is and How to Avoid It

Professor Christo Wilson suggests the solution may not come from computer scientists.

By Bill Ibelle

Illustration by Josh Reese

As computer algorithms grow increasingly sophisticated, they’re being trusted to make decisions that could have a dramatic impact on your life—on everything from bank loans and hiring decisions, to the deployment of police officers.

But these algorithms aren’t as impartial as we’d like to think, says Christo Wilson, a Big Data expert and computer science professor at Northeastern. In fact, they can be even more biased than the humans that created them.

“WE’RE ACTING ON THE ERRONEOUS BELIEF THAT BECAUSE A COMPUTER IS A MACHINE, IT’S NEUTRAL.”

For instance, Wilson’s research uncovered a bias against black workers on two prominent gig economy websites. In both cases, contract employees were rated by clients, who consistently gave higher ratings to white workers with similar qualifications. Those ratings then drove the order in which workers were recommended for the next job, so black workers were systematically losing work opportunities.

The study found no indication of intentional prejudice. But that’s the point.

“We’re acting on the erroneous belief that because a computer is a machine, it’s neutral,” says Wilson. “That’s not true. Computers are working with code and data that are produced by humans and reflect our values, perceptions, and intent. If you train a machine based on biased data, you’re going to create a biased algorithm.”

AI Gone Haywire
Wilson offers an extreme, but valuable, example of how quickly biased data can cause machine learning to go off the rails.

Microsoft’s ill-fated Tay (short for “Thinking About You”) chatbot was a marketing experiment designed to mimic the conversation patterns of a 19-year-old girl. The goal was for users to engage in Twitter conversations with the bot. To improve Tay’s conversational skills, scientists programmed it to mimic the language and attitudes of the tweets it received.

Within hours, Tay had sampled so much noxious Twitter traffic that it began spewing out a tirade of inflammatory messages—more than 96,000 tweets—a large percentage of them extolling the virtues of Hitler and demonizing feminists. Microsoft pulled the plug on Tay before it completed its first day of operation.

Computer science professor Christo Wilson. (Photo by Mariah Tauger)Computer science professor Christo Wilson. (Photo by Mariah Tauger)

How to Avoid Computer Bias
There are steps, however, that organizations can take to prevent machine bias from taking root, and mitigate it when bias does become apparent. The first, says Wilson, is to ensure that you perform routine bias audits on the algorithms you use, much as you would run security audits.

“This should be rolled into any algorithm’s deployment strategy,” says Wilson.

He also cautions that computer scientists typically aren’t the best people to perform that task. “It’s going to be far more effective to bring in a sociologist or a statistician,” he says. “These people are going to be far more attuned to what could go wrong and how to tease out the issues that can lead to bias.”

Meanwhile, computer scientists are looking for ways to get ahead of this issue—both in terms of practical solutions and training for coders. Wilson, for example, has received a CAREER grant from the National Science Foundation to explore algorithmic bias and develop an ethics training curriculum for code developers. The Workshop on Data Transparency and Algorithmic Transparency—co-chaired by Alan Mislove of Northeastern and Arvind Narayanan of Princeton—runs sessions around the country to promote unbiased programming.

“Computer bias wasn’t as big an issue when code was distributed on floppy disks among hundreds of users,” says Wilson. “But now people can develop an algorithm in their garage and have a million users in a matter of weeks. That comes with considerable responsibility. We need to think about how our algorithms are going to be used and how they can be abused in terms of social norms, bias, and ethics.”

Suggest A Story
^