Cathy O'Neil

Cathy O’Neil—Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy

October 4, 2016—Cathy O’Neil, data scientist and author of the new book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, discussed how some algorithms can have an invisible, but important and destructive impact on people’s lives.

Decisions about employment, criminal sentencing, and many other areas are now influenced by algorithms and big data. This is a serious problem, argues O’Neil, as there is little transparency about how these systems are constructed or used. Following are some highlights from the conversation. Listen to the full audio below or on iTunes. 

On her trajectory from Wall Street to Occupy Wall Street

People are afraid of math, and they trust math, and they stop asking questions when they see formulas.

O’Neil left her position at Barnard College to work at D.E. Shaw before the beginning of the 2008 financial crisis. “I started becoming more skeptical of what I was actually a part of…basically you’re taking ugly sausage meat and mixing it together and claiming to have very good sausage at the top of the heap. This sounds wrong, and it was wrong. It was a mathematical lie. As an idealist mathematician I had thought of mathematics as a refuge, where you don’t have to have political disagreements, you can all agree on math, because it’s true. To see it used as a shield for corrupt practices in finance was actually shameful to me.”

“[In 2009] I went to RiskMetrics for two years with the hope that I could help the risk model…I worked on the credit default swap model with the idea, ‘oh, if we understand risk better on a mathematical level, then we’d have a safer world.’ I came up with a better way of understanding risk…but then I found out very quickly that nobody cared…the way I was going to do it was going to show them that they had more risk on their books than they wanted to admit or that they cared about…then I realized that this is not a mathematical problem, this is essentially a political problem. We’re weaponizing math, because people are afraid of math, and they trust math, and they stop asking questions when they see formulas…I left, I started Mathbabe as a way to expose the corrupt practices that I had seen in finance, and when Occupy broke out about six months later I joined them.”

Data science shortcomings

After leaving finance, O’Neil also began working as a data scientist, but saw problems that paralleled those of her previous industry. “What I noticed was that everybody in data science was extremely drunk on the Kool-Aid of this idea that whatever we’re doing with big data is going to solve problems and make the world a better place, and I was like, ‘wait a second, why do we assume that?’”

Data science algorithms sorted the world into winners and losers, and the losers sometimes didn’t even know that they had been scored.

Although failures in the financial sector garnered public attention, other failures in data science, such as teacher evaluation models, among many others, did not, said O’Neil. “Teachers were getting these scores that weren’t explained to them, and sometimes they were getting fired or at least denied tenure based on bad scores. They couldn’t appeal them because it was proprietary and secret. They had this extreme power but no transparency, which meant no accountability. And that meant that people that were fired or didn’t get tenure were silenced…people didn’t even notice because it happened at this individual level.”

“I started noticing that pattern—that data science algorithms sorted the world into winners and losers, and the losers sometimes didn’t even know that they had been scored. They just didn’t get opportunities that the winners got.”

Why even online ads are not innocuous

O’Neil decided to write her book when working at a startup, where she predicted purchases on Expedia in order to deliver targeted advertising. She thought of the practice as “not particularly pernicious,” until a visiting venture capitalist remarked that he never wanted to see another University of Phoenix ad because “that’s not for people like me.”

“Venture capitalists are influential in creating the world of our online experience. Their stated intention is to make that world a place where we are segregated and siloed and segmented by class, so the people who are the winners in that system…are given opportunities, and the people at the other end of the spectrum are preyed upon.”

O’Neil elaborated, using the example of for-profit colleges, some of which have come under fire in recent years for their poor graduation rates and high levels of debt among low-income students. “If you’re a poor person, who are you worth the most for?…In this case, they’re not actually getting money from that poor person, they’re getting money from the federal government because they convinced that poor person to sign up for online, for-profit college classes.”

How data and algorithms can affect employment and incarceration

“Sixty-six percent of job applicants per year have to take a personality test in order to get an interview for minimum wage work,” said O’Neil. This becomes problematic when the tests are used to discriminate, for example, against people with mental health issues, she said.

“I found out about recidivism risk algorithms that judges were using in order to decide the length of sentencing for people. And the risk scores very likely are racist because they’re based on all sorts of racist arrest data and questionnaires, the questions of which are proxies for race and class.”

“The people who were losers from the system were the least likely to have lawyers, and to even have the voice to complain about it in a way that mainstream media would pick up.”

Why greater transparency is crucial

If you think about the recidivism risk algorithms changing the sentences of people, that is tantamount to a law on the books, and we have the right to understand our laws.

“Algorithms are not inherently objective, they’re just decision making processes. Although there are sophisticated mathematical elements to them, there is no reason that anyone should be wowed by the math—it’s just a black box…people go ‘I’m not a mathematician, I can’t object to this.’ No, you can object to it. It’s not fair for someone to be assessed at a job, especially a civil servant, when they don’t even understand the process by which they’re being judged.”

“With predictive policing and recidivism risk algorithms, those are secret…If you think about the recidivism risk algorithms changing the sentences of people, that is tantamount to a law on the books, and we have the right to understand our laws.”

“We already have laws about fair hiring practices…we have these anti-discrimination laws, but when we have an algorithm involved, we somehow don’t interrogate it, we exempt it. Why? These companies are replacing their human resources departments with algorithms to save money, and they are not checking to see whether the algorithms they’ve replaced their HR with are legal.”

Can these algorithms be improved and used for the public good?

“When we build an algorithm, we define the data that goes into it, which is often biased…but more importantly we define the objective, we define what the definition of success is. With the for-profit colleges, Google made a ton of money, the for-profit colleges made a ton of money, so it was a win-win situation if you ignore what actually happened to society. It’s hard to imagine those algorithms suddenly being used not to make profit, but to make sure that everyone gets the best possible education. The role of government has to come in here.”

“I’m a big believer in data, but we’ve just thrown these algorithms out there assuming that they were safe and fairer then our existing system, and there’s no reason to believe that. They’re just propagating our past systems.”

“I think what we need to do is monitor and audit the algorithms for fairness…a lot of recidivism risk algorithms are actually pretty accurate, but they’re accurate because our policing system is racist.”

The role of the media

“I do blame journalists a little bit, and I love journalists, they’re my favorite people—journalists are basically skeptics. But the problem is a lot of journalists are also really afraid of math. Tech journalists are often very guilty of just writing down whatever the PR person from a big data firm says….tech journalists have to start asking important questions: How do you know that what you’re doing is fair? They have to push back much harder than they do now.”

Article by Nilagia McCoy of the Shorenstein Center.