Math is racist: How data is driving inequality
Math is
racist: How data is driving inequality
By Aimee Rawlins
Taken From: CNN
It's no surprise
that inequality in the U.S. is on the rise. But what you might not know is that
math is partly to blame.
In a new book,
"Weapons of Math Destruction," Cathy O'Neil details all the ways that
math is essentially being used for evil (my word, not hers).
From targeted
advertising and insurance to education and policing, O'Neil looks at how
algorithms and big data are targeting the poor, reinforcing racism and
amplifying inequality.
These
"WMDs," as she calls them, have three key features: They are opaque,
scalable and unfair.
Denied a job
because of a personality test? Too bad -- the algorithm said you wouldn't be a
good fit. Charged a higher rate for a loan? Well, people in your zip code tend
to be riskier borrowers. Received a harsher prison sentence? Here's the thing:
Your friends and family have criminal records too, so you're likely to be a
repeat offender. (Spoiler: The people on the receiving end of these messages
don't actually get an explanation.)
The models O'Neil
writes about all use proxies for what they're actually trying to measure. The
police analyze zip codes to deploy officers, employers use credit scores to
gauge responsibility, payday lenders assess grammar to determine credit
worthiness. But zip codes are also a stand-in for race, credit scores for
wealth, and poor grammar for immigrants.
O'Neil, who has a
PhD in mathematics from Harvard, has done stints in academia, at a hedge fund
during the financial crisis and as a data scientist at a startup. It was there
-- in conjunction with work she was doing with Occupy Wall Street -- that she
become disillusioned by how people were using data.
"I worried
about the separation between technical models and real people, and about the
moral repercussions of that separation," O'Neill writes.
She started
blogging -- at mathbabe.org -- about her frustrations, which eventually turned
into "Weapons of Math Destruction."
One of the book's
most compelling sections is on "recidivism models." For years,
criminal sentencing was inconsistent and biased against minorities. So some
states started using recidivism models to guide sentencing. These take into
account things like prior convictions, where you live, drug and alcohol use,
previous police encounters, and criminal records of friends and family.
These scores are
then used to determine sentencing.
"This is
unjust," O'Neil writes. "Indeed, if a prosecutor attempted to tar a
defendant by mentioning his brother's criminal record or the high crime rate in
his neighborhood, a decent defense attorney would roar, 'Objection, Your
Honor!'"
But in this case,
the person is unlikely to know the mix of factors that influenced his or her
sentencing -- and has absolutely no recourse to contest them.
Or consider the
fact that nearly half of U.S. employers ask potential hires for their credit
report, equating a good credit score with responsibility or trustworthiness.
This "creates
a dangerous poverty cycle," O'Neil writes. "If you can't get a job
because of your credit record, that record will likely get worse, making it
even harder to work."
This cycle falls
along racial lines, she argues, given the wealth gap between black and white
households. This means African Americans have less of a cushion to fall back on
and are more likely to see their credit slip.
And yet employers
see a credit report as data rich and superior to human judgment -- never questioning
the assumptions that get baked in.
In a vacuum, these
models are bad enough, but O'Neil emphasizes, "they're feeding on each
other." Education, job prospects, debt and incarceration are all
connected, and the way big data is used makes them more inclined to stay that
way.
"Poor people
are more likely to have bad credit and live in high-crime neighborhoods,
surrounded by other poor people," she writes. "Once ... WMDs digest
that data, it showers them with subprime loans or for-profit schools. It sends
more police to arrest them and when they're convicted it sentences them to
longer terms."
In turn, a new set
of WMDs uses this data to charge higher rates for mortgages, loans and
insurance.
So, you see, it's
easy to be discouraged.
And yet O'Neil is
hopeful, because people are starting to pay attention. There's a growing
community of lawyers, sociologists and statisticians committed to finding
places where data is used for harm and figuring out how to fix it.
She's optimistic
that laws like HIPAA and the Americans with Disabilities Act will be modernized
to cover and protect more of your personal data, that regulators like the CFPB
and FTC will increase their monitoring, and that there will be standardized
transparency requirements.
And then there's
the fact that these models actually have so much potential.
Imagine if you
used recidivist models to provide the at-risk inmates with counseling and job
training while in prison. Or if police doubled down on foot patrols in high
crime zip codes -- working to build relationships with the community instead of
arresting people for minor offenses.
You might notice
there's a human element to these solutions. Because really that's the key.
Algorithms can inform and illuminate and supplement our decisions and policies.
But to get not-evil results, humans and data really have to work together.
"Big Data
processes codify the past," O'Neil writes. "They do not invent the
future. Doing that requires moral imagination, and that's something only humans
can provide."
Comments
Post a Comment