Algorithmic Bias: When Bigotry, Sexism, & Discrimination become Code?

by Morgan Stonelake

We are in a battle against an invisible assailant: Algorithms. Algorithms are everywhere, judging us and sorting us in a manner that is even more invasive and life devastating than any other kind of surveillance technology because of just how surreptitiously they operate. The scariest thing about all of this is how we are so trusting of technology. “It’s really different from what you think most people think of algorithms. They think algorithms are objective and true and scientific. That’s a marketing trick. It’s also a marketing trick to intimidate you with algorithms, to make you trust and fear algorithms because you trust and fear mathematics.” (O’Neil, 2017) Yet as it stands, in this world right now when you apply for a job, or apply to a university, or seek any kind of service or product you typically go through a layer of technology first. You apply online, or search through a website. Or you submit an application into a pool of other online applicants that are screened, sorted, and filtered by unseen algorithms. We are fighting against an unseen enemy just to make it to the end goal.

algorithms: the invisible enemy

“We’re being scored with secret formulas that we don’t understand. That doesn’t have a system of appeal.” (O’Neil, 2017) Companies can now develop codes to target their ideal demographics and completely ignore and discard those people who do not match their definition of success. This is well beyond troubling and infringes on equality in our democracy.

What? I don’t see color. I only enforce my creators’ bias.

“Algorithms are everywhere. They sort and separate the winners from the losers. The winners get the job, or a good credit card offer. The losers don’t even get an interview, or they pay more for insurance.” (O’Neil, 2017) The issues are far reaching and have the potential to devastate peoples’ entire lives and ban them from success based on factors as innocuous as their zip code, gender, or race.

the problem is, picking only ‘the winners’ could create an even greater wealth disparity

What’s really going on here? “Data laundering. It’s a process by which technologists hide ugly truths inside black box algorithms and call them objective; call them meritocratic.” (O’Neil, 2017) What is just or fair about abusing technology in this way? It’s as if institutions have chosen to use algorithms to interview us without our permission or even our knowledge and unfortunately, many people are denied from the start because of factors beyond their control. “They call it their “secret sauce” and that’s why they can’t tell us about it. It’s also private power. They are profiting for wielding the authority of the inscrutable.” (O’Neil, 2017) I find it morally and ethically bankrupt on the part of these institutions to use algorithms to enforce bigotry, sexism, and discrimination before a person even has the opportunity to speak for themselves.

algorithms operate in secret and are used to deny certain candidates from the outset, all without our knowing

In other words, algorithms are allowing people and institutions to enforce their own biases behind the mask of impartial technology. This is discrimination and bigotry at its finest. Who can we really blame when the algorithms are blindly trusted by our own justice system, our schools, our places of employment yet remain hidden, and are unintelligible to the lay person? How does one demand justice, who do we point to? The scope of corruption is unimaginable but it’s already here.

For example, The New York Post published a performance rating algorithm that was used on 665 teachers, all these teacher’s names and scores were published as an act of teacher-shaming, and subsequently 205 teachers were fired as a result.

Gary Rubinstein and the complete dumpster fire that is the Value-Added Model for assessing teachers

Yet these ratings were later examined by an expert, Gary Rubinstein, and found to be entirely nonsensical, resulting in the loss of good teachers and public shaming that should have never occurred. This happened in 2011. This is the algorithmic dystopia we are already living in. “Algorithms can go wrong; even have deeply destructive effects with good intentions. And whereas an airplane that’s designed badly crashes to the earth and everyone sees it, an algorithm designed badly, can go on for a long time silently wreaking havoc.” (O’Neil, 2017)

The problems of algorithmic bias grow even more threatening and terrifying when looking at the ways they are radically perpetuating racial profiling and longer sentences for BIPOC and minorities in the justice system. “Across the US, police departments are starting to use facial recognition software in their crime-fighting arsenal. They are also starting to use machine learning for predictive policing. Some judges use machine-generated risk scores to determine how long an individual is going to spend in prison.” (Buolamwini, 2017)

This blind faith in bias reinforcing algorithms however, is already affecting the outcomes of people’s lives and without real cause. It is a corruption of justice and the concept of innocent until proven guilty. “Algorithms don’t make things fair. They repeat our past practices, our patterns. They automate the status quo.” (O’Neil, 2017) If past criminal data collected by US police is predominantly targeted at minorities then implementing algorithms using this skewed data would create harsher sentences for minorities vs non BIPOC.

garbage in, garbage out

“The news organization ProPublica recently looked into one of those “recidivism risk” algorithms, as they’re called, being used in Florida during sentencing by judges. Bernard, on the left, the black man, was scored a 10 out of 10. Dylan, on the right, 3 out of 10. 10 out of 10, high risk. 3 out of 10, low risk. They were both brought in for drug possession. They both had a records but Dylan had a felony but Bernard didn’t. This matters, because the higher score you are, the more likely you’re being given a longer sentence.” (O’Neil, 2017)

Bernard and Dylan, two lives, two very different futures, all thanks to biased algorithms

Because Bernard is a POC, a black man, the algorithms, using skewed data, view him as a greater risk than the Caucasian male with a prior felony simply because of racial factors. Our justice system failed Bernard, because of a blind reliance on a bias enforcing algorithm. I am appalled. Disgusted. I feel such tremendous resignation and powerlessness. It is already here and being used to destroy lives, and for what? I can’t even imagine how these judges rely on this technology more heavily than their own common sense, judgement, and humanity. It’s like people are outsourcing common sense to machines and they are failing miserably because we have already influenced them. I hate it here.

I have no moral compass, surely I know what is best for this human life

I think the most disturbing matter in all of this, is the overall vacuum of ethical governance over algorithms and their widespread use. The legislation to audit the efficacy and ethics of algorithms is severely lacking in our society and the results are already devastating lives. The call to audit algorithms, the data scientists, the companies creating the codes, and the government agencies utilizing them in mass and largely in secret must face legislative reconstruction and judicial punishments for their discriminations based on gender and race. “This is not a math test, this is a political fight. We need to demand accountability of our algorithmic overlords. The era of blind faith in big data must end.” (O’Neil, 2017) If the public could better audit and demand ethical treatment from companies the companies would have to make reparations. I think of the work that Joy Buolamwini is doing to really give the public a space for everyday people to point out the injustices in algorithmic bias. “On codedgaze.com, you can report bias, request audits, become a tester, and join the ongoing conversation. #codedgaze” (Buolamwini, 2017)

Joy Buolamwini & the Algorithmic Justice League

She inspired me to not feel as hopeless and shows us we can fight back by auditing companies using these dangerous algorithms. “To get the ‘incoding’ movement started I’ve launched the Algorithmic Justice League, where anyone who cares about fairness can help fight the coded gaze.” (Buolamwini, 2017) Although they are currently in use everywhere, I still want to remain hopeful that change will come and we can make a difference.

Works Cited

Buolamwini, Joy. “How I’m Fighting Bias in Algorithms.” YouTube, TEDtalksDirector, 29 Mar. 2017, www.youtube.com/watch?v=UG_X_7g63rY.

Hauser, Robin. “Can We Protect AI from Our Biases? .” YouTube, TEDInstitute, 12 Feb. 2018, www.youtube.com/watch?v=eV_tx4ngVT0.

O’Neil, Cathy. “The Era of Blind Faith in Big Data Must End .” YouTube, TEDtalksDirector, 7 Sept. 2017, www.youtube.com/watch?v=_2u_eHHzRto.

Sharma, Kriti. “How to Keep Human Bias out of AI.” YouTube, TEDtalksDirector, 12 Apr. 2019, www.youtube.com/watch?v=BRRNeBKwvNM.

Wachter-Boettcher, Sara. Technically Wrong Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. W.W. Norton & Company, Independent Publishers since 1923, 2018.

ISFP just a gentle soul (she/her, Pisces ♓︎, gen z) ✽ ✾ ✿ ❀ ❁ Performing Arts Major at WOU Emphasis in (Theatre, Voice, and Dance)

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Low Code Data Science is the Future

 Low Code Data Science is the Future

Energy and Policy Considerations for Deep Learning in NLP

AIRTIST & MAN

AI Territory. Content so far…

Is figuring out Intelligence using Deep Learning Equivalent to the Halting Problem ?

New Features of Chatbot in Upcoming Years

Humanoids will soon be running International Trade

Build a better onboarding process using AI

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Morgan Stonelake

Morgan Stonelake

ISFP just a gentle soul (she/her, Pisces ♓︎, gen z) ✽ ✾ ✿ ❀ ❁ Performing Arts Major at WOU Emphasis in (Theatre, Voice, and Dance)

More from Medium

Where are the East Asians on our Screens?

Fire, Light and Dark # 1

This R&B Song Shows Just How Disposable Music Has Become

Anitta, posing in a throwback striped, sleeved bathing suit from her 2021 music video, “Girl From Rio.”

Sanctions Top-5 for the week ending 19 November 2021