Skip to main content

Sign up for our newsletter.

Quality journalism. Progressive values. Direct to your inbox.

Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. (Photo: SocialJusticeSeeker812/cc/flickr)

Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. (Photo: SocialJusticeSeeker812/cc/flickr)

Algorithmic Bias: How Algorithms Can Limit Economic Opportunity for Communities of Color

As a young person of color who relies heavily on technology, I worry about the ways inequality is becoming automated, normalized and worsened through computerized decision-making systems

Haleema Bharoocha

Racial and gender bias in algorithms impacts communities of color in disproportionate and frightening ways. The urgency of addressing this issue cannot be stressed enough. Algorithmic bias goes beyond privacy and surveillance issues that we have seen in the past. Biased algorithms can determine my access to healthcare, jobs, loans, and broadly, economic opportunity — and yours, too.

As a young person of color who relies heavily on technology, I worry about the ways inequality is becoming automated, normalized and worsened through computerized decision-making systems. I have first-hand experience of what happens when we leave discrimination unchecked. As a Muslim raised post 9/11, I have seen Islamophobia continue to increase. I watched as Mohammed became Mo, as aunties took off their hijabs, as my community did all but shed their skin to hide their Muslim identity.

My community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally.

Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. The silence around the genocide of Rohingya Muslims in Burma and the extermination of Uyghur Muslims in China speaks volumes to how the dehumanization of Muslims became so normalized, which by the way was supported by platforms like Facebook which enabled hate-based propaganda on their platforms. So I of all people understand why addressing algorithmic bias is a matter of urgency. We must include this analysis in our fight for equity and justice before automated inequality becomes the new status quo.

Let’s focus on the hiring process as an example to expand on what algorithmic bias looks like. To begin with, an algorithm is a process that helps solve a problem. Think of it as a formula where you plug in datasets and methods to show results. The simplest algorithms are written based just on the intuitions of the programmer, but many in practice also rely on big data and artificial intelligence. Big data and artificial intelligence use datasets in combination with programmer instructions to shape algorithms into more finely tuned formulas. Though generally seen as objective, computerized decision-making systems contain human and data set bias. They are, after all, created by humans who bring their own biases that can impact the way the program is structured and the information that’s fed into it.

Hiring software contains algorithmic bias, limiting economic opportunity for people of color and marginalized genders. When you have a 1,000 applicants for a job, using an algorithm to pick out your top 20 candidates to interview solves an issue of capacity and time for an organization. For this algorithm to pick the best applicants you plug in resumes of successful hires at the organization, past hiring history, and keywords that match the job description. Here is where the “isms” start to show up.

In 2017, Uber’s technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group.

Let’s use Uber an example. In 2017, Uber’s technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group. Keywords may also include bias. Textio, a company which helps create gender-neutral language in job descriptions, shows that words like “enforcement” or “fearless” tend to attract male applicants. Based on this data, the hiring algorithm will likely pick White and Asian men as the top 20 candidates, taking away economic opportunity from qualified diverse candidates. These are just two examples of how an algorithmic can contain bias in the hiring process. The bias that led the hiring process to select only White and Asian men for the job is now embedded into the algorithm, automating this cycle of discrimination.

As someone who is currently applying for jobs, I worry that I may not even get an interview despite my qualifications due to this bias. I found there are hacks you can use to improve your chances to pass resume reading software through tools like Bloc and Job Scan. To learn more about algorithmic bias in the hiring process read this report by Upturn.

In order to address this issue at its root, community organizers, policy advocates, nonprofit professionals, and youth need to understand the impact of algorithmic bias based on race, gender, or other factors. Our communities must mobilize to create solutions that are by us and for us — soon. Companies like Google still lag on creating long-term solutions that address the root cause of these issues. We need a grassroots #PeoplePowered movement to bring #TechEquity, before these supposedly “objective” systems normalize and worsen discrimination.

Follow us on Twitter for more updates and stay tuned for our official report on #AlgorthmicBias. Have ideas or want training for how your organization can advance #TechEquity? Email haleemab@greenlining.org.


Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.

Haleema Bharoocha

Haleema Bharoocha is Technology Equity Fellow at The Greenlining Institute.

We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.

'Bloodbath': At Least 6 Dead, Dozens Wounded in Mass Shooting at Illinois July 4th Parade

"What freedom do we have if we fear being gunned down at a parade?" asked one progressive politician horrified by the reported carnage.

Brett Wilkins ·


On This July 4th, Abortion Rights Movement Says 'We're Not in the Mood for Fireworks'

"If we don’t have the ability to make decisions about if, when, and how to grow our families—we don't have freedom."

Brett Wilkins ·


Deadly Glacier Collapse in Italy 'Linked Directly to Climate Change'

At least seven people were killed when a glacier slid down a mountainside near a popular climbing route in the Alps on Sunday.

Julia Conley ·


'Organized Whitewash': US Claims Israeli Military's Murder of Journalist Not Intentional

"The odds that those responsible for the killing of Shireen Abu Akleh will be held to account are all but nonexistent," said the human rights group B'Tselem in response to findings of U.S. State Department.

Brett Wilkins ·


Hundreds March in Akron Enraged by Police Killing of Jayland Walker

"The police can do whatever they want," said one local resident through tears. "They can take our children's lives and think it's okay."

Julia Conley ·

Common Dreams Logo