Skip to main content

Sign up for our newsletter.

Quality journalism. Progressive values. Direct to your inbox.

Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. (Photo: SocialJusticeSeeker812/cc/flickr)

Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. (Photo: SocialJusticeSeeker812/cc/flickr)

Algorithmic Bias: How Algorithms Can Limit Economic Opportunity for Communities of Color

As a young person of color who relies heavily on technology, I worry about the ways inequality is becoming automated, normalized and worsened through computerized decision-making systems

Haleema Bharoocha

Racial and gender bias in algorithms impacts communities of color in disproportionate and frightening ways. The urgency of addressing this issue cannot be stressed enough. Algorithmic bias goes beyond privacy and surveillance issues that we have seen in the past. Biased algorithms can determine my access to healthcare, jobs, loans, and broadly, economic opportunity — and yours, too.

As a young person of color who relies heavily on technology, I worry about the ways inequality is becoming automated, normalized and worsened through computerized decision-making systems. I have first-hand experience of what happens when we leave discrimination unchecked. As a Muslim raised post 9/11, I have seen Islamophobia continue to increase. I watched as Mohammed became Mo, as aunties took off their hijabs, as my community did all but shed their skin to hide their Muslim identity.

My community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally.

Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. The silence around the genocide of Rohingya Muslims in Burma and the extermination of Uyghur Muslims in China speaks volumes to how the dehumanization of Muslims became so normalized, which by the way was supported by platforms like Facebook which enabled hate-based propaganda on their platforms. So I of all people understand why addressing algorithmic bias is a matter of urgency. We must include this analysis in our fight for equity and justice before automated inequality becomes the new status quo.

Let’s focus on the hiring process as an example to expand on what algorithmic bias looks like. To begin with, an algorithm is a process that helps solve a problem. Think of it as a formula where you plug in datasets and methods to show results. The simplest algorithms are written based just on the intuitions of the programmer, but many in practice also rely on big data and artificial intelligence. Big data and artificial intelligence use datasets in combination with programmer instructions to shape algorithms into more finely tuned formulas. Though generally seen as objective, computerized decision-making systems contain human and data set bias. They are, after all, created by humans who bring their own biases that can impact the way the program is structured and the information that’s fed into it.

Hiring software contains algorithmic bias, limiting economic opportunity for people of color and marginalized genders. When you have a 1,000 applicants for a job, using an algorithm to pick out your top 20 candidates to interview solves an issue of capacity and time for an organization. For this algorithm to pick the best applicants you plug in resumes of successful hires at the organization, past hiring history, and keywords that match the job description. Here is where the “isms” start to show up.

In 2017, Uber’s technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group.

Let’s use Uber an example. In 2017, Uber’s technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group. Keywords may also include bias. Textio, a company which helps create gender-neutral language in job descriptions, shows that words like “enforcement” or “fearless” tend to attract male applicants. Based on this data, the hiring algorithm will likely pick White and Asian men as the top 20 candidates, taking away economic opportunity from qualified diverse candidates. These are just two examples of how an algorithmic can contain bias in the hiring process. The bias that led the hiring process to select only White and Asian men for the job is now embedded into the algorithm, automating this cycle of discrimination.

As someone who is currently applying for jobs, I worry that I may not even get an interview despite my qualifications due to this bias. I found there are hacks you can use to improve your chances to pass resume reading software through tools like Bloc and Job Scan. To learn more about algorithmic bias in the hiring process read this report by Upturn.

In order to address this issue at its root, community organizers, policy advocates, nonprofit professionals, and youth need to understand the impact of algorithmic bias based on race, gender, or other factors. Our communities must mobilize to create solutions that are by us and for us — soon. Companies like Google still lag on creating long-term solutions that address the root cause of these issues. We need a grassroots #PeoplePowered movement to bring #TechEquity, before these supposedly “objective” systems normalize and worsen discrimination.

Follow us on Twitter for more updates and stay tuned for our official report on #AlgorthmicBias. Have ideas or want training for how your organization can advance #TechEquity? Email haleemab@greenlining.org.


Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.

Haleema Bharoocha

Haleema Bharoocha is Technology Equity Fellow at The Greenlining Institute.

We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.

ACLU Demands 'Truly Systemic Overhaul' of US Civilian Harm Policies

"While a serious Defense Department focus on civilian harm is long overdue and welcome, it's unclear that this directive will be enough," says director of the legal group's National Security Project.

Jessica Corbett ·


'This Is Not Over': Alaska Supreme Court Rejects Youth Climate Case

"With the state continuing to undermine their health, safety, and futures," said the plaintiffs' lead counsel, "we will evaluate our next steps and will continue to fight for climate justice."

Jessica Corbett ·


Analysis Finds 'Staggering' Rise in Voter Suppression After GOP Restrictions in Georgia

"This is why we are fighting this new law in court," said one voting rights advocate.

Brett Wilkins ·


'Egregious': Pennsylvania Court Strikes Down Mail-In Voting Law

The ruling was stayed pending an appeal to the state's Supreme Court and as one voting advocate put it: "The fight's not over yet, folks."

Julia Conley ·


Big Win for Open Internet as Court Upholds California Net Neutrality Law

One legal advocate called the Ninth Circuit's opinion "a great decision and a major victory for internet users in California and nationwide."

Kenny Stancil ·

Support our work.

We are independent, non-profit, advertising-free and 100% reader supported.

Subscribe to our newsletter.

Quality journalism. Progressive values.
Direct to your inbox.

Subscribe to our Newsletter.


Common Dreams, Inc. Founded 1997. Registered 501(c3) Non-Profit | Privacy Policy
Common Dreams Logo