
Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. (Photo: SocialJusticeSeeker812/cc/flickr)
To donate by check, phone, or other method, see our More Ways to Give page.
Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. (Photo: SocialJusticeSeeker812/cc/flickr)
Racial and gender bias in algorithms impacts communities of color in disproportionate and frightening ways. The urgency of addressing this issue cannot be stressed enough. Algorithmic bias goes beyond privacy and surveillance issues that we have seen in the past. Biased algorithms can determine my access to healthcare, jobs, loans, and broadly, economic opportunity -- and yours, too.
As a young person of color who relies heavily on technology, I worry about the ways inequality is becoming automated, normalized and worsened through computerized decision-making systems. I have first-hand experience of what happens when we leave discrimination unchecked. As a Muslim raised post 9/11, I have seen Islamophobia continue to increase. I watched as Mohammed became Mo, as aunties took off their hijabs, as my community did all but shed their skin to hide their Muslim identity.
My community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally.
Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. The silence around the genocide of Rohingya Muslims in Burma and the extermination of Uyghur Muslims in China speaks volumes to how the dehumanization of Muslims became so normalized, which by the way was supported by platforms like Facebook which enabled hate-based propaganda on their platforms. So I of all people understand why addressing algorithmic bias is a matter of urgency. We must include this analysis in our fight for equity and justice before automated inequality becomes the new status quo.
Let's focus on the hiring process as an example to expand on what algorithmic bias looks like. To begin with, an algorithm is a process that helps solve a problem. Think of it as a formula where you plug in datasets and methods to show results. The simplest algorithms are written based just on the intuitions of the programmer, but many in practice also rely on big data and artificial intelligence. Big data and artificial intelligence use datasets in combination with programmer instructions to shape algorithms into more finely tuned formulas. Though generally seen as objective, computerized decision-making systems contain human and data set bias. They are, after all, created by humans who bring their own biases that can impact the way the program is structured and the information that's fed into it.
Hiring software contains algorithmic bias, limiting economic opportunity for people of color and marginalized genders. When you have a 1,000 applicants for a job, using an algorithm to pick out your top 20 candidates to interview solves an issue of capacity and time for an organization. For this algorithm to pick the best applicants you plug in resumes of successful hires at the organization, past hiring history, and keywords that match the job description. Here is where the "isms" start to show up.
In 2017, Uber's technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group.
Let's use Uber an example. In 2017, Uber's technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group. Keywords may also include bias. Textio, a company which helps create gender-neutral language in job descriptions, shows that words like "enforcement" or "fearless" tend to attract male applicants. Based on this data, the hiring algorithm will likely pick White and Asian men as the top 20 candidates, taking away economic opportunity from qualified diverse candidates. These are just two examples of how an algorithmic can contain bias in the hiring process. The bias that led the hiring process to select only White and Asian men for the job is now embedded into the algorithm, automating this cycle of discrimination.
As someone who is currently applying for jobs, I worry that I may not even get an interview despite my qualifications due to this bias. I found there are hacks you can use to improve your chances to pass resume reading software through tools like Bloc and Job Scan. To learn more about algorithmic bias in the hiring process read this report by Upturn.
In order to address this issue at its root, community organizers, policy advocates, nonprofit professionals, and youth need to understand the impact of algorithmic bias based on race, gender, or other factors. Our communities must mobilize to create solutions that are by us and for us -- soon. Companies like Google still lag on creating long-term solutions that address the root cause of these issues. We need a grassroots #PeoplePowered movement to bring #TechEquity, before these supposedly "objective" systems normalize and worsen discrimination.
Follow us on Twitter for more updates and stay tuned for our official report on #AlgorthmicBias. Have ideas or want training for how your organization can advance #TechEquity? Email haleemab@greenlining.org.
Dear Common Dreams reader, The U.S. is on a fast track to authoritarianism like nothing I've ever seen. Meanwhile, corporate news outlets are utterly capitulating to Trump, twisting their coverage to avoid drawing his ire while lining up to stuff cash in his pockets. That's why I believe that Common Dreams is doing the best and most consequential reporting that we've ever done. Our small but mighty team is a progressive reporting powerhouse, covering the news every day that the corporate media never will. Our mission has always been simple: To inform. To inspire. And to ignite change for the common good. Now here's the key piece that I want all our readers to understand: None of this would be possible without your financial support. That's not just some fundraising cliche. It's the absolute and literal truth. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. Our Summer Campaign is now underway, and there’s never been a more urgent time for Common Dreams to be as vigilant as possible. Will you donate now to help power the nonprofit, independent reporting of Common Dreams? Thank you for being a vital member of our community. Together, we can keep independent journalism alive when it’s needed most. - Craig Brown, Co-founder |
Racial and gender bias in algorithms impacts communities of color in disproportionate and frightening ways. The urgency of addressing this issue cannot be stressed enough. Algorithmic bias goes beyond privacy and surveillance issues that we have seen in the past. Biased algorithms can determine my access to healthcare, jobs, loans, and broadly, economic opportunity -- and yours, too.
As a young person of color who relies heavily on technology, I worry about the ways inequality is becoming automated, normalized and worsened through computerized decision-making systems. I have first-hand experience of what happens when we leave discrimination unchecked. As a Muslim raised post 9/11, I have seen Islamophobia continue to increase. I watched as Mohammed became Mo, as aunties took off their hijabs, as my community did all but shed their skin to hide their Muslim identity.
My community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally.
Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. The silence around the genocide of Rohingya Muslims in Burma and the extermination of Uyghur Muslims in China speaks volumes to how the dehumanization of Muslims became so normalized, which by the way was supported by platforms like Facebook which enabled hate-based propaganda on their platforms. So I of all people understand why addressing algorithmic bias is a matter of urgency. We must include this analysis in our fight for equity and justice before automated inequality becomes the new status quo.
Let's focus on the hiring process as an example to expand on what algorithmic bias looks like. To begin with, an algorithm is a process that helps solve a problem. Think of it as a formula where you plug in datasets and methods to show results. The simplest algorithms are written based just on the intuitions of the programmer, but many in practice also rely on big data and artificial intelligence. Big data and artificial intelligence use datasets in combination with programmer instructions to shape algorithms into more finely tuned formulas. Though generally seen as objective, computerized decision-making systems contain human and data set bias. They are, after all, created by humans who bring their own biases that can impact the way the program is structured and the information that's fed into it.
Hiring software contains algorithmic bias, limiting economic opportunity for people of color and marginalized genders. When you have a 1,000 applicants for a job, using an algorithm to pick out your top 20 candidates to interview solves an issue of capacity and time for an organization. For this algorithm to pick the best applicants you plug in resumes of successful hires at the organization, past hiring history, and keywords that match the job description. Here is where the "isms" start to show up.
In 2017, Uber's technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group.
Let's use Uber an example. In 2017, Uber's technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group. Keywords may also include bias. Textio, a company which helps create gender-neutral language in job descriptions, shows that words like "enforcement" or "fearless" tend to attract male applicants. Based on this data, the hiring algorithm will likely pick White and Asian men as the top 20 candidates, taking away economic opportunity from qualified diverse candidates. These are just two examples of how an algorithmic can contain bias in the hiring process. The bias that led the hiring process to select only White and Asian men for the job is now embedded into the algorithm, automating this cycle of discrimination.
As someone who is currently applying for jobs, I worry that I may not even get an interview despite my qualifications due to this bias. I found there are hacks you can use to improve your chances to pass resume reading software through tools like Bloc and Job Scan. To learn more about algorithmic bias in the hiring process read this report by Upturn.
In order to address this issue at its root, community organizers, policy advocates, nonprofit professionals, and youth need to understand the impact of algorithmic bias based on race, gender, or other factors. Our communities must mobilize to create solutions that are by us and for us -- soon. Companies like Google still lag on creating long-term solutions that address the root cause of these issues. We need a grassroots #PeoplePowered movement to bring #TechEquity, before these supposedly "objective" systems normalize and worsen discrimination.
Follow us on Twitter for more updates and stay tuned for our official report on #AlgorthmicBias. Have ideas or want training for how your organization can advance #TechEquity? Email haleemab@greenlining.org.
Racial and gender bias in algorithms impacts communities of color in disproportionate and frightening ways. The urgency of addressing this issue cannot be stressed enough. Algorithmic bias goes beyond privacy and surveillance issues that we have seen in the past. Biased algorithms can determine my access to healthcare, jobs, loans, and broadly, economic opportunity -- and yours, too.
As a young person of color who relies heavily on technology, I worry about the ways inequality is becoming automated, normalized and worsened through computerized decision-making systems. I have first-hand experience of what happens when we leave discrimination unchecked. As a Muslim raised post 9/11, I have seen Islamophobia continue to increase. I watched as Mohammed became Mo, as aunties took off their hijabs, as my community did all but shed their skin to hide their Muslim identity.
My community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally.
Fast forward to today and my community is experiencing unprecedented levels of Islamophobia that are normalized in media, policy, and culture, not only in the U.S. but globally. The silence around the genocide of Rohingya Muslims in Burma and the extermination of Uyghur Muslims in China speaks volumes to how the dehumanization of Muslims became so normalized, which by the way was supported by platforms like Facebook which enabled hate-based propaganda on their platforms. So I of all people understand why addressing algorithmic bias is a matter of urgency. We must include this analysis in our fight for equity and justice before automated inequality becomes the new status quo.
Let's focus on the hiring process as an example to expand on what algorithmic bias looks like. To begin with, an algorithm is a process that helps solve a problem. Think of it as a formula where you plug in datasets and methods to show results. The simplest algorithms are written based just on the intuitions of the programmer, but many in practice also rely on big data and artificial intelligence. Big data and artificial intelligence use datasets in combination with programmer instructions to shape algorithms into more finely tuned formulas. Though generally seen as objective, computerized decision-making systems contain human and data set bias. They are, after all, created by humans who bring their own biases that can impact the way the program is structured and the information that's fed into it.
Hiring software contains algorithmic bias, limiting economic opportunity for people of color and marginalized genders. When you have a 1,000 applicants for a job, using an algorithm to pick out your top 20 candidates to interview solves an issue of capacity and time for an organization. For this algorithm to pick the best applicants you plug in resumes of successful hires at the organization, past hiring history, and keywords that match the job description. Here is where the "isms" start to show up.
In 2017, Uber's technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group.
Let's use Uber an example. In 2017, Uber's technical leadership was entirely White and Asian, and 88.7 percent male. This means successful hires were White or Asian male and the hiring history dataset was only made up of this group. Keywords may also include bias. Textio, a company which helps create gender-neutral language in job descriptions, shows that words like "enforcement" or "fearless" tend to attract male applicants. Based on this data, the hiring algorithm will likely pick White and Asian men as the top 20 candidates, taking away economic opportunity from qualified diverse candidates. These are just two examples of how an algorithmic can contain bias in the hiring process. The bias that led the hiring process to select only White and Asian men for the job is now embedded into the algorithm, automating this cycle of discrimination.
As someone who is currently applying for jobs, I worry that I may not even get an interview despite my qualifications due to this bias. I found there are hacks you can use to improve your chances to pass resume reading software through tools like Bloc and Job Scan. To learn more about algorithmic bias in the hiring process read this report by Upturn.
In order to address this issue at its root, community organizers, policy advocates, nonprofit professionals, and youth need to understand the impact of algorithmic bias based on race, gender, or other factors. Our communities must mobilize to create solutions that are by us and for us -- soon. Companies like Google still lag on creating long-term solutions that address the root cause of these issues. We need a grassroots #PeoplePowered movement to bring #TechEquity, before these supposedly "objective" systems normalize and worsen discrimination.
Follow us on Twitter for more updates and stay tuned for our official report on #AlgorthmicBias. Have ideas or want training for how your organization can advance #TechEquity? Email haleemab@greenlining.org.