Using AI to Promote Diversity in the Tech Industry

The U.S. technology industry has historically been dominated by men, specifically white and Asian men. Despite growing interest from women and Black, Indigenous, and People of Color (BIPOC), these diversity gaps haven’t closed much, and they may actually be getting worse.

Workplace diversity isn’t just a matter of altruism, not that that’s a bad motivation, and by itself can make for a more engaged workforce with a sense of mission. Diversity is also good business. Companies with diverse management see an increase in revenue of about 19 percent over companies with a homogenous management team. And that extends down from management too: diverse employee teams lead to more innovative and creative solutions through the exchange of ideas. It quite literally pays for companies to prioritize diversity.

There’s been a recent push from leaders in the tech industry to try and get broader representation in open roles, so why aren’t we seeing this reflected in hiring trends? In this article, we’ll lay out some of the issues contributing to workplace segregation and a few ways artificial intelligence (AI) can combat them.

 

Workplace segregation and AI’s influence

 

     

    Major causes of workplace segregation

    Infographic of 4 reasons companies aren't diverse

    Most of the time, workplace segregation isn’t intentional. It’s a product of subconscious biases and a tendency for organizations to hire people who look and act like their current employees. Here are a few of the major causes of workplace segregation.

    Icon on three businesses

    More new businesses

     

    With the recent emphasis placed on supporting small and local businesses, there’s an uptick in the number of new businesses that are cropping up. While this is a positive trend, small businesses tend to be the biggest perpetrators of workplace segregation. They often rely heavily on referrals to staff their organizations, leading to little diversity in the workplace. Additionally, small businesses usually don’t face the same laws surrounding workplace discrimination as large firms do, making diversity less of a priority.

    One candidate preferred over others

    Giving referred applicants preferential treatment

     

    Most companies prefer to hire people that their employees have referred because they’re easier to onboard and tend to stay with the company longer. However, this means they usually give referred applicants preferential treatment, even when someone applying from a job board may be a better fit. Employees have limited networks, and they may not have a lot of diversity in their personal lives. If that’s the case, you’re severely limiting your chances of hiring someone with a different background or outlook.

    Document with sections highlighted

    Biased requirements

     

    It’s possible that your job descriptions contain biased requirements that are driving away great candidates. If you’re adding lifting requirements to job descriptions for roles that won’t actually require lifting heavy objects, you may be driving away people with disabilities. Additionally, job descriptions, especially in male-dominated fields, often contain gendered language like “he”, “guys”, and “brotherhood”, which could deter women from applying or make them feel as if they wouldn’t be welcome if they did. And requiring a college degree or a personal vehicle when it’s not absolutely necessary can exclude people below a certain socioeconomic status. Unintentionally biased requirements like these can lead to homogenous workplaces.

    Even the application process itself can contain bias. For most jobs, applicants need access to a computer or smartphone to be able to apply, but not everyone has that. Text-to-apply applications are growing in popularity and cater to previously ignored segments of the population. If you’re looking to improve the diversity of your organization, these are some of the pitfalls to avoid.

     

    Subjective vs. objective hiring processes

     

    When looking to combat workplace segregation, it’s important to understand the difference between subjective and objective hiring processes.

     

    Subjective hiring processes

    The interview and resume analysis phases of hiring are inherently subjective. It’s nearly impossible to have a conversation with an applicant without bringing some of your personal feelings and opinions into the decision. It’s these stages where unconscious bias is the most likely to creep in, and you have to be extra aware to ensure you’re making the decision based on who will be the best fit for the job. This isn’t to say that subjective processes are bad; they just need to be scrutinized for potential bias.

     

    Objective hiring processes

    Standardized pre-employment testing tends to be more objective because it is free of gendered language and is assessing whether candidates possess certain skills rather than personal characteristics. Additionally, hiring managers don’t really have a way to inject their own biases into the testing because it’s administered automatically when a candidate applies, and the platform provides a report to the hiring manager based on the outcome. To truly remain objective, a testing platform might also assign candidates an ID number and give that to the organization rather than a name to avoid gender or racial bias.

     

    Bias within AI software itself

     

    AI can help organizations avoid some of the biases they face when hiring, but it also contains some implicit biases that companies need to be aware of. Organizations have to train AI models in order for them to provide successful insights.

    XOR Marketing VP Birch Faber explains, “Biases can manifest themselves in AI when the data that machines learn from is itself filled with bias.”

    For example, he notes that if a company uses AI to predict the performance of potential employees using data from its current workforce, it’s likely to run into problems. “The reason is that the composition of the company’s current workforce is likely due to some previous hiring bias,” he said. “Just because one type of person has been successful at a company doesn’t mean hiring that same type of person again and again will lead to continued success.”

    In fact, hiring the same type of people over and over can cause a company to stagnate due to a lack of new ideas. Workplace diversity is one of the best ways to ensure long-term success because it helps make companies more agile. Companies with a diverse team make better business decisions 87 percent of the time, according to a study by Cloverpop. And as population growth continues to slow in the U.S., diversity will be critical for a company’s future.

     

    Using AI to improve hiring diversity

     

    Despite some well-publicized issues with bias, AI can actually help take some of the bias out of the hiring process. “When used properly, AI l can help to reduce hiring bias by creating a more accessible job application process, and completely blind screening and scheduling process,” says Faber. Companies can use these models to ensure they’re hiring the best employees, not just the ones who look like their current workforce.

    Application with highlighted parts being removed

    Removing non-inclusive language

     

    Artificial intelligence can help you remove gendered or non-inclusive language from your job descriptions to ensure you aren’t driving away quality candidates before they ever apply. Text analysis tools examine your job descriptions for gender bias, weak job descriptions, and choppy language. They can even suggest alternate wording to help you attract more diverse candidates. One example of a text analysis tool is Gender Decoder, a free tool that examines text you input for gendered language. It’s easy to use and can help you quickly improve your job descriptions.

    Computer with data trends

    Highlighting trends in each hiring stage

     

    Companies can also use AI to analyze their hiring process to see where bias might be leaking in. If they get a large number of diverse candidates initially but diverse candidates are quickly falling away after the initial interview, it stands to reason that some of their questions may be biased. Recruiting tools with AI, like HireVue and XOR, can highlight these trends and help organizations determine which stages of the hiring process they need to focus on improving.

    Deloitte has made a large effort to remove bias stemming from socioeconomic factors in its hiring process over the last few years. Knowing that people in higher economic brackets typically get better opportunities early on, which usually leads to them attending better universities, Deloitte’s hiring team has hidden the name of colleges and universities on applications. Victoria Lawes, Deloitte’s former Head of Resourcing in the U.K. says they want to “ensure that whoever is recruiting isn’t consciously or unconsciously favoring a person who attended a certain school or university.”

    In addition to the school-blind testing, Deloitte has also partnered with a recruitment consulting company called Rare to provide context to applicants’ achievements. For example, if a candidate is the first in their family to attend college, they might get more “points” in the recruiting process for graduating with honors than someone who grew up wealthy. By combatting this socioeconomic bias, Deloitte has created a more diverse workforce and provided social mobility for a variety of people.

    Magnifying glass over group of people

    Evaluating a larger talent pool

     

    Because AI doesn’t rely on employee referrals, it opens up a larger pool of talent for companies to pull from. And rather than slowing the hiring process down, it can automate the initial resume analysis to actually make the process easier.

    Explaining more about the blind screening process, Faber says, “Companies can also create an automated screening and scheduling process using AI so that factors like a candidate’s name, accent, or origin don’t disqualify a candidate early in the hiring process. Every candidate gets asked the same questions and is scored by the AI in the same exact fashion.”

    There is a caveat to this, however. “That said, it’s important to write your screening questions and state your required job qualifications carefully to avoid any bias,” he notes.

    With AI-powered hiring platforms like XOR, companies have the ability to remove bias from the early stages of the hiring process. This means they’re going to interview more diverse candidates and will likely find that they make more diverse hires as a direct result.

    Where you look for talent also matters. Local colleges and universities are great for creating internships that you can eventually turn into long-term work, but many of them offer candidates similar to what you already have. Consider partnering with historically Black colleges and universities (HBCUs) or multicultural career centers and professional groups to find more diverse talent. 

    Additionally, Pink Jobs is an LBGT-friendly job board that you can post open roles on, and Youth Villages can help you connect with young adults who have aged out of foster care. You might also consider partnering with the Transition Assistance Program (TAP) to connect with veterans who are looking to transition into civilian roles. Then use your AI-powered testing to evaluate for potential, rather than experience.

     

    What you can do to complement AI’s efforts

     

    AI can’t fix workplace segregation all on its own. Your company will need to put in some work to complement AI’s efforts, and you can do this by making diversity a priority in your organization. Get your managers on board with the initiative by explaining the benefits that come with a more diverse workforce. Make discrimination and harassment training a requirement for every employee to ensure your entire team feels welcome and safe. Achieving a diverse workforce isn’t enough; you need to keep it, too.

    Read next: Investing in AI to Enhance Remote Work