As detailed in Women in Tech: Their Current Status, What They Have Achieved and What They Want, the lip service many companies pay to diversity and equal opportunity in hiring does not translate into equal representation among women and minorities in the tech industry.
A number of people and organizations have directed their attention toward overcoming stereotypical biases and bridging that hiring gap.
CodeSignal brings AI to hiring
Now a company called CodeSignal is offering its AI-powered hiring solution as a tool to overcome bias in hiring. Sophia Baik, Co-founder and VP of Operations at CodeSignal, answered my questions about how what she calls the equivalent of the SAT exam for admissions for jobs can work toward making a more level and equitable playing field in hiring.
Are the number of female hires in proportion to the number of applicants, or do women in certain fields represent a much lower percentage of applicants overall? What about minorities?
Yes, there is certainly a pipeline issue in the software engineering field. Statistically speaking there are fewer women who get their engineering college degrees every year compared to their male counterparts.
When looking at the numbers, roughly only 22% of US college graduates in engineering are women. Compared to other professions, engineering has a long way to go. For example, in 2017, we saw that for the first time there were more women enrolled in US medical schools than men.
You don’t have to get a computer science or engineering college degree to become a software engineer. But this imbalance is representative of the composition of talent entering the industry.
According to the US Bureau of Labor Statistics 2018, only 26% of computer scientists are women.
Negative stereotyping can negatively influence people’s career decisions, and this type of unconscious bias is one area that CodeSignal is working to improve.
Does bias programmed into AI exacerbate the problem of bias in hiring?
AI, like any mathematical model, relies heavily on the quality of input to produce its output. To use layman’s terms, this system reflects the concept of GIGO, which stands for garbage in, garbage out.
When we feed and train an AI historical records of hiring decisions that are biased, then the AI will certainly produce outcomes with the same bias embedded within the results, because it’s part of the model already built into the machine. AI is great at helping humans make the same decisions faster and at a larger scale, and as a result, it could potentially amplify the bias to a much larger scale.
The danger here is that if we blindly take the results coming out of a black box, there is no mechanism to correct for the bias. Recognizing that there could be a bias in the input of an AI or machine learning model, as well as interpreting the results with a critical eye to check for bias is important to reducing that inclination in hiring overall.
Could adjustments and transparency turn AI into a tool for good in achieving more equality?
Absolutely. If we set out our direction to improve equality and utilize AI with that intention we can make it work for us instead of letting it accidentally amplify the problem. For example, CodeSignal uses AI to assess technical skills at scale and calculate a Coding Score which is strongly correlated with the candidate’s performance during the job interview and job performance.
When you have objective data and measurement of skills, a hiring decision can be made easily. It also significantly reduces the scope in which hiring managers and recruiters’ unconscious bias can kick in. They no longer have to rely on their subjective impression of a candidate when making a hiring decision.
Moreover, it empowers women and minority candidates to self-promote during the hiring process with evidence to support their skills.
In the video below, Baik talks about the Top 3 Benefits of Framework Based Assessments.
How does your solution work?
We have a suite of assessment solutions to help companies #GoBeyondResumes in tech recruiting: Test, Interview and Certify. Certify, currently in beta, is our latest offering and it is the first technical assessment product that can be used at the very top of the hiring funnel allowing talent acquisition professionals and engineering managers to request and compare unbiased and easy-to-understand assessment results at scale to make data-driven hiring decisions.
Certify is intended to do for technical hiring professionals what the SAT exam does for college admissions. It is a trustworthy test of skill proficiency that also allows companies to conduct a top-notch skill assessment without having to create it themselves.
When they use such a ready-made solution, HR professionals can focus their time and energy on using the data to make a hiring decision instead of building out a professional skill assessment operation with rigorous test creation and maintenance work.
What inspired your company to develop this solution?
The precursor of CodeSignal was as a website that people could visit to solve interesting coding challenges to improve their skills. With over one million experienced engineers solving a large volume of coding tasks, we were able to chart out their programming skills.
Learning that those at the very top of the skill spectrum were having a hard time finding a job because they didn’t look good on paper, together with the struggle many organizations experience in finding and hiring talented software engineers from various backgrounds, was crazy.
That realization inspired us to bridge the gap by making an objective and credible technical skill assessment easily and readily available for hiring purposes and to help the industry to go beyond resumes.
The video above shows how Greenhouse uses CodeSignal in their technical recruiting process.
Do you have any case studies that prove more women or minorities were hired as a result of implementing your solution?
A lot of our customers who clearly understand the value in objective skill assessments tend to have diversity and inclusion initiatives already at work. Consequently, it is difficult to separate out the sole effect of implementing our solution on their ability to become successful in improving the diversity within their teams.
What we can say with confidence is that a lot of candidates who wouldn’t have been considered without CodeSignal’s assessment results have been interviewed for many software engineering jobs and have gone on to be hired at those companies.
It makes sense that having an objective interview process assures job candidates about the company’s dedication to making fair and unbiased hiring decisions and would attract more women or minority candidates who value a diverse work environment more highly.
We internally use our own assessment solution to hire software engineers. When candidates express interest, we request them to share their CodeSignal Coding Score as the first step and don’t pay attention to their resumes which could introduce unconscious bias.
We invite them to start an interview process on the basis of their Coding Scores. This practice has been very well-received by our candidates and allowed us to hire the best candidate based on their ability.