How To Embrace Diversity in Hiring Through AI

May 31, 2021

By Sarah Evans

Embracing Diversity in Hiring

In conversations surrounding workplace diversity and inclusion, the practical benefits of a diverse workforce may sometimes get lost in the chatter. When that happens, it’s worth remembering: the 20 most diverse companies in the S&P 500 achieve higher long term profitability than their less diverse counterparts. Companies in the top quartile for diversity in leadership are more likely to outperform on profitability, have superior value creation, and have good financial performance. By 2025, advancing gender equality alone in the workplace could add $12 trillion to the global GDP. Despite these obvious benefits, 48% of businesses either aren’t on track to meet their diversity goals or have no goals at all.

How can companies who are struggling improve their diversity outcomes? The first step is to acknowledge human biases in recruiting. On average, recruiters spend 7 seconds reviewing an individual resume. In that short a time, recruiters often rely on snap judgements that could be colored by similarity bias, contrast effect, and more. While human recruiters may not have the time to review each application in more depth, they should be aware of the subjectivities human screeners bring to the hiring process.

Is software a solution? 90% of enterprises and 68% of small businesses use applicant tracking systems, but recruiting software is not immune from bias either. Using software that relies on keywords in searching resumes may reflect a candidate’s ability to write a keyword-heavy resume, but not their actual qualifications. Synonym searching resolves part of the issue, though not all. A case in point of recruiting software gone wrong is Amazon. In 2018, Amazon scrapped its state-of-the-art recruiting AI after it taught itself to penalize resumes that included the word “female” or mentioned all-female colleges. AIs don’t come programmed with human biases; they learn them by processing data and identifying patterns. If the data one is provided is biased, its results will reflect the biases back. Amazon’s AI was trained on a decade of resumes and hiring decisions. As a result, the AI picked up and exaggerated existing biases in the screening process.

What can be done to improve the situation? 81% of HR professionals admit their current practices are average or worse in the area of diversity. Many are unsure how to train an AI without bias. Some ways to un-bias the data an AI examines include collecting data from varied industries, jobs, and candidates, removing factors like age, gender, and names from initial screening, and considering how well a candidate and company fit from both perspectives. Important steps to take on the human side are setting clear, trackable improvement targets, making diversity training standard and an opportunity to learn from fellow employees, partnering strategically with outside organizations, schools, and colleges, and ensuring management and policies support diversity and inclusion at every level.

Overcoming recruiting bias can help a company thrive.  And in the …read more

Source:: Social Media Explorer

      

amateurfetishist.com analonly.org todominate.org fullfamilyincest.com