AI's Impact on HR Harnessing Potential and Addressing Bias at Your Nonprofit
Talent Published Date, 2023

AI’s Impact On Human Resources: Harnessing Potential And Addressing Bias At Your Nonprofit

Created By: Jessica Shatzel
July 27, 2023

Artificial Intelligence promises to make our lives better. It will eventually impact almost every aspect of our lives and livelihood. I’m sure you’ve heard a version of both of those statements.

AI’s potential for good is already well known—and well marketed.

When it comes to business, it’s said that AI can enhance efficiency, improve productivity, and increase revenue. AI can perform routine and monotonous tasks such as data collection, data entry, and email responses, giving employees more time to spend on tasks that require human abilities.

One area in which the use of AI is growing rapidly is recruitment and hiring. This technology has proven valuable when it comes to the review and sorting of resumes and identification of the most suitable job candidates—helping to expedite the hiring process.

“AI can use predictive analytics to analyze candidate data, including resumes, social media profiles, and online behavior, to predict which candidates are most likely to be successful in the role,” writes Jack Kelly in Forbes.

However, as HR departments explore the capabilities of AI more fully, employers are discovering that, if fed biased data, the use of AI as a resume-screening tool can lead to the same discrimination and omission of talented, diverse individuals as the traditional recruitment and hiring process.

The use of AI does not eliminate human error and subjectivity in recruitment and hiring.

In theory, AI can be helpful in bringing an unbiased view to the recruitment process by significantly reducing unconscious bias and focusing on skills, qualifications, and experience rather than personal characteristics. However, in reality, we’re learning that AI is only as good—and impartial—as its programming and the algorithms that flow from it.

“Many of us have probably heard about AI’s potential for repeating and propagating human biases. All of them come down to the same issue: biased data sets,” a recent Fast Company article on artificial intelligence and discrimination explains. Part of the blame for the bias that has crept into AI algorithms stems from the fact that vital voices are absent in the development process. Black and Latinx talent is and has been historically underrepresented in the information technology sector, including engineers, coders, and data and computer scientists—those doing the programming of AI.

If fed incomplete and biased data, AI could potentially move us backward in our quest for more equitable and inclusive workplaces. It could erode many of the gains that have been made to reduce recruiting and hiring inequities that negatively impact women, people of color, and the LGBTQ+ community.

To maximize the potential of AI, users must be vigilant in ensuring that the information and content used to drive AI outputs are free of bias, and properly inputted by the human controller.

How to mitigate bias in your use of AI

  1. Establish processes and procedures that require strict practices for review and set metrics for the identification of bias in AI-generated results.
    • i.e.: Create and document AI Usage Guidelines for your organization.
  2. Test algorithms in ways that mirror how you would use them in real-life scenarios to ensure outcomes reflect the situation and setting appropriately.
    • i.e.: Ensure your algorithm for applicant screening applies to the job and industry candidates are being considered for.
  3. Consider ‘Humans-in-the-Loop’ systems that allow human intervention, input, and feedback, correcting errors and filling in specific knowledge gaps prevalent in AI.
    • i.e.: Include human review and input when developing communications to employees to deliver messages with inclusive language and tones.

Talent, people and culture, and diversity, equity and inclusion, are human-centric functions within an organization. With proper utilization, there are ways in which AI can augment the development of such initiatives including improving performance management processes or helping to create more equitable compensation practices. AI can support the fostering of climates for inclusivity by acting as assistive technology that removes barriers for employees working in their non-native language or aids neurodivergent employees with communication limitations, for example.

AI is here to stay, and its innovation can certainly help to advance healthier workplaces, with a sound understanding of its limitations. It can help us create a better, more equitable future, but only if humans are there to analyze its outputs and help shape its direction.

Trust Orr Group With Nonprofit AI Implementation

Orr Group offers solutions that combine AI and human expertise to help your organization mitigate risks and implement new technology that supports your recruitment, retention, and DEI goals. Get in touch with us today to learn more.


Jessica Shatzel is a Senior Director, Talent Management at Orr Group, specializing in executive search, recruitment, and a variety of human resources support for our nonprofit partners.

Related Resources