Over the past few years, the use of AI has permeated through many HR functions, from recruitment, to onboarding, to helping with routine day-to-day tasks. While it has been praised for streamlining some processes, there are concerns, particularly when it is used to manage employee data.
When it comes to disabled and neurodivergent workers, there are different challenges to overcome. Much has been written about AI bias in recruitment, and whether the use of these technologies puts those with disabilities at a disadvantage, as they may best exhibit their suitability for roles in different ways to others. AI is only as good as the data it is trained on, so if this data is not inclusive, then the result won’t be inclusive either.
To ensure that diversity is considered, human intervention is still needed to interpret and act upon AI HR recommendations, particularly when it comes to recognising if any bias has taken place, or if any sensitive data is at risk of being leaked. For many HR teams, learning to work with AI’s limitations as well as its strengths has become essential, as the use of AI becomes more widespread.
Using AI if you have a disability
If you are an HR manager with a disability or neurodivergent condition yourself, the ability of AI to summarise the huge amount of information in the recruitment process, or to aid as a data management tool, shows great promise to help you do your job.
AI is rapidly becoming ‘the assistive technology that everyone, including those without disabilities, use’. So, it is particularly useful if you have chosen not to disclose your disability to your employer, which of course, there is no obligation to do.
However, if you are working in HR and have a disability, then you need to be aware of the best ways to use AI to avoid any costly errors.
How AI can be a force for good
Firstly, I would like to look at the positives of how AI can help you with your day-to-day work.
AI can improve the accessibility of your content. For example, it can generate automatic audio descriptions or alt-text for video and images on career sites. There are tools that can automatically create transcripts from meetings and trim out filler words to avoid the need for so much notetaking.
Summarisation tools can also be particularly useful for those with neurodiverse conditions such as dyslexia, who may struggle to read large portions of text, from CVs or interview transcripts, or for those who have sight difficulties, so they don’t have to focus for long periods of time to read content. Tools can simplify the reading level, change the tone and help define or explain complex language, if these are helpful.
Data privacy can be a real concern
However, as mentioned, there are challenges. If you do use summarisation, it is essential to note that each summarisation tool can produce different results, depending on the data sets they have been trained on, and the prompts you use.
In a role like HR, where accuracy and protecting the privacy of sensitive information is paramount, this can lead to issues.
HR is responsible for handling a considerable amount of people data, and employees are becoming increasingly concerned about the security of their personal data because of AI’s increased use by HR departments.
If you are using AI for summarising or data management, then do so with caution, as many AI technologies are not always clear about the privacy of data you enter into them, how it is processed, and how private the output is. AI is currently unable to recognise what data should be private, so it’s up to the AI user to do this in advance of using the tool – privacy-first needs to be the watchword.
Additionally, human intervention is a necessity, to check the accuracy of the summarisation delivered. A dyslexic HR manager may struggle to do these checks, which could cause decisions to be made based on inaccurate data.
This is why it is paramount that an employer has rigorous data protection measures in place to ensure the confidentiality of employee information, that only the necessary data is collected, and that summaries are reliably accurate. And, if you, as an HR manager, have a condition that could impact how you check whether AI is being used in the most appropriate way, then it is important to disclose this to your employer.
Use AI, but use it in the right way
As with all AI, the key thing is to know what it is good at, and what it isn’t. AI isn’t and can never be human. So, while some tools can have a positive impact on HR functions, it cannot eliminate the need for human involvement, particularly when it comes to data privacy and accuracy.
So, if you are an HR manager with a disability, or someone in your team who has a condition that may affect how they use and interpret AI-based information, then it is important to recognise AI’s current limitations and only use it in a way that protects your integrity, the integrity of your organisation, and your employees.