NSW government agencies warned against the use of AI to hire decision-making

Applications of AI


Public Service Commissioner's NSW Office is publishing it Using AI in recruitment guidancesays that AI should not be used to “make recruitment-based decisions.”

“Institutions that choose to use AI during the recruitment process should ensure that AI use is legal, clearly and transparently documented, and that its risks are properly identified and mitigated.

There are no free articles this month

“AI should not be used to rule out applications about candidates or make other decisions.”

The report also highlighted some important issues regarding the use of unreliable AI when used by potential job seekers.

Bias can result in the wrong candidate being selected based on the development of the tool itself in terms of training or developer programming, both in automation and AI output.

Furthermore, AI can evaluate candidates in unreliable or ineffective ways, making decisions based on video inputs such as skin tone, facial hair, head covering, etc., and can all hallucinate as the technique is known.

To prevent bias, AI tools need to be developed in a comprehensive way. Consider people with “diverse attributes” such as culturally diverse backgrounds and disabilities. AI output should also be screened for bias and periodically audited for exclusion patterns.

An AI dataset should also be constructed and included, including “names, experiences, qualifications and linguistic patterns common to Aboriginal and Torres Strait Islander applicants, culturally diverse communities, rural and local people.”

The Public Service Commission said candidates need to know why human surveillance is necessary, as is the case with recruitment.

“Administrators should also be able to clearly identify who made the decision, what authority and how that decision was made.”

Additionally, candidates should be allowed to provide feedback to highlight issues that were dealing with AI tools.

This report presents additional concerns as not all institutions have their own AI tools and are working on using third-party “off-the-fly” AI.

“The product may work in one agency, but there may be unexpected risks in another agency,” the report added.

“When procuring AI products, carefully check marketing and contract materials. [NSW AI Assessment Framework] Requirements.

“This is especially important as AI products may be reluctant to provide information on the decision-making of the algorithms used by them, and therefore AI products may only be tested in a limited way.”

Especially with regard to third-party AI, there are security concerns regarding the use of confidential candidate data using AI. This could potentially be used for training when you collect the data you need to support the hiring process. Institutions considering off-the-shelf AI programs should ensure that the data that AI collects is independent of the data collected for training.

That being said, the NSW office of the Public Service Commissioner acknowledges that AI can be a productive tool in the hiring process.

This includes using AI to “support the development of recruitment documents, evaluation materials and communications,” or “identifying key themes from candidate responses,” or interviewer notes.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *