As employers explore how to use artificial intelligence (AI) within existing and new laws and guidelines, and as government agencies, states, and local governments seek to regulate AI in hiring and other areas, the use of AI by job seekers is proceeding largely under the radar. However, job seekers are increasingly using AI to create and review resumes, cover letters, and writing samples, complete job applications, and even prepare for and participate in interviews. This use often goes unnoticed by employers, which can lead to problematic outcomes.
It’s important for employers to understand how AI can help candidates in the recruitment process and learn how to effectively resolve any issues that may arise without negatively impacting the pool of qualified applicants or violating employment laws.
How are job seekers using AI?
Job seekers are leveraging AI in a variety of ways: A 2023 survey by an online and app-based resource provider for jobseekers found that nearly half of job seekers already use ChatGPT to write their resumes and cover letters, and 70% of applicants reported that using ChatGPT to write or revise their applications increases their response rates from employers.1 In fact, a 2023 Harvard Business Review article states,[u]Using a tool like ChatGPT [a] Resumes may become the new standard in a few years.”2
Not only are candidates using AI to write and revise applications, resumes, cover letters, and other documents, but they are also using it for interviews. For example, in 2023, multiple news outlets covered a TikTok video (with over 2 million views) that showed how candidates could use AI to prepare for interviews by using a tool to generate interview questions based on the job description. In fact, a recent survey found that 41% of college students believe it is acceptable to use AI to prepare for an interview.3
Even more problematic, some applicants may use AI to answer questions during text, pre-recorded, or video interviews. This controversial use of AI was highlighted in a 2023 TikTok video, in which a woman was seen using a smartphone app to generate answers to questions as they were asked during a video interview.Four While some believe the post was an advertisement for the app the woman was using and not a real-life scenario, the video shows another way AI is infiltrating the hiring process.
Employment Law Considerations
Although Congress, federal agencies, and state and local governments have not addressed applicants’ use of AI, existing federal employment guidelines and laws regarding employers’ use of AI provide insight into how employers can regulate applicants’ use of AI.
For example, in May 2022, the U.S. Equal Employment Opportunity Commission (EEOC) issued “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Evaluate Job Applicants and Employees” (EEOC ADA Guidance), which includes guidance on how the Americans with Disabilities Act (ADA) can limit employers' use of AI in screening job applicants. Although it focuses on employers' use of AI, the guidance helps employers understand how they may limit job applicants' use of AI. Among other things, the EEOC ADA Guidance notes that one of the most common ways employers violate the ADA is by failing to provide “reasonable accommodations” to ensure that job applicants are considered fairly. Relatedly, the EEOC ADA Guidance explains that employers cannot use AI to “screen out” job applicants with disabilities under the ADA.
With these concerns in mind, if an employer has a general policy banning AI based on a legitimate, non-discriminatory business reason (such as preventing plagiarism or misrepresentation about skills or experience), it may need to make an exception if a job seeker with a disability can clearly explain why they need AI assistance in the application process. If the underlying objective of an employer's workplace policy banning job seekers from using AI can be achieved by alternative means (such as using AI tools and human screeners to detect possible AI plagiarism in job applications), the employer may need to tailor its policy for job seekers with disabilities. Additionally, if a job seeker with a disability can use AI to create a resume, for example, and still perform the essential functions of an offered position, employers should be careful not to automatically exclude applicants due to a blanket ban on the use of AI.
Overall, employers should take some comfort from the EEOC ADA Guidance’s instruction that under the ADA, “reasonable accommodation does not require lowering productivity or performance standards or reducing essential job functions,” but they should proceed carefully and intentionally when regulating job applicants’ use of AI, including by following the recommended practices outlined below.
In May 2023, the EEOC issued “Specific Issues: Assessing the Adverse Effects of Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Section 7 of the Civil Rights Act of 1964,” which contained guidance regarding Section 7 of the Civil Rights Act of 1964.Five While the guidance focuses on how employers can use AI within the scope of Title VII, it also provides useful insights for employers as they develop policies governing job applicants’ use of AI.6 At the forefront of this guidance is the EEOC's clear message that employers may be held liable if they discriminatory use of AI-influenced “selection procedures” (i.e., any means used to make employment decisions) in a way that disproportionately impacts job seekers based on protected characteristics (e.g., race, sex, religion, etc.). With this in mind, if public data or research is released that shows that job seekers with protected characteristics are disproportionately using AI when applying to certain industries or occupations, affected employers should proceed with caution when determining how to limit candidates' use of AI and should not automatically reject candidates because they use some form of AI in their application process.
Existing and proposed laws are also helpful: for example, Illinois’ Artificial Intelligence Video Interviewing Act of 2020 requires employers to disclose their use of AI in the hiring process, suggesting that employers could require applicants to do the same.
Finally, employers who restrict job applicants’ use of AI should draft their policies with the Biden Administration’s Blueprint for an AI Bill of Rights (AI Blueprint) in mind to mitigate potential legal liability and reputational damage.7 Because the AI Blueprint focuses on the potential for AI to exacerbate existing biases in hiring, employers who seek to restrict or prohibit job seekers’ use of AI should ensure that their rules do not inadvertently or disproportionately impact certain groups of job seekers.
Employer Recommendations
Employers need to be prepared for the different ways that job seekers will use AI during the job search process and should develop rules and procedures that accommodate the various ways candidates may use AI and take into account existing federal, state, and local laws and guidance. Employers and their legal counsel need to stay current on the ever-evolving legal landscape in this area and new AI tools entering the market.
With this general advice in mind, employers should consider the following steps to address job seekers’ use of AI:
- Make sure your business reasons for prohibiting or restricting the use of AI at different stages of the application process are not discriminatory. For example, some uses, such as using AI-generated questions to prepare for interviews or using AI to edit resumes and cover letters, may be acceptable, but other uses, such as using AI to write resumes or cover letters from scratch, create or edit writing samples, or answer interview questions, may be problematic.
- Post a notice about the AI rules and a link to them on your job portal or third-party job advertisements.
- If a particular AI-related rule applies only to a specific job type, we will post a specific notice on our job portal or third-party job postings tailored to that specific job type.
- Depending on your AI rules, you may either require applicants to certify that they are not using AI in the application process or require applicants to disclose their use of AI.
- Include in the notice and rules information regarding requesting reasonable accommodations in the application process, including the use of AI.
- Train recruiters, HR professionals, and interviewers to detect job applications that may have been AI-enabled.
- AI “signatures” in written materials include: (1) repetition of words or phrases; (2) lack of personalization, such as failure to provide detail about skills or experience, failure to mention the specific job for which the applicant is applying, or repeating job advertisement language without detail or context; (3) inconsistencies in formatting; (4) different tone or style within and across a single document; (5) overly complex or redundant language; and (6) similarities in the written materials of multiple applicants.
- Indicators that a candidate is using AI in real time during an interview include: (1) an extended period of silence from the candidate following a question, (2) the candidate looking away from the camera before responding, and (3) the candidate providing scripted responses.
- Additionally, as candidates move through the hiring process, recruiters, HR professionals and interview staff should be on the lookout for inconsistencies between what candidates say about their experience and skills in written and video interviews and what they or their references say later in the hiring process.
- Include in-person, non-digital interactive steps during the job search process (such as a face-to-face interview without electronic devices) and carefully review references to better assess the applicant's skills and experience (such as comparing the skills and experience communicated in the in-person interview or with references to what was communicated in the applicant's written materials or during a remote interview).
Conclusion
Employers should be mindful that a job seeker's use of AI is likely to increase over time. Additionally, as AI evolves and there are fewer visible AI “signs” or “hallucinations” in job seeker materials, it may become more difficult for employers to accurately track a job seeker's use of AI. As employers seek to navigate this evolving landscape, they should proceed carefully when imposing limitations on a job seeker's use of AI to ensure they remain compliant with existing federal, state, and local employment and labor law obligations.
Oue’s labor, employment and workplace safety lawyers regularly advise clients on a range of issues relating to emerging issues in labor, employment and workplace safety law and are well positioned to provide guidance and assistance to clients regarding AI developments.
