Study finds AI accessibility tools overcome key app barriers

Applications of AI


Applause has released research showing that more than half of assistive technology users have been blocked by inaccessible apps this year, even though most organizations surveyed said they were using artificial intelligence to improve accessibility.

The study leveraged responses from more than 500 development and quality assurance professionals and more than 1,000 people who use assistive technologies such as screen readers, captions, font enlargement, and alternative navigation tools.

As a result, we found a gap between how companies use AI tools and the experience of people who rely on accessibility features. While 78% of organizations said they are using AI to improve the digital accessibility of their websites and applications, 56% of assistive technology users said they have regularly encountered inaccessible apps since the start of the year.

These barriers often prevent users from completing basic tasks. According to the data, 28% faced such problems every month and 17% faced such problems weekly.

Poor accessibility also appears to have commercial costs, with 44% of assistive technology users saying they are more likely to abandon an app if it is less accessible.

At the same time, easily accessible services may enhance customer retention. According to the report, 97% of assistive technology users are loyal to brands that provide an accessible experience, and 62% describe themselves as very loyal.

Introduction of AI

Organizations reported using AI across several stages of digital development. The survey found that 60% use AI coding tools to address accessibility issues and 58% use coding agents to generate accessible code for new features.

Another 56% offer AI-based features, 47% use AI to scan a site or app for accessibility issues, and 45% use AI to generate audio or video captions or subtitles.

Confidence in these tools remains mixed. Only 22% of respondents said AI-driven audit tools accurately identify 75% or more of accessibility issues.

More than half of people using AI accessibility scanning tools expressed concerns about accuracy. Of this group, 24% said the tool reported incorrect issues, and 13% said the tool missed an issue entirely.

Human review still plays an important role in testing. Only 10% of organizations say they rely solely on AI accessibility tools, and 90% validate automated results with some form of manual testing.

human surveillance

The report noted that direct input from people with disabilities is still needed, rather than relying solely on automation. The report said that without the participation of the disability community in research, some problems will remain undetected.

This reflects a broader issue in accessibility testing. Automated checks can identify coding errors and missing labels, but they may not provide insight into how your app will actually work for users using assistive technology. Real-world interactions, especially across different devices and use cases, can reveal issues that automated scans miss.

Bob Farrell, vice president of solution delivery and accessibility at Applause, said manual testing remains essential alongside AI tools.

“More teams are incorporating AI-powered accessibility testing tools into their development processes, even at the coding stage,” Farrell said. “However, these tools miss up to 80% of important accessibility issues that cannot be discovered by machines. The majority of organizations incorporate some form of manual testing to complement their AI-powered accessibility checks. These checks can be made more effective by involving users with disabilities, and generally testers with expertise in accessibility and inclusive design. That expertise includes the latest WCAG and EAA. This includes knowledge such as requirements.

The findings suggest that accessibility extends beyond compliance to product design, customer experience, and customer retention. While companies are investing in AI tools to address issues early in development, data shows that these systems do not eliminate the need for expert testing and user feedback.

ShareFile, which collaborates with Applause on accessibility testing, said this approach resulted in fewer defects and commercial benefits.

“By working with the Applause team and our global community of real users, including people with disabilities, we were able to reduce accessibility defects by more than 60% year over year,” said Ivan Ereiz, Senior Director of Product Design and Research at ShareFile.

A second executive tied accessibility efforts to customer retention and new business.

“Regulatory compliance is an important factor in our procurement process, so demonstrating that we prioritize accessibility has helped us both retain existing customers and attract new ones,” said John McCartney, senior manager of user experience at ShareFile. “As we continue to work with Applause to shift left and move towards a culture of inclusive design, we expect to see an even greater impact on our business.”



Source link