
According to a new survey from Checkmarx, 99% of companies are using AI code generation tools, but only 29% have established any form of governance.
A survey of 900 CISOs and application security professionals worldwide found that 15% of respondents explicitly prohibit the use of AI tools for code generation within their organizations.
“Enterprise CISOs are under increasing pressure to understand and manage the new risks associated with generative AI without stifling innovation or becoming a roadblock within their organizations,” said Sandeep Johri, CEO of Checkmarx. “GenAI helps time-strapped development teams scale and produce more code faster, but emerging issues like AI hallucinations are ushering in a new era of hard-to-quantify risks. Checkmarx has correctly predicted the issues that can arise with AI-generated code, and today we are proud to deliver our next-level solution within the Checkmarx One platform.”
Other survey findings include: 70% say there is no centralized strategy for generative AI, with purchasing decisions being made on an ad-hoc basis by individual departments, 60% are concerned about GenAI attacks such as AI hallucinations (where GenAI produces inaccurate or odd results), and 80% are concerned about security threats arising from developers using AI.
Forty-seven percent of respondents said they would be interested in allowing AI to make changes to code without oversight, while 6 percent said they wouldn't trust AI to take part in security actions within vendor tools.
“These responses from global CISOs expose the reality that while developers are using AI to build applications, AI can't reliably create secure code, meaning security teams are inundated with new vulnerable code to manage,” said Kobi Tzruya, chief product officer at Checkmarx. “This shows that security teams need to have their own productivity tools to manage, correlate and prioritize vulnerabilities. Checkmarx One is designed to help them do just that.”
The full report is available on Checkmarx's site.
Image credit: BiancoBlue/depositphotos.com