AI Opportunities: Experts as Tool Makers for Compliance

Applications of AI


We can all see that artificial intelligence has fundamentally changed client expectations about work speed. So what does it take for the broader professional community to use the code output of large language models like Claude Code and ChatGPT to create applications and accelerate their work while maintaining quality?

Not everything requires a sophisticated user interface. With LLM, you can code highly targeted applications that perform important compliance tasks more efficiently. Open source software practices can be used to share and reuse across the community. Some difficulties remain, especially when it comes to secure coding. Cybersecurity still requires engineering expertise and time to access sensitive data. We also need to be aware of regulatory challenges in our own apps. But the real challenge lies in creating the right mindset for a future where we can define our own compliance tools.

core discussion

Millions of software engineers use AI-powered coding tools like Claude Code and Cursors on a daily basis. In fact, in my experience running a startup specializing in governance, regulatory, and compliance automation and compliance advisory, these tools have democratized coding. This allows people with limited coding skills to create Python applications that automate tedious tasks. These applications are rapidly iterated and improved by engineering personnel.

Why did I break out of my product manager and compliance professional mindset and do this? Because this new technology is changing our profession. Boundaries are collapsing and client expectations for work speed are irrevocably changing with AI. We must also respond as professionals.

We need more and better tools to accelerate our work. And thanks to this technology, we can now build them ourselves.

There are many narrowly defined tasks within privacy, AI, and cybersecurity governance processes that are currently performed using simple but inefficient tools such as expensive GRC enterprise software and spreadsheets. This task calls for automation.

Need to identify new cookies in front of a consent wall in your client’s portfolio website in real time? Let’s code! Want to automatically check for vendor or processing location changes between processors and subprocessors in your client’s application stack? Let’s hack it together! I have what I need in my hands.

The number of users of the app may be small, perhaps just one. That’s okay; just getting the job done is enough.

More importantly, as a community of peers within IAPP, we share similar challenges, even though there are unique factors to each client’s situation. There are strong legal mechanisms, such as the Apache 2.0 open source software license and a series of other model agreements to limit legal risks. There are free OSS tools such as VS Code, Pytest, and Docker. For the more adventurous, process automation tools like n8n are cutting edge. Taken together, these commons reduce the burden of repetitive and time-consuming development tasks.

Significant challenges remain. An important skill set, but one that many within the GRC community are less familiar with, is secure coding. Even granting that, there are obvious mitigations. You can limit your use cases to those that only require information available within your organization’s website or in the public domain. This includes information about cookies, tracking pixels, and consent strings from consent management platforms. Alternatively, you can build automatic searches for information in already published documents, such as privacy/cookie policies and often data processing agreements.

For tasks that examine client application code or sensitive data or documentation, you can foster collaboration with your organization’s engineering community. These configuration tasks may be foreign to many people, but they are core competencies for application engineers. You can easily restrict access to read-only to adhere to the principle of least privilege. Roots must not be opened within the application for unauthorized use. And of course, you should scan your application for malware before deploying it. Implementing these and other measures requires significant engineering time.

For motivated engineers, time spent improving applications is time spent not manually filling out questionnaires that quickly become outdated. However, one piece of advice is to avoid situations where engineers are “volunteered” by management. It breeds resentment.

Once a partnership is established, you will be able to transparently see and change every line of code within your own apps. It may not look good, but it will give you the results you want. There is also a maintenance burden over time, but this is usually manageable if the application is small and narrowly focused.

There is still a role for GRC vendors. Organizations may not be able to use their own human resources to build and maintain compliance automation for many valid reasons. Some stakeholder teams may require an application with a graphical user interface rather than a command line Linux interface. Maintenance is guaranteed, although support hours may be required. As you scale beyond a small number of users, these become real issues for organizations to consider.

There is also a small but growing number of open source vendors in the GRC space, and these coding tools enable organizations to add the bespoke functionality and analytics that they specifically need, without having to wait in vendor queues or pay bespoke development consultants. Examples include OpenGRC, OpenVAS, and GLPI.

What’s stopping people? I can’t speak for everyone, but I’ve asked this question many times to my fellow IAPP community members. Some people are eager to try it out. Other stakeholders are wary of “stepping out of their own lane,” or being seen as stepping on the toes of other stakeholders.

The problem is that AI will inevitably break down the barriers to accessing the skills and knowledge that formed the basis of the traditional division of labor in GRC teams. And it’s already happening. Privacy UX designers will be doing more product management work and vice versa. Cybersecurity engineers are doing more product design and management work, and vice versa. Compliance analysts do further work to match and assess risks using legal frameworks.

They all seek to solve organizational problems faster, often in the public interest. They have received professional training and are confident in their knowledge of being responsible human beings and upholding quality.

This also applies to lawyers. They should also do more prototyping and building lightweight applications, preferably in collaboration with surrounding stakeholders. Lane boundaries are disappearing and tool innovation is being unleashed in the GRC space.

conclusion

AI coding, open source GRC tools, and new ways of thinking are emerging and changing the way we build, maintain, and extend work tools in our field. Building applications is no longer limited to vendors or in-house engineering teams. It is also in our hands as a community. Enablers are readily available. As a community of IAPP practitioners, where will we take this new possibility? I can’t wait to see.



Source link