GitLab has expanded its partnership with Google as part of its effort to bring more generative artificial intelligence (AI) capabilities to DevOps workflows.
The GitLab suite of software-as-a-service (SaaS) applications already exists on the Google cloud platform, providing GitLab with the data foundation needed to train these AI models. Over the past two months, GitLab has already added a number of features that rely on multiple types of AI technologies.
For example, there is an experimental Explain This Vulnerability feature that provides a natural language summary of the issue in a way that developers and cybersecurity teams can easily understand.
Taylor McCaslin, product group manager for data science and AI/machine learning at GitLab, says most of the future AI focus will be on leveraging generative AI capabilities. These features are enabled by Google using the Large Language Model (LLM) developed by GitLab for DevOps workflows. This approach allows GitLab to display more accurate recommendations based on validated data compared to the generic LLM that was used to create the ChatGPT service.
In addition, GitLab can continuously update AI models running on Google Vertex AI cloud services using data from its continuously monitored and updated SaaS application environment, McCaslin said.
It’s not clear how AI will impact DevOps workflows, but GitLab predicts a 10x improvement. This is accomplished, for example, by surfacing code that can be used to remediate vulnerabilities. Currently, many vulnerabilities are not addressed simply because developers do not have enough time to create patches.
However, according to a recent GitLab survey, developers are already adopting AI to improve their productivity, with 62% of developers using AI and machine learning algorithms to check their code. More than a third (36%) rely on AI and machine learning algorithms to review code.
What is certain for now is that AI and other related technologies will make developers more productive. It’s less clear how the increased amount of code moving simultaneously through the DevOps pipeline will affect the software engineers managing those processes. It is hoped that similar types of AI advancements will allow more code to pass through these pipelines without exacerbating existing bottlenecks that may exist.
In the meantime, it’s clear that the AI genie is out of the bottle. His LLM for all kinds of tasks is coming soon. DevOps teams must plan today on the premise that many of the manual tasks that make software engineering tedious are disappearing. As such, DevOps team roles change and evolve. The assumption these DevOps teams should make is that these changes are for the better. After all, the reason organizations adopted DevOps in the first place was to automate IT processes relentlessly. AI is just the latest iteration of that commitment.