AI is already pervasive across business, and its carbon impact is likely to be profound

Machine Learning


AI didn’t suddenly enter organizations, but its role is rapidly changing.

What was once applied in a targeted manner is now becoming central to how products are built and work is done.

At Culture Amp, we’ve been using machine learning for over 10 years. Features such as sentiment analysis and topic clustering in employee engagement surveys relied on models we built ourselves and were applied in a restrained manner that added clear value.

What has changed over the past year is how central AI has become.

According to Deloitte’s 2026 State of AI in the Enterprise, employee access to AI has expanded by 50% in just one year.

It has gone from being used in a specific part of a product to being at the core of how the product is built and experienced. Features like AI Coach no longer require admins to navigate rigidly structured workflows, but give them the flexibility to support admins by meeting them where they are. Beyond product development, AI is being used in nearly every role as a productivity tool.

It’s not just recruitment that has changed, but the scale at which it has been reached has also changed. And at this point, AI stops acting like a feature and starts acting like a flexible foundation that creates value beyond individual use cases.

This change is occurring in a broader context. The global energy transition is still incomplete, and the rapid growth of AI is increasing the demand for energy-hungry data centers. So how to use AI and how to scale it quickly becomes an even more important decision for business leaders.

The next challenge in AI is responsibility, not competency

Instinct still treats AI as software. That is, something you deploy, test, and layer into your workflow. But the more you use it, the more it starts to behave like a system that runs underneath your business.

It takes energy-intensive shared infrastructure and scales with usage in ways that aren’t always obvious, including carbon.

The carbon impact of AI will vary depending on how widely AI is used across the business and how effectively it is deployed.

This change is occurring in a broader context. The global energy transition to renewables remains incomplete, and the growing demand for AI is increasing pressure on energy-intensive data infrastructures.

The International Energy Agency’s latest analysis predicts that electricity demand from data centres, AI and digital infrastructure will grow rapidly in the coming years, placing further strain on energy systems already under pressure.

So how to use AI and how to scale it quickly becomes a more important set of decisions.

Measurement lags behind adoption

For most organizations, this is nothing new. The majority of emissions in technology businesses already reside in cloud infrastructure and data centers, outside of direct control and often with limited transparency. AI is increasing the demands on these systems without making its impact easy to understand.

At Culture Amp, this is something we’ve been working on as part of our broader sustainability efforts. As a certified B Corp, we approach this through a broader lens of responsibility, balancing innovation with accountability for our employees, customers, and the broader environment.

One of the most encouraging things we’ve seen is that efficiency gains, cost savings, and emissions reductions often point in the same direction.

By making targeted changes to our cloud architecture and usage patterns, we reduced downstream data center emissions by 49% while lowering operational costs.

That experience doesn’t map directly onto AI, but it shapes how we think about AI.

Measurement remains untapped in most enterprises, but is a prerequisite for responsible deployment. You can’t optimize what you can’t see.

Where your system runs is more important than you think

There are still gaps. There is no standard way to attribute emissions to different AI use cases. Visibility into vendor infrastructure is improving, but still limited. And most organizations are just beginning to understand how usage patterns, from model selection to real-time versus asynchronous workloads, can lead to large-scale impact.

What this suggests is that we need to treat this as a system that needs to be actively managed.

At scale, these decisions are not made solely within individual organizations. These shape the demand across the shared infrastructure and, over time, across the systems that support it.

What this requires for organizations

Organizations don’t need perfect data to make better decisions, but they should think carefully about how they use and scale AI.

  1. Treat carbon like a management cost

Think about carbon the same way you think about profit and loss. At Culture Amp, we have strived to understand our footprint over the life of the company and treat it as something to be actively managed over time. Minimize debt growth through operational improvements and then repay the debt through the purchase of carbon removal credits (not just carbon offset credits).

For most companies, the first step is simply to start measuring impact and accountability.

  1. Gain visibility into AI usage and costs

There are real cost signals to the use of AI. There is a close relationship between how AI is used, the number of tokens consumed, and the cost of running those systems.

That cost is not just in dollars but also in carbons, and in the absence of a global carbon price, it will be the responsibility of responsible businesses to ensure that their carbon impacts are priced in.

Gaining visibility into where AI is being used and holding teams accountable for usage and infrastructure costs is one of the most practical ways to manage both.

  1. Fit the model to the problem

Not all tasks require state-of-the-art models, and some tasks don’t require AI at all. More complex inference models consume more resources, and smaller models can often achieve the same results.

Careful model selection can significantly reduce costs without compromising business value.

  1. Designed for efficiency at the architectural level

How you build your system is important.

Engineering decisions, from how you structure your workload to how you scale your system, have a direct impact on both cost and emissions. Take the time to invest in efficient architecture initial compounds.

  1. Choose a partner with clear commitment

For most organizations, infrastructure is external. Therefore, vendor selection is important.

Cloud providers are ramping up their efforts around water positivity, renewable energy, and net zero goals. These efforts should become part of how organizations think about where they run their systems.

Responsible AI is part of building a better company

As AI becomes embedded in different industries, the question is not just how it will be used, but how it will be managed, measured, and maintained over time.

AI is no longer something organizations experiment with. It’s becoming part of the way we work, and increasingly part of the systems that support it.

It’s no longer just about what these systems can do, but how they are performed, measured, and maintained over time. And in many organizations, adoption is happening faster than the ability to manage what is being built.

At scale, it doesn’t just impact individual businesses. It shapes the overall demand for shared infrastructure and the systems that support it.

At Culture Amp, we always take a human-centered approach to the way we work. AI should enhance the role of people by helping them make more informed, more timely, and better decisions, not replace them, but enable them to perform their roles more effectively.

Organizations that get this right will be the ones that establish visibility early on, make thoughtful decisions about where and how they use AI, and treat it as something that needs to be actively managed over time.

It’s not just performance and cost that will ultimately define whether these systems improve people’s jobs and are sustainable at the scale they are currently being built.

  • Doug English is the Chief Technology Officer and Co-Founder of Culture Amp.



Source link