OPINION: Before Alaska becomes an AI data farm, be sure to read the fine print

AI News


Stargate artificial intelligence data center complex in Abilene, Texas. (AP)

Artificial intelligence is revolutionizing the economy and culture of the United States and other countries. Alaska is being touted as the next frontier in data centers, one of the most energy-intensive industries, primarily aimed at advancing AI, which is socially disruptive to an as-yet-unknown extent.

Gov. Mike Dunleavy, the state's biggest promoter, has invited more than a dozen tech companies, including affiliates of Microsoft, Facebook and Amazon, to set up “data farms” in Alaska. He has personally given executives tours of potential sites in the Anchorage and Fairbanks areas. The Alaska Legislature has taken a more cautious stance, stating in House Concurrent Resolution 3 (HCR 3) that “the development and use of artificial intelligence and the establishment of data centers in the state have the potential to stimulate economic growth, create job opportunities, and position the state as a leader in technological innovation.” However, this resolution does not mention the shortcomings caused by data center development.

The Fairbanks-based Northern Alaska Environmental Center (NAEC) studies the known and potential benefits, costs and risks of data center growth in the state. It calls for an informed, unhurried, transparent and measured approach.

First, what is a data center? These are facilities that house the servers, storage, networking, and other computing infrastructure needed to support AI and other digital services, as well as the associated electrical and cooling infrastructure.

Generally, there are two categories of data centers. One is a large hyperscale facility, which typically operates on the multi-megawatt scale and is designed to scale to higher scales. One example is the proposed Far North Digital (FND) Prudhoe Bay data centre. It will initially start with a capacity of 120 megawatts, with “potential for significant expansion.” It will be powered by natural gas.

Another type is micro or microgrid data centers. A good example is the GreenSpark Corp./Cordova Electric Cooperative's 150-kilowatt facility in Cordova. Powered by 100% renewable energy from a nearby hydroelectric power plant. We agree with analysis from the Alaska Center for Energy and Power (ACEP) at the University of Alaska Fairbanks, which argues that such small, sustainable data centers, which could be integrated into existing microgrids, are more viable in Alaska, especially in underserved and remote communities.

The main problem with data centers is that their energy demands are high and huge. This is especially true for hyperscale data centers, which consume as much power as 100,000 homes. Depending on the situation, cooling can account for approximately 40% of a facility's energy usage. Although Alaska's cold climate provides environmental benefits and reduces the need for energy-intensive mechanical cooling systems, cooling still requires large amounts of water. NAEC asserts that new data centers will be required to minimize water use and thermal pollution, and reuse waste heat for local heating.

Railbelt grids already face constraints and the need for expensive upgrades. NAEC believes that when new data centers are developed, regulatory safeguards must be in place to prevent them from exacerbating grid shortages and increasing household electricity costs.

Even as operators enter into renewable contracts and add clean generation, most of the electricity powering data centers still comes from fossil fuels. Building data centers powered by fossil fuels will lock up high-emissions infrastructure for decades, contradicting global decarbonization efforts. NAEC proposes that new data centers must build or contract an equal amount of clean energy generation (wind, solar, hydro, or geothermal) to match their consumption.

There are many other concerns that need to be addressed when considering data center and AI development. One is the issue of e-waste, or e-waste. When data centers need to be upgraded, they generate e-waste containing hazardous materials. Given Alaska's remote potential locations and limited recycling infrastructure, the cost of properly disposing of e-waste must be factored into data center decisions.

In the rush to embrace data centers, some states are offering significant tax breaks and subsidies, many with limited public benefit. Alaska must learn from mistakes made elsewhere. Before we consider approving new data centers, we need to enact legislation that ensures profit-making companies do not receive power rate discounts or tax breaks and do not pass on additional costs, including the cost of required upgrades, to ratepayers.

Yes, data centers provide needed diversification to Alaska's economy, but not by much. It is very capital intensive, employing many people during the construction phase but very few during operations. Companies should be required to train and employ local people on a practical level.

Additionally, there are serious but little-recognized issues that go beyond energy, economics, and the environment. Data centers expand the computing available for increasingly powerful AI systems. Some researchers and industry leaders say this could accelerate progress toward AI that matches or surpasses human capabilities, along with new risks. In the end, the biggest cost of data centers and AI may be the changes it will cause to our humanity and society, for which we are woefully unprepared.

Roger Kay He is a Fairbanks-based freelance writer and author of Last Great Wilderness: The Campaign to Packaging the Arctic National Wildlife Refuge. He serves on the Northern Alaska Environmental Center's Issues Committee.

• • •

The Anchorage Daily News welcomes a wide range of perspectives. To submit your work for consideration, please send an email Commentary(at)adn.com. Submissions of less than 200 words should be sent to: Letters@adn.com or Click here to submit from any web browser. Read all guidelines for letters and comments here.





Source link