Gordon Campbell On Why Data Centres Are Not Your (Or The Planet’s) Friend

It sounds too good to be true. A giant multinational mothership – Amazon Web Services aka AWS – it supposedly intent on spending $7.5 billion here in building and operating a cluster of state-of-the-art data centres likely to create 1,000 fulltime jobs and train 100,000 locals. All while, supposedly, doing little or no harm to the environment or to our electricity grid, thanks to a supply agreement for renewable energy that AWS has signed with Mercury. Is this all too good to be true? Yes it is.

There is no way of verifying the above claims. AWS is not a transparent company. “For security reasons” we aren’t even being told where these AWS data centres are located on New Zealand soil, beyond AWS country manager Manual Bohnet saying they’re “in and around Auckland” in zones at reduced risk from natural disasters, but close enough to facilitate high speed operations. There has been no disclosure of the privacy protocols under which these centres are protecting and/or using the data they manage.

AWS is being especially coy about its energy needs and its water requirements ( Google is equally secretive about such matters). Also, the fabled 1,000 jobs that AWS will create in New Zealand apparently include all of the engineers, builders, suppliers, telcos (and, presumably, baristas) involved in constructing the centres and servicing their operational needs.

As Bohnet also told RNZ, only 150 staff are involved in providing the range of AWS data centre services. These services include analytics, networking, computing, cloud storage, generative AI and machine learning. There is no breakdown on how many of the staff doing these tasks are locals. What constitutes “training” and how progress to the 100,000 claim is being monitored are also opaque.

All that however, may be the least of the concerns. Globally, Microsoft, Google, Amazon and Meta are locked in competition to provide infrastructure for the AI race. Data centres exist to provide the computing muscle for artificial intelligence operations and they consume vast amounts of two elements – electricity and water – that are in increasingly short supply around the world, as demand for computing power ramps up, and as climate change bites.

When Google announced its environmental impact [in 2024] it revealed its own greenhouse gas emissions had risen 48% in the last five years, and 13% in the last 12 months, largely driven by increased data centre demand to service its AI needs.

In this country, the impacts of the AWS (and Microsoft) data centres (a) on New Zealand’s already stressed electricity grid and (b) on household electricity prices, remain swathed in secrecy. They seem likely to be sizeable. By 2030, data centres are on track to rival the Tiwai Point aluminium smelter as New Zealand’s biggest users of electricity. (And Tiwai Point isn’t going away anytime soon.)

True, some data centres have engaged in expanding grid capacity. Earlier this year for instance, the CDC data centre company – in which the infrastructure investor Infratil has a 49% stake holding – was reported as planning for an extra 200mw construction to add to the 400mw it already has in train in Australia and New Zealand.

At present though, there is no indication that data centres are being charged a premium rate for the power they are drawing from the existing grid. (If anything the reverse is probably true: they’re probably being offered electricity sweetheart deals to enticed them here. See below.) Ideally, such a premium fee would be earmarked to finance (a) the grid upgrades and (b) the additional renewables resources that the activities of data centres will require. As things stand, these major costs are likely to be borne by taxpayers/consumers.

In future, there has to be a question mark over whether grid capacity can affordably be built to a standard that serves all the needs of all the customers at all times. Once again though, there doesn’t seem to be any suggestion that the increasing pressure that data centres will place on capacity may need to be managed – in part at least – by “flexible use” contracts that will require the data centres to trade electricity with other users during periods of peak use.

Yes, AWS does have a contract with Mercury for renewable energy. Yet given New Zealand’s lack of significant electricity storage ( the coalition government’s kneejerk cancellation of the Lake Onslow battery storage project looks more short-sighted by the day) the renewables contract will inevitably revert to fossil fuels when lake levels are low, the sun isn’t shining and the wind isn’t blowing. (The fluctuations in electricity usage inherent in data centre AI training operations may also conceivably need fossil fuel backups.) Ultimately, these renewables contracts will also carry an opportunity cost, given that once the commitment to data centres has been met, there is likely to be far less renewable energy available to other users.

Within the data centres, it is unclear what percentage of the electricity demand is being generated by the controversial web crawling activities involved in training the AI chatbots, as opposed to what percentage is being consumed by the generative AI searches being initiated by the public. Both are significant drains on supply, and the remainder devoted to other functions (such cloud storage) is by comparison, pretty minimal.

Footnote: It is estimated that by 2050, data centres will be consuming nearly 9% of the world’s entire energy supply. Given that trend, small nuclear reactors located next to data centres are being touted as one answer to the hunger that data centres have for more and more electricity. This isn’t just a hypothetical. In June of this year, AWS signed a supply deal with Talen Energy for almost 2,000 megawatts of electricity from Talen’s Susquehanna reactor, for use in the adjacent AWS data centre complex in Pennsylvania. Microsoft is reportedly recommissioning the Three Mile Island nuclear site, and last October, Google announced that it is buying “six or seven” small nuclear reactors to help satisfy its energy needs.

Water, water

As for water….our cooler climate (when compared to other data centre sites in the likes of Mexico, Saudi Arabia, Chile and Spain) does lessen the amounts of water that need to be utilised for cooling purposes. Some progress is also being made by the data centres themselves in devising more efficient closed circuit water cooling systems. Even so, the extent of water they need for cooling purposes remains substantial.

A massive investigative article published last month by the Spanish newspaper El Pais into data centres in Chile, Mexico and Spain laid out the evidence of the environmental threats posed by data centres and the extent of community resistance in each of those countries to their activities, and to their operational secrecy. In some cases, this has included evidence of data centres and their enablers draining the underground aquifers (on which local communities depend) in order to meet their insatiable demands for water.

To supply water to their data centres, El Pais reported, these companies rely on wells whose permits are sometimes being obtained under other names and for other ostensible purposes, including the supposed irrigation of parklands. “That’s why its complicated to track them down,” explained Lorena Garcis Estrada, professor and environmental geographer at the University of Queretaro, in central Mexico. Between May 2024 and January 2025, Microsoft, Google, and AWS all set up shop in Queretaro. As El Pais also says :

Amazon, Microsoft and Google… are now extracting basic resources such as water, in areas facing severe droughts. They are also seizing large tracts of land to build their centres and the power grids that feed them. All under a veil of secrecy that at times, includes the use of shell companies and in almost all cases, the complicity and collusion of local authorities.

Complicity and collusion of local authorities? Surely, not in New Zealand.

Feeding the AI furnace

Reportedly, a ChatGTP search consumes ten times the energy of an ordinary Google search. (Other reports say five times.) This is very bad news for the environment, given the evidence of the declining use of the ordinary Google search engine, and the increasing tendency among young people in particular, to use generative AI for search purposes.

The voracious energy needs of these centres are a direct reflection of the vast computing power they contain. Floating point operations (FLOPs) per second is the common metric for measuring computing power and speed (This speed reduces the “latency”period between asking a chatbot a question, and getting an answer). A high end mart phone will probably have about a two to three teraFLOPs capability, which equates to two to three trillion floating point operations per second.

Data centres however are operating in the realm of petraFLOPs per second, which are a thousand times more powerful than a teraFLOP. A petraFLOP is a 10 with 15 noughts after it, and is aka a quadrillion. In 2020, there were reportedly only10 large scale computer models capable of 10 petraFLOP performance levels. By 2024, there were 81, and that pace is accelerating.

As with bitcoin processing, the heat generated by the super semiconductor chips essential to these hyperspeed computing processes is what requires the lavish inputs of water, in order to cool down the circuitry.

Footnote: The Global Data Centre Trends survey for 2025 can be found here. This survey measures current data centre growth, the extent of over-capacity, and the pricing and availability trends, worldwide. New Zealand does not yet rate a mention. (In late May however, Bloomberg News did report that AWS was treating Mexico, Chile, New Zealand, Saudi Arabia and Taiwan as sites in which AWS planned to aggressively expand its operations.) The Global Trends commentary on Sydney – which is clearly a mature data centre market close to saturation point – is highly revealing.

Despite its many attractions to Big Data, Sydney (“one of the prime hyperscale hubs”) is now enacting “planning constraints” that are (not yet) evident in Melbourne which – whoopee—“offers attractive land and power availability with multiple campus-styled developments geared towards hyperscale and AI customers.” Plainly, in the de-regulatory striptease to attract data centres, New Zealand and Melbourne are both seeking to woo Big Data to come up and see their etchings, whatever it takes.

The Chilean Connection.

The secretive way that AWS began building its data centre complex in Huechuraba, a municipality in the north of the Chilean capital Santiago does not inspire much confidence in the company’s good neighbourliness. The 24 high voltage power pylons servicing the project were initially vested in the name of a Chilean business group that withdrew shorty before AWS stepped out of the shadows and announced its plans for a three-part data centre complex.

Locals had to connect the dots to realise the pylon project being routed from a substation located in their low income La Pincoya neighbourhood (and running through their only local green area to the 10 hectare AWS data centre project) were part and parcel of the same initiative. The pylons were always to be the essential power conduits for the second and third phases, a common feature of data centres as they scale up their capacity – with this evolution being dependent (as AWS belatedly informed La Pincoya residents) “on the existence of a concessionary electrical company that can provide the required supply.”

This is instructive. Clearly, the New Zealand public needs to be told what “concessionary” electricity rates have been offered to AWS by Mercury (and endorsed by our government) as enticements to proceed with their data centres in New Zealand. (BTW, the common term “hyperscale” refers to how data centres routinely consist of modular infrastructure that can then be scaled up as more powerful chips are deployed, or as more buildings are added or subtracted, in line with demand.)

The ongoing AWS build in Santiago – and its cautious engagement with the protests being made by local residents – has been in part a recognition of the stern rebuff meted out to Google by a Chilean environmental court over the data centre project that Google had intended to build in Cerillos, on Santiago’s southern outskirts. As El Pais reported:

The judge who halted Google’s project [in 2024] ruled that neither the company nor the authorities took into account, in their initial assessment, the likely impact of the data centre on Santiago’s central aquifer, amid climate change and a severe water crisis.

Reportedly, this legal setback has made Google and its fellow Big Data titans far more sensitive and willing to engage with any forms of local opposition. That belated sensitivity should be taken as an incentive by the New Zealand public who – earlier this week – watched their Prime Minister repeating company spin as if it were gospel truth, and generally acting like a front-of-house greeter for the data centre industry.

For starters, we need to know where these data centre buildings are located on our soil. We also need to know what concessions on electricity and water have been offered to them. We need to know what impact their activities are being estimated to have, directly or indirectly, on household electricity prices. We need to know how their activities will be monitored and by whom. We need to know what the privacy protocols governing this industry are, and what guarantees we can trust that the personal data of New Zealanders held by these centres will not be shifted offshore, or monetised here, or offshore, without prior and explicit approval.

Moving right along…we also need an independent agency to assess the increasing impact that data centres will have on our electricity grid and on renewables generation, and what premium fees we are demanding from the likes of AWS and Microsoft as their contribution to the maintenance and upgrade of our grid, and to the expansion of our renewables capability. We also need a Prime Minister willing and able to distance the government from the sales pitches being offered to him by the data centre industry. That’s not the entire list, but it would be a good start.

Footnote: One virtue of the recent black comedy Eddington, was that the movie set itself in the historical moment – May 2020 – when the West began to lose its collective mind. That was when pandemic lockdowns began to send people online and down the conspiracy rabbit holes in which many of us still reside. The main catalysts for how this angry, anarchic process plays out in the small New Mexico town of Eddington are (a) the Covid masking and social distancing rules and (b) the imminent arrival of a data centre that’s being touted by the local mayor (and by his business backers) as the town’s economic saviour.

Eddington is good at showing how the focus of the digital giants is on amplifying fear and prejudice around those issues that divide the community – conflict means clicks – even as it de-prioritises the content relating to the points on which the community may agree. After the town has torn itself apart, the main winner is seen to be the data centre (called SolidGoldMagikarp) that has so successfully pursued social polarisation as a conscious business tactic.

There’s a good interview available here with Eddington director Ari Aster, as he discusses the film’s themes at length with host Jason Di Rosso, on the always excellent ABC Screen Show. Di Rosso is a model of how state radio can talk intelligently about film, and trust its audience. There is no equivalent in this country.