They should be careful about this. ~20 years ago AMD used to proudly proclaim how much of the funding for Del Valle ISD came from the property taxes they paid…until they decided to move from Ben White Blvd to Southwest Pkwy (no longer in Del Valle ISD). And that wasn’t even associated with some collapse of an investment bubble.“The data center itself pays an enormous amount of property taxes, and in the state of Illinois, our school system is funded primarily by property taxes,” he says. He adds that a $33 million elementary school was recently constructed in one of DeKalb’s most disenfranchised neighbourhoods due to funding he attributes to[…]
Well, yes, but no.Data centers have already emerged as a significant driver of economic expansion in the US, accounting for 80 percent of private sector growth in the first half of 2025, according to S&P Global.
Chipmaker Nvidia says it has also developed more energy-efficient chips that require even less cooling.
Well, it is a source of tension. Raising taxes is political suicide. But people want and demand schools and roads and police and fire and EMS and sewer/water and power infra that incur lasting and growing maintenance obligations--that you need taxes to pay for, that people invariably don't want to pay. But they want the services.my county, and some towns in the next county over in central NC have halted data center construction and cancelled existing plans.
developers are not happy.
From the article:
This seems like a (deliberate?) confusion of metrics. Everybody is trying to build more efficient chips that produce more computation per unit of energy consumption, but the main thing they are doing is producing bigger chips that produce more computation from more energy.
For example, NVidia is moving in the direct of co-packaging multiple full-reticle chips tightly interconnected with silicon interposer bridges. This means that a single 4-reticle package can consume ~3 kW. That's effectively putting four GPUs into a single package, so in one sense it can be compared favorably to four separate GPUs, which will use more power overhead for communicating between chips compared to the lower power, higher bandwidth silicon-silicon interconnections. So it's more FLOPs per Watt, but it's also more Watts per package, and more Watts per rack, increasing both the compute and power density of an AI datacenter. You could perhaps think of this as being able to compress an existing datacenter, and its power needs, into a smaller floor footprint.
That's the paradox of AI compute. The more FLOPs you can produce per Watt, the more Watts people want to deploy.
First it was crypto driving electricity prices up, now is AI, I shudder to think what bullshit they will try next,
cloud storage and video streaming ?
Where can I find pictures of the "hideous" solar farm? the picture in the link looks just like my sheep pasture, but with a bunch of solar panels in it and no sheep (yet).The county in which I currently reside rejected a proposed data center development a few months ago. But the developers will probably be back in a year or so with a different plan, or one that targets primarily land within the county seat's city limits, which the county government has really no say over.
But this is a county government that about a decade ago rejected a proposed wind power development, then a few years later approved a hideous solar farm just north of the county seat. So who know what they will do next?
Well, it is a source of tension. Raising taxes is political suicide. But people want and demand schools and roads and police and fire and EMS and sewer/water and power infra that incur lasting and growing maintenance obligations--that you need taxes to pay for, that people invariably don't want to pay. But they want the services.
And so counties and municipalities are stuck using "growth" any "growth" no matter how toxic to fill in the budget gap to pay the bills citizens don't want to pay for the things they want. SO when a data center offers a quick infusion of property taxes and economic activity--it is very hard to resist the siren-call. Hence that town in MO that took the deal--and half the city council who signed it was removed from office in return.
How much of this rural rejection of data centers is NIMBYism and how much of it is actually the data centers themselves IDK. I suspect it is a lot of the former.
Classic case of Jevons paradox.From the article:
This seems like a (deliberate?) confusion of metrics. Everybody is trying to build more efficient chips that produce more computation per unit of energy consumption, but the main thing they are doing is producing bigger chips that produce more computation from more energy.
For example, NVidia is moving in the direct of co-packaging multiple full-reticle chips tightly interconnected with silicon interposer bridges. This means that a single 4-reticle package can consume ~3 kW. That's effectively putting four GPUs into a single package, so in one sense it can be compared favorably to four separate GPUs, which will use more power overhead for communicating between chips compared to the lower power, higher bandwidth silicon-silicon interconnections. So it's more FLOPs per Watt, but it's also more Watts per package, and more Watts per rack, increasing both the compute and power density of an AI datacenter. You could perhaps think of this as being able to compress an existing datacenter, and its power needs, into a smaller floor footprint.
That's the paradox of AI compute. The more FLOPs you can produce per Watt, the more Watts people want to deploy.
Rural America mainly cares about living decently, making reasonable money, and going about it without being too crunched in. A datacenter can potentially offer that, same as a factory, a mine, or a warehouse or a hospital. But if you don’t hire and train locally you won’t get local support.
Yeah, but think about it, deep down, what is really more important: food or a datacenter?In Tazewell County, Illinois, Michael Deppert depends on a natural pool of water beneath the sandy soils of his farm to irrigate the pumpkins, corn and soybeans growing in his fields.
So when a data center was proposed about eight miles away, he feared it would tap the same aquifer, potentially eroding crop yields and profits.
If only there were as much demand to build housing as there is to build data centers.
The amount of water datacenters use directly is actually pretty small compared to the amount of water they consume indirectly via the water needed for power generation.Aside from the obvious matter of AI executives with a vested interest being prone to just lying; I'm highly skeptical of the 'nah, we fixed water use, bro' argument just on basic theoretical grounds.
There's obviously some fiddly engineering to be done around the edges of 'closed loop' to drive down the number of leaks and how often you have to swap out coolant that is just too full of random ions probably galvanically corroding something or either the biocides or the biofilms, whichever is winning at the moment; but that's not where the big use is.
If you have heat and wish to be rid of it you face a fairly fundamental tradeoff: because water has a decently high enthalpy of vaporization for such a common and well-behaved material you can use evaporative cooling to good effect if you want to reduce electricity costs and space/volume requirements.
If you want to conserve water you can blow air over heat sinks connected to your closed loop; and face the mixture of energy costs and volume requirements for all those fin stacks.
It's (mostly) true that you can run a low-water data center if you wish; in the worst case you can fill the closed loop with some other working fluid; but when everyone is talking about the eleventy-zillion gigawatts they can't source for their precious chatbots; do we really think that people are using the electricity-heavy cooling option rather than the 'get water for basically zero at agricultural rates or because well drilling isn't regulated there' option that uses less electricity?
There's a simple reason here. These data centers are being hawked--because the entire industrial production side of building them sees profit. Counties and municipalities agree to them, because they're desperate for more tax revenue they cannot get without relying on infinite-growth to dig their way out of their voter-inflicted hole (which of course doesn't work).If only there were as much demand to build housing as there is to build data centers.
The amount of water datacenters use directly is actually pretty small compared to the amount of water they consume indirectly via the water needed for power generation.
The direct water use is a weak argument, since the power generation is where the majority of the water use is.
I wonder if you realized how much of a point in favour of Solar Panels you just made...Especially in Illinois which has among the highest percentage of nuclear power which is the most water intensive power source.
Direct water usage is only really a concern if you are using portable water in a desert. But tapping an aquafir in illinois for data center cooling is just not a big deal compared to the agricultural uses or the electricity demand of the datacenter.
And a lot of the water in Illinois is used for irrigating corn which is possibly the only thing dumber and more destructive than AI datacenters.
Yea, you'd have to eminent domain the Big AG producers to get that to happen. Here in Nebraska, the sandhills got loaded with wind turbines (great windage out there for them)....but that is because the land owners could farm and ranch around them. IDK if there's any kind of ag activity where you do a similar thing under a solar field.I wonder if you realized how much of a point in favour of Solar Panels you just made...
Replace Illinois corn farms specifically meant for Ethanol production with Solar Farms and Battery Storage. You likely just generated enough power to power all of Illinois.