Rural America is resisting the surge in data center construction

In the town I used to live in in western NC we got a google data center. They didn’t hire locally that much. It didn’t really affect the place one way or the other, but that town and the region around is pretty much all industry and old, empty furniture factories. There’s tons of empty giant buildings and not a lot of farmland. It was kind of the perfect place for a data center.

I guess the security used to ride around on Specialized Rock Hopper Elites. I bought one at a pawn shop in town and they told me some google guys sold 4 of them there. Also If I said anything slightly nerdy people asked me if I worked for google … so that was fun.
 
Last edited:
Upvote
61 (61 / 0)

CTho9305

Seniorius Lurkius
23
Subscriptor
“The data center itself pays an enormous amount of property taxes, and in the state of Illinois, our school system is funded primarily by property taxes,” he says. He adds that a $33 million elementary school was recently constructed in one of DeKalb’s most disenfranchised neighbourhoods due to funding he attributes to[…]
They should be careful about this. ~20 years ago AMD used to proudly proclaim how much of the funding for Del Valle ISD came from the property taxes they paid…until they decided to move from Ben White Blvd to Southwest Pkwy (no longer in Del Valle ISD). And that wasn’t even associated with some collapse of an investment bubble.
 
Upvote
70 (70 / 0)
Post content hidden for low score. Show…
If holding back data centers means the spread of AI into our lives has to slow down...well, that's a sacrifice I'm willing to make. :)

Although electricity prices are not too bad where I live thanks to hydroelectric sources, other areas are not so lucky and for them, electricity is already too expensive. If more data centers will lead to higher electricity prices (and higher water usage where rights over fresh water are already fought over), I am in favor of slowing the growth of data centers to assign higher priority to maintaining affordable electricity to support homes and the growth of electric vehicles, and to maintain our fresh water supply.
 
Upvote
92 (96 / -4)
Post content hidden for low score. Show…
Data centers have already emerged as a significant driver of economic expansion in the US, accounting for 80 percent of private sector growth in the first half of 2025, according to S&P Global.
Well, yes, but no.

That "economic expansion", is the GDP of construction. A sugar high, that ends once the site is operational, as data centers employ basically no employees and only pay utility bills. Traditionally GDP from construction is an investment that increases productivity--like a bridge that connects two communities permitting commerce. Except data centers don't do that.
 
Upvote
134 (136 / -2)
I think, beyond any of the underlying technical details, data centers and AI are simply the most publicly visible representation of the oligarchic concentration of wealth in America, and thus act as a proxy for our much larger social and economic issues.

Anyway let’s all vote for more billionaires again, as they so dearly care about the common man.
 
Upvote
93 (95 / -2)

DDopson

Ars Tribunus Militum
2,965
Subscriptor++
From the article:
Chipmaker Nvidia says it has also developed more energy-efficient chips that require even less cooling.

This seems like a (deliberate?) confusion of metrics. Everybody is trying to build more efficient chips that produce more computation per unit of energy consumption, but the main thing they are doing is producing bigger chips that produce more computation from more energy.

For example, NVidia is moving in the direct of co-packaging multiple full-reticle chips tightly interconnected with silicon interposer bridges. This means that a single 4-reticle package can consume ~3 kW. That's effectively putting four GPUs into a single package, so in one sense it can be compared favorably to four separate GPUs, which will use more power overhead for communicating between chips compared to the lower power, higher bandwidth silicon-silicon interconnections. So it's more FLOPs per Watt, but it's also more Watts per package, and more Watts per rack, increasing both the compute and power density of an AI datacenter. You could perhaps think of this as being able to compress an existing datacenter, and its power needs, into a smaller floor footprint.

That's the paradox of AI compute. The more FLOPs you can produce per Watt, the more Watts people want to deploy.
 
Upvote
89 (89 / 0)
Aside from the obvious matter of AI executives with a vested interest being prone to just lying; I'm highly skeptical of the 'nah, we fixed water use, bro' argument just on basic theoretical grounds.

There's obviously some fiddly engineering to be done around the edges of 'closed loop' to drive down the number of leaks and how often you have to swap out coolant that is just too full of random ions probably galvanically corroding something or either the biocides or the biofilms, whichever is winning at the moment; but that's not where the big use is.

If you have heat and wish to be rid of it you face a fairly fundamental tradeoff: because water has a decently high enthalpy of vaporization for such a common and well-behaved material you can use evaporative cooling to good effect if you want to reduce electricity costs and space/volume requirements.

If you want to conserve water you can blow air over heat sinks connected to your closed loop; and face the mixture of energy costs and volume requirements for all those fin stacks.

It's (mostly) true that you can run a low-water data center if you wish; in the worst case you can fill the closed loop with some other working fluid; but when everyone is talking about the eleventy-zillion gigawatts they can't source for their precious chatbots; do we really think that people are using the electricity-heavy cooling option rather than the 'get water for basically zero at agricultural rates or because well drilling isn't regulated there' option that uses less electricity?
 
Upvote
67 (68 / -1)
my county, and some towns in the next county over in central NC have halted data center construction and cancelled existing plans.

developers are not happy.
Well, it is a source of tension. Raising taxes is political suicide. But people want and demand schools and roads and police and fire and EMS and sewer/water and power infra that incur lasting and growing maintenance obligations--that you need taxes to pay for, that people invariably don't want to pay. But they want the services.

And so counties and municipalities are stuck using "growth" any "growth" no matter how toxic to fill in the budget gap to pay the bills citizens don't want to pay for the things they want. SO when a data center offers a quick infusion of property taxes and economic activity--it is very hard to resist the siren-call. Hence that town in MO that took the deal--and half the city council who signed it was removed from office in return.

How much of this rural rejection of data centers is NIMBYism and how much of it is actually the data centers themselves IDK. I suspect it is a lot of the former.
 
Upvote
25 (32 / -7)
From the article:


This seems like a (deliberate?) confusion of metrics. Everybody is trying to build more efficient chips that produce more computation per unit of energy consumption, but the main thing they are doing is producing bigger chips that produce more computation from more energy.

For example, NVidia is moving in the direct of co-packaging multiple full-reticle chips tightly interconnected with silicon interposer bridges. This means that a single 4-reticle package can consume ~3 kW. That's effectively putting four GPUs into a single package, so in one sense it can be compared favorably to four separate GPUs, which will use more power overhead for communicating between chips compared to the lower power, higher bandwidth silicon-silicon interconnections. So it's more FLOPs per Watt, but it's also more Watts per package, and more Watts per rack, increasing both the compute and power density of an AI datacenter. You could perhaps think of this as being able to compress an existing datacenter, and its power needs, into a smaller floor footprint.

That's the paradox of AI compute. The more FLOPs you can produce per Watt, the more Watts people want to deploy.

It's definitely deliberate. They used exactly the same argument back when the concern was 'crypto' burning unlimited energy for no obvious benefit. The various proof-of-work systems literally had mechanisms built into their designs to demand more work over time in order to avoid increased efficiency enabling cheaper 50% attacks or otherwise perturbing things; and they were all "but it's getting more efficient!" as though that matters when there's an unbounded amount of compute that people wish to deploy.
 
Upvote
42 (42 / 0)
I doubt it, given how many other blatant contrafactuals cling on decade after decade; but is there any chance that the current administration's enthusiasm for getting its broligarch's toys built will help budge the "Big city liberal elitists spit on your real american small town values" talking point?

Maybe I don't understand how honor culture works; but "I want to turn your community into a sacrifice zone to power my posthuman workforce" seems like slightly more intense elite contempt than "I used the phrase 'flyover country' without shame".
 
Upvote
34 (35 / -1)
The county in which I currently reside rejected a proposed data center development a few months ago. But the developers will probably be back in a year or so with a different plan, or one that targets primarily land within the county seat's city limits, which the county government has really no say over.

But this is a county government that about a decade ago rejected a proposed wind power development, then a few years later approved a hideous solar farm just north of the county seat. So who know what they will do next?
Where can I find pictures of the "hideous" solar farm? the picture in the link looks just like my sheep pasture, but with a bunch of solar panels in it and no sheep (yet).
 
Upvote
85 (85 / 0)
Well, it is a source of tension. Raising taxes is political suicide. But people want and demand schools and roads and police and fire and EMS and sewer/water and power infra that incur lasting and growing maintenance obligations--that you need taxes to pay for, that people invariably don't want to pay. But they want the services.

And so counties and municipalities are stuck using "growth" any "growth" no matter how toxic to fill in the budget gap to pay the bills citizens don't want to pay for the things they want. SO when a data center offers a quick infusion of property taxes and economic activity--it is very hard to resist the siren-call. Hence that town in MO that took the deal--and half the city council who signed it was removed from office in return.

How much of this rural rejection of data centers is NIMBYism and how much of it is actually the data centers themselves IDK. I suspect it is a lot of the former.

i suspect there's also a bit of "Fuck the tech bros" sentiment in my county.
 
Upvote
28 (28 / 0)
From the article:


This seems like a (deliberate?) confusion of metrics. Everybody is trying to build more efficient chips that produce more computation per unit of energy consumption, but the main thing they are doing is producing bigger chips that produce more computation from more energy.

For example, NVidia is moving in the direct of co-packaging multiple full-reticle chips tightly interconnected with silicon interposer bridges. This means that a single 4-reticle package can consume ~3 kW. That's effectively putting four GPUs into a single package, so in one sense it can be compared favorably to four separate GPUs, which will use more power overhead for communicating between chips compared to the lower power, higher bandwidth silicon-silicon interconnections. So it's more FLOPs per Watt, but it's also more Watts per package, and more Watts per rack, increasing both the compute and power density of an AI datacenter. You could perhaps think of this as being able to compress an existing datacenter, and its power needs, into a smaller floor footprint.

That's the paradox of AI compute. The more FLOPs you can produce per Watt, the more Watts people want to deploy.
Classic case of Jevons paradox.
 
Upvote
9 (9 / 0)
Post content hidden for low score. Show…

DarthSlack

Ars Legatus Legionis
23,285
Subscriptor++
Rural America mainly cares about living decently, making reasonable money, and going about it without being too crunched in. A datacenter can potentially offer that, same as a factory, a mine, or a warehouse or a hospital. But if you don’t hire and train locally you won’t get local support.

Hire and train for what? Data centers famously don't need many people once they're built. Security to keep the locals out and a few people to replace the occasional failed part. Anything and everything that can be done remotely will be. The only benefit to the local community these things bring is the property taxes, and those can be waived as an "incentive" by local politicians.

Republicans like to accuse Democrats of abandoning rural America, but it turns out as soon as the billionaires came knocking, Republicans turned their back on rural America. Maybe rural America needs to stop being beholden to a single party.
 
Upvote
105 (106 / -1)
In Tazewell County, Illinois, Michael Deppert depends on a natural pool of water beneath the sandy soils of his farm to irrigate the pumpkins, corn and soybeans growing in his fields.

So when a data center was proposed about eight miles away, he feared it would tap the same aquifer, potentially eroding crop yields and profits.
Yeah, but think about it, deep down, what is really more important: food or a datacenter?

You picked correctly! The data center. Food is good, can eat it sometimes, not sure what for.

But a datacenter will waste use the water for profit.

And again, really, at the end of the day, what's more important? Food, water? No, what's really important is the datacenter number 314156741342069 that maybe will allow "AI" to make a profit someday.... any day now....
 
Upvote
35 (37 / -2)

Castellum Excors

Ars Scholae Palatinae
742
Subscriptor++
Upvote
12 (15 / -3)

DarthSlack

Ars Legatus Legionis
23,285
Subscriptor++
If only there were as much demand to build housing as there is to build data centers.

Come on, data centers benefit billionaires where housing would benefit only the poors. It's no secret why data centers are springing up like weeds, but housing is just plodding along.
 
Upvote
36 (36 / 0)

Doug DigDag

Smack-Fu Master, in training
85
All this waste and pollution so fifty million Braysons can cheat on their homework until their completely smooth brains dribble out of their ear canals and a hundred million Carls can pump out fifty billion slide decks that a million Trevor Chadlington, Jrs. will never read, in the hopes that someday eight billion minus two hundred people can be cut off from any conceivable livelihood.
 
Last edited:
Upvote
43 (44 / -1)
Former DC Tech here… Kenosha DC Layoffs

One thing people shouldn’t underestimate is how automated and how efficient(relatively for a DC) these things can get. There's a lot of tech focused specifically on water usage. Even PITA spill mitigation procedures. They seem to have sacrificed electrical usage for low water usage. Lot's of active cooling, less passive cooling... higher electrical costs for everyone.

To me the biggest things to watch out for are the DIRECT environmental costs, unused(grown in) farmer is not a "wetlands". "Fixing the drainage in an area with retainment ponds are not "restoring wetlands". I wonder what will happen my old DC in 15-20 years when AI is blasé. I can tell you this, it won't be farmland.

After that it's people. For our location our contractor encouraged us to move to the area(I was already in Milwaukee), which, Kenosha, is not very tech savvy. Most of the techs lived either in Milwaukee or Chicago, with a bunch communing back home to Indiana on weekends. VERY, VERY few people lived in Kenosha.

So not only do you have a hyper-specialized building on "virgin" land, but the only source of tax revenue for the area are property/utility taxes, and the kiwi trip. None of the techs are going to spend any money locally. Well, gas was really cheap, so maybe the biggest winner is Kwik Trip?
 
Upvote
38 (38 / 0)

quamquam quid loquor

Ars Tribunus Militum
2,894
Subscriptor++
Aside from the obvious matter of AI executives with a vested interest being prone to just lying; I'm highly skeptical of the 'nah, we fixed water use, bro' argument just on basic theoretical grounds.

There's obviously some fiddly engineering to be done around the edges of 'closed loop' to drive down the number of leaks and how often you have to swap out coolant that is just too full of random ions probably galvanically corroding something or either the biocides or the biofilms, whichever is winning at the moment; but that's not where the big use is.

If you have heat and wish to be rid of it you face a fairly fundamental tradeoff: because water has a decently high enthalpy of vaporization for such a common and well-behaved material you can use evaporative cooling to good effect if you want to reduce electricity costs and space/volume requirements.

If you want to conserve water you can blow air over heat sinks connected to your closed loop; and face the mixture of energy costs and volume requirements for all those fin stacks.

It's (mostly) true that you can run a low-water data center if you wish; in the worst case you can fill the closed loop with some other working fluid; but when everyone is talking about the eleventy-zillion gigawatts they can't source for their precious chatbots; do we really think that people are using the electricity-heavy cooling option rather than the 'get water for basically zero at agricultural rates or because well drilling isn't regulated there' option that uses less electricity?
The amount of water datacenters use directly is actually pretty small compared to the amount of water they consume indirectly via the water needed for power generation.

The direct water use is a weak argument, since the power generation is where the majority of the water use is.
 
Upvote
4 (9 / -5)
Post content hidden for low score. Show…
If only there were as much demand to build housing as there is to build data centers.
There's a simple reason here. These data centers are being hawked--because the entire industrial production side of building them sees profit. Counties and municipalities agree to them, because they're desperate for more tax revenue they cannot get without relying on infinite-growth to dig their way out of their voter-inflicted hole (which of course doesn't work).

So why not housing? Because housing is a for-profit enterprise.

As supply catches up to demand prices go down, and so does profit. It is in housing developers (and real estate brokers) vested interests--to throttle supply the same way Nvidia throttles GPU supply to maintain high ASP. You see it every time. Last year as home prices stagnated and dipped in price, large homebuilders were losing their minds--and their quarterly earnings statements were filled with variations of "we don't see making as much profit in the current market, so we are cancelling or idling XXXX number of projects". There's also the knock on effect of bringing down prices--would cause people on a mortgage (now) would put them under-water, and they'd scream.

I wish it weren't so. But that is the economic system our parents saddled us with. And changing it is the only way to actually address housing.

It is the same reason why "drill baby drill" doesn't work to bring down oil prices. Of course it doesn't, oil production is a for-profit enterprise, and increasingly supply brings down price and therefore profit. Which is why Trump v1 unlocked ANWR for oil exploration--and no one did a single well up there.
 
Upvote
27 (29 / -2)

norton_I

Ars Praefectus
5,836
Subscriptor++
The amount of water datacenters use directly is actually pretty small compared to the amount of water they consume indirectly via the water needed for power generation.

The direct water use is a weak argument, since the power generation is where the majority of the water use is.

Especially in Illinois which has among the highest percentage of nuclear power which is the most water intensive power source.

Direct water usage is only really a concern if you are using portable water in a desert. But tapping an aquafir in illinois for data center cooling is just not a big deal compared to the agricultural uses or the electricity demand of the datacenter.

And a lot of the water in Illinois is used for irrigating corn which is possibly the only thing dumber and more destructive than AI datacenters.
 
Upvote
19 (19 / 0)
Especially in Illinois which has among the highest percentage of nuclear power which is the most water intensive power source.

Direct water usage is only really a concern if you are using portable water in a desert. But tapping an aquafir in illinois for data center cooling is just not a big deal compared to the agricultural uses or the electricity demand of the datacenter.

And a lot of the water in Illinois is used for irrigating corn which is possibly the only thing dumber and more destructive than AI datacenters.
I wonder if you realized how much of a point in favour of Solar Panels you just made...

Replace Illinois corn farms specifically meant for Ethanol production with Solar Farms and Battery Storage. You likely just generated enough power to power all of Illinois.
 
Upvote
44 (45 / -1)
I wonder if you realized how much of a point in favour of Solar Panels you just made...

Replace Illinois corn farms specifically meant for Ethanol production with Solar Farms and Battery Storage. You likely just generated enough power to power all of Illinois.
Yea, you'd have to eminent domain the Big AG producers to get that to happen. Here in Nebraska, the sandhills got loaded with wind turbines (great windage out there for them)....but that is because the land owners could farm and ranch around them. IDK if there's any kind of ag activity where you do a similar thing under a solar field.
 
Upvote
5 (7 / -2)