Remix.run Logo
Bender 3 days ago

As someone that used to help manage data-centers I think this is fine. I had to load test the generators quarterly. It's good to ensure the transfer switches and generators are working and nothing beats a real world load test. The data-centers have contracts with diesel fuel providers to keep the main tanks full. They can run on diesel any time the infrastructure load is high and write off the cost in their taxes check with your tax lawyer. There may even be other tax advantages being the need for generators would be compelled by the state, perhaps a tax lawyer could find ways to make a generator tech refresh get a bigger write-down, write off better noise abatement walls as usage will increase. If a company was running with scissors and did not buy enough generator capacity to run everything then they will have to get their act together now vs. later.

Shank 3 days ago | parent | next [-]

This is true, but how long will firm load shed events last, and how many of them will happen? In California, when CAISO has events that lead to firm load shedding, they're predictable, rolling blackouts and everyone knows how long they'll last, and they're assigned on a customer-specific basis. You know your power will be cut, and you know it will be a certain amount of time, and you know roughly where you are in the schedule.

I could see operators of datacenters in Texas wondering about this. Also, it's underrated how much critical infrastructure is dependent on datacenters running. Like, are you going to pull someone's EHR system down that serves a local hospital, while keeping the local hospital on a critical circuit?

Bender 3 days ago | parent | next [-]

You know your power will be cut, and you know it will be a certain amount of time, and you know roughly where you are in the schedule.

Knowing when the power will be cut will not help unless I am misunderstanding you. If the data-center loses power for even a minute the generators will all fire up and then every ATS will count-down and transfer in 30 seconds. Battery backup only lasts just long enough to do a quick return to service on a failed generator and even that is sketchy at best. A properly engineered data-center can run on reduced generator capacity.

Some data-centers are indeed on circuits deemed to be critical but I could see regulations changing this so that they are "business critical" vs. "life support critical" and some changes could be made at substations so that data-centers could participate in shedding. I think you are right that they will be thinking about this and adding to this probably filing preemptive lawsuits to protect their business. Such changes can violate SLA contracts businesses have with power companies and Texas is very pro-business so I can not compare it to California.

imglorp 3 days ago | parent | next [-]

Texas has shown no interest in life support critical. They prioritized operator profits over uptime. Hundreds died as a result.

https://en.wikipedia.org/wiki/2021_Texas_power_crisis

thegreatpeter 3 days ago | parent | next [-]

I lived in San Antonio during the winter storm in 2021 and no power went out and the hospital didn't lose power in my area either.

grepfru_it 3 days ago | parent [-]

H-town here. Our house at the time never lost power.. we also shared the block with emergency communication, so we figured that was why our neighborhood didn't lose power. Hospitals (and their neighborhoods) did not lose power either. Where I live now lost power, so did a lot of suburbs.

doodlebugging 3 days ago | parent | prev [-]

>Texas has shown no interest in life support critical.

I'm not sure that this is correct. I was initially worried about how Mom would fare since she lives alone and is over 80. During the entire one week period of power problems in Feb. 2021 my Mom never lost power, not even a quick brown-out. Her home is within a half mile of a local hospital which also never lost power. The area around the hospital did not lose power so businesses and homes close by had no issues with heating, cooking, bathing, etc during the cold blast. That fact allowed me to stay here at my place a couple hours away and manage my own situation which was fairly easy compared to many others in the state.

Your other statements are quite true and to date no one who played a part in mismanagement of utility power in Texas has been held accountable nor will they ever be in a libertarian state where regulations exist only to guarantee a profitable situation for a commercial entity. In fact, most electricity customers in Texas ended up paying for the huge cost increases that occurred as those in charge tweaked the system in real time to maximize their own profits.

Texas needs regulations worse than most other states. Grifters, fraudsters, and thieves have filled too many critical positions for too long.

opo 2 days ago | parent | next [-]

>...in a libertarian state

I don't think any organization that considers themselves to be libertarian has ever called Texas a "libertarian state". For example:

>...Texas’ institutions and policies continue to bear something of an old statist legacy. In the Cato Institute’s Freedom in the 50 States study, Texas scores a mere 17th, behind even the southern states of Florida (#2), Tennessee (#6), Missouri (#8), Georgia (#9), and Virginia (#12).

https://www.cato.org/commentary/texas-really-future-freedom

Are there any Texas national or state politicians who are members of the Libertarian Party or even refer to themselves as Libertarian?

grepfru_it 3 days ago | parent | prev | next [-]

Heating your house/cooking/bathing etc during this time put extraordinary strain on the grid. A big reason why others did not have power is because those that did did not reduce their consumption by much. So many of my neighbors/friends/collegues made comments like "we didn't lose power, so we kept the heat cranking at 75". So it would make sense that load shedding primarily affected neighborhoods, but my recollection of the events from people who lived near emergency centers was use it up before it goes away.

bsder 2 days ago | parent | next [-]

> A big reason why others did not have power is because those that did did not reduce their consumption by much.

First, that was the big manufacturers. ERCOT couldn't force big companies off the grid, and they didn't go off grid until the press noticed and started complaining.

Second, the Texas grid has insufficient granularity to actually shed enough non-critical load to do rolling blackouts. There are too many "critical" things connected to the same circuits as non-critical ones, and it would cost money to split those loads (something Texas just ain't gonna do).

Third, the base production got hit because fundamental natural gas infrastructure wasn't winterized, froze and exacerbated the whole situation. It would cost money to fix. (aka: something Texas just ain't gonna do)

Finally, when you don't have big industrial consumers defining your power grid (aka massive overprovisioning), you can't "shed load" your way out of trouble.

The fundamental problem is that, like so many things in the US economy, personal consumption is so low that it doesn't help when the problem is systemic. We've optimized houses with insulation, LED lighting, high-efficiency appliances, etc. Consequently, the difference between "minimal to not die" and "fuck it, who cares" in terms of consumption differential isn't sufficiently large to matter when a crisis hits.

doodlebugging 2 days ago | parent | prev [-]

You must live near and work with some selfish people.

I have more family up there where Mom lives and they lost power for all or most of the week so they all shuffled operations to the homes that had the most reliable power and pooled resources so no one had to be hungry or cold.

const_cast 2 days ago | parent | prev | next [-]

My house had zero power for 3 days straight. No cooktop either, because that's electric, and no water heating. It got to be ~30 degrees inside.

buerkle 2 days ago | parent | prev [-]

I'm in the Austin area and lost power for 2 days. Some friends of mine lost power for almost a week.

doodlebugging 2 days ago | parent [-]

We're near DFW. Mom is north of us by a bit. We lost power too, for days. Towards the end we had rolling outages that were predictable so we prepped anything that needed heat or power and as soon as the lights came on we made fresh coffee and tea and water for oatmeal or whatever and recharged the water supply since we are on a private well. Our power bricks handled most of the phone/laptop power delivery so we basically topped off the charge on the bricks whenever we had power. My greenhouse is solar/battery powered though I did use 1 lb propane cylinders for the coldest periods since the heater in the greenhouse was way too small to manage temps that went below 10F. I lost some things but I learned some things too. We are much more resilient today.

toast0 3 days ago | parent | prev [-]

> Knowing when the power will be cut will not help unless I am misunderstanding you. If the data-center loses power for even a minute the generators will all fire up and then every ATS will count-down and transfer in 30 seconds. Battery backup only lasts just long enough to do a quick return to service on a failed generator and even that is sketchy at best. A properly engineered data-center can run on reduced generator capacity.

If you know when the power will be cut, you can start the generators before the cut, and depending on your equipment, you may be able to synchronize the generator(s) with the grid and switch over without hitting the batteries. I assume big datacenters are on three phase, can you switch over each phase as it crosses zero, or do you need to do them all at once?

Bender 3 days ago | parent | next [-]

At least in the data-centers I helped manage the inverters were in-line running 100% duty-cycle, meaning frequency sync is not required as there is no bypass. The servers never see the raw commercial power. Data-centers in the US are indeed 3-phase. FWIW the big Cats did have controllers that would maintain sync even when commercial power was gone but we did not need it. There wasn't even a way to physically merge commercial and generator power. ATS inputs and outputs were a binary choice.

I know what you mean though, the generators I worked with in the military had a manual frequency sync that required slowly turning a dial and watching light bulbs that got brighter with frequency offset. Very old equipment for Mystic Star, post-WWII era equipment. 50's to 90's

dylan604 3 days ago | parent | next [-]

In the facilities I have been in (not managed), they were all in-line as you describe as well. Mains power is dirty. Having a data center without line condition on mains would be insane.

Bender 3 days ago | parent [-]

Mains power is dirty. Having a data center without line condition on mains would be insane.

Agreed. Even my home computer and networking equipment is 100% in-line with inverters and never see commercial power. PG&E in California got me into this habit with all the Planned Safety Power Shutoffs, wildfires, surges from really old transformers and unplanned outages. Now each of my tiny indoor rings of power have 200 to 800 amp-hour capacity each and over-sized inverters. I put the whole-house inverter plans on hold for now.

dylan604 2 days ago | parent [-]

way back when, I worked for a VHS dubbing facility where we had a voltage meter with an alarm set to warn when voltage would drop below a certain rate, but I don't remember the exact value. At that point, the VCRs would glitch and the recordings would be bad but the dip would be momentary and not enough to force the machines to stop like a full outage. When the alarm sounded, we stop all of the decks and re-rack the room and restart all of them. Without the alarm, it was impossible to catch these without 100% QC of a tape. That is when I groked how much worse a dip can be than a spike. Some equipment will start to pull harder when the voltage drops which kills more power supplies than spikes. Surge protectors are great for the spikes, but line conditioners or battery backups are the only protection from the dips. Management decided that the fully time battery conditioned expense was not worth it, so we were constantly running with some set of equipment down because of a dead power supply

jimmygrapes 3 days ago | parent | prev [-]

Manually syncing several of the MEP012 generators was always far more stressful to me than any physical dangers!

Bender 2 days ago | parent [-]

I bet. I never messed with the trailer or skid mounted generators. It sounds like you were also USAF. At least modern day noise cancelling headphones are much better. Guessing you probably have tinnitus from working on them. At least I think that is partially where mine came from.

bluGill 3 days ago | parent | prev [-]

Again, that doesn't matter because everyone knows the grid isn't 99% reliable. They just pull the big power switch to the whole building and watch all the backup systems work. If anything fails it was broke already and they fix/replace it. Because this happens often they have confidence that most systems will work - even where it doesn't computers fail unexpectedly anyway and so they have redundancy of computers anyway (and if it is really important redundancy of a data center in a different state/country)

Synchronizing generators is a thing, but it isn't useful for this situation since they need to be able to handle the sudden without warning power loss where generators cannot be synchronized anyway.

stevetron 3 days ago | parent [-]

How often does that inverter burn-out a transistor? Is there a backup inverter? Do you keep replacement transistors on-site?

Bender 3 days ago | parent | next [-]

Commercial inverters are massive and highly redundant. They do fail but it is very rare and there are contractors that can be on site to fix things very quickly. A properly engineered system can run in a degraded state for a prolonged period of time.

hdgvhicv 2 days ago | parent | prev [-]

My data centres have two separate supplies through two separte ups with two seperate generators, kit is striped across each one.

Of course that doesn’t help for fire/flood etc which is why we have critical workloads in two dcs.

bluGill 2 days ago | parent [-]

I once worked in a building that had two separate connections to the grid which went to different substations. Their servers have more than one power supply as well so that either grid could go down. While somewhat of an outlier, it is an option if you care about this.

hdgvhicv 20 hours ago | parent [-]

That goes without saying, you still need at least one on an ups and generator though. Dual power supply services and dual supplies are or so for any important hosting environment.

tw04 3 days ago | parent | prev | next [-]

If power is out everywhere for an extended period of time, they aren’t doing anything but life saving surgery. Pulling up an EMR will be near the bottom of the list of concerns.

vineyardmike 2 days ago | parent [-]

Yea, that’s not how medicine works. There are patients that are in the hospital beds for a variety of reasons, and they don’t just go home during a storm. Those people still need some level of care, even if they’re not getting XRays and basic preventative care.

EMRs contain a record of when patients last took some critical but dangerous drug, what their allergies and reactions are, and many other important bits of information. When one of the patients starts to exhibit some new symptom or reaction (very stressful situation!), doctors and nurses look at the EMR to understand the best course of treatment or intervention.

When the EMR goes down, doctors and nurses revert to pen and paper. It’s very slow, and requires a lot of human handoff - which, critically, they’re less practiced in.

tw04 2 days ago | parent [-]

I literally help design IT resiliency for hospitals. This is absolutely how they work and part of their disaster planning. When there is an extended power outage they stop anything but vital surgery and work off pen and paper.

Which you got to after spending 3 paragraphs talking about what an EMR is for.

vel0city 3 days ago | parent | prev | next [-]

If they cannot handle the grid power being pulled due to load shedding, they have no business handling critical applications.

bluGill 3 days ago | parent [-]

In particular there is no way you can predict when a backhoe will take our your power line. (Even if they call to get lines located - sometimes the mark is in the wrong spot, though most are they didn't call in the first place). There are lots of other ways the power can go down suddenly and nothing you can do about it except have some other redundancy.

nradov 3 days ago | parent | prev [-]

All of the major cloud EHRs run in multiple availability zones.

mschuster91 3 days ago | parent [-]

The data center might be... but are all fiber routers, amplifiers, PoPs etc. along the line from the datacenter to the hospitals and other healthcare providers backed up as well?

Particularly the "last mile" often enough is only equipped with batteries to bridge over small brownouts or outages, but not with full fledged diesel engines.

And while hospitals, at least those that deal with operating patients, are on battery banks and huge ass diesel engines... private small practices usually are not, if you're lucky the main server has a half broken UPS where no one ever looked after that "BATTERY FAULT" red light for a year. But the desktop computers, VPN nodes, card readers or medical equipment? If it's not something that a power outage could ruin (such as a MRT), it's probably not even battery backed.

There's a German saying "the emperor is naked, he has no clothes". When it comes to the resilience of our healthcare infrastructure, the emperor isn't just naked, the emperor's skin is rotting away.

marcosdumay 2 days ago | parent | next [-]

Just pointing out that none of those non-data-center buildings are data centers.

I really doubt the classification of "data centers and other large, non-critical power consumers" extends to telecom infrastructure.

mschuster91 2 days ago | parent [-]

> I really doubt the classification of "data centers and other large, non-critical power consumers" extends to telecom infrastructure.

That's the point. Okay, cool, the datacenter is highly available, multiple power and data feeds, 24/7/365. But that highly available datacenter is useless when it cannot be reached because its data feed elements or the clients don't have power.

nradov 3 days ago | parent | prev [-]

Starlink

lokar 3 days ago | parent | prev | next [-]

For any big facility there will be pretty strict EPA limits on how long you can run the generators each year.

dylan604 3 days ago | parent | next [-]

The EPA? Are they still a thing? I doubt anyone is concerned about the EPA under current management.

bilbo0s 3 days ago | parent | next [-]

Which is great.

Until there's new management.

You can't run a business by seesaw.

Best to just count on that rule being enforced and place the necessary battery backups and wind or solar in place to backstop the diesel. Then make any users who need to use those data centers eat that extra cost. There's no problem with us-east costing less than us-west, and us-texas costing most of all. That's how markets work.

dylan604 2 days ago | parent [-]

But the seesaw is what the future is going to look like. If some bit of mass voting breaks out for the next election and moves to the other party, there will be a swing back the other direction. Then the following election the masses will get upset about something and swing it back. The country is too polarized to expect anything other than seesaw policies. Unless we go full revolution and just deny the other party from ever taking charge.

xxpor 3 days ago | parent | prev [-]

The state regulators can also get you.

dylan604 2 days ago | parent | next [-]

The state regulators of Texas? Unless you're trying to manage your own health, the state is not concerned about you. If you're a gas/power company, they only want to know what regulations you want removed/enacted. They definitely aren't "getting you" for being part of bigEnergy

xxpor 2 days ago | parent [-]

Texas, I could agree with. I'm just saying that Virginia has fined DC operators specifically for running their generators too much.

2 days ago | parent | prev [-]
[deleted]
Bender 3 days ago | parent | prev | next [-]

Indeed. I have faith that Texas will find a way around such rules especially if they are being regulated into running them. A Texas company I worked for was highly proficient in maximum shrugs.

more_corn 3 days ago | parent | prev [-]

The EPA doesn’t really have the resources to enforce that. And certainly won’t have that capability under the trump administration.

abeppu 3 days ago | parent | prev | next [-]

I wasn't involved in the specific details but I remember being told that during the power outage from hurricane Sandy, even datacenters that had sufficient generators had trouble getting the diesel to keep them running, because everyone wanted diesel at the same time and both the supply and distribution were bottlenecked.

How long can most DCs run with just the fuel onhand? Have standards around that changed over time?

jabart 3 days ago | parent | next [-]

At a call center, that had a whole datacenter in the basement. They had two weeks of fuel on hand at all times. Being on a border of a state, they also had a 2nd grid connection in case one failed.

The whole area lost power for weeks but gym was open 24/7 which became very busy during that time.

Bender 2 days ago | parent | prev | next [-]

How long can most DCs run with just the fuel onhand?

There really is not a universal answer for this. Every generator will have what is called a "day tank" that as you might guess lasts for one day under a nominal load.

The day tanks are connected in pods to large diesel fuel tanks. Every {n} number of generators get a main tank. Those tanks vary in size depending on how much the company wishes to spend and how resilient they need to make their data-center excluding fuel trucks. Cities have regulations about how much fuel can be above or below ground at each location. My main tanks were 10K gallons. Each generator used over a gallon per minute under load.

And you are right, during a regional or global disaster fuel trucks will be limited. They who bribe the most get the last fuel but that too will run out. Ramping up production and distribution takes weeks and that assumes roads are still viable and the internet outside of the data-center is still functional.

stevetron 3 days ago | parent | prev | next [-]

It seems to me that Hurricane Sandy caused an issue with fueling backup generators in a New York City datacenter, where there was a bucket-brigade of people carrying fuel in pails up flights of steps several stories.

abeppu 2 days ago | parent [-]

I remember hearing this story at the time but have forgotten all the details. I think it involved a pretty well known company? I also remember hearing that some DC in NJ got special priority in getting diesel in the following days because some federal government services were hosted there and so it was treated as a national security issue to keep them supplied.

kalleboo 2 days ago | parent [-]

It reminds me of the blog during Hurricane Katrina, with the guy carrying barrels of fuel up the stairs to the 10th floor https://en.wikipedia.org/wiki/Interdictor_(blog)

stogot 3 days ago | parent | prev [-]

I recall ranges I’ve heard from operators as 24h to upwards of 72 (rare)

duxup 2 days ago | parent | prev | next [-]

When I worked in a different career I worked with big banks and did disaster recovery tests (we didn't run the tests we just had equipment that the tests sort of centered around). We'd basically cut of a data center for a weekend and they would run from another data center that was supposed to have all the data there too. We'd even move their check processing to a backup site and they'd truck full truckloads of paper checks to a backup site.

They were legit tests at the offline site too too, they'd power down equipment, and power it up and we'd fix what didn't come back up. Even the data centers would be fully powered off to test.

At least at that time those banks did not skimp on those big tests and it was a big effort and pretty dang well run / complete.

stronglikedan 2 days ago | parent | prev | next [-]

> did not buy enough generator capacity to run everything

Maybe they can cap it for most cases - we'll turn it off for n number of days max - so that companies have a target to prepare for (or maybe that was already mentioned). Of course, no one can prepare for anything, but if it's longer than that then most other things are probably also affected anyway and no one is worried about their favorite data center.

f1shy 2 days ago | parent | prev | next [-]

I get your point, but saying “is ok to cut power, they have backup” isn’t a little bit kind of deviance? For example in a hospital this kind of thinking would be totally unacceptable IMHO

doubled112 2 days ago | parent | next [-]

Are data centers as life or death as a hospital?

Also, the data center I use has run from generator power for days after storms with NO quality of service loss. Nothing like some real world testing to remind me they have this figured out.

Is a hospital in the same situation? Or is only part of the hospital on those generators?

f1shy 2 days ago | parent [-]

Of course not all, depends, but I worked in a telco; and pretty much yes. Without datacenter, no cell phones (or trunking in the case I know) so no ambulance, no police, no firefighters (or a very delayed version)

We had batteries and 2 generators, and once we were minutes from blackout, as the primary generator failed, and the secondary was not dimensioned to cope with the load of AA of a 45 celsius day

quickthrowman 2 days ago | parent | prev [-]

Hospitals are obligated by law and building code to have backup generators with multiple (at least 3) separate backup power feeds for critical branch, life safety branch, and equipment branch. These are defined as ‘essential loads’ by the National Electrical Code. They can all be fed from the same generator but must use separate overcurrent protection and automatic transfer switches.

Critical branch is defined as loads that are used for direct patient care, plus ‘additional task lighting, receptacles, and circuits needed for effective hospital operation.’

Life safety branch is the fire alarm system and emergency lighting, plus elevator lights and controls, medgas alarms, PA/notification systems used during building evacuation, and some generator accessories.

Equipment branch has some require items including OR cooling, patient room heating, and data rooms. Some hospitals will add MRIs, non-patient care HVAC, and chillers (for air conditioning) on the generator backup system as well.

There’s typically a fourth system for everything else (‘normal’ power) that is not backed by a generator. Non-emergency lighting, convenience receptacles, and other non-essential loads are on this system.

chaz6 3 days ago | parent | prev | next [-]

Are they clean enough to stay within the limits of regulations around pollution when run for long periods of time?

Bender 3 days ago | parent [-]

The generators I worked with were just massive diesel engines the size of a tractor trailer each. This was pre-def, pre-cat requirements. When I would switch on multiple load banks at once I could get a small puff of soot as the engines revved up. Provided the load was not varying heavily they would at least run clean enough that one could not see or smell any exhaust. This was in the 2000 time frame. Mind you the load tests are not real-world realistic at all and this can be further mitigated using step-start programmable PDU's in the data-center. Companies that own data-centers usually have a lot of political pull with the cities due to all the assorted taxes they pay, direct and indirect revenue and employees they bring in.

Cat has since added options for hydrogen [1] but I have no idea how many people have bought them.

[1] - https://www.cat.com/en_US/by-industry/electric-power/electri...

lowwave 3 days ago | parent | prev [-]

Nice to know that, but are the data-centers Carrington Event proof? Always wondered that, but never worked in a data-center.