Remix.run Logo
Shank 3 days ago

This is true, but how long will firm load shed events last, and how many of them will happen? In California, when CAISO has events that lead to firm load shedding, they're predictable, rolling blackouts and everyone knows how long they'll last, and they're assigned on a customer-specific basis. You know your power will be cut, and you know it will be a certain amount of time, and you know roughly where you are in the schedule.

I could see operators of datacenters in Texas wondering about this. Also, it's underrated how much critical infrastructure is dependent on datacenters running. Like, are you going to pull someone's EHR system down that serves a local hospital, while keeping the local hospital on a critical circuit?

Bender 3 days ago | parent | next [-]

You know your power will be cut, and you know it will be a certain amount of time, and you know roughly where you are in the schedule.

Knowing when the power will be cut will not help unless I am misunderstanding you. If the data-center loses power for even a minute the generators will all fire up and then every ATS will count-down and transfer in 30 seconds. Battery backup only lasts just long enough to do a quick return to service on a failed generator and even that is sketchy at best. A properly engineered data-center can run on reduced generator capacity.

Some data-centers are indeed on circuits deemed to be critical but I could see regulations changing this so that they are "business critical" vs. "life support critical" and some changes could be made at substations so that data-centers could participate in shedding. I think you are right that they will be thinking about this and adding to this probably filing preemptive lawsuits to protect their business. Such changes can violate SLA contracts businesses have with power companies and Texas is very pro-business so I can not compare it to California.

imglorp 3 days ago | parent | next [-]

Texas has shown no interest in life support critical. They prioritized operator profits over uptime. Hundreds died as a result.

https://en.wikipedia.org/wiki/2021_Texas_power_crisis

thegreatpeter 3 days ago | parent | next [-]

I lived in San Antonio during the winter storm in 2021 and no power went out and the hospital didn't lose power in my area either.

grepfru_it 3 days ago | parent [-]

H-town here. Our house at the time never lost power.. we also shared the block with emergency communication, so we figured that was why our neighborhood didn't lose power. Hospitals (and their neighborhoods) did not lose power either. Where I live now lost power, so did a lot of suburbs.

doodlebugging 3 days ago | parent | prev [-]

>Texas has shown no interest in life support critical.

I'm not sure that this is correct. I was initially worried about how Mom would fare since she lives alone and is over 80. During the entire one week period of power problems in Feb. 2021 my Mom never lost power, not even a quick brown-out. Her home is within a half mile of a local hospital which also never lost power. The area around the hospital did not lose power so businesses and homes close by had no issues with heating, cooking, bathing, etc during the cold blast. That fact allowed me to stay here at my place a couple hours away and manage my own situation which was fairly easy compared to many others in the state.

Your other statements are quite true and to date no one who played a part in mismanagement of utility power in Texas has been held accountable nor will they ever be in a libertarian state where regulations exist only to guarantee a profitable situation for a commercial entity. In fact, most electricity customers in Texas ended up paying for the huge cost increases that occurred as those in charge tweaked the system in real time to maximize their own profits.

Texas needs regulations worse than most other states. Grifters, fraudsters, and thieves have filled too many critical positions for too long.

opo 2 days ago | parent | next [-]

>...in a libertarian state

I don't think any organization that considers themselves to be libertarian has ever called Texas a "libertarian state". For example:

>...Texas’ institutions and policies continue to bear something of an old statist legacy. In the Cato Institute’s Freedom in the 50 States study, Texas scores a mere 17th, behind even the southern states of Florida (#2), Tennessee (#6), Missouri (#8), Georgia (#9), and Virginia (#12).

https://www.cato.org/commentary/texas-really-future-freedom

Are there any Texas national or state politicians who are members of the Libertarian Party or even refer to themselves as Libertarian?

grepfru_it 2 days ago | parent | prev | next [-]

Heating your house/cooking/bathing etc during this time put extraordinary strain on the grid. A big reason why others did not have power is because those that did did not reduce their consumption by much. So many of my neighbors/friends/collegues made comments like "we didn't lose power, so we kept the heat cranking at 75". So it would make sense that load shedding primarily affected neighborhoods, but my recollection of the events from people who lived near emergency centers was use it up before it goes away.

bsder 2 days ago | parent | next [-]

> A big reason why others did not have power is because those that did did not reduce their consumption by much.

First, that was the big manufacturers. ERCOT couldn't force big companies off the grid, and they didn't go off grid until the press noticed and started complaining.

Second, the Texas grid has insufficient granularity to actually shed enough non-critical load to do rolling blackouts. There are too many "critical" things connected to the same circuits as non-critical ones, and it would cost money to split those loads (something Texas just ain't gonna do).

Third, the base production got hit because fundamental natural gas infrastructure wasn't winterized, froze and exacerbated the whole situation. It would cost money to fix. (aka: something Texas just ain't gonna do)

Finally, when you don't have big industrial consumers defining your power grid (aka massive overprovisioning), you can't "shed load" your way out of trouble.

The fundamental problem is that, like so many things in the US economy, personal consumption is so low that it doesn't help when the problem is systemic. We've optimized houses with insulation, LED lighting, high-efficiency appliances, etc. Consequently, the difference between "minimal to not die" and "fuck it, who cares" in terms of consumption differential isn't sufficiently large to matter when a crisis hits.

doodlebugging 2 days ago | parent | prev [-]

You must live near and work with some selfish people.

I have more family up there where Mom lives and they lost power for all or most of the week so they all shuffled operations to the homes that had the most reliable power and pooled resources so no one had to be hungry or cold.

const_cast 2 days ago | parent | prev | next [-]

My house had zero power for 3 days straight. No cooktop either, because that's electric, and no water heating. It got to be ~30 degrees inside.

buerkle 2 days ago | parent | prev [-]

I'm in the Austin area and lost power for 2 days. Some friends of mine lost power for almost a week.

doodlebugging 2 days ago | parent [-]

We're near DFW. Mom is north of us by a bit. We lost power too, for days. Towards the end we had rolling outages that were predictable so we prepped anything that needed heat or power and as soon as the lights came on we made fresh coffee and tea and water for oatmeal or whatever and recharged the water supply since we are on a private well. Our power bricks handled most of the phone/laptop power delivery so we basically topped off the charge on the bricks whenever we had power. My greenhouse is solar/battery powered though I did use 1 lb propane cylinders for the coldest periods since the heater in the greenhouse was way too small to manage temps that went below 10F. I lost some things but I learned some things too. We are much more resilient today.

toast0 3 days ago | parent | prev [-]

> Knowing when the power will be cut will not help unless I am misunderstanding you. If the data-center loses power for even a minute the generators will all fire up and then every ATS will count-down and transfer in 30 seconds. Battery backup only lasts just long enough to do a quick return to service on a failed generator and even that is sketchy at best. A properly engineered data-center can run on reduced generator capacity.

If you know when the power will be cut, you can start the generators before the cut, and depending on your equipment, you may be able to synchronize the generator(s) with the grid and switch over without hitting the batteries. I assume big datacenters are on three phase, can you switch over each phase as it crosses zero, or do you need to do them all at once?

Bender 3 days ago | parent | next [-]

At least in the data-centers I helped manage the inverters were in-line running 100% duty-cycle, meaning frequency sync is not required as there is no bypass. The servers never see the raw commercial power. Data-centers in the US are indeed 3-phase. FWIW the big Cats did have controllers that would maintain sync even when commercial power was gone but we did not need it. There wasn't even a way to physically merge commercial and generator power. ATS inputs and outputs were a binary choice.

I know what you mean though, the generators I worked with in the military had a manual frequency sync that required slowly turning a dial and watching light bulbs that got brighter with frequency offset. Very old equipment for Mystic Star, post-WWII era equipment. 50's to 90's

dylan604 3 days ago | parent | next [-]

In the facilities I have been in (not managed), they were all in-line as you describe as well. Mains power is dirty. Having a data center without line condition on mains would be insane.

Bender 3 days ago | parent [-]

Mains power is dirty. Having a data center without line condition on mains would be insane.

Agreed. Even my home computer and networking equipment is 100% in-line with inverters and never see commercial power. PG&E in California got me into this habit with all the Planned Safety Power Shutoffs, wildfires, surges from really old transformers and unplanned outages. Now each of my tiny indoor rings of power have 200 to 800 amp-hour capacity each and over-sized inverters. I put the whole-house inverter plans on hold for now.

dylan604 2 days ago | parent [-]

way back when, I worked for a VHS dubbing facility where we had a voltage meter with an alarm set to warn when voltage would drop below a certain rate, but I don't remember the exact value. At that point, the VCRs would glitch and the recordings would be bad but the dip would be momentary and not enough to force the machines to stop like a full outage. When the alarm sounded, we stop all of the decks and re-rack the room and restart all of them. Without the alarm, it was impossible to catch these without 100% QC of a tape. That is when I groked how much worse a dip can be than a spike. Some equipment will start to pull harder when the voltage drops which kills more power supplies than spikes. Surge protectors are great for the spikes, but line conditioners or battery backups are the only protection from the dips. Management decided that the fully time battery conditioned expense was not worth it, so we were constantly running with some set of equipment down because of a dead power supply

jimmygrapes 3 days ago | parent | prev [-]

Manually syncing several of the MEP012 generators was always far more stressful to me than any physical dangers!

Bender 2 days ago | parent [-]

I bet. I never messed with the trailer or skid mounted generators. It sounds like you were also USAF. At least modern day noise cancelling headphones are much better. Guessing you probably have tinnitus from working on them. At least I think that is partially where mine came from.

bluGill 3 days ago | parent | prev [-]

Again, that doesn't matter because everyone knows the grid isn't 99% reliable. They just pull the big power switch to the whole building and watch all the backup systems work. If anything fails it was broke already and they fix/replace it. Because this happens often they have confidence that most systems will work - even where it doesn't computers fail unexpectedly anyway and so they have redundancy of computers anyway (and if it is really important redundancy of a data center in a different state/country)

Synchronizing generators is a thing, but it isn't useful for this situation since they need to be able to handle the sudden without warning power loss where generators cannot be synchronized anyway.

stevetron 3 days ago | parent [-]

How often does that inverter burn-out a transistor? Is there a backup inverter? Do you keep replacement transistors on-site?

Bender 3 days ago | parent | next [-]

Commercial inverters are massive and highly redundant. They do fail but it is very rare and there are contractors that can be on site to fix things very quickly. A properly engineered system can run in a degraded state for a prolonged period of time.

hdgvhicv 2 days ago | parent | prev [-]

My data centres have two separate supplies through two separte ups with two seperate generators, kit is striped across each one.

Of course that doesn’t help for fire/flood etc which is why we have critical workloads in two dcs.

bluGill 2 days ago | parent [-]

I once worked in a building that had two separate connections to the grid which went to different substations. Their servers have more than one power supply as well so that either grid could go down. While somewhat of an outlier, it is an option if you care about this.

hdgvhicv 20 hours ago | parent [-]

That goes without saying, you still need at least one on an ups and generator though. Dual power supply services and dual supplies are or so for any important hosting environment.

tw04 3 days ago | parent | prev | next [-]

If power is out everywhere for an extended period of time, they aren’t doing anything but life saving surgery. Pulling up an EMR will be near the bottom of the list of concerns.

vineyardmike 2 days ago | parent [-]

Yea, that’s not how medicine works. There are patients that are in the hospital beds for a variety of reasons, and they don’t just go home during a storm. Those people still need some level of care, even if they’re not getting XRays and basic preventative care.

EMRs contain a record of when patients last took some critical but dangerous drug, what their allergies and reactions are, and many other important bits of information. When one of the patients starts to exhibit some new symptom or reaction (very stressful situation!), doctors and nurses look at the EMR to understand the best course of treatment or intervention.

When the EMR goes down, doctors and nurses revert to pen and paper. It’s very slow, and requires a lot of human handoff - which, critically, they’re less practiced in.

tw04 2 days ago | parent [-]

I literally help design IT resiliency for hospitals. This is absolutely how they work and part of their disaster planning. When there is an extended power outage they stop anything but vital surgery and work off pen and paper.

Which you got to after spending 3 paragraphs talking about what an EMR is for.

vel0city 3 days ago | parent | prev | next [-]

If they cannot handle the grid power being pulled due to load shedding, they have no business handling critical applications.

bluGill 3 days ago | parent [-]

In particular there is no way you can predict when a backhoe will take our your power line. (Even if they call to get lines located - sometimes the mark is in the wrong spot, though most are they didn't call in the first place). There are lots of other ways the power can go down suddenly and nothing you can do about it except have some other redundancy.

nradov 3 days ago | parent | prev [-]

All of the major cloud EHRs run in multiple availability zones.

mschuster91 3 days ago | parent [-]

The data center might be... but are all fiber routers, amplifiers, PoPs etc. along the line from the datacenter to the hospitals and other healthcare providers backed up as well?

Particularly the "last mile" often enough is only equipped with batteries to bridge over small brownouts or outages, but not with full fledged diesel engines.

And while hospitals, at least those that deal with operating patients, are on battery banks and huge ass diesel engines... private small practices usually are not, if you're lucky the main server has a half broken UPS where no one ever looked after that "BATTERY FAULT" red light for a year. But the desktop computers, VPN nodes, card readers or medical equipment? If it's not something that a power outage could ruin (such as a MRT), it's probably not even battery backed.

There's a German saying "the emperor is naked, he has no clothes". When it comes to the resilience of our healthcare infrastructure, the emperor isn't just naked, the emperor's skin is rotting away.

marcosdumay 2 days ago | parent | next [-]

Just pointing out that none of those non-data-center buildings are data centers.

I really doubt the classification of "data centers and other large, non-critical power consumers" extends to telecom infrastructure.

mschuster91 2 days ago | parent [-]

> I really doubt the classification of "data centers and other large, non-critical power consumers" extends to telecom infrastructure.

That's the point. Okay, cool, the datacenter is highly available, multiple power and data feeds, 24/7/365. But that highly available datacenter is useless when it cannot be reached because its data feed elements or the clients don't have power.

nradov 3 days ago | parent | prev [-]

Starlink