> The substations or feeders that are tripped off are not currently determined by real-time metering - instead they are pre-allocated based on their typical demand. This means that the system operator does not really know how much demand will be disconnected at any given time.
This is wild. From a amateur technical perspective, it would only take a cheap hall sensor inside the transformer to have a pretty good guess of how much current has been flowing to the load.
Hell, put the hall sensor onto a board with a micro controller and a LORA transmitter and stick it to the outside of the feed line. Seems like an incredibly cheap upgrade to get real-time load data from every substation.
The nice thing about frequency based regulation is it's an inherent property of the system, so as long as you're connected to the grid you've got the info you need to decide when to turn on or off.
If you're monitoring real time power consumption you then need a whole extra infrastructure to communicate this info back and forth. Of course you then have to consider how you're going to keep that extra infra online in the event of power issues.
Frequency based regulation is only telling you that something is wrong, but not what or how to fix it.
If you find yourself in the middle of a black swan event, and 15 GW have tripped offline, you have milliseconds to dump pretty much exactly 15 GW of load, otherwise more generating capacity is going to trip offline very quickly.
If you only dump 14 GW because you used historical data (which happens to be imprecise, because today's cloud cover reduced rooftop solar output), you're still going to be in trouble. A detector scheme with sensors at every substation would allow you to do just that.
The board is not the expensive part. It's the getting reliability qualified and then having staff fit it to every substation, arrange the data links, and construct the dashboard.
I also wonder what the realtime requirement is. Data from a minute ago is fine .. except in this kind of situation, when things are changing very quickly.
Doing that at scale is tricky and requires a lot of people to participate in the mechanism, whereas the law only forces producers above a given size to participate.
The estimates we get from seasonal studies are usually close enough, especially since load shedding isn't a finesse exercise.
The situations that require load shedding usually give operators only a few minutes to react, where analyzing the issue and determining a course of action takes the lion's share. Once you're there, you want the actual action to be as simple as possible, not factor in many details.
This is wild. From a amateur technical perspective, it would only take a cheap hall sensor inside the transformer to have a pretty good guess of how much current has been flowing to the load.
Hell, put the hall sensor onto a board with a micro controller and a LORA transmitter and stick it to the outside of the feed line. Seems like an incredibly cheap upgrade to get real-time load data from every substation.