Yeah I always heard that the phone lines carried their own power, and in Florida the phones did keep working when the power went out, but I never knew why.
So the grid was always charging up the lead acid batteries, and the phone lines were always draining them? Or was there some kind of power switching going on where when the grid was available the batteries would just get "topped off" occasionally and were only drained when the power went out?
The phone grid predated the electrical grid. There was no other choice for power.
Actually, there was one. Even earlier phones had their own power. A dry-cell battery in each phone, and every 6 months, the phone company would come around with a cart and replace everyone's battery. Central battery was found to be more convenient, since phone company employees didn't have to go around to everyone's site. Central offices could economize scale and have actual generators feeding rechargeable batteries.
It's a pretty decent chunk of power down a POTS cable too, as it was designed to ring multiple big chunky metal bells in the days of yore.
I was wiring in a phone extension for my grandma once as a boy and grabbed the live cable instead of the extension and stripped the wire with my teeth (as you do). I've been electrocuted a great number of times by the mains AC, but getting hit by that juicy DC was the best one yet. Jumped me 6ft across the room :D
The teeth. Yikes! But yeah, I remember having the rotary phone disassembled and touching the wires adjusting something when a ring came. Gave me enough of a jolt to remember.
The batteries, the grid/generator-supplied power supplies, and the telephone switch equipment are all connected in parallel -- as if the entire DC power infrastructure consists of only two wires, and everything involved with it connects only to those two wires.
1. In normal operation, the batteries are kept at a constant state of charge. The switches are powered from the same DC bus that keeps the batteries charged.
2. When the power grid goes down, the batteries slowly discharge and keep things running like nothing ever happened (for hours/days/weeks). There is no switchover for this; it's just the normal state, minus the ability to juice-up the batteries. (Remember: It's just one DC bus.)
3. When the grid comes back up (or the generators kick in), the batteries get recharged. There is no switchover for this either; nothing important even notices. (Still just one DC bus.)
4. If the grid stays up long enough, go to 1. Repeat as the external environment dictates. (And as you might guess, it's still one DC bus and there's also no switchover here. Things just continue to work.)
--
You can play with this at home with a capacitor (which loosely acts like a battery does), an LED+resistor combo (which acts as a load), and a small power supply that is appropriate for LED+resistor you've chosen (which acts as the AC-DC converting grid input).
Wire them all 3 parts up in parallel and the light comes on.
Disconnect the power supply, and the light stays on for a bit -- it successfully runs from power stored in the capacitor.
Reconnect the power supply, and the light comes on and the capacitor ("battery") recharges -- concurrently.
Improve staying power by adding more parallel capacitance. Reduce or eliminate it by reducing or eliminating capacitance. Goof around with it; it's fun. (Just don't wire the capacitor backwards. That's less fun.)
The batteries and phone lines were one system at -48v with power supplies converting AC power to DC while grid / generator is up.
The batteries are floated at the line voltage nothing was really charging or discharging and there was no switchover.
This is similar to your cars 12v dc power system such the when the car is running the alternator is providing DC power and the batteries float doing nothing except buffering large fluctuations stabilizing voltage.
Grid charging batteries, phone draining them as I understand. Of course there were switches all over the us so I can't make blanket claims but from what I hear that was normal.
> Not most of the developers but whoever put in place a system where so much time is spent on overhead/retrograde activities.
Dude that's everybody in charge. You're young, you build a system, you mold it to shifting customer needs, you strike gold, you assume greater responsibility.
You hire lots of people. A few years go by. Why can't these people get shit done? When I was in their shoes, I was flying, I did what was needed.
Maybe we hired second rate talent. Maybe they're slacking. Maybe we should lay them off, or squeeze them harder, or pray AI will magically improve the outcome.
> The cold reality, in my opinion, is that the things we value about ourselves are generally not that valuable to others. I love my own personality and humanity, my soul if you will, but nobody's paying me for it, and so I have to value it accordingly.
How many GB? I see "bluray rip" mp4 files on torrent index sites, which I assume have been aggressively recompressed, but there are three size tiers in the "1080p" category: 2-3GB, 7-10GB, and 15+GB.
You want to search for BDMV for full disc images, or for remuxes which are uncompressed video and audio streams, if you want to get a sense for the size on disc. Typical Blu-ray images will be from 20-40ish GB.
Unmodified Blu-ray disc images are the BDMV folders I mentioned. Any BDMV will be unmodified almost all the time though I've very occasionally run into modified ones originating from the Chinese piracy scene that had custom subs added.
A "good" remux is actually the highest quality movie release available, usually, if you don't care about file size. A good remux will combine all the best parts of every possible release into one super-file. For one movie, you could have the best video quality be on a French UHD Blu-ray, the best audio quality from a different source, subtitles aggregated from various international releases and streaming platforms (and filtered/deduped for quality), chapter titles taken from an old DVD, and all available commentary tracks collected. Rarely you might even see a hybrid release where multiple streams are spliced together to fix some problem or another in one of them. You can look for releases by the CINEPHILES p2p group for gold standard examples, they get distributed fairly widely so you can probably find some.
To answer what you asked about extra audio tracks specifically (outside of full disc images)--usually non-English dubs are considered bloat and aren't distributed. Commentary tracks are kept. Audio description is a mixed bag, good groups will keep it.
On private trackers where people care about that stuff it's easier. The NFO usually has a pretty comprehensive description of the contents and all the tracks etc so you can decide which version you want before downloading.
It really depends on your hard drive space and your tolerance for compression. Two hours of decently compressed video is a few gigs, but if you want 10-bit HDR with 5.1 audio, then choose the 15 gig torrent.
Also, the studio paid a professional to peep at all the inter-frame pixels and turn the knobs right when they encoded the bluray. I might be able to get a perceptually lossless rip that's 25-50% smaller than the original, but it's just not worth my time.
amazon sells a kindle version for ~$6. looks like deadtree copies retail under $10 online (or ~$17 from PRH). IA has it but it's covered under their stupid pseudo-library false scarcity bargain so you might have to get in line. if you're okay with physical, i bet your local library has it.
otherwise... can't check from work, but perhaps anna's archive/slsk has you covered?
Yep! But you are also a mouse who has limited venues in which to complain.
I wonder if the vaccine causes inflammatory and other unpleasant responses when administered. If so, I wonder if those responses go away after the last dose, when the three months of protection begin.
Here are the two paragraphs that I found interesting:
> The new vaccine, for now known as GLA-3M-052-LS+OVA, mimics the T cell signals that directly stimulate innate immune cells in the lungs. It also contains a harmless antigen, an egg protein called ovalbumin or OVA, which recruits T cells into the lungs to maintain the innate response for weeks to months.
> In the study, mice were given a drop of the vaccine in their noses. Some recieved multiple doses, given a week apart. Each mouse was then exposed to one type of respiratory virus. With three doses of the vaccine, mice were protected against SARS-CoV-2 and other coronaviruses for at least three months.
> It also contains a harmless antigen, an egg protein called ovalbumin or OVA
Here's hoping the final product doesn't have a side-effect of inducing an allergy to the main component of egg-whites.
Although even if that happened... Would it only apply to the raw materials, as opposed to cooked products where the ovalbumin was denatured by heat?
Edit: No, wait! What about "safe to eat" cookie-dough, which uses heat-treated flour and pasteurized eggs as ingredients!? The might still have intact ovalbumin, and obviously I can't give it up.
And what about people who eat actual raw egg? I routinely eat freshly-made cake batter (made with raw eggs; I just clean the bowl, I don't actually gobble tons of raw cake batter), for instance. It's perfectly safe because I live in a country where they actually check eggs for salmonella before selling them and people routinely eat raw eggs on top of things.
AFAIK people with egg white allergy also have to avoid cooked foods.
My understanding (not a chemist nor doctor) is that it's specific bits of the protein that trigger the allergic reaction, so eve if the whole protein breaks down parts of it will survive and will cause trouble.
I suppose this is similar to how we use broken down bits of virus to trigger immune reactions with vaccines.
The title of this post changed as I was reading it. "It looks like the 'JVG algorithm' only wins on tiny numbers" is a charitable description. The article is Scott Aaronson lambasting the paper and shaming its authors as intellectual hooligans.
Agree. Scott is exactly correct when he just straight calls it crap.
It's inaccurate to say it wins on small numbers because on small numbers you would use classical computers. By the time you get to numbers that take more than a minute to factor classically, and start dreaming of quantum computers, you're well beyond the size where you could tractably do the proposed state preparation.
That slide deck is complaining that correct work on quantum attacks should be seen as negligible priority or as distractions. TFA is complaining that JVG isn't even correct. They are pretty different concerns.
To be clear, I think that slide deck will be looked back upon as naive. In particular, it makes the classic mistake of assuming the size of number factored should be growing smoothly. That's naive because 15 is such a huge cost outlier and because quantum error correction has frontloaded costs. See [1] and [2] for details.
What do you mean? The original 2019 supremacy experiment was eventually simulated, as better classical methods were found, but the followups are still holding strong (for example [4] and [5]).
There was recently a series of blog posts by Dominik Hangleiter summarizing the situation: [1][2][3].
the reason people pay attention to him is that he does a good job publicizing both positive and negative results, and accurately categorizing which are bullshit
So the grid was always charging up the lead acid batteries, and the phone lines were always draining them? Or was there some kind of power switching going on where when the grid was available the batteries would just get "topped off" occasionally and were only drained when the power went out?