What's with giving error rates as a count of bits, when it's not clear how much DRAM was tested? There's a comment that the error was around 50% but the graphs should be error percentage, not some (meaningless) absolute number!
And there was an interesting feature - error rates didn't seem to change linearly with time, but (strongly for DDR4 and less so for DDR5) the error rate changes in intervals of 8 seconds. That's very much unexpected, so needs a good explanation or indicates a likely error in their procedure.
I agree that absolute numbers are a bit strange, but the article states exactly which model of memory was used, namely a W-NM56S508G SODIMM for DDR5 and a KF432C16BB/4 DIMM for DDR4, not to mention the most important part is measuring their different performance between generations
The error rate is given per bit, not per second, i.e. every few bars represents a distinct DRAM chip. That makes some sense, and the article explains quite well why DRAM would behave like that... but I agree that I had to read the article at least twice to figure out that the x-axis on the graph represents the lower bit of the address line!
I think it might be flash memory (i.e. non-volatile, unlike RAM), but this must be so obvious to readers of this publication that it's never said explicitly.
Yup - as they say, "the cloud" is just some else's computer.
It's not so hard to genuinely self-host. You just need a reasonable ISP who is willing to open your connection, and to be sensible about securing your systems.
Aarrggghhh. Why do we insist that projects/systems/languages must be continually changing and evolving?
Shouldn't there be room for a system that just does the thing(s) is does well? Why do we need to be continually tweaking? Adding increasingly obscure "features" and new bugs?
Unlike a human language, a computer language isn't "dead" when it stops changing. It is dead when nobody is using it. These are very different criteria.
> Unlike a human language, a computer language isn't "dead" when it stops changing. It is dead when nobody is using it. These are very different criteria.
A human language is considered dead if it no longer has any first-language speakers, but does have second-language speakers or is used fluently in written form, such as Latin. [0]
I broadly agree with your point though. Many languages would do well to slow their rate of change. There are very few slow-changing languages, like C, Forth, and Scheme. This 4 year old comment of mine on this topic is still applicable. [1]
Agreed - languages that keep on changing result in a lot of churn in the ecosystem, as older libraries quickly feel dated when they do things "the old way".
New features often interact poorly with some of the existing features, as those features weren't designed with the new feature in mind.
Be careful with what you wish for. One such niche is COBOL development. COBOL hasn’t changed much and it’s anything but dead in the sense of not being used, but there are not few developers who wish it were.
>> The existing ones are all deficient in rather serious ways
But most of the new ones look and behave just like the old ones - but with slightly more awkward syntax to allow for some special "feature" dear to the authors ;)
I routinely check out the various new languages mentioned on HN and Lobsters. While plenty of languages are new to me, I'm yet to see any feature that is new to me :(
I appreciate the work that goes into any language, but the stream of "new" languages feels like a stream of tweaks and rearrangement. It's disappointing. Most new languages come across as "like language X but with feature Y".
Reminds me of all those "Airbnb for X" proposals from a few years ago:)
> they almost always culminate in the pilot failing to control the airplane
"failing to control the airplane" is a little like saying everyone dies of heart failure. Yes their heart stopped - but why!?
The real cause is somewhat earlier. Why was a normally competent pilot in a situation where they no longer adequately controlled the aircraft?
Improve avionics - great. Improve situational awareness - really helpful. Handling should be way down the list - it becomes quite intuitive very quickly.
Minimal traffic? An ATPL pilot here. Fly into any major airport and you'll see that it can be REALLY BUSY. The system is based around separation down to about 1 minute. Major airfields are all traffic limited (hence "slots").
1 minute might seem like a lot when you're walking or driving, but when coming in at 200mph in something that weights hundreds of tons ... I want that guy 1 minute ahead of me off the runway ;)
And that's considering that existing traffic is of similar speed and ability. Adding slow, small, low-performance aircraft to the mix makes things very interesting ;( Then add low-experience pilots who have to stop and think about procedures, actions, radio calls ...
permission to back their car out of their driveway - It might be different if those cars cost $100M-$300M. A tiny bingle is very expensive.
True I suppose, and it's not like we're on the verge of any new energy density tech that would make personal VTOLs viable either so runways might be the bottleneck that keeps throughput constrained for the foreseeable future I guess. Although you can always build more of them :)
And there was an interesting feature - error rates didn't seem to change linearly with time, but (strongly for DDR4 and less so for DDR5) the error rate changes in intervals of 8 seconds. That's very much unexpected, so needs a good explanation or indicates a likely error in their procedure.