In an unrelated Z80 testing situation, my son in college was asked to verify a gate-array implementation of a Z80 for a NASA project. He wrote an opcode-by-opcode comparison of a real Z80 and the DUT and stored internal state on both before and after. The DUT failed to set flags consistently on some opcodes! It was some gate-array implementation for custom hardware (a radar unit with a built-in FFT processor). He did some other projects, but still tells the story of getting a part stepped for NASA.
Anyway years later I'm gratified to see a short animation posted here on HN, of the Juno space probe flying by Jupiter. Some of his code (unrelated to testing) was flying on that!
If you get in a similar situation again, thanks to the visual6502 project there's now a transistor-level simulation of the Z80 (has been for several years actually, the URL was just never as popular as the 6502 simulation):
One thing to be aware of is that the various Z80 clones had slightly different undocumented behaviour though, so some differences where the undocumented behaviour leaks into the outside world are to be expected (like the "X" and "Y" flag bits at bit positions 3 and 5).
In his case, he had to test the DUT he was given, and not a simulator.
If he'd used a simulator for the reference design, he'd have to have verified that against an actual Z80 first. Which would be the same effort he went through to test his DUT anyway!
The author's relationship with the machine reminds me of Scotty in
Start Trek. In the age of "the cloud" that kind of engineering
intimacy feels like something lost in days of yore.
The cloud is just the rebranding of timesharing, anyone that got that scars in mainframes, microcomputers and UNIX, can see lots of similarities with "Cloud Native".
As Sun used to put it, The Network is the Computer.
From the point of view of the user, I wish this was more true than it is. Cloud users have a specific thing they want to get done, and ideally, the cloud should do that and get out of the way. And yet, there's still room for specialists to "get close" to the cloud and have an expertise in glueing cloud services together to make a product.
From the point of view of cloud providers, standing up a cloud service is enormously complex, filled with interesting distributed computing problems, especially under scale, diverse tenancy (and the predators that attracts), diverse workloads, and creative consumers finding ways to use it that you hadn't considered.
> here were two red and two green lamps. These are a go-no/go set for parametric failures such as continuity and leakage, and a second pair of Red/Green lamps for functional failures.
Interesting idea to have separate parametric and functional test indicators. I've made a few custom production test setups in the past years (nothing as nearly as complicated as the one described here) and always went with the greatest simplicity possible - start button and a single red/green go/no go, exactly to avoid the kind of problem described later in the article.
I can see why they would do that though. When reviewing test runs, it's always the parametric tests that give the most false-positives. If a parametric test fails, the part functions as intended, but some measured parameters are outside of the designed range. Parametric failures are often not failures in the parts themselves, but failures in the test procedure (e.g. bad connections to test points due to dirt or worn out contacts in the fixture, etc.).
A separate indicator for parametric failure would give the operator an early indicator that excessive failures might be false positives and re-test the parts, or clean or exchange the test fixture.
Even today testers cost millions of dollars, chip designers spend ages optimising their test vectors to minimise how long each chip spends on that expensive tester (or testers, often you do die sort to weed obvious bad dies before packaging)
Testers are fascinating. I worked on firmware for a part under test and was amazed that tests are often literal vectors of bits and that interfaces (UART, JTAG etc) are bit banged. The Test team was super focused on time with the mindset being every additional second of test time was a million dollars less profit on the part.
Anyway years later I'm gratified to see a short animation posted here on HN, of the Juno space probe flying by Jupiter. Some of his code (unrelated to testing) was flying on that!