It's not' impossible', it's economically unviable. There's a difference. We really should be mandating that companies that don't pay fair market prices for the data they use to train their models must open source everything as reparation to humanity.
I have a (completely unscientific) theory that my tinnitus is a result of early exposure to CRT TV's and my brain trying to compensate for the noise. The reason I say that is that it's roughly in the same frequency band as the PAL horizontal refresh. It's been with me basically my whole life, since long before any real hearing damage would have set in anyway - I remember asking friends when I was 9 or 10 whether they could hear it too. It wouldn't surprise me if there was a window of opportunity when the brain is still plastic for these kind of "adaptations" to set in place.
I can't help but wonder whether the solution here is something like building a multi-resolution understanding of the codebase. All the way from an architectural perspective including business context, down to code structure & layout, all the way down to what's happening in specific files and functions.
As a human, I don't need to remember the content of every file I work on to be effective, but I do need to understand how to navigate my way around, and enough of how the codebase hangs together to be able to make good decisions about where new code belongs, when and how to refactor etc.. I'm pretty sure I don't have the memory or reading comprehension to match a computer, but I do have the ability to form context maps at different scales and switch 'resolution' depending on what I'm hoping to achieve.
If the tariffs are able to act as a sustained source of revenue then they're not working to bring manufacturing back to the US. If they work to bring manufacturing back then they're not going to raise much revenue (long term). Either way, the bulk of the pain is going to be borne by average people.
- a lot of DSP stuff is pretty magical in it's various applications - digital filtering, modulation/demodulation, recovery of weak signals in noisy environments, beam-forming etc
There's a lot of really amazing, largely mathematical, foundation work that we now tend to take for granted and without which we'd be back in the relative dark ages.
I think it's probably about the timing. I'm not an expert, but I believe that opening the side borders requires cycle exact timing at both edges of the screen, and when you have characters enabled on the screen, the VIC-II graphics chip ends up 'stealing' memory cycles from the CPU every 8 scanlines to fetch character and font data from RAM. These lines are known as 'badlines'.
Across an entire scanline I think you normally get something like 63 6510 cpu cycles to do 'work' in, but only 23 if you hit a badline - keeping in mind that some instructions take multiple cycles to execute. This probably makes the timing difficult or impossible to manage with the characters turned on.
It's not just that the 'badlines' steal 40 cycles. They steal a solid block of 40 cycles that cover most of the screen, from end of hblank until just before the right border starts. This blog post [1] has a nice interactive demo showing badline timings.
During a badline it's simply impossible write to the VIC-II's registers during the left border. Though, this seems to indicate it's still possible to open the right border during a bad line, but it's a 1 cycle window (maybe 2).
> The amount of effort to do a single 30 minute video of this sort when scaled out to a half or full year math class is significant.
This is true if Grant is the only person doing the work, however having a well educated and scientifically engaged populace seems important enough that we (the human race) should devote a few more resources to creating high quality (and freely available) courseware for all curricula/year levels.
Start paying attention to the things that bog you down when working on code, and the things that your users (ought to) expect to be easy but that are inscrutably difficult to achieve with your existing codebase. Find high quality open source projects and study them. Read books (eg. Domain driven design [distilled]). Stay somewhere long enough to feel the impact of your own bad design decisions. Take some time occasionally to pause and reflect on what isn't working and could have been done better in hindsight. Consider whether you could have done anything differently earlier in the process to avoid needing hindsight at all.
Kung Fu flash is probably all you need, with a few caveats (eg. With KFF the drive is "emulated" by intercepting kernal vectors rather than acting as a 1541 on the serial bus, so some software that eg. uses fast loaders or relies on the disk drive for offloading computation won't work).
If you want to get fancy you could go for something like an Ultimate II+ and a usb key, which will get you a bunch of extra functionally like network connectivity, extra SID support, pretty solid compatibility, REU emulation etc (but UII+ will also cost a lot more).
Given you've got a real 1541, maybe you could just copy files/disk images across to the real thing if KFF doesn't work for a particular program I guess?
I credit the C64 that I had as a kid and magazines like COMPUTE! / Compute's Gazette for my career in software. I taught myself 6510 assembler and started writing some simple demo-like things on that machine, and got hooked on the feeling of creativity that it unlocked.
Funnily enough I'd been thinking that it's about time I tried (again, as an older person) to write a game or a demo for the old 64.
It's absolutely amazing what people are able to get out of these 40+ year old machines now, and I love that there's still a vibrant scene.
In addition to the tools specified in the article, I would also recommend "retro debugger", it's an amazing tool for single stepping through code and seeing what's going on, even letting you follow the raster down the screen to see what code is executing on given scaliness.
Also, there are some really good youtubers out there helping to demystify how various games/demos work.. Martin Piper comes to mind as a good example.
I credit the BASIC and machine language byte code type-in programs for reinforcing my attention to detail and being able to track down software problems.
Kids these days[0] will never know the "pleasure" of spending hours typing in some cheesy BASIC game only to have to track down any number of syntax errors!
It's amazing that I still remember some opcodes like the ones you posted (and others, such as 0xAD for LDA$, 0x78 for SEI, 0x58 for CLI) after all these years. Brains are weird.
I had a C64 as well. My school had a programming class and we all shared a TRS80 (I think). I remember writing a program to find prime numbers and thinking about various optimizations. Mine was fastest, and I was proud. Then the boy that wrote directly in assembly ran his... That was the moment I decided to get good. :-)
I have similar experiences and sentiments myself. One difference is I was into Apple and Atari computers, but that does not seem to matter all that much.
As a younger person, I did demos and explored the tech plenty without actually building finished applications and or games.
Learned a ton! And had major league fun. Great times filled with bits of understanding I draw on all the time.
And YES! Good grief, the pixels are dancing in ways nobody would have predicted back then.
When I hop on the machines today I find them simpler than I remember and fun to program.
My experience as well but using the ZX-Spectrum. Trying to figure out why the machine code I hand translated from Z80 assembler crash the computer taught me a lot. No internet to ask for help. Just a book explaining how to program the ZX-Spectrum using machine code. I was 11 at the time.