I spent a lot of time reading Knuth's books, especially Seminumerical Algorithms when I was implementing multi precision arithmetic for various architectures. What amazing pieces of work they are. That was back in the time when you could look at assembler (or even C code) and get a good idea of how many cycles it would take.
I love the Art of the Algorithm, but I have to say I'm very happy to use off the shelf libraries now. I've implemented binary search dozens of times and each time spent ages fixing all the corner cases. Instructive, but a waste of time when you can import a library which is battle tested and faster than anything you'll ever write.
Algorithms are dead - long live the (commoditized) Algorithm!
Incidentally, I think there is at least one important area where Knuth is actually still ahead of off-the-shelf libraries as of 2020: "broadword computing". When skimming volume 4A for semi-recreational purposes a few years ago, I was surprised to find multiple "bit-hacks" which were better than anything I had found in other sources, and proceeded to use them in my own open-sourced work. Every time I've encountered a situation where bitvectors are useful, I have benefited from rolling my own implementation.
Came here to say this. It's rare that you want to consume these things through an abstraction layer: the layer is often heavier than the work underneath. Sucks for readability, though, so I usually spend as much time documenting as implementing.
I love the Art of the Algorithm, but I have to say I'm very happy to use off the shelf libraries now. I've implemented binary search dozens of times and each time spent ages fixing all the corner cases. Instructive, but a waste of time when you can import a library which is battle tested and faster than anything you'll ever write.
Algorithms are dead - long live the (commoditized) Algorithm!