Yes, I would have written a test to confirm that input_a generates output_b. The first half of that function is nothing but a string builder and easily testable. If they were copy-and-pasting the actual output to get the expected output, then yes: they screwed that part up.
I'm far from a TDD purist, but it's clearly true that they're not sufficiently validating their code. If they had been, this would not have happened. I'm not saying this as an attack on their skills as programmers, but as caution to others reading the story: you have to - have to - test your stuff.
It's one thing to lean on third-party libraries and expect them to mostly Do The Right Thing, especially if they're popular and come from a culture of valuing test coverage. If you're writing a Rails app, for instance, you might be forgiven for not writing your own independent validations of the Ruby methods you call. But writing string-building code to implement RFC-defined network protocols? You should have some confidence that your program is generating the output that the other party will be expected. Especially with something as commonly proxied as unencrypted HTTP; you just have to assume that your data will be traversing and analyzed by systems 100% outside of your control.
At first I was thinking that suggesting that you exactly check the output of a request might be a bit much, especially since it could be entirely variable and cause your tests to break at any point during refactoring. If that was done by a third party framework, as you point out, you might not get a whole lot of value from testing its output. However, if you're constructing your own HTTP requests, as seem to be, then yeah, you probably need to explicitly check that it is being built up correctly. Or, since this appears to be a single build up, and not common/shared functionality, it could probably be abstracted into a common function/utility that does it for you. That should be easily unit-testable. Fair enough.
I'm far from a TDD purist, but it's clearly true that they're not sufficiently validating their code. If they had been, this would not have happened. I'm not saying this as an attack on their skills as programmers, but as caution to others reading the story: you have to - have to - test your stuff.
It's one thing to lean on third-party libraries and expect them to mostly Do The Right Thing, especially if they're popular and come from a culture of valuing test coverage. If you're writing a Rails app, for instance, you might be forgiven for not writing your own independent validations of the Ruby methods you call. But writing string-building code to implement RFC-defined network protocols? You should have some confidence that your program is generating the output that the other party will be expected. Especially with something as commonly proxied as unencrypted HTTP; you just have to assume that your data will be traversing and analyzed by systems 100% outside of your control.