Hacker News new | past | comments | ask | show | jobs | submit login

I think there's a sort of horseshoe effect where both beginners and some experienced programmers tend to use print statements a lot, only differently.

When you're extremely "fluent" in programming code and good at mentally modelling code state, understanding exactly what the code does by looking at it, stepping through it doesn't typically add all that much.

While I do use a debugger sometimes, I'll more often form a hypothesis by just looking at the code, and test it with a print statement. Using a debugger is much too slow.




> Using a debugger is much too slow.

This varies, but in a lot of environments, using a debugger is much _faster_ than adding a print statement and recompiling (then removing the print statement). Especially when you're trying to look at a complex object/structure/etc where you can't easily print everything.


I think there is a bit of a paradox, debugging can seem heavy, but then when you’ve added enough print statements you’ve spent more time and added more things to clean up than if you had just taken the time to debug, well, you should have debugged. But you don’t know until you know. This seems to also appear with the “it would’ve been faster to not try to automate/code a fuller solution” than address whatever you were doing.


> using a debugger is much _faster_ than adding a print statement and recompiling (then removing the print statement)

Don't remove the print statement. Leave it in, printing conditionally on the log level you set.


In what way is using a debugger "slow"? I find that it speeds up iteration time because if my print statements aren't illustrative I have to add new ones and restart, whereas if I'm already sitting in the debugger when my hypothesis is wrong, I can just keep looking elsewhere.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: