Hacker News new | past | comments | ask | show | jobs | submit login

Hi! I'm that person! Senior engineer, decade of experience. I've used debuggers in the past, both for running code and looking at core dumps, but I really don't find them to be cost effective for the vast majority of problems. Just write a print statement! So when I switched from C to python and go a couple jobs ago, I never bothered learning how to use the debuggers for those languages. I don't miss them.



I am also this person. I'm a systems programmer (kernel and systems software, often in C, C++, golang, bit of rust, etc)

What I find is that if my code isn't working, I stop what I'm doing. I look at it. I think really hard, I add some print statements and asserts to verify some assumptions, and I iterate a small handful of times to find my faulty assumption and fix it. Many, many times during the 'think hard' and look at the code part, I can fix the bug without any iterations.

This almost always works if I really understand what I'm doing and I'm being thoughtful.

Sometimes though, I don't know what the hell is going on and I'm in deep waters. In those cases I might use a debugger, but I often feel like I've failed. I almost never use them. When I helped undergrads with debuggers it often felt like their time would be more productively spent reasoning about their code instead of watching it.


Your list of programming languages excluded Java. Please ignore this reply if Java is included.

Are you aware of the amazing Java debugger feature of "Drop to Frame"? Combined with "hot-injection" (compile new code, then inject into current debug'd JVM), it is crazy and amazing. (I love C#, but the hot-injection feature is much worse than Java -- more than 50% of the time, C# compiler rejects my hot-injection, but about 80% of the time, JVM accepts my hot-injection.) When working on source code where it is very difficult to acquire data for the algorithm, having the ability to inspect in a debugger, make minor changes the the algorithm, then re-compile, inject new class/method defs, the drop to frame, the re-exec the same code in the same debug session is incredibly powerful.


Yes, that sounds pretty cool and it doesn't take a lot of imagination to see the utility in this. I've done a lot work on lower level software, often enough on platforms where the debuggers are tough to get working well anyway.

The plus side of less capable tooling is it tends to limit how complex software can be--the pain is just too noticeable. I haven't liked java in the past because it seems very difficult without the tooling and I never had to do enough java to learn that stuff. Java's tooling does seem quite excellent once it is mastered.


This part: "I haven't liked java in the past because it seems very difficult without the tooling"

If you are a low level programmer, I understand your sentiment. A piece of advice, when you need to use Java (or other JVM languages), just submit to all the bloat -- use an IDE, like IntelliJ, that needs 4GB+ of RAM. The increase in programmer productivity is a wild ride coming from embedded and kernel programming. (The same can be said for C#.)


I think there's a sort of horseshoe effect where both beginners and some experienced programmers tend to use print statements a lot, only differently.

When you're extremely "fluent" in programming code and good at mentally modelling code state, understanding exactly what the code does by looking at it, stepping through it doesn't typically add all that much.

While I do use a debugger sometimes, I'll more often form a hypothesis by just looking at the code, and test it with a print statement. Using a debugger is much too slow.


> Using a debugger is much too slow.

This varies, but in a lot of environments, using a debugger is much _faster_ than adding a print statement and recompiling (then removing the print statement). Especially when you're trying to look at a complex object/structure/etc where you can't easily print everything.


I think there is a bit of a paradox, debugging can seem heavy, but then when you’ve added enough print statements you’ve spent more time and added more things to clean up than if you had just taken the time to debug, well, you should have debugged. But you don’t know until you know. This seems to also appear with the “it would’ve been faster to not try to automate/code a fuller solution” than address whatever you were doing.


> using a debugger is much _faster_ than adding a print statement and recompiling (then removing the print statement)

Don't remove the print statement. Leave it in, printing conditionally on the log level you set.


In what way is using a debugger "slow"? I find that it speeds up iteration time because if my print statements aren't illustrative I have to add new ones and restart, whereas if I'm already sitting in the debugger when my hypothesis is wrong, I can just keep looking elsewhere.


I find I use the stepping debugger less and less as I get more experienced.

Early on it was a godsend. Start program, hit breakpoint, look at values, step a few lines, see values, make conclusions.

Now I rely on print statements. Most of all though, I just don't write code that requires stepping. If it panics it tells me where and looking at it will remind me I forgot some obvious thing. If it gives the wrong answer I place some print statements or asserts to verify assumptions.

Over time I've also created less and less state in my programs. I don't have a zillion variables anymore, intricately dependent on each other. Less spaghetti, more just a bunch of straight tubes or an assembly line.

I think it's possible that over the years I hit problems that couldn't easily be stepped. They got so complicated that even stepping the code didn't help much, it would take ages to really understand. So later programs got simpler, somehow.


I find I use the stepping debugger more and more as I get more experienced. Watching the live control flow and state changes allows me to notice latent defects and fix them before they ever cause actual problems. Developers ought to step through every line of course that they write.


It’s because we moved to request-response, and allow for deep linking / saved states. Soo the state of the session is more easily reproduced


> I never bothered learning how to use the debuggers for those languages. I don't miss them.

This could be causal.


You'd have to assume that the python and go debuggers do something that C debuggers don't do.


Or assume that python debuggers aren't as nice to use, or that python does not lend itself to inspecting weird memory and pointer dereferences, or a bunch of other possibilities.


Most of what I worked with code wise growing up was either very niche or setup in such a way that debuggers weren't an option so I never really used them much either. I don't understand their appeal when print statements can give you more context to debug with anyway. I'm definitely no senior but I'm used to solving things the "hard way" as one developer told me. He wondered how I could even work because of how "bad" my tools were but I didn't know any better being self taught and with certain software it's just not compatible with the tools he mentioned.


It depends what you're doing. Sometimes inserting a print and capturing state works. Sometimes you're not sure what you need to capture, or it's going to take a few iterations. That's where pdb / breakpoint() / more interactive debuggers can be very helpful.


What makes you talented?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: