▲ | marssaxman 5 days ago | |
I imagine that it depends on the kind of software you are working on. I found debuggers helpful back when I was working on interactive GUI programs which had a lot of complex state, built in classic OOP style with lots of objects pointing at each other, but I have not done that sort of thing in a long time. In part, that's because I got seriously into functional programming, which left me with a much more rigorous approach to state transitions; but then I shifted from conventional software into embedded firmware, where you can sometimes stop the microcontroller and use a debugger via JTAG, but it's usually easier to just stream data out over a serial port, or hook an LED up to a spare pin and make it blink. The world doesn't stop even if you want the software to stop, so the utility of a debugger is limited. After that I went to work for Google, building distributed software running across many machines in a datacenter, and I have no idea how you would hook up a debugger even if you wanted to. It's all logs all the time, there. By the time that was over, I was thoroughly accustomed to logging, and attaching a debugger had come to seem like a nuisance. Since then I've mostly worked on compilers, or ML pipelines, or both: pure data-processing engines, with no interactivity. If I'm fixing a bug, I'm certainly also writing a regression test about it, which lends itself to a logging-based workflow. I don't mind popping into gdb if that's what would most directly answer my question, but that only happens a couple of times a year. | ||
▲ | TheRoque 5 days ago | parent [-] | |
Thanks for the detailed answer. Indeed I also worked on embedded a long time ago, and the debugger was often the last resort because the debug pins weren't always accessible. |