| ▲ | lpapez 6 days ago |
| This article goes completely against my experience so far. I teach at an internship program and the main problem with interns since 2023 has been their over reliance on AI tools. I feel like I have to teach them to stop using AI for everything and think through the problem so that they don't get stuck. Meanwhile many of the seniors around me are stuck in their ways, refusing to adopt interactive debuggers to replace their printf() debug habits, let alone AI tooling... |
|
| ▲ | lordnacho 6 days ago | parent | next [-] |
| > Meanwhile many of the seniors around me are stuck in their ways, refusing to adopt interactive debuggers to replace their printf() debug habits, let alone AI tooling... When I was new to the business, I used interactive debugging a lot. The more experienced I got, the less I used it. printf() is surprisingly useful, especially if you upgrade it a little bit to a log-level aware framework. Then you can leave your debugging lines in the code and switch it on or off with loglevel = TRACE or INFO, something like that. |
| |
| ▲ | shmerl 6 days ago | parent | next [-] | | I kind of had the opposite experience. I used to rely mostly on printfs and etc. but started using debugger more. printf doesn't improve going up and down the call stacks in the debugger to analyze their chain (you'd have to spam debug printfs all around you expect this chain to happen to replace the debugger which would waste time). debugger is really powerful if you use it more than superficially. | | |
| ▲ | boredtofears 5 days ago | parent [-] | | > you'd have to spam debug printfs all around you expect this chain to happen to replace the debugger which would waste time It's not wasting time, it's narrowing in on the things you know you need to look for and hiding everything else. With a debugger you have to do this step mentally every time you look at the debugger output. | | |
| ▲ | shmerl 5 days ago | parent [-] | | Trying to guess what that chain is and putting printf's all around that path feels like doing a poor simulation of what debugger can do out of the box and unlike us - precisely. So I'd say it's exactly the opposite. If you only care about some specific spot, then sure - printf is enough, but you also need to recompile things every time you add a new one or change debug related details, while debugger can do it re-running things without recompilation. So if anything, printf method can take more time. Also, in debugger you can reproduce printf using REPL. |
|
| |
| ▲ | ambicapter 6 days ago | parent | prev | next [-] | | > printf() is surprisingly useful, especially if you upgrade it a little bit to a log-level aware framework. What do you mean by this? Do you mean using a logging framework instead of printf()? | |
| ▲ | cbanek 6 days ago | parent | prev | next [-] | | This is absolutely true. If anything, interactive debuggers are a crutch and actual logging is the real way of debugging. You really can't debug all sorts of things in an interactive debugger, things like timing issues, thread problems, and you certainly can't find the actual hard bugs that are in running services in production, you know, where the bugs actually happen and are found. Or on other people's machines that you can't just attach a debugger. You need good logging with a good logging library that doesn't affect performance too much when it's turned off, and those messages can also provide very useful context to what things are going on, many times as good if not better than a comment, because at least the log messages are compiled in and type checked, as opposed to comments, which can easily go stale. | | |
| ▲ | TheRoque 6 days ago | parent | next [-] | | Both are valid, if your code is slightly complex it's invaluable to run it at least once with a debugger to verify that your logic is all good. And using logs for this is highly inefficient. E.g. if you have huge data structures that are a pain to print, or if after starting the program you notice that you forgot to add some print somewhere needed. And obviously when you can't hook the debugger, logs are mandatory. Doesn't have to be one or the other. | | | |
| ▲ | brongondwana 5 days ago | parent | prev | next [-] | | This is why I trigger a segfault which dumps core at the spot where I had the printf when the conditions aren't what I want, so I can then open the debugger on the core (obviously: not if I have a copy of the input which can recreate it, if so then a debugger with a conditional breakpoint at the same spot is even better) | |
| ▲ | lenkite 5 days ago | parent | prev | next [-] | | Timing and concurrency issues are actually easier to discover in a debugger than using printf logging. | |
| ▲ | JustExAWS 5 days ago | parent | prev [-] | | Really? I’ve been using interactive debuggers since the days of Turbo C/Turbo Pascal in the mid 1990s. Yes you need good logging also. |
| |
| ▲ | evertedsphere 5 days ago | parent | prev [-] | | it can get even better https://rustc-dev-guide.rust-lang.org/tracing.html#i-dont-wa... |
|
|
| ▲ | VectorLock 6 days ago | parent | prev | next [-] |
| Interactive debuggers and printf() are both completely valid and have separate use-cases with some overlap. If you're trying to use, or trying to get people to use, exclusively one, you've got some things to think about. |
| |
| ▲ | jennyholzer 5 days ago | parent [-] | | printf() users in this thread are very proud of their own ignorance | | |
| ▲ | marssaxman 5 days ago | parent [-] | | I almost exclusively debug via printf / logging, and I am so stupendously ignorant that I have even written and published a multi-platform interactive debugger, to go with the compiler I also wrote (one of several, but this one was the most successful). Make of that what you will, I suppose. |
|
|
|
| ▲ | marssaxman 6 days ago | parent | prev | next [-] |
| That's funny. I remember using interactive debuggers all the time back in the '90s, but it's been a long time since I've bothered. Logging, reading, and thinking is just... easier. |
| |
| ▲ | TheRoque 6 days ago | parent | next [-] | | Really ? I find myself thinking the opposite. My program always runs in debug mode, and when there's some issue I put a breakpoint, trigger it, and boom I can check what is wrong. I don't need to stop the program, insert a new line to print what i _guess_ is wrong, restart the program from scratch etc. Properly debugging my stack is probably one of the first things I setup because I find it way less tedious. Like, for example, if you have an issue in a huge Object or Array, will you actually print all the content, paste it somewhere else and search through the logs ? And by the way, most debuggers also have ability to setup a log points anyways, without having to restart your program. Genuinely curious to know how writing extra lines and having to restart makes things easier. Of course I'm not saying that I never débug with logs, sometimes it's require or even more efficient, but it's often my second choice. | | |
| ▲ | marssaxman 5 days ago | parent | next [-] | | I imagine that it depends on the kind of software you are working on. I found debuggers helpful back when I was working on interactive GUI programs which had a lot of complex state, built in classic OOP style with lots of objects pointing at each other, but I have not done that sort of thing in a long time. In part, that's because I got seriously into functional programming, which left me with a much more rigorous approach to state transitions; but then I shifted from conventional software into embedded firmware, where you can sometimes stop the microcontroller and use a debugger via JTAG, but it's usually easier to just stream data out over a serial port, or hook an LED up to a spare pin and make it blink. The world doesn't stop even if you want the software to stop, so the utility of a debugger is limited. After that I went to work for Google, building distributed software running across many machines in a datacenter, and I have no idea how you would hook up a debugger even if you wanted to. It's all logs all the time, there. By the time that was over, I was thoroughly accustomed to logging, and attaching a debugger had come to seem like a nuisance. Since then I've mostly worked on compilers, or ML pipelines, or both: pure data-processing engines, with no interactivity. If I'm fixing a bug, I'm certainly also writing a regression test about it, which lends itself to a logging-based workflow. I don't mind popping into gdb if that's what would most directly answer my question, but that only happens a couple of times a year. | | |
| ▲ | TheRoque 5 days ago | parent [-] | | Thanks for the detailed answer. Indeed I also worked on embedded a long time ago, and the debugger was often the last resort because the debug pins weren't always accessible. |
| |
| ▲ | LandR 5 days ago | parent | prev [-] | | Also conditional breakpoints. I.e. break on this line if foo==5 I couldn't imagine going back to print statement based debugging. Would be a massive waste of time. |
| |
| ▲ | globular-toast 5 days ago | parent | prev [-] | | Yeah, I remember learning to use gdb when I was beginning in the early 2000s. I totally thought I was "levelling up" as a programmer and to be honest felt kinda badass with all those windows open in Emacs. But I've found that the number of times I actually resorted to using the debugger has been so small I don't remain fluent in it's use. What am I supposed to do? Write more bugs? On the other hand, I'm always ready to read, think and put in some print/log statements. |
|
|
| ▲ | jacquesm 6 days ago | parent | prev | next [-] |
| The right tool for the right job. If someone gets the job done with printf() then that would be good enough for me. Interactive debuggers are a great way to waste a ton of time and get absolutely nowhere. They do have their uses but those are not all that common. The biggest usecase for me for GDB has been to inspect stacktraces, having a good mental model of the software you are working on is usually enough to tell you exactly what went wrong if you know where it went wrong. Lots of people spend way too much time debugging code instead of thinking about it before writing. Oh, and testing >> debugging. |
|
| ▲ | another_twist 6 days ago | parent | prev | next [-] |
| Nitpicking a bit here but theres nothing wrong with printf debugging. Its immensely helpful to debug concurrent programs where stopping one part would mess up the state and maybe even avoid the bug you were trying to reproduce. As for tooling, I really love AI coding. My workflow is pasting interfaces in ChatGPT and then just copy pasting stuff back. I usually write the glue code by hand. I also define the test cases and have AI take over those laborious bits. I love solving problems and I genuinely hate typing :) |
|
| ▲ | Gigachad 6 days ago | parent | prev | next [-] |
| I've tried the interactive debuggers but I'm yet to find a situation where they worked better than just printing. I use an interactive console to test what stuff does, but inline in the app I've never had anything that printing wasn't the straightforward fast solution. |
| |
| ▲ | gdubs 6 days ago | parent | next [-] | | I'm not above the old print here or there but the value of an interactive debugger is being able to step and inspect the state of variables at all the different call sites, for instance. | | |
| ▲ | jennyholzer 5 days ago | parent [-] | | it's genuinely embarassing that the printf() diehards in this thread are not clear on this detail. |
| |
| ▲ | davemp 6 days ago | parent | prev [-] | | I’m only found them to be useful in gargantuan OOP piles where the context is really hard to keep in your head and getting to any given point in execution can take minutes. In those cases interactive debugging has been invaluable. | | |
| ▲ | Gigachad 6 days ago | parent [-] | | I guess that’s the difference. I do rails dev mostly and it’s just put a print statement in, then run the unit test. It’s a fast feedback loop. |
|
|
|
| ▲ | quantiq 5 days ago | parent | prev | next [-] |
| Yeah, I'm not putting stock in this at all. The methodology of this survey seems dubious. |
| |
| ▲ | jennyholzer 5 days ago | parent [-] | | 30% of the articles on this forum are covert LLM advertising | | |
| ▲ | wulfstan 5 days ago | parent | next [-] | | Good, it's not just me. I guess it's not a massive surprise given it's an HN forum and a reasonable percentage of HN candidates are doing LLM/AI stuff in recent cohorts, but it still means I have to apply a very big filter every time I open an article and people wax lyrical about how amazing Claude-GPT-super-codez is and how it has made them twice the engineer they were yesterday at the bargain price of $200 a month... May it all die in a fire very soon. Butlerian jihad now. | | |
| ▲ | jennyholzer 5 days ago | parent [-] | | I agree. Burn the machines-that-think. To anyone reading, if you work on LLM software, I hope that your business fails. |
| |
| ▲ | fragmede 5 days ago | parent | prev [-] | | Every single last link here is some form of an advertisement, why single out LLM-related ads? | | |
| ▲ | jennyholzer 5 days ago | parent [-] | | because the underlying technology is so egregiously overhyped in relation to the quality of the services offered | | |
| ▲ | fragmede 4 days ago | parent [-] | | Sure, but why this technology. It's not like that's the first time that's happened, and it won't be the last. |
|
|
|
|
|
| ▲ | unconed 6 days ago | parent | prev [-] |
| The old fogeys don't rely on printf because they can't use a debugger, but because a debugger stops the entire program and requires you to go step by step. Printf gives you an entire trace or log you can glance at, giving you a bird's eye view of entire processes. |
| |
| ▲ | oblio 6 days ago | parent [-] | | Most decent debuggers have condițional breakpoints. | | |
| ▲ | jcparkyn 5 days ago | parent | next [-] | | Not to mention tracepoints (logging breakpoints), which are functionally the same as printf but don't require compiling/restarting. | |
| ▲ | scarface_74 5 days ago | parent | prev [-] | | And have since the mid 1990s at least… |
|
|