| ▲ | crystal_revenge a day ago | |
> What are you building? I think AI really pushes this higher up the abstraction layer: > What problem are you solving? I've spent a good amount of my careering using engineering and math to solve specific problems, I'm usually adjacent to software teams. What I've seen happen with agentic coding is that traditional software engineers keep focusing on using it to build software, while ignoring the problem they're trying to solve. Meanwhile I've seen junior data analysts start interfacing with applications and tools they never dreamed of before, and delivering results to stakeholders in record times. Things that were previously blocked by engineering no longer are. But many engineers today are not really problem solvers, they're software builders. The idea that solving the end users problem is the goal, not building them software, is incomprehensible. And so they continue to struggle to use AI effectively because they're trying to build software with it. Which it's not terrible at, but it's really the wrong tool for that job. Sometimes software is necessary to solve a problem, a few years ago, software was necessary for a fairly large problem surface area (though, to your point, even then a lot of software was not really built to solve those problems). Today that surface area is shrinking, and as economic constraints loom on the horizon, I believe it will increasingly be people who are solving problems (with or without AI) that will be the ones surviving. | ||
| ▲ | Panzer04 11 hours ago | parent [-] | |
The kind of jobs an analyst are doing are probably the most amenable of everything to LLM assistance. Small, bounded, etc. The bigger the problem set and context the less helpful an LLM gets. | ||