There's no question they have capabilities that no other tool has. But a good tool goes beyond just doing something, there are some generally agreed upon principles of tool design that make something good versus just useful.
For example, I think a hammer is a good tool because every time I swing it at a nail, it provides the force necessary to drive it into the wood. It's reliable. Sure, sometimes a hammer breaks, but my baseline expectation in using one is that every time I swing the hammer it will behave the same way.
Something more complicated, like a Rust compiler is also a good tool in the same way. It's vastly more intricate than the hammer, yet it still has the good tool property of being reliable; every time I press compile, if the program is wrong, then the compiler tells me that. If it's right, then the compiler passes every time. It doesn't lie, it doesn't guess, it doesn't rate limit, it doesn't charge a subscription, it doesn't silently update causing code to fail, it informs when changes are breaking and what they are, it allows me to pick my version and doesn't silently deprecate me without recourse, etc.
There are of course ecosystems out there where building a project is more like a delicate dance, or a courtship ritual, and those ecosystems are pain in the ass to deal with. I'm talking XKCD #1987, or NodeJS circa 2014, or just the entire rationale behind Docker. People exit their careers to not have to deal with such technology, because it's like working at the DMV or living in Kafka's nightmares. LLMs are more in that direction, and no one is going to like where we end up if we make them our entire stack, as seems to be the intent by the powers that be.
There's a difference between what LLMs are and what they're being sold as. For what they are, they can be useful and may one day they will be turned into good tools if some of the major flaws are fixed.
On the other hand, we are in the process of totally upending the way our industry works on the basis of what these things will be, which they are selling as essentially an oracle. "The smartest person who know in your pocket", "a simultaneous expert PhD, MD, JD", "Smarter than all humans combined". But there's a giant gulf between what they're selling and what it is, and that gulf is what makes LLMs a poor tool.