▲ | roncesvalles 7 days ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
As a dev, I find that the personal utility of LLMs is still very limited. Analyze it this way: Are LLMs enabling something that was impossible before? My answer would be No. Whatever I'm asking of the LLM, I'd have figured it out from googling and RTFMing anyway, and probably have done a better job at it. And guess what, after letting the LLM do it, I probably still need to google and RTFM anyway. You might say "it's enabling the impossible because you can now do things in less time", to which I would say, I don't really think you can do it in less time. It's more like cruise control where it takes the same time to get to your destination but you just need to expend less mental effort. Other elephants in the room: - where is the missing explosion of (non-AI) software startups that should've been enabled by LLM dev efficiency improvements? - why is adoption among big tech SWEs near zero despite intense push from management? You'd think, of all people, you wouldn't have to ask them twice. The emperor has no clothes. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | ryao 7 days ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
> Are LLMs enabling something that was impossible before? I would say yes when the LLM is combined with function calling to allow it to do web searches and read web pages. It was previously impossible for me to research a subject within 5 minutes when it required doing several searches and reviewing dozens of search results (not just reading the list entries, but reading the actual HTML pages). I simply cannot read that fast. A LLM with function calling can do this. The other day, I asked it to check the Linux kernel sources to tell me which TCP connection states for a closing connection would not return an error to send() with MSG_NOSIGNAL. It not only gave me the answer, but made citations that I could use to verify the answer. This happened in less than 2 minutes. Very few developers could find the answer that fast, unless they happen to already know it. I doubt very many know it offhand. Beyond that, I am better informed than I have ever been since I have been offloading previously manual research to LLMs to do for me, allowing me to ask questions that I previously would not ask due to the amount of time it took to do the background research. What previously would be a rabbit hole that took hours can be done in minutes with minimal mental effort on my part. Note that I am careful to ask for citations so I can verify what the LLM says. Most of the time, the citations vouch for what the LLM said, but there are some instances where the LLM will provide citations that do not. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | knowitnone2 7 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
do cars enable something that was impossible before? bikes? shoes? clothing? Your answer would be No. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|