▲ | ryao 7 days ago | |
> Are LLMs enabling something that was impossible before? I would say yes when the LLM is combined with function calling to allow it to do web searches and read web pages. It was previously impossible for me to research a subject within 5 minutes when it required doing several searches and reviewing dozens of search results (not just reading the list entries, but reading the actual HTML pages). I simply cannot read that fast. A LLM with function calling can do this. The other day, I asked it to check the Linux kernel sources to tell me which TCP connection states for a closing connection would not return an error to send() with MSG_NOSIGNAL. It not only gave me the answer, but made citations that I could use to verify the answer. This happened in less than 2 minutes. Very few developers could find the answer that fast, unless they happen to already know it. I doubt very many know it offhand. Beyond that, I am better informed than I have ever been since I have been offloading previously manual research to LLMs to do for me, allowing me to ask questions that I previously would not ask due to the amount of time it took to do the background research. What previously would be a rabbit hole that took hours can be done in minutes with minimal mental effort on my part. Note that I am careful to ask for citations so I can verify what the LLM says. Most of the time, the citations vouch for what the LLM said, but there are some instances where the LLM will provide citations that do not. |