▲ | system2 8 hours ago | |
There are millions of websites, and a local LLM cannot scrape all of them to make sense of them. Think about it. OpenAI can do it because they spend millions to train its systems. Many sites have hidden sitemaps that cannot be found unless submitted to google directly. (Not even listed in robots txt most of the time). There is no way a local LLM can keep up with up to date internet. |