▲ | andai 7 hours ago | |||||||
I added search to my LLMs years ago with the python DuckDuckGo package. However I found that Google gives better results, so I switched to that. (I forget exactly but I had to set up something in a Google dev console for that.) I think the DDG one is unofficial, and the Google one has limits (so it probably wouldn't work well for deep research type stuff). I mostly just pipe it into LLM apis. I found that "shove the first few Google results into GPT, followed by my question" gave me very good results most of the time. It of course also works with Ollama, but I don't have a very good GPU, so it gets really slow for me on long contexts. | ||||||||
▲ | ivape 4 hours ago | parent [-] | |||||||
How do you meaningfully use it without using scraping APIs? Aren't the official apis severely limited? | ||||||||
|