▲ | giancarlostoro 4 days ago | |||||||
Him using different ones is why I use Perplexity, I get to try different models and honestly its pretty darn decent, gives me everything in an organized way, I can see all the different links, and all the files it outputs can be downloaded as a simple zip file. It has everything from GTP5 to Deepseek R1 and even Grok. There's other sites similar to perplexity that host multiple models as well, I have not tried the plethora of others, I feel like Perplexity does the most to make sure whatever model you pick it works right for you and all its output is usefully catalogued. | ||||||||
▲ | mark_l_watson 4 days ago | parent [-] | |||||||
I also use Perplexity APIs, specifically their combined web search tool + decent models. Useful combination and easier that what I used to do: using a search API like Brave and rolling my own code to combine LLM and search. That said I have been having too much fun running Melisearch to build a local search index for many web sites that I use for reference and combine that with a small Python app that also uses local models running on Ollama. I will probably wrap this into an example to add to one of my books: not that practical but fun. | ||||||||
|