▲ | mfkhalil 2 days ago | ||||||||||||||||
MatterRank is pretty slow still since it runs LLM evaluations on each webpage as markdown content. Wouldn't really consider it a Kagi alternative (which I haven't used but have heard great things about!), as that's more of a search engine in the traditional sense. I think where MatterRank shines right now is for finding results where you wouldn't mind waiting an extra 20-30 seconds for an added layer of vetting, as opposed to just wanting a quick answer. Having said that, we are definitely working on making it faster and more useful for everyday queries. | |||||||||||||||||
▲ | danpalmer 2 days ago | parent [-] | ||||||||||||||||
> I think where MatterRank shines right now is for finding results where you wouldn't mind waiting an extra 20-30 seconds for an added layer of vetting, as opposed to just wanting a quick answer. I've not used it, but anecdotally, I can refine my own search query to get what I want, or conclude it doesn't exist, within 20-30s. Assuming ~5s per search to write, search, read, decide, that's 4-6 searches. Do you think you're getting more value than 4 iterations on the initial search term? Are you always getting it in one search, or do you end up still needing to refine the search term, extending it beyond that 20-30s? | |||||||||||||||||
|