Remix.run Logo
barapa 3 days ago

They are training on your queries. So they may have some exposure to them going forward.

franktankbank 3 days ago | parent | next [-]

Even if your queries are hidden via a local running model you must have some humility that your queries are not actually unique. For this reason I have a very difficult time believing that a basic LLM will be able to properly reason about complex topics, it can regurgitate to whatever level its been trained. That doesn't make it less useful though. But on the edge case how do we know the query its ingesting gets trained with a suitable answer? Wouldn't this constitute an over-fitting in these cases and be terribly self-reinforcing?

keysdev 3 days ago | parent | prev [-]

Not if one ollama pull to ur machine.