| ▲ | heavyset_go 19 hours ago | |
At least with LLM providers, they have your prompts and output, and if they wanted to, they could identify what code was AI generated or not. Maybe they can be subpoenaed, maybe they can sell the data to parties who care like legal teams, maybe they can make it service anyone can plug a GitHub repo into, etc. | ||
| ▲ | BoredomIsFun 7 hours ago | parent | next [-] | |
Jokes on you - I run LLMs only locally, and besides the most widely deployed code generating tool AFAIR is JetBrain tiny ~200M LLM, builtin into their IDE. | ||
| ▲ | ivan_gammel 13 hours ago | parent | prev [-] | |
Do you really think anyone is ready to spend money on legal to prove that some piece of code is public domain/has no author? That’s an expensive bet with uncertain outcome. And of course you can recover some information only if logs exist, which might not be the case, especially if local inference was used. | ||