| ▲ | Terr_ 5 hours ago | |
> to see LLM re-discover I imagine someone probably wrote very specifically about it in the training data that underwent lossy compression, and the LLM is decompressing that how-to. So I'd say it's more like "surfacing" or "retrieving" than "re-discovering". | ||
| ▲ | seanp2k2 4 hours ago | parent | next [-] | |
They scraped everything on Stackoverflow, likely IRC logs from Freenode, and every book written in the modern era courtesy of Sci-Hub / Library Genesis / Anna's Archive / Z Library. RIP Aaron Swartz, they're generating trillions in shareholder value from the spiritual successors to the work they were going to imprison you for. | ||
| ▲ | ebonnafoux an hour ago | parent | prev [-] | |
Indeed, I check and the solution was already on stack overflow https://askubuntu.com/a/1483248 | ||