| ▲ | ramraj07 3 hours ago | |||||||||||||||||||||||||
The fundamental idea that modern LLMs can only ever remix, even if its technically true (doubt), in my opinion only says to me that all knowledge is only ever a remix, perhaps even mathematically so. Anyone who still keeps implying these are statistical parrots or whatever is just going to regret these decisions in the future. | ||||||||||||||||||||||||||
| ▲ | pseudosavant 2 hours ago | parent | next [-] | |||||||||||||||||||||||||
But all of my great ideas are purely from my own original inspiration, and not learning or pattern matching. Nothing derivative or remixed. /sarcasm | ||||||||||||||||||||||||||
| ▲ | mrbungie 2 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||
> Anyone who still keeps implying these are statistical parrots or whatever is just going to regret these decisions in the future. You know this is a false dichotomy right? You can treat and consider LLMs statistical parrots and at the same time take advantage of them. | ||||||||||||||||||||||||||
| ▲ | heavyset_go 2 hours ago | parent | prev [-] | |||||||||||||||||||||||||
Yeah, Yann LeCun is just some luddite lol | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||