Remix.run Logo
simianwords 10 hours ago

absolutely they can learn. you are being emotional and the original point is correct.

i give the LLM my codebase and it indeed learns about it and can answer questions.

RichardLake 10 hours ago | parent | next [-]

That isn't learning, it can read things in its context, and generate materials to assist answering further prompts but that doesn't change the model weights. It is just updating the context.

Unless you are actually fine tuning models, in which case sure, learning is taking place.

2 hours ago | parent | next [-]
[deleted]
simianwords 10 hours ago | parent | prev [-]

i don't know why you think it matters how it works internally. whether it changes its weights or not is not important. does it behave like a person who learns a thing? yes.

if i showed a human a codebase and asked them questions with good answers - yes i would say the human learned it. the analogy breaks at a point because of limited context but learning is a good enough word.

RichardLake 8 hours ago | parent [-]

Maybe because I work on a legacy programming language with far less material in the training? For me it makes a difference because it partly needs to "learn" the language itself and have that in the context, along with codebase specific stuff. For something with the model already knowing the language and only needing codebase specific stuff it might feel different.

simianwords 6 hours ago | parent [-]

But my codebase isn’t there in training set yet it learns and I can ask questions

khy34 7 hours ago | parent | prev [-]

[flagged]