▲ | drob518 5 days ago | |||||||
It’s splitting a hair, but a pretty important hair. Does anyone think that models won’t need continuous retraining? Does anyone think models won’t continue to try to scale? Personally, I think we’re reaching diminishing returns with scaling, which is probably good because we’ve basically run out of content to train on, and so perhaps that does stop or at least slow down drastically. But I don’t see a scenario where constant retraining isn’t the norm, even if the rough amount of content we’re using for it grows only slightly. | ||||||||
▲ | gomox 4 days ago | parent [-] | |||||||
Well, models are definitely good enough for some things in their current state, without needing to be retrained (computer translation for example was a solved problem with GPT3) | ||||||||
|