▲ | tbrownaw 6 days ago | |
> Also, there is no training data, which would be the "preferred form" of modification. Isn't fine-tuning a heck of a lot cheaper? | ||
▲ | Nevermark 6 days ago | parent [-] | |
Fine tuning with original data plus fine tuning data has more predictable results. Just training on new data moves a model away from its previous behavior, to an unpredictably degree. You can’t even reliably test for the change without the original data. |