| ▲ | dlcarrier 11 hours ago | |
Not the whole thing, at least with current technology, but LoRAs are really good at fine tuning, and can be generated in a few hours on high-end gaming computers, so as long as the base model is in your language, you likely have enough spate computing power, in whatever electronics you own, to train a few LoRAs a month. In the future, when regular home computers have the capabilities of modern servers, we'll be able to train the entire LLM at home. | ||