| ▲ | Fine-Tuning TranslateGemma-4B for Better Welsh Translations on an H200 GPU(metalglot.com) | |
| 1 points by metalglot 4 hours ago | 1 comments | ||
| ▲ | metalglot 4 hours ago | parent [-] | |
Open source repo: https://github.com/grctest/finetuned-gemmatranslate-cy 5% of the fine-tuning took 40 minutes and cost a couple dollars to prove the process works. Looking forwards to Flash Attention v4 to leave beta, to test fine-tuning performance on a B200 on the cloud, probably a few months away it seems? What languages would you train TranslateGemma to be able to translate? I was originally thinking about klingon but the available datasets seemed a bit lacking.. | ||