| ▲ | irishcoffee 10 hours ago | ||||||||||||||||||||||||||||||||||
I own 2 5070TI cards in a rig I would gladly donate time to for a distributed training model effort. The kicker is the training data. I would want to gate the data to anything before 2022. I don’t know how to coordinate that, but I would really like to be involved in something like this. SETI, for LLMs. | |||||||||||||||||||||||||||||||||||
| ▲ | AlexCoventry 9 hours ago | parent [-] | ||||||||||||||||||||||||||||||||||
Bandwidth is the killer, in distributed LLM training. | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||