Remix.run Logo
Iolaum 6 hours ago

I don't know their strategy but I wish they would double up on the open source ecosystem by sharing their innovations like Chinese labs do and use the ones shared as well. I think models in the range of 50B - 250B still have a lot of room to grow and presumably compute to train them should be more accessible than multi-T parameter models.

This would also add pressure on other labs to keep being engaged in the open source ecosystem as a rug pull isn't a small danger IMO.