| ▲ | drob518 2 days ago | ||||||||||||||||
I’m really curious how this scales up. Bonsai delivers an 8B model in 1.15 GB. How large would a 27B or 35B model be? Would it still retain the accuracy of those large models? If the scaling holds, we could see 100+B models in 64 GB of RAM. | |||||||||||||||||
| ▲ | cubefox 2 days ago | parent | next [-] | ||||||||||||||||
Also depends on how expensive training these models is. It's probably at least as expensive as full precision models, otherwise they would have mentioned it. | |||||||||||||||||
| |||||||||||||||||
| ▲ | MeetRickAI 2 days ago | parent | prev [-] | ||||||||||||||||
[dead] | |||||||||||||||||