| ▲ | adgjlsfhk1 9 hours ago | |||||||||||||
those gross profit margins aren't that useful since training at fixed capacity is continually getting cheaper, so there's a treadmill effect where staying in business requires training new models constantly to not fall behind. If the big companies stop training models, they only have a year before someone else catches up with way less debt and puts them out of business. | ||||||||||||||
| ▲ | HDThoreaun 7 hours ago | parent [-] | |||||||||||||
Only if training new models leads to better models. If the newly trained models are just a bit cheaper but not better most users wont switch. Then the entrenched labs can stop training so much and focus on profitable inference | ||||||||||||||
| ||||||||||||||