| ▲ | wyre 2 days ago | |||||||
>this stuff is expensive to run What's expensive is innovating on current models and building the infrastructure. My understanding is inference is cheap and profitable. Most open source models cost less than a dollar for 1 million tokens which makes me think SotA models likely have a similar pricepoint, but more profit margin. | ||||||||
| ▲ | aeon_ai 2 days ago | parent [-] | |||||||
I can assure you that inference is not profitable if the user is paying nothing. | ||||||||
| ||||||||