| ▲ | camillomiller 19 hours ago | |||||||||||||||||||||||||||||||||||||||||||
This is a very biased example. Also, it is possible only because right now the tools you've used are heavily subsidised by investors' money. A LOT of it. Nobody questions the utility of what you just mentioned, but nodoby stops to ask if this would be viable if you were to pay the actual cost of these models, nor what it means for 99.9% of all the other jobs that AI companies claim can be automated, but in reality are not even close to be displaced by their technology. | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | aurareturn 19 hours ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||
Why is it biased? So what if it's subsidized and companies are in market share grab? Is it going to cost $40 instead of $20 that I paid? Big deal. It still beats the hell out of $2k - $3k that it would have taken before and weeks in waiting time. 100x cheaper, 1000x faster delivery. Further more, v0 and ChatGPT together for sure did much better than the average web designer and copy writer. Lastly, OpenAI has already stated a few times that they are "very profitable" in inference. There was an analysis posted on HN showing that inference for open source models like Deepseek are also profitable on a per token basis. | ||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||