Remix.run Logo
wongarsu 3 hours ago

I guess the argument is that a tool built by a company with actual insight into and focus for financial services, with Anthropic as inference provider, would lead to more adoption and more use of Anthropic models. Something Anthropic could achieve either by just leaving things alone and having the best models, or alternatively by starting some kind of incubator or something. AWS might be a good model

The issue with that is obviously that most of the generated value would be captured by that company in the middle, while Anthropic would stay in the cost-conscious inference market.

noitpmeder 3 hours ago | parent [-]

Why would anthropic at all prefer this approach when that middle man can switch and cost-arbitrage between countless other model providers.

We're not talking about what is best for the consumer (ex more competition to force iterations and improvements), but what Anthropic thinks is best for Anthropic.

wongarsu 3 hours ago | parent [-]

Make up the lower margins by larger volume because you get much better market penetration. But you are right that this only works if you know the middle-men don't go to other model providers. That's where some kind of incubation program that provides capital or credits or whatever in return for long-term commitments might work

But I doubt staying a pure model provider is a winning move. It's a market nobody will win long-term. Almost all of the value to be captured isn't in inference APIs but in how to use them to generate business value. Claude Code was already the right approach, they "just" need to show they can repeat this for other kinds of tasks

khuey 3 hours ago | parent [-]

> Almost all of the value to be captured isn't in inference APIs but in how to use them to generate business value.

If the business value can be generated with a few thousand words in a SKILL.md on top of a commoditized model it doesn't sound like that's a market anyone can win long-term either, and the business value is ultimately going to accrue elsewhere (the customer, the inference hardware provider, etc)