| ▲ | tyre 5 days ago |
| What I really want from Anthropic, Gemini, and ChatGPT is for users to be able to log in with them, using their tokens. Then you can have open/free apps that don’t require the developer to track usage or burn through tons of tokens to demonstrate value. Most users aren’t going to manage API keys, know that that even means, or accept the friction. |
|
| ▲ | mentos 5 days ago | parent | next [-] |
| https://x.com/steph_palazzolo/status/1978835849379725350 |
| |
| ▲ | MillionOClock 5 days ago | parent [-] | | It’s unclear to me wether that would give some access to a token quota or if it would just be like any other « Sign in with … ». In all cases I am currently developing an app that would greatly benefit from letting my users connect to their ChatGPT account and use some token quota. |
|
|
| ▲ | rahimnathwani 4 days ago | parent | prev | next [-] |
| When you share an app you created in Google AI Studio, it will use quota from the logged in user, instead of your own quota. |
| |
| ▲ | robbomacrae 4 days ago | parent [-] | | As someone who has been waiting for the same thing as op tyre posted, I went to investigate this claim and it seems that it might be true but only when running apps within the Google AI Studio itself.. ie if you were to make an app that was on something like the App Store using Google AI Studio, it would be back to an API key that the developer bears the costs for. The problem with the current model is that there is a high barrier to justifying the user pays essentially a 2nd/3rd subscription for ultimately the same AI intelligence layer. And so you cannot currently make an economically successful small use case app based on AI without somehow restricting users use of AI. I don't think AI companies are incentivized to fix this. |
|
|
| ▲ | numlocked 5 days ago | parent | prev | next [-] |
| We do this at openrouter and many apps use exactly that pattern! |
| |
| ▲ | chrisshroba 4 days ago | parent [-] | | Do you have any repository of apps that support that? I’d love to browse them! |
|
|
| ▲ | wahnfrieden 5 days ago | parent | prev | next [-] |
| Foundation Models on iOS/macOS was seen to have dormant code for doing this via OpenAI. So they are experimenting with it and may make it available next year. |
|
| ▲ | abrbhat 4 days ago | parent | prev | next [-] |
| At some point the model providers will realize they don't need to provide apps, just enterprise-grade intelligence at scale in a pipe, much like utility companies providing electricity/water. Right now, they have to provide the apps to kick-off the adoption. |
| |
| ▲ | kgwgk 4 days ago | parent | next [-] | | > much like utility companies providing electricity/water A capital intensive, low margin business. The dream of every company. | | |
| ▲ | baq 4 days ago | parent | next [-] | | A natural monopoly in which you can't really lose. A retirement fund manager's dream. | | |
| ▲ | dns_snek 4 days ago | parent [-] | | Except AI companies are not a monopoly, never mind a natural monopoly. When ChatGPT first released it was popular to predict the death of Google because they were "so far behind". |
| |
| ▲ | almostgotcaught 4 days ago | parent | prev [-] | | You can always depend on "brilliant" hn users to contribute the most braindead business hot-takes (not you but the person you're responding to). | | |
| ▲ | array_key_first 4 days ago | parent [-] | | Well after a certain point people have to smell the roses, so to speak. You don't get to control your business 100%, the market tells you what to do a lot of the time. I think, the reality is, as models become more competitive they are becoming commodities. There's really no reason an app has to be built on GPT, or Gemini. It makes much more sense for apps to be "model agnostic" and let their customers choose which models to use. I think, if OpenAI sticks to just trying to make their own apps for everything, they will be outrun. People will make apps outside of their ecosystem and will just use them as an API dumb pipe, regardless of if OpenAI wants that. And if they don't want that and restrict it, then their models will fall to the wayside as more competitive models which DO allow that take their place. They're in a bind here, which is probably why we are seeing this announcement. OpenAI can see the writing on the wall for them. |
|
| |
| ▲ | TeMPOraL 4 days ago | parent | prev [-] | | The problem is that "enterprise-grade intelligence", by its very nature, doesn't want to be trapped in a pipe feeding apps - it subsumes apps, reducing them to mere background tool calls. The perfect "killer app" for AI would kill most software products and SaaS as we know them. The code doing the useful part would still be there, but stripped off branding, customer funnels and other traps, upsell channels, etc. As a user, I'd be more than happy to see it (at least as long as the AI frontend part was well-developed for power users); obviously, product owners hate this. | | |
| ▲ | abrbhat 4 days ago | parent [-] | | (Good) Apps take the context of the user and their use-case from their head and make it into something the user can see and interact with. An app might or might not be the 'product'. Unfortunately it seems there is always going to be some 'product' so dark patterns might be here to stay. | | |
| ▲ | TeMPOraL 4 days ago | parent [-] | | Right. Problem is, the user interface is also the perfect marketing channel, because it stands between the user and some outcome they want. Due to technical and social limitations, most apps are also limited in what they can do, this naturally shapes and bounds them and their UIs, forming user-facing software products. Intelligence of the kind supplied by SOTA LLMs, is able to both subsume the UI, by taking much broader context of the user and the use case into account, distilling it down to minimal interaction patterns for a specific user and situation, and also blur the boundaries of products, by connecting and chaining them on the fly. This kills the marketing channel (UI) and trims the organizational structure itself (product), by turning a large SaaS into a bunch of API endpoints for AI runtime to call. Of course, this is the ideal. I doubt it'll materialize, or if it does, that it'll survive for long, because there's half a software industry's worth of middlemen under risk of being cut out, and thus with a reason to fight it. |
|
|
|
|
| ▲ | czhu12 4 days ago | parent | prev | next [-] |
| In some ways, that’s what MCP interfaces are kind of for. It just takes one extra step to add the mcp url and go through oauth. I assume the fall off there will be 99% of users though, the way it works today. But this theoretically allows multiple applications to plugin into ChatGPT/claude/gemini and work together. If someone adds zillow and… vanguard, your LLM can call both through mcp and help you plan a home buy |
|
| ▲ | 4 days ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | redorb 5 days ago | parent | prev | next [-] |
| won't they just eventually have a 'log in with OpenAI' button similar to a 'login with Google' button? Maybe a 'connect with OpenAI' button so the service can charge a fee, while allowing a bring your own token type hybrid. |
|
| ▲ | xnx 5 days ago | parent | prev | next [-] |
| This is close to how it works with shared apps in Google AI Studio. |
|
| ▲ | stingraycharles 4 days ago | parent | prev [-] |
| So basically oauth-style app connections. Makes sense. |