Remix.run Logo
al_borland 2 days ago

I like the idea of a single chat with many models. Pre-AI-everything, I was already a Kagi user, so already paying for that. I've started using the Kagi Assistant[0] to solve this for myself. I pay $10/month, as I always did, and that's my credit limit to the various LLMs. They added an AI cost tracker to create transparency to those costs. So far this month I've used $0.88 of my $10.00 allotment. So I don't feel like I'm in any danger of going over. If I wasn't already paying for this, I'd be pretty interested in an option that was pay-as-you-go.

Looking at your pricing I find the credit model a bit confusing. It feels like credit card points and I don't really have a concept of what that will get me. Tokens are a bit abstract, but that's the currency of AI, so it is what it is. Adding credits as an intermediary between tokens and dollars may have been done with the goal of simplify things, but it my head it makes it harder to understand and leaves more places for hidden fees to hide.

Giving some idea of how much usage someone could expect to get out of a 1,000 tokens or 100 credits or $1 would be useful. I can do the math and see I can do 20 web searches for $1, but does that include follow up questions? Is every question a web search? Kagi shows that I've used 15 searches so far today, and it's cost me less than 2¢ for the almost 19k tokens. So I'm a bit confused.

More generally on the chat-only umbrella tools, I do miss some of the nice-to-have options of going directly with the big players (interactive code editors, better images generation, etc), but not enough to be paying $20+/month/service.

[0] https://kagi.com/assistant

metrix 2 days ago | parent [-]

I've been using openrouter.ai to use "all llm's". No subscription, and can be tied to your editor of choice

pyman 2 days ago | parent [-]

For free? How's that possible when one AI prompt uses 10x more energy than a Google search [1]?

[1] Source: https://kanoppi.co/search-engines-vs-ai-energy-consumption-c...

symboltoproc 2 days ago | parent [-]

10 google searches are also free

pyman 2 days ago | parent [-]

You didn't click on the link I shared. I'm talking about the cost to produce the response, not the request. One AI prompt uses around 10 times more CPU and energy than a Google search.

If ChatGPT handles 1 billion queries a day, that's like the energy cost of 10 billion Google searches every single day.

Someone has to pay the electricity bill. We all know it's not free like you claim.

112233 a day ago | parent [-]

you also didn't click on the link the poster you replied to shared...

seconding openrouter and fal, having to muck around with idiosyncrasies of each vendor just to try their "bestest model" and find out it does not satisfy your requirements is a chore.

pyman a day ago | parent [-]

I'd stick with Google Search until Microsoft figures out how to handle a billion OpenAI requests a day without draining the water supply of entire cities. Because in Chile, for example, people are struggling.

ruthvik947 18 hours ago | parent [-]

https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...

pyman 16 hours ago | parent [-]

Sorry, but I'm not interested in blog posts from lobbyist in Washington, the same place pushing to build mega datacentres with Nvidia servers in developing countries.

Also, Andy's blog post doesn't mention infrastructure-scale impacts. Even small actions add up, and as AI scales exponentially, so does the demand on energy and water. That part gets left out.

I'll stick with the research papers published by AI researchers [1] and investigative journalists [2], but thanks for sharing your link, it gives me a good idea of what lobbyists in Washington aren't saying.

[1] https://eng.ox.ac.uk/case-studies/the-true-cost-of-water-guz...

[2] https://www.bloomberg.com/graphics/2025-ai-impacts-data-cent...