|
| ▲ | simonw 4 days ago | parent | next [-] |
| I have been using a Tailscale VPN to make LM Studio and Ollama running on my Mac available to my iPhone when I leave the house. |
|
| ▲ | brokencode 4 days ago | parent | prev | next [-] |
| Perhaps said software could even form an end to end encrypted tunnel from your phone to your local LLM server anywhere over the internet via a simple server intermediary. The amount of data transferred is tiny and the latency costs are typically going to be dominated by the LLM inference anyway. Not much advantage to doing LAN only except that you don’t need a server. Though the amount of people who care enough to buy a $3k - $10k server and set this up compared to just using ChatGPT is probably very small. |
| |
| ▲ | JumpCrisscross 4 days ago | parent [-] | | > amount of people who care enough to buy a $3k - $10k server and set this up compared to just using ChatGPT is probably very small So I maxed that out, and it’s with Apple’s margins. I suspect you could do it for $5k. I’d also note that for heavy users of ChatGPT, the difference in energy costs for a home setup and the price for ChatGPT tokens may make this financially compelling for heavy users. | | |
| ▲ | brokencode 4 days ago | parent [-] | | True, it may be profitable for pro users. At $200 a month for ChatGPT Pro, it may only take a few years to recoup the initial costs. Not sure about energy costs though. And of course you’d be getting a worse model, since no open source model currently is as good as the best proprietary ones. Though that gap should narrow as the open models improve and the proprietary ones seemingly plateau. |
|
|
|
| ▲ | dghlsakjg 4 days ago | parent | prev [-] |
| That software is an HTTP request, no? Any number of AI apps allow you to specify a custom endpoint. As long as your AI server accepts connections to the internet, you're gravy. |
| |
| ▲ | JumpCrisscross 4 days ago | parent [-] | | > That software is an HTTP request, no? You and I could write it. Most folks couldn’t. If AI plateaus, this would be a good hill to have occupied. | | |
| ▲ | dghlsakjg 4 days ago | parent [-] | | My point is, what is there to build? The person that is willing to buy that appliance is likely heavily overlapped with the person that is more than capable of pointing one of the dozens of existing apps at a custom domain. Everyone else will continue to just use app based subscriptions. Streaming platforms have plateaued (at best), but self hosted media appliances are still vanishingly rare. Why would AI buck the trend that every other computing service has followed? | | |
| ▲ | itsn0tm3 4 days ago | parent | next [-] | | You don’t tell your media player company secrets ;) I think there is a market here, solely based on actual data privacy.
Not sure how big it is but I can see quite some companies have use for it. | | |
| ▲ | dghlsakjg 4 days ago | parent [-] | | > You don’t tell your media player company secrets ;) No, but my email provider has a de-facto repository of incredibly sensitive documents. When you put convenience and cost up against privacy, the market has proven over and over that no one gives a shit. |
| |
| ▲ | JumpCrisscross 4 days ago | parent | prev [-] | | > what is there to build? Integrated solution. You buy the box. You download the app. It works like the ChatGPT app, except it's tunneling to the box you have at home which has been preconfigured to work with the app. Maybe you have a subscription to keep everything up to date. Maybe you have an open-source model 'store'. |
|
|
|