| ▲ | scriptsmith 10 hours ago | |||||||
I've got some demos of what the new Prompt API in Chrome that uses a local model can do: https://adsm.dev/posts/prompt-api/#what-could-you-build-with... As OP says, it shines in constrained environments where the model is transforming user-owned data. Definitely less useful for anything more open-ended. | ||||||||
| ▲ | 2ndorderthought 10 hours ago | parent | next [-] | |||||||
Yea I do not recommend treating chromes prompt API as a good example of local LLMs. It's fine and stuff but it's really weak. 8b models from a year ago are better in some ways. And a lot of the recent model drops are meaningfully better. | ||||||||
| ||||||||
| ▲ | robot-wrangler 8 hours ago | parent | prev | next [-] | |||||||
> I've got some demos of what the new Prompt API can do: > Use surrounding context to rewrite your ad copy: Yup, that's the plan. No local model, no webpage; more, better and cheaper adtech extortion/surveillance for vendors while everyone else pays for the juice and hardware degradation. | ||||||||
| ▲ | dakolli 10 hours ago | parent | prev [-] | |||||||
So you're running an llm to do data transformation that deterministic processes would be much better suited for and running 1,000 watt power supply to do so. Wild. | ||||||||