Remix.run Logo
roschdal 4 hours ago

I yearn for the days when I can program on my PC with a programming llm running on the CPU locally.

yazaddaruvala 4 hours ago | parent | next [-]

I’ve been using Google AI Edge Gallery on my M1 MacBook with Gemma4B with very good results for random python scripts.

Unfortunately still need to copy paste the code into a file+terminal command. Which is annoying but works.

fredmendoza 4 hours ago | parent | prev | next [-]

you're honestly not that far off. the coding block on this model scored 8.44 with zero help. it caught a None-init TypeError on a code review question that most people would miss. one question asked for O(n) and it just went ahead and shipped O(log(min(m,n))) on its own. it's not copilot but it's free, it's offline, and it runs on whatever you have. there's a 30-line chat.py in the article you can copy and run tonight.

glitchc 4 hours ago | parent | prev | next [-]

You can do that now. Qwen-coder3.5 and gpt-oss-20b are pretty good for local coding help.

luxuryballs 4 hours ago | parent | prev | next [-]

You can do it on a laptop today, faster with gpu/npu, it’s not going to one shot something complex but you can def pump out models/functions/services, scaffold projects, write bash/powershell scripts in seconds.

trgn 4 hours ago | parent | prev [-]

we need sqlite for llms

philipkglass 4 hours ago | parent [-]

I think that we're getting there. I put together a workstation in early 2023 with a single 4090 GPU. I did it to run things like BERT and YOLO image classifiers. At that point the only "open weights" LLM was the original Llama from Meta, and even that was open-weights only because it was leaked. It was a very weak model by today's standards.

With the same hardware I now get genuine utility out of models like Qwen 3.5 for categorizing and extracting unstructured data sources. I don't use small local models for coding since frontier models are so much stronger, but if I had to go back to small models for coding too they would be more useful than anything commercially available as recently as 4 years ago.