Top
Best
New

Posted by mpweiher 1 day ago

A guide to local coding models(www.aiforswes.com)
588 points | 345 commentspage 5
flowinghorse 1 day ago|
Local models less than 2b are good enough for code auto completion. Even you don't have 128G memory.
dfischer96 1 day ago||
Nice guide! I want to point out opencode CLI, which is far superior to Qwen CLI in my opinion.
avhception 1 day ago||
I tried local models for general-purpose LLM tasks on my Radeon 7800 XT (20GB VRAM), and was disappointed.

But I keep thinking: It should be possible to run some kind of supercharged tab completion on there, no? I'm spending most of my time writing Ansible or in the shell, and I have a feeling that even a small local model should give me vastly more useful completion options...

freeone3000 1 day ago||
What are you doing with these models that you’re going above free tier on copilot?
satvikpendem 1 day ago|
Some just like privacy and working without internet, I for example travel regularly on the train and like to have my laptop when there's not always good WiFi.
BoredPositron 1 day ago||
Not worth it yet. I run a 6000 black for image and video generation, but local coding models just aren't on the same level as the closed ones.

I grabbed Gemini for $10/month during Black Friday, GPT for $15, and Claude for $20. Comes out to $45 total, and I never hit the limits since I toggle between the different models. Plus it has the benefit of not dumping too much money into one provider or hyper focusing on one model.

That said, as soon as an open weight model gets to the level of the closed ones we have now, I'll switch to local inference in a heartbeat.

holyknight 1 day ago||
your premise would've been right, if memory wouldn't skyrocketed like 400% in like 2 weeks.
SpaceManNabs 1 day ago||
I love that this article added a correction and took ownership in it. This encourages more people to blog stuff and then get more input for parts they missed.

The best way to get the correct answer on something is posting the wrong thing. Not sure where I got this from, but I remember it was in the context of stackoverflow questions getting the correct answer in the comments of a reply :)

Props to the author for their honesty and having the impetus to blog about this in the first place.

ikidd 1 day ago||
So I can't see bothering with this when I pumped 260M tokens through running in Auto mode on a $20/mo Cursor plan. It was my first month of a paid subscription, if that means anything. Maybe someone can explain how this works for them?

Frankly, I don't understand it at all, and I'm waiting for the other shoe to drop.

lelanthran 1 day ago|
> So I can't see bothering with this when I pumped 260M tokens through running in Auto mode on a $20/mo Cursor plan. It was my first month of a paid subscription, if that means anything. Maybe someone can explain how this works for them?

They're running at a loss and covering up the losses using VC?

> Frankly, I don't understand it at all, and I'm waiting for the other shoe to drop.

I think that the providers are going to wait until there are a significant number of users that simply cannot function in any way without the subscription, and then jack up the prices.

After all, I can all but guarantee that even the senior devs at most places now won't be able to function if every single tool or IDE provided by a corporate (like VSCode) was yanked from them.

Myself, you can scrub my main dev desktop of every corporate offering, and I might not even notice (emacs or neovim, plugins like Slime, Lsp plugins, etc) is what I am using daily, along with programming languages.

dackdel 1 day ago||
no one using exo?
redrove 1 day ago|
https://github.com/exo-explore/exo

I keep hearing about it but unfortunately I myself only have one mac and nvidia GPUs and those can’t cluster together :/

lucideng 22 hours ago|
A Mac dev type using a 5-year-old machine, I will believe it when I see it. I know a few older Macs still kicking around, but those people use them for basic stuff, not actual work. Mac people jump to new models faster than Taco Bell leaves my body.
More comments...