Posted by mpweiher 1 day ago
But I keep thinking: It should be possible to run some kind of supercharged tab completion on there, no? I'm spending most of my time writing Ansible or in the shell, and I have a feeling that even a small local model should give me vastly more useful completion options...
I grabbed Gemini for $10/month during Black Friday, GPT for $15, and Claude for $20. Comes out to $45 total, and I never hit the limits since I toggle between the different models. Plus it has the benefit of not dumping too much money into one provider or hyper focusing on one model.
That said, as soon as an open weight model gets to the level of the closed ones we have now, I'll switch to local inference in a heartbeat.
The best way to get the correct answer on something is posting the wrong thing. Not sure where I got this from, but I remember it was in the context of stackoverflow questions getting the correct answer in the comments of a reply :)
Props to the author for their honesty and having the impetus to blog about this in the first place.
Frankly, I don't understand it at all, and I'm waiting for the other shoe to drop.
They're running at a loss and covering up the losses using VC?
> Frankly, I don't understand it at all, and I'm waiting for the other shoe to drop.
I think that the providers are going to wait until there are a significant number of users that simply cannot function in any way without the subscription, and then jack up the prices.
After all, I can all but guarantee that even the senior devs at most places now won't be able to function if every single tool or IDE provided by a corporate (like VSCode) was yanked from them.
Myself, you can scrub my main dev desktop of every corporate offering, and I might not even notice (emacs or neovim, plugins like Slime, Lsp plugins, etc) is what I am using daily, along with programming languages.
I keep hearing about it but unfortunately I myself only have one mac and nvidia GPUs and those can’t cluster together :/