Posted by ishanz 10 hours ago
> The thing is that even if I was wrong (I'm not) and AI was somehow helpful for software engineering (it isn't), I still wouldn't want to use it.
So even if you were wrong on the facts (you are) you still wouldn't change your mind? In other words, you're unreasonable and know you're unreasonable and think that's totally fine?
Well, cool. Next time, lead with that.
This line was good and lines up well with why I use minimal AI. But indeed the rest of the article shouldn't have really been needed then if this was the point.
You can certainly disagree, but that sentence isn't unreasonable - you cut out context.
At one point the author writes
> AI is a tool that can only produce software liabilities
which I would argue is completely caused by misuse of AI. Sure, you can have AI write a ton of code that often comes with subtle bugs. But using AI doesn't mean that it has to write any code for you at all. I've been using LLM often for security analysis and the results are quite good. Vulnerabilities that we had collectively missed were shown and we could fix them ourselves.
In this case, instead of creating liabilities, we were able to use LLM to get more information about our code. It's completely possible we could have deduced this information on our own, but we didn't and LLM is capable of doing it much more quickly than humans.
Then we started to have (myself included) people actual plan out the tasks for the bot: give good specs, good ac, file context, better self-review, better ”agentic practices” (i.e. asking it to review it own work can sometimes help), and suddenly I noticed you really can use agents in a real world 1mil LOC project. If you do it well and responsibly (also meaning you still retain some sense of ownership and actually review the shit)
It’s about the same for AI coding, I just get better results.
Another analog is using power tools to make jigs for hand tools. I’m constantly rigging up test or data wrangling harnesses to improve my ability to verify and refine solutions. It’s so ridiculously useful for improving outputs, even if it isn’t writing the code that makes it to production.
I have no idea what the frontier will look like in a few years but I don’t doubt local models like qwen will still be a staple of my workflows.
And for what it’s worth, there are people out there who lose their sawing ability because a safety brake totals their blade and needs to be replaced for something like $100. Sometimes we pay extra for features we value. We can always pull out the hand tools if we have to. In the mean time, make hay I guess.
With AI coding we're talking about people producing abstract artifacts that most people do not understand and do not know how to test. These aren't just strips of board. They are little machines. So you shouldn't be asking whether you'd trust a table saw to cut your boards, you should be asking whether you'd trust someone who has never cut boards to build your table saw.
People like you are an anomaly, not the norm. "I wrote an entire production quality SaaS without knowing what a function is" is the norm.
Because that's what every AI usage I've experienced has been.
Faster, yes. Useful, yes. Not better "finish".
I only commit code that is roughly the same as I would have written anyway.
It feels as good for developer ergonomics as the move away from CRT monitors.
I want to be better month after month, I want to be able to discover new areas.
Using AI tools makes sense to me. It’s important that you don’t believe everything the hype men are telling on Twitter, but it would also be a mistake to believe there is nothing valuable in this technology.
I kind of think CRT monitors were much better for developer ergonomics than LCD because of the tendency to set modern monitors much deeper into the desk and have to lean forward to see them. CRTs forced you to sit with better posture
You want to be delivery service that takes 2 days instead of 30 minutes to bring you pizza so that you don't forget how to ride your horse..?
- Vendors get to know everything about you
- Chips are becoming more politicized; I fear artificial scarcity as with housing will be put on chips, driving up prices.
- It causes a lot of centralisation. No, I cannot run deepseek at home. I don´t have 100.000+ USD laying around. 1TB of VRAM is not chump change.
- It can be a threat to the flourishing of open source. There is no longer a reason for me to work with other devs to build something in public together. I just have the LLM write what I need. It isolates.
These are the only drawbacks. Eveything else is clearly the artisans' ego getting in the way. That being said, if a piece of code is critical infra onto which many other things hinge, I will still hand code it.
EDIT: I think software will centralize heavily eventually; all the individual software devs we have now, and all the little custom shops, will all coalesce into a few megacorps per state. Clothing used to be made by famillies (micro scale), for the village, not produced centrally. It's not unthinkable the same will happen with software. The vendors have unprecedented access to all software being made; not just the code, but all the reasoning and iteration behind it. Plus, they can use their own model for development, allowing them to undercut any software house they want. The software world will be completelty unrecognizable in about two decades, I estimate.
It's the same with wood craft as a hobby. On one end of the spectrum is a CNC router, that's something that would somehow defeat the purpose (for me). I am electric drill/screwdriver because it would be tedious to do everything manually. On the other hand I like to saw with a japanese saw because it is so good that I can work fast. Your mileage may vary.
Thinking about it we might reconsider this whole philosophy of "software as a craft".
Code completion and snippets work well for this use case without AI.
Not true. I love coding. Much like I love walking and bicycling. I still own and use a car and an electric bike.
I also like having the option of getting to where I need to be fast.
Right now, I couldn't declare "I'm right" about any part of what's going on in this space right now, and I'm surprised when others do so.
But yes this is a very extreme position.