Top
Best
New

Posted by ishanz 10 hours ago

I Will Never Use AI to Code(antman-does-software.com)
55 points | 70 comments
sho 9 hours ago|
The final sentence says it all:

> The thing is that even if I was wrong (I'm not) and AI was somehow helpful for software engineering (it isn't), I still wouldn't want to use it.

So even if you were wrong on the facts (you are) you still wouldn't change your mind? In other words, you're unreasonable and know you're unreasonable and think that's totally fine?

Well, cool. Next time, lead with that.

pjjpo 8 hours ago||
> I actually like writing code. Why would I want to give up something I enjoy?

This line was good and lines up well with why I use minimal AI. But indeed the rest of the article shouldn't have really been needed then if this was the point.

happytoexplain 3 hours ago|||
The bar for insulting people who have anything negative to say about AI is getting scarily low.

You can certainly disagree, but that sentence isn't unreasonable - you cut out context.

laughingcurve 9 hours ago||
Thanks, helped me save some time
grasbergerm 9 hours ago||
It's fine if people don't want to use AI for anything, and honestly I don't even believe you need to justify it. The justification given here is interesting and I think shows misunderstanding.

At one point the author writes

> AI is a tool that can only produce software liabilities

which I would argue is completely caused by misuse of AI. Sure, you can have AI write a ton of code that often comes with subtle bugs. But using AI doesn't mean that it has to write any code for you at all. I've been using LLM often for security analysis and the results are quite good. Vulnerabilities that we had collectively missed were shown and we could fix them ourselves.

In this case, instead of creating liabilities, we were able to use LLM to get more information about our code. It's completely possible we could have deduced this information on our own, but we didn't and LLM is capable of doing it much more quickly than humans.

anon22981 7 hours ago|
I suspect people with opinions like the author’s haven’t been in a project where people use LLMs responsibly. We had a senior dev basically just prompt and push, with very little overseeing and minimal instructions, causing so many bad PRs and even prod bugs. That made me a sceptic for many months about agents.

Then we started to have (myself included) people actual plan out the tasks for the bot: give good specs, good ac, file context, better self-review, better ”agentic practices” (i.e. asking it to review it own work can sometimes help), and suddenly I noticed you really can use agents in a real world 1mil LOC project. If you do it well and responsibly (also meaning you still retain some sense of ownership and actually review the shit)

Anthony261 5 hours ago||
Nope. My point is not about the quality of the generated code, it's the fact that it was generated. No matter how good it is it will always be a liability without the accompanying asset which is the understanding produced by undertaking the effort of writing the code. Generated code is exclusively cognitive debt. It is also, by definition, legacy code since no one wrote it.
dd04any 13 minutes ago||
what is the main reason you think this? i have literally 0 experience with coding but ive used AI to build be stuff i need what would you say the major concerns are?
ok_dad 9 hours ago||
I was a hand tool woodworker, but the first time I had to rip 56 6 foot boards into 7 strips I immediately purchased a table saw. Now I use hand tools rarely because I find the speed and quality of my cuts are better. I still use hand tools for things that require certain standards, but electric tools almost always produce better quality results.

It’s about the same for AI coding, I just get better results.

xigoi 1 hour ago||
Does your saw require you to pay for each use?
steve_adams_86 9 hours ago|||
Similar to wood working, sometimes I use the LLM rough out the concept quickly then refine it. The initial roughing looks awful and this seems to bother some people a lot. It’s fine for me because I still have the correct tools to pull it all together. It saves me immense amounts of time.

Another analog is using power tools to make jigs for hand tools. I’m constantly rigging up test or data wrangling harnesses to improve my ability to verify and refine solutions. It’s so ridiculously useful for improving outputs, even if it isn’t writing the code that makes it to production.

officialchicken 9 hours ago|||
Your power tools run out of tokens and you have to open yet another online account to get around the daily sawing limits in order to finish the task today?
ok_dad 1 hour ago|||
I’m a professional so I don’t mind paying for tools.
steve_adams_86 9 hours ago||||
You can use qwen 3.5 for genuinely useful stuff without worrying about subscriptions and tokens. The 35b model works well on my Mac Studio and does all kinds of menial tasks so I can use my subscriptions for more important or complex things. I don’t think it’ll be long until models comparable to Sonnet today will run on my machine.

I have no idea what the frontier will look like in a few years but I don’t doubt local models like qwen will still be a staple of my workflows.

And for what it’s worth, there are people out there who lose their sawing ability because a safety brake totals their blade and needs to be replaced for something like $100. Sometimes we pay extra for features we value. We can always pull out the hand tools if we have to. In the mean time, make hay I guess.

CaptainFever 9 hours ago|||
local models exist
globular-toast 9 hours ago|||
I think we have to be careful with such analogies. One does not have to have sweated for years with hand tools to understand what an accurate rip cut through ply looks like. On the other hand, if you just gave someone some rough cut wood and an electric sander, how would they even understand what that wood could look like having never used a good, sharp hand plane?

With AI coding we're talking about people producing abstract artifacts that most people do not understand and do not know how to test. These aren't just strips of board. They are little machines. So you shouldn't be asking whether you'd trust a table saw to cut your boards, you should be asking whether you'd trust someone who has never cut boards to build your table saw.

ok_dad 1 hour ago||
Everyone is talking about AI coding like only brainless idiots are using it. I’m a professional, I can judge and fix the clankers output. I don’t give a shit if some other idiot is using their tools right.
krapp 1 hour ago||
The vast majority of people using AI to code, even in production, are brainless idiots. Not knowing anything about the process and not needing to care is the entire draw of AI for most people regardless of the medium, and particularly for employers. Processes are moving to eliminate humans from the loop of AI production, not to require them.

People like you are an anomaly, not the norm. "I wrote an entire production quality SaaS without knowing what a function is" is the norm.

archagon 8 hours ago|||
A table saw does not make decisions for you.
b112 9 hours ago||
Is it? Isn't the inverse? The speed of your cuts is improved with AI a bit, but aren't the cuts all rough and need additional work? Isn't the quality less than what you would do by hand?

Because that's what every AI usage I've experienced has been.

Faster, yes. Useful, yes. Not better "finish".

okeuro49 9 hours ago||
I love using AI to code, as it saves me a lot of boring and repetitive typing.

I only commit code that is roughly the same as I would have written anyway.

It feels as good for developer ergonomics as the move away from CRT monitors.

ok_dad 9 hours ago||
I think I’m lucky that I never enjoyed programming, I enjoyed thinking about problems. That makes AI coding great, because I’m good enough at programming that I can describe what I want easily to an LLM, and I can judge the results very well for myself. I read and understand each line so I know I’m not committing crap.
serial_dev 9 hours ago||
I feel similarly. I wanted to develop software, I didn’t want to “program”. I want my code to fix problems, I want the end result to feel great to use, I want it to be able to fix problems and feel great a year from now, too.

I want to be better month after month, I want to be able to discover new areas.

Using AI tools makes sense to me. It’s important that you don’t believe everything the hype men are telling on Twitter, but it would also be a mistake to believe there is nothing valuable in this technology.

brailsafe 9 hours ago|||
> It feels as good for developer ergonomics as the move away from CRT monitors.

I kind of think CRT monitors were much better for developer ergonomics than LCD because of the tendency to set modern monitors much deeper into the desk and have to lean forward to see them. CRTs forced you to sit with better posture

xantronix 9 hours ago||
Just how much boilerplate have people been putting up with for this to be an oft cited advantage of LLM usage? I know boilerplate has to exist somewhere, but I've been labouring these past couple decades under the assumption that boilerplate should be rare and to be avoided.
altern8 9 hours ago||
We still know how to ride horses but we also drive cars, now.

You want to be delivery service that takes 2 days instead of 30 minutes to bring you pizza so that you don't forget how to ride your horse..?

kakacik 9 hours ago||
Most people would not be able to ride horse properly, that would end up in catastrophe (or nothing just standing around or going to random directions). So your analogy is good but not in way you probably intended.
altern8 9 hours ago||
My point was, if you still want to know how to code you can, without having to lose all your customers.
ishanz 8 hours ago||
[dead]
edg5000 9 hours ago||
What's bad about AI:

- Vendors get to know everything about you

- Chips are becoming more politicized; I fear artificial scarcity as with housing will be put on chips, driving up prices.

- It causes a lot of centralisation. No, I cannot run deepseek at home. I don´t have 100.000+ USD laying around. 1TB of VRAM is not chump change.

- It can be a threat to the flourishing of open source. There is no longer a reason for me to work with other devs to build something in public together. I just have the LLM write what I need. It isolates.

These are the only drawbacks. Eveything else is clearly the artisans' ego getting in the way. That being said, if a piece of code is critical infra onto which many other things hinge, I will still hand code it.

EDIT: I think software will centralize heavily eventually; all the individual software devs we have now, and all the little custom shops, will all coalesce into a few megacorps per state. Clothing used to be made by famillies (micro scale), for the village, not produced centrally. It's not unthinkable the same will happen with software. The vendors have unprecedented access to all software being made; not just the code, but all the reasoning and iteration behind it. Plus, they can use their own model for development, allowing them to undercut any software house they want. The software world will be completelty unrecognizable in about two decades, I estimate.

ofrzeta 5 hours ago||
It doesn't have to be only extremes. How about using a little bit of AI? I also like coding but on the other hand writing down the 1000th loop to iterate over some array is not exactly fulfilling either.

It's the same with wood craft as a hobby. On one end of the spectrum is a CNC router, that's something that would somehow defeat the purpose (for me). I am electric drill/screwdriver because it would be tedious to do everything manually. On the other hand I like to saw with a japanese saw because it is so good that I can work fast. Your mileage may vary.

Thinking about it we might reconsider this whole philosophy of "software as a craft".

cassianoleal 2 hours ago|
> writing down the 1000th loop to iterate over some array is not exactly fulfilling either.

Code completion and snippets work well for this use case without AI.

xigoi 1 hour ago||
Or just using a language that doesn’t make it hard to iterate over arrays.
cassianoleal 45 seconds ago||
TBF I haven't met one that does yet.
sshine 5 hours ago||
> Maybe the only useful thing about AI coding is making it easy to identify engineers that don't enjoy writing code.

Not true. I love coding. Much like I love walking and bicycling. I still own and use a car and an electric bike.

I also like having the option of getting to where I need to be fast.

padjo 9 hours ago|
AI discourse perfectly illustrates why you should just ignore people with view points on the extreme ends of the spectrum.
schmookeeg 8 hours ago||
It's not the extremism I mind, it's the absolutism.

Right now, I couldn't declare "I'm right" about any part of what's going on in this space right now, and I'm surprised when others do so.

padjo 7 hours ago||
I think they're basically the same. ”AI is useless for everything, therefore I will never use it", or "AI will solve everything so i'm never going to even look at the code it produces". Both extreme/absolutist positions and both impractical/foolish.
ishanz 8 hours ago|||
I don’t love that take. Extreme positions are often where the interesting ideas are, even if you don’t agree with them.
padjo 7 hours ago||
The ideas are alluring because they're extreme. However the relatively boring stuff in the middle is far more likely to reflect reality or be actually useful.
rvz 9 hours ago||
Exactly. I find these proclamations pointless and sanctimonious. We don't know if he is using these AI tools privately.

But yes this is a very extreme position.

More comments...