Top
Best
New

Posted by ColinWright 8 hours ago

We mourn our craft(nolanlawson.com)
396 points | 525 comments
sosomoxie 7 hours ago|
I started programming over 40 years ago because it felt like computers were magic. They feel more magic today than ever before. We're literally living in the 1980s fantasy where you could talk to your computer and it had a personality. I can't believe it's actually happening, and I've never had more fun computing.

I can't empathize with the complaint that we've "lost something" at all. We're on the precipice of something incredible. That's not to say there aren't downsides (WOPR almost killed everyone after all), but we're definitely in a golden age of computing.

hnlmorg 6 hours ago||
The golden age for me is any period where you have the fully documented systems.

Hardware that ships with documentation about what instructions it supports. With example code. Like my 8-bit micros did.

And software that’s open and can be modified.

Instead what we have is:

- AI which are little black boxes and beyond our ability to fully reason.

- perpetual subscription services for the same software we used to “own”.

- hardware that is completely undocumented to all but a small few who are granted an NDA before hand

- operating systems that are trying harder and harder to prevent us from running any software they haven’t approved because “security”

- and distributed systems become centralised, such as GitHub, CloudFlare, AWS, and so on and so forth.

The only thing special about right now is that we have added yet another abstraction on top of an already overly complex software stack to allow us to use natural language as pseudocode. And that is a version special breakthrough, but it’s not enough by itself to overlook all the other problems with modern computing.

davidhyde 4 hours ago|||
My take on the difference between now and then is “effort”. All those things mentioned above are now effortless but the door to “effort” remains open as it always has been. Take the first point for example. Those little black boxes of AI can be significantly demystified by, for example, watching a bunch of videos (https://karpathy.ai/zero-to-hero.html) and spending at least 40 hours of hard cognitive effort learning about it yourself. We used to purchase software or write it ourselves before it became effortless to get it for free in exchange for ads and then a subscription when we grew tired of ads or were tricked into bait and switch. You can also argue that it has never been easier to write your own software than it is today.

Hostile operating systems. Take the effort to switch to Linux.

Undocumented hardware, well there is far more open source hardware out there today and back in the day it was fun to reverse engineer hardware, now we just expect it to be open because we couldn’t be bothered to put in the effort anymore.

Effort gives me agency. I really like learning new things and so agentic LLMs don’t make me feel hopeless.

hnlmorg 4 hours ago||
I’ve worked in the AI space and I understand how LLMs work as a principle. But we don’t know the magic contained within a model after it’s been trained. We understand how to design a model, and how models work at a theoretical level. But we cannot know how well it will be at inference until we test it. So much of AI research is just trial and error with different dials repeated tweaked until we get something desirable. So no, we don’t understand these models in the same way we might understand how an hashing algorithm works. Or a compression routine. Or an encryption cypher. Or any other hand-programmed algorithm.

I also run Linux. But that doesn’t change how the two major platforms behave and that, as software developers, we have to support those platforms.

Open source hardware is great but it’s not on the same league of price and performance as proprietary hardware.

Agentic AI doesn’t make me feel hopeless either. I’m just describing what I’d personally define as a “golden age of computing”.

bhadass 14 minutes ago||
but isn't this like a lot of other CS-related "gradient descent"?

when someone invents a new scheduling algorithm or a new concurrent data structure, it's usually based on hunches and empirical results (benchmarks) too. nobody sits down and mathematically proves their new linux scheduler is optimal before shipping it. they test it against representative workloads and see if there is uplift.

we understand transformer architectures at the same theoretical level we understand most complex systems. we know the principles, we have solid intuitions about why certain things work, but the emergent behavior of any sufficiently complex system isn't fully predictable from first principles.

that's true of operating systems, distributed databases, and most software above a certain complexity threshold.

zzo38computer 2 hours ago||||
> The golden age for me is any period where you have the fully documented systems. Hardware that ships with documentation about what instructions it supports. With example code. Like my 8-bit micros did. And software that’s open and can be modified.

I agree, that it would be good. (It is one reason why I wanted to design a better computer, which would include full documentation about the hardware and the software (hopefully enough to make a compatible computer), as well as full source codes (which can help if some parts of the documentation are unclear, but also can be used to make your own modifications if needed).) (In some cases, we have some of this already, but not entirely. Not all hardware and software has the problems you list, although it is too common now. Making a better computer will not prevent such problematic things on other computers, and not entirely preventing such problems on the new computer design either, but it would help a bit, especially if it is actually designed good rather than badly.)

HoldOnAMinute 5 hours ago||||
Have you tried using GenAI to write documentation? You can literally point it to a folder and say, analyze everything in this folder and write a document about it. And it will do it. It's more thorough than anything a human could do, especially in the time frame we're talking about.

If GenAI could only write documentation it would still be a game changer.

orwin 5 hours ago|||
But it write mostly useless documentation Which take time to read and decipher.

And worse, if you are using it for public documentation, sometimes it hallucinate endpoints (i don't want to say too much here, but it happened recently to a quite used B2B SaaS).

throwup238 5 hours ago||
Loop it. Use another agent (from a different company helps) to review the code and documentation and call out any inconsistencies.

I run a bunch of jobs weekly to review docs for inconsistencies and write a plan to fix. It still needs humans in the loop if the agents don’t converge after a few turns, but it’s largely automatic (I baby sat it for a few months validating each change).

hnlmorg 4 hours ago|||
The problems about documentation I described wasn’t about the effort of writing it. It was that modern chipsets are trade secrets.

When you bought a computer in the 80s, you’d get a technical manual about the internal workings of the hardware. In some cases even going as far as detailing what the registers did on their graphics chipset or CPU.

GenAI wouldn’t help here for modern hardware because GenAI doesn’t have access to those specifications. And if it did, then it would already be documented so we wouldnt need GenAI to write it ;)

XenophileJKO 4 hours ago||||
Actually this makes me think of an interesting point. We DO have too many layers of software.. and rebuilding is always so cost prohibative.

Maybe an iteresting route is using LLMs to flatten/simplify.. so we can dig out from some of the complexity.

hnlmorg 4 hours ago||
I’ve heard this argument made before and it’s the only side of AI software development that excites me.

Using AI to write yet another run-of-the-mill web service written in the same bloated frameworks and programming languages designed for the lowest common denominator of developers really doesn’t feel like it’s taking advantage leap in capabilities that AI bring.

But using AI to write native applications in low level languages, built for performance and memory utilisation, does at least feel like we are bringing some actual quality of life savings in exchange for all those fossil fuels burnt to crunch the LLMs tokens.

fragmede 5 hours ago|||
> perpetual subscription services for the same software we used to “own”.

In another thread, people were looking for things to build. If there's a subscription service that you think shouldn't be a subscription (because they're not actually doing anything new for that subscription), disrupt the fuck out of it. Rent seekers about to lose their shirts. I pay for eg Spotify because there's new music that has to happen, but Dropbox?

If you're not adding new whatever (features/content) in order to justify a subscription, then you're only worth the electricity and hardware costs or else I'm gonna build and host my own.

hnlmorg 4 hours ago||
People have been building alternatives to MS Office, Adopt Creative Suite, and so on and so forth for literally decades and yet they’re still the de facto standard.

Turns out it’s a lot harder to disrupt than it sounds.

apitman 7 hours ago|||
In some ways, I'd say we're in a software dark age. In 40 years, we'll still have C, bash, grep, and Mario ROMs, but practically none of the software written today will still be around. That's by design. SaaS is a rent seeking business model. But I think it also applies to most code written in JS, Python, C#, Go, Rust, etc. There are too many dependencies. There's no way you'll be able to take a repo from 2026 and spin it up in 2050 without major work.

One question is how will AI factor in to this. Will it completely remove the problem? Will local models be capable of finding or fixing every dependency in your 20yo project? Or will they exacerbate things by writing terrible code with black hole dependency trees? We're gonna find out.

zzo38computer 1 hour ago|||
> That's by design. SaaS is a rent seeking business model.

Not all software now is SaaS, but unfortunately it is too common now.

> But I think it also applies to most code written in JS, Python, C#, Go, Rust, etc. There are too many dependencies.

Some people (including myself) prefer to write programs without too many dependencies, in order to avoid that problem. Other things also help, including some people write programs for older systems which can be emulated, or will use a more simpler portable C code, etc. There are things that can be done, to avoid too many dependencies.

There is uxn, which is a simple enough instruction set that people can probably implement it without too much difficulty. Although some programs might need some extensions, and some might use file names, etc, many programs will work, because it is designed in a simple way that it will work.

ctmnt 5 hours ago|||
I’m not sure Go belongs on that list. Otherwise I hear what you’re saying.
apitman 5 hours ago||
A large percentage of the code I've written the last 10 years is Go. I think it does somewhat better than the others in some areas, such as relative simplicity and having a robust stdlib, but a lot of this is false security. The simplicity is surface level. The runtime and GC are very complex. And the stdlib being robust means that if you ever have to implement a compiler from scratch, you have to implement all of std.

All in all I think the end result will be the same. I don't think any of my Go code will survive long term.

hnlmorg 4 hours ago||
I’ve got 8 year old Go code that still compiles fine on the latest Go compiler.

Go has its warts but backwards compatibility isn’t one of them. The language is almost as durable as Perl.

massysett 7 hours ago|||
We have what I've dreamed of for years: the reverse dictionary.

Put in a word and see what it means? That's been easy for at least a century. Have a meaning in mind and get the word? The only way to get this before was to read a ton of books and be knowledgable or talk to someone who was. Now it's always available.

astrashe2 7 hours ago|||
This is a great description of how I use Claude.
terminalbraid 7 hours ago||||
> Now it's always available.

And often incorrect! (and occasionally refuses to answer)

NeutralCrane 5 hours ago|||
Is it? I’ve seen AI hallucinations, but they seem to be increasingly rare these days.

Much of the AI antipathy reminds me of Wikipedia in the early-mid 2000s. I remember feeling amazed with it, but also remember a lot of ranting by skeptics about how anyone could put anything on there, and therefore it was unreliable, not to be used, and doomed to fail.

20 years later and everyone understands that Wikipedia may have its shortcomings, and yet it is still the most impressive, useful advancement in human knowledge transfer in a generation.

advael 5 hours ago||
I think robust crowdsourcing is probably the biggest capital-A Advancement in humanity's capabilities that came out of the internet, and there's a huge disparity in results that comes from how that capability is structured and used. Wikipedia designed protocols, laws, and institutions that leverage crowdsourcing to be the most reliable de facto aggregator of human knowledge. Social media designed protocols, laws, and institutions to rot people's brains, surveil their every move, and enable mass-disinformation to take over the public imagination on a regular basis.

I think LLMs as a technology are pretty cool, much like crowdsourcing is. We finally have pretty good automatic natural language processing that scales to large corpora. That's big. Also, I think the state of the software industry that is mostly driving the development, deployment, and ownership of this technology is mostly doing uninspired and shitty things with it. I have some hope that better orgs and distributed communities will accomplish some cool and maybe even monumental things with them over time, but right now the field is bleak, not because the technology isn't impressive (although somehow despite how impressive it is it's still being oversold) but because silicon valley is full of rotten institutions with broken incentives, the same ones that brought us social media and subscriptions to software. My hope for the new world a technology will bring about will never rest with corporate aristocracy, but with the more thoughtful institutions and the distributed open source communities that actually build good shit for humanity, time and time again

dgacmu 7 hours ago||||
It is! But you can then verify it via a correct, conventional forward dictionary.

The scary applications are the ones where it's not so easy to check correctness...

terminalbraid 7 hours ago||
Right. Except the dictionary analogy only goes so far and we reach the true problem.
0x696C6961 5 hours ago||
It's not an analogy.
0x696C6961 7 hours ago|||
Sure, but it's easy to check if it's incorrect and try again.
terminalbraid 7 hours ago|||
Forgive me if "just dig your way out of the hole" doesn't sound appealing.
0x696C6961 6 hours ago|||
You're free to use whatever tools you like.
chasd00 5 hours ago||
> You're free to use whatever tools you like.

this is important, i feel like a lot of people are falling in to the "stop liking what i don't like" way of thinking. Further, there's a million different ways to apply an AI helper in software development. You can adjust your workflow in whatever way works best for you. ..or leave it as is.

shepherdjerred 7 hours ago||||
Surely you, a programmer, can imagine a way to automate this process
terminalbraid 7 hours ago||
No, I actually haven't made, nor desire to make, a way to automate "thinking about, researching, and solving a problem".
Jensson 5 hours ago|||
When you use it to lookup a single word, yeah, but people here use it to lookup thousand words at once and then can't check it all.
wizzwizz4 7 hours ago||||
The "reverse dictionary" is called a "thesaurus". Wikipedia quotes Peter Mark Roget (1852):

> ...to find the word, or words, by which [an] idea may be most fitly and aptly expressed

Digital reverse dictionaries / thesauri like https://www.onelook.com/thesaurus/ can take natural language input, and afaict are strictly better at this task than LLMs. (I didn't know these tools existed when I wrote the rest of this comment.)

I briefly investigated LLMs for this purpose, back when I didn't know how to use a thesaurus; but I find thesauruses a lot more useful. (Actually, I'm usually too lazy to crack out a proper thesaurus, so I spend 5 seconds poking around Wiktionary first: that's usually Good Enough™ to find me an answer, when I find an answer I can trust it, and I get the answer faster than waiting for an LLM to finish generating a response.)

There's definitely room to improve upon the traditional "big book of synonyms with double-indirect pointers" thesaurus, but LLMs are an extremely crude solution that I don't think actually is an improvement.

yunwal 7 hours ago|||
A thesaurus is not a reverse dictionary
dgacmu 7 hours ago|||
Really?

"What's a word that means admitting a large number of uses?"

That seems hard to find in a thesaurus without either versatile or multifarious as a starting point (but those are the end points).

wizzwizz4 7 hours ago||
I plugged "admitting a large number of uses" into OneLook Thesaurus (https://www.onelook.com/thesaurus/?s=admitting%20a%20large%2...), and it returned:

> Best match is versatile which usually means: Capable of many different uses

with "multi-purpose", "adaptable", "flexible" and "multi-use" as the runner-up candidates.

---

Like you, I had no idea that tools like OneLook Thesaurus existed (despite how easy it would be to make one), so here's my attempt to look this up manually.

"Admitting a large number of uses" -> manually abbreviated to "very useful" -> https://en.wiktionary.org/wiki/useful -> dead end. Give up, use a thesaurus.

https://www.wordhippo.com/what-is/another-word-for/very_usef..., sense 2 "Usable in multiple ways", lists:

> useful multipurpose versatile flexible multifunction adaptable all-around all-purpose all-round multiuse multifaceted extremely useful one-size-fits-all universal protean general general-purpose […]

Taking advantage of the fact my passive vocabulary is greater than my active vocabulary: no, no, yes. (I've spuriously rejected "multipurpose" – a decent synonym of "versatile [tool]" – but that doesn't matter.) I'm pretty sure WordHippo is machine-generated from some corpus, and a lot of these words don't mean "very useful", but they're good at playing the SEO game, and I'm lazy. Once we have versatile, we can put that into an actual thesaurus: https://dictionary.cambridge.org/thesaurus/versatile. But none of those really have the same sense as "versatile" in the context I'm thinking of (except perhaps "adaptable"), so if I were writing something, I'd go with "versatile".

Total time taken: 15 seconds. And I'm confident that the answer is correct.

By the way, I'm not finding "multifarious" anywhere. It's not a word I'm familiar with, but that doesn't actually seem to be a proper synonym (according to Wiktionary, at least: https://en.wiktionary.org/wiki/Thesaurus:heterogeneous). There are certainly contexts where you could use this word in place of "versatile" (e.g. "versatile skill-set" → "multifarious skill-set"), but I criticise WordHippo for far less dubious synonym suggestions.

dgacmu 6 hours ago||
'multifarious uses' -> the implication would be having not just many but also a wide diversity of uses

M-W gives an example use of "Today’s Thermomix has become a beast of multifarious functionality. — Matthew Korfhage, Wired News, 21 Nov. 2025 "

wordhippo strikes me as having gone beyond the traditional paper thesaurus, but I can accept that things change and that we can make a much larger thesaurus than we did when we had to collect and print. thesaurus.com does not offer these results, though, as a reflection of a more traditional one, nor does the m-w thesaurus.

cess11 4 hours ago|||
"The only way to get this before was to read a ton of books and be knowledgable or talk to someone who was"

Did you have trouble with this part?

xeromal 2 hours ago||
This seems like a hostile question.
crawshaw 7 hours ago|||
Glad to see this already expressed here because I wholly agree. Programming has not brought me this much joy in decades. What a wonderful time to be alive.
socalgal2 4 hours ago||
I wish I could have you sit by my side for a week or two and pair program what I'm working on because most for the time I'm not getting great results.
chkaloon 3 hours ago|||
Depends on the project. For web based functionality it seems great, because of all the prior work that is out there. For more obscure things like Obsidian Note extentions or Home Assistant help it's more hit and miss
crawshaw 2 hours ago|||
You in SF? My schedule is a bit busy since we launched but I could find an hour in the city.
hintymad 3 hours ago|||
One thing that I realized was that a lot of our so-called "craft" is converged "know-how". Take the recent news that Anthropic used Claude Code to write a C compiler for example, writing compiler is hard (and fun) for us humans because we indeed need to spend years understanding deeply the compiler theory and learning every minute detail of implementation. That kind of learning is not easily transferrable. Most students tried the compiler class and then never learned enough, only a handful few every year continued to grow into true compiler engineers. Yet to our AI models, it does not matter much. They already learned the well-established patterns of compiler writing from the excellent open-source implementations, and now they can churn out millions of code easily. If not perfect, they will get better in the future.

So, in a sense our "craft" no longer matters, but what really happens is that the repetitive know-how has become commoditized. We still need people to do creative work, but what is not clear is how many such people we will need. After all, at least in short term, most people build their career by perfecting procedural work because transferring the know-how and the underlying whys is very expensive to human. For the long term, though, I'm optimistic that engineers just get an amazing tool and will use it create more opportunities that demand more people.

blibble 2 hours ago||
writing a C compiler is a 1st year undergrad project

C was explicitly designed to make it simple to write a compiler

hintymad 1 hour ago||
Which university offers compiler for freshmen? Can you provide a link to the course?
HendrikHensen 7 hours ago|||
Good for you. But there are already so, so many posts and threads celebrating all of this. Everyone is different. Some of us enjoy the activity of programming by hand. This thread is for those us, to mourn.
nradov 5 hours ago||
You're still allowed to program by hand. Even in assembly language if you like.
fupaskys 3 hours ago||
[dead]
anonnon 7 hours ago|||
> I can't empathize with the complaint that we've "lost something" at all.

We could easily approach a state of affairs where most of what you see online is AI and almost every "person" you interact with is fake. It's hard to see how someone who supposedly remembers computing in the 80s, when the power of USENET and BBSs to facilitate long-distance, or even international, communication and foster personal relationships (often IRL) was enthralling, not thinking we've lost something.

icedchai 5 hours ago|||
I grew up on 80's and 90's BBSes. The transition from BBSes to Usenet and the early Internet was a magical period, a time I still look back upon fondly and will never forget.

Some of my best friends IRL today were people I first met "online" in those days... but I haven't met anyone new in a longggg time. Yeah, I'm also much older, but the environment is also very different. The community aspect is long gone.

munksbeer 4 hours ago||||
I'm from the early 90s era. I know exactly what you're saying. I entered the internet on muds, irc and usenet. There were just far fewer people online in those communities in those days, and in my country, it was mostly only us university students.

But, those days disappeared a long time ago. Probably at least 20-30 years ago.

Gud 4 hours ago||
IRC is still around, that old internet is still there.

You just have to get off the commercial crap and you’ll find it.

chasd00 3 hours ago|||
even in the 90s there was the phrase "the Internet, where the men are men, the women are men, and the teen girls are FBI agents". It was always the case you never really knew who/what you were dealing with on the Internet.
croes 7 hours ago|||
> We're on the precipice of something incredible.

Total dependence on a service?

_aavaa_ 6 hours ago|||
The quality of local models has increased significantly since this time last year. As have the options for running larger local models.
icedchai 5 hours ago|||
The quality of local models is still abysmal compared to commercial SOTA models. You're not going to run something like Gemini or Claude locally. I have some "serious" hardware with 128G of VRAM and the results are still laughable. If I moved up to 512G, it still wouldn't be enough. You need serious hardware to get both quality and speed. If I can get "quality" at a couple tokens a second, it's not worth bothering.

They are getting better, but that doesn't mean they're good.

_aavaa_ 5 hours ago||
Good by what standard? Compared to SOTA today? No they're not. But they are better than the SOTA in 2020, and likely 2023.

We have a magical pseudo-thinking machine that we can run locally completely under our control, and instead the goal posts have moved to "but it's not as fast as the proprietary could".

icedchai 4 hours ago||
My comparison was today's local AI to today's SOTA commercial AI. Both have improved, no argument.

It's more cost effective for someone to pay $20 to $100 month for a Claude subscription compared to buying a 512 gig Mac Studio for $10K. We won't discuss the cost of the NVidia rig.

I mess around with local AI all the time. It's a fun hobby, but the quality is still night and day.

IhateAI_2 5 hours ago|||
These takes are terrible.

1. It costs 100k in hardware to run Kimi 2.5 with a single session at decent tok p/s and its still not capable for anything serious.

2. I want whatever you're smoking if you think anyone is going to spend billions training models capable of outcompeting them are affordable to run and then open source them.

gordonhart 5 hours ago||||
On a scale that would make big tobacco blush.
kaffekaka 6 hours ago||||
Yes this is the issue. We truly have something incredible now. Something that could benefit all of humanity. Unfortunately it comes at $200/month from Sam Altman & co.
puchatek 5 hours ago||
If that was the final price, no strings attached and perfect, reliable privacy then I might consider it. Maybe not for the current iteration but for what will be on offer in a year or two.

But as it stands right now, the most useful LLMs are hosted by companies that are legally obligated to hand over your data if the US gov. had decided that it wants it. It's unacceptable.

lovich 3 hours ago||
That 200/month price isn’t sustainable either. Eventually they’re going to have to jack that up substantially.
NeutralCrane 5 hours ago||||
Between the internet, or more generally computers, or even more generally electricity, are we not already?
hafley66 6 hours ago|||
prefrontal cortex as a service
slekker 5 hours ago||
yup, all these folks claiming AI is the bees knees are delegating their thinking to a roulette that may or may not give proper answers. the world will become more and more like the movie idiocracy
uejfiweun 7 hours ago|||
I agree with you with the caveat that all the "ease of building" benefits, for me, could potentially be dwarfed by job losses and pay decreases. If SWE really becomes obsolete, or even if the number of roles decrease a lot and/or the pay decreases a lot (or even fails to increase with inflation), I am suddenly in the unenviable position of not being financially secure and being stuck in my 30s with an increasingly useless degree. A life disaster, in other words. In that scenario the unhappiness of worrying about money and retraining far outweighs the happiness I get from being able to build stuff really fast.

Fundamentally this is the only point I really have on the 'anti-AI' side, but it's a really important one.

busterarm 5 hours ago|||
We definitely have lost something. I got into computers because they're deterministic. Way less complicated than people.

Now the determinism is gone and computers are gaining the worst qualities of people.

My only sanctuary in life is slipping away from me. And I have to hear people tell me I'm wrong who aren't even sympathetic to how this affects me.

Gud 4 hours ago||
But no one is forcing you to use this software?
renegade-otter 3 hours ago|||
Nothing meaningful happened in almost 20 years. After the iPhone, what happened that truly changed our lives? The dumpster fire of social media? Background Netflix TV?

In fact, I remember when I could actually shop on Amazon or browse for restaurants on Yelp while trusting the reviews. None of that is possible today.

We have been going through a decade of enshitification.

jauntywundrkind 6 hours ago|||
I really am very thankful for @simonw posting a TikTok from Chris Ashworth, a Baltimore theater software developer, who recently picked up LLM's for building a voxel display software controller. And who was just blown away. https://simonwillison.net/2026/Jan/30/a-programming-tool-for...

Simon doesn't touch on my favorite part of Chris's video though, which is Chris citing his friend Jesse Kriss. This stuck out at me so hard, and is so close to what you are talking about:

> The interesting thing about this is that it's not taking away something that was human and making it a robot. We've been forced to talk to computers in computer language. And this is turning that around.

I don't see (as you say) a personality. But I do see the ability to talk. The esoteria is still here underneath, but computer programmers having this lock on the thing that has eaten the world, being the only machine whisperers around, is over. That depth of knowledge is still there and not going away! But notably too, the LLM will help you wade in, help those not of the esoteric personhood of programmers to dive in & explore.

Levitating 7 hours ago|||
> golden age of computing

I feel like we've reached the worst age of computing. Where our platforms are controlled by power hungry megacorporations and our software is over-engineered garbage.

The same company that develops our browsers and our web standards is also actively destroying the internet with AI scrapers. Hobbyists lost the internet to companies and all software got worse for it.

Our most popular desktop operating system doesn't even have an easy way to package and update software for it.

davidw 5 hours ago|||
Yes, this is where it's at for me. LLM's are cool and I can see them as progress, but I really dislike that they're controlled by huge corporations and cost a significant amount of money to use.
seanmcdirmid 5 hours ago|||
Use local OSS models then? They aren’t as good and you need beefy hardware (either Apple silicon or nvidia GPUs). But they are totally workable, and you avoid your dislikes directly.
davidw 2 hours ago||
"Not as good and costs a lot in hardware" still sounds like I'm at a disadvantage.
seanmcdirmid 1 hour ago||
$3000 is not that much for hardware (like a refurbished MBP Max with decent amount of RAM), and you'd be surprised how much more useful a thing that is slightly worse than the expensive thing is when you don't have anxiety about token usage.
bdangubic 5 hours ago|||
> they're controlled by huge corporations and cost a significant amount of money to use.

is there anything you use that isn't? like laptop on which you work, software that you use to browse the internet, read the email... I've heard similar comment like yours before and I am not sure I understand it given everything else - why does this matter for LLMs and not the phone you use etc etc?

Gud 4 hours ago||
I’ve used FreeBSD since I was 15 years old - Linux before that.

My computer was never controlled by any corporation, until now.

davidw 2 hours ago||
Yeah I've always run Linux on my computers for the past 30 years. I'm pretty used to being in control.
bdangubic 1 hour ago||
what phone do you use?
sosomoxie 7 hours ago||||
Dystopian cyberpunk was always part of the fantasy. Yes, scale has enabled terrible things.

There are more alternatives than ever though. People are still making C64 games today, cheap chips are everywhere. Documentation is abundant... When you layer in AI, it takes away labor costs, meaning that you don't need to make economically viable things, you can make fun things.

I have at least a dozen projects going now that I would have never had time or energy for. Any itch, no matter how geeky and idiosyncratic, is getting scratched by AI.

AndrewKemendo 7 hours ago|||
It’s never been easier for you to make a competitor

So what is stopping you other than yourself?

linguae 7 hours ago|||
I’m not the OP, but my answer is that there’s a big difference between building products and building businesses.

I’ve been programming since 1998 when I was in elementary school. I have the technical skills to write almost anything I want, from productivity applications to operating systems and compilers. The vast availability of free, open source software tools helps a lot, and despite this year’s RAM and SSD prices, hardware is far more capable today at comparatively lower prices than a decade ago and especially when I started programming in 1998. My desktop computer is more capable than Google’s original cluster from 1998.

However, building businesses that can compete against Big Tech is an entirely different matter. Competing against Big Tech means fighting moats, network effects, and intellectual property laws. I can build an awesome mobile app, but when it’s time for me to distribute it, I have to either deal with app stores unless I build for a niche platform.

Yes, I agree that it’s never been easier to build competing products due to the tools we have today. However, Big Tech is even bigger today than it was in the past.

bunderbunder 6 hours ago||
Yes. I have seen the better product lose out to network effects far too many times to believe that a real mass market competitor can happen nowadays.

Look at how even the Posix ecosystem - once a vibrant cluster of a dozen different commercial and open source operating systems built around a shared open standard - has more or less collapsed into an ironclad monopoly because LXC became a killer app in every sense of the term. It’s even starting to encroach on the last standing non-POSIX operating system, Windows, which now needs the ability to run Linux in a tightly integrated virtual machine to be viable for many commercial uses.

icedchai 5 hours ago||
Oracle Solaris and IBM AIX are still going. Outside of enterprises that are die hard Sun/Oracle or IBM shops, I haven't seen a job requiring either in decades. I used to work with both and don't miss them in the least.
tock 7 hours ago|||
Billions of dollars?
AnthonyMouse 7 hours ago||
You don't need billions of dollars to write an app. You need billions of dollars to create an independent platform that doesn't give the incumbent a veto over your app if you're trying to compete with them. And that's the problem.
plagiarist 7 hours ago|||
I didn't imagine I would be sending all my source code directly to a corporation for access to an irritatingly chipper personality that is confidently incorrect the way these things are.

There have been wild technological developments but we've lost privacy and autonomy across basically all devices (excepting the people who deliberately choose to forego the most capable devices, and even then there are firmware blobs). We've got the facial recognition and tracking so many sci-fi dystopias have warned us to avoid.

I'm having an easier time accomplishing more difficult technological tasks. But I lament what we have come to. I don't think we are in the Star Trek future and I imagined doing more drugs in a Neuromancer future. It's like a Snow Crash / 1984 corporate government collab out here, it kinda sucks.

jmclnx 7 hours ago|||
I retired a few years ago, so I have no idea what AI programming is.

But I mourned when CRT came out, I had just started programming. But I quickly learned CRTs were far better,

I mourned when we moved to GUIs, I never liked the move and still do not like dealing with GUIs, but I got used to it.

Went through all kinds of programming methods, too many to remember, but those were easy to ignore and workaround. I view this new AI thing in a similar way. I expect it will blow over and a new bright shiny programming methodology will become a thing to stress over. In the long run, I doubt anything will really change.

querez 7 hours ago|||
I think you're underestimating what AI can do in the coding space. It is an extreme paradigm shift. It's not like "we wrote C, but now we switch to C++, so now we think in objects and templates". It's closer to the shift from assembly to a higher level language. Your goal is still the same. But suddenly you're working in a completely newer level of abstraction where a lot of the manual work that used to be your main concern is suddenly automated away.

If you never tried Claude Code, give it s try. It's very easy to get I to. And you'll soon see how powerful it is.

imiric 6 hours ago||
> But suddenly you're working in a completely newer level of abstraction where a lot of the manual work that used to be your main concern is suddenly automated away.

It's remarkable that people who think like this don't have the foresight to see that this technology is not a higher level of abstraction, but a replacement of human intellect. You may be working with it today, but whatever you're doing will eventually be done better by the same technology. This is just a transition period.

Assuming, of course, that the people producing these tools can actually deliver what they're selling, which is very much uncertain. It doesn't change their end goal, however. Nor the fact that working with this new "abstraction" is the most mind numbing activity a person can do.

Revanche1367 5 hours ago||
I agree with this. At a higher level of abstraction, you’re still doing the fundamental problem solving. Low-level machine language or high-level Java, C++ or even Python, the fundamental algorithm design is still entirely done by the programmer. LLMs aren’t being used to just write the code unless the user is directing how each line or at least each function is being written, often times you can just describe the problem and it solves it most of the way if not entirely. Only for really long and complex tasks do the better models really require hand-holding and they are improving on that end rapidly.

That’s not a higher level of abstraction, it’s having someone do the work for you while doing less and less of the thinking as well. Someone might resist that urge and consistently guide the model closely but that’s probably not what the collective range of SWEs who use these models are doing and rapidly the ease of using these models and our natural reluctance to take on mental stress is likely to make sure that eventually everyone lets LLMs do most or all of the thinking for them. If things really go in that direction and spread, I foresee a collective dumbing down of the general population.

apitman 7 hours ago|||
OT but I see your account was created in 2015, so I'm assuming very late in your career. Curious what brought you to HN at that time and not before?
jmclnx 5 hours ago||
I did not know it existed before 2015 :)
AndrewKemendo 7 hours ago|||
Same.

I was born in 84 and have been doing software since 97

it’s never been easier, better or more accessible time to make literally anything - by far.

Also if you prefer to code by hand literally nobody is stopping you AND even that is easier.

Cause if you wanted to code for console games you literally couldn’t in the 90s without 100k specialized dev machine.

It’s not even close.

This “I’m a victim because my software engineering hobby isn’t profitable anymore” take is honestly baffling.

zeroonetwothree 7 hours ago|||
I'm not going to code by hand if it's 4x slower than having Claude do it. Yes, I can do that, but it just feels bad.

The analogy I like is it's like driving vs. walking. We were healthier when we walked everywhere, but it's very hard to quit driving and go back even if it's going to be better for you.

NeutralCrane 5 hours ago|||
I actually like the analogy but for the opposite reason. Cars have become the most efficient way to travel for most industrial purposes. And yet enormous numbers of people still walk, run, ride bikes, or even horses, often for reasons entirely separate from financial gain.
AndrewKemendo 7 hours ago|||
I walk all the time

During the summer I’ll walk 30-50 miles a week

However I’m not going to walk to work ever and I’m damn sir not going to walk in the rain or snow unless if I can avoid it

chasd00 5 hours ago|||
it's an exciting time, things are changing and changing beyond "here's my new javascript framework". It's definitely an industry shakeup kind of deal and no one knows what lies 6 months, 1 year, 5 years from now. It makes me anxious seeing as i have a wife+2 kids to care for and my income is tied to this industry but it's exciting too.
AndrewKemendo 5 hours ago||
Well you need to learn to adapt quickly if you have that much infrastructure to maintain
fupaskys 3 hours ago|||
[dead]
IhateAI 7 hours ago||
[flagged]
sosomoxie 7 hours ago|||
I'm actually extremely good at programming. My point is I love computers and computing. You can use technology to achieve amazing things (even having fun). Now I can do much more of that than when I was limited to what I can personally code. In the end, it's what computers can do that's amazing, beautiful, terrifying... That thrill and to be on the bleeding edge is always what I was after.
samiv 6 hours ago|||
The downside is that whatever you (Claude) can do so can anyone else too.

So you're welcome to make the 100000000th Copy of the same thing that nobody cares about anymore.

sosomoxie 2 hours ago||
It's so easy to build things that I don't need anyone to care about it, I just need to the computer to do what I want it to do.
IhateAI 6 hours ago|||
[flagged]
tines 6 hours ago||
Thank you. I don't understand how people don't see that this is the universe's most perfect gift to corporations, and what a disaster it is for labor. There won't be a middle class. Future generations will be intellectual invalids. Baffling to see people celebrating.
blibble 5 hours ago|||
it is a very, very strange thing to witness

even if you can be a prompt engineer (or whatever it's called this week) today

well, with the feedback you're providing: you're training it to do that too

you are LITERALLY training the newly hired outsourced personnel to do your job

but this time you won't be able to get a job anywhere else, because your fellow class traitors are doing exactly the same thing at every other company in the world

samiv 6 hours ago|||
They are the useful idiots buying into the hype thinking they by some magic they get to keep their jobs and their incomes.

This things is going to erase careers and render skills sets and knowledge cultivated over decades worthless.

Anyone can promt the same fucking shit now and call it a day.

freigeist79 6 hours ago|||
If you were confident in your own skills, you wouldn’t need to invent a whole backstory just to discredit someone.
Nextgrid 7 hours ago||
LLMs are only a threat if you see your job as a code monkey. In that case you're likely already obsoleted by outsourced staff who can do your job much cheaper.

If you see your job as a "thinking about what code to write (or not)" monkey, then you're safe. I expect most seniors and above to be in this position, and LLMs are absolutely not replacing you here - they can augment you in certain situations.

The perks of a senior is also knowing when not to use an LLM and how they can fail; at this point I feel like I have a pretty good idea of what is safe to outsource to an LLM and what to keep for a human. Offloading the LLM-safe stuff frees up your time to focus on the LLM-unsafe stuff (or just chill and enjoy the free time).

zeroonetwothree 7 hours ago||
I see my job as having many aspects. One of those aspects is coding. It is the aspect that gives me the most joy even if it's not the one I spend the most time on. And if you take that away then the remaining part of the job is just not very appealing anymore.

It used to be I didn't mind going through all the meetings, design discussions, debates with PMs, and such because I got to actually code something cool in the end. Now I get to... prompt the AI to code something cool. And that just doesn't feel very satisfying. It's the same reason I didn't want to be a "lead" or "manager", I want to actually be the one doing the thing.

Nextgrid 7 hours ago||
You won't be prompting AI for the fun stuff (unless laying out boring boilerplate is what you consider "fun"). You'll still be writing the fun part - but you will be able to prompt beforehand to get all the boilerplate in place.
archagon 3 hours ago||
If you’re writing that much boilerplate as part of your day to day work, I daresay you’re Doing Coding Wrong. (Virtue number one of programming: laziness. https://thethreevirtues.com)

Any drudgework you repeat two or three times should be encapsulated or scripted away, deterministically.

AstroBen 6 hours ago|||
There are many tens (hundreds?) of billions of dollars being poured into the smartest minds in the world to push this thing forward

I'm not so confident that it'll only be code monkeys for too long

Nextgrid 6 hours ago|||
Until they can magically increase context length to such a size that can conveniently fit the whole codebase, we're safe.

It seems like the billions so far mostly go to talk of LLMs replacing every office worker, rather than any action to that effect. LLMs still have major (and dangerous) limitations that make this unlikely.

esafak 5 hours ago||
Models do not need to hold the whole code base in memory, and neither do you. You both search for what you need. Models can already memorize more than you !
Jensson 5 hours ago|||
> Models do not need to hold the whole code base in memory, and neither do you

Humans rewire their mind to optimize it for the codebase, that is why new programmers takes a while to get up to speed in the codebase. LLM doesn't do that and until they do they need the entire thing in context.

And the reason we can't do that today is that there isn't enough data in a single codebase to train an LLM to be smart about it, so first we need to solve the problem that LLM needs billions of examples to do a good job. That isn't on the horizon so we are probably safe for a while.

esafak 5 hours ago||
Getting up to speed is a human problem. Computers are so fast they can 'get up to speed' from scratch for every session, and we help them with AGENTS files and newer things like memories; e.g., https://code.claude.com/docs/en/memory

It is not perfect yet but the tooling here is improving. I do not see a ceiling here. LSPs + memory solve this problem. I run into issues but this is not a big one for me.

Nextgrid 5 hours ago|||
I’ll believe it when coding agents can actually make concise & reusable code instead of reimplementing 10 slightly-different versions of the same basic thing on every run (this is not a rant, I would love for agents to stop doing that, and I know how to make them - with proper AGENTS.md that serves as a table of contents for where stuff is - but my point is that as a human I don’t need this and yet they still do for now).
Revanche1367 5 hours ago||
In my experience they can definitely write concise and reusable code. You just need to say to them “write concise and reusable code.” Works well for Codex, Claude, etc.
Nextgrid 5 hours ago||
Writing reusable code is of no use if the next iteration doesn’t know where it is and rewrites the same (reusable) code again.
munksbeer 4 hours ago||
I guide the AI. If I see it produce stuff that I think can be done better, I either just do it myself or point it in the right direction.

It definitely doesn't do a good job of spotting areas ripe of building abstractions, but that is our job. This thing does the boring parts, and I get to use my creativity thinking how to make the code more elegant, which is the part I love.

As far as I can tell, what's not to love about that?

Nextgrid 4 hours ago||
If you’re repeatedly prompting, I will defer to my usual retort when it comes to LLM coding: programming is about translating unclear requirements in a verbose (English) language into a terse (programming) language. It’s generally much faster for me to write the terse language directly than play a game of telephone with an intermediary in the verbose language for it to (maybe) translate my intentions into the terse language.

In your example, you mention that you prompt the AI and if it outputs sub-par results you rewrite it yourself. That’s my point: over time, you learn what an LLM is good at and what it isn’t, and just don’t bother with the LLM for the stuff it’s not good at. Thing is, as a senior engineer, most of the stuff you do shouldn’t be stuff that an LLM is good at to begin with. That’s not the LLM replacing you, that’s the LLM augmenting you.

Enjoy your sensible use of LLMs! But LLMs are not the silver bullet the billion dollars of investment desperately want us to believe.

AstroBen 4 hours ago||
> programming is about translating unclear requirements in a verbose (English) language into a terse (programming) language

Why are we uniquely capable of doing that, but an LLM isn't? In plan mode I've been seeing them ask for clarifications and gather further requirements

Important business context can be provided to them, also

ozozozd 2 hours ago|||
We are uniquely capable of doing that because we invented that :) It’s a self-serving definition, a job description.

This isn’t an argument against LLMs capability. But the burden of proof is on the LLMs’ side.

AstroBen 2 hours ago||
True. That capability might be reserved for AGI. The current implementation does feel like a party trick and I don't enjoy working with it
Nextgrid 3 hours ago|||
An LLM isn’t (yet?) capable of remembering a long-term representation of the codebase. Neither is it capable of remembering a long-term representation of the business domain. AGENTS.md can help somewhat but even those still need to be maintained by a human.

But don’t take it from me - go compete with me! Can you do my job (which is 90% talking to people to flesh out their unclear business requirements, and only 10% actually writing code)? It so, go right ahead! But since the phone has yet to stop ringing, I assume LLMs are nowhere there yet. Btw, I’m helping people who already use LLM-assisted programming, and reach out to me because they’ve reached their limitations and need an actual human to sanity-check.

philipwhiuk 5 hours ago|||
> the smartest minds in the world

Dunning–Kruger is everywhere in the AI grift. People who don't know a field trying to deploy some AI bot that solves the easy 10% of the problem so it looks good on the surface and assumes that just throwing money (which mostly just buys hardware) will solve it.

They aren't "the smartest minds in the world". They are slick salesmen.

ozozozd 2 hours ago||
The other day someone referred to Claude Code as “the most complex terminal app” they’ve seen.

Meanwhile folks are rendering videos in the terminal.

notnullorvoid 5 hours ago|||
Agreed. Programming languages are not ambiguous. Human language is very ambiguous, so if I'm writing something with a moderate level of complexity, it's going to take longer to describe what I want to the AI vs writing it myself. Reviewing what an AI writes also takes much longer than reviewing my own code.

AI is getting better at picking up some important context from other code or documentation in a project, but it's still miles away from what it needs to be, and the needed context isn't always present.

JeremyNT 3 hours ago|||
I see what these can do and I'm already thinking, why would I ever hire a junior developer? I can fire up opencode and tell it to work multiple issues at once myself.

The bottleneck becomes how fast you can write the spec or figure out what the product should actually be, not how quickly you can implement it.

So the future of our profession looks grim indeed. There will be far fewer of us employed.

I also miss writing code. It was fun. Wrangling the robots is interesting in its own way, but it's not the same. Something has been lost.

Nextgrid 3 hours ago||
You hire the junior developer because you can get them to learn your codebase and business domain at a discount, and then reap their productivity as they turn senior. You don’t get that with an LLM since it only operates on whatever is in its context.

(If you prefer to hire seniors that’s fine too - my rates are triple that of a junior and you’re paying full price for the time it takes me learning your codebase, and from experience it takes me at least 3 months to reach full productivity.)

jauntywundrkind 4 hours ago||
Yes. And I'm excited as hell.

But I also have no idea how people are going to think about what code to write when they don't write code. Maybe this is all fine, is ok, but it does make me quite nervous!

Nextgrid 4 hours ago||
That is definitely a problem, but I would say it’s a problem of hiring and the billion-dollars worth of potential market cap resting on performative bullshit that encourages companies to not hire juniors to send a signal to capture some of those billions regardless of actual impact on productivity.

LLMs benefit juniors, they do not replace them. Juniors can learn from LLMs just fine and will actually be more productive with them.

When I was a junior my “LLM” was StackOverflow and the senior guy next to me (who no doubt was tired of my antics), but I would’ve loved to have an actual LLM - it would’ve handled all my stupid questions just fine and freed up senior time for the more architectural questions or those where I wasn’t convinced by the LLM response. Also, at least in my case, I learnt a lot more from reading existing production code than writing it - LLMs don’t change anything there.

iambateman 7 hours ago||
I do not mourn.

For my whole life I’ve been trying to make things—beautiful elegant things.

When I was a child, I found a cracked version of Photoshop and made images which seemed like magic.

When I was in college, I learned to make websites through careful, painstaking effort.

When I was a young professional, I used those skills and others to make websites for hospitals and summer camps and conferences.

Then I learned software development and practiced the slow, methodical process of writing and debugging software.

Now, I get to make beautiful things by speaking, guiding, and directing a system which is capable of handling the drudgery while I think about how to make the system wonderful and functional and beautiful.

It was, for me, never about the code. It was always about making something useful for myself and others. And that has never been easier.

eranation 7 hours ago||
I like coding, I really do. But like you, I like building things more than I like the way I build them. I do not find myself miss writing code by hand as much.

I do find it that the developers that focused on "build the right things" mourn less than those who focused on "build things right".

But I do worry. The main question is this - will there be a day that AI will know what are "the right things to build" and have the "agency" (or illusion of) to do it better than an AI+human (assuming AI will get faster to the "build things right" phase, which is not there yet)

My main hope is this - AI can beat a human in chess for a while now, we still play chess, people earn money from playing chess, teaching chess, chess players are still celebrated, youtube influencers still get monetized for analyzing games of celebrity chess players, even though the top human chess player will likely lose to a stockfish engine running on my iPhone. So maybe there is hope.

bayarearefugee 6 hours ago|||
> will there be a day that AI will know what are "the right things to build" and have the "agency" (or illusion of) to do it better than an AI+human (assuming AI will get faster to the "build things right" phase, which is not there yet)

Of course, and if LLMs keep improving at current rates it will happen much faster than people think.

Arguably you don't need junior software engineers anymore. When you also don't need senior software engineers anymore it isn't that much of a jump to not needing project managers, managers in general or even software companies at all anymore.

Most people, in order to protect their own ego, will assume *their* job is safe until the job one rung down from them disappears and then the justified worrying will begin.

People on the "right things to build" track love to point out how bad people are at describing requirements, so assume their job as a subject matter expert and/or customer-facing liaison will be safe, but does it matter how bad people are at describing requirements if iteration is lightning fast with the human element removed?

Yes, maybe someone who needs software and who isn't historically some sort of software designer is going to have to prompt the LLM 250 times to reach what they really want, but that'll eventually still be faster than involving any humans in a single meeting or phone call. And a lot of people just won't really need software as we currently think about it at all, they'll just be passing one-off tasks to the AI.

The real question is what happens when the labor market for non-physical work completely implodes as AI eats it all. Based on current trends I'm going to predict in terms of economics and politics we handle it as poorly as possible leading to violent revolution and possible societal collapse, but I'd love to be wrong.

RivieraKid 7 hours ago||||
> I do find it that the developers that focused on "build the right things" mourn less than those who focused on "build things right".

I've always been strongly in the first category, but... the issue is that 10x more people will be able to build the right things. And if I build the right thing, it will be easy to copy. The market will get crowded, so distribution will become even harder than it is today. Success will be determined by personal brand, social media presence, social connections.

epicureanideal 6 hours ago||
> Success will be determined by personal brand, social media presence, social connections.

Always has been. (Meme)

dynamite-ready 6 hours ago||||
For me, photography is the metaphor - https://raskie.com/post/we-have-ai-at-home - We've had the technology to produce a perfect 2D likeness of a subject for close to two centuries now, and people are still painting.

Video didn't kill the radio star either. In fact the radio star has become more popular than ever in this, the era of the podcast.

bayarearefugee 6 hours ago||
While what you're saying is true, I think it is important to recognize that painting in a way that generates a livable income is mostly a marketing gig.

Likewise, being a podcaster, or "influencer" in general, is all about charisma and marketing.

So with value destruction for knowledge workers (and perhaps physical workers too once you factor in robotics) we may in fact be moving into a real "attention economy" where all value is related to being a charismatic marketer, which will be good for some people for a while, terrible for the majority, but even for the winners it seems like a limited reprieve. Historically speaking charismatic marketers can only really exist through the patronage of people who mostly aren't themselves charismatic marketers. Without patrons (who have disposable income to share) the charismatic marketers are eventually just as fucked as everyone else.

rubenflamshep 6 hours ago||||
> will there be a day that AI will know what are "the right things to build" and have the "agency" (or illusion of) to do it better than an AI+human

I share this sentiment. It's really cool that these systems can do 80% of the work. But given what this 80% entails, I don't see a moat around that remaining 20%.

eranation 6 hours ago||
I can't explain it in a way that will make sense or base it on any data, but I do see that moat all the time. For example, Microsoft / GitHub somehow taking a 3 year lead (with Co-Pilot) and losing it to Cursor in months, and still catching up.

Microsoft / GitHub have no real limitation to doing better/faster, maybe it's the big company mentality, moving slower, fear of taking risks where you have a lot to lose, or when the personal incentive for a product manager at github is much much lower than the one of a co-founder of a seed stage startup. Co-Pilot was a microscopic line item for Microsoft as a whole, and probably marginal for GitHub too. But for Cursor, this was everything.

This is why we have innovation, if mega-corps didn't promote people to their level of incompetence, if bureaucracy and politics didn't ruin every good thing, if private equity didn't bleed every beloved product to the last penny, we would have no chance for any innovation or entrepreneurship because these company have practically close to unlimited resources.

So my only conclusion from this is - the moat is sometimes just the will to do better, to dare to say, I don't care if someone has a billion dollars to compete with me, I'll still do better.

In other words, don't underestimate the power of big companies to make colossal mistakes and build crappy products. My only worry is, that AI would not make the same mistakes an we'll basically have a handful of companies in the world (the makers of models, owner of tokens e.g. OpenAI, Anthropic, Google, Amazon, Meta, xAi), if AI led product teams will be able to not make the mistakes of modern corporations of ruining everything good that they got in their hands, then maybe software related entrepreneurship will be dead.

notnullorvoid 6 hours ago||||
> The main question is this - will there be a day that AI will know what are "the right things to build"

What makes you think AI already isn't at the same level of quality or higher for "build the right things" as it is for "building things right"?

blargthorwars 6 hours ago||||
Computers are better at chess. Humans invented chess and enjoy it.

I think humans have the advantage.

godelski 6 hours ago|||

  >  "build the right things" [vs] "build things right"
I think this (frequent) comparison is incorrect. There are times when quality doesn't matter and times that it does. Without that context these discussions are meaningless.

If I build my own table no one really gives a shit about the quality besides me and maybe my friends judging me.

But if I sell it, well then people certainly care[0] and they have every right to.

If I build my own deck at my house people do also care and there's a reason I need to get permits for this, because the danger it can cause to others. It's not a crazy thing to get your deck inspected and that's really all there is to it.

So I don't get these conversations because people are just talking past one another. Look, no one gives a fuck if you poorly vibe code your personal website, or at least it is gonna be the same level as building your own table. But if Ikea starts shipping tables with missing legs (even if it is just 1%) then I sure give a fuck and all the customers have a right to be upset.

I really think a major part of this concern with vibe coding is about something bigger. It is about slop in general. In the software industry we've been getting sloppier and sloppier and LLMs significantly amplify that. It really doesn't matter if you can vibe code something with no mistakes, what matters is what the businesses do. Let's be honest, they're rushing and don't care about quality because they have markets cornered and consumers are unable to accurately evaluate products prior to purchase. That's the textbook conditions for a lemon market. I mean the companies outsource tech support so you call and someone picks up who's accent makes you suspicious of their real name being "Steve". After all, it is the fourth "Steve" you've talked to as you get passed around from support person to support person. The same companies who contract out coders from poor countries and where you find random comments in another language. That's the way things have been going. More vaporware. More half baked products.

So yeah, when you have no cake the half baked cake is probably better than nothing. At home it also doesn't matter if you're eating a half baked cake or one that competes with the best bakers in the world. But for everyday people who can't bake their own cakes, what do they do? All they see is a box with a cake in it, one is $1, another for $10, and another other is $100. They look the same but they can't know until they take a bite. You try enough of the $1 cakes and by the time you give up the $10 cakes are all gone. By the time you get so frustrated you'll buy the $100 cake they're gone too.

I don't dislike vibe coding because it is "building things the wrong way" or any of that pretentious notion. I, and I believe most people with a similar opinion, care because "the right things" aren't being built. Most people don't care how things were built, but they sure do care about the result. Really people only start caring about how the sausage is made when they find out that something distasteful is being served and concealed from them. It's why everyone is saying "slop".

So when people make this false dichotomy it just feels like people aren't listing to what's actually being said.

[0] Mind you, it is much easier for an inexperienced person to judge the quality of a table than software. You don't need to be a carpenter to know a table's leg is missing or that it is wobbly but that doesn't always hold true for more sophisticated things like software or even cars. If you haven't guessed already, I'm referencing lemon markets: https://en.wikipedia.org/wiki/The_Market_for_Lemons

eranation 5 hours ago||
Of course. I mean, my view is that it needs to be "build the right things right", vs "build things right and then discover if they are the right things". It's a stab at premature optimisation, focusing on code elegance more than delivering working software. Code simplicity, good design, scalability, are super important for maintainability, even in the age of AI (maybe even more so).

But considering that AI will more and more "build things right" by default, it's up to us humans to decide what are the "right things to build".

Once AI knows what are the "right things to build" better than humans, this is AGI in my book, and also the end of classical capitalism as we know it. Yes, there will still be room for "human generated" market, like we have today (photography didn't kill painting, but it made it a much less of a main employment option)

In a way, AI is the great equality maker, in the past the strongest men prevailed, then when muscles were not the main means to assert force, it was the intellect, now it's just sheer want. You want to do something, now you can, you have no excuses, you just need to believe it's possible, and do it.

As someone else said, agency is eating the world. For now.

godelski 4 hours ago||

  >  it needs to be "build the right things right", vs "build things right and then discover if they are the right things"
I still think this is a bad comparison and I hoped my prior comment would handle this. Frankly, you're always going to end up in the second situation[0] simply because of 2 hard truths. 1) you're not omniscient and 2) even if you were, the environment isn't static.

  > But considering that AI will more and more "build things right" by default
And this is something I don't believe. I say a lot more here[1] but you can skip my entire comment and just read what Dijkstra has to say himself. I dislike that we often pigeonhole this LLM coding conversation into one about a deterministic vs probabilistic language. Really the reason I'm not in favor of LLMs is because I'm not in favor of natural language programming[2]. The reason I'm not in favor of natural language programming has nothing to do with its probabilistic nature and everything to do with its lack of precision[3].

I'm with Dijkstra because, like him, I believe we invented symbolic formalism for a reason. Like him, I believe that abstraction is incredibly useful and powerful, but it is about the right abstraction for the job.

[0] https://news.ycombinator.com/item?id=46911268

[1] https://news.ycombinator.com/item?id=46928421

[2] At the end of the day, that's what they are. Even if they produce code you're still treating it as a transpiler: turning natural language into code.

[3] Okay, technically it does but that's because probability has to do with this[4] and I'm trying to communicate better and most people aren't going to connect the dots (pun intended) between function mapping and probabilities. The lack of precision is inherently representable through the language of probability but most people aren't familiar with terms like "image" and "pre-image" nor "push-forward" and "pull-back". The pedantic nature of this note is precisely illustrative of my point.

[4] https://www.mathsisfun.com/sets/injective-surjective-bijecti...

fastasucan 7 hours ago|||
>It was, for me, never about the code.

Then it wasn't your craft.

stevejb 7 hours ago|||
Isn't this like saying that if better woodworking tools come out, and you like woodworking, that woodworking somehow 'isn't your craft'. They said that their craft is about making things.

There are woodworkers on YouTube who use CNC, some who use the best Festool stuff but nothing that moves on its own, and some who only use handtools. Where is the line at which woodworking is not their craft?

terminalbraid 7 hours ago|||
The better analogy is you're now a shop manager or even just QA. You don't need to touch, look at, or think about the production process past asking for something and seeing if the final result fits the bill.

You get something that looks like a cabinet because you asked for a cabinet. I don't consider that "woodworking craft", power tools or otherwise.

Xenoamorphous 6 hours ago|||
If it looks like a cabinet, works as a cabinet and doesn’t fall apart, by all intents and purposes it’s a cabinet. 99% of people out there won’t care if it was a “craftsman” or a robot built it. Just like most people buy furniture at Ikea.
Revanche1367 3 hours ago||
The difference is that the person who was a woodworker is no longer needed. Why can’t the customer just walk up to a kiosk and ask the machine to start building? The machine or another one specialized for QA can then assess if it fits all the technical requirements which the customer doesn’t necessarily understand. This is what most people here are worried about, eventually the professional human being will no longer be needed by businesses which can produce everything with neither customer nor business owner being in need of specialized knowledge which they previously needed to acquire by hiring professionals.
sally_glance 6 hours ago||||
I'm pretty sure at least the better woodworking shop managers and QA people all have experience with woodworking and probably would also consider this their craft if asked.
jstanley 7 hours ago||||
The only confusion is in the use of the term "woodworking".

For the power tool user, "woodworking with hand tools" isn't their craft.

For the CNC user, "woodworking with manual machines" isn't their craft.

ori_b 6 hours ago||||
The analogy you're making is that wiring a taskrabbit to assemble Ikea furniture is woodworking.

There's a market for Ikea. It's put woodworkers out of business, effectively. The only woodworkers that make reasonable wages from their craft are influencers. Their money comes from YouTube ads.

There's no shame in just wanting things without going to the effort of making them.

datsci_est_2015 2 hours ago||
Love this extension of the analogy, particularly. Especially because, like a woodworker inspecting IKEA assembled by a taskrabbit, the craftsmanship of a finished product becomes less and less impressive the longer you inspect it
AstroBen 7 hours ago||||
It's feeling much closer to hiring a woodworker to make you something, not woodworking tools
chongli 6 hours ago||||
I think a better comparison is painting and photography. Prior to the invention of photography, painting portraits of individuals and families was a real profession. Today it’s practically unheard of outside of heads of state and the like. Sure, there are plenty of people who could afford to commission a painted portrait but few do when a quick session in a photographer’s studio is so much cheaper and more convenient.
gugagore 7 hours ago||||
Woodworking is, like, the quintessential craft. I think it is very useful to bring it in when discussion "craft"!

I am not myself a woodworker, however I have understood that part of what makes it "crafty" is that the woodworker reads grain, adjusts cuts, and accepts that each board is different.

We can try to contrast that to whatever Ikea does with wood and mass production of furniture. I would bet that variation in materials is "noise" that the mass production process is made to "reject" (be insensitive to / be robust to).

But could we imagine an automated woodworking system that takes into account material variation, like wood grain, not in an aggregate sense (like I'm painting Ikea to do), but in an individual sense? That system would be making judgements that are woodworker-like.

The craft lives on. The system is informed by the judgement of the woodworker, and the craftperson enters an apprenticeship role for the automation... perhaps...

Until you can do RL on the outcome of the furniture. But you still need craft in designing the reward function.

Perhaps.

simonwsucks 7 hours ago|||
[dead]
Ronsenshi 7 hours ago||||
Yeah, seems like too many went into this field for money or status not because they like the process. Which is not an issue by itself, but now these people talk about how their AI assistant of choice made them some custom tool in two hours that would have taken them three weeks. And it's getting exhausting.
logicprog 7 hours ago|||
That is an insane assumption to make based on the grandparents' post. What part of them talking about how much they care about the systems thinking and software architecture and usefulness and meaningfulness to other people of software over the day-to-day drudgery of APIs and bugs and typing in syntax indicates to you that they only care about money and status? They just care about a different part of the process.
shepherdjerred 7 hours ago||||
I went into this field because I love programming. I didn't even know how well these jobs paid until my junior year of college when I got an internship at AWS. I constantly programmed and read programming texts in my spare time growing up, in college, and after work.

I love AI tools. I can have AI do the boring parts. I can even have to write polished, usable apps in languages that I don't know.

I miss being able to think so much about architecture, best practices, frameworks/languages, how to improve, etc.

twelve40 7 hours ago|||
> many went into this field for money

I went into this field for both! what do i do now, i'm screwed

seanmcdirmid 7 hours ago||||
It is a different kind of code. Just a lot of programmers can’t grock it as such.

I guess I started out as a programmer, then went to grad school and learned how to write and communicate my ideas, it has a lot in common with programming, but at a deeper level. Now I’m doing both with AI and it’s a lot of fun. It is just programming at a higher level.

iambateman 6 hours ago||||
I’m going to be thinking about this comment for a while—-and I think you’re basically right.

Almost none of the code I wrote in 2015 is still in use today. Probably some percentage of people can point to code that lasted 20 years or longer, but it can’t be a big percentage. When I think of the work of a craft, I think of doing work which is capable of standing up for a long time. A great builder can make a house that can last for a thousand years and a potter can make a bowl that lasts just as long.

I’ve thought of myself as a craftsman of code for a long time but maybe that was just wrong.

light_hue_1 7 hours ago||||
That's just gatekeeping.

It was and is my craft. I've been doing it since grade 5. Like 30 years now.

Writing tight assembly for robot controllers all the way to AI on MRI machines to security for the DoD and now the biggest AI on the planet.

But my craft was not typing. It's coding.

If you're typist you're going to mourn the printer. But if you're a writer you're going to see how the improves your life.

kelnos 6 hours ago|||
A big component to coding is typing. If you aren't doing the typing, then, unless you are dictating code to someone else to mechanically, verbatim type out for you, you are not coding.

I do believe directing an LLM to write code, and then reviewing and refining that code with the LLM, is a skill that has value -- a ton of value! -- but I do not think it is coding.

It's more like super-technical product management, or like a tech lead pair programming with a junior, but in a sort of mentorship way where they direct and nudge the junior and stay as hands-off as possible.

It's not coding, and once that's the sum total of what you do, you are no longer a coder.

You can get defensive and call this gatekeeping, but I think it's just the new reality. There's no shame in admitting that you've moved to a stage of your life where you build software but your role in it isn't as a coder anymore. Just as there's no shame in moving into management, if that's what you enjoy and are effective at it.

(If presenting credentials is important to you, as you've done, I've been doing this since 1989, when I was 8 years old. I've gone down to embedded devices, up through desktop software, up to large distributed systems. Coding is my passion, and has been for most of my life.)

light_hue_1 6 hours ago||
Assembling code doesn't require typing. Linking doesn't require typing.

Even though once upon a time both did.

Claiming that this isn't coding is as absurd as saying that coding is only what you do when you hook up the wires between some vacuum tubes.

The LLM is a very smart compiler. That's all.

Some people want to sit and write assembly. Good for them. But asserting that unless I assemble my own code I'm not a coder is just silly.

recursivegirth 7 hours ago|||
Right, there is a non-zero overlap between the VIM Andy's and AI nay-sayers.
arduanika 7 hours ago||||
No true programmer is excited for the future.
boxedemp 7 hours ago|||
And no true scotsman puts sugar in his porridge
arduanika 7 hours ago|||
Yes, that was the reference!

Possibly too obscure. I can't tell whether I'm being downvoted by optimists who missed the joke, or by pessimists who got it.

pseudalopex 6 hours ago|||
Most sarcasm worsens discussion. And the comment guidelines say Don't be snarky.[1]

[1] https://news.ycombinator.com/newsguidelines.html

senordevnyc 6 hours ago|||
Haha, I downvoted you from the first category (until I read this comment).
simonwsucks 7 hours ago|||
[dead]
heliumtera 7 hours ago|||
this is so true.

never once in my life i saw anything get better. except for metal gear solid psx and gears of wars

bojan 7 hours ago||
As someone who started with Borland DOS-era IDEs I can tell you that IDEs did get a lot better over the years. I'm still fascinated every day by JetBrains IDEs.
blibble 7 hours ago||
> I'm still fascinated every day by JetBrains IDEs.

have you used them recently?

terrible, is the word I would use

(as a customer since the 2010s)

fullstackchris 6 hours ago|||
so much garbage ego in statements like this. if you really knew about software, you'd recognize there are about a million ways to be successful in this field
Fraterkes 7 hours ago|||
I've seen a hundred ai-generated things, and they are rarely interesting.

Not because the tools are insufficient, it's just that the kind of person that can't even stomach the charmed life of being a programmer will rarely be able to stomach the dull and hard work of actually being creative.

Why should someone be interested in you creations? In what part of your new frictionless life would you've picked up something that sets you apart from a million other vibe-coders?

embedding-shape 7 hours ago|||
> stomach the dull and hard work of actually being creative

This strikes me as the opposite of what I experience when I say I'm "feeling creative", then everything comes easy. At least in the context of programming, making music, doing 3D animation and some other topics. If it's "dull and hard work" it's because I'm not feeling "creative" at all, when "creative mode" is on in my brain, there is nothing that feels neither dull nor hard. Maybe it works differently for others.

nubg 7 hours ago||||
What sets you apart from millions of manual programmers?
dukeyukey 7 hours ago||||
I've been a professional programmer for 8+ years now. I've stomached that life. I've made things people used and paid for.

If I can do that typing one line at a time, I can do it _way_ faster with AI.

boxedemp 7 hours ago|||
You may be mistaking some ai dev with non, because it doesn't have tell tails
kelnos 7 hours ago|||
I love building things too, but for me, the journey is a big part of what brings me joy. Herding an LLM doesn't give me joy like writing code does. And the finished project doesn't feel the same when my involvement is limited to prompting an LLM and reviewing its output.

If I had an LLM generate a piece of artwork for me, I wouldn't call myself an artist, no matter how many hours I spent conversing with the LLM in order to refine the image. So I wouldn't call myself a coder if my process was to get an LLM to write most/all the code for me. Not saying the output of either doesn't have value, but I am absolutely fine gatekeeping in this way: you are not an artist/coder if this is how you build your product. You're an artistic director, a technical product manager, something of that nature.

That said, I never derived joy from every single second of coding; there were and are plenty of parts to it that I find tedious or frustrating. I do appreciate being able to let an LLM loose on some of those parts.

But sparing use is starting to really only work for hobby projects. I'm not sure I could get away with taking the time to write most of it manually when LLMs might make coworkers more "productive". Even if I can convince myself my code is still "better" than theirs, that's not what companies value.

bnchrch 7 hours ago|||
Well said. This sums up my own feeling. I joined this craft and love this craft for the simple ability to build beautiful and useful things.

This new world makes me more effective at it.

And this new world doesn’t prevent me from crafting elegant architectures either.

croes 7 hours ago||
Wait 5 years and your skills are down
Ronsenshi 7 hours ago|||
I don't think 5 years is necessary. I think after two years of this agentic orchestration if you rarely touch code yourself skill will degrade to the point they won't be able to write anything non-trivial without assistance.
SoftTalker 7 hours ago|||
Depends how long you've done it, and how much the landscape has changed since then. I can still hop back into SQL and it all comes back to me though I haven't done it regularly at all for nearly 10 years.

In the web front-end world I'd be pretty much a newbie. I don't know any of the modern frameworks, everything I've used is legacy and obsolete today. I'd ramp up quicker than a new junior because I understand all the concepts of HTTP and how the web works, but I don't know any of the modern tooling.

AstroBen 7 hours ago|||
How much do you think Linus Torvalds has coded over the last decade? Why is he still able to do his job?
Xenoamorphous 5 hours ago|||
https://github.com/torvalds
esafak 5 hours ago|||
His job is reviewing.
dham 7 hours ago||||
What infrastructure has gone through the last 15 years would like a word.

Half the people I work with can't do imperative jQuery interfaces. So what I guess. I can't code assembly.

croes 7 hours ago||
A programming language is still an additional language with all the benefits of being multilingual.

AI will kill that.

Xenoamorphous 5 hours ago|||
In 5 years coding skills will matter as much as being able to operate an elevator. (sadly)
GeorgeTirebiter 7 hours ago|||
I want to be in your camp, and am trying hard. But the OP's blog entry should at least give us a moment to "respect the dead". That's all he's asking, I think.
rdedev 7 hours ago|||
Adam Neely has a video on GenAI and it's impact on the music industry. There is a section in the video about beauty and taste and it's pretty different from your conclusions. One example I remember is would an AI find beauty in a record scratch sound?

https://youtu.be/U8dcFhF0Dlk

andyjohnson0 5 hours ago|||
> Now, I get to make beautiful things by speaking, guiding, and directing a system which is capable of handling the drudgery while I think about how to make the system wonderful and functional and beautiful.

For how long do you think this is sustainable? In the sense of you, or me, or all these other people here being able to earn a living. Six months? A couple of years? The time until the next-but-one Claude release drops?

Does everyone have to just keep re-making themselves for whatever the next new paradigm turns out to be? How many times can a person do that? How many times can you do that?

ori_b 6 hours ago|||
> For my whole life I’ve been trying to make things—beautiful elegant things.

Why did you stop? Because, you realize, LLMs are giving up the process of creating for the immediacy of having. It's paying someone to make for you.

Things are more convenient if you live the dream of the LLM, and hire a taskrabbit to run your wood shop. But it's not you that's making.

RivieraKid 7 hours ago|||
> For my whole life I’ve been trying to make things—beautiful elegant things.

Me too, but... The ability to code was a filter. With AI, the pool of people who can build beautiful elegant software products expands significantly. Good for the society, bad for me.

RGamma 7 hours ago|||
AI agents seem to be a powerful shortcut to the drudgery. But let's not forget, that powerful software rests on substance. My hope is the substance will increase, after all.
altmanaltman 7 hours ago|||
So when you "learned software development and practiced the slow, methodical process of writing and debugging software", it wasn't about code? I don't get it. Yes, building useful things is the ultimate goal, but code is the medium through which you do it, and I don't understand how that cannot be an important part of the process.

It's like a woodworker saying, "Even though I built all those tables using precise craft and practice, it was NEVER ABOUT THE CRAFT OR PRACTICE! It was about building useful things." Or a surgeon talking about saving lives and doing brain surgery, but "it was never about learning surgery, it was about making people get better!"

I mean sure yeah but also not really.

mikepurvis 7 hours ago||
Not the GP I feel some of that energy. The parts I most enjoy are the interfaces, the abstractions, the state machines, the definitions. The code I enjoy too, and I would be sad to lose all contact with it, but I've really appreciated AI especially for helping me get over the initial hump on things like:

- infrastructure bs, like scaffold me a JS GitHub action that does x and y.

- porting, like take these kernel patches and adjust them from 6.14 to 6.17.

- tools stuff, like here's a workplace shell script that fetches a bunch of tokens for different services, rewrite this from bash to Python.

- fiddly things like dealing with systemd or kubernetes or ansible

- fault analysis, like here's a massive syslog dump or build failure, what's the "real" issue here?

In all these cases I'm very capable of assessing, tweaking, and owning the end result, but having the bot help me with a first draft saves a bunch of drudgery on the front end, which can be especially valuable for the ADHD types where that kind of thing can be a real barrier to getting off the ground.

voidhorse 7 hours ago|||
In my opinion the relationship between level of detailed care and resulting beauty is proportional. Can you get the same level without getting your hands dirty? Sure, maybe, but I doubt a painter or novelist could really produce beautiful work without being intimately familiar with that work. The distance that heavy use of AI tools creates between you and the output does not really lend itself to beauty. Could you do it, sure, but at that point it's probably more efficient to just do things yourself and have complete intimate control.

To me, you sound more utilitarian. The philosophy you are presenting is a kind of Ikea philosophy. Utility, mass production, and unique beauty are generally properties that do not cohere together, and there's a reason for this. I think the use of LLMs in the production of digital goods is very close to the use of automation lines in the production of physical goods. No matter how you try some of the human charm, and thus beauty will inevitably be lost, the number of goods will increase, but they'll all be barely differentiable souless replications of more or less the same shallow ideas repeated as infinitum.

bloomca 6 hours ago|||
I agree, LLMs definitely sand off a lot of personality, and you can see it in writing the most, at this point I'm sure tons of people are subconsciously trained to lower the trust for something where they recognize typical patterns.

With the code, especially interfaces, the results will be similar -- more standardized palettes, predictable things.

To be fair, the converging factor is going on pretty much forever, e.g. radio/TV led to the lots of local accents disappearing, our world is heavily globalized.

mversic 6 hours ago|||
only the true artist will survive the advent of LLMs
throwawa14223 5 hours ago|||
But why would someone pay you for that?
libraryofbabel 6 hours ago|||
So many people responding to you with snarky comments or questioning your programming ability. It makes me sad. You shared a personal take (in response to TFA which was also a personal take). There is so much hostility and pessimism directed at engineers who simply say that AI makes them more productive and allows them to accomplish their goals faster.

To the skeptics: by all means, don't use AI if you don't want to; it's your choice, your career, your life. But I am not sure that hitching your identity to hating AI is altogether a good idea. It will make you increasingly bitter as these tools improve further and our industry and the wider world slowly shifts to incorporate them.

Frankly, I consider the mourning of The Craft of Software to be just a little myopic. If there are things to worry about with AI they are bigger things, like widespread shifts in the labor force and economic disruption 10 or 20 years from now, or even the consequences of the current investment bubble popping. And there are bigger potential gains in view as well. I want AI to help us advance the frontiers of science and help us get to cures for more diseases and ameliorate human suffering. If a particular way of working in a particular late-20th and early-21st century profession that I happen to be in goes away but we get to those things, so be it. I enjoy coding. I still do it without AI sometimes. It's a pleasant activity to be good at. But I don't kid myself that my feelings about it are all that important in the grand scheme of things.

Johnny_Bonk 7 hours ago|||
I couldn't agree more.
gamblor956 6 hours ago|||
If AI can do the coding, those of us who aren't programmers don't need you anymore. We can just tell the AI what we want.

Luckily for real programmers, AI's not actually very good at generating quality code. It generates the equivalent of Ali Baba code: it lasts for one week and then breaks.

This is going to be the future of programming: low-paid AI clerks to generate the initial software, and then the highly paid programmers who fix all the broken parts.

icedchai 4 hours ago||
Yes. The problem is there is a huge invisible gap between "looks like it works" and "actually works", and everything that entails, like security and scaling beyond a couple users. Non-programmers and inexperienced ones will have trouble with those gaps. Welcome to our slop filled future.
stoneforger 3 hours ago||
It's been here since it's inception. The more the market grew and the more computing became widespread, the greater the number of ravenous sociopaths with other people's money got involved. This is just another version of Rome burning. This is the terminal acceleration phase. The lunatics are running everything to the ground and the few sane people left are labeled the same way they always had been. The useful idiots are just thinking they're so much smarter. Noone asked if it is wise or not, they are thinking they can outrun the treadmill. Just add water and LLM, it's so much fun for the whole family.
croes 7 hours ago|||
But you don’t make.

You order it.

Ronsenshi 7 hours ago|||
Because such people are not sincere either to themselves about who they are or to others. It's really hard for me to take seriously phrases like "I joined this industry to make things, not to write code".

Do painters paint because they just like to see the final picture? Or do they like the process? Yes, painting is an artistic process, not exactly crafting one. But the point stand.

Woodworkers making nice custom furniture generally enjoy the process.

FeteCommuniste 7 hours ago||||
Right.

It's like learning to cook and regularly making your own meals, then shifting to a "new paradigm" of hiring a personal chef to cook for you. Food's getting made either way, but it's not really the same deal.

crazygringo 7 hours ago||
No, it's more like moving from line cook, to head chef in charge of 30 cooks.

Food's getting made, but you focus on the truly creative part -- the menu, the concept, the customer experience. You're not boiling pasta or cutting chives for the thousandth time. The same way now you're focusing on architecture and design now instead of writing your 10,000th list comprehension.

tatjam 7 hours ago||
Except the cooks don't exist anymore as they all have become head chefs (or changed careers) and the food is being cooked by magical cooking black boxes
crazygringo 7 hours ago||
Sure, but the point is you're now doing the most creative and satisfying part. Not the drudgery.

It's not that you've stopped doing anything at all, like the other commenter claimed in their personal chef analogy.

tatjam 7 hours ago||
Would you consider drudgery the in-depth thinking that's required to actually go and write that algorithm, think out all the data ownership relationships, name the variables, think the edge cases for the tests?

For me, the act of sitting down and writing the code is what actually leads to true understanding of the logic, in a similar way to how the only way to understand a mathematical proof is to go trough it. Sure, I'm not doing anything useful by showing that the root of 2 is irrational, but by doing that I gain insights that are otherwise impossible to transfer between two minds.

I believe that coding was one of the few things (among, for example, writing math proofs, or that weird process of crafting something with your hands where the object you are building becomes intimately evident) that get our brains to a higher level of abstraction than normal mammal "survival" thinking. And it makes me very sad to see it thrown out of the window in the name of a productivity that may not even be real.

crazygringo 6 hours ago|||
> Would you consider drudgery the in-depth thinking that's required to actually go and write that algorithm, think out all the data ownership relationships, name the variables, think the edge cases for the tests?

For 99% of the functions I've written in my life? Absolutely drudgery. They're barely algorithms. Just bog-standard data transformation. This is what I love having AI replace.

For the other 1% that actually requires original thought, truly clever optimization, and smart naming to make it literate? Yes, I'll still be doing that by hand, although I'll probably be getting the LLM to help scaffold all the unit tests and check for any subtle bugs or edge cases I may have missed.

The point is, LLMs let me spend more time at the higher level of abstraction that is more productive. It's not taking it away!

tatjam 5 hours ago||
I do agree with this, and in fact I do often use LLMs for for these tasks! I guess my message is more intended towards vibe-only coders (and, I guess, the non-technical higher ups drooling at the idea of never having to hire another developer).
icedchai 4 hours ago||
I see junior PM types glowing about being able to lead teams of agents, doing their bidding without putting up a fuss or argument. Short term, developers are in for a world of hurt. Long term, we're going to need a lot more to clean this crap up.
stoneforger 3 hours ago||
Noone will clean it up, it's a societal problem. The koolaid is produce more, like we need another app for X . We are celebrating owning nothing, as a liberating act. People hate mental load yes, this is the perfect drug. You don't need to think or challenge anything. If the model says it's okay, it's okay. Local models will never be able to democratise this. People will do as they are told, and another generation of consumers will follow. The matrix won't be a prison, it will be a prompt from birth to death. And y'all clapping cause you can have X number of agents running around burning tokens like kids looking at the fire cracker on their hand about to blow up, giggling. The world was always mad, and this is proof it will always be mad while people are still around.
icedchai 5 hours ago|||
I think there is room for a hybrid approach. You can delegate most of the "drudgery" to AI, but keep the parts that require creative solutions for yourself. There is undoubtedly a lot of crappy work we have to do as engineers. This is stuff that needs to be done but has also been done many times before.
logicprog 7 hours ago||||
I think unless you're vibe coding, it's pretty clear that they're still making it. Just because you aren't literally typing 100% of the characters that make up the syntax of the programming language you're using doesn't mean you're not making the final product in most meaningful sentences if you're designing the architecture, the algorithms, the data structures, the state machines, the interfaces, etc, and thinking about how they interact and whether they'll do something that's useful for the people you're making it for.
derektank 7 hours ago|||
The transition is from author to editor/publisher. Both play an important role in bringing something new into the world.
zeroonetwothree 7 hours ago||
It's true, but ask an author and 99% of them will say they don't want to be an editor.
popopopopoopop 7 hours ago|||
[dead]
AllegedAlec 7 hours ago||
While I'm on the fence about LLMs there's something funny about seeing an industry of technologists tear their own hair out about how technology is destroying their jobs. We're the industry of "we'll automate your job away". Why are we so indignant when we do it to ourselves...
zamalek 7 hours ago||
This article isn't really about losing a job. Coding is a passion for some of us. It's similar to artists and diffusion, the only difference being that many people can appreciate human art - but who (outside of us) cares that a human wrote the code?
andai 7 hours ago|||
I love programming, but most of that joy doesn't come from the type of programming I get paid to do. I now have more time and energy for the fun type, and I can go do things that were previously inconceivable!

Last night "I" "made" 3D boids swarm with directional color and perlin noise turbulence. "I" "did" this without knowing how to do the math for any of those things. (My total involvement at the source level was fiddling with the neighbor distance.)

https://jsbin.com/ququzoxete/edit?html,output

Then I turned them into weird proteins

https://jsbin.com/hayominica/edit?html,output

(As a side note, the loss of meaning of "self" and "doing" overlaps weirdly with my meditation practice...)

mirsadm 7 hours ago||
Yes but did you learn anything?
apitman 7 hours ago||
Obviously that matters, but how much does it matter? Does it matter if you don't learn anything about computer architecture because you only code in JS all day? Very situational.
dolebirchwood 7 hours ago||
There's a subset of people whose identity is grounded in the fact that they put in the hard work to learn things that most people are unable or unwilling to do. It's a badge of honor, and they resent anyone taking "shortcuts" to achieve their level of output. Kind of reminds me of lawyers who get bent out of shape when they lose a case to a pro se party. All those years of law school and studying for the bar exam, only to be bested by someone who got by with copying sample briefs and skimming Westlaw headnotes at a public library. :)
geetee 6 hours ago||
It's not that our identity is grounded in being competent, it's that we're tired of cleaning up messes left by people taking shortcuts.
eslaught 5 hours ago||
It's that, but it's also that the incentives are misaligned.

How many supposed "10x" coders actually produced unreadable code that no one else could maintain? But then the effort to produce that code is lauded while the nightmare maintenance of said code is somehow regarded as unimpressive, despite being massively more difficult?

I worry that we're creating a world where it is becoming easy, even trivial, to be that dysfunctional "10x" coder, and dramatically harder to be the competent maintainer. And the existence of AI tools will reinforce the culture gap rather than reducing it.

stoneforger 3 hours ago||
It's a societal problem we are just seeing the effects in computing now. People have given up, everything is too much, the sociopaths won, they can do what they want with my body mind and soul. Give me convenience or give me death.
janderland 7 hours ago||||
The people outside of us didn’t care about your beautiful code before. Now we can quickly build their boring applications and spend more time building beautiful things for our community’s sake. Yes, there are economic concerns, but as far as “craft” goes, nothing is stopping us from continuing to enjoy it.
terminalbraid 6 hours ago|||
I'd add part of the craft is enjoying those minutiae, sharing lessons, and stories with others. The number of people you can do that with is going to dwindle (and has been for a long time from the tech sphere's coopting of all of it). That's part that I mourn.
zrail 6 hours ago|||
Except that's not really true, because the work expands to fill the time allotted. Now we can build more boring applications with fewer people.
janderland 4 hours ago||
Yes, it is true that companies are always hungry for more. But once again, those same companies never cared about beautiful code. They wanted us to build something that works as quickly as possible. In my experience, the beauty of programming was often enjoyed outside of work for this very reason, and we can still enjoy it outside of work for it's own sake.
jhickok 7 hours ago||||
I disagree a bit. Coding can remain an artistic passion for you indefinitely, it's just your ability to demand that everyone crafts each line of code artisinally won't be subsidized by your employer for much longer. There will probably always be a heavily diminished demand for handcrafted code.
crvdgc 5 hours ago||||
At least for this article it's more about the job, or to be precise, the past where job and passion coincided:

> Ultimately if you have a mortgage and a car payment and a family you love, you’re going to make your decision.

Nothing is preventing the author from continuing to write code by hand and enjoy it. The difference is that people won't necessarily pay for it.

The old way was really incredible (and worth mourning), considering in other industries, how many people can only enjoy what they do outside of work.

SoftTalker 7 hours ago||||
I think this is really it. Being a musician was never a very reliable way to earn a living, but it was a passion. A genuine expression of talent and feeling through the instrument. And if you were good enough you could pay the bills doing work work for studios, commercials, movies, theater. If you were really good you could perform as a headliner.

Now, AI can generate any kind of music anyone wants, eliminating almost all the anonymous studio, commercial, and soundtrack work. If you're really good you can still perform as a headliner, but (this is a guess) 80% of the work for musicians is just gone.

terminalbraid 6 hours ago||
> AI can generate any kind of music anyone wants

It only sounds like music.

kaffekaka 6 hours ago||||
Is coding a passion only because other people appreciate it?

Is painting a passion because others appreciate it? No, it is a passion in itself.

There will always be people appreciating coding by hand as a passion.

My passions - drawing, writing, coding - are worthwhile in themselves, not because other people care about them. Almost noone does.

ALoverOfLats 7 hours ago|||
Huge tangent but curiosity is killing me: By any chance is your username based on the Egyptian football club Zamalek?
bobro 7 hours ago|||
How do you read this article and hear indigence? It’s clearly someone grieving something personal about their own relationship with the technology.
NeutralCrane 4 hours ago||
It may not be a reaction to the article itself but to the many comments in this thread and others that fall under that category.
boobsbr 7 hours ago|||
I never thought or felt myself as or my work as someone or something that "will automate your job away".
EvanAnderson 7 hours ago||
Agreed. I've always thought the purpose of all automation was to remove needless toil. I want computers to free people. I guess I subscribe to the theory of creative destruction.
thwarted 7 hours ago|||
Maybe it comes down to the definition of "toil". Some people find typing to be toiling, so they latch on to not having to type as much when using LLMs. Other people see "chores" as toiling, and so dream of household robots to take on the burden of that toil. Some people hate driving and consider that to be needless toil, so self-driving cars answer that—and the ads for Waymo latch onto this.

Personally, I am not stymied by typing nor chores nor driving. For me, typing is like playing a musical instrument: at some point you stop needing to think about how to play and you just play. The interaction and control of the instrument just comes out of your body. At some point in my life, all the "need to do things around the house" just became the things I do, and I'm not bothered by doing them, such that I barely notice doing them. But it's complex: the concept of "chores" is front and center when you're trying to get a teenager to be responsible for taking care of themselves (like having clean clothes, or how the bathroom is safer if it's not a complete mess) and participating in family/household responsibilities (like learning that if you don't make a mess, there's nothing to clean up). Can you really be effective at directing someone/something else without knowing how to do it yourself? Probably for some things, but not all.

EvanAnderson 7 hours ago||
> Maybe it comes down to the definition of "toil".

For sure.

I idealize a future where people can spend more time doing things they want to do, whatever those avocations might be. Freedom from servitude. I guess some kind of Star Trek / The Culture hybrid dream.

The world we have is so far from that imaginary ideal. Implicit in that ideal would be elimination of inequality, and I'm certain there are massive forces that would oppose that elimination.

thwarted 6 hours ago||
And not just the definition, but the assumption that a specific toil is necessarily universal. I've had more than one conversation that started with someone else saying "using the LLM saves me soooo much time typing, think of how much time typing you'd save by using an LLM". But when I examine my processes and where I'm spending my time, typing isn't even on my list, so this claim is talking right past me and I can't see it all. Even when I was a hunt-and-peck typer on the c64 I didn't consider the typing to be a/the major factor in how long something took to program so much so that I continued with two-finger typing until I was forced to take a touch-typing class in highschool (back when that was still a thing, and we split the exercises between typewriters and computers).

"I'm able to put my shirt on so much faster with this shirt-buttoning machine, and I don't spend time tediously buttoning shirts and maybe having to rebutton when I misalign the buttons and buttonholes. You should get one to button your shirts, you're wasting time by not using a buttoning machine".

"I wear t-shirts."

(Obviously a contrived and simplistic example for fun)

shepherdjerred 7 hours ago|||
Computers are definitely on the path to freeing programmers from programming
AstroBen 7 hours ago|||
I'm very confident in saying the majority of developers didn't get into it saying "we'll automate your job away"
jsrcout 6 hours ago|||
"We" might be such an industry, but I'm not. My focus has always been on creating new capabilities, particularly for specialists in whatever field. I want to make individuals more powerful, not turn them into surplus.
nilespotter 6 hours ago|||
Per the "About Me" picture, this particular technologist does not have any hair to tear out.
ares623 7 hours ago||
For me it's because the same tech is doing it to everyone else in a more effective way (i.e. artists especially). I'm an "art enjoyer" since I was a child and to see it decimated by people who I once looked up to is heartbreaking. Also, if it only affected software, I would've been happy to switch to a more artistic career, but welp there goes that plan.
Melonai 7 hours ago||
I feel very similarly, I always thought of software engineering as being my future career. I'm young, I just really got my foot into the industry in my early twenties. It feels like the thing I wanted to do died right when I was allowed to start. I also always felt that if I didn't get to do development, I would try to get into arts which has always been a dream of mine, and now it feels that that died, too. I wish I was born just a little bit earlier, so that I had a bit more time. :(
Revanche1367 4 hours ago||
Yeah, the thing that’s different about this technical revolution compared to the previous ones is that it’s not only trying to take out multiple industries, but the creative process as a whole.
Ronsenshi 7 hours ago||
Agree with the author. I like the process of writing code, typing method names and class definitions while at the same time thinking ahead about overall architecture, structure, how much time given function would run for, what kind of tests are necessary.

I find it unsettling how many people in the comments say that they don't like writing code. Feels aliens to me. We went into this field for seemingly very different reasons.

I do use LLMs and even these past two days I was doing vibe coding project which was noticeably faster to setup and get to its current state than if I wrote in myself. However I feel almost dirty by how little I understand the project. Sure, I know the overall structure, decisions and plan. But I didn't write any of it and I don't have deep understanding of the codebase which I usually have when working on codebase myself.

dham 7 hours ago|
It's not so much the writing of the code (which I did like), it's the aesthetic of the code. It's solving a problem with the right code and the right amount of code (for now). That's still the case, even with AI writing most of the code. You have to steer it constantly because it has very bad instincts, because most people in the profession aren't good at it, so it has bad training data. Mainly because the "learn to code" movement and people getting into this profession just for the money and not the love. Those people are probably screwed.
grahar64 7 hours ago||
"Wait 6 months" has been the call for 3-4 years now. You can't eulogize a profession that hasn't been killed, that's just mean.
Banditoz 7 hours ago||
This is what I don't really understand. It's a bit difficult to take "wait x months" at face value because I've been hearing it for so long. Wait x months for what? Why hasn't it happened yet?

Things seem to be getting better from December 2022 (chatgpt launch), sure, but is there a ceiling we don't see?

wlindley 7 hours ago|||
"Self-driving cars" and Fusion power also come to mind. With the advent of photography, it was widely believed that drawing and painting would vanish as art forms. Radio would obsolete newspapers, becoming obsolete themselves with television, and so on. Don't believe the hype.
dham 7 hours ago|||
My car has driven me back and forth with no issues for 6 months now. But yes it's been a long time coming.
XenophileJKO 7 hours ago|||
And yet.. my car was surrounded by 5 self-driving cars with no people in them on the way to work on Thursday.
Brian_K_White 6 hours ago|||
And your ability to go your own way is only temporary and due to inertia. Today, for a while, you can still buy a vehicle that requires a driver and doesn't look and perform exactly like every other waymo.

But that's only because self driving cars are still new and incomplete. It's still the transition period.

I already can't buy the car I want with a manual transmission. There are still a few cars that I could get with one, but the number is both already small and getting smaller every year. And none of those few are the one I want, even though it was available previously.

I already can't buy any (new) car that doesn't have a permanent internet connection with data collection and remote control by people that don't own the car even though I pay full cash without even financing, let alone the particular one I want. (I can, for now, at least break the on board internet connection after I buy the car without disabling the whole car, but that is just a trivial software change away, in software I don't get to see or edit.)

It's hardly unreasonable to suggest that in some time you won't be able to avoid having a car that drives itself, and even be legally compelled to let the car drive itself because you can't afford the insurance or legal risk or straight up fines.

And forget customizing or personalizing. That's right out.

Teknoman117 7 hours ago||||
Waymos require a highly mapped environment to function safely in. Not to take away from what Waymo has accomplished, but it's a far more bounded problem that what the "self driving" promise has been.
emp17344 3 hours ago||
And they still rely on human operators for some maneuvers, as we learned this week.
SquibblesRedux 7 hours ago|||
Just like in "I, Robot?"
Levitating 7 hours ago||||
[dead]
XenophileJKO 7 hours ago|||
Um.. Claude Code has been out less than a YEAR.. and the lift in capability in the last year has been dramatic.

It does seem probable based on progress that in 1-2 more model generations there will be little need to hand code in almost any domain. Personally I already don't hand code AT ALL, but there are certainly domains/languages that are under performing right now.

Right now with the changes this week (Opus 4.6 and "teams mode") it already is another step function up in capability.

Teams mode is probably only good for greenfield or "green module" development but I'm watching a team of 5 AI's collaborating and building out an application module by module. This is net new capability for the tool THIS WEEK (Yes I am aware of earlier examples).

I don't understand how people can look at this and then be dismissive of future progress, but human psychology is a rich and non-logical landscape.

stoneforger 3 hours ago||
Because then you won't be important, the model will be important. And then everyone will have to use their model, that's their dream. Why isn't that your nightmare too? Why will you be special if it can just code whatever it needs to code? Then anthropic can just employ all the programmers that will ever be needed, to just review new skills and modules of code. It was predicted early on there would be a need for about six big computers worldwide. Well now we'll just need six AI shepherds. And then literally everyone else will forget how anything works because it will be a solved problem. People already treat computers like magic, it will literally become a dark art. And I guess it's fine, what can we do, right? Go with the flow I guess. "If I don't , someone else will. Maybe I can be one of those six real people at Anthropic".
bopbopbop7 7 hours ago|||
Just a couple more trillion dollars, we are so close!
Kiro 6 hours ago|||
Things have progressed much faster than even the most optimistic predictions, so every "wait 6 months" has come true. Just look at how the discourse has changed on HN. No-one is using the arguments from 6 months ago and any argument today will probably be equally moot in 6 months.
bopbopbop7 5 hours ago||
Maybe we should look at output like quality of software being produced instead of discourse on forums where AI companies are spending billions to market?

Where is all this new software and increased software quality from all this progression?

stoneforger 3 hours ago||
Quality shmuality. Get good bro, my app already uses best patterns and you can do all the things and has enterprise SSO and runs on vercel and needs 39 services and costs a few million to run to show you AI generated excel sheets because you cant be bothered to think for a hot minute. We can't have you thinking you might get wrong ideas about ownership. I'm afraid open source was a mistake in the end because it enabled enterprises to iterate faster than they ever could on their own.
chrysoprace 5 hours ago|||
Humans are notoriously bad at predicting the future. We can't even reliably predict the weather a week from now.
stoneforger 3 hours ago||
If we were as smart as the smartest guys throwing trillions at LLMs we wouldn't be predicting anything, we would be creating it like the gods we were always meant to be ever since someone hurt our feelings irrevocably. Hitler could have been a painter, these guys could be slinging dope for a living but here we are.
RivieraKid 7 hours ago||
But the sentiment has changed significantly over the last 6 months. I think this is the biggest step change in sentiment since ChatGPT 3.5. Someone who said "wait 6 months" 6 months ago would have been "right".
jryio 7 hours ago||
These comments are comical. How hard is it to understand that human beings are experiential creatures. Our experiences matter, to survival, to culture, and identity.

I mourn the horse masters and stable boys of a century past because of their craft. Years of intuition and experience.

Why do you watch a chess master play, or a live concert, or any form of human creation?

Should we automate parts of our profession? Yes.

Should he mourn the loss of our craft. Also yes.

kaffekaka 6 hours ago|
Very well put.

Two things are true at the same time, this makes people uneasy.

jauntywundrkind 4 hours ago||
In fact, contrary things are so very often both true at the same time, in different ways.

Figuring out how to live in the uncomfortableness of non-absolutes, how to live in a world filled with dualisms, is IMO one of the primary and necessary maturities for surviving and thriving in this reality.

kaffekaka 4 hours ago||
Yes. Unwillingness to accept contradicting data points is holding many people back. They have an unconscious need to always pick one or the other, and that puts them at a disadvantage. "I know what I think." But no, you do not.
leecommamichael 7 hours ago||
> Now is the time to mourn the passing of our craft.

Your craft is not my craft.

It's entirely possible that, as of now, writing JavaScript and Java frontends (what the author does) can largely be automated with LLMs. I don't know who the author is writing to, but I do not mistake the audience to be "programmers" in general...

If you are making something that exists, or something that is very similar to something that exists, odds are that an LLM can be made to generate code which approximates that thing. The LLM encoding is lossy. How will you adjust the output to recover the loss? What process will you go through mentally to bridge the gap? When does the gap appear? How do you recognize it? In the absolute best case you are given a highly visible error. Perhaps you've even shipped it, and need to provide context about the platform and circumstances to further elucidate. Better hope that platform and circumstance is old-hat.

localghost3000 7 hours ago||
This perspective was mine 6 months ago. And god damn, I do miss the feeling of crafting something truly beautiful in code sometimes. But then, as I've been pushed into this new world we're living in, I've come to realize a couple things:

Nothing I've ever built has lasted more than a few years. Either the company went under, or I left and someone else showed up and rewrote it to suit their ideals. Most of us are doing sand art. The tide comes in and its gone.

Code in and of itself should never have been the goal. I realized that I was thinking of the things I build and the problems I selected to work on from the angle of code quality nearly always. Code quality is important! But so is solving actual problems with it. I personally realized that I was motivated more by the shape of the code I was writing than the actual problems it was written to solve.

Basically the entire way I think about things has changed now. I'm building systems to build systems. Thats really fun. Do I sometimes miss the feeling of looking at a piece of code and feeling a sense of satisfaction of how well made it is? Sure. That era of software is done now sadly. We've exited the craftsman era and entered into the Ikea era of software development.

blibble 6 hours ago||
> Nothing I've ever built has lasted more than a few years.

maybe this say something more about your career decisions than anything else?

localghost3000 5 hours ago||
Maybe? I wasn't just speaking of myself however.
zeroonetwothree 7 hours ago|||
Interesting, I still have code I wrote 20 years ago being used in production.
localghost3000 5 hours ago||
Theres always exceptions. Congrats!
codazoda 7 hours ago||
“Most of us are doing sand art. The tide comes in and it’s gone.”

I’m putting that on my wall.

gob_blob 5 hours ago|
"They can write code better than you or I can, and if you don’t believe me, wait six months." They've been saying that for years. Stop believing it.
fiala__ 5 hours ago|
Also, it’s always six months from now, because otherwise you could just point at the hundred ways they’re wrong right now. It’s nothing but the ol’ dotcom “trust me, bro” kind of marketing.
More comments...