Top
Best
New

Posted by jakelsaunders94 5 hours ago

Is anybody else bored of talking about AI?(blog.jakesaunders.dev)
518 points | 363 comments
_doctor_love 4 hours ago|
This might sound like snark, but I truly don’t mean it that way.

I think what’s interesting about AI, and why there’s so much conversation, is that in order to be a good user of AI, you have to really understand software development. All the people I work with who are getting the most value out of using AI to deliver software are people who are already very high-skilled engineers, and the more years of real experience they have, the better.

I know some guys who were road warriors for many years —- everything from racking and cabling servers, setting up infrastructure, and getting huge cloud deployments going all the way to embedded software, video game backends, etc. These guys were already really good at automation, seeing the whole life cycle of software, and understanding all the pressure points. For them, AI is the ultimate power tool. They’re just flying with it right now. (All of them also are aware that the AI vampire is very real.)

There’s still a lot to learn, and the tools are still very, very early on, but the value is clear.

I think for quite a few people, engaging with AI is maybe the first time ever in their entire career they are having to engage with systems thinking in a very concrete and directed way. Consequently, this is why so many software engineers are having an identity crisis: they’ve spent most of their career focusing on one very small section of the overall SDLC, meanwhile believing that was mostly all there was that they needed to know.

So I think we’re going to keep talking for quite a while, and the conversation will continue to be very unevenly distributed. Paradoxically, I’m not bored of it, because I’m learning so much listening to intelligent people share their learnings.

jakelsaunders94 4 hours ago||
Hey, I don't think this sounded like snark at all. Super grounded take.

> I think what’s interesting about AI, and why there’s so much conversation, is that in order to be a good user of AI, you have to really understand software development.

This I agree with completely. You can see it in the difference between a prompt where you know exactly what you want and when things are a little woolley. A tool in the hands of a well trained craftsperson is always better used.

> So I think we’re going to keep talking for quite a while Me neither, and to be clear I'm okay with that. This was mostly a rant at the lack of diversity of discourse.

_doctor_love 3 hours ago||
Thanks friend! Appreciate it.

Agree, the diversity of the discourse is not great. There's a lot of "omg I just got started waaauw" articles out there along with "we're all gonna die!" stuff. And then a few seams of very excellent insight.

Deep research at least helps with dowsing for the knowledge...

artur_makly 3 hours ago||
your HN handle is one of my top 10 fav tracks: https://www.youtube.com/watch?v=q2RSniyYNSc

{heart}

mindcrime 1 hour ago||
And here I was expecting this...

https://www.youtube.com/watch?v=k6Rl8TpGIP4

llmthrow0827 56 minutes ago|||
This would be a more compelling argument if the conversations weren't so extremely dull and derivative, with most of the articles written in LLMspeak. I see a lot of discussion and not a lot of substance; articles and discussions about AI have a much smaller chance of being compelling compared to any other technical subject posted on HN.
bengale 4 hours ago|||
Spot on take. The people I’ve noticed that say things like “it’s not useful” are the ones who are doing so little they can’t see the value.

This isn’t to say there’s not hype. Just that if you’re not seeing big productivity gains you need to make sure you really are an outlier and not just surplus to requirements.

imiric 2 hours ago||
I rarely come across people who flat out say "it's not useful". They exist, but IME they're the minority.

Rather, I hear a lot of nuanced opinions of how the tech is useful in some scenarios, but that the net benefit is not clear. I.e. the tech has many drawbacks that make it require a lot of effort to extract actual value from. This is an opinion I personally share.

In most cases, those "big productivity gains" are vastly blown out of proportion. In the context of software development specifically, sure, you can now generate thousands of lines of code in an instant, but writing code was never the bottleneck. It was always the effort to carefully design and implement correct solutions to real-world problems. These new tools can approximate this to an extent, when given relevant context and expert guidance, but the output is always unreliable, and very difficult to verify.

So anyone who claims "big productivity gains" is likely not bothering to verify the output, which in most cases will eventually come back to haunt them and/or anyone who depends on their work. And this should concern everyone.

hparadiz 25 minutes ago|||
"productivity" is a misnomer. Sort of. The things I'm building are all things I've had on the back burner for years. Most of which I never would have bothered to do. But AI lets me ignore that excuse and just do it.
SpaceNoodled 2 hours ago||||
That's only because we're trying to not be too condescending.
AbanoubRodolf 4 minutes ago|||
[dead]
strangattractor 46 minutes ago|||
Agreed - another tool in the old tool pouch. I find it fascinating in that it provides insight into the role of language in intelligence. Certainly not AGI but makes ELIZA seem neolithic;)

I am amazed at the incredible things it can do - only to turnaround and not be able to do a simple task a child can do. Just like people.

systemsweird 22 minutes ago|||
Completely agree. It’s very telling that the majority of write ups on effect agentic coding are essentially summaries of software engineering best practices.
amelius 4 hours ago|||
This is really not true. There are stories of people who had no background in software engineering who now write entire applications using AI. And I have personally seen this happen.
strken 1 hour ago|||
Before AI, there were also stories of people who had no background in software engineering who wrote entire applications using their fingers. This was called "learning to be a software engineer".

I don't mean to snipe at AI, because it really does seem to have set more people on the path of learning, but I was writing VB5 apps when I was 14 by copying poorly understood bits and pieces from books. Now people are doing basically the same but with less typing and everyone thinks it's a revolution.

mikkupikku 4 hours ago||||
Smart people can hit the ground running if they're freed from the need to first learn the intricacies of a new language. We're going to see an explosion in the number of people writing software as clever people who invested their time in something other than learning to program are now able to write software for themselves.
switchbak 3 hours ago||||
What is not true, that "so many software engineers are having an identity crisis"?

I don't believe they said that folks new to AI can't make impressive use of it. They did however say that senior folks with lots of scrappy and holistic knowledge can do amazing things with it. Both can be true.

leptons 1 hour ago||||
I've seen people generate a lot of vibeslop with AI, but they didn't actually "Write entire applications using AI".

They still have absolutely no clue how it works, so how could they "write entire applications"? They vibed it, but they certainly didn't write any of it, not one bit of it, and they're clueless as to how to extend it, upgrade it, and maintain it so that the AI doesn't make it a bloated monstrosity of AI patches and fixes and workarounds that they simply could never begin to understand.

They were also following a dozen youtube tutorials step by step, so even that part was someone else doing the thinking.

Yeah, these are the same guys constantly bugging me to help them figure something out.

pojzon 4 hours ago|||
Its silly to say this but one such person is „pewdiepie”
sigbottle 3 hours ago|||
The "AI Vampire", huh. Unironically, I've been feeling that way.

Well, there was also a lot of unrelated things that happened as well around last November for me, but yes, getting into vibecoding for real was one of them, and man I feel physically drained coming back from work and going to use more AI.

Not sure what it is. I'm using AI personally to learn and bootstrap a lot of domain knowledge I never would have learned otherwise (even got into philosophy!, but man is it exhausting keeping up with AI. I would burn through a week's worth of credits in a day, and now I haven't vibe coded a week.

I think, I will chill. One day at a time.

_doctor_love 3 hours ago||
AI Vampire is from Steve Yegge, credit where it's due.

My take is that it's similar to what Amber Case described in Calm Technology - with AI you are not steering one car, you're really steering three cars at the same time. The human mind isn't really designed for that.

I am finding that really structuring my time helps in terms of fighting back. And adopting an hours restriction, even if I could rage for 4 more hours, I don't. Instead I stop and go outside.

d675 4 hours ago|||
absolutely. as a early/mid level SDET/SRE, I can move so fast on prototyping full good apps now. That style of thinking is serving me well, knowing about queues, docker, basic infra knowledge, good coding practices, is plenty to produce decent code. Interesting time to be laid off.

AI makes a ton of bad decisions too and it's up to you to work with it. If I had the knowledge of the dangers hidden in things I'm developing, I'd move even faster

Was able to make a great full web app, which I think is hardened for prod but it had to be refactored to do so. Which it happily did.

It's really about asking the right questions, breaking down tasks, and planning now. I'm going to tackle a huge project, hoping to share it here.

username135 2 hours ago|||
AI Vampire is so perfect. Ive never thought of it that way but its right there.
QuantumGood 4 hours ago|||
> I’m learning so much listening to intelligent people share their learnings.

Me too. A key purpose of HN, and a bright time for that.

deadbabe 2 hours ago|||
If you have to really understand software development to be a good user of AI, we’re screwed. All the best users of AI we’ll ever have already exist I think.
throwawaytea 52 minutes ago||
That's a good point. Im a novice self taught developer that somehow pushed through and made a decent PM tool for the construction industry. It works, if your users aren't malicious or too demanding.

Now I'm working on a second project, all with AI. I haven't written a single line. It works better than a non programmer would make because I knew what to ask for. But I'll admit I'm not learning anything.

hparadiz 5 minutes ago||
Can't say the same. I've been super hands on with a C project. Really getting into the details of the event bus and how to make things performant. The AI is still writing 99% of the code but I'm being super strict about what I consider acceptable.
keybored 3 hours ago|||
A post supposedly about being bored of talking about AI. But psyche, it’s the same AI talking points. And psyche, the top comment is the same sentiment about how the truly skilled will finally have their time to shine.

I don’t know if it’s the Universe delivering this farce or it’s the emergent LLM Singularity.

_doctor_love 3 hours ago||
> how the truly skilled will finally have their time to shine.

That's not what I said. I said that those who are already shining, are now shining even brighter. Give a great craftsman a new tool and he will find a way to apply it. If it is valueless, he will throw it away.

For what it's worth, your comment is also an HN trope, the disaffected low-effort armchair keyboard warrior.

keybored 3 hours ago||
Expressing a negative sentiment is a trope now?
Rapzid 2 hours ago||
Keybored is a trending vibe, yeah.
gAI 4 hours ago|||
Agreed, though I prefer "Fae Folk" to vampires.
Terr_ 2 hours ago||
If LLMs were vampires, they'd be better at counting, if they were fae, they'd be better at legalistic logic. :p
djeastm 3 hours ago|||
Any thoughts on what the next generation of software devs is going to look like without as much manual experience?
eloisant 3 hours ago|||
When C arrived, programmers wonder how software devs would look like when they won't have assembly experience.

Then the same happened with languages that managed memory.

And with IDE that could refactor your code in a click and autocomplete API calls.

And with Stack Overflow where people copy/pasted code they didn't understand.

bGl2YW5j 2 hours ago||||
I reckon there's a limit to how long this abstraction can go on before not understanding underlying mechanisms will seriously hamstring you.
AnimalMuppet 39 minutes ago||||
It started before that. When assemblers came out, (some) programmers worried about losing touch with the machine if they didn't have to know the instructions in octal.
calvinmorrison 2 hours ago|||
And over and over time proves that, when you need it, ASM or C or generals system knowledge was handy. One example, I am not a "Windows" or "NT" guy, mostly working in various Unixes and Linux in my professional career. I had a client who had battered every resource trying to fix some horrible freeze/timeout in their application. So I rolled up my sleeves, first search " is there dtrace on windows", found some profiling tools, found the process was stuck in some dumb blocking call loop, resource was unavailable, and the rest was history.

So yeah i mean - who cares how it works - but also if you have experience in how things _do_ work you can solve problems other people cannot.

_doctor_love 3 hours ago|||
Honestly, I think it will look pretty much like this one. There’s a lot of manual experience that the current generation doesn’t have.

For example, I haven’t racked and cabled a server in over 15 years. That used to be a valuable skill.

I also used to know how to operate Cisco switches and routers (on the original IOS!). I haven't thought about CIDR and the difference between a /24 and a /30 since the year 2008. A class IP addresses, how do those work? What subnet am I on? Is thing running on a different VLAN? Irrelevant to me these days. Some people still know it! But not as many as in the past.

The late Dr. Richard Hamming observed that once a upon a time, "a good man knew how to implement square root in machine code." If you didn't know how to do that, you weren't legit. These days nobody would make such a claim.

So some skills fade and others rise. And also, software has moved in predictable cycles for many decades at this point. We are still a very young field but we do have some history at this point.

So things will remain the same the more they change on that front.

calvinmorrison 2 hours ago||
> So some skills fade and others rise. And also, software has moved in predictable cycles for many decades at this point. We are still a very young field but we do have some history at this point.

And there'll be a split too... like there's a giant divide between those mechanics who used to work on carburetors and the new gen with microcontrollers, injection systems, etc. People who think cars are 'too complicated' aren't wrong, but for someone who grew up in the injected era, i vastly prefer debugging issues over the canbus rather than snaking my ass around a hot exhaust to check something.

cyanydeez 4 hours ago|||
Isn't that scary though: A bunch of people are going to be forced to use a tool that keeps them ignorant and they absolutely won't know if it's doing correct things, to the point that as you retire, the next crop is going to be much less involved in knowing whats going on.

It's what happened with the internet and computer usage. As Apple made it easier to get online with zero computer knowledge, suddenly we're electing people like donald trump.

scorpioxy 3 hours ago|||
To me, it is very scary. I know people who have sort of "outsourced" their critical thinking to chatgpt. So to me it's extra scary when I see it outside technical circles. They'll just believe whatever that generation of LLM tells them because it is doing it so confidentially and never question or check the information. Maybe I'm naive but I thought easier access to knowledge was supposed to make us more intelligent, not less.
vparseval 8 minutes ago|||
I don't remember exactly in which book's introduction Hannah Arendt mentioned this, but she pointed out that every time humanity learned a new skill that improved its efficiency in some capacity, that skill as well as adjacent skills diminished irrevocably.

AI is the thing that for the first time can think better than us (or so at least some people believe) and is seen as an efficiency booster in the world of cognition and ideas. I'd think Hannah Arendt would be worried with what we are currently seeing and where we might be headed.

heavyset_go 2 hours ago|||
> Maybe I'm naive but I thought easier access to knowledge was supposed to make us more intelligent, not less.

Turns out Lowtax was right and ahead of his time

_doctor_love 3 hours ago|||
Serious reply to this one: I truly don’t find it any more scary than what’s already taken place many times in human history.

We have hundreds and thousands of years of history showing humans committing atrocities against each other well before the advent of computers, or even the introduction of electricity. So while the tool may become so ubiquitous that there’s no option not to engage with it, I don’t think it really fundamentally alters the dynamics of human behavior.

Some people are motivated by greed. Others are motivated by nobility. It really just comes down to which wolf they're feeding.

In terms of the tool keeping people ignorant, there’s a part I agree with and a part that I don’t. I think, in terms of information dissemination, AI is probably the autocrat’s wet dream in terms of finally being able to achieve real-time redefinition of reality. That’s pretty scary, and I’m not sure what to do about it.

On the other hand, people have always been free to not really learn their craft and to just sort of get by and make a living. That was true a thousand years ago, and it’s true today. There’s always somebody who can do really a high-quality job, but they’re very expensive, and then there's a vast population who will do a medium to terrible job for less money. You get what you pay for. There's a reason history is primarily written about people with power and wealth, they were the only ones with the means to do anything.

I don’t agree with the assertion about the internet and the election of someone like Donald Trump. Well before the internet existed, politicians were using communication mediums to influence things and get elected—whether it was the telegraph, the telephone, or the TV. JFK famously was the first TV president (notably, he didn't wear a hat).

These technologies simply give politicians more reach, and they may change the dynamics of how voters are persuaded. But what’s true today was true three hundred years ago: there’s the face of power that you see publicly, and then there’s what really happens behind the scenes.

bluefirebrand 3 hours ago||
> Serious reply to this one: I truly don’t find it any more scary than what’s already taken place many times in human history

Spoken like someone who thinks they are going to be insulated from the fallout

solenoid0937 3 hours ago||
Many of us are fine with the fallout because we understand the net benefit to humanity is going to be similar to the previous waves of automation.

Sure, it might hurt me personally. I'm not selfish enough to put that over what will be an incredibly empowering development for our species.

bluefirebrand 1 hour ago||
I don't believe for even a second that the net benefit to humanity is going to be positive

This will be good for a handful of elites and no one else

heliumtera 3 hours ago|||
>They’re just flying with it right now.

Where are they flying and why software has gone to shit?

Maybe this super stars programmers have to keep their reality breaking technology secret, but everything has not only degraded, but turned to absolute trash.

AbanoubRodolf 4 minutes ago|||
[dead]
LogicFailsMe 3 hours ago|||
Spot on, I am having the time of my life with AI, more fun than I've had in decades. But I was in the top 10% of engineering, and top 1% of the bits of engineering I do best, so it's easy for me to use AI to explore more ideas than I could have possibly explored by hand. And if I get replaced, cool bro, my investments are in compute, and compute's just getting started IMO.
hbarka 3 hours ago||
> For them, AI is the ultimate power tool.

Yup

SpaceNoodled 2 hours ago||
When all you've got AI, every problem looks like ... Uh, whatever hole an LLM's output goes into. A garbage can, ideally.

AI seems great when you have no way of truly validating its output.

lukev 5 hours ago||
This is bad in tech. But at least we are (relatively) well equipped to deal with it.

My partner teaches at a small college. These people are absolutely lost, with administration totally sold on the idea that "AI is the future" while lacking any kind of coherent theory about how to apply it to pedagogy.

Administrators are typically uncritically buying into the hype, professors are a mix of compliant and (understandably) completely belligerent to the idea.

Students are being told conflicting information -- in one class that "ChatGPT is cheating" and in the very next class that using AI is mandatory for a good grade.

Its an absolute disaster.

Terr_ 2 hours ago||
I've been telling my curious/adrift relatives that it's a machine takes a document and guesses what "usually" comes next based on other documents. You're not "chatting with it" as much as helping it construct a chat document.

The closer they can map their real problems to make-document-bigger, the better their results will be.

Alas, that alignment is nearly 100% when it comes to academic cheating.

chatmasta 4 hours ago|||
The wild part is they’re having this reaction while using the most rigid and limited interfaces to the LLMs. Imagine when the capabilities of coding agents surface up to these professions. It’s already starting to happen with Claude Cowork. I swear if I see another presentation with that default theme…
iugtmkbdfil834 4 hours ago||
This. As annoying as all sorts of 'safety features' are, the sheer amount of effort that goes into further restricting that on the corporate wrapper side side makes llm nigh unusable. How can those kids even begin to get the idea of what it can do, when it seems like its severely locked down.
pjc50 4 hours ago||
Could you provide an example of such a thing that is prevented?
iugtmkbdfil834 1 hour ago||
Sure. In the instance I am aware of, SQL ( and xml and few others )files are explicitly verbotten, but you can upload them as text and reference them that way; references to personal information like DOB immediately stops the inference with no clear error as to why, but referencing the same info any other way allows it go on.

It is all small things, but none of those small things are captured anywhere so whoever is on the other end has to 'discover' through trial and error.

metalliqaz 3 hours ago|||
By my understanding, the administrators at small colleges are among the least capable professionals one might find anywhere in the economy.
throwawaysleep 36 minutes ago||
A friend and I have a contract with a local university here in Canada.

They paid for custom on prem software and in over a year, they have not fully provided both access and infrastructure for install it.

We have been paid already, but they paid for a tool they can’t get their shit together enough to let us install.

whattheheckheck 4 hours ago|||
When industrialization was taking root yes indeed the factory jobs sucked AND it was the future. Two things can be true
ares623 35 minutes ago||
You left out the part that the non-factory jobs sucked more (or were just non-existent).

This is the opposite.

webdood90 4 hours ago|||
> These people are absolutely lost, with administration totally sold on the idea that "AI is the future" ...

Doesn't sound that different from my tech job

jakelsaunders94 4 hours ago|||
This is really interesting. I've been out of education for a long time, but I was wondering how they were dealing with the advent of AI. Are exams still a thing? Do people do coursework now that you can spew out competent sounding stuff in seconds?
Al-Khwarizmi 1 hour ago||
I teach CS at a university in Spain. Most people here are in denial. It is obvious to me that we need to go back to grading based on in-person exams, but in our last university reform (which tried to copy the US/UK in many aspects) there was so much political posturing and indoctrination about exams being evil and coursework having to take the fore that now most people just can't admit the truth before their own eyes. And for those of us that do admit it, we have a limited range of maneuver because grading coursework is often a requirement that emanates from above and we can't fundamentally change it.

So in most courses nothing has changed in the way we grade. Suddenly coursework grades have gone up sharply. Anyone with working neurons know why, but in the best case, nothing of consequence is done. In the worst case (fortunately uncommon), there are people trusting snake oil detectors and probably unfairly failing some students. Oh, and I forgot: there are also some people who are increasing the difficulty of the coursework in line with LLMs. Which I guess more or less makes sense... Except that if a student wants to learn without using them, then they suddenly will find assignments to be out of their league.

So yeah, it's a mess.

technothrasher 1 hour ago||
> Except that if a student wants to learn without using them

My son, who is a freshman at a major university in NYC, when he said to his freshman English professor that he wanted to write his papers without using AI, was told that this was "too advanced for a freshman English class" and that using AI was a requirement.

senordevnyc 1 hour ago||
Now colleges will have to try and detect if you didn't use AI!
ares623 37 minutes ago|||
Meh, today I opened twenty PRs and felt great. That's worth it to me. (/s)

https://twentyprsaday.github.io/

delbronski 4 hours ago||
AI is starting to look like a net negative for humanity. I remember the early days of OpenAI. I was super excited about it. There was a new space to uncover and learn about. I was hopeful.

Now I have this love/hate relationship with it. Claude Code is amazing. I use it everyday because it makes me so much more efficient at my job. But I also know that by using it I’m contributing to making my job redundant one day.

At the same time I see how much resources we are wasting on AI. And to what end? Does anybody really buy the BS that this will all make the world a better place one day? So many people we could shelter and feed, but instead we are spending it on trying to make your computer check and answer your emails for you. At what point do we just look up and ask… what is the damn purpose of all of this? I guess money.

wrs 4 hours ago||
Well, on the other hand, software isn’t all about checking emails.

I know someone who worked for a nonprofit that made pregnancy health software that worked over text messaging. Its clients were women in Africa who didn’t have much, but they had a cell phone, so they could get reminders, track vitals, and so forth.

They had to find enough funding to pay several software engineers to build and maintain that system. If AI allows a single person to do it, at much lower cost, is that bad?

bGl2YW5j 2 hours ago||
This is awesome. It's sad that examples like this are few and far between.
remich 16 minutes ago||
Are they? Or do you just mean that it's few and far between that we hear about them? If it's the former, I think there's a much bigger universe of this kind of stuff than most people realize. Otoh, if you're just commenting on the lack of coverage, then, yeah I agree I wish more publicity was paid to small software like this. Maybe we need a catchy term - "organic software"? "Locally grown software"?
rimbo789 36 minutes ago|||
starting? it was pretty clearly a net negative from the get go
xvector 4 hours ago|||
> But I also know that by using it I’m contributing to making my job redundant one day.

I don't see how this is the case if you're anything more than a junior engineer... it unlocks so many possibilities. You can do so much more now. We are more limited by our ideas at this point than anything else.

Why is the reaction of so many people, once their menial work gets automated, "oh no, my menial work is automated." Why is it not "sweet, now I can do bigger/better/more ambitious things?"

(You can go on about corporate culture as the cause, but I've worked at regular corporations and most of FAANG. Initiative is rewarded almost everywhere.)

> Does anybody really buy the BS that this will all make the world a better place one day?

Why is it BS? I'm shocked that anyone with a love and passion for technology can feel this way. Have you not seen the long history of automation and what it has brought humanity?

There is a reason that we aren't dying of dysentery at the ripe age of 45 on some peasant field after a hard winter day's worth of hard labor. The march of automation and technology has already "made the world a better place."

RivieraKid 2 hours ago|||
> I don't see how this is the case if you're anything more than a junior engineer... it unlocks so many possibilities.

I really don't understand this way of thinking. Don't you think that AI could replace senior engineers? Sure, companies will be able to do bigger / better / more ambitious stuff - but without any software engineers.

> Why is it BS? I'm shocked that anyone with a love and passion for technology can feel this way. Have you not seen the long history of automation and what it has brought humanity?

I definitely think that AI will be a net benefit for society but it could easily end up being be bad for me.

szatkus 1 hour ago|||
So far AI doesn't seem even close to replacing senior engieeners. Hell, it can't even replace junior engieeners entirely.

I use AI agents every day at work and I'm happy with that, but it took over two years and billions of dollars in investment to deliver anything useful (Claude Code et al). The current models are amazing, but they still randomly make mistakes that even a junior wouldn't make.

There's another paradigm shift to be made certainly, because currently it feels like we scaled up a bug brain to spit out code. It works great for some problems, but it's not what software developers usually do at work.

ej88 2 hours ago|||
there doesnt seem to be a limit in terms of the ceiling of what companies can do with software, probably the most elastic demand out of any industry ever

the swe role is going to change but problem solving systems thinkers with initiative won't go away

GeoAtreides 2 hours ago||||
>Why is the reaction of so many people, once their menial work gets automated, "oh no, my menial work is automated." Why is it not "sweet, now I can do bigger/better/more ambitious things?"

because i have rent to pay? old age to prepare for?

why is it so hard to understand most people are not rich, that the cost of living is high, and that most people are VERY afraid their jobs will be automated away? why is so hard to understand that most people haven't worked at FAANG, they don't have stocks or savings, and are squeezed harder with every new day and every new war?

what world, what reality are you guys living in?!

xvector 1 hour ago||
Because there is always work to do. It is true that demand will drop for those that don't take initiative and aren't sure what to do now that AI can do their repetitive tasks. However, demand will surge for those that can think critically about how to utilize AI to empower businesses.

"Software engineer" as a profession is rapidly getting automated at my company, and yet our SWEs are delivering more value than ever before. The layer of abstraction has changed, that is all.

> what world, what reality are you guys living in?!

One that has seen immense benefits from the Industrial Revolution and previous waves of automation.

GeoAtreides 1 hour ago||
you might want to brush up on the short and medium consequences of the industrial revolution and the dark satanic mills where children were maimed or where people worked for 12h a day in horrendous conditions.

Do you think because 2 dev are now super productive with AI, the company will keep the other average 30 devs? no, of course not, they will fire and pocket the difference. Same for other industries, where AI will slowly diffuse like a poisonous gas and displace jobs and people, leaving behind a crippled white collar class. The profits will not trickle down and the increased productivity will be a hatchet, not a plough.

senordevnyc 59 minutes ago||
where children were maimed or where people worked for 12h a day in horrendous conditions

Such things were super uncommon before the industrial revolution, I'm sure.

delbronski 3 hours ago||||
And I’m shocked that anyone into tech can be so blind to the adverse effects the current tech industry is having on our world and our society.

We owe it to the world, as the experts, to be critical. The march of automation and technology has made the world a better place in some ways. I sure love modern medicine, but those drones flying over Ukraine and Russia sure don’t seem like they are making the world a better place. Nuclear bombs are not making the work a better place. Misinformation in social media is not making the world a better place.

Any belief you drink blindly will eventually find a way to harm you.

xvector 3 hours ago||
[flagged]
delbronski 3 hours ago||
Oh yeah, no you are right. Sorry for focusing on that little part of space and time where I and everyone I know and love is alive and being affected by our decisions. How dumb of me!
solenoid0937 3 hours ago||
It actually is genuinely wrong to prioritize your little bit of space and time over the needs of the species as a whole and the benefit of untold future billions.

If everyone thought like you we'd be stuck in the pre-Industrial phase. How miserable that would be!

miltonlost 4 hours ago||||
Keep marching that automation and tehcnology to an acidified ocean. But hey, at least now we can code faster than we can review!
solenoid0937 3 hours ago||
AI won't be what acidifies our ocean, but AGI might save us from it.

Strangely enough, I don't see you calling to end the consumption of meat which would have a far larger environmental impact while not slowing global progress at all.

palata 3 hours ago||
> AI won't be what acidifies our ocean

Tech is what got us where we are. AI allows us to use more energy to produce more of what is currently measurably killing us.

> but AGI might save us from it.

This is just faith. Some believe that prayers may save us.

solenoid0937 3 hours ago||
"AI energy usage" is a convenient scapegoat not backed by data.

Many things are orders of magnitude bigger than AI in the energy usage problem that bring less comparable value.

remich 14 minutes ago||
I'm starting to get to the point where I'll only listen to AI energy use critiques if the commentator tells me up front they abstain from all forms of social media, especially video-based social media, first.
palata 3 hours ago|||
> There is a reason that we aren't dying of dysentery at the ripe age of 45 on some peasant field after a hard winter day's worth of hard labor.

Tell that to the people who will die before 45 because of global instability and global warming, I guess?

jesterson 1 hour ago|||
I am having similar thoughts.

To add to list of questions - it's undeniable the AI is making humans dumber by doing mental job previously done by humans. So why we spend so much energy making AI smarter and fellow humans dumber?

Shouldn't we be moving in opposite direction - invest in people instead of some software and greedy psychopaths at helm of large companies behind it?

boxingdog 11 minutes ago||
[dead]
jvanderbot 5 hours ago||
How do I answer this without spamming: Yes, very much.

Everyone is in their own place adapting (or not) to AI. The disconnect b/w even folks on the same team is just crazy. At least it's gotten more concrete (here's what works for me, what do you do) vs catastrophizing jobpocolypse or "teh singularity", at least on day to day conversations.

peruvian 3 hours ago||
Yeah maybe some workplaces are starting to get more organized but in general there's teams with anti-LLM engineers still and some that have Claude Code running all day.
scorpioxy 3 hours ago||
Yes, extremes which seems to fit the general sentiment of the world right now.

For a while, it felt like I'm in a minority when I was saying that it can be a useful tool for certain things but it's not the magic that the sales guys are saying it is. Instead, all the hype and the "get rid of your programmers" messaging made it into this provocative issue.

HN was not immune to this phenomenon with certain HN accounts playing an active part in this. LLMs are/were supposed to be an iteration of machine learning/AI tools in general, instead they became a religion.

zer00eyz 4 hours ago||
I'm sure as hell bored of the current conversations people are having about ai.

> here's what works for me, what do you do

This is at least progress... but many want to remain in denial, and cant even contemplate this portion of the conversation.

We're also ignoring the light AI shines on our industry, and how (badly) we have been practicing our craft. As an example there is a lot of gnashing of teeth right now about the VOLUME of code generated and how to deal with it... how were you dealing with code reviews? How were you reviewing the dependencies in your package manager? (Another supply chain attack today so someone is looking but maybe not you). Do you look at your DB or OS? Does the 2 decades of leet code, brain teaser fang style interview qualify candidates who are skilled at reading code? What is good code? Because after close to 30 years working in the industry, let me tell you the sins of the LLM have nothing on what I have seen people do...

chatmasta 4 hours ago||
What I miss is people showing off their hand-crafted libraries or frameworks. That’s become way less common now that everyone is building a layer up the stack. I fear we’ll be stuck in a permanent state of using Tailwind and React and all the LLM-favored libraries as they were frozen in time at the beginning of 2025. Then again, that’ll be the agent’s problem, not mine…

All that said, it’s extremely exciting. I’ve been in tech, in one way or another, for 25 years. This is the most energizing (and simultaneously exhausting) atmosphere I’ve ever felt. The 2006-2011 years of early Facebook, Uber, etc. were exciting but nothing like this. The future is developing faster than we can process it.

kehvyn 4 hours ago||
If it helps, I've mostly been using AI to implement things in the craziest languages I can justify.

I write Typescript and SQL by day, my last two personal projects were Rust and Perl.

I do worry that I'm not learning them as deeply, but I am learning them and without AI as an accelerant I probably wouldn't be trying them at all.

mattgreenrocks 4 hours ago|||
Perhaps we're in an AI summer and a tech winter. Winter is always the time when people hole up, dream, and work on whatever big thing is next.

We're about due for some new computing abstractions to shake things up I think. Those won't be conceived by LLMs, though they may aid in implementing them.

zer00eyz 4 hours ago||
We have 2 decades of abstraction.

The stacks of turtles that we use to run everything are starting to show their bloat.

The other day someone was lamenting dealign with an onslaught of bot traffic, and having to deal with blocking it. Maybe we need to get back to good old fashioned engineering and optimization. There was a thread on here the other day about PC gamer recommending RSS readers and having a 36gb webpage ( https://news.ycombinator.com/item?id=47480507 )

jakelsaunders94 4 hours ago||
> What I miss is people showing off their hand-crafted libraries or frameworks.

Saame. I wonder if the use of AI will lead to less invention and adoption of new ideas in favour of ideas with lots of training data.

arkt8 12 minutes ago||
Most of boring is about people thinking AI is really intelligent. Thinking that it is magic. With magic comes the ghosts instead bugs. Who was lazy, will become even lazier. Engineers will keep building bridges... and also software.

As shown in "Normal Accidents" the strength is as high as its weaknesses, and in any complex system this is even more a problem. A catastrophic event is still to happen with AI as it happened in basically every complex system. They ocurred with trained people that wasnt believing in magic or laziness... so the scenario is even worse for AI.

Yes, I'm bored about people that believe in magic and the ghosts the are emerging and are yet to be seen.

mindcrime 2 hours ago||
OK, if you take "talking about AI" to mean just talking about "three different people’s (almost identical) Claude code workflow and yet another post about how you got OpenClaw to stroke your cat and play video games" then sure, that would be pretty boring.

But I don't see it that way. I've been fascinated by AI since I was a little kid (watching Max Headroom, Knight Rider, Whiz Kids, Wargames, Tron, Short Circuit, etc in the 80's) up through college in the 1990's when I first read about the 1956 Dartmouth AI workshop that kicked the field off, and up to today where we have the most powerful AI systems we've had. Every single bit of this stuff is wildly fascinating to me, but that's at least in part because I recognize (or "believe" if you will) that there's a lot more to "AI" than just "LLM's" or "Generative AI".

I still believe there are plenty of neural network architectures that haven't been explored yet, plenty more meat on the bone of metaheuristics, all sorts of angles on neuro-symbolic AI to work on, etc. And even "Agents" are pretty exciting when you go back and read the 90's era literature on Agents and realize that the things passing for "Agents" right now are a pretty thin reflection of what Agents can be. Really understanding MAS's involves economics, game theory, computer science, maybe even a hint of sociology.

As such, I still find AI fascinating and love talking about it... at least in the right context and with the right people. :-)

And besides... as they[1] say: "Swarm mode is sick fun".

[1]: https://static0.srcdn.com/wordpress/wp-content/uploads/2022/...

nancyminusone 5 hours ago||
Among non-programmers, you always hear about some fool that fell in love with an AI girlfriend or whatever, but you never hear about the people who open chatgpt up once, tried some things with it, said to themselves "huh, that's kind of neat" and then lost interest a day or two later, having conceived of no further items to which AI could provide assistance.
slfnflctd 3 hours ago||
> having conceived of no further items to which AI could provide assistance

For me, the issue isn't that I can't conceive of work AI could help with. It's that most of the work I currently need to be doing involves things AI is useless for.

I look forward to using it when I have an appropriate task. However, I don't actually have a lot of those, especially in my personal life. I suspect this is a fairly common experience.

olivia-banks 4 hours ago||
I actually hear about this fairly often. In quite a few of my college classes, there's a large focus on AI (even outside the computer science department). I find it surprising the amount of non-technical people who don't even think to use it, or otherwise haven't interacted with it except when required.
JSR_FDED 5 hours ago||
I’m sad that it’s crowded out all the interesting stuff I used to love learning about on HN.
JoshTriplett 4 hours ago||
I'm sad that it's crowding some of those things out of existence, not just out of being talked about.
4k93n2 2 hours ago|||
you can at least block some of it out with ublock origin?

  news.ycombinator.com##td.title:has-text(/LLM|AI/i)
dandrew5 1 hour ago||
I use a bookmarklet for this: https://github.com/dan-lovelace/hn-blocklist
mtndew4brkfst 4 hours ago|||
Not limited to here, of course. Net-new publications to ArXiv for some (most?) CS subcategories are >=90% about models, transformers, training, quantization, or some other directly related field, or how to apply these towards a different specialty.
Kye 3 hours ago||
This is completely normal when a new thing is in the third section of the technology adoption curve.[0] AI will either go away (unlikely) or become a footnote in posts about what people are doing with it in the next stage.

[0] https://en.wikipedia.org/wiki/Technology_adoption_life_cycle

Alternately: the trough of disillusionment.

https://en.wikipedia.org/wiki/Gartner_hype_cycle

pvorb 1 hour ago|
I really like this paragraph about management caring about AI:

> What makes this worse, is our bosses have bought into it this time too. My managers never cared much about database technologies, IDE’s or javascript frameworks; they just wanted the feature so they could sell it. Management seems to have stepped firmly and somewhat haphazardly into the implementation detail now. I reckon most of us have got some sort of company initiative to ‘use more AI’ in our objectives this year.

More comments...