Top
Best
New

Posted by NotAnOtter 7/4/2025

Ask HN: Worth leaving position over push to adopt vibe coding?

My company is increasingly pushing prompt engineering as the single way we "should" be coding. The CEO & CTO are both obsessed with it and promote things like "delete entire unit test file & have claude generate a new one" rather than manually address test failures.

I'm a 'senior engineer' with ~5 years of industry experience and am considering moving on from this company because I don't want

1. Be pushed into a workflow that will cause my technical growth to stall or degrade 2. Be overseeing a bunch of AI-generated spaghetti 2-3 years from now

Feel free to address my specific situation but I'm interested in more general opinions.

82 points | 94 comments
Ancapistani 7/4/2025|
I’ve been doing this for 20 years. I see it as you having two primary options.

You should stay there, learn the new tech, and see what happens.

If it works better than you expected, then your mind will be changed and you’ll be well positioned for the new economy.

If it turns out how you expect, now you have experience working with this tooling to inform your positions at your next company.

Either way, a few months in that environment will help your career.

andrei_says_ 7/5/2025||
I have the same recommendation.

Learn the strengths and weaknesses of the new technology and add it to your resume.

Become the AI advisor who can help an organization adopt the tech where appropriate and avoid the traps associated with top-down hype- and fomo-driven adoption.

Also who knows where the AI cycle will be in 2-3 years. My sense is by then we will see the cost of tech debt caused by LLM generated code, the cost of the ignorance and naïveté of vibe coding and the cost of VC money wanting its ROI on a subsidized tech.

GianFabien 7/5/2025|||
All of the above +

Start looking for a new role that is better aligned with your expectations. You may find it harder than you expect. In which case, you might be glad you didn't burn your bridges in a pique over AI mandates by the CEO & CTO.

SftwrSvior81 7/5/2025|||
Agree with this wholeheartedly. Stay there for now while looking for a new job. If things work out ok at the current place, end your search and enjoy your current job. If they don't work out, hopefully you'll have some options to move into. Best of luck to you.
swithek 7/7/2025||
[dead]
muzani 7/4/2025||
It's the same elsewhere. Some places are actually using it as a way to get rid of people 'resistant to change'. It also remains to be seen what technical skills we need 5 years from now. I did memory management and pointers 15 years ago and I can still do them now.

What I'd suggest is adapt to it, find ways to push back. Obviously things like "delete entire unit test file & have claude generate a new one" is a bad idea. I've seen claude "monkey patching" a system so that it returns true to the tests.

This issue is going to pop up in the future. Experiment with it on the company's dime even if you've checked out emotionally. You are still doing your job - improving code quality and making sure things run.

The new approach seems to be doing TDD. One, as an engineer, you'll know when AI is bullshitting you with mocks. Even when mocks are BS, you can still test the thing they're meant to represent. 2) AI spits more code than anyone can review. The red, green, refactor approach is one way to keep them on the rails.

TheNewsIsHere 7/5/2025|
> I've seen claude "monkey patching" a system so that it returns true to the tests.

I’ve watched Github Copilot do the same thing. I’ve also seen it doubling down on ridiculous things and just spewing crash-laden messes. There seems to be a low upper ceiling on how “competent” it is, which makes sense.

al_borland 7/12/2025||
In my own use of Copilot, I found Gemini gives me better results than ChatGPT and Claude. To the point where ChatGPT and Claude will flounder on a problem for hours of back and forth, where Gemini will one-shot the same thing.
biglyburrito 7/5/2025||
Stay, but actively look for other opportunities. It’s a miserable tech job market right now; hunkering down and continuing to collect a paycheck is something you shouldn’t take for granted.

As others have said, use this opportunity to learn about what works w/ AI based on all the crap that doesn’t. Your C-level execs have given you carte blanche to fuck around, without worrying too much about the immediate-term consequences. If they literally said to delete existing test files & generate new ones using AI, do it! And when shit inevitably goes sideways, you’ll be able to spend more of your salaried time rewriting those tests to probably look & work more like the original tests that you deleted.

And while you’re learning to use AI, you’ll be burning the company’s funny-money on AI usage fees. At some point, they’re gonna realize they don’t have a lot to show for all the money that was thrown away in service of doing what they told employees to do. At which point, they’ll take a more measured & pragmatic approach towards using AI. Do everyone a favor & help them get to that point sooner than later.

HenryBemis 7/5/2025|
I fear that when the CFO and CEO will walk in holding a hammer and a saw, the CTO will start blaming the Directors for not "maximizing value" by "misusing" AI.. because why blame the tech when you can fire 20 more devs/engineers and hire 20 more eager and cheaper ones...?

As for the long term consequences.. I suggest "in front of your nose" by Orwell.

sircastor 7/4/2025||
I'm a senior engineer with 20+ (oof) years of industry experience. I appreciate that this sucks and you don't want to do it. I wouldn't either. That said, it's a hirer's market out there right now. There will be plenty of people who will be happy to take your position while you're looking for something you prefer.

My opinion is that we're going to have about 5 years of this. Managers and C-suite folks are going to do their absolute darnedest to replace and supplement people with AI tools before they figure out it's not going to work. While I appreciate the differences, I remember seeing this ~6-7 years ago with blockchain at my last role. It'll work itself out. In the mean time, you get to contribute to the situation, instead of simply not being present. It's not going to be fun of course.

I don't think we're ever going back from this. There's an entire generation of new coders, and new managers who are growing up with this stuff. It's part of their experience, and suggesting they not use it is going to be akin to asking if you can use a typewriter instead of a computer with a word processor. Some companies will take longer to adopt, but it's coming...

noduerme 7/5/2025|
I feel I'm sort of stuck in the opposite situation of OP. I manage a few massive codebases that I simply cannot trust an AI to go mucking around with. The only type of serious AI coding experience I could get at this point would be to branch one of these and start experimenting on my own dime to see how good or bad the actual experience is. And that doesn't really seem worth it, because I know what I want to do with them (what's on the feature list that I'm being paid to develop)... and it feels like it would take more time to talk to an LLM and get it perfectly dialed in on any given feature, and ensure it was correct, than it would take to write it myself. And I'm not getting paid for it.

I feel like I'd never use Claude seriously unless someone demanded I used it from day one on a greenfield project. And so while I get to keep evolving my coding skills, I'm a little worried that my "AI skills" will lag behind.

sircastor 7/5/2025|||
I do a lot of non-work AI stuff on my own, from pair programming with AI, asking it to generate whole things, to just asking it to clarify a general approach to a problem.

FWIW, in a work environment (and I have not been given the go-ahead to start this at my work) I would start by supplementing my codebase. Add a new feature via AI coding, or maybe reworking some existing function. Start small.

slau 7/5/2025|||
With all due respect, and I’m particularly anti-LLM, you sound exactly like someone who has never tried the tech.

You can use LLMs without letting them run wild on the entire codebase. You have git, so you can see every minute change it makes. You can limit what files it’s allowed to change and how much context you give it.

You don’t have to give it root on your machine to make it useful. You don’t have to “Jesus, Take the Wheel”. It is possible to try it out at a smaller scale, even on critical code.

UncleOxidant 7/5/2025||
I'm basically retired now and I'm really glad about the timing - I would not want to be in this field if I were in my 30s, 40s or 50s the way things are going. I think what's happening at your company is happening in lots of companies right now so I don't think you'll be able to jump ship and end up somewhere else where it's not happening. You can hope for a backlash - and it might come. In the meantime, go ahead and vibecode being careful about the areas you do it in - they seem pretty good at coming up with testcases, for example. Maybe don't let your coding agent have full editing permissions. Have it give you suggestions for what it would do in the code and evaluate them closely before letting the edits happen (pushing back when needed).
wrs 7/5/2025||
I know with only 5 years experience this may not be obvious, but this is only the first of many “revolutionary” technologies making everyone around you lose their minds that you’ll have to deal with in your career. Like every other such technology, I recommend that you engage with it, understand it, relate that experience to what your employer does, and be the voice of knowledgeable pragmatism about where to use it. In other words, be an engineer.

If that can’t be done where you are, or isn’t valued, you’re in the wrong place.

I’ve been through this with (including but not limited to) PCs, OOP, client-server, SOA, XML, NoSQL, blockchain, “big data”, and indeed, multiple definitions of “AI”. Turns out all but one of those were actually somewhat useful in the end, when applied properly, but they didn’t eliminate the industry. Just roll with it.

xtracto 7/5/2025||
Reminds me when Rational Rose and UML were briefly famous in the late 90s. What an absolute piece of crap that the suits pushed to use.
alfiedotwtf 7/5/2025||
I remember at the time that Rational Rose was going to allow non-programmers to make apps…

History doesn’t repeat, but it rhymes

CamperBob2 7/5/2025||
This time is different.

No, really. This time is different.

RaftPeople 7/5/2025|||
> I’ve been through this with (including but not limited to) PCs, OOP, client-server, SOA, XML, NoSQL, blockchain, “big data”, and indeed, multiple definitions of “AI”. Turns out all but one of those were actually somewhat useful in the end, when applied properly, but they didn’t eliminate the industry. Just roll with it.

My non-technical boss was excited about some silver-bullet tech and I had to walk him through how these things play out.

I talked through most of the industry's overhyped "trends" since the 70's and describe the promise and the reality (very similar to your list).

I did mention that there were 2 trends that had more lasting general application (compared to things that were just another tool in the toolbox used for some situations):

1-Relational databases

2-Internet

edanm 7/5/2025||
> I know with only 5 years experience this may not be obvious, but this is only the first of many “revolutionary” technologies making everyone around you lose their minds that you’ll have to deal with in your career.

While this has some truth, the size of the current "revolution" makes all the others look tiny, especially in terms of how it affects a programmer's day job. Nor did most of those "revolutions" affect every field of programming at once, like this one does. The percent of programmers actually impacted by blockchain is probably in the low single-digits. The percent of programmers using some version of AI tooling 3 years into this is probably >50%, and the more impactful tools will be used more very soon is my gues.

wrs 7/5/2025||
Were you around for the OOP craze? It definitely affected a lot of peoples’ day job. I mean, quite a few people use C++ and Java, no?

In my list I didn’t even mention the internet, the web, smartphones, and the cloud, all of which had a very broad effect on programming and programmers, and had similar top-down edicts from the C-suite, e.g., declaring you must be “all-in on cloud”. Turns out those things were indeed quite transformative, but now that the hype has dissipated somewhat, we’ve absorbed them into the toolkit and just proceed with the engineering.

edanm 7/5/2025|||
I've been programming professionally since around 2003. I'd say I caught the late-stages OOP craze (or maybe after-craze) though not really the origin or rise of OOP. My first job was in a C++ codebase, and I spent a lot of time learning the OOP design patterns (or really, the "how do we make C++ behave well" patterns).

That said, the rise of OOP is probably measured in a decade or two. It eventually "spawned" whole new programming languages, that eventually got a lot of popularity. But this is over a much longer time-frame than how quickly we went from no such thing as AI coding, to (now) coding agents. It also didn't affect the entire industry in the same way - hell, some people were still writing assembly in the 80s as the OOP craze was winding up. I don't have actual stats, but I imagine coding agents are far more ubiquitious across far more industries/languages/stacks, for more quickly.

> In my list I didn’t even mention the internet, the web, smartphones, and the cloud, all of which had a very broad effect on programming and programmers,

The internet (or maybe the web) I'd say was probably the more transformative thing. Cloud affected a lot of things too but not quite as much and didn't make quite as big a difference to the day-to-day work. I deployed things pre-cloud and post-cloud (though honestly mostly during the early-stage cloud), and there wasn't such a big difference.

Look, at the end of the day you can't just compare AI to other technologies blindly and say "well they were big hypes, this is the same". By actually looking at what it's doing and how it's affecting things, it's fairly clear it has a much bigger impact on the day-to-day work of programming, as opposed to anything else you mentioned.

I'm not saying this is the end of development, for all I know this will mean more developers! But I think software development will look fundamentally different in 5 years in a way that is far more widespread than in any other of these changes.

wrs 7/5/2025||
I’m not comparing anything blindly; that is in fact the exact opposite of my advice. Nor am I saying all big hypes are equivalent, just that there’s always a big hype about something, and you need a strategy to stay levelheaded about them despite the irrational polarized yelling you hear.

I use LLMs every day in my work (both to help write code and as a component of the thing I’m coding). They’re pretty cool. But as an engineer you need to make decisions based on what they actually do, in your empirical observation, not what people tell you they will do, eventually, in their fantasies. Speculating about that is just noise. The engineer’s job is to find the signal.

In my observation, I can trust an LLM to write code way more than last year, but I still have to keep it on a very short leash. Will it be better next year? I don’t know. Nobody does.

edanm 7/6/2025||
Yes, agreed on basically everything.
RaftPeople 7/5/2025|||
> Were you around for the OOP craze?

In the early 90's I was working for a large ERP company that went all in on OOP and distributed objects.

I was talking to one of the guys from the new team they created to re-write the entire system and had an entertaining conversation:

Tech guy that had drunk the kool-aid (TGTHDTKA): "...and the objects can just automatically interact with each other, like I can drop this person object on the phone object and it just automatically makes the phone call to that person"

Me: "Uh, but you still had to write specific code that causes that interaction to occur, you can't just do that with objects that haven't agreed on how to communicate"

TGTHDTKA: "No, it's all automatic because a person has a phone number and a phone uses a phone number to make calls, so you don't have to code anything special"

etc.

wrs 7/5/2025||
Now the MCP kool-aid is about how we’re finally going to make that work, because the computer on either end can literally read the field descriptions and intuit the interaction. MCP uses LLMs to supply the magic that the WSDL folks never admitted was necessary.
qualeed 7/4/2025||
Is it worth leaving? Hard to say for your specific situation, there's thousands of variables that no one here will ever know. Unless you are a superstar or independently wealthy, it's typically a bad idea to leave a job before you have something else lined up.

Is it worth looking? Absolutely! It will be much easier to make a decision when you're comparing your current position to a job offer, rather than comparing your current position to an unknown. I would also add, no matter what you feel about your current job, it's always a good idea to keep feelers out there for new positions. The fastest way up the rank and salary ladders is moving to new positions. It will always outpace internal promotions.

kazinator 7/4/2025|
Assume OP is talking about grabbing a new rope before letting go of another one; otherwise we are mixing generalities about career moves not specific to the issue in the topic.
qualeed 7/5/2025||
>generalities about career moves not specific to the issue in the topic.

They explicitly asked for general opinions, and provided almost no context which would let me be more specific.

"Is it worth leaving position over push to adopt X" is not exclusive to AI, nor is it a new question, so I addressed the general case.

juandsc 7/4/2025||
Three years ago I left my job with VERY high salary because I was starting to burn out and took two months off.

From my experience, if you're burnt out or starting to burn out then leave, otherwise I recommend staying until you secure another job.

Regarding the situation, they want to delete the tests? Fine, you have git right? Replace it, and let everything set on fire, quietly enjoy the chaos and at some point revert the changes. Or don't, you're leaving anyway.

specialist 7/4/2025||
Nah.

There's always pointless fads and food fights. Just tough it out. (Until a better gig comes along.)

I wish I could advise my young self "this too shall pass". The savvy play is to be a "team player". All those dumb hills I choose to die on... For dumb crap which eventually self-mooted all by themselves.

There was a comment (or a story?) some time back about how to survive as a software developer when projects are managed by Pointy Haired Bosses (PHBs). From memory:

Always be positive, optimistic.

Never say no or hedge or doubt.

Proactively manage upwards with enthusiastic status reports.

Instead of owning up to failures (due to fantasy estimates, ridiculous deadlines, scope creep, misc chaos, etc), list in detail all the great things and awesome progress you and your fantastic team have miraculously accomplished.

Like "reproducible builds which reduced failures by 1000% FTW, saving 30 hours per week" and "implemented boss' recommended pub/sub heptagonal meta architecture event sourced distributed pseudo sharded hybrid cloud something something, attaining sustained P95 latency of sub 3 picoseconds for 2 days"

Sadly, I was never able to keep up the act for more than 12 months. I'm just too contrarian, sarcastic, jaded. (I used to tell myself that I was "results oriented". I now see I was just an asshole. Everyone lies, needs to suspend disbelief, have a reason to crawl out of bed every morning. Who am I to piddle in their Cheerios?)

I'd like to think that if someone had clubbed young(est) me with the clue stick, I could have adapted.

YMMV. Happy Hunting.

y0eswddl 7/7/2025|
> Everyone lies, needs to suspend disbelief...

It's wild to me that somehow we're wrong for not wanting to have to do this every day to get by...

specialist 7/11/2025||
Yup. I've long been semi-curious about deceit, lying, etc. From studies of how our primate cousins lie to each other in the never-ending game of Get The Banana to every day Machiavellianism.

Does the person lying to me know they're lying? Surely they know that I know, right?

I'm also terrified that I'm lying to myself. Am I just in denial? Is some level of wishful thinking necessary? What are my blind spots?

I'm no smarter about any of this stuff (The Human Condition) than anyone else.

ICYMI, I enjoyed the book Everybody Lies by Seth Stephens-Davidowitz:

https://www.goodreads.com/book/show/28512671-everybody-lies

Please share any books, links, observations you have. TIA.

Uptrenda 7/4/2025|
Don't leave your job. Unless you've looked for jobs recently you have no idea how bad the current job market is. You also need to adapt to AI. Obviously, clueless people are going to misuse it. But let them learn the hard way. I've found that in organizations where the higher ups are incompetent if you try to signal the problem you become the problem. Then you're viewed as "being hard to work with" rather than trying to prevent a tragedy. Keep your job lad.

If you do get another offer remember that there's always a risk when you change jobs. I.E. how stable is that companies funding? Will they want to do layoffs, too? Are their investors pressuring them to make cuts? Because if you're a new hire you can say good bye to that job. We don't have formal tenure in tech but there's still a human cost to firing people who have been long-time with a company. The decision makers have less attachment to a new hire so its easier to fire them in that respect (and how many decisions with fires are just arbitrary, number-based, bad luck.)

More comments...