Top
Best
New

Posted by theletterf 8 hours ago

A letter to those who fired tech writers because of AI(passo.uno)
211 points | 130 commentspage 2
prakashn27 3 hours ago|
I have not fired a technical writer, but writing documentation that understands and maintains users focus is hard even with llm. I am trying to write documentation for my start up and it is harder than I expected even with llm.

Kudos to all technical writer who made my job as software engineer easier.

ChrisMarshallNY 3 hours ago||
Good points.

I suspect a lot of folks are asking ChatGPT to summarize it…

I can’t imagine just letting an LLM write an app, server, or documentation package, wholesale and unsupervised, but have found them to be extremely helpful in editing and writing portions of a whole.

The one thing that could be a light in the darkness, is that publishers have already fired all their editors (nothing to do with AI), and the writing out there shows it. This means there’s the possibility that AI could bring back editing.

groovy2shoes 2 hours ago|
as a writer, i have found AI editing tools to be woefully unhelpful. they tend to focus on specific usage guidelines (think Strunk & White) and have little to offer for other, far more important aspects of writing.

i wrote a 5 page essay in November. the AI editor had sixty-something recommendations, and i accepted exactly one of them. it was a suggestion to hyphenate the adjectival phrase "25-year-old". i doubt that it had any measurable impact on the effectiveness of the essay.

thing is, i know all the elements of style. i know proper grammar and accepted orthographic conventions. i have read and followed many different style guides. i could best any English teacher at that game. when i violate the principles (and i do it often), i do so deliberately and intentionally. i spent a lot of time going through suggestions that would only genericize my writing. it was a huge waste of my time.

i asked a friend to read it and got some very excellent suggestions: remove a digressive paragraph, rephrase a few things for persuasive effect, and clarify a sentence. i took all of these suggestions, and the essay was markedly improved. i'm skeptical that an LLM will ever have such a grasp of the emotional and persuasive strength of a text to make recommendations like that.

ChrisMarshallNY 1 hour ago||
Thanks!

That makes a lot of sense, but right now, the editing seems to be completely absent, and, I suspect, most writers aren’t at your level (I am sure that I’m not).

It may be better than nothing.

InMice 4 hours ago||
Is it expected that LLMs will continue to improve over time? All the recent articles like this one just seem to describe this technology's faults as fixed and permanent. Basically saying "turn around and go no further". Honestly asking because their arguments seem to be dependent on improvement never happening and never overcoming any faults. It feels shortsighted.
marcosdumay 2 hours ago||
> Is it expected that LLMs will continue to improve over time?

By whom?

Your expectations aren't the same everybody has.

InMice 47 minutes ago||
I dont have any expectations. Thats why I was asking
LtWorf 1 hour ago||
The LLM can't actually use the product and realise that the description is wrong.
EagnaIonat 3 hours ago||
Nice read after the earlier post saying fire all your tech writers. Good post.

One thing to add is that the LLM doesn't know what it can't see. It just amplifies what is there. Assumed knowledge is quite common with developers and their own code. Or the more common "it works on my machine" because something is set outside of the code environment.

Sadly other fields are experiencing the same issue of someone outside their field saying AI can straight up replace them.

theletterf 2 hours ago|
> after the earlier post saying fire all your tech writers

What post was that?

motbus3 3 hours ago||
I like the post but we can learn from insurance companies.

They have AI finding reasons to reject totally valid requests

They are putting to court that this is a software bug and they should not be liable.

That will be the standard excuse. I hope it does not work.

ninalanyon 2 hours ago||
I remember the days when every large concern employed technical writers and didn't expect us programmers and engineers to write for the end users. But that stopped decades ago in most places at least as far as in house applications are concerned, long before AI could be used as an excuse for firing technical writers.
aurareturn 5 hours ago||
But you might not need 5 tech writers anymore. Just 1 who controls an LLM.
theletterf 5 hours ago|
Perhaps. Could the same be said for engineers?
aurareturn 5 hours ago|||
Yes. That could be said for engineers as well.

If the business can no longer justify 5 engineers, then they might only have 1.

I've always said that we won't need fewer software developers with AI. It's just that each company will require fewer developers but there will be more companies.

IE:

2022: 100 companies employ 10,000 engineers

2026: 1000 companies employ 10,000 engineers

The net result is the same for emplyoment. But because AI makes it that much more efficient, many businesses that weren't financially viable when it needed 100 engineers might become viable with 10 engineers + AI.

SturgeonsLaw 3 hours ago||
There's another scenario... 100 companies employ 1000 engineers
gjm11 3 hours ago|||
The person you're replying to is obviously and explicitly aware that that is another scenario, and the whole point of their comment was to argue against it and explain why they think something else is more likely. Merely restating the thing they were already arguing against adds nothing to the discussion.
aurareturn 2 hours ago|||
Why do you think this outcome is more likely?
oldjim798 2 hours ago||
Because this is what capital has told us. Capital always wants to reduce the labour cost to $0.
aurareturn 2 hours ago|||
If labor cost is close to $0, even more businesses that weren’t viable before would become viable.

Do you not see the logic?

menaerus 33 minutes ago|||
Demand is the driver not only the cost.
squeefers 1 hour ago|||
ai is bad because it's automated his job. luddites in tech. a real contradiction
DangitBobby 1 hour ago||
Not really a contradiction, since the entire point of jobs and the economy at all is to serve the specific needs of humanity and not to maximize paper clip production. If we should be learning anything from the modern era it's something that should have always been obvious: the Luddites were not the bad guys. The truth is you've fallen for centuries old propaganda. Hopefully someday you'll evolve into someone who doesn't carry water for paperclip maximizers.
9rx 1 hour ago|||
Zero labor cost should see the number of engineers trend towards infinity. The earlier comment suggested the opposite — that it would fall to just 1000 engineers. That would indicate that the cost of labor has skyrocketed.
DangitBobby 1 hour ago||
That doesn't make sense. Demand isn't entirely dictated by cost. There is only so much productivity the world is equipped to consume.
9rx 1 hour ago||
What difference does that make? If the cost of an engineer is zero, they can work on all kinds of nonsensical things that will never be used/consumed. It doesn't really matter as it doesn't cost anything.
DangitBobby 29 minutes ago||
I'm kinda baffled by your suggestion. That's just not how people or organizations run by people operate. Cost is not the only driver to demand.
ap99 5 hours ago||||
Yes and no.

Five engineers could be turned into maybe two, but probably not less.

It's the 'bus factor' at play. If you still want human approvals on pull requests then If one of those engineers goes on vacation or leaves the company you're stuck with one engineer for a while.

If both leave then you're screwed.

If you're a small startup, then sure there are no rules and it's the wild west. One dev can run the world.

marginalia_nu 5 hours ago||
This was true even before LLMs. Development has always scaled very poorly with team size. A team of 20 heads is like at most twice as productive as a team of 5, and a team of 5 is marginally more productive than a team of 3.

Peak productivity has always been somewhere between 1-3 people, though if any one of those people can't or won't continue working for one reason or another, it's generally game over for the project. So you hire more.

This is why small software startups time and time again manage to run circles around with organizations with much larger budgets. A 10 person game studio like Team Cherry can release smash hit after smash hit, while Ubisoft with 170,000% the personnel count visibly flounders. Imagine doing that in hardware, like if you could just grab some buddies and start a business successfully competing with TSMC out of your garage. That's clearly not possible. But in software, it actually is.

matwood 5 hours ago||||
That assumes your backlog is finite.

Is the tech writers backlog also seemingly infinite like every tech backlog I've ever seen?

imtringued 4 hours ago|||
The tech writer backlog is probably worse, because writing good documentation requires extensive experience with the software you're writing documentation about and there are four types of documentation you need to produce.
DeborahWrites 5 hours ago|||
Yes. Yes it is.
ekidd 4 hours ago||||
Yes. I have been building software and acting as tech lead for close to 30 years.

I am not even quite sure I know how to manage a team of more than two programmers right now. Opus 4.5, in the hands of someone who knows what they are doing, can develop software almost as fast as I can write specs and review code. And it's just plain better at writing code than 60% of my graduating class was back in the day. I have banned at least one person from ever writing a commit message or pull request again, because Claude will explain it better.

Now, most people don't know to squeeze that much productivity out of it, most corporate procurement would take 9 months to buy a bucket if it was raining money outside, and it's possible to turn your code into unmaintainable slop at warp speed. And Claude is better at writing code than it is at almost anything else, so the rest of y'all are safe for a while.

But if you think that tech writers, or translators, or software developers are the only people who are going to get hit by waves of downsizing, then you're not paying attention.

Even if the underlying AI tech stalls out hard and permanently in 2026, there's a wave of change coming, and we are not ready. Nothing in our society, economy or politics is ready to deal with what's coming. And that scares me a bit these days.

aniou 3 hours ago||
"And it's just plain better at writing code than 60% of my graduating class was back in the day".

Only because it has access to vast amount of sample code to draw a re-combine parts. Did You ever considered emerging technologies, like new languages or frameworks that may be a much better suited for You area but they are new, thus there is no codebase for LLM to draw from?

I'm starting to think about a risk of technological stagnation in many areas.

raincole 5 hours ago||||
We have been seeing this happen in real time in the past two years, no?
amelius 5 hours ago|||
Yes. But they are now called managers.
aniou 4 hours ago||
First, we've fallen into a nomenclature trap, as so-called "AI" has nothing to do with "intelligence." Even its creators admit this, hence the name "AGI," since the appropriate acronym has already been used.

But, when we use "AI" acronym, our brains still recognize "intelligence" attribute and tend to perceive LLMs as more powerful than they actually are.

Current models are like trained parrots that can draw colored blocks and insert them into the appropriate slots. Sure, much faster and with incomparably more data. But they're still parrots.

This story and the discussions remind me of reports and articles about the first computers. People were so impressed by the speed of their mathematical calculations that they called them "electronic brains" and considered, even feared, "robot intelligence."

Now we're so impressed by the speed of pattern matching that we called them "artificial intelligence," and we're back to where we are.

theshrike79 2 hours ago||
Documentation needs to be tested.

Someone has to turn off their brain completely and just follow the instructions as-is. Then log the locations where the documentation wasn't clear enough or assumed some knowledge that wasn't given in the docs.

throwaw12 3 hours ago|
I will share my experience, hopefully it answers some questions to tech writers.

I was terrible writer, but we had to write good docs and make it easy for our customers to integrate with our products. So, I prepared the context to our tech writers and they have created nice documentation pages.

The cycle was (reasonably takes 1 week, depending on tech writer workload):

    1. prepare context
    2. create ticket to tech writers, wait until they respond
    3. discuss messaging over the call
    4. couple days later I get first draft
    5. iterate on draft, then finally publish it
Today its different:

    1. I prepare all the context and style guide, then feed them into LLM.
    1.1. context is extracted directly from code by coding agents 
    2. I proofread it and 97% of cases accept it, because it follows the style guide and mostly transforms my context correctly into customer consumable content
    3. Done. less than 20 minutes
Tech writers were doing amazing job of course, but I can get 90-95% quality in 1% of the time spend for that work.
arionmiles 2 hours ago||
If you're getting such value out of LLMs, I'm intrigued to learn more about what exactly it is that you're feeding them.

People boast about the gains with LLMs all the damn time and I'm sceptical of it all unless I see their inputs.

anonymous_sorry 1 hour ago||
Your docs are probably read many more times than they are written. It might be cheaper and quicker to produce them at 90% quality, but surely the important metric is how much time it saves or costs your readers?
More comments...