Top
Best
New

Posted by zdw 10 hours ago

Our newsroom AI policy(arstechnica.com)
94 points | 64 commentspage 2
JumpCrisscross 7 hours ago|
Context:

"An AI agent of unknown ownership autonomously wrote and published a personalized hit piece about me after I rejected its code, attempting to damage my reputation and shame me into accepting its changes into a mainstream python library.

...

I’ve talked to several reporters, and quite a few news outlets have covered the story. Ars Technica wasn’t one of the ones that reached out to me, but I especially thought this piece from them was interesting (since taken down – here’s the archive link). They had some nice quotes from my blog post explaining what was going on. The problem is that these quotes were not written by me, never existed, and appear to be AI hallucinations themselves.

This blog you’re on right now is set up to block AI agents from scraping it (I actually spent some time yesterday trying to disable that but couldn’t figure out how). My guess is that the authors asked ChatGPT or similar to either go grab quotes or write the article wholesale. When it couldn’t access the page it generated these plausible quotes instead, and no fact check was performed.

...

Update: Ars Technica issued a brief statement admitting that AI was used to fabricate these quotes" [1].

[1] https://theshamblog.com/an-ai-agent-published-a-hit-piece-on...

Discussion: https://news.ycombinator.com/item?id=47009949

lproven 3 hours ago||
From TFA:

> Our approach comes from two convictions:

Uhuh.

> that AI cannot replace human insight, creativity, and ingenuity

Sure, agreed, no dispute.

> and that these tools, used well, can help professionals do better work.

[[citation needed]]

Prove it. I don't believe you.

bombcar 2 hours ago|
If AI slop-generation helps your work, then your work was at least partially generating slop.

Slop may be useful, helpful, any number of things, but if you're using slop you're using slop.

ares623 8 hours ago||
Trust, reputation, and credibility will become (even more of) a premium.
tomalbrc 6 hours ago||
Good to know which companies/news to avoid.
gnabgib 9 hours ago||
Doesn't need Ars Technica added to the title
npodbielski 7 hours ago||
It is nice to see, but I fear it will be the same as with papers and their news and internet. I could buy a paper and read it but why would I?

The same will most likely happen with human written news and cheap AI slop news. Why would anyone pay more for higher quality when you can have low quality cheap product?

Look at food for example. Price is most important factor in the choice of what you are going to buy. I will probably not happen now, in few months or in even few years but it will happen if models will still be advancing.

nemomarx 3 hours ago|
Food would actually be a pretty good example - people pay extra for higher quality food, local farm food, whatever all the time. They go out to expensive restaurants that talk up their techniques and sourcing. There's a lot of defined space for refined food like this.

If ai generated and human written content ended up like that you would have a pretty decent shot at a fully human authored blog or substack that people paid for specifically, or human written books specially curated.

npodbielski 1 hour ago||
You could say the same about horses: people still riding those, have stables, buy expensive ones or even bread ones themselves. Does not change the fact that common people usually drive cheap cars.

Of course everything can be argued via analogy that way, but I think outcome of cheap, mostly correct but often completely wrong news will be more probable.

Just like todays social media

Philex 7 hours ago||
[dead]
vintagedave 8 hours ago|
> Anyone who uses AI tools in our editorial workflow is responsible for the accuracy and integrity of the resulting work. This responsibility cannot be transferred to colleagues, editors...

This sounds a direct callout to the incident earlier this year where an apparently sick staff member relied on an AI to reproduce quotes, and it did not. Ars retracted the article and the staffmember was fired.

I have felt very ethically uneasy about this because the person was ill, and I emailed the Ars editorial team directly to express concern re labour conditions, and to note that it is the editorial team's responsibility to do things like check quotes.

Of course it is the journalist's responsibility: when you have a job you do your job by policy (I wonder if this policy existed in writing at the time of the firing?) plus, it is part of the job to be accurate. But I am also a firm believer in responsibility being greater at higher levels. This sounds a direct abrogation of journalistic standards by the Ars editorial team.

cubefox 6 hours ago||
> and to note that it is the editorial team's responsibility to do things like check quotes.

Publishing things online for free (as Ars does) is difficult business. I doubt they can realistically afford an "editorial team" which checks quotes. Paying the journalists is expensive enough.

lynx97 7 hours ago|||
> apparently sick staff member relied on an AI to reproduce quotes

"Apparently sick", you couldn't phrase it more accurately.

Kudos for firing them, the only valid course of action for a publisher.

vintagedave 1 hour ago||
That's harsh. I feel any situation where someone is ill and required to work (the appearance, which is a labour issue if true), and makes mistakes while sick, should be treated with a little kindness. I worry they were made an example of.
intended 5 hours ago||
>This sounds a direct abrogation of journalistic standards by the Ars editorial team.

We depended on an ecosystem of news and journalism to keep our polities informed.

However, if that ecosystem is starving it will increasingly fail to live up to its standards and we can expect these failures to impact us increasingly.

I am not defending bad journalists, nor creating an excuse to tolerate such behavior in the future.

I am describing the macro trend we are facing, the failure state we can expect, and asking what happens if nothing grows to replace it.

The NYT earns revenue through games more than journalism and ads. Wikipedia is seeing reduced visitors due to AI summaries, and this leads to lower donations. A review site I used went into a full paywall.

I don't really see how Ars or most other sites will be able to earn revenue and pay salaries in this bot first environment.

bombcar 2 hours ago|||
>We depended on an ecosystem of news and journalism to keep our polities informed.

If this is true and necessary we might as well skip the middleman and have the news and journalists run the polities.

nemomarx 3 hours ago||||
Ars has a decently pricey direct subscription, doesn't it? With a lot of tech focused features included. Their strategy is probably the best you could set up in this ecosystem.
bombcar 2 hours ago||
If it isn't clear from this policy that Ars is run by the advertisers and not the subscribers, I don't know what would make it clear.

Advertisers only care about eyeballs and really bad press; AI increases the first and rarely causes the second.

nemomarx 2 hours ago||
My more cynical take is that this might be as subscriber driven as it's possible for a news outlet now. Keep an eye on 404 and see if they can resist the gravity of ads, I guess?
vintagedave 1 hour ago|||
I agree with you; what I am noting is that traditional journalism ethics (editors are responsible for fact checking) is explicitly refused by this policy.

They can simultaneously set standards for their staff -- as they should -- and retain professional standards for the more senior staff as well.

To remove responsibility from those more senior and make those more junior the only ones responsible is in any company a serious professional issue. Here it is also specifically contrary to the professional standards in their business area.

I see my parent comment is downvoted. Yet, this is firmly the ethical and professional and traditional stance. I don't believe AI or any random upcoming technology should change this.