Posted by walterbell 4/3/2025
It is, but it adds disingenuous apologetic.
Not wishing to pick on this particular author, or even this particular topic, but it follows a clear pattern that you can find everywhere in tech journalism:
Some really bad thing X is happening. Everyone knows X is happening.
There is evidence X is happening, But I am *not* arguing against X
because that would brand me a Luddite/outsider/naysayer.... and we
all know a LOT of money and influence (including my own salary)
rests on nobody talking about X.
Practically every article on the negative effects of smartphones or
social media printed in the past 20 years starts with the same chirpy
disavowal of the authors actual message. Something like;"Smartphones and social media are an essential part of modern life today... but"
That always sounds like those people who say "I'm not a racist, but..."
Sure, we get it, there's a lot of money and powerful people riding on "AI". Why water down your message of genuine concern?
Maybe what I'm getting at is this [0] poem of Taylor Mali. Somehow we all lost our nerve to challenge really, really bad things, wrapping up messages in tentative language. Sometimes that's a genuine attempt at balance, or honesty. But often these days I feel an author is trying too hard to distance themself from ... from themself.
It's a a silly bugbear, I know.
[0] https://taylormali.com/poems/totally-like-whatever-you-know/
It’s not. It’s a rant against people and their laziness and gullibility.
I doubt anyone can do it perfectly every time, it requires a posthuman level of objectivity and high level of information quality that hardly ever exists.
The "Pray Mr. Babbage..." anecdote comes to mind: https://www.azquotes.com/quote/14183
People also seem to be losing their ability to detect satire.
I'm concerned GenAI will lower creative standards too, that people will be fine with the sound of suno, or the look of Dall-E. How then would the arts evolve?
The kids will be alright.
OSINT only exists because of internet capabilities and google search - ie someone had to learn how to use those new tools just a few years ago and apply critical thinking
AI tools and models are rapidly evolving and more in depth capabilities appearing in the models, all this means the tools are hardly set in stone and the workflows will evolve with them - it’s still up to human oversight to evolve with the tools - the skills of human overseeing AI is something that will develop too
[1] https://www.tandfonline.com/doi/full/10.1080/16161262.2023.2...
She was interested in the then-new concept of "open source" so went to the talk, only to find it had nothing to do with software development.