Stephen Hawking is the first example that comes to mind.
He developed a remarkable ability to perform complex calculations and visualize intricate mathematical concepts entirely in his mind. He once mentioned that his ALS diagnosis, which limited his physical abilities, led him to focus intensely on theoretical physics, as it required more intellectual than physical effort.
But sure, writing (and drawing) is a great tool to aid in deep thinking. So are AI tools.
You're correct here.
> Stephen Hawking is the first example that comes to mind.
The post is obviously speaking of the general population or at best average professional, and in my opinion choosing one of the most brilliant exceptional scientific minds of our lifetimes is not a good counterargument for a piece that speaks of a potential problem with society at large.
Stephen Hawking's thinking and imagination wouldn't have meant much had he not finally penned them down for others to read, and neither would his ideas have been taken seriously had he chosen to make tiktoks or podcasts to explain them instead.
PG is obviously talking about the mental process of writing, i.e. of organizing a complex network of thoughts in a linear hierarchy that others can grasp, not the physical one.
Most of us have neither the intellect of Hawking nor his situation.
Sure some will thoughtlessly copy and paste but for many AI helps to structure their thoughts and they think clearer as a result.
Just look at the quality of presidential debates and political discourse we've been having for the past decade. Not just in the US, but all over the world. The situation is perilous.
Lagniappe: https://www.youtube.com/watch?v=EwPnJXXX5Ic
As someone said, it's no wonder The Matrix chose the 90s as the peak of human civilization.
IQ, (from “intelligence quotient”), is a number used to express the relative intelligence of a person.
So for the whole population it is constant by definition :)
And heading in the direction of Dick and Jane doesn't mean that we ever reached it; according to https://community.jmp.com/t5/image/serverpage/image-id/8926i... recent SOTUs should be comprehensible to high school freshmen.
(full discussion at https://community.jmp.com/t5/JMPer-Cable/Regression-to-model... ; through 2018 — anyone have 2022?)
EDIT: keep in mind that the expected "general" audience for the SOTU has also expanded dramatically due to technological change in between 1790 and 2018...
* Less than 1% of your writing will be life-changing.
* 3% will be trivial to write.
* 4% will strongly resonate with others in a way you didn’t expect.
* 5% will be quite good.
* 15% probably should’ve never been published.
* 26% will elicit a reaction you did not expect. Positive or negative.
* 28% will become vastly better because you chose to edit.
* 30% will start as one piece but finish as another.
* 40% will be good solid writing.
* 45% will do much worse than you expect when published.
* 60% of your writing will never be finished. Be ok with that.
* 100% of your writing is worth your time.
Just curious, but do you think my collective 20 years of random posts across 10+ social media platforms, often with thousands of posts per platform, has been worth my time?
Unlike younger generations, who are growing up surrounded by AI-generated content, many of us older folks have had the experience of engaging directly with people and evaluating their competence. We developed a knack for quickly determining someone's skill level through just a few minutes of face-to-face conversation—a skill that was essential for navigating various life situations.
Now that anyone can use AI to generate seemingly competent text, videos, and more, and as in-person interactions decline, the conditions that once allowed us to gauge competence are fading. I worry that in the future, no one—including AI trained on our outputs—will be adept at making these assessments.
Those of us who take time to carefully compose arguments and revise them, as Paul suggests, will have a better handle on this, so that's a helpful consideration.
I worry strongly about a future like that in Ideocracy[1], where nobody has a clue bout actually judge competence, and instead goes with the best sound bites.
The one path out that I can see, and it's unlikely, is to teach the skill of explicitly tracking history, and reviewing how well someone predicted the future, over time.
The explicit generation and curation of a reputation is part of that priceless nexus that they'll all be seeking in future generations, and yet it'll pale in comparison with the ability to size someone up in a few minutes of interaction.
I'm seeing that happen today with corporate documents (there's always that one enthusiast in each team who says "oh, let me improve that with [LLM]", and it's a slog to go through pages and pages of things that could be a bullet point). Quality has been trumped by mediocre quantity, and the cluelessness of the people who do this willingly baffles me.
As someone who's been writing pretty much constantly for over 30 years and both uses AI code completion to speed up first drafts (of code) but switches off everything except macOS's (pretty good) word completion for prose--and absolutely refuses to use AI to generate work documents or even my blog post drafts--this post was a bit of a "oh, so this would be the ultimate consequence of keeping all of it on" moment.
Accelerating writing (with simple word completion) is fine. But letting the AI generate entire sentences or paragraphs (or even expand your draft's bullet points) will either have you stray from your original intent in writing or generate overly verbose banalities that only waste people's time.
I use iA Writer to draft a lot of my stuff, and its style checks have helped a lot to remove redundancies, clichés and filler, making my prose a bit more cohesive and to the point. That's been around for ages and it's not LLM-style AI (more of a traditional grammar checker), but that sort of assistance seems to be missing from pretty much every AI "writing aid"--they just generate verbose slop, and until that is fixed in a way that truly helps a writer LLMs are just a party trick.
Edit: I just realised that I wrote about this at length in February - https://taoofmac.com/space/blog/2024/02/24/1600#the-enshitti...
During the typewriter era, anyone's ability to produce pages and pages of text was limited by their ability to type. Nowadays, you can copy/paste large blocks of code and thus inflate documents to enormous sizes. Which works just like sand in a gearbox of the decision-making process.
I wonder if the future of ESG, DEI and such is that one AI will produce endless reports and another AI will check if the correct buzzwords are used in correct frequency. And instead of yearly reports, they could easily become daily or hourly reports...
It would be a way to tout "allyship" on the social networks without actually doing anything substantial.
I recently published a [major philosophical work][1] that is the result of decades of thinking and three months of writing. I’m not a native English speaker, and although I know what I want to say, I often don’t know how to write it. I may not know or can’t find the right terms or phrasing, or I might make grammar mistakes. Sometimes, I can describe my ideas in a clumsy way, and I need help refining my sentences.
So, I use AI. I think, write my thoughts in my own way, and then work with AI to bring them closer to what I want. It’s hard work. Although AI can be an amazingly good writing partner, it often alters my text in ways that change the meaning completely. Even replacing a single word word with a synonym or adding a comma can turn a sentence into something totally unintended. It can be a lot of back-and-forth work to find the right paragraphs. Still, AI is a tremendous help, and my work would have been more immature and unpolished without it, even if it sometimes feels a little artificial.
Of course, it’s much more ideal to master English fully and to practice writing until it feels natural. But AI helps with that, too.
To begin with, the following assumption is false:
>To write well you have to think clearly, and thinking clearly is hard.
For most people, most life situations which require clear thinking have nothing to do with writing.
>This is why eminent professors often turn out to have resorted to plagiarism.
What's the percentage of such professors ? In the university I studied, there is no case of plagiarism till today. And plagiarism is not done because professors can't write, but due to other professional factors.
>If you're thinking without writing, you only think you're thinking.
As if writing is the only way to think well/correctly/effectively. My father never wrote a word: still, some of the most thoughtful statements I ever heard in my life were told to me by him during our conversations.
When you face a situation of danger, such as a wolf is running towards you: will you start to write your thoughts about what you should do, or will you just run right away and decide about the safest paths to follow while you are escaping ?
I'd disagree with "half" here because I can't imagine it being anywhere close to 50/50. I expect a power law distribution: most won't be able to write well. The ones who do will have a massive advantage, in the same way that those who can concentrate in our age of distractions have a significant advantage over those who can't.
We're already in a poverty of quality communicators. All that nonsense with Bitcoin was fast talking nonsense that sounded plausible. This is what happens when real communications breaks down: fraudulent technical products surrounded by a word salad of abused language and people afraid of looking stupid so they never ask for clarifications of the gobbledygook.
I am for example very good at math and reasoning etc. But when I write something I tend to construct long complicated sentences (probably because I think that way^^) and the result whould often be considered badly written.
Now of course you can feel superiour, for your better writing style. If it makes you happy ;)
All of those are related to language, because our thinking (and also math and logic) is based on language.
But just because we think in language, does not imply that writing is the only form of reasoning. It is ok when it is your preference and certainly has a value - like other thinks.