Top
Best
New

Posted by koshyjohn 15 hours ago

AI should elevate your thinking, not replace it(www.koshyjohn.com)
486 points | 353 commentspage 4
srcreigh 15 hours ago|
Is it wise to understand everything that AI does for you?

Let’s say a person has 10 units of learning per week. Is the author actually claiming that that person must not deliver any results beyond their 10 units?

It makes some sense to have say 20 units of results and prioritize which ones to fully comprehend.

I suspect APIs / libraries / languages / platforms will have more churn due to AI. New platform new system need to learn. Once every 5 years might become every year or even more frequent. That would be a sort of inflation of knowledge and skills. It would affect the decision making about how to spend one’s 10 units per week.

addaon 14 hours ago||
> Let’s say a person has 10 units of learning per week.

This is… not how humans work? If you have the time and energy to learn ten things, and then spend time babysitting a random number generator to produce evidence of 10 more units of work, you’re paying an opportunity cost compared to someone who spends the time learning an eleventh thing. You can argue who has more short term value to a company… but who is the wiser person after a thirty year career?

koshyjohn 11 hours ago|||
> Is the author actually claiming that that person must not deliver any results beyond their 10 units? No, I'm claiming that if someone or something else produced your 10 units of work, you better be able to verify that those 10 units of work are of at least the same quality as you producing them yourself. This is the bare minimum and not something to shift onto other people reviewing your work.

Beyond that, if that's all you do, you are basically proving you're replaceable. If you're smart, you'll reallocate intellectual capacity that was freed up by A.I. onto something A.I. can't do today.

SkyPuncher 10 hours ago||
It's really no different than managing people.

Managers simply cannot know all of the details of what their reports write. They have to build abstractions.

e1ghtSpace 4 hours ago||
what if it seems ai has literally replaced your thinking? Is there a way to unreplace it? im talking literally.
archfrog 12 hours ago||
Very apt headline, IMHO.

I have been an ardent opponent of AI since it came up a few years back. I refuse to vibe code and I refuse to let AI think for me. I won't be an AI controller.

However, two days ago I found a nice, personal use case for AI: Advanced writing checks (grammar checks, mostly, and some rewordings) in Word using a rather expensive app.

I write a lot of US English, despite it not being my native language, and AI is now helping me to write much better than I did before. Also, I discovered that I am much worse at writing Danish than I was believing. In fact, I think I am better at writing US English than at Danish, that's a bit surprising as I am a Dane.

No AI was used during the writing of this entry, but I dearly love the writing tool already! I have heard similar stories from friends who say that AI is very good at summarizing long documents and stuff like that.

So, I personally think that AI CAN elevate one's thinking. I am learning more about Danish and US English grammar every day, now, than I did during a decade before. Writing is suddenly so fun because it involves growing my skills.

conqrr 15 hours ago||
This is a huge concern and I fully agree with the post. Even though one might think I am not fully giving into AI, this was always the case etc. It still affects YOU and everyone else. 1. Software, often, isn't built in vacuum. Lots of companies are shoving AI down throats like it or not. Most Bigtech is heavily using metrics to get to 100% AI generated code. Reviewing is a nightmare. 2. New entrants (new grads etc) are largely AI first and are losing out on the safety and reliability aspects that are enforced automatically when you learn coding without AI.

IMO, teams need to agree on a set of principles on AI usage, concrete examples of where and how to use it. Perhaps its much more useful in parts of your system that's faster evolving and doesn't have too much core logic like testing frameworks etc

Simply discarding it as 'yet another tool' is part of the problem.

krishna3145 10 hours ago||
https://news.ycombinator.com/item?id=47916430 Check this out on LLMS security.
alecco 13 hours ago||
CoRecursive had a really good episode about this last August:

"Coding in the Red-Queen Era" https://corecursive.com/red-queen-coding/

cvanelteren 12 hours ago||
Wrote a similar take on it here:https://thefriendlyghost.nl/chinese-room-ai/
oxag3n 12 hours ago||
> split people into two nebulous groups

shows both groups using AI differently. Hard to continue reading the article that excludes your group entirely.

smj-edison 14 hours ago|
On the point of avoiding the struggle of learning, I think it's easy to swing too far the other direction and go back to not using modern development tools. I think it is doing a new learner a disservice by saying something like "don't use GDB/REPL/AI tool to learn, since you'll never learn the fundamentals". I think all of these tools allow for learning, if that's how the learner engages with them. So I hope that AI becomes integrated in the learning process, as far as it accelerates and doesn't replace understanding.
More comments...