Posted by robotswantdata 6/30/2025
I’ll usually spend a few minutes going back and forth before making a request.
For some reason, it just feels like this doesn't work as well with ChatGPT or Gemini. It might be my overuse of o3? The latency can wreck the vibe of a conversation.
This new stillpointlab hacker news account is based on the company name I chose to pursue my Context as a Service idea. My belief is that context is going to be the key differentiator in the future. The shortest description I can give to explain Context as a Service (CaaS) is "ETL for AI".
We are entering a new era of gamification of programming, where the power users force their imaginary strategies on innocent people by selling them to the equally clueless and gaming-addicted management.
This really does sound like Computer Science since it's very beginnings.
The only difference is that now it's a much more popular field, and not restricted to a few nerds sharing tips over e-mail or bbs.
Except in actual computer science you can prove that your strategies, discovered by trial and error, are actually good. Even though Dijkstra invented his eponymous algorithm by writing on a napkin, it's phrased in the language of mathematics and one can analyze quantitatively its effectiveness and trade-offs, and one can prove if it's optimal (as was done recently).
Maybe it's true for computer science - but most people on here aren't doing computer science. They're doing software engineering. And it sure as heck isn't true for software engineering. If it were, I wouldn't be hearing arguments about programming languages for years, or static vs dynamic typing, or functional vs OOP...
So what you're arguing about AI isn't exactly anything new to software development.
Even more problematic is that too many "researchers" are just laymen, lacking a proper scientific background, and they are often just playing around with third-party-services, while delivering too much noise to the community.
So in general, AI has also something like the replication crisis in its own way. But on the other side, the latest wave of AI is just some years old (3 years now?), which is not much in real scientific progress-rates.
where the authors fail to explain how the prompts are obtained and how they prove that they are valid and not a hallucination.
the move from "software engineering" to "AI engineering" is basically a switch from a hard science to a soft science.
rather than being chemists and physicists making very precise theory-driven predictions that are verified by experiment, we're sociologists and psychologists randomly changing variables and then doing a t-test afterward and asking "did that change anything?"
the hard sciences have theories. and soft sciences have models.
computer science is built on theory (turing machine/lambda calc/logic).
AI models are well "models" - we dont know why it works but it seems to - thats how models are.
Dijkstra is rolling in his grave. Computer Science was a rigorous sub-field of mathematics before the tech bros showed up and started moving fast and breaking things. The endless supply of VC money has destroyed this field.
that applies to basically any domain-specific terminology, from WoW raids through cancer research to computer science and say OpenStreetMap
People are using their thinking caps and modelling data.
25 years ago it was object oriented programming.
He doesn't even take responsibility for it, but claims the board told him to do that.
pg said a few months ago on twitter that ai coding is just proof we need better abstract interfaces, perhaps, not necessarily that ai coding is the future. The "conversation is shifting from blah blah to bloo bloo" makes me suspicious that people are trying just to salvage things. The provided examples are neither convincing nor enlightening to me at all. If anything, it just provides more evidence for "just doing it yourself is easier."
ie. the new skill in AI is complex software development