Posted by surprisetalk 5 days ago
> Are our tools just worse now? Was early 2000s PHP actually good?
Not sure how rhetorical that was, but of course? PHP is a super efficient language that is tailor made to write dynamic web sites, unlike Go. The author mentions a couple of the features that made the original version easier to write and easier to maintain, they are made for the usecase, like $_GET.
And if something like a template engine is needed, like it will be if the project is a little bit bigger, then PHP supports that just fine.
> Max didn't need a request router, he just put his PHP file at the right place on the disk.
The tendency to abstract code away leads to complexity, while a real useful abstraction is about minimizing complexity. Here, the placement of PHP files makes stuff easier -> it's a good abstraction.
And that's why the original code is so much better.
This also elides a bit of complexity; if I assume I already have the Nginx and gunicorn process then my Python web server isn’t much worse. (Back in the day, LAMP stack used Apache.)
I’ll for sure grant the templating and web serving language features though.
> To be perfectly honest, as a teenager I never thought Max was all that great at programming. I thought his style was overly-simplistic. I thought he just didn't know any better. But 15 years on, I now see that the simplicity that I dismissed as naive was actually what made his code great.
They’ve all been solved 100x over by founders who’ve been funded on this site. It used to make sense to have a directory or cgi-bin of helpful scripts. Now it only makes sense as a bit of nostalgia.
I miss the days when we had less, could get less done in a day… but felt more ownership over it. Those days are gone.
I’m kind of getting tired of software made by “founders,” who are just looking to monetize me and get their exit, as opposed to software written by normal users just wanting to be useful. I know I’m on the wrong website for this, but the world could use fewer “founders” trying to turn my eyeballs and attention into revenue.
> They’ve all been solved 100x over by founders who’ve been funded on this site. It used to make sense to have a directory or cgi-bin of helpful scripts. Now it only makes sense as a bit of nostalgia.
Why does it make more sense to learn the syntax for someone else's helper scripts than to roll my own, if the latter is as easy or easier, and afterwards I know how to solve the problem myself?
That's true, but it was also true before. To the extent that solving a problem to learn the details of solving it was ever worthwhile, which I think is and was quite a lot, I'd say it's still true now, even though there are lots of almost-but-not-quite solutions out there. That doesn't mean that you should solve all problems on your own, but I think you also shouldn't always use someone else's solution.
But they're personal itches, not productizable itches. The joy is still there, though.
I don't know for sure what the problem was (I have my theories) and why could we not get there where most people build their own custom products.
User interfaces became more user-friendly [0], while developer experience - though simpler in many ways - also became more complex, to handle the complex demands of modern software while maintaining a smooth user experience. In isolation both of these things make sense. But taken together, it means that instead of developer and user experience converging into a middle place where tools are a bit easier to learn and interfaces a bit more involved, they've diverged further, to where all the cognitive load is placed on the development side and the user expects an entirely frictionless experience.
Specialization is at the core of our big interconnected society, so it's not a surprising outcome if you look at the past century or two of civilization. But at the same time I think there's something lost when roles become too segregated. In the same way homesteading has its own niche popularity, I believe there's a latent demand for digital homesteading too; we see its fringes in the slow rise of things like Neocities, the indie web, and open source software over the past few years.
Personally I think we just have yet to see the 'killer app' for digital homesteading, some sort of central pillar or set of principles to grow around. The (small) web is the closest we have at the moment, but it carries a lot of technical baggage with it, too much to be able to walk the fine line needed between approachability and flexibility.
Anyway, that's enough rambling for now. I'll save the rest for a blog post.
[0] user-friendly as in being able to use it without learning anything first; not that that's necessarily in the user's best interest
I certainly consider it a good idea, now that it has come to mind.
and it will work very well.
It's not like this person was ever going to pay someone to make a cartoon drawing so nobody lost their livelihood over it. Seems like a harmless visual identifier (that helps you remember if you read the article if you stumble across it again later).
Is it really such a bad thing when people use generative AI for fun or for their hobbies? This isn't the New York Times.
When the project becomes more complex, things change for the worse.
Also, you need to protect modules not only from errors, but from the other programmers in your team.
Re: hardening - I guess I deployed a lot of "insecure" LAMP-style boxes. My experience, mainly w/ Fedora Core and then CentOS, was to turn off all unnecessary services, apply security updates, limit inbound and outbound connectivity to only the bare minimum necessary w/ iptables, make sure only public key auth was configured for SSH, and make sure no default passwords or accounts were enabled. Depending on the application grubbing thru SELinux logs and adjusting labels might be necessary. I don't recall what tweaks there were on the default Apache or PHP configs, but I'm sure there were some (not allowing overrides thru .htaccess files in user-writeable directories, making sure PHP error messages weren't returned to clients, not allowing directory listings in directories without a default document, etc).
Everything else was in the application and whatever stupidity it had (world-writeable directories in shitty PHP apps, etc). That was always case-by-case.
It didn't strike me as a horribly difficult thing to be better-than-average in security posture. I'm sure I was missing a lot of obvious stuff, in retrospect, but I think I had the basics covered.
People soon found out that it was not very good at complex web apps, though.
These days, there's almost no demand for very simple web apps, partially because common use cases are covered by SaaS providers, and those with a need and the money for custom web apps have seen all the fancy stuff that's possible and want it.
So it's no surprise that today's languages and frameworks are more concerned with making complex web apps manageable, and don't optimize much (or at all) for the "very simple" case.
I dunno about that.
In 2000, one needed a cluster of backends to handle, say, a webapp built for 5000 concurrent requests.
In 2025, a single monolith running on a single VM, using a single DB on another instance can vertically scale to handle 100k concurrent users. Put a load balancer in front of 10 instances of that monolith and use RO DB followers for RO queries, and you can easily handle 10x that load.
> So it's no surprise that today's languages and frameworks are more concerned with making complex web apps manageable, and don't optimize much (or at all) for the "very simple" case.
Maybe the goal is to make complex web apps manageable, but in practice what I see are even very simply webapps being mad with those frameworks.
I see the current state of web development as a spiral of complexity with a lot of performance pitfalls. Over-engineering seems to be the default.
Definitely not. PHP lost far more market share to Java,C# and Ruby on Rails than to node.js
> PHP is still very usable for server-side rendering and APIs.
Not "is still", but "has become". It has changed a lot since the PHP 3 days.
> You say "very simple" as if you can't have complex systems with PHP.
With early 2000s PHP, you really couldn't, not without suffering constantly from the language's inadequacies.
> I see the current state of web development as a spiral of complexity with a lot of performance pitfalls. Over-engineering seems to be the default.
I don't disagree, but that seems to happen most of all in the frontend space.
They eventually made it fit for purpose with Laravel ;-)
I think for a kid, Max's code was great but ultimately you do need to learn to think about things like error handling, especially if your code is intended to go into "production" (i.e., someone besides yourself will use/host it).
The example with the image sharing is pretty good, because it only needs to share images. In, shall we say more commercial settings, it would grow to handle meta data, scaling, comments, video sharing, account management and everything in between. When that happens Max's approach breaks down.
If you keep your systems as "image sharing", "comments" and "blog" and just string them together via convention or simply hard coding links, you can keep the simple solutions. This is at the cost if integration, but for many use that's perfectly fine.
Edit: Oh, that Mel.
Also, the image kinda looks like me. It's not me though. I don't think.