Top
Best
New

Posted by claxo 9 hours ago

I'm done making desktop applications (2009)(www.kalzumeus.com)
141 points | 171 comments
ryandrake 9 hours ago|
Almost all of Patrick's points are great if your software development goal is to make a buck. They don't seem to matter if you're writing open source, and I'd argue that desktop apps are still relevant and wonderful in the open source world. I just started a new hobby project, and am doing it as a cross-platform, non-Electron, desktop app because that's what I like to develop.

The onboarding funnel: Only a concern if you're trying to grow your user base and make sales.

Conversion: Only a concern if you're charging money.

Adwords: Only a concern if, in his words, you're trying to "trounce my competitors".

Support: If you're selling your software, you kind of have to support it. Minor concern for free and open source.

Piracy: Commercial software concern only.

Analytics and Per-user behavior: Again, only commercial software seems to feel the need to spy on users and use them as A/B testing guinea pigs.

The only point I can agree with him that makes web development better is the shorter development cycles. But I would argue that this is only a "developer convenience" and doesn't really matter to users (in fact, shorter development cycles can be worse for users as their software changes rapidly like quicksand out from under them.) To me, in my open source projects, my "development cycle" ends when I push to git, and that can be done as often as I want.

hn_acc1 5 hours ago||
To me, I prefer desktop apps because I KNOW when I've upgraded - it either said "upgrade now?" and did it, or, in the olden days, I had to track it down, or I installed an updated version of a distro, which included updated apps, so I expected some updates.

There are some things that NATURALLY lend themselves to a website - like doctor's appointments, bank balance, etc - but it's still a pain when, on logging in to "quickly check that one thing" that I finally got the muscle memory down for because I don't do it that often, I get a "take a quick tour of our great new overhauled features" where now that one thing I wanted is buried 7 levels deep or something, or just plain unfindable.

For something like Audacity (the audio program), how the heck does it make sense to put that on a website (I'm just giving a random example, I don't think they've actually done this), where you first have to upload your source file (privacy issues), manipulate it in a graphically/widget-limited browser - do they have a powerful enough machine on the backend for your big project? - then download the result? It's WAY, WAY better to be able to run the code on your own machine, etc. AND to be stable, so that once you start a project, it won't break halfway through because they changed/removed that one feature your relied upon (no, not thinking of AI at all, why do you ask? :-)

SoftTalker 4 hours ago|||
You touched on the one thing I hate most about infrequently used websites. The inevitable popup to "explore our new features." Hell no, I don't want to do that. I haven't logged on in six months, so I'm obviously here now with a purpose in mind and I want to do that as quickly as possible and then close the tab.
hn_acc1 4 hours ago||||
Of course, I'm also an old-school hacker (typed my first BASIC program ~45 years ago), so I have a desktop mentality. None of this newfangled 17-pound-portable stuff for me :-) And phones are at best a tertiary computing mechanism: first, desktop, then laptop, then phone. So yes, I'm clearly biased. Not trying to hide that.
d3Xt3r 3 hours ago|||
> For something like Audacity (the audio program), how the heck does it make sense to put that on a website (I'm just giving a random example, I don't think they've actually done this), where you first have to upload your source file (privacy issues), manipulate it in a graphically/widget-limited browser

I understand it was just an example, but you'd be surprised how far browsers have come along with technologies like Web Assembly and WebGL. Forget audio editing, you can even do video editing - without uploading any files to the remote server[1]. All the processing is done locally, within your browser.

And if you thought that was impressive, wait till you find out that you can can even boot the whole Linux kernel in your browser using a VM written in WASM[2]!

But I do agree with your points about lack of feature stability. I too prefer native apps just for the record (but for me, the main selling points are low RAM/CPU/disk requirements and keyboard friendliness).

[1] https://news.ycombinator.com/item?id=47847558

[2] https://joelseverin.github.io/linux-wasm/

jasomill 58 minutes ago||
Sure, but taking your video editor example, what advantages does an in-browser app provide over a native application like DaVinci Resolve, other than portability and not needing to install the application, in exchange for reduced performance, a clunkier interface, and reduced integration with the rest of the desktop platform?

And if this is such a compelling value proposition for full-featured desktop productivity applications, why didn't Java Web Start set the world on fire?

cyberax 1 minute ago||
> Sure, but taking your video editor example, what advantages does an in-browser app provide over a native application like DaVinci Resolve

It's the issue of friction. Also, good webapps are often _better_ than native apps, as they can support tabs.

> And if this is such a compelling value proposition for full-featured desktop productivity applications, why didn't Java Web Start set the world on fire?

Because it relied on Java and SWING, which were a disaster for desktop apps.

Aurornis 4 hours ago|||
This blog post benefits a lot from understanding where the author was at that point in their career: They had gained notoriety for their writing about their Bingo Card Creator software, but were moving on. After this they went on to build Appointment Reminder, a webapp that grew to a nice MRR before being sold off. Both were nice little indie developer success stories.

I grew up reading his writings and learned pretty quickly to read them as "this is what I'm thinking right now in my life" even though they're written more as authoritative and decisive writings from an expert. Over time he's gone from SEO expert to $30K/week consulting expert to desktop app expert to indie SaaS expert to recruiting industry expert to working for Strip Atlas. It was fun to read his writings at each point, but after so many changes I realized it was better to read it as a blog of ongoing learnings and opinions, not necessarily as retrospective wisdom shared from years of experience on the topic even if that's what the writing style conveys.

So I agree that the advice in the post should be taken entirely in context of pursuing the specific goals he was pursuing at the time. The less your goals happen to align, the less relevant the advice becomes.

analog31 8 hours ago|||
Going further, if you're a hobbyist, you're probably instinctively prioritizing the aspects of the hobby that you enjoy. My first app was a shareware offering in the 1980s, written in Turbo Pascal, that was easy to package and only had to run on one platform. Because expectations were low, my app looked just as good as commercial apps.

Today, even the minimal steps of creating a desktop app have lost their appeal, but I like showing how I solved a problem, so my "apps" are Jupyter notebooks.

Noumenon72 7 hours ago||
My coworker showed a Jupyter notebook with ipywidgets and it looked just like an app. A good CLI using FastAPI's `typer` looks a lot like an app too.
itsgabriel 1 hour ago|||
I generally agree. If you're not doing it for money you don't technically need most of these things. But if you see open source as more than “here's the code” some of them matter. Support will find you, via GitHub issues, emails, or DMs. Analytics is really important because it shows whether the software works for people besides you. Without money you usually do not have playtesters or a UX designer, so you get fewer useful bug reports. Frustrated users rarely take the time to write a detailed issue.
the__alchemist 7 hours ago|||
They're also ubiquitous for creative works, i.e. the sort of things a small set of people spend much time on, but is not something most people use. Examples:

  - CAD / ECAD
  - Artist/photos
  - Musician software. Composing, DAW etc
  - Scientific software of all domains, drug design etc
leonidasrup 5 hours ago|||
Adobe Photoshop, the most used tool for professional digital art, especially in raster graphics editing, is was first example of a perfectly fine commercial desktop application converted to cloud application with a single purpose - increased profit for Adobe.
rustcleaner 5 hours ago||
Master Collection CS6 still works excellently, and is now (relatively) small enough to live comfortably in virtuo. Newer file formats can be handled with ffmpeg and a bit of terminal-fu.
TacticalCoder 2 hours ago|||
Slicers for people doing 3D printing too (don't know if webapp slicers are more common than desktop app slicers though).

Desktop publishing.

Brokerage apps (some are webapps but many ship an actual desktop app).

And yet, to me, something changed: I still "install apps locally", but "locally" as in "only on my LAN", but they can be webapps too. I run them in containers (and the containers are in VMs).

I don't care much as to whether something is a desktop app, a GUI or a TUI, a webapp or not...

But what I do care about is being in control.

Say I'm using "I'm Mich" (immich) to view family pictures: it's shipped (it's open source), I run it locally. It'll never be "less good" than it is today: for if it is, I can simply keep running the version I have now.

It's not open to the outside world: it's to use on our LAN only.

So it's a "local" app, even if the interface is through a webapp.

In a way this entire "desktop app vs webapp" is a false dichotomy, especially when you can have a "webapp (really in a browser) that you can self-host on a LAN" and then a "desktop app that's really a webapp (say wrapped in Electron) that only works if there's an Internet connection".

theK 7 hours ago|||
I see a lot of this sentiment amongst developer friends but I never could relate. Its not that I'm against it or something but it just doesn't move me personally.

Most things I create in my free time are for my and my family's consumption and typically benefit immensely from the write once run everywhere nature of the web.

You can launch a small toy app on your intranet and run it from everywhere instantly. And typically these things are also much easier to interconnect.

tmtvl 8 hours ago|||
> Analytics and Per-user behavior: Again, only commercial software seems to feel the need to spy on users and use them as A/B testing guinea pigs.

KDE has analytics, they're just disabled by default (and I always turn them on in the hopes of convincing KDE to switch the defaults to the ones I like).

sevenzero 5 hours ago|||
I quit all social media, cancelled Spotify and whatnot and I am hella thankful for the Strawberry media player as a desktop app as it allows me to play all the music i actually own. I love desktop apps.
famouswaffles 3 hours ago|||
>To me, in my open source projects, my "development cycle" ends when I push to git, and that can be done as often as I want.

If development ends at a git push and users are left to build/fend for themselves (granted this is a lot of open source), then yeah not much difference, but if you're building and packaging it up for users (which you will more likely to be doing if your project is an app specifically) then the difference is massive.

satvikpendem 6 hours ago|||
Agreed, desktop frameworks have been getting really good these days, such as Flutter, Rust's GPUI (which the popular editor (and more importantly a competitor to a webview-based app in the form of Electron) Zed is written with), egui, Slint and so on, not to mention even the ability to render your desktop app to the web via WASM if you still wanted to share a link.

Times have changed quite a bit from nearly 20 years ago.

rustcleaner 5 hours ago|||
I generally despise "web tech" as it is today. Browsers are not application platforms!
nonethewiser 8 hours ago|||
its just waaaaaay easier to distribute a web app

For some things a desktop app is required (more system access) or offers some competitive UX advantage (although this reason is shrinking all the time). Short of that user's are going to choose web 95% of the time.

mohamedkoubaa 7 hours ago|||
This points to our failure as an industry to design a universal app engine that isn't a browser.
fbrchps 7 hours ago|||
Counterpoint: is the web browser not already fulfilling the "universal app engine" need? It can already run on most end user devices where people do most other things. IoT/Edge devices don't count here, but this day most of their data is just being sent back to a server which is accessible via some web interface.

Ignoring the fragmentation of course; although that seems to be getting less and less each year (so long as you ignore Safari).

archagon 1 minute ago|||
I think a browser is an inverted universal engine. The underlying tech is solid, but on top of it sits the DOM and scripting, and then apps have to build on top of that mess. It would be much better for web apps and the DOM to be sibling implementations on top of the engine, not hierarchically related.
rustcleaner 4 hours ago||||
>Counterpoint: is the web browser not already fulfilling the "universal app engine" need?

Counter-counterpoint: Maybe it's time to require professional engineer certification before a software product can be shipped in a way that can be monetized. It's to filter devs from the industry who look at browsers today and go "Yeah, this is a good universal app engine."

mohamedkoubaa 55 minutes ago||
This was cathartic to read thank you
abdullahkhalids 6 hours ago|||
Yes. But it consumes at least 10x-100x more resources to run a web app than to run a comparable desktop app (written in a sufficiently low level language).

The impact on people's time, money and on the environment are proportional.

mpyne 4 hours ago||
> But it consumes at least 10x-100x more resources to run a web app than to run a comparable desktop app (written in a sufficiently low level language)

Does it? Have you compared a web app written in a sufficiently low level language with a desktop app?

Pannoniae 2 hours ago|||
Yes. I can run entire 3D games.... ten times in the memory footprint of your average browser. Even fairly decent-looking ones, not your Doom or Quake!

And if we're talking about simple GUI apps, you can run them in 10 megabytes or maybe even less. It's cheating a bit as the OS libraries are already loaded - but they're loaded anyway if you use the browser too, so it's not like you can shave off of that.

skydhash 1 hour ago|||
I believe Firefox use separate processes per tab and most of them are over 100MB per page. And that's understandable when you know that each page is the equivalent of a game engine with it's own attached editor.

A desktop app may consume more, but it's heavily focused on one thing, so a photo editor don't need to bring in a whole sound subsystem and a live programming system.

2ndorderthought 58 minutes ago||||
Steam is pretty close.
Cheese48923846 7 hours ago||||
Remember Flash? The big tech companies felt a threat to their walled gardens. They formed an unholy alliance to stamp out flash with a sprinkle of fake news labeling it a security threat.

Remember Livescript and early web browsers? It was almost cancelled by big tech because Java was supposed to be the cross platform system. The web and Javascript just BARELY escaped a big tech smack down. They stroked the ego of big tech by renaming to Javascript to honor Java. Licked some boots, promised a very mediocre, non threatning UI experience in the browser and big tech allowed it to exist. Then the whole world started using the web/javascript. It caught fire before big tech could extinguish. Java itself got labeled a security threat by Apple/Microsoft for threatening the walled gardens but that's another story.

You may not like browsers but they are the ONLY thing big tech can't extinguish due to ubiquity. Achieving ubiquity is not easy, not even possible for new contenders. Pray to GOD everyday and thank her for giving us the web browser as a feasible cross platform GUI.

Web browser UI available on all devices is not a failure, it's a miracle.

To top it all off, HTML/CSS/Javascript is a pretty good system. The box model of CSS is great for a cross platform design. Things need to work on a massive TV or small screen phone. The open text-based nature is great for catering to screen readers to help the visually impaired.

The latest Wizbang GPU powered UI framework probably forgot about the blind. The latest Wizbang is probably stuck in the days of absolute positioning and non-declarative layouts. And with x,y(z) coords. It may be great for the next-gen 4-D video game, but sucks for general purpose use.

MrDrMcCoy 4 hours ago|||
As I recall, Flash and Java weren't so much security issues themselves, but rather the poorly designed gaping hole they used to enter the browser sandbox being impossible to lock down. If something like WASM existed at the time to make it possible for them to run fully inside the sandbox, I bet they'd still be around today. People really did like Macromedia/Adobe tools for web dev, and the death of Flash was only possible to overcome its popularity because of just how bad those security holes were. I miss Flash, but I really don't miss drive-by toolbar and adware installation, which went away when those holes were closed.
tolciho 7 hours ago|||
Flash had quite a lot of quite severe CVE; how many of those do you suppose are "fake news" connived by conspiracy (paranoid style in politics, much?) as opposed to Flash being a pile of rusted dongs as far as security goes? A lot of software from that era was a pile of rusted dongs, bloat browsers included. Flash was also the first broken website I ever came across, for some restaurant I never ended up going to. If they can't show their menu in text, oh, well.
jimbokun 7 hours ago||||
We have failed to design a universal app engine…except for the one that dwarfs every other kind of software development for every kind of device in the world.
jcelerier 6 hours ago||
Can a single webpage address & use more than 4gb of ram nowadays? I was filling 16gb of ram with a single Ableton live session in 2011.
Gigachad 1 hour ago|||
Via electron I’m sure it could. In the main browser it’s probably best to cap usage to avoid having buggy pages consume everything. Anything heavy like a video editor you’d rather install as an electron app for deeper system access and such.
rustcleaner 4 hours ago|||
How about a webpage shouldn't ever address & use even 4GB of RAM! :O
theK 7 hours ago||||
No. We did, it is the browser.
ryandrake 7 hours ago||
"The Browser" has turned out to be a pretty terrible application API, IMO. First, which browser? They are all (and have been) slightly different in infuriating ways going all the way back to IE6 and prior. Also, a lot of compromises were made while organically evolving what was supposed to be "a system for displaying and linking between text pages" into a cross-platform application and system API. The web's HTML/CSS roots are a heavy ball and chain for applications to carry around.

It would have been great if browsers remained lightweight html/image/hyperlink displayers, and something separate emerged as an actual cross-platform API, but history is what it is.

Gigachad 1 hour ago|||
Look at caniuse, if you see green boxes on all the current version browsers. Than you are good to go. If not, wait until the feature is more widely supported.
jstanley 6 hours ago|||
They're not that different, and it's a pretty good platform and pretty easy to program for. That's why it won.
irishcoffee 6 hours ago||
It didn't win. It just survived long enough. The web is a terrible platform. I haven't ever shipped a line of "web code" for money and I plan to keep it that way until I retire. What a miserable way to make a living.
2ndorderthought 54 minutes ago|||
I envy your pure soul. I am one of many who has had, at times, been coerced through financial strain to write some front end code. All I ask for is, when the time comes, you try to remember me for who I was and not the thing I became.
jstanley 6 hours ago|||
Perhaps you're taking the npm/react/vercel world to be the entire web? I agree that that stuff is a scourge. But you can still just write html and Javascript and serve it from a static site, I wrote an outline in https://incoherency.co.uk/blog/stories/web-programs.html which I frequently link to coding agents when they are going astray.
mohamedkoubaa 45 minutes ago|||
I wouldn't say that react is what's wrong with the web. I would say that the web is what's wrong with react.
irishcoffee 4 hours ago|||
When I was a kid I was running websites with active forums and a real domain name, and I did it with vBulletin and my brain. Someone bought the domain name and website off of me, haven't touched web tech since. I did use Wt at an old job once, but the "website" was local to 1 machine and there were no security concerns.
criddell 5 hours ago|||
> design a universal app engine

You've reminded me of the XKCD comic about standards: https://xkcd.com/927/

Do you really want a universal app engine? If you don't have a good reason for ignoring platform guidelines (as many games do), then don't. The best applications on any platform are the ones that embrace the platform's conventions and quirks.

I get why businesses will settle for mediocre, but for personal projects why would you? Pick the platform you use and make the best application you can. If you must have cross-platform support, then decouple your UI and pick the right language and libraries for each platform (SwiftUI on Mac, GTK for Linux, etc...).

mohamedkoubaa 43 minutes ago|||
Platforms and app engines are orthogonal concerns. I agree that platform guidelines are worth preserving, and the web as a platform solves it by hijacking the rectangle that the native platform yields to it. Any app engine could do the same thing.
MrDrMcCoy 4 hours ago|||
Please, for the love of all that is holy, not GTK.
rustcleaner 4 hours ago||||
>or offers some competitive UX advantage (although this reason is shrinking all the time).

As a user, properly implemented desktop interface will always beat web. By properly, I mean obeying shortcut keys and conventions of the desktop world. Having alt+letter assignments to boxes and functions, Tab moves between elements, pressing PageUp/PageDown while in a text entry area for a chat window scrolls the chat history above and not the text entry area (looking at you SimpleX), etc.

Sorry, not sorry. Web interface is interface-smell, and I avoid it as much as possible. Give me a TUI before a webpage.

foresto 5 hours ago|||
> its just waaaaaay easier to distribute a web app

Let's also remember that it's infinitely easier to keep a native app operational, since there's no web server to set up or maintain.

2ndorderthought 51 minutes ago||
No DNS, no DDOS, no network plane, no kubernetes, no required data egress, no cryptographic vulnerabilities, no surveillance of activity... It's almost like the push for everything to go through the web was like a psyop so everything we did and when was logged somewhere. No, no, that's not right.
zephen 7 hours ago|||
Agreed.

And his point about randomly moving buttons to see if people like it better?

No fucking thanks. The last thing I need is an app made of quicksand.

rustcleaner 4 hours ago||
God damn that drives me up a wall! Mozilla is a terrible offender in this regard, but there are myriad others too!

The user interface is your contract with your users: don't break muscle memory! I would ditch FF-derivatives, but I'm held hostage by them because the good privacy browsers are based on FF.

Stop following fads! Be like craigslist: never change, or if you do then think long and hard about not moving things around! Also if you're a web/mobile developer, learn desktopisms! Things don't need to be spaced out like everything is a touch interface. Be dense like IRC and Briar, don't be sparse like default Discord or SimpleX! Also treat your interfaces like a language for interaction, or a sandbox with tools; don't make interfaces that only corral and guide idiots, because a non-idiot may want to use it someday.

I really wish Stallman could be technology czar, with the power to [massively] tax noncompliance to his computing philosophy.

otterley 7 hours ago|||
To be fair, probably most of us here on HN write software to put food on the table. Don’t pooh-pooh our careers.
kylemaxwell 3 hours ago|||
Both can be true: we can have different preferences about what we're doing to put food on the table and what we're doing when we build something on our own for other reasons.
sfpotter 7 hours ago|||
He didn't pooh-pooh anyone's careers.
otterley 7 hours ago||
The way it's worded comes across that way.
foresto 5 hours ago||
I have spent a good deal of my life writing software to put food on the table. I didn't interpret any of what he wrote in the way you describe. Perhaps you could explain why you did.
miki123211 7 hours ago||
Attitudes like these is why non-developers don't want to use open source software.

These concerns may not matter to you, the developer, but they absolutely matter to end-users.

If your prospective user can't find the setup.exe they just downloaded, they won't be able to use your software. If your conversion and onboarding sucks, they'll get confused and try the commercial offering instead. If you don't gather analytics and A/B test, you won't even know this is happening. If you're not the first result on Google, they'll try the commercial app first.

Users want apps that work consistently on all their devices and look the same on both desktop and mobile, keep their data when they spill coffee on the laptop, and let them share content on Slack with people who don't have the app installed. Open source doesn't have good answers to these problems, so let's not shoot ourselves in the foot even further.

2ndorderthought 49 minutes ago|||
If my user cannot install software in their own computer then I do not want their money. They have issues they need to work out on their own and they might be better off saving their money.
satvikpendem 6 hours ago||||
This presupposes that the OSS creator even wants users in the first place, which might not always be the case as it could be personal software; and that these users actually want these features, as many do not want analytics, ads, and A/B tests in your app.
janalsncm 6 hours ago||
I guess in the same way that one might presuppose a boat wants water?

If a piece of software doesn’t have users and the developers don’t care about the papercuts they are delivering, I would argue what they have created is more of an art project than a utility.

notarobot123 5 hours ago|||
Science research without obvious practical application can still be important and valuable.

Art works without popular appeal can become highly treasured by some.

Open source software doesn't have to be ambitious to be worthwhile and useful. It can be artful, utilitarian or a artifact of play. Commercial standards shouldn't be the only measure of good software.

satvikpendem 5 hours ago|||
It's more like building your own boat then someone else coming along and saying it'll never compete with a cruise ship because it doesn't have a water slide and endless buffet; sometimes, things in the same category can serve wholly different purposes.
rustcleaner 4 hours ago||||
>Attitudes like these is why non-developers don't want to use open source software.

Good! It's not for them! They can stay paypigs on subscription because they can't git gud!

MarcelOlsz 7 hours ago|||
I'm a seasoned developer and I frequently come across OSS projects where I spend half an hour or more in "how the fuck do I actually use this"-land. A lot of developers need to take the mindset of writing the documentation for their non-tech grandma from the ground up.
sharts 6 hours ago||
LLMs to the rescue
MarcelOlsz 6 hours ago||
It's the principle.
CMay 1 hour ago||
In practice this was about a product that is not targeted at desktop-savvy people. At the same time, you also have people who know "this isn't a hard problem, why do I need go through all this. let me go look for someone who did it better." Not to mention all of their younger tech savvy family telling them, "don't download anything!"

If your product targets a segment that expects a desktop app, do that. Web app, do that. Phone app, do that.

Something like this would have worked if it was still back in the Walmart bargain software shelf where people could impulse buy a CD, put it into their computer and have it automatically start and install, then show up on the desktop. Despite that being less common now, it was more streamlined in a way for many users.

Many of those people probably aren't logged into Steam or Windows Store either, so you have to do your own thing. It makes sense that web is the least friction for those people.

franga2000 8 hours ago||
Not off to a great start... The "look how many steps it takes to convert shareware users" is insanely overblown.

1-4. Google, find, read... this is the same for web apps. 2. Click download and wait a few seconds. Not enough time to give up because native apps are small. Heavy JS web apps might load for longer than that. 3. Click on the executable that the browser pops up in front of you. No closing the browser or looking for your downloads folder. It's right there! 3.5. You probably don't need an installer and it definitely doesn't need a multi-step wizard. Maybe a big "install" button with a smaller "advanced options". 3.6. Your installer (if you even have it) autostarts the program after finishing 4. The user uses it and is happy. 5. Some time later, the program prompts the user to pay, potentially taking them directly onto the payment form either in-app or by opening it in a browser. 6. They enter their details and pay.

That's one step more than a web app, but also a much bigger chance the user will come back to pay (you can literally send them a popup, you're a native app!).

monooso 8 hours ago|
If my failing memory serves, those were valid concerns in 2009, when this was written.
neilv 8 hours ago||
> However, the existence of pirates is a stitch in my craw, particularly when any schoolmarm typing the name of my software into Google is prompted to try stealing it instead:

I wonder whether Google, in its Don't Be Evil era, ever considered what they should do about software piracy, and what they decided.

I'd guess they would've decided to either discourage piracy, or at least not encourage it.

In the screenshot, the Google search query doesn't say anything about wanting to pirate, yet Google is suggesting piracy, a la entrapment.

(Though other history about that user may suggest a software piracy tendency, but still, Google knows what piracy seeking looks like, and they special-case all sorts of other topics.)

Is the ethics practice to wait to be sued or told by a regulator to stop doing something?

Or maybe they anticipate costs and competition for how they operate, and lobby for the regulation they want, so all they have to do is be compliant with it, and be let off the hook for lawsuits?

rustcleaner 4 hours ago||
"Piracy" today is not stealing IP. It's not even what it used to mean, when it was originally used to describe rogue publishers who violated copyright. IP laws as used today against private downloaders and users are the legalization of plundering of people who do the equivalent of hear a fact/idea and act on it or use it. IP cannot be stolen, an "immunity from plundering" fee is what's being paid (license). The whole justification for it with software, namely copying from disc/internet to local storage, and then copying from local storage into RAM, is a legal formality to facilitate this plundering.

It is plundering those who didn't pay you for legal immunity.

steve1977 8 hours ago||
Did Google ever have a real Don't be Evil era?
sowbug 8 hours ago|||
The original expression came out of an internal company discussion that someone summarized (paraphrased) as "when there's a tough choice to make, one is usually less evil. Make that choice."

In the early days of Google in the public consciousness, this turned into "you can make money without being evil." (From the 2004 S-1.)

Over time, it got shortened to "don't be evil." But this phrase became an obligatory catchphrase for anyone's gripes against Google The Megacorp. Hey, Google, how come there's no dark mode on this page? Whatever happened to "don't be evil"? It didn't serve its purpose anymore, so it was dropped.

Answering your question really depends on your priors. I could see someone honestly believing Google was never in that era, or that it has always been from the start. I strongly believe that the original (and today admittedly stale) sentiment has never changed.

ux266478 7 hours ago||
Making a loud affair out its retirement rather than quietly letting it collect dust and be forgotten over time was most definitely not a good idea.

The public already demonstrated that they adopted, misused and weaponized the maxim. Its retirement just sharpened the edge of that weapon. Now instead of "What happened to don't be evil?" it's become "Of course Google is being evil." and everything exists in that lens.

sowbug 7 hours ago||
A similar dynamic is playing out with Anthropic, whose founders left OpenAI in part over a philosophical split that could be described, if you'll grant a little literary license appropriate to this thread, as Anthropic choosing the "don't be evil" path. No surprise that we now see HN commentary skewering Anthropic for not living up to it.
neilv 8 hours ago||||
They had to at least nominally have it, early on, to be able to hire the best Internet-savvy people.

Tech industry culture today is pretty much finance bro culture, plus a couple decades of domain-specific conditioning for abuse.

But at the time Google started, even the newly-arrived gold rush people didn't think like that.

And the more experienced people often had been brought up in altruistic Internet culture: they wanted to bring the goodness to everyone, and were aware of some abuse threats by extrapolating from non-Internet society.

Minor49er 8 hours ago||||
If you need to sloganize a reminder to yourself to not be evil, that's not a promising sign
neilv 7 hours ago||
Early in Google's history, I took that sentiment as saying that they were one of us (Internet people), and weren't going to act like Microsoft (at the time, regarded by Internet people as an underhanded and ignorant company). Even though Google had a very nice IR function and general cluefulness, and seemed destined to be big and powerful.

And if it were the altruistic Internet people they hired, the slogan/mantra could be seen as a reminder to check your ego/ambition/enthusiasm, as well as a shorthand for communicating when you were doing that, and that would be respected by everyone because it had been blessed from the top as a Prime Directive.

Today, if a tech company says they aspire not to be evil: (1) they almost certainly don't mean it, in the current culture and investment environment, or they wouldn't have gotten money from VCs (who invest in people motivated like themselves); (2) most of their hires won't believe it, except perhaps new grads who probably haven't thought much about it; and (3) nobody will follow through on it (e.g., witness how almost all OpenAI employees literally signed to enable the big-money finance-bro coup of supposedly a public interest non-profit).

traderj0e 6 hours ago|||
I took it to mean, prioritize long-term growth over short-term income. But the slogan was silly even back then, like obviously an evil company would claim to not be evil.
neilv 6 hours ago||
If it was silly, a lot of altruistic people nevertheless fell for it.

For example, my impression at the time was that people thought that Google would be a responsible steward of Usenet archives:

https://en.wikipedia.org/wiki/Henry_Spencer#Preserving_Usene...

FWIW, it absolutely was believable to me at the time that another Internet person would do a company consistent with what I saw as the dominant (pre-gold-rush) Internet culture.

For example of a personality familiar to more people on HN, one might have trusted that Aaron Swartz was being genuine, if he said he wanted to do a company that wouldn't be evil.

(I had actually proposed a similar corporate rule to a prospective co-founder, at a time when Google might've still been hosted at Stanford. Though the co-founder was new to Internet, and didn't have the same thinking.)

1718627440 7 hours ago|||
In other words the company made a bet on peoples naivety and it worked.
fragmede 8 hours ago|||
'99 to 2004. You had to have been there, maaaan...
steve1977 8 hours ago||
I've been there when Google was altavista.digital.com ;)
mwkaufma 8 hours ago||
Over a decade of circular "web apps are better for the subset of problems webapps are good at" tautologies.
traderj0e 5 hours ago|
Web apps weren't so easy to make back then, so standalone apps were the norm. Shortly before 2009 a lot of the web apps were Java or Adobe Flash, and 2009 was part of the transition period where platforms were at war with that stuff but open-web alternatives weren't mature yet.
mwkaufma 5 hours ago||
Yeah I know I was there -- also today's wasteful "open-web alternatives" wouldn't fly anyway, because I recall even during the XP era having min-specs of, like, 800mhz/512mb
traderj0e 3 hours ago||
Yeah, they were like "Flash is too slow" then the replacement was 5x slower
yen223 1 hour ago||
No one argued Flash was too slow, they argued (correctly) that Flash was closed source, proprietary, and had a lot of security issues
traderj0e 1 hour ago||
The most famous criticism of Flash was "Thoughts on Flash" by Steve Jobs, which said among other things that it's too inefficient. He did cite inconsistent hardware acceleration for H.264 that was a real performance drawback of Flash for video in particular, and was also complaining about the power usage for interactive Flash content in general. Jobs was right at the time from what I can tell, but somehow the end result was even slower stuff. People did keep repeating the line that Flash is slow.

I also remember people citing performance as a reason YouTube switched from Flash to HTML5. Searching those blogs now is giving a lot of 404s. Like I said this should've helped since it's video, but somehow YouTube immediately got slower anyway back then. Back then I installed an extension to force it to use QuickTime Player for that reason.

The proprietary and insecure parts were real problems too. I'm fine with the decisions that were made, but this was a drawback.

sudb 9 hours ago||
I wonder what the numbers say about desktop applications now, and how much the arrival of Electron changed things up here.

Nowadays, it seems to be that mobile apps have the "best metrics" for b2c software. I'd be interested to read a contemporary version of this article.

xp84 8 hours ago||
“Metrics”

This reminds me of a past job working for an e-commerce company. This wasn’t a store like Amazon that “everyone” uses weekly, it was a specific pricey fashion brand. They had put out a shitty iOS app, which was just a very bare-bones wrapper around the website. But they raved about how much better the conversion rate rates were there. Nobody would listen to me about how the customers that bother downloading a specific app for shopping at a particular retailer are obviously just superfans so of course that self-selected group converts well.

So many people who should be smart based on their job titles and salaries, got the causation completely backwards!

drBonkers 7 hours ago|||
Hey, I notice this kind of thing all the time. People use "data" to tell the story they want to -- similar to how it seems humans make a decision subconsciously then weave a rational decision to back it up afterwards.

Do you have principles on how to tackle this? I feel stuck between the irrationality of anecdata and the irrationality of lying with numbers. As if the only useful statistic is one I collect and calculate myself. And, even then, I could be lying to myself.

gridder 5 hours ago||||
Survivorship bias
zephen 7 hours ago|||
This stupidity might go a long way towards explaining the relentless push towards apps.
hermitcrab 9 hours ago|||
Some of us are still making a living from desktop apps, 17 years later.
xantronix 6 hours ago||
Please tell your tales. We beseech thee of thine humble wisdom.
hermitcrab 4 hours ago||
I've written about it lots at:

https://www.successfulsoftware.net

yshamrei 9 hours ago|||
In 2026, the number of mobile applications in the App Store and Google Play increased by 60% year over year, largely because entry into the market has become much easier thanks to AI.
stackghost 9 hours ago|||
Electron is the worst of both worlds. I have never paid for an Electron app, and never will. Horrid UX.
bigyabai 9 hours ago||
> I have never paid for an Electron app

Your employer most likely has.

stackghost 9 hours ago||
Sure, and so has my government. But I can only control what I personally pay for.
hermitcrab 9 hours ago||
What 'best metrics'?
joenot443 7 hours ago|||
I think in this case it can be approximated as 'largest market'

I'd wager there are more people paying for software for their smart phone than any other platform they use.

bee_rider 6 hours ago|||
Having my credit card already is an overwhelming advantage for the Apple App store and for Steam. I won’t say it is impossible to overcome, but I think I could count on my fingers the number of instances where I, like, typed my card into a website to buy anything, in the last decade.
hermitcrab 6 hours ago|||
Yes, but they are mostly paying little or nothing. How much did you spend on phone apps this year? And ads pay a pittance, unless you have massive scale.
sudb 8 hours ago|||
Anecdotally, conversion - from free to trial, trial to paid, one-off purchases, etc.
onionisafruit 9 hours ago||
This is from 2009, and the title should say so.
trueno 1 hour ago|
yeah perfect, i came in here to say

I'm done making web apps (2026).

seriously desktop apps kinda own i just desktop-app'd a pwa made it do SSO auth at my org and now its just part of the self-serve application download kiosk and we're laughing at all the pain we've endured for so many years writing up proposals and billing to scale up web app infra for internal tooling and stuff.

im kinda enjoying coming back to earth right now with my team and we're just hmmmmmmm'ing a lot of things like this. we've had devops chasing 23498234892% availability with k8s and load balancers and all this stuff and we're now assessing how much of that cruft was completely unnecessary and made everything some amorphous blob of complexity and unpredictable billing & and really gave devops a moat to just say "no" to so many things that came through the pipeline. there's so many things that can just be dragged back to like an actual on premise machine and served up through the internal network. we are... amused at how self-important we made ourselves out to be this past decade.

we're probably like days worth of goofing away from going to buy a few mac minis and plug it into some uninterruptable power supplies and just seeing how un-serious we can get with so much tooling we've built over the years. and for everything else, desktop apps. seriously desktop apps is like free infrastructure if you build it right.

arikrahman 1 hour ago||
I came to the same conclusion but due to the Web tooling being much better with Clojure. The Clojure toolchains to develop desktop apps like ClojureDart are too brittle from my experience.
autonomousErwin 9 hours ago||
Grass always looks greener on the other side, mainly because it's been fertilised.
binaryturtle 8 hours ago|
No, "grass always looks greener on the other side" is a perspective thing. If you stand on your own grass then you look down onto it and see the dirt, but if you look over to the other side you see the gras from the side which makes it look more dense and hides the dirt. But it's the same boring grass everywhere. :)
em-bee 5 hours ago|||
hah, i have been arguing this for years. first time to see someone else making the same argument. nice!
CobrastanJorji 5 hours ago||||
At first, I thought "this is missing the point of the phrase" and moved on, but now I'm back to say it's stuck in my head and an intuitive, pretty neat way to think about it.
nothrabannosir 8 hours ago|||
I preferred GPs poop joke version but to each their own.
HeyLaughingBoy 7 hours ago|
It's hard to believe not only that this is 17 years old, but that I remember when he posted it!
projektfu 6 hours ago|
Lol, it's less than 10 years old. The 80s were 20 years ago.
More comments...