Posted by marbartolome 2 days ago
I should be able to run a crypto wallet I downloaded from a Kim Jong Un fan site while high and it shouldn’t be able to do anything I don’t give it permission to do.
It’s totally possible. Tabs in a web browser are basically this.
I can do it with VMs but that’s lots of extra steps.
The only place it seems to fall flat is network I/O - LAN access requires permission, but dialing out to the wider Internet does not.
Compare Windows, which has jack (except for bloated anti-malware hooks in NTFS.)
Linux is _trying_ to replicate macOS with Flatpak/XDG portals, but those still need more time in the oven.
Source: I use both a MacBook and a Linux desktop daily.
No it isn't, and no it doesn't.
95% of people don't know what "Run your own software" means, because to them, the app store lets them chose what apps to install. And they don't get viruses and malware like their 2008 laptop did.
That being said, there absolutely needs to be a mechanism for "lowering the gates" if the user wants full control of the device they own.
What would you include?
I care about the free-ness and open-ness of my computer, because that's where I do all my work, my E-mail, my finances, and all my "serious computing." I feel that a different standard applies on a Real Computer because they are totally different devices, used for totally different purposes. So what I accept on phones, cars, and gaming consoles, I don't accept on my computer.
I believe the likelihood of a smartphone being the only form of computing (and access to the internet in particular) grows with diminishing income / cultural means.
This is based on anecdotal observation, does anybody here know of relevant survey data?
Based on a cursory look, keywords can include "smartphone-only internet users" and "large-screen computer ownership".
The American Community Survey asks questions related to that (income, computing devices). Comparing states, the poorer the residents of a state, the smaller the percent of households with regular computers ("large-screen computer ownership"), per "Computer Ownership and the Digital Divide" (Mihaylova and Whitacre, 2025) [0, 1, 2].
Also, Pew runs surveys on income and device usage ("smartphone-only"). Again, the lower the income, the higher the proportion that is smartphone-only [3, 4].
[0] Chart: https://files.catbox.moe/emdada.png
[1] Paper, "Census Data with Brian Whitacre.pdf": https://files.catbox.moe/1ttgee.pdf
[2] Web: https://www.benton.org/blog/computer-ownership-and-digital-d...
[3] Pew chart: https://files.catbox.moe/fs62tf.png
[4] Pew web: https://www.pewresearch.org/internet/fact-sheet/mobile/
The idea that smartphones aren't computers and their users aren't deserving of software freedom is frustratingly entitled.
We all now live with the blowback from that decision. Most people don't even realize that actually secure computing is a possibility now, even here on HN.
This general insecurity means that anything exposed to raw internet will be compromised and therefore significant resources must be expended to manage it, and recover after any incidents.
It's no wonder that most people don't want to actually run their own servers. Thus we give up control and this .... Situation .... Is the result.
It's like trying to set up a warehousing system so perfect that the shrinkage rate is 0.
This is historically inaccurate. All console games were originally produced in-house by the console manufacturers, but then 4 Atari programmers got wind that the games they wrote made tens of $millions for Atari while the programmers were paid only a relatively small salary. When Atari management refused to give the programmers a cut, they left and formed Activision. Thus Activision became the original third-party console game development company. Atari sued Activision for theft of trade secrets, because the Activision founders were all former Atari programmers. The case was settled, with Atari getting a cut of Activision’s revenue but otherwise allowing Activision to continue developing console games. I suspect this was because the 4 programmers were considered irreplaceable to Atari (albeit too late, after they already quit).
The licensing fee business model was a product of this unique set of circumstances. The article author's narrative makes it sound like consoles switched from open to closed, but that's not true. The consoles (like the iPhone) switched from totally closed to having a third-party platform, after the value of third-party developers was shown.
> Consumers loved having access to a library of clean and functional apps, built right into the device.
How can you say they're "built right into the device" when you have to download them? Moreover, you were originally able to buy iPhone apps in iTunes for Mac, and manage your iPhone via USB.
> Meanwhile, they didn’t really care that they couldn’t run whatever kooky app some random on the Internet had dreamed up.
I'm not sure how you can say consumers didn't really care. Some people have always cared. It's a tradeoff, though: you would have to care enough to not buy an iPhone altogether. That's not the same as not caring at all. Also, remember that for the first year, iPhone didn't even have third-party apps.
> At the time, this approach largely stayed within the console gaming world. It didn’t spread to actual computers because computers were tools. You didn’t buy a PC to consume content someone else curated for you.
I would say this was largely due to Steve Wozniack, who insisted that the Apple II be an open platform. If Steve Jobs—who always expressed contempt for third-party developers—originally had his way, the whole computing industry might have been very different. Jobs always considered them "freeloaders", which is ridiculous of course (for example, VisiCalc is responsible for much of the success off the Apple II), but that was his ridiculous view.
None of what was written in the rest of the article after this statement has any bearing on what they said in this statement. Sure, they said the "Microsoft Store", but aside from that, you still have the freedom of running whatever software you want on your own desktop computer, laptop computer, or server (Linux, Windows, or Macintosh) ... nothing has changed about this. I, for one, like the increased security on mobile devices. As far as gaming, I am not a gamer, so I just do not care.
I'm not sure how many Macs you've used lately, but this isn't entirely true: out-of-the-box, Macs only run software that has been signed and notarised by Apple.
You can still disable this, but the methods of disabling are getting more obscure, and it's not a given they will remain available
Which is why after Snow Leopard, I switched to Linux 100%.
Computers nowadays are so weird.
I remember seeing KDE and GNOME already have their "stores", we need to keep a close eye on Linux.