Top
Best
New

Posted by marbartolome 2 days ago

What happened to running what you wanted on your own machine?(hackaday.com)
404 points | 282 commentspage 3
Gigachad 2 days ago|
What happened was people ended up putting a lot of money and sensitive data on their computers and desired a system which wouldn’t expose that just because they ran the wrong software.
Dilettante_ 2 days ago||
"Wash me but don't get me wet." (Is this a saying in english?)
baxtr 2 days ago|||
I guess you are trying to say: "You can’t have your cake and eat it too." ?!
lupire 2 days ago||
Also, "want the milk without buying the cow", but I like "don't get me wet" because it highlights not wanting the result without the unpleasant step of the process. Then again, we have "dry cleaning" and ozempic.... https://english.stackexchange.com/questions/429316/wash-me-b...
bitwize 1 day ago|||
I'm reminded of a meme involving a dog with a ball: "Please throw? No take. Only throw."
api 2 days ago|||
The better answer is to build better OSes with better security models.

I should be able to run a crypto wallet I downloaded from a Kim Jong Un fan site while high and it shouldn’t be able to do anything I don’t give it permission to do.

It’s totally possible. Tabs in a web browser are basically this.

I can do it with VMs but that’s lots of extra steps.

colonial 1 day ago|||
macOS kinda gets there. I've (grudgingly) come to admit that it has by far the best security story of any desktop operating system. Apps require explicit user consent to access the filesystem, peripherals, and other sensitive data (e.x. Discord requests "Input Monitoring" access to determine if you're "actively online" even when unfocused.)

The only place it seems to fall flat is network I/O - LAN access requires permission, but dialing out to the wider Internet does not.

Compare Windows, which has jack (except for bloated anti-malware hooks in NTFS.)

Linux is _trying_ to replicate macOS with Flatpak/XDG portals, but those still need more time in the oven.

Source: I use both a MacBook and a Linux desktop daily.

netdevphoenix 2 days ago||||
Web pages have a lot of restrictions even if you consider the gradual adoption of the project Fugu APIs
fuzzehchat 2 days ago|||
Isn't that what Qubes is all about?
api 2 days ago||
Yes but IMHO that approach is a hack. “Fix our 1970s OS by putting it in a box in our 1970s OS.”
immibis 2 days ago|||
And by "people" we mean Hollywood. A great deal of this was created to enable DRM, then exploited for other purposes. For instance, it's illegal (by contract) to let a device without Secure Boot play a 4K stream from any mainstream studio. This is why Windows requires Secure Boot.
ranger_danger 1 day ago||
> This is why Windows requires Secure Boot.

No it isn't, and no it doesn't.

immibis 5 hours ago||
Elaborate?
Workaccount2 1 day ago|||
This is the real answer that is rather banal and boring compared to conspiracies of nefarious money harvesting.

95% of people don't know what "Run your own software" means, because to them, the app store lets them chose what apps to install. And they don't get viruses and malware like their 2008 laptop did.

That being said, there absolutely needs to be a mechanism for "lowering the gates" if the user wants full control of the device they own.

matheusmoreira 2 days ago||
Ah yes, the good old freedom for security tradeoff. Of course, in this case it's the security of trillion dollar corporations at the cost of our freedoms...
cadamsdotcom 1 day ago||
Time for a Digital Bill of Rights.

What would you include?

buyucu 1 day ago||
Answer: companies realized that they can milk you for more money by restricting your options and alternatives.
amelius 1 day ago|
Yes, this is the main idea behind iOS and the App Store. I don't get why smart people are falling for this.
ryandrake 1 day ago|||
Let me try to strawman a little: I personally accept this on my phone because I honestly don't consider my phone to be a computer, and I don't really care about "computing" on it. My phone is not really that important to me. It is a toy/appliance that I goof around with. What it's running and how "free" and "open" it is, is about as important to me as how free the firmware in my car is, or the software on my gaming console.

I care about the free-ness and open-ness of my computer, because that's where I do all my work, my E-mail, my finances, and all my "serious computing." I feel that a different standard applies on a Real Computer because they are totally different devices, used for totally different purposes. So what I accept on phones, cars, and gaming consoles, I don't accept on my computer.

lejalv 1 day ago|||
While this is fine for you, I worry about a sociocultural divide.

I believe the likelihood of a smartphone being the only form of computing (and access to the internet in particular) grows with diminishing income / cultural means.

This is based on anecdotal observation, does anybody here know of relevant survey data?

realityfactchex 1 day ago||
> relevant survey data

Based on a cursory look, keywords can include "smartphone-only internet users" and "large-screen computer ownership".

The American Community Survey asks questions related to that (income, computing devices). Comparing states, the poorer the residents of a state, the smaller the percent of households with regular computers ("large-screen computer ownership"), per "Computer Ownership and the Digital Divide" (Mihaylova and Whitacre, 2025) [0, 1, 2].

Also, Pew runs surveys on income and device usage ("smartphone-only"). Again, the lower the income, the higher the proportion that is smartphone-only [3, 4].

[0] Chart: https://files.catbox.moe/emdada.png

[1] Paper, "Census Data with Brian Whitacre.pdf": https://files.catbox.moe/1ttgee.pdf

[2] Web: https://www.benton.org/blog/computer-ownership-and-digital-d...

[3] Pew chart: https://files.catbox.moe/fs62tf.png

[4] Pew web: https://www.pewresearch.org/internet/fact-sheet/mobile/

EvanAnderson 1 day ago||
It sounds like lower income people aren't Real People and don't need Real Computers.

The idea that smartphones aren't computers and their users aren't deserving of software freedom is frustratingly entitled.

amelius 1 day ago||||
I suppose the reason for this is that this is how it has always been with mobile computing. People don't even bother to think about their smartphone as a computer anymore.
buyucu 1 day ago|||
You have nothing to fear, if you have nothing to hide. Right?
wetpaws 1 day ago|||
[dead]
everyone 2 days ago||
Part of the cycle .. https://www.goodreads.com/book/show/8201080-the-master-switc...
mikewarot 1 day ago||
I believe that in the depths of the cold war, when personal computers were just showing up, it was decided, deep within the National Security Agency,that it was more advantageous to let them continue to proliferate without fostering secure Operating Systems, though they were available.

We all now live with the blowback from that decision. Most people don't even realize that actually secure computing is a possibility now, even here on HN.

This general insecurity means that anything exposed to raw internet will be compromised and therefore significant resources must be expended to manage it, and recover after any incidents.

It's no wonder that most people don't want to actually run their own servers. Thus we give up control and this .... Situation .... Is the result.

SpicyLemonZest 1 day ago|
I affirmatively argue that actually secure computing is not a possibility. It's fun to build toy models where every process has exactly the permissions it needs and no more, sure. In the real world, your users are going to grant superuser/admin permissions to random installers, and they're not going to perform the complex verification rituals you told them to do beforehand.

It's like trying to set up a warehousing system so perfect that the shrinkage rate is 0.

bigbuppo 1 day ago||
It doesn't increase shareholder revenue. That is the second highest calling. The only thing more important is marketing and advertising, and this also helps that, so hey, two birds one stone.
lapcat 2 days ago||
> The moment gaming became genuinely profitable, console manufacturers realized they could control their entire ecosystem. Proprietary formats, region systems, and lockout chips were all valid ways to ensure companies could levy hefty licensing fees from developers.

This is historically inaccurate. All console games were originally produced in-house by the console manufacturers, but then 4 Atari programmers got wind that the games they wrote made tens of $millions for Atari while the programmers were paid only a relatively small salary. When Atari management refused to give the programmers a cut, they left and formed Activision. Thus Activision became the original third-party console game development company. Atari sued Activision for theft of trade secrets, because the Activision founders were all former Atari programmers. The case was settled, with Atari getting a cut of Activision’s revenue but otherwise allowing Activision to continue developing console games. I suspect this was because the 4 programmers were considered irreplaceable to Atari (albeit too late, after they already quit).

The licensing fee business model was a product of this unique set of circumstances. The article author's narrative makes it sound like consoles switched from open to closed, but that's not true. The consoles (like the iPhone) switched from totally closed to having a third-party platform, after the value of third-party developers was shown.

> Consumers loved having access to a library of clean and functional apps, built right into the device.

How can you say they're "built right into the device" when you have to download them? Moreover, you were originally able to buy iPhone apps in iTunes for Mac, and manage your iPhone via USB.

> Meanwhile, they didn’t really care that they couldn’t run whatever kooky app some random on the Internet had dreamed up.

I'm not sure how you can say consumers didn't really care. Some people have always cared. It's a tradeoff, though: you would have to care enough to not buy an iPhone altogether. That's not the same as not caring at all. Also, remember that for the first year, iPhone didn't even have third-party apps.

> At the time, this approach largely stayed within the console gaming world. It didn’t spread to actual computers because computers were tools. You didn’t buy a PC to consume content someone else curated for you.

I would say this was largely due to Steve Wozniack, who insisted that the Apple II be an open platform. If Steve Jobs—who always expressed contempt for third-party developers—originally had his way, the whole computing industry might have been very different. Jobs always considered them "freeloaders", which is ridiculous of course (for example, VisiCalc is responsible for much of the success off the Apple II), but that was his ridiculous view.

NoSalt 1 day ago||
> "When the microcomputer first landed in homes some forty years ago, it came with a simple freedom—you could run whatever software you could get your hands on. Floppy disk from a friend? Pop it in. Shareware demo downloaded from a BBS? Go ahead! Dodgy code you wrote yourself at 2 AM? Absolutely. The computer you bought was yours. It would run whatever you told it to run, and ask no questions."

None of what was written in the rest of the article after this statement has any bearing on what they said in this statement. Sure, they said the "Microsoft Store", but aside from that, you still have the freedom of running whatever software you want on your own desktop computer, laptop computer, or server (Linux, Windows, or Macintosh) ... nothing has changed about this. I, for one, like the increased security on mobile devices. As far as gaming, I am not a gamer, so I just do not care.

swiftcoder 1 day ago|
> or Macintosh

I'm not sure how many Macs you've used lately, but this isn't entirely true: out-of-the-box, Macs only run software that has been signed and notarised by Apple.

You can still disable this, but the methods of disabling are getting more obscure, and it's not a given they will remain available

NoSalt 1 day ago||
> "You can still disable this, but the methods of disabling are getting more obscure"

Which is why after Snow Leopard, I switched to Linux 100%.

Yeul 1 day ago||
Windows 11 gives me a giant warning if I actually want to run something.

Computers nowadays are so weird.

jmclnx 1 day ago|
My fear with IBM and AI, Linux could go down this path.

I remember seeing KDE and GNOME already have their "stores", we need to keep a close eye on Linux.

More comments...