Posted by em-bee 10/26/2024
If anything, I expect those existing VMs to slowly be replaced by WebAssembly due to how crucial and complicated that very specific sandbox requirement is - and how useful that is once you have it working reliably.
Personally I never want to run untrusted code on any of my computers outside of a robust sandbox. I look forward to a future where every application I might install runs in a sandbox that I can trust.
https://news.ycombinator.com/item?id=9732827
The Web is an evolving system too large and long-lived for any single company, stable consortium, or standards body capable of doing the deed to do it, so none of Java, Flash (AVM), .NET/CLR, NaCl/PNaCl, Dart, and others I have forgotten about ever had a chance to take over.
JS got out first and evolved through several jumps into https://asmjs.org/, a typed (as in static types) subset suitable with AOT+JIT techniques of hosting near-native-speed code such as Unreal Engine 3. https://brendaneich.com/2013/03/the-web-is-the-game-platform...
Java was mismanaged as a plugin (and only ever a plugin -- no deep or even shallow browser integration worth talking about) by Sun, who tried getting it into Windows after Microsoft was killing Netscape (Microsoft then killed Java in Windows, pulled trigger on .NET; Oracle later bought Sun).
Flash had its day but fell to HTML5 and fast JS, Adobe threw in the towel well before Wasm announcement, even salted the earth re: good Flash tools instead of retargeting them at the Web.
Google was a house divided all along but had absolutely no plan for getting PNaCl supported by Apple, never mind Mozilla or Microsoft. I told them so, and still get blame and delicious tears to drink as I sit on my Throne of Skulls, having caused all of this by Giant-Fivehead mind control (testimony from one of my favorite minions at https://news.ycombinator.com/item?id=9555028).
https://news.ycombinator.com/item?id=1905155
https://kripken.github.io/talks/2020/universal.html#/ (from Alon Zakai in 2020)
The more important thing to consider, however, is the fact that CLR, JVM, etc. provide internal memory safety whereas Wasm runtimes don't.
e.g. a C program that goes sufficiently out of bounds on an array is guaranteed to segfault in the C runtime, but that runtime error does not necessarily occur on a wasm target. That is to say, the program in the sandbox can have totally strange runtime behavior -- still, defined behavior according to wasm -- although the program has undefined behavior in the source language. In the case of JVM languages, this can't really happen.
> As told in JavaScript: The First Twenty Years, Brenden Eich joined Netscape in April 1995.
> [..]
> However, Eich didn’t think he’d have to write a new language from scratch. There were existing options available — such as the research language, Scheme, or a Unix-based language like Perl or Python. So when he joined, Eich “was expecting to implement Scheme in the browser.” But the increasingly fractious politics of the software companies of the day (it was, basically, everyone against Microsoft) soon saw the project take a more creative turn.
> On 23 May 1995, Sun Microsystems launched a new programming language into the world: Java. As part of the launch, Netscape announced that it would license Java for use in the browser. This was all well and good, but Java didn’t really fit the bill for the web. Java is a general-purpose programming language that promised Write Once, Run Anywhere (WORA) functionality, but it was too complicated for web designers and other non-programmers to use. So Netscape decided it needed a scripting language, which was a trendy term at the time for a smaller, easier to learn programming language.
There's a whole lot more interesting stuff but I think that part directly answers most of what you're wondering.
Look at the Java bytecode, and you'll see it features such things as a goto with an arbitrary offset: https://en.m.wikipedia.org/wiki/List_of_Java_bytecode_instru...
They had to build a verifier that attempts to ensure the bytecode isn't doing anything bad. That proved to be fairly difficult, and comes at a considerable cost.
I'm not a huge fan of WASM but it's easy to see that the authors would clearly not want to leave control in the hands of Microsoft or Oracle (and as a result all of us are hostages to Google instead because of evil that is Chromium).
https://ecma-international.org/publications-and-standards/st...
The JVM and CLR are poor compilation targets for C and C++, because those languages weren't designed to target those runtimes and those runtimes weren't designed to run those languages. (C++/CLI isn't C++.) It's possible to get something working, and a few people have tried, but you run into a lot of impedance mismatches and compatibility issues. I think you would see people run into a lot more problems trying to get their code running on the JVM or CLR than they in fact run into trying to get it running on WebAssembly. (Though I think the CLR is less bad about this than the JVM.)
As for the idea of using LLVM bitcode as an interchange format, we don't have to guess how that would have gone, because it was actually tried! Google implemented this in Chrome and called it PNaCl, and some sites and extensions relied on it for a while. They ultimately withdrew it in favor of WebAssembly. I don't understand all the reasons why it failed, but I think part of the problem is that it ran into a bunch of "the spec is whatever LLVM happens to do" type issues that were real problems for would-be toolchain authors and made the other browser vendors (including Apple, LLVM's de facto primary corporate sponsor) reluctant to support it. WebAssembly has a relatively short and simple standard that you can actually read; writing a WebAssembly interpreter is an undergraduate exercise, though of course writing a highly performant one is much more work.
Also, as far as I can tell, LLVM hasn't at all been optimized to death for the use case of runtime code generation, where the speed of the compiler is about as important as that of the generated code. The biggest dynamic language I know that uses LLVM is Julia, which is a decently big deal, but the overwhelming majority of LLVM usage is for ahead-of-time compilation of languages like C, C++, Swift, and Rust.
On a bigger-picture note, I'm not sure I at all understand why adopting an existing bytecode language would have made things easier. Yes, it would have been much easier to reuse existing Java code if the JVM had been adopted, or to reuse existing C# code if the CLR had been adopted, but those options are mutually exclusive; the goal was something that would work at least okay for all the languages. Python doesn't have a stable bytecode format, and Rust and Haskell compile to LLVM bitcode (which LLVM has no problem lowering to WebAssembly since WebAssembly was designed to make that straightforward), so I don't see how those languages are in any way disadvantaged by the choice of WebAssembly as the target bytecode language instead of some alternative.
Or are your concerns about I/O? That's a bigger can of worms, and you'd need to explain how you imagine this would work, but the short version is that reusing the interfaces that existing OSes provided would not have worked well, because the browser has a different (and in many ways better) security model.
And it was used by some browsers, there was just no consensus between different vendors due to politics. The problem largely solved itself by.. only one vendor remaining, chromium.
Also, "the proposed solution is not to backtrack on existing features" makes very little sense. If you're going to split something into core and "compiles down to core", then a LOT of features can be moved out of core because they're just (definitely worth keeping, but not necessary in core if that split were made) convenience APIs.
Incorrect: https://ecma-international.org/about-ecma/history/
While it is admittedly confusing to have all these different flavors of JS, I don’t think this proposal is actually as radical as it seems.
It is a disaster the moment you try to run same code on different platforms.
So now a huge swaths of use cases are going to be killed by this change. E.g. AdBlock, NewPipe and yt-dlp - how is that better? All of them (expect maybe AdBlock) rely on parsing incoming JS from YouTube which will be rendered obsolete by WebAssembly blob.
https://www.amazon.science/blog/how-prime-video-updates-its-...
There is already too much exhaustion around switching frameworks and paradigms in the JS world, but I guess everyone likes getting jerked around by corpos and evangelists these days.
On the Backend there are very few issues, outside of FFI only being in unstable for Deno I suppose, but you could frankly be running the same old Express API you did a decade ago and be perfectly fine.
If you’re burnt out on changes and keeping up with things I think the issue is mostly a “you” issue. You don’t have to chase down the latest hypes or fads. In fact I think you almost never should.
This is an incredibly disingenuous response. You maybe like the world this way. It doesn’t mean there isn’t room for change or improvement away from Javascript.
What's wrong with VanillaJS?
It’s not just the flavor of the week frameworks, it’s libraries and best practices. Want to work with dates? Do you use moment? Nope that’s deprecated, what do you use? Which moment successor? How do you write react? Classes or functions? You can’t use hooks with classes, so you better update to functions. On and on you run into a decision tree because of the shifting target of javascript. It causes a lot of churn to be migrating and updating to new systems, especially when the new hire can’t help because they don’t understand prototypal inheritance.
I can tell you such stories about any language, it’s not unique to JS. Welcome to working with people.
I had always imagined that if the DoJ took any action it would be to cleave the ad business away from Google. Although if they went so far as to take action against GCP I bet Amazon, Amazon Marketplace, and AWS would start to get sweaty palms