Posted by deevus 11 hours ago
cl yourprogram.c /link user32.lib advapi32.lib ... etc etc ...
I've built a load of utilities that do that just fine. I use vim as an editor.The Visual Studio toolchain does have LTSC and stable releases - no one seems to know about them though. see: https://learn.microsoft.com/en-gb/visualstudio/releases/2022... - you should use these if you are not a single developer and have to collaborate with people. Back like in the old days when we had pinned versions of the toolchain across whole company.
[1] https://download.visualstudio.microsoft.com/download/pr/5d23...
You only get access to the LTSC channel if you have a license for at least Visual Studio Professional (Community won't do it); so a lot of hobbyist programmers and students are not aware of it.
On the other hand, its existence is in my experience very well-known among people who use Visual Studio for work at some company.
There are licensing constraints, IANL but essentially you need a pro+ license on the account if you're going to use it to build commercial software or in a business environment.
How strict Microsoft is with enforcement of this license is another story.
> Previously, if the application you were developing was not OSS, installing VSBT was permitted only if you had a valid Visual Studio license (e.g., Visual Studio Community or higher).
From (https://devblogs.microsoft.com/cppblog/updates-to-visual-stu...). For OSS, you do not even need a Community License anymore.
> if you and your team need to compile and develop proprietary C++ code with Visual Studio, a Visual Studio license will still be required.
I don’t need visual to write, read, compile, or link any code using the toolchain.
GPL was made in response to restrictive commercial licensing. Yes is uses the same legal document (a license): but is made in response!
So is propriety seizes to exist, then it's not a problem GPL also seizes to exist.
Also: it's quite obvious to me that IP-law nowadays too much. It may have been a good idea at first, but now it's a monster (and people seem to die because of it: Aaron Swartz and Suchir Balaji come to mind).
https://www.stacksocial.com/sales/microsoft-visual-studio-pr...
At least in the EU, this is legal.
An article about court decision by the EuGH from 2012:
https://www.heise.de/hintergrund/EuGH-Gebrauchte-Softwareliz...
Another court decision from the BGH (the highest German civil court) from 2014 that builds on this EuGH decision:
https://www.heise.de/news/BGH-begruendet-Rechtmaessigkeit-de...
Visual studio is a dog but at least it's one dog - the real hell on windows is .net framework. The sheer incongruency of what version of windows has which version of .net framework installed and which version of .net your app will run in when launched... the actual solution at scale for universal windows compatibility on your .net app is to build a c++ shim that checks for .net beforehand and executes it with the correct version in the event of multiple version conflict - you can literally have 5 fully unique runtimes sharing the same .net target.
If you somehow experience an actual dependency issue that involves glibc itself, I'd like to hear about it. Because I don't think you ever will. The glibc people are so serious about backward and forward compatibility, you can in fact easily look up the last time they broke it: https://lwn.net/Articles/605607/
Now, if you're saying it's a dependency issue resulting from people specifying wrong glibc version constraints in their build… yeah, sure. I'm gonna say that happens because people are getting used to pinning dependency versions, which is so much the wrong thing to do with glibc it's not even funny anymore. Just remove the glibc pins if there are any.
As far as the toolchain as a whole is concerned… GCC broke compatibility a few times, mostily in C++ due to having to rework things to support newer C++ standards, but I vaguely remember there was a C ABI break somewhere on some architecture too.
Only the latest .NET Framework 4.8 is shipped with Windows at this point.
.NET 10 supports a Windows 10 build from 10 years ago.
We had just deprecated support for XP in 2020 - this was for a relatively large app publisher ~10M daily active users on windows. The installer was a c++ stub which checked the system's installed .NET versions and manually wrote the app.config before starting the .net wrapper (or tried to install portable .NET framework installer if it wasn't found at all).
The app supported .NET 3.5* (2.0 base) and 4 originally, and the issue was there was a ".NET Framework Client Profile" install on as surprising amount of windows PCs out there, and that version was incompatible with the app. If you just have a naked .NET exe, when you launch it (without an app.config in the current folder) the CLR will decide which version to run your app in - usually the "highest" version if several are detected... which in this case would start the app in the lightweight version and error out. Also, in the app.config file you can't tell it to avoid certain versions you basically just say "use 4 then 2" and you're up to the mercy of the CLR to decide which environment it starts you in.
This obviated overrides in a static/native c++ stub that did some more intelligent verifications first before creating a tailored app.config and starting the .net app.
I feel for those who have to support an OS no longer supported by the vendor. That's a tough position to be in, not only if a customer comes across a bug that is due to the OS, but it keeps you from advancing your desktop application forward.
Why is it ok that you have to invest 2 times number of apps hours just because MS has such a short life cycle for its .NET versions.
.NET Framework should only be used for legacy applications.
Unfortunately there are still many around that depend on .NET Framework.
Microsoft sadly doesn't prioritize this so this might still be the case for a couple of years.
One thing I credit MS for is that they make it very easy to use modern C# features in .NET Framework. You can easily write new Framework assemblies with a lot of C# 14 features. You can also add a few interfaces and get most of it working (although not optimized by the CLR, e.g. Span). For an example see this project: https://www.nuget.org/packages/PolySharp/
It's also easy to target multiple framework with the same code, so you can write libraries that work in .NET programs and .NET Framework programs.
The current solution is to use the CLI tools just like C++.
However have you looked into ComWrappers introduced in .NET 8, with later improvements?
I still see VB 6 and Delphi as the best development experience for COM, in .NET it wasn't never that great, there are full books about doing COM in .NET.
Because that’s pretty much any freaking thing - oh Python, oh PHP, oh driving a fork lift, oh driving a car.
Once you invest time in using and learning it is non issue.
I do get pissed off when I want to use some Python lib bit it just doesn’t work out of the box, but there is nothing that works out the box without investing some time.
Just like a car get a teenager into a car he will drive into first tree.
Posting BS on Facebook shouldn’t be benchmark for how easy things should be.
Thus this should be less of a problem.
However, there were version problems: some Linux distributions had only stable packages and therefore lacked the latest updates, and some had problems with multiple versions of the same library. This gave rise to the language-specific package managers. It solved one problem but created a ton of new ones.
Sometimes I wish we could just go back to system package managers, because at times, language-specific package managers do not even solve the version problem, which is their raison d'être.
I want to focus on the project itself; not jump through hoops in the build process. It feels hostile.
For cross compiling to ARM from a PC in rust in particular, you do one CLI cmd to add the target. Then cargo run, and it compiles, flashes, with debug output.
These are from anecdotes. I am probably doing something wrong, but it is my experience so far.
That seems more a property of npm dependency management than linux dependency management.
To play devil's advocate, the reason npm dependency management is so much worse than kernel/os management, is because their scope is much bigger, 100x more package, each package smaller, super deep dependency chains. OS package managers like apt/yum prioritize stability more and have a different process.
Had fewer issues on EndeavourOS (Arch) compared to Fedora overall though... I will stay on Arch from now on.
uv has more of less solved this (thank god). Night and day difference from Pip (or any of the other attempts to fix it honestly).
At this point they should just deprecate Pip.
Continuing to use Pip because Astral might stop maintaining uv in future is stupidly masochistic.
That's where I stopped.
Toolchains on linux distributions with adults running packaging are just fine.
Toolchains for $hotlanguage where the project leaders insist on reinventing the packaging game, are not fine.
I once again state these languages need to give up the NIH and pay someone mature and responsible to maintain packaging.
And when it inevitably leads to all kinds of weird issues the packagers of course can't be reached for support, so users end up harassing the upstream maintainer about their "shitty broken application" and demanding they fix it.
Sure, the various language toolchains suck, but so do those of Linux distros. There's a reason all-in-one packaging solutions like Docker, AppImage, Flatpak, and Snap have gotten so popular, you know?
This is only the case for debian and derivatives, lol. Rolling-release distributions do not have this problem. This is why most of the new distributions coming out are arch linux based.
I am so fed up with this! Please if you're writing an article using LLMs stop writing like this!
“This isn’t just [what the thing literally is]; it’s [hyperbole on what the thing isn’t].”
In the UK, Marks and Spencer have a long-running ad campaign built around it (“it’s not just food, it’s...”)
Em dashes are fine too.
Er, sorry. I meant: the purpose isn't just drama—it's a declaration of values, a commitment to the cause of a higher purpose, the first strike in a civilizational war of independence standing strong against commercialism, corporatism, and conformity. What starts with a single sentence in an LLM-rewritten blog post ends with changing the world.
See? And I didn't even need an LLM to write that. My own brain can produce slop with an em dash just as well. :)
You can then build a script/documentation that isolates your specific requirements and workloads:
https://learn.microsoft.com/en-us/visualstudio/install/use-c...
Had to do this back in 2018, because I worked with a client with no direct internet access on it's DEV/build machines (and even when there was connectivity it was over traditional slow/low-latency satellite connections), so part of the process was also to build an offline install package.
(And - it is better on a shared-machine to have everything installed "machine-wide" rather than "per-user", same as PowerShell modules - had another client recently who had a small "C:" drive provisioned on their primary geo-fenced VM used for their "cloud admin" team and every single user was gobbling too much space with a multitude of "user-profile" specific PowerShell modules...)
But - yes, even with a highly trimmed workload it resulted in a 80gb+ offline installer. ... and as a server-admin, I also had physical data-center access to load that installer package directly onto the VM host server via external drive.
> curl -L -o msvcup.zip https://github.com/marler8997/msvcup/releases/download/v2026...
No thanks. I’m not going to install executables downloaded from an unknown GitHub account named marler8997 without even a simple hash check.
As others have explained the Windows situation is not as bad as this blog post suggests, but even if it was this doesn’t look like a solution. It’s just one other installation script that has sketchy sources.
Just like those complaining about curl|sh on Linux, you are confusing install instructions with source code availability. Just download the script and read it if you want. The curl|sh workflow is no more dangerous that downloading an executable off the internet, which is very common (if stupid) and attracts no vitriol. In no way does it imply that you can not actually download and read the script - something that actually can't be done with downloaded executables.
[0] https://github.com/marlersoft/zigwin32 [1] https://github.com/microsoft/win32metadata
But if this is LLM content then it does seem like the LLMs are still improving. (I suppose the AI flavour could be from Grammarly's new features or something.)
This was either written by Claude or someone who uses Claude too much.
I wish they could be upfront about it.
It's hated by everyone, why would people imitate it? You're inventing a rationale that either doesn't exist or would be stupider than the alternative. The obvious answer here it they just used an LLM.
> and clearly it serves some benefit to readers.
What?
It could be involuntary. People often adopt the verbal tics of the content they read and the people they talk with.
We need a dictionary like this :D
Know what's more annoying than AI posts? Seeing accusations of AI slop for every. last. god. damned. thing.
So if you see LinkedInglish on LinkedIn, it may or may not be an LLM. Outside of LinkedIn... probably an LLM.
It is curious why LLMs love talking in LinkedInglish so much. I have no idea what the answer to that is but they do.
The actual mechanism, I have no clue.
I came back around 2017*, expecting the same nice experience I had with VB3 to 6.
What a punch in the face it was...
I honestly cannot fathom anyone developing natively for windows (or even OSX) at this day and age.
Anything will be a webapp or a rust+egui multi-plataform developed on linux, or nothing. It's already enough the amount of self-hate required for android/ios.
* not sure the exact date. It was right in the middle of the WPF crap being forced as "the new default".*
What if it was?
What if it wasn't?
What if you never find out definitely?
Do you wonder that about all content?
If so, doesn't that get exhausting?
I completely agree with your parent that it's tedious seeing this "fake and gay" problem everywhere and wonder what an unwinnable struggle it must be for the people who feel they have to work out if everything they read was AI written or not.
I hardly ever go through a post fisking it for AI tells, they leap out at me now whether I want them to or not. As the density of them increases my odds of closing the tab approach one.
It's not a pleasant time to read Show HNs but it just seems to be what's happening now.
Exactly!
I personally like the content and the style of the article. I never managed to accept going through the pain to install and use Visual Studio and all these absurd procedures they impose to their users.
[1] https://www.pangram.com/history/300b4af2-cd58-4767-aced-c4d2...
Alternatively, there's this:
Install Visual Studio Build Tools into a container to support a consistent build system | Microsoft Learn
https://learn.microsoft.com/en-us/visualstudio/install/build...
`winget install --id Microsoft.VisualStudio.2022.BuildTools`.
If you need the Windows(/App) SDK too for the WinRT-features, you can add `winget install --id Microsoft.WindowsSDK.10.0.18362` and/or `winget install --id Microsoft.WindowsAppRuntime.1.8`
I used to just install the desktop development one and then work through the build errors until I got it to work, was somewhat painful. (Yes, .vsconfig makes this easier but it still didn’t catch everything when last I was into Windows dev).
I don't understand how open source projects can insist on requiring a proprietary compiler.
To answer your question, the headers.
Just use Clang + MSVC STL + WinSDK. Very simple.