Posted by stickynotememo 1 day ago
The real protocol in action here is symbolic linking and hardware call ABIs.
You could always directly call Rust functions, but you'd have to know where to symbolically look for them and how to craft its parameters for example.
If this is well defined then its possible. If its poorly defined or implementation specific (c++) then yeah its a shit show that is not solvable.
That's exactly my case. For my programming language I have wrote a tool for C headers conversion using libclang. And even with help of this library it wasn't that easy, I have found a lot of caveats by trying converting headers like <windows.h>.
With C++ it's the same. Within the Haiku code it's half understandable, the whole spec it's to get driven mad in days.
Eg. here, from memory:
> ...you want to read 32 bits from file but OH NOOES long is 64 bit ! The language ! The imposibility !
But when you read something ot unserialize some format you just need to know based on format schema or domain knowledge. Simple and straightforward like that ! You do not do some "reflections" on what language standard provide and then expect someone send you just that !!
So that anti-C "movement" is mostly based on brainless exampless.
Not saying C is perfect.
But it is very good and I bet IBM and other big corps will keep selling things written and actively developed in C/C++ + adding hefty consulting fees.
In the meantime proles has been adviced to move to cpu-cycle-eating inferior languages and layers over layers of cycle burning infra in cloud-level zero-privacy and guaranteed data leaks.
Oh, btw. that femous Java "bean" is just object with usually language delivered "basic type"... How that poor programmer from article should know what to read from disc when he just have types Java provides ?? How ? Or maybe he should use some domain knowledge or schema for problem he is trying to solve ??
And in "scripting language" with automatic int's - how to even know how many bits runtime/vm actually use ? Maybe some reflection to check type ? But again how that even helps if there is no knowledge in brain how many bits should be read ?? But calling some cycle burning reflection or virtual and as much as posible indirect things is what fat tigers love the moust :)
We didn't do it to annoy you or to foist bad APIs on you. We did it because it was the best language for writing machine code at the time. By miles. Not understanding why this is true will lead you to make all the same mistakes the languages "bested" by C made.
There's some people that still develop on C for sure, but it's limited to FOSS and embedded at this point, Low Level proprietary systems having migrated to C++ or Rust mostly.
I agree with the main thesis that C isn't a language like the others, something that we practice, that it's mostly an ancient but highly influential language, and it's an API/ABI.
What I disagree with is that 'critiquing' C is productive in the same way that critiquing Roman Law or Latin or Plato is productive, the horse is dead, one might think they are being clever or novel for finding flaws in the dead horse, but it's more often a defense mechanism to justify having a hard time learning the decades of backwards compatibility, edge cases and warts that have been developed.
It's easier to think of the previous generation as being dumb and having made mistakes that could have been fixed, and that it all could be simpler, rather than recognize that engineering is super complex and that we might as well dedicate our full life to learning this craft and still not make a dent.
I applaud the new generation for taking on this challenge and giving their best shot at the revolution, but I'm personally thinking of bridging the next-next generation and the previous generation of devs, the historical complexity of the field will increase linearly with time and I think if we pace ourselves we can keep the complexity down, and the more times we hop unto a revolution that disregards the previous generation as dumb, the bigger the complexity is going to be.
I fully agree about your last point. The proposed solutions to some of the deficiencies of C are sometimes worse than the disease while its benefits are often exaggerated, at the same time adding unnecessary layers of complexity that will haunt us for decades. In contrast, my hope would be to to carefully revise the things we have, but this takes time and patience.
Do you mean that there's still code being developed, or that it still exists? Because I meant the former, the latter is true of a lot of dead languages like fortran and cobol, and it would cement the idea of being a dead language.
Being upfront about it by authors dispels a lot of the potential tension and substantially changes the way we interact. I understand there may be a conflict and not everyone will want to advertise their diagnosis, but in my experience once it becomes clear that's what's going on, it helps all the parties involved.