Posted by spf13 16 hours ago
Then they turn around and claim that choosing a programming language is the most important thing you can do, and that you'll need to Like and Subscribe to learn more about it...
I've been through tens of rewrite projects, successful and unsuccessful, and seen projects and products at almost every scale, and I cannot agree that programming language choice is a primary driver in a product's success or failure. Even extending this thesis from language to framework and ecosystem, where there's perhaps a _tiny_ bit of signal, still doesn't really lead to a meaningful conversation. The main driver of a project's success is almost always driven by: the composition of employees working on the project, and the competence of the people architecting the project. Don't get me wrong - to an extent, some languages (especially more niche ones) drive hiring and what kind of employee you get, but this effect is dwarfed by who works on the project and how well it's managed.
Like, my team doesn't know anything about Java, but we COULD ship in Java if forced to. We can't ship if the feedback loop is a 30-minute CI pipeline because there is no way to have a local dev environment.
"Rock bottom is deeper than you think!"
I think a lot of the problem of switching isn't so much the language, but relearning all the undocumented lessons that were learned the hard way the first time around.
- Systems languages with manual memory management, like C or Zig, where real-time/low-latency performance is important.
- Rust and its borrow checker, as an alternative to manual memory management.
- A strongly FP influenced language such as a Lisp or Haskell. Especially Lisp macros. Exceptionally good for working with structured data in cases where purity is more important than performance.
- The BEAM/OTP architecture for distributed systems (Erlang/Elixir).
- Languages with good CUDA/PTX/Vulkan support, for programs that need the GPU.
- Assembly. This was a much bigger deal in the past before compilers got good. Today good mostly for educational purposes.
When Java came along its garbage collection blew my mind. I agree with the 4x factor.
I've rewritten a very large and complex macro assembler program into C. The original developers were gone. Nobody would touch that assembler code. I volunteered. The result was a program that could be maintained and could be ported to multiple diverse platforms needed by the company.
I tried to port Optlink (a linker for Win32) from assembler into C. It failed because the market for Win32 programming died, and so I abandoned the project.
My Empire game started out in BASIC. Then it was converted to FORTRAN, then PDP-11 assembler, then C on the PC, then D.
For example, migrating a web app from a language that predates Unicode to something that won't require a bunch of scaffolding around every user input sometimes is worth it. Moving from LABVIEW to a real programming language that integrated with remotely modern development tooling was worth it. Switching from C++ to Rust? Probably not.
Counter example:
Weird. LabVIEW is a real programming language. And both LabVIEW and Rust make entire classes of bugs that you often hit in C++ go away, especially for concurrent programs.
> remotely modern development tooling
I would make the argument that it is the tooling, i.e., Git and Diff, that is ancient and not remotely modern. Continually demanding that anything worth source controlling is text-based, and even further line-based, is as antiquated as it gets.
Do you have examples of working version control for non-text programs, including the ability to combine work from two parallel workstreams (aka, merge)? I worked for a long time at probably one of the largest companies using non-text based programming and version control was never even close to solved, and my subsequent forays into the literature when I worked in the "no code workflow" space still came up empty handed for working solutions here. We always fell back to "try to represent the structure of the change in some text-based way."
https://discord.com/blog/why-discord-is-switching-from-go-to...
This and similar are common ideas for the people that never see the real whole world of programming, and maybe have the fortune of be in the "startup" circles.
I see the opposite, and is very good predictor to know how bad a product or a team is, using the programming language AND the main DB engine, but that is because I live in the world of "enterprise" code where for example:
* I'm called to do a rewrite
* I see the screenshot of the main app
* I guess correctly was made with vb (first big alarm) (how I know: I never see in my circle anybody that do vb, php, c, c++ anything resembling a sane UI. BTW just the use of colors was enough to guess)
* I worry, but confirm, that use Access as the main db
* I discover that part of the data was ALSO in a excel file, that is used with the equivalent of "joins", and was not surprised to see things like this
Even without knowing more about the people that do it, that is far enough signals to guess much.
BTW, there are very good predictors, if Use: MySql, MonGo, Php, Js (almost whatever you wanna add here in terms of frameworks), VB, Perl, Android (aka: Java android and android itself without using iOS alongside), is likely terrible. Then Java or C# taking turns how much worse, but not as bad as the ones before. I sweat if somebody say it use C or C++. Probably enough to straight refuse to take the project.
Any use of not-obscure tech in this sector and is a good predictor to be more or less not-that-bad.
BTW: Also complex infra and related boilerplate is now probably a stronger predictor after some langs like python, go, typescript and more modern java/kotlin/c# has spread (and also more pg and much less nosql, but too much "cloud")
> the CTO couldn't be a first rate hacker, because to become an eminent [Windows] NT developer he would have had to use NT voluntarily, multiple times, and I couldn't imagine a great hacker doing that
Yes, you may not enjoy turning a legacy system that works into a nicely architected system, but the ones that get to that phase are clear successes IMO.
Contrast this with systems which had nice, clean, maintainable architecture from day one but bit the dust two years later.
The original article is a silly shill, as engineering managers have looked at economic cost of choosing a language and other technology since... forever.
And some of that "invisible" discussion happens visibly too (I've done that a number of times: "how do we keep our engineers motivated who want to explore a new hyped tech stack vs the cost of them being slower or leaving the company").
And because time to first release is often the difference between prosper and fail
It is not enough to be the best. You need to be the first
There are of course a few scenarios where changing the programming language is a more defensible, less "always wrong" kind of thing. An extreme case would be something like a COBOL system that needs maintenance and you have trouble finding people who can do it.
I've seen it. There are definitely incorrect language choices for certain projects.
It would be fair to say that these cases are themselves often exceptions. Many projects can be equally well accomplished by teams skilled in any language. But there is definitely a set of problems for which you can make incorrect language decisions.
I'm going to exaggerate to make the point in an attempt to avoid too much argument about whether or the language would be suitable, but: You do not sit down to write an industry-leading, high-performance database whose top-level implementation language is Python. If your project spec involves running code provided at runtime by users, Go is a fairly poor choice. You can make things a lot harder for yourself trying to be too insistent about what language you'll do your mobile development in, rather than just accepting that there's a very dominant choice in those spaces.
I've also seen projects I couldn't prove to you beyond a shadow of a doubt failed due to language selection, but I am fairly certain the project I saw that chose Scala failed primarily for the choice of Scala where it was a bad fit, both technically and for the skillsets of the engineers involved.
I've also seen projects nearly fail because they chose databases incorrectly, which I would submit is a fairly similar thing. Mostly because of choosing a NoSQL database "because fast" when they should have used a relational DB. The projects in question didn't fail because they were able to switch in time, but it was a close thing.
Part of "the composition of the employees of a project" being responsible for its success is that good engineers pick at least a decent solution to a problem from day one. The aforementioned DB problem, for instance, should have been obvious from the very beginning that it was not the correct choice in their case. There are absolutely wrong choices, that can crash projects both quickly and slowly.
I guess we can all agree that writing your web application using a fortran framework to generate JS code is a bad idea.
But if you pick tfa's second example, picking Go vs. Rust for a new project, the language choice is secondary. Both languages were likely fine unless the project as a specific library requirement.
The main criteria to make the choice was likely whether the team had developers with some experience in that language, and whether using that language would make them feel dead inside in the morning when they check in ; and I'm pretty sure developers can be found that make either choice a great choice.
The point tfa's making, that picking a language defines culture, the hiring pipeline etc. is fitting neither the first example (team already there, and a rewrite is almost always a bad choice) nor the second example (team also already there, and the culture with them. Pipeline therefore irrelevant).
You mean they're writing their own database? Why? That's a huge job and available databases are pretty good. There are multiple open-source choices, all of which work.
If they think they're going to compete with Oracle, they need to read the history of Oracle.
Of course, almost no one should attempt this. The number of people with the technical expertise to pull it off successfully is much, much smaller than the number of companies with workloads that would benefit from this.
It doesn't have to be an exotic workload. Sometimes the market is just full of weak implementations e.g. graph databases.
They're just not competing with Oracle.
That being said, some programs can only be written in one of those. Browser code is JS exclusive, low-level needs C++, secure code needs not C++. Machine Learning needs Python and high performance can't use Python. Some Windows things need C#. Those cases are the obvious ones where there is basically no choice. Beyond those, it is mostly about the team.
Don't go running around telling people that they can dig the Panama Canal with three toothpicks and a spare weekend, and if they fail, well by golly they just didn't have enough grit and gumption like us awesome folks who could have done it with only two. Tool choice matters. In fact I can hardly process how anyone can be an engineer and think that it doesn't, let alone how they can think it's some sort of engineering wisdom to claim that it doesn't matter what tools you use to do a project.
Of course, picking the tool is only the moment the project may fail. It is not the moment the project succeeds; there's still a lot of using it correctly that will be necessary and plenty of further opportunities to fail even with the correct tool. But at least success is within the range of possibilities. You can forstall that possibility entirely on day one with incorrect tool choices.
But is anyone REASONABLY competent going to do that? They might pick C/C++, Go, Rust, Java, etc. Those aren't a choice between "bulldozers and toothpicks" - they are more akin to choosing between Caterpillar, Volvo or Hitachi as the vendor of choice for construction equipment. They may have some gaps in their specific list of equipment, they may charge too much for one specific tool, your workers may have experience with one, not the other, etc..
Your example should be the textbook definition of a strawman argument..
Just to be clear, I wasn't trying to claim this; tooling certainly matters, at the very least, for the happiness and welfare of an engineering team! But, the article tries to claim things like "choosing a programming language is the single most expensive economic decision your company will make" and outside of a few extreme edge cases, I just can't agree with that particular thesis. Even the examples of bad decision-making you pose in your sibling comments, like writing a database in Go or "almost failing" by using sketchy niche datastores, are actually examples of this exact thing: these projects made huge engineering mistakes only to achieve some level of success as a business. Would they have been more successful if they made better engineering decisions? Possibly, but again, language and framework just was not the most important decision or factor driving an outcome.
I'm not saying that means we shouldn't care about making good engineering choices; there are easy ways to do things and hard ways to do things, and certainly I'm going to advocate for and work with people and at companies that favor the easy ways to do things. But when it comes to overall outcomes, I'll stand by having seen far more projects sacrificed to analysis paralysis, rewrites, rewrite-related hand wringing, and language/tooling hubris than sabotaged by poor language and framework choices.
it is (almost) always people and (almost) never language/framework/…
One argument I would like to add to this original debate is that I have observed two types of developers: one type tries to stick to one programming language for everything (e.g. Java) and tries to write everythin in that language. They are specialist and they may accrue deep knowledge of the language versions, APIs and IDE(s). Another type of developer, in contrast, maintains active knowledge of a dozen programming languages (C/C++, Java/Kotlin, Python, bash, Rust, Go, PHP, JavaScript, ...), and is capable of delivering projects in each. They'd pick a language suitable for the task and stick with it for a project. They won't know any single language as intimately as the first type, but they benefit by virtue of their choice of language being more appropriate to the given project.
Programming language alone should almost never be a big enough issue to force a rewrite, but if you already have serious other issues that force huge changes you might as well look at it at the same time.
This is my experience too. I’d go a bit further and say the leads are the primary driver of success. Because ultimately, if the composition of the people on a project is incorrect, it’s the lead’s responsibility to realize and change it.
With that pov, I don't see any contradiction by saying the language is an important decision, and rewriting your project in a new language is probably a bad idea.
To some extent! There are also cases where any decision is better than no decision, and all the options are good enough that it's not worth the delay to argue about them.
A possible reason for this is that our current languages are way too similar to make a difference.
Even most of the ones we think of as radically different.
> Don't get me wrong - to an extent, some languages (especially more niche ones) drive hiring and what kind of employee you get
In my experience, the community around a given language is going to significantly influence the sort of typical applicant you get for a job working in that language. Those profile vary a surprising amount, especially for, as you say, niche languages, but also for "beginner" languages.
I have seen businesses significantly harmed because they hired what I would term language specific technicians instead of engineers. That's a failure of leader, certainly, but that failure is a lot more likely for certain languages.
I have seen this too, and I really like the way you phrased it - I think I'll use that in the future!
I do think it's an easier trap to fall into with some languages, but I still don't think the language really drives it.
I worked on a large-scale Rust project that could probably have been a Go project a while ago and while Language Technicians were a big hiring hazard, after we got one or two we both learned how to manage them and stopped hiring that type of employee (since they weren't what our project needed) and things evened out and were successful in Rust.
Yeah, in the end poor hiring practices drive it. The language you choose just makes the probability of that failure possibility higher or lower.
> I worked on a large-scale Rust project that could probably have been a Go project a while ago and while Language Technicians were a big hiring hazard, after we got one or two we both learned how to manage them and stopped hiring that type of employee (since they weren't what our project needed) and things evened out and were successful in Rust.
That tracks with my experience, for sure. Once you learn to spot it, you can mitigate it.
Having that said I agree that language/platform is a "non technical requirement" in 90% of real world cases. You pick what you know or in more industrious scenarios - what's available on the market or what's the most cost effective.
But people are indeed irrational about programming languages. There's tribalism, stereotypes and preconceptions. Most notable is probably PHP, language for human failures and shit projects. As if same exact project written in Java was suppose to be of higher philosophical value.
But it's something you can't deny. I've been asked in non ironic way by a (non technical) founder investor if I could recommend him Rails programmers, because he read about it and it's suppose to be great. I asked him about specifics of his new project and he said he doesn't have an idea yet, but it has to be in Rails. Go figure.
Personally, my bias is towards the languages I'm most comfortable with. I recognize this and will make other suggestions and if I'm not responsible for the code, I'm more than flexible.
All the fad chasing and top down declarations that we're all going to use Cucumber, GraphQL, Microservices or anything else is often a bad move.
First learn the problem you are trying to solve and empathize on behalf of the user... Then empathize on behalf of support... on behalf of the maintenance developers... on behalf of yourself in a decade. Is there a boxed solution? Buy it.. Is something custom really needed, what can you outsource as part of it? integrate it. Do the simplest, easiest thing you can to get the job done.
I fully agree. The challenge is, some will want to use the latest languages and technologies because they want to learn it (personal development, meaning: the next job). Sometimes the "new thing" can be limited to (non-critical) testing and utilities. But having many languages and technologies just increases the friction, complicates things, and prevents refactoring. Even mixing just scripts with regular languages is a problem; calling one language from another is similar. The same with unnecessary remote APIs. Less technologies is often better, even if the technologies are not the best (eg. using PostgreSQL for features like fulltext search, event processing, etc.)
This is a bit related to external dependencies vs build yourself (AKA reinvent the wheel). Quite often the external library, long term, causes more issues than building it yourself (assuming you _can_ build a competent implementation).
I'm sorry but I disagree. Languages are tools, pick the best tool for the job. The idea that languages are all good at everything is not true. And when I see takes like this, I tend to think that that person just doesn't understand how to assess a language's strengths and weaknesses.
Want to write ML, probably best to use a language with functions as first class types (ie a FP language). You might say, most people doing ML use Python. This came to be because the language was picked based upon what people knew. But the big companies doing ML (successfully) don't use Python anymore and haven't for over a decade. ML researchers kept FP alive for several decades when nobody else cared because FP is the best tool for ML (or for writing a compiler). Where the FP folks get into trouble is trying to push FP where it doesn't make sense. I see this pattern repeating over and over again. Languages are pitched as silver bullets when they are just screwdrivers and hammers.
Right tool for the job, ignoring this leads to "religious wars" because that's how we describe disputes which are matters of taste.
Also, you are massively overvaluing expertise in a given language. A more talented engineer who doesn't know your favorite language after a couple of months will be better than you in your favorite language too despite your greater experience with that language.
I strongly disagree. The top AI companies are using a lot of Python (although sure they also use other languages too, but they're definitely using Python!). Even if by ML you mean old-school ML techniques, a ton of big companies also use Python for this (some might use MATLAB or R).
On the other hand I don't know of a single large company using an FP language for ML, unless you count something like Spark (which I would push back on: the Scala API of Spark is not really FP and almost all users of Spark that I know of program mainly in the more OO part of Scala rather than its FP part).
Even die-hard FP companies such as Jane Street use Python for their ML (see e.g. https://www.janestreet.com/join-jane-street/position/4276720... which notably mentions Python and does not mention OCaml).
Do you know of any company with a team of over 50 ML researchers (either old-school ML or modern AI) using an FP language as their primary workhorse for that team? Because I can't think of a single one.
More to the point, do you know of any ML researcher (and who is acknowledged as primarily an ML researcher by other ML researchers) who primarily programs in an FP language?
Even in the golden era of symbolic AI from the 70s and 80s they still weren't using FP languages (they basically didn't exist yet). The closest you could say is that they were using Lisps, but Lisps aren't FP languages by default. Some are (e.g. Clojure), but many aren't or at least aren't any more FP than any other multiparadigmatic language (e.g. Common Lisp). And again I don't know of any significant ML work being done in Lisps at this point (there's some scattered small teams and individuals doing work there, but nothing that I think could rise to the level of "big ML company").
> ... or there are underlying reasons for a language shift.
As to "best language" that is just as dogmatic as anything else... just look at the C/C++ vs. Rust divide in the Linux community.
I think you are overestimating the value of a best fit language for any given task, especially those where there are a half dozen popular languages that more people know well that can do the job good enough. Don't build for a sky scraper when all you need is a birdhouse.
Also, MOST engineers aren't particularly talented. If you're fortunate enough to be working for an organization where everyone is a rockstar, that's great... for those doing bog standard CRUD apps for business, you don't get rockstar money, and you aren't finding rockstar talent. You get what you get and make the pest of it.
In nearly three decades, I've once, only once worked a project where I didn't have to explain a relatively simple concept to someone, where everyone on the project delivers their pieces in time and all were talented. It was wonderful. Then new management gets stacked on top, all the job roles are reclassified to mid level developers and everyone rolls out of that group.
A lot of the actual experience is literally explaining public/private key usage to other developers who manage to (re)use the same keys from dev to all the production deployments. Or a pissing match with the "security expert" who doesn't understand that your app's use case is different than the in the box security script that is failing, because your /login route is a different app from / and the bogus query params don't matter.
Python: Python is almost a hard-compiled language. Most of the dynamic stuff that's really hard to compile isn't all that useful. But Guido and his enablers love the dynamism, and the CPython implementation. So instead of PyPy taking over, we have CPython with hacks to call C.
Go: The "share by communicating" thing in Go works out about as well as it does in other languages, that is, it's useful but not central. Early on, there were tortured examples of implementing locks with queues. Nobody does that any more. People pretty much write Go like they do other languages, with shared state and locks. Queues are used when queues do something useful. The real strength of Go is that the libraries needed for webcrap are maintained and used by Google, so they're all well-tested and exercised. Also, goroutines/green threads eliminate the sync/async distinction. Garbage collection takes care of most ownership problems. Simple. (I recently wrote a web back end in Rust. Big mistake. Should have used Go.)
Rust: The "traits" system is an overreaction to Objects Are Bad. Rust probably would have been better off with single inheritance, which is well understood. (Multiple inheritance has too many dark corners.) People keep trying to do OOP with traits, which is like pounding a screw. Rust still doesn't have a good solution to the back reference problem, as I point out occasionally. The macro language sucks, but then almost all macro languages suck. "Async" is a nightmare but necessary to keep the Javascript crowd happy, since that's all they know. If you really need complex multiprocessor concurrency, Rust is currently the best game in town. Most people don't.
C++: They can't take anything out, and the cruft is too deep. "Modern C++" is not all that bad, but all of bad old C/C++ is still in there. So the safety situation remains awful. The cumulative complexity is now so high that even long-time language lawyers are giving up following it.
Javascript: Who thought that would rule the world? It's awful, but everywhere. Heroic efforts have made an inherently slow language go fast. It's kind of impressive, actually.
Go: Greenthreading is a big deal in backends. Arguably the main reason for Kotlin is because Java didn't have that, but now there are vthreads. Rust has chipped away at the Go systems use cases, so it's mainly for backends and CLIs now, but I wish it had better error handling.
Rust: There's a very good preso from the Rust team about how they arrived at async/await, and also how every other language does concurrency. Greenthreading was considered, main problem being you need a runtime for that.
C++: Torvalds was right about it all along.
JS: Honestly the best high-level language, made better choices than Python, never made huge breaking changes, somehow had a decent answer to cooperative multitasking before most other langs, deserves its popularity.
> Rust: The "traits" system is an overreaction to Objects Are Bad.
It's interesting. Interfaces, traits, and mixins are all OOP concepts.
> "Async" is a nightmare
In what way? I do agree though, that it's annoying to yet again have a new language not have concurrency baked into the language. In some ways, Rust has excuses, because they want their concurrency primitives to support microcontroller, real-time Linux, and general purpose programming.
I'd have to disagree, it really isn't, and I even think that's kind of the point the article makes.
Rewritting existing services from scratch in another language can often be a bad decision, because it assumes the choice of programming language is an important one. And any rewrite is costly, doesn't matter the reason why.
But starting a new project in a new language I don't think has much impact generally. And if it motivates the team, because they're excited about it, it can even help.
Deciding later to rewrite this once it's been built because it's not in the same language as what is common at the company, that's likely the mistake that will happen.
NOT because a good leader will save $$$$$$; because a bad leader can single-handedly sink a ship.
- Donella Meadows, Leverage Points: Places to Intervene in a System https://donellameadows.org/archives/leverage-points-places-t...
If you're a Python Programmer, and you've made that your identity, you've trapped yourself into the mindset that you are a Python Programmer. Same with any other identity you ascribe to yourself (or allow others to ascribe to you). Separating yourself from your tools allows you to evaluate the tools independently of your identity and you will find yourself unaffected (or at least less affected) by reaching a conclusion that the thing you know well may not be appropriate to the job. You may not be appropriate for the job, and that's fine too. Getting past paradigms (or identities with how I'm extending it) gives you much greater freedom to explore and participate in the world.
It's not me who has problems getting past paradigms, or anyone else I know for that matter, it's the recruiters and HR people who screen resumes by only ticking boxes on buzzword.
If you find yourself searching for a replacement language more frequently, you should stop, take a long look in the mirror and ask yourself:
1. Something has clearly gone wrong last time, since you're looking for a replacement so soon; are you confident of your language-picking ability?
2. Are you sure your goal is to do what's best for the software and its long-term maintenance, or is there some other consideration here?
Worse, this also falls into the trap of thinking you have to make a choice, and that an informed choice is better than chance. Reality is that you are often best trying to do both, and then having some sort of coin flip to disambiguate choices works far better than we'd like to admit. In large part for the simple reason that simply doing something is more than a lot of your competitors will be doing.
Generalizing briefly, the same phenomena of identity underlies a lot of our religious wars. Be it language, braces, indentation, or a variety of other programming choices. What's fundamentally going on is that programming expertise is fragile. (I think I first saw that idea in Code Complete?) A new language / style / technology / whatever very often will leave us less competent. Ideally we would respond with, "I guess I need to get back on that learning curve." But often it is easy to instead blame the external factor. "I'm a good programmer. I tried it. The result didn't work well. It must be bad."
Among the many attempts to try to fix the problem, I can recommend https://blog.codinghorror.com/the-ten-commandments-of-egoles.... To the extent that you manage to apply its advice, you really will do better.
Of course your improvement won't directly help those around you...
This may be a big factor in rejecting unfamiliar languages. Over time the brain trains itself to grok a specific syntax, and understanding becomes partly automatic: we look at a Java program and our brain injects meaning into our consciousness. If we then look at APL or Lisp, however, that training on Java doesn't apply and the automatic injection doesn't function. We're left having to read the symbols directly, and it's unsettling not having the auto-assist. It makes us feel we "can't" understand the language, when it would likely take a couple weeks of immersion to change that impression.
I wrote a compiler/language, and I was expecting something different from the article after my experiences
Like many of the other commenters, I didn't like the article
By far no. Now I don't know if I even should read beyond that.
If a company chose brainfuck as a main programming language, it's doubtful they'd come back from that choice.
Based on what? Economics define where you go, not language. If company using Perl or VB has steady cash flow and their bottleneck is language - they’ll just rewrite when it makes sense. No amount of writing in C# or Python from scratch will save you if your product is garbage.
Same thing with the fundamental architecture of programs, and especially the data model & database solution.
Still I don't entirely agree with the article. He makes it sound like there isn't any difference between programming languages and any preference is purely about developer identity. But that identity doesn't come from nowhere. Rust is popular because a load of C++ developers finally found something technically better. Not because they all woke up one morning and decided to be "a Rust developer".
Where the story falls apart is that Perl is arguably even worse than PHP. One deluded Perl programmer does not prove a principle.
On Rust vs Go, he's absolutely right that Go is has a slightly better "build & deploy" story (though not by a much). But reading between the lines I think he's misrepresenting that - it sounds like a) that was just one point for choosing Rust, and b) that was a point when comparing to typical alternatives, e.g. C++ or Java. It's not untrue that Rust is easy to build & deploy simply because Go is slightly easier.