- error handling is neglected in the basic design: <https://github.com/nushell/nushell/issues/10633>, <https://github.com/nushell/nushell/issues/10856>, <https://github.com/nushell/nushell/issues/8615>, <https://github.com/nushell/nushell/issues/6617>
- control-C interrupts its internals with obviously-wrong error: <https://github.com/nushell/nushell/issues/8828>, is mishandled in other ways <https://github.com/nushell/nushell/issues/8206>
These bugs have existed for so many years I've 100% given up on Nushell. I recommend you don't subject yourself to software that is this level of unreliable and unbothered about being so unreliable.
(There's a lot of missing/misimplemented features, but the above is so severe they're not even worth mentioning.)
First, at least some of the issues linked have been addressed. Second, they don't strike me as severe. So, all in all, the above take seems like an exaggeration from my POV.
To what degree do these affect the commenter above? Other people? I've not noticed them myself. Maybe there is more to the story?
> I recommend you don't subject yourself to software that is this level of unreliable and unbothered about being so unreliable.
Can the above comment say more about "this level of unreliable"?
Think of all the people using Nushell. Out of say N hours of usage of Nushell, what percentage correspond to a user feeling something like "Nushell is unreliable and/or buggy in a way that notably affects me"? Of course I don't have data, but I would guess it would be very small, probably 0.5% or less.
My feels would be that people who "leave" or "bounce off" Nushell mostly do so for other reasons, such as (a) not POSIX compatible; (b) it feels weird to some due to its design decisions (maybe pipelining for example); (c) it has a learning curve because it is different; (d) things change quickly; there are breaking changes. But I would be surprised if there was anything close to ~0.1% of people feeling like Nushell is anywhere in the ballpark of being even one of these three: (i) deeply-flawed; (b) bug-riddled; or (c) led by people unbothered by unreliability.
Nushell is amazing and a total pleasure to use. I cannot yet discern any limit to how much thought has been put into it.
I’m stunned in a good way. I’m writing shell scripts without any pain — actually I much prefer them to Python! With the language-server integration – I use Zed, but I'm sure they exist in VS Code and others too – I can see errors in scripts as I write them! (Including typos in pathnames! Amazing.)
I have to admit Nushell didn’t become my daily driver right away; it took a few years for me to switch fully. I don't know exactly why, but it probably had to do with its lack of POSIX compatibility. I now see this as a necessary break for Nushell to pursue its vision.
About expectations: some people will be delighted immediately and get hooked, but not all. We're all busy and adopting a new tool can feel like a leap of faith. For me, it has been worth it. Nushell has felt like planting a garden that gives back way more than you put into it.
I wrote up a quick gist [1] that shows how nice the experience is to write a new command (i.e. function) in Nushell.
[1]: "Building Argsort with Nushell" https://news.ycombinator.com/item?id=46528644
Nushell is good on the Unixes as well, but the defaults there are less annoying. I regularly revert to bash because there's just some thing I've memorized in bash, and bash doesn't make me want to scream.
Note that this is just my perspective on it as an interactive shell. I've never used it for scripting.
I love Powershell and I wish MSFT would put a concerted effort into optimizing its performance.
The stock Windows terminal experience is awful. Windows has improved markedly recently with `winget`, so maybe they'll get around to fixing the speed sometime, too.
I've never experienced startup times anything close to a minute. Is your computer very old?
Just as easily as Aero switched to Metro, syntax in PowerShell will do what they want, despite impacts to your legacy scripting.
The POSIX shell, on the other hand, is a POSIX standard controlled by the Austin group. The classic adaptation is the Debian Dash shell, which is both tiny and fast, and changes are very, very slow.
Dash can be linked with libedit and used as an interactive shell. Everyone should do so, before learning non-standard extensions in Korn, Bash, Zsh, et al.
Shells are a matter of taste to a great extent. These are different envelopes of features, stability, and portability.
May you enjoy the trip, my friend. You deserve it.
POSIX tools are _not_ discoverable!
Get-ADObject -Filter * -Properties * | ConvertTo-JSON > ADObjects.json
And you have access to ALL of the .NET library.
' "Here is an ampersand... &" '
To clarify, in PowerShell there is a difference between text between single quotes (e.g. '$test') and double quotes (e.g. "$test"). Single quote strings are literal strings, so whatever text is contained within them is reproduced as written. Double quote strings are expandable strings, which means that certain text inside the string is evaluated before it is returned. If you have double quotes in a literal string, you'll see double quotes within that string, and the same should be true for ampersands.
As for it being a "masterpiece of design", it has it's quirks but compared to common Unix shells (aside from Nushell) it's far better. It doesn't need to have a perfect design in order to be a step above the competition.
I hope to continue to see the growth of Nushell, I can see that becoming the best shell one day.
Shell languages make sense if you believe in the Unix philosophy. For me, the problem with the Unix philosophy is the endless serialization and deserialization as well as lack of proper error handling. So nushell for me is a good answer to an ill-posed question.
The approach I have been taking is a single binary with a billion subcommands and aliasing these. I currently have two of these in Rust, one in Swift. I tried going all Rust but integrating with certain aspects of macOS is just so much simpler in Swift.
Like the recent push to make CLI development suck less (e.g. argument parsing used to be painful but is solved now) has made developing CLI tools a lot less painful.
Each releases breaks something, usually, and it’s been like that for a few years (like the default config file that was generated is no longer parsed after an upgrade, or a function was renamed, etc.)
I guess they are trying to go this way with their standard lib somehow.
On my workstation I can probably handle (minor) breaking changes, but I need stable runtimes on servers, which is why Bash still is still my default.
If I want to do real scripting/programming I use python or another dedicated programming language. I don't really know what the value of Nushell is for me. Maybe the plugin system is amazing but at the moment I miss nothing in my zsh.
However, I need to know sh/bash well because they're the tools installed by default; in any "well-established" organization, getting approval to install a new shell will range from "12 to 24 months" to "impossible". And without that, I'm not going to put in the effort to learn a new tool that is only useful some of the time and requires massive context switching.
[1] https://en.wikipedia.org/wiki/ALGOL_68 [2] https://en.wikipedia.org/wiki/Bourne_shell#:~:text=Stephen%2...
I’d probably rather use xonsh as I do more complex scripts in Python anyways.
I freely admit this is how I felt about – to talk about a different technology – the Zig language for at least a year. Until one day I spent about an hour trying it out. I am not a "convert", but it yielded useful personalized insight.
Speaking personally for a moment, when I reflect on my higher-level philosophy, it tells me: "If you notice yourself only «generating a list of reasons why NewThing doesn't fit your existing needs», you might be in a cognitive rut. Current needs matter, but so does ongoing capability development. And psychologically one's awareness of tools tends to shape what we perceive as needs.§"
So, it continues, "Instead, do a timebox and try the thing.* Attempt to see it for what it is rather than relative to what you already know! Doing this is really hard, so you'll probably need to seek out techniques to get better at it over time."
* Assuming it doesn't involve drugs, chemicals, or software supply chains of unknown origin.
§ When you only have a hammer...
I remember when I first learned how to work with files in Python and thought "Damn, y'all live like this?"
for i in 1..10 {
print $i
}really all that more readable than
for i in {1..10}; do
print $i
doneLike, am I taking crazy pills? They're basically exactly the same!
for i in `seq 10`
do
print $i
done
Which is pretty much readable though. The only issue is Pascal vs C syntax. As a fan of the former, I admit that the latter is more advanced: it stacks better. E.g. consider if (test -f junk) rm junk
against if test -f junk; then rm junk; fi
The former “avoids semicolons that Bourne requires in odd places, and the syntax characters better set off the active parts of the command.” [1]1: Rc—The Plan 9 Shell https://9p.io/sys/doc/rc.html
sleep 60; do_the_thing_that_needs_a_minute_wait
It's not necessarily required in the for loop either, I tend to prefer the more compact method of putting the "do" on the same line as the for. It can be written as
for i in {1..10}
do
print $1
doneHaving "done" be the signifier of the "the for loop context ends here" is 3 characters more than "}" or ")" or whatever else. "done" is more color coming off the screen with syntax-highlighting, and can be typed in two keypresses with a "d" and a "tab" in any editor from the last 30 years. It just seems like a very very inconsequential nitpick. At least Nushell doesn't pull a Python and just have "invisible space" be the end of the for-loop.
One line conditionals are doable as well in the shell.
test -f junk && junk
or
[[ -f junk ]] && junk
You can even just use [ -f junk ] if double brackets is giving the yuck.
for (f in *.c) if (test -f $f) cc $f
I can write the whole thing in a single line (which is essential in GUI and interactive contexts) and it reads well.To myself, I agree, but to a Windows native power user, the latter looks confusing.
Source: Windows native colleagues.
Something I came to appreciate is that the 50MB binary is really battery included. You will be able to deal with JSON, XML, SQLite DBs, HTTP requests and more. I personally do not need anything else.
Now, no matter how much you get excited about it, do not write too much production code with it as it still change and break often between releases.
for i in {1..10}; do
echo $i
done
(Though I prefer using printf than echo as it's more capable and POSIX compliant)I write far too much stuff in BASH, but for me it's just not worth moving to using a different shell due to its ubiquity. There's also the question of "will this still run easily in 20 years". Of course, BASH is a nightmare for bugs and foot-guns unless you make a point of defensive coding (e.g. surround variables with double-quotes and using Shellcheck to point out common errors).
By the way, the article goes on to mention the large number of options to "ls". Don't try to parse the output of "ls" in scripts as there's better ways to do things: https://mywiki.wooledge.org/ParsingLs