I was just thinking "could this be it? Another Microsoft product that I start voluntarily using? (besides VSCode)" and then I go to the comments and the first comment I see reveals the Microsoft stink.
Oh yeah, I know, but I expect that from a big electron app and it asked me if I wanted to opt out on first launch. This is the same way Firefox and many other respectable programs do it and I see no problem with that.
But putting it in a shell with no prompt and only an unset environment variable to disable it, that's more like the Microsoft we all know.
From the Microsoft website: “The telemetry reporting can be disabled by setting the environment variable POWERSHELL_TELEMETRY_OPTOUT to true, yes, or 1. This should not be done in your profile, as PowerShell reads this value from your system before executing your profile.”
I spent some time with Powershell on Windows recently, more time than I am used to. Powershell is a clear winner over CMD.EXE by a mile, but…
- As an interactive shell, it suffers from an ecosystem which is much poorer than the bash/zsh ecosystem,
- As a scripting language, I’m going to use Python or Go the moment anything becomes larger than a few lines,
- Installing Posh Git is just a really mediocre experience (indicative of various, more specific problems)
Some of these things can be fixed up, like the bad experience installing Posh Git—you have to update the package manager to a non-release version, but the signature is occasionally broken (it seems to happen again and a gain) so you have to disable signature verification, and then you can use the updated package manager to download the Posh Git package.
Through the journey I encountered so many stupid random problems that it’s not something I can just step out and recommend to anyone. I’ll continue using it as my primary shell on Windows, but for various reasons (I forget the exact details) I had to give up on it for working with Git. There was a lot of “action at a distance”, like running ssh-agent would break the “ls” alias for Get-ChildItem, which leaves me high and dry without muscle memory.
Meanwhile, Microsoft is also spending time and money on WSL. Again, I’ll still be using PowerShell on Windows but even there it is a damn rough experience. The command-line has been paradoxically a second-class citizen as well as a necessary tool for development on Windows for as long as I can remember, and it will take a lot more improvement to PowerShell before that feeling disappears.
Also note that you can get in-console help on most commands without having to go out to a web browser. On the ones I've tried, examples are also included which help output too.
Help will also list available commands; eg:
help json
Shows me the three commands available from my shell: ConvertFrom-Json, ConvertTo-Json and Test-Json.
There is also Get-Command, which you can filter for command discovery, ie:
Get-Command | where Name -Like "Json"
The advantage to PowerShell being object-based here is that if you press <TAB> after typing where, the list of completions you get will include the properties that are in the object returned as the result of Get-Command.
I also use Cmder for a much nicer console than cmd.exe. I haven't tried the latest version of the new Windows Terminal but I hear it's also nice.
> My biggest issue with powershell is how undiscoverable it is
This surprises me as I feel exactly the opposite.
I may not know how to manage X via PowerShell, but I know those pre-approved verbs like Get/Set/New/Test/Convert etc. Then I know commands from same module usually start with some abbreviation like Get-Net.... Then I can tab throguth commands.
Or just Get-Module (to see loaded/available modules) and then Get-Command -ModuleName x or just help tcp. And if "help Get-NetTCPSetting -Full" won't help, yeah, fallback to google.
Example.
Now how would I set an IP address for an interface not having an idea? (where you see italicized, actually there are star-wildcard around the word)
1. help ip (ohh, too much results)
2. help set-ip - alright, Set-NetIPAddress
3. help Set-NetIPAddress -Examples (seems I need to specify some interface. It is probably at Get-NetIPInter<TAB> - ahh, gives me Get-NetIPInterface, here you are.)
4. Get-NetIPInterface
5. Set-NetIPInterface -<TAB><TAB><TAB> (autocompletes arguments and I choose which I would like to set or not)
The worst/easiest to fix discoverability problem with Microsoft prompts is that they show you a single completion on tab and cycle through them on every subsequent tab, which is a lot more confusing than being shown all the options, in my opinion.
I would argue that the strict posix shell (with all its limitations) can be learnt in an afternoon or two and a lot of it will be also useful in general in the OS. like utilities and their inputs and outputs. it's mostly reusing/duct taping existing programs, so it's not a vacuum like a DSL. I hope it has excellent man pages to be usable. the BSDs have...
I'd say it's not any less discoverable than the typical Unix shell, plus readability is guaranteed when coming across scripts written by someone else. Crypticism is culturally out of place in Powershell, and I think this is its best feature.
PS: the ISE (Integrated Scripting Environment) helps when writing scripts.
Hey, I use PowerShell on Windows a lot. Some advice that I hope helps for installing posh-got specifically is to “git clone” the posh-git repo. Then in your $Profile.CurrentUserAllHosts do an Import-Module for the PSM.
I know all this is personal and subjective. I think using git to manage this is pretty easy and safe. If and when I want to update it’s as simple as doing a git pull in my local repo of posh-git and reloading the shell.
If the new version is broken (has never happened yet but ya know.. in case) just checkout the previous one.
I have found using git, the powershell user profile and modifying my user path when necessary is the best way of managing dev dependencies on Windows.
I also understand why people want this all abstracted away from them in a package manager.
I've done quite a bit of powershell scripting, and it has some warts but overall it's actually pretty good.
* You have access to the entire .Net standard library
* Optional typing
* Writing functions and using pipes allows a functional style
* You can even inline some C# code, or read it from a separate file and compile it on the fly. I've used that to make windows API calls (P/Invoke).
Overall, I'd choose pwsh scripting over bash any day, and I'd say it doesn't even make me miss python all that much (for simple scripts, couple hundred lines max).
The main feature lacking for larger scripts/tools is classes, I think. Unless it has changed in newer versions, powershell is object-oriented yet does not allow you to define custom classes, which is annoying to say the least. The workaround is loading c# on the fly.
yeah I use to automate a decent amount of stuff for bigger deployments for client with it as it’s easier to pass through security audit then an executable.
I always feel like things in power shell are harder then they need to be that other scripting languages I don’t have that problem. One thing I do like is being able to leverage .net locally and also import DLLs. That helps me get around a bunch of gaps.
I’m not sure I could cope with that. I have enough problems with it on windows. Literally every corner you turn something punches you in the balls or turns to dust in your hands. It’s one of the most frustrating things I’ve ever used and I avoid it where I can.
I would not invite it onto a *nix machine voluntarily where there are established patterns and solutions which have matured over the space of 40 odd years now.
In my experience with PowerShell on Windows, you have to approach it more like interactive python with shortcuts rather than a regular shell, and once you’re accustomed it works fine. The main issue is that it doesn’t work like a classic shell, it just has some affordances to look like a classic shell in common use cases.
Well it’s not the concept but the implementation that is at fault. From persistent random WinRM failures, sudden things breaking and the whole thing wrapping rancid bits of WMI and COM which leak through regularly, it’s just pain. On top of that the interface is somewhat flakey and slow. Plus some of the semantics feel like going back to circa 1995 Perl. Oh and don’t get me started on the nightmare that is script signing and security which really does nothing useful other than get in the way.
It really feels immature.
The biggest travesty is when it is integrated into something else such as ansible, packer, chocolatey etc where it will cost you a couple of days occasionally due to some obscure issue hidden behind even more layers.
I use powershell on linux occasionally. Primarily with the azure and powercli modules. For doing vmware and azure stuff I can't think of anything that hasn't worked.
If you want to do anything that would be windows only then you're right, it probably won't work.
That said, I usually have a tmux window ssh'd into a windows machine for powershell. It would be nice to be able to manage ad and exchange from a linux machine with powershell.
IWR and ConvertFrom-Json are what got me hooked on powershell. No more learning complex switches for curl or crazy JQ syntax. It's all built in and makes sense.
FWIW, I already have a command line tool that can do that installed in regular Unix: https://github.com/ericchiang/pup The docs even use the same example of scraping HN.
How does Invoke-WebRequest decide if it’s looking at HTML or something else?
Well, you use Invoke-RestMethod for JSON/XML stuff.
Or decide yourself for Invoke-WebRequest to pipe it with whatever the data: ConvertFrom-{JSON,XML,CSV}
With Invoke-WebRequest you get back headers, raw response, etc - and you get to decide whats next. With the RestMethod - just the content as an object.
There is an param to declare response type parsing and if you want to use different parsing. Older versions of powershell use IE and webrequest will randomly barf. It really sucked. You can also use a rest request method that uses json.net to deserialize I believe .
Creating secure strings isn't exactly fixed; the type exists but they aren't encrypted in memory on Linux because it doesn't have the DPAPI (Data Protection API) from Windows:
Same here. I love using pwsh 7 on windows, but on Linux it was a whole different experience. Granted I was trying to use it on an Arch instance, which isn't technically supported, but I couldn't even get PSGet or PsReadLine to work.
However, I was pretty great to run the same scripts cross OS with basic commands like Invoke-WebMethod/Invoke-RestMethod
>For example, if we wanted to get the first three files from ls I’d do something like `ls | head -n 3`. In Powershell, it’s `$(dir)[0..2]` since the dir command is returning an array which I can index into.
You could do that, but the way to write it in the bash way would be `dir | select -first 3`
>a quick Google told me to use `echo | openssl s_client -showcerts -servername joejag.com -connect joejag.com:443 2>/dev/null | openssl x509 -inform pem -noout -text` which works fine in Bash or Zsh. Still, in Powershell, it throws an error for some reason.
It won't throw an error (unless you're running it without a tty), but it will ask you to enter the parameter of `echo`. That's because `echo` aka `Write-Object`'s parameter is required, unlike bash's `echo`. `echo |` in bash is the same as `< /dev/null`, but that also doesn't work in pwsh because it doesn't support `<`. The pwsh way is to pipe `$null`, thus
The echo problem is reflective of a larger, longer-term issue with PowerShell, that being that it sometimes aliases PowerShell commands to the names of equivalent Unix commands. That’s done to ease the transition into PS-land for Unix folks, of course. But unless the PowerShell command works exactly the same as the Unix command it’s aliased to — takes all the same arguments, returns all the same data structures and error codes, etc. — you inevitably run into cases where things break because echo or curl or whatever aren’t behaving the way your code expects them to.
True. I used PS only on Windows for many years and got into the habit of writing `cat`. When I started using PS Core on Linux I had to unlearn that and start using `gc` instead, because `cat` there is the native `cat`
A) this is all good stuff, I want to get better with powershell in 2020 and just learned some things here
B) I think the authors point was that a lot of their googling for how to do things in a shell assume posix shell. So copy paste from web doesn’t work as well for powershell, which becomes frustrating for new onboarders.
It's because it's a PowerShell syntax.
What the parent meant was "here's how to write it in PowerShell so that it still has a 'unix' feel to it (ie: composing commands with pipes and option switches)".
Yes, this is what the parent meant: it is valid PowerShell.
It wouldn't work by default in Bash unless you had `dir` and `select` defined there somehow.
While I really would like UNIX shells to get the ability to pass structured data, one problem is that most shells out AFAIK there don't give the ability to pass structured data natively.
For these structured shells(like Powershell, elvish, Nushell) to succeed, they should...
* have a spec on how to pass structured data from a native executable (without function calls like 'from-json')
* implement some structured data functionality in all GNU coreutils at the very least
* and push the 'structured shell evangelists' to implement these schemes on other programs.
That is unlikely to happen in PowerShell because the structured data is .Net objects. If you are writing a C# cmdlet, it runs in the PowerShell executable process. That is, the output is not structured text it's live state, open file handles, methods, in-memory structures, whatever you want. You aren't going to do that from a native command running in a separate process returning data through stdout without serializing it - and then you're back to convertfrom-json as far as it matters.
> For these structured shells(like Powershell, elvish, Nushell) to succeed, they should...
For PowerShell to succeed, Microsoft should rewrite all the GNU coreutils? Did you see the months of outcry when Microsoft used the four characters "curl" in PowerShell? You reckon anyone is going to want unilaterally rewritten coreutils? From Microsoft?
Microsoft should stop on their development of a completely new tool and hobble themselves to the constraints of the stuff they're trying to replace, controlled by people they're in competition with? Unlikely fantasy world.
Firstly, I wasn't writing the comment specifically about PowerShell - it was just a rant about the structured shells.
> That is unlikely to happen in PowerShell because the structured data is .Net objects.
Isn't that an implementation detail?
> You aren't going to do that from a native command running in a separate process returning data through stdout without serializing it - and then you're back to convertfrom-json as far as it matters.
Not having to write 'convertfrom-json' IMHO is a big improvement. 'ls[0..3]' is much more convenient than 'ls | from-json[0..3]'. (Imaginary syntax, BTW.)
> For PowerShell to succeed, Microsoft should rewrite all the GNU coreutils?
I wasn't really talking about PowerShell either, but well, yes I do think that for PowerShell to succeed there should be support for native PowerShell in 'ls', 'find', 'grep', etc...
> You reckon anyone is going to want unilaterally rewritten coreutils? From Microsoft?
It doesn't to be 'rewritten', it just have to gain support. I don't think that a set of patches that adds a new output format will be that problematic. And, one doesn't have to use PowerShell support if one isn't using PowerShell.
> Microsoft should stop on their development of a completely new tool and hobble themselves to the constraints of the stuff they're trying to replace, controlled by people they're in competition with? Unlikely fantasy world.
Er... you know - nobody said that one should replace your bash with Powershell. If you decide that the 30-year-old-unable-to-handle-spaces-in-filenames-by-default script language is fine for you, then you can keep using bash. They're not EEEing...
BTW, checkout elvish shell, since it's not from MS and it has some good ideas. You might like it if you're just hating PowerShell b.c. it's from Microsoft.
> > That is unlikely to happen in PowerShell because the structured data is .Net objects.
> Isn't that an implementation detail?
Arguably, but I don't think so. They stopped writing a language spec at version 3, but the spec which exists[1] says on page 1 "Windows PowerShell is built on top of the .NET Framework common language runtime (CLR) and the .NET Framework, and accepts and returns .NET Framework objects." so that's in the spec. Either way, there is only one PowerShell implementation to speak of so if you want GNU utilities to emit structured objects to PowerShell without going through the serialization to JSON/CSV step that you want to avoid, you will practically have to add interprocess communication into the running process of a managed language and emit .Net objects, and that will be a difficult push.
If you built an implicit decode-from-JSON step into running native cli commands, as Invoke-WebRequest has for talking to REST APIs, you could get more convenient syntax but they will forever be second-class citizens. They won't follow the Get-/Set- verb naming scheme, they won't be built to take pipeline input or handle command line parameters in the same way, they will be limited to only output which can be serialized to text.
By the time you've updated `ls` to read [System.IO.DirectoryInfo] through stdin, detect if it's running inside PowerShell and return serialized [System.IO.FileInfo] output to stdout, and rewritten PowerShell to auto-convert output like this to objects, you may as well have used Get-ChildItem instead.
> I don't think that a set of patches that adds a new output format will be that problematic.
The problem with powershell on Unix/Linux is that there is an implicit assumption everywhere that there is a shell available. If you look at simple git StackOverflow questions like "how do I get the oldest common ancestor commit for two branches", you'll happily see people answer with combinations of shell commands that assumes that there is a sh (or some times even bash or zsh!). Absolute madness that git can't do basic things without relying on external applications that may or may not exist.
It's not madness, it's the price of flexibility and one that I am personally gladly to pay. I am forced to use Windows and pre-approved tooling daily at work, you can't imagine what a relief it is to use Linux at home, where you can chain together the exact programs that you want and make them work the way you want.
I don’t mind that programs provide output that can be chained and so on, but what annoys me is when portable (cross platform) apps m, like e.g git, doesn’t provide core functionality without the assumption of a specific environment. It's fine to let the user sort the outputs using an external program. But if the sorting is so commonly required that it's a "core feature" then perhaps the program itself should do it.
Since most developers use windows, most windows users use cmd as their shell and git is the most common vcs around, it wouldn’t surprise me if the most common environment for git is ... Windows cmd. Where there is no sed/grep/awk etc!
I'd suppose most WIndows users using git either use it through a UI (e.g.Intellij's), or else in the bash shell for Windows that comes with the git package for windows. Starting that bash shell is easier than mucking with the path and the old cmd.exe is awefull in so many ways, anyway.
I am dead serious. Powershell should replace (bash) shell scripting under any Linux.
I find shell scripts terrible, I've learned to switch to Python quickly, despite the overhead.
Powershell is very command-line friendly, with easy tab-completion of commands. I've worked with it extensively on Windows, creating large worklflows with it, it's very nice.
Powershell even has a unit test tool called 'pester' which is funny to me because it means 'bully' in my native language.
I don't know what your native language is, but in English "pester" means to annoy. A negative connotation no matter what.
I don't get the hate on Bash though. It's a natural extension of the command line, and works quite nicely to get a lot done without the overhead. Deployment is a lot easier in my environment (web hosting) because all of the servers are homogenous, and I don't need to worry about any dependencies because I rely mostly on coreutils + grep and awk.
why can't the two coexist? On Linux you can currently choose rather easily between bash, fish, zsh, ash, dash, sh, ksh, csh, tcsh, fish and more. Every script can use the one it wants easily with `#!/usr/bin/env <shell>`
If you're arguing that powershell should be the default that's a different argument. sh syntax is still in widespread use because of how portable it is and how widely known it is. It truly is the lingua franca of unixey OSes.
Powershell is kind of half baked. In Powershell you mostly interact with in a functional manner (chaining, filtering, etc) but Powershell for whatever reason provides nearly zero functional programming features which makes it painful to script in. IMO anyways
Honestly my biggest complaint is just that it's more verbose than bash which is admittedly pretty cryptic at times. Now that I've learned bash though, I have little need to move to PowerShell.
Bash is tough to learn at first because its syntax is alien but I've learned enough to get things done with it.
>Powershell is very command-line friendly, with easy tab-completion of commands
I really couldn't agree less. Title-Case with hyphens and verbose command names make it horrible just to type. The only reason I see for doing that is so that it would fit in with the rest of the ecosystem.
Maybe it's a nice scripting language: we already have quite a lot of nice scripting languages.
The Title-Case is unnecessary; the commands aren't case sensitive. It's just considered good practice for maintainability. Plus, most common commands are aliased to shorter ones.
> I am dead serious. Powershell should replace (bash) shell scripting under any Linux.
This probably means you don't have a CS background and/or very little experience with linux/scripting/etc and dare I say powershell itself. Scripting in linux/unix/etc is primarily text based whereas powershell is object based ( which is why powershell tends to be slow in comparison ). Also, the only thing going for powershell is it's tie-in with the .Net Framework along with Microsoft based servers ( such as SQL Server ). Like most Microsoft products, it's meant for their environment. There really isn't any real reason for powershell on linux.
> I find shell scripts terrible, I've learned to switch to Python quickly, despite the overhead. Powershell is very command-line friendly, with easy tab-completion of commands. I've worked with it extensively on Windows, creating large worklflows with it, it's very nice.
Sounds about right. I can't believe that's what you wrote in support of powershell. Linux, bash, etc was built from the ground up to be command-line friendly. Windows was built to be GUI friendly. But there is a reason why windows administration is viewed as a joke in the tech industry. I guess powershell may seems amazing when going from GUI based administration to actual script based automation/administration.
> Powershell even has a unit test tool called 'pester' which is funny to me because it means 'bully' in my native language.
Yeah, it means bother/annoy in english, but you probably knew that already.
Note that this article is slightly out-of-date. It looks like the experiment began in January with PowerShell 6, and there are notes about caveats that were scheduled to be fixed with PowerShell 7. PowerShell 7 has since been released.
"so they do these cool things, which of necessity break a model some of you will have been familiar with for 50 years in some cases, but its amazing"
(in my case, 40 years.)
There is considerable merit in new things. But there are also costs. And, there is the proplist version of this story in OSX. So, I know this is a "thing"
There are often more costs in sticking with the same tech for forty years...I would still be coding for DOS...I hope with new generations of devs will come better tools...as well as a greater willingness to move on...
One of the missing ingredients in PowerShell when I tried to use it was network transparency - which I suspect will be hard to solve since it passes .net objects around.
It is not often that I pipe things into or out off ssh, a Unix domain socket or a pipe, but when I do it’s usually significantly easier and more efficient than other ways.
I also often save outputs rather than pipe, to inspect/debug, or avoid re-computation.
I last played with PowerShell on Win7 ages ago so things might have improved dramatically (e.g. if every output is guaranteed to round trip serialize with a network stream).
Last few weeks I've been going back to bash, mainly to try out WSL/Ubuntu's defaults, but will probably move to pwsh there too. I like the consistent naming (which means I can guess commands) and avoiding the scraping that's inherent in text-only shells.
nushell looks like one to watch too. It's a lot further behind than pwsh but has a faster growing community.
Hey thanks for this, nushell looks really interesting! I really like the idea of an objective shell but don't like the verbosity of PowerShell inputs. I think if you go through my post history I actually describe wanting something like Nushell, not knowing it existed!
Tbf most common PowerShell commands have short aliases for interactive use. I like that I can use the long forms in scripting and aliases on command line.
For example in my PS (unsure who created aliases because I think some aren't default) I have both "ls" and "dir" for "Get-Childitem", "wget" for "Invoke-Webrequest" and so on.
> For example, if we wanted to get the first three files from ls I’d do something like `ls | head -n 3`. In Powershell, it’s `$(dir)[0..2]` since the dir command is returning an array which I can index into.
Hmm, yes, but it's not a lazy array/list, so it's not online, is it. If you have a few million files in a directory, that's not going to work. To be fair, neither is `ls | head -n3` -- you have to use `ls -f | ...` to avoid ls(1) sorting the listing first, but if you do, this will be online. For ls(1) it's not really an issue, but in general you want your shell to be online.
>Hmm, yes, but it's not a lazy array/list, so it's not online, is it. If you have a few million files in a directory, that's not going to work.
It is lazy. The author slightly misunderstood / oversimplified what's going on. The commandlet does not build an entire array in memory and then write it out; it writes out each item one at a time. Evaluating it as `$(dir)` just forces all its output to be collected into a single value, ie an array. It's the equivalent of writing `output="$(ls)"; <<< $output head -n3` in bash.
I posted an alternative way to write this in https://news.ycombinator.com/item?id=22963842 which does not need to build the output as an array first and is also more natural to write.
On a related tangent, I think the Powershell project is the only place to get certain DotNet libraries that pertain to remote management. It's a bizarre situation.
Another shell that has a similar philosophy is Nu[1]. I think the idea of structured shells (and typed shells?) is a very interesting path to improve UX and prevent certain kinds of common error.