It also buffers the downloaded data completely into memory last time I checked. So downloading a file bigger than the available RAM just doesn't work and you have to use WebClient instead.
Another fun one is Extract-Archive which is painfully slow while using the System.IO.Compression.ZipFile CLR type directly is reasonably fast. Powershell is really a head scratcher sometimes.
The download being cached in RAM kind of makes sense, curl will do the same (up to a point) if the output stream is slower than the download itself. For a scripting language, I think it makes sense. Microsoft deciding to alias wget to Invoke-WebRequest does make for a rather annoying side effect, but perhaps it was to be expected as all of their aliases for GNU tools are poor replacements.
I tried to look into the whole Expand-Archive thing, but as of https://github.com/PowerShell/Microsoft.PowerShell.Archive/c... I can't even find the Expand-Archive cmdlet source code anymore. The archive files themselves seem to have "expand" be unimplemented. Unless they moved the expand command to another repo for some reason, it looks like the entire command will disappear at one point?
Still, it does look like Expand-Archive was using the plain old System.IO.Compression library for its file I/O, though, although there is a bit of pre-processing to validate paths existing and such, that may take a while.
> curl will do the same (up to a point) if the output stream is slower than the download itself
That "up to a point" is crucial. Storing chunks in memory up to some max size as you wait for them to be written to disk makes complete sense. Buffering the entire download in memory before writing to disk at the end doesn't make sense at all.
curl's approach will lead to partial and failed downloads. When a client stops accepting new data, servers tend to close the connection after a while.
There are smoother ways to deal with this (i.e. reduce download rate by faking dropped packets to output speed), but if you just want a simple download command, I think both simple solutions are fine.
If the download doesn't fit in RAM, it'll end up swapped out and effectively cached to disk anyway.
The standard solution to this is to write the download to a temporary hidden file on the same volume and then rename it into place once the download succeeds (or delete it on failure).
That's true when downloading to a file, but Invoke-WebRequest is more curl-like than wget-like. It's designed to return an object/struct rather than simply download a file.
If you want to download many/large files, you're probably better off with Start-BitsTransfer.
Yep. And 'wget' is often alias for WebRequest in PowerShell. The amount of footguns I ran into while trying to get a simple Windows Container CI job running, oh man
I do still find Invoke-WebRequest useful for testing, because it is magically able to reuse TCP connections whereas curl always opens a new connection per request.
It's a completely new shell, new commands for everything, no familiar affordances for common tasks, so they add user-configurable, user-removable aliases from DOS/macOS/Linux so that people could have some on-ramp, something to type that would do something. That's not a dick move at all, that's a helpful move.
Harassing the creator/team for years because a thing you don't use doesn't work the way you want it to work? That is.
They removed it in PowerShell core 9 years ago! 9 years! And you're still fixated on it!
It is still present in powershell on my up to date windows 11 machine today, so it is disingenuous for you to claim the alias was removed 9 years ago. It is 100% still being shipped today.
The alias confuses people that are expecting to run curl when they type "curl" (duh) and also causes headaches for the actual curl developers, especially when curl is legitimately installed!
Why the hostile tone? Pretty rude of you to claim I'm fixated on the issue for years and harassing the powershell development team with zero evidence.
When you open powershell it says something like “Install the latest PowerShell for new features and improvements! https://aka.ms/PSWindows”
Isn’t it disingenuous to claim it is “up to date” when you know there’s a new version and aren’t using it?
> “The alias confuses people that are expecting to run curl when they type "curl" (duh)”
Yes, once, until you learn what to do about it. Which is … just like any other software annoyance. One you think people would get over decades ago.
> “and also causes headaches for the actual curl developers.”
Linux users can’t comprehend that the cURL developer doesn’t own those four letters.
> “It has very little compatibility with the actual curl command.”
It’s not supposed to have. As I said in another comment the aliases were added to be an on-ramp to PS.
Why aren’t you also infuriated that “ls” isn’t compatible with “ls”? Because you use the full command name in scripts? Do that with invoke-webrequest. Because you expect command to behave different in PS? Do that with curl.
>Linux users can’t comprehend that the cURL developer doesn’t own those four letters.
probably they can comprehend that MS has a history of making things slightly incompatible so as to achieve lock-in and eradicate competing systems.
Also if any program has been the standard for doing this kind of thing for a long time it's curl, it's pretty much a dick move that someone can't just send a one liner and expect it to work on your system because that is often how you have to tell someone working in another system "yes it works, just use this curl script" and then they can see wow it must be something with my code that is messed up.
> "it's pretty much a dick move that someone can't just send a one liner and expect it to work on your system"
No, it isn't. This is what I'm objecting to - this frames the situation in terms of Linux being "the one correct way" to do everything computing, and that all companies, people, tools, operating systems, should do everything the way Linux does - and are dicks if they don't. Not just dicks, dicks to you personally.
Including Linux's 'competitors', they are dicks for including things which help their paying customers in a way that isn't the Linux approved way, and they shouldn't do that because of the demands of Linux users.
This is collectively domineering (everything should be my way!), entitled (I have a say how a tool I don't use, am not developing, don't want, and am not paying for, should work), self-centred (everything which exists should be for my convenience), and anti-progress (nobody can try to change anything in computing for any reason - not even other people improving their system for other people).
That is a framing change which should not go unnoticed, uncommented. It's also common in programming languages where people complain if a language looks a bit like C but doesn't behave exactly like C in every way.
Your arbitrary one liner won't work because Python isn't there. Perl isn't there. `ls` is different. Line endings and character encodings are different. xargs isn't there. OpenSSL, OpenSSH aren't there. `find` isn't there. `awk` isn't there. `sed` isn't there. `/` and `/sys` and `/etc` aren't there. It's a completely different shell! On a different OS!
It's not reasonable to expect that a shell that was designed to not be a *nix shell - because the underlying OS is not *nix - will work exactly like a *nix shell and you will be able to copypaste a one liner over.
It is unreasonable to see some developer trying to create a thing in the world which isn't Unix and take that as them being dicks to you personally. It's also bad to be like "I tried one command in this 'new shell' of yours and without understanding anything it didn't do exactly what I wanted and that's you being mean to me. and I'm still going to be hurt about this in unrelated posts decades later on the internet".
Pretty sure you edited that in afterwards, but here you come into a thread about Copy-Item, instead start talking about Invoke-WebRequest and when I say "start talking" I mean mic-drop calling the developers dicks with no other content. After you've successfully triggered someone into a flamewar (me), you try to take the high road saying I'm the one being rude? Calling that out as well.
> "my impersonal complaint"
There's a person behind the move whom you are calling a dick. That's not impersonal. And it is rude. I suspect it's Jeffrey Snover, but possibly Bruce Payette or James Truher.
This is atrocious. I get it, some things are less trivial than they seem - but I would be ashamed for shipping something like this, and even more for not fixing it.
Not exactly the point of this article, but it would be cool if APIs like this can return the expected signed string for debugging. It would have to be properly limited for security. But if the API is expecting non-standard signatures, it could help developers with better debugging tools.
Given that you can't infer the error from simply looking at the signature string, I don't see how having the expected string rather than a simple "OK" or "mismatched signature" (as you get now) would make a difference?
You can save the expected string to a file, save your string to a file, and run diff on a hexdump of both. Even without hexdump, you should see the difference between "\n" and "\\n" in properly escaped output.
But the returned signed string will be an HMAC-SHA256 hash, won't it? Then there's not going to be any '\n' or '\\n's in there. Only thing you'll be able to tell is if it matches your hash or not, in which case 'OK' or 'not OK' will work just as well.
But neither does the actual server. HMAC only verifies that the message is from whoever it claims to be from and that it is intact. It won't know what you intended the body of the request to look like.
Sure, but only that team gets to put “designed and implemented new build system” on their resume. See how many meet/hangout/alo variants came out of Google. In companies of that size the “here” in NIH is a lot more localized to smaller units.
> The largest net seller in January was Canada. The U.K. was the largest buyer in January, after having been the largest net seller in December. Norway and Japan were the second and third largest net buyers in January, respectively, Goldman said in a note.
This was an issue without LLMs too, and it sucks. GH has a tag for "good first issue" which always gets snatched by someone who only cares about the contribution line. Sometimes they just let it sit for weeks because they forgot that they now have to actually do the work.
The problem with Google takeout is that if you haveany photos, let's say 300gb like myself, you end up having to manually download 50 or so files. This could be easily avoided if photos where synced to a pc like before, unsig Google drive.
You can ask for larger zip files. The maximum is 50GB. It uses zip64 which most modern systems should support. So still 6 files in your case, but at least not 50.
They claimed the actual extortion took place over the phone. The closest he gets to acknowledging it in those texts is mentioning %. I would be curious if that is enough evidence. Pretty stupid behavior, either way.
WCUS took place in Oregon where both sides of that phone call would have been located. As far as I know while Oregon is two-party consent in most cases, only one party needs to consent to record a phone call.
Given the apparent escalation, I would be surprised if the WP Engine people hadn't consulted with legal and started recording calls during the event if it was legal to do so.
Yes. Which is epoxied shut and requires you to virtually destroy the shell to get to the cells inside as I recently discovered trying to refurbish my Dewalt batteries.
PowerShell has some "interesting" design choices...