Many times a day both in scripts and interactively I use a small program I refer to as "yy030" that filters URLs from stdin. It's a bit like "urlview" but uses less complicated regex and is faster. There is no third party software I use that is distributed via "curl|bash" and in practice I do not use curl or bash, however if I did I might use yy030 to extract any URLs from install.sh something like this
I can then open 1.htm in an HTML reader and select any file for download or processing by any program according to any file associations I choose, somewhat like "urlview".
I do not use "fzf" or anything like that. yy030 and yy073 are small static binaries under 50k that compile in about 1 second.
I also have a tiny script that downloads a URL received on stdin. For example, to download the third URL from install.sh to 1.tgz
I do not use "fzf" or anything like that. yy030 and yy073 are small static binaries under 50k that compile in about 1 second.
I also have a tiny script that downloads a URL received on stdin. For example, to download the third URL from install.sh to 1.tgz
"ftp" means the client is tnftp"0" means stdin