Everything being a string is not that bad in a world of shell pipelines, where all the commands arguments and results are strings anyway. And if you want you can also use data structures like lists and dictionaries. It is a bit idiosyncratic to support the everything is a string metaphor but under the hood they are still implemented as actual lists or dictionaries.
The main thing I dislike about migrating my bash scripts to python is that the syntax is more verbose. subproccess.call can be a bit of a mothfull and you "need" "to" "quote" "everything", "like“ "this". If the only fancy thing your script is doing is control flow then Tcl might be a good fit.
The quoting thing is definitely a benefit. Bash's lack of quoting usually leads to bugs (god help you if you have a space in your path).
I agree subprocess.call is quite verbose, but I find you have to do it less in Python scripts anyway because you can things properly with libraries rather than hackily calling out to other programs (e.g. curl).
I work in a very polyglot environment, and doing "everything properly with libraries" would mean duplicating code from language 1 to language 2 and possibly language 3 and 4 over and over again for zero benefit. Far from being "hacky" to call out to other programs, let's write the program only once, and have easy ways to glue to results of different programs together regardless of what language they might have been written in.
I am just learning about tcl today and am seriously considering replacing bash with it. Lack of quoting is looking very elegantly done in tcl, because of the way it handles {} and...it's not a scope, it's a single string argument. You can manipulate the string argument as a string, and you ultimately still know exactly whether you have space separated arguments or a single string because the grouping is explicit.
Tcl does not do automatic expansion on glob patterns unless explicitly requested with `glob`. Where bash might end up dynamically changing a space separated argument into two arguments when it gets passed around, tcl will give you a type failure for the equivalent scenario because the arguments do not fit the command given.
The ease of metaprogramming feats is really feeling incredibly LISPy, but without all the verbosity that turns me off from LISP...and with easy access to any commands in the host shell. That means LISPy control over a polyglot language context.
Consider:
puts "Hello world!"
# Hello world!
proc say {word msg} {puts "$word $msg!"}
say Hello world
# Hello world!
set greeting "Hello"
proc say$greeting {msg} {
say [uplevel {set greeting}] $msg
}
sayHello world
# Hello world!
set pyScript {
print "Python"
}
sayHello [exec python << $pyScript]
# Hello Python!
set nodeScript {
console.log("Node")
}
sayHello [exec node << $nodeScript]
# Hello Node!
I know this doesn't look this special, and basically just looks like doing normal bash stuff. But-
- there's no special handling you need to do with the scripts that need an outside interpreter,
- super easy to build these interactively within your tclsh, you can do macro-like actions on any of this very easily,
- you can send functions "from whatever language" around as arguments (and augment them) totally painlessly and not need to worry about syntax issues outside of the normal language context. You're either "inside" a UTF-8 encoded {} or you are not (in which case you are in the tcl context).
I agree. Python shines when you can use real libraries such as "requests" and if you are able to use a proper string/regex manipulation instead of being forced to live with sed/cut/awk. However, sometimes I really just want to invoke a bunch of shell commands, including pipelines. In those cases I gravitate towards Tcl or Fish.
The main thing I dislike about migrating my bash scripts to python is that the syntax is more verbose. subproccess.call can be a bit of a mothfull and you "need" "to" "quote" "everything", "like“ "this". If the only fancy thing your script is doing is control flow then Tcl might be a good fit.