If the tools are coherently designed, it should be easier to debug, because you can just use -x to log the commands being run and paste them in to see what went wrong. It's debugging with a REPL.
The biggest mistake I see people making is to hard code paths, ports, user names, configuration, etc. inside Python/Perl scripts. All that stuff belongs in shell. All the logic and control flow goes in the scripts. If it's factored this way, then you have a very natural way of testing the scripts with test parameters (i.e. not production parameters).
I don't doubt that there are many multi language shell scripts that are spaghetti. Factoring into processes is definitely a skill that takes thought and effort, and I don't think anyone really teaches it or writes about it. The only books I can think of know of is The Art of Unix Programming by ESR and the Unix Programming Environment.
But it's definitely made me way more productive once I started thinking like this. People say talk about the "Unix philosophy" for a reason. It's a real thing :) It's not an accident that Unix is wildly popular and has unprecedented longevity.
Yup, until someone can show me a bash editor that understands all the processes that are invoked and gives me at least context aware code completion, parameter documentation, "go to definition" and "find all references" I'd rather keep my code in a language where I can get those features.
I'm glad someone mentioned debugging. Typically Bash scripts evolve into programs and one of the first things I always notice is how much effort I'm having to put into debugging. Indeed, since I started working with [plumbum](http://plumbum.readthedocs.org/en/latest/) I now typically reach for python in favour of bash even for small jobs.
I work with some deployment systems that chain together small scripts in a bunch of different languages. They are a nightmare to troubleshoot.
I'd much rather the spaghetti was all on one plate than follow it from table to table...