It's been a while since I wanted something in Go that wasn't available, other than the obvious big-ticket items like NumPy which are ecosystems unto themselves. (Or, to put it another way, yeah, Go doesn't have NumPy, but neither does anybody else other than Python at this point.) YMMV, of course, but it's certainly not a routine occurrence.
The other packages I've missed are all Rust or C packages that manage to create absurdly compact or performant versions of low level structures and blocks like tries, probabilistic filters, encryption primitives, or custom binary storage solutions.
All things that Java, Go, C# and others just can't be the best at because of the overhead or GC.
If I understand correctly, numpy at least originally was a wrapper around some Fortran libraries. So you could get the same functionality and performance in Fortran.
Am I missing something? Is there some way in which numpy is superior to, say, Linpack (other than not having to use Fortran to call it)?
Is there anything unique about Python that enables this approach? Couldn't Go, say, do the same thing? Or C++?
"Is there anything unique about Python that enables this approach? Couldn't Go, say, do the same thing? Or C++?"
Network effect. I don't think anything stops most languages from doing most of what NumPy does (although IMHO even post-generics Go is actually a bad choice), but you have to compete with the existing NumPy. Competitors exist, but, well, the fact you don't know about them and haven't heard of them kinda makes my point.
In fact in my personal opinion Python is an unfortunate choice for NumPy. NumPy is so big it is basically its own thing, and data scientists could have learned to half-program in almost any modern language. (Rust might have given them fits, and C++ would cause some issues, but most languages would have worked for them.) Unfortunately, they settled on a language with weak types, which makes the documentation really annoying to use because it's very hard to tell what will work with what. (I've been dipping into NumPy and pandas over the past few weeks, the docs are infuriating... it's like, they make it obvious there's something that will do what you want but it's quite difficult to backengineer what that "something" is if you don't already know, because nothing has types anywhere. I can already tell you just sort of "get used to it", but it would be easier and faster if there was some type indications somewhere.) And perhaps even worse, as you scale up the size of what you're doing in Python, anything that isn't accelerated becomes more and more mindbogglingly-slow, because when you're in Python you are in a terribly slow single-CPU language banged together with the highest-performance multi-core if not GPU-based code in the world, and the gap between those two things just keeps opening larger and larger. Knowing when you're in slow land and when you're in fast land takes expert-level knowledge and at times source code reading. I'm glad I'm just visiting, I think living there would drive me insane.
Python has a GC based around reference counting, which is vastly easier to integrate with native code than languages with heavier runtimes such as Go, Java, C#.
C++ could give you equal or better performance, but it's not anywhere near as friendly to beginners as Python. Plus, the value of REPL-based programming for prototypes and experimentation is hard to overstate.
the choice was a personal preference, and justified after the fact. This kind of justification happens way too often in a lot of shops, esp. if it is ran by someone with strong opinions and preferences.