Hacker News new | past | comments | ask | show | jobs | submit login

The big difference is that Julia can handle user defined structs and handle higher-level functions, e.g. you pass a Julia function to you GPU kernel and that function will get compile for the GPU without you having to declare it GPU-compatible.



The key difference here is that, while Python and R has a lot of their standard library written in other languages (C), Julia's is mostly written in Julia. Same with Julia's packages. This means that you can throw a lot of library functions and they will GPU compile just fine because the whole stack is Julia all the way down (in many cases. There are of course exceptions).


I keep hearing this, but each time I look at the links on HN, I see that the high-performance libraries being cited are those still written in C, C++, or some other low level language. For example, even in this link, the code is tying into things like cuBLAS, which is definitely not Julia code. For me, high performance linear algebra routines are important and I just checked here:

https://docs.julialang.org/en/latest/stdlib/linalg/

It looks like Julia uses a combination of LAPACK and SuiteSparse. These are good choices, but it's not Julia code and these routines are callable from all sorts of other languages like Python, MATLAB, and Octave. As such, it still appears as though Julia is operating more like a glue language rather than a write all of your numerical libraries in Julia language, which is fine, but I don't feel like that's what it's being sold as.


We use BLAS, LAPACK and SuiteSparse - because they are incredibly high quality libraries. For example, if you translate LAPACK or SuiteSparse into Julia, you will get the same performance. BLAS is a different story (and while not impossible to have a Julia one, the effort to build one would be better deployed elsewhere for now).

The benefit comes from user code, which in many dynamic languages is interpreted and is much slower than built-in C libraries. For example, look at the Julia `sum`. It is written in Julia. Or that we are in the process of replacing openlibm (based on freebsd libm) with a pure julia implementation. Or any of the fused array kernels (arithmetic, indexing, etc.). Our entire sparse matrix implementation (except for the solvers) is in pure Julia.


To be sure, I agree and think it's the right thing to do to hook into external libraries when they provide the functionality we need. That's just an extension of the right tool for the right job philosophy.

Alright, so I write numerical codes professionally. Though it's not quite fair, I tend to bulk things into glue languages and computation languages. In a glue language, we combine all of our numerical drivers and produce an application. For example, optimization solvers don't really need to be written in a low-level language since their parallelism and computation is primarily governed by the function evaluations, derivatives, and linear system solvers. As long as these are fast, we can use something like like Python to code it and it runs about the same speed, and in parallel, as a C or C++ code. On the other hand, we have the computation languages where we code the low level and parallel routines like linear algebra solvers. Typically, this is done is C/C++/Fortran, but I'm curious to see how Rust can fit in with these language. For me, the primary focus of a computation language is one that it's fast and two that it's really, really easy to hook into glue languages. Since just about every language has a c-api, that's our pathway forward.

Alright, so now we have Julia. Is it a glue language? Is it a computation language? Maybe it's designed to be both. However, at the end of the day, most of the examples I see of Julia on HN are using Julia as a glue language. To me, we have lots of glue languages that already hook into whatever other stuff we care about be it plotting tools or database readers or whatever. If Julia is designed to be a computation language, great. However, that means we should be seeing people writing the next generation of things like parallel factorizations and then hooking them into a more popular glue language like Python or MATLAB or whatever. Maybe these examples exist and I haven't seen them. However, until this is more clear, I personally stay away from Julia and I advise my clients to as well.

And, to be clear, Julia may be wonderfully suited for these things. Mostly, I wanted to express my frustration of what I see as an ambiguity in the marketing.


I think the biggest reason that Julia might not satisfy your definition of "computation language" is just that Julia has a significant runtime, as a garbage collected language. So it's not really suited to writing something as a library and then using it from glue languages as your proposing for "computation languages", at least currently. I think that would remain true even if it had the speed and flexibility and developer resources to not need to call out to native libraries for its own purposes.

Which reminds me a bit of Java, where the speed is either there or getting there for tight loops, but it just doesn't play well with others at all when they are wanting to do the driving.


That's fair. And, certainly, there's nothing wrong with a glue language geared toward computation. Then, from my perspective, the question for me becomes whether Julia provides an good resources for the end application. Stuff like good plotting, reading from databases and diverse file formats, easy to generate GUIs, etc. Honestly, that's part of why I think Python became popular in the computation world. Personally, I dislike the language, but I support it because there's code floating around to do just about anything for the end application and that's hugely useful.

There's one other domain that, depending, Julia may fit well. At the moment, I prototype everything in MATLAB/Octave because the debugger drops us into a REPL where we can perform arbitrary computations on terms easily. Technically, this is possible in something like Python, but it's moderately hateful compared to MATLAB/Octave because factorizing, spectral analysis, and plotting can be done extremely easily in MATLAB/Octave. That said, I tend not to keep my codes there since MATLAB/Octave are not good, in my opinion, for developing large, deliverable applications. As such, in my business where I quickly develop one off prototype codes on a tight deadline, maybe it would be a reasonable choice.

Though, thinking about it, there may be licensing problems. The value in MATLAB is that they provide the appropriate commercial license for codes like FFTW and the good routines out of SuiteSparse rather than the default GPL. I'm looking now and it's not clear to me Julia provides the same kind of cover. This complicates the prototyping angle.


It's a difficult trade off, if you wait until all the basic functionality is written, debugged, and optimized in Julia then nobody can use it for "real" work for ages. On the other hand they've pretty clearly been making design decisions that allow for efficient native implementations (unlike, say python).

I haven't been following very closely recently but there has been some active native implementation work such as: https://github.com/JuliaDiffEq/DifferentialEquations.jl


ChrisRackauckas (above) is the author of DifferentialEquations.jl!


Of course the linear algebra parts were replaced with cuBLAS because it's implemented in Julia via BLAS (OpenBLAS/MKL), but also because numerical linear algebra needs to be architecture-specific (though there are minor efforts for a JuliaBLAS). But most of those other functions, like sum, findfirst, etc (all of those higher level functions that you use Julia/Python/MATLAB for) will be available through this mechanism.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: