Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I find that wolfram mathematica is incredibly painful to use except for symbolic math. And even then.. the kernel often crashes, syntax (and the combination of how it is entered) is weird, and overall often things (like dynamic graphics) would just stop working altogether. Mind you, I was basically just doing some symbolic functions and manipulate functions.

I can not imagine actually using it with data and more complicated programs. That must be... really tedious.

It's great for solving equations and such, but I seriously wonder why anyone would do "data science" with it.



On the contrary I don't have a problem with the syntax at all. You can almost treat it like a symbolic functional language. All the usual functional programming constructs are there like map/fold/lambdas (mathematica calls them pure functions).

I agree the front end struggles if you have a large dynamic object with a lot of data, it has been 32bit for a long time.


Was watching one of Wolfram's Twitch streams and the front end is supposedly going to be 64 bit in the newest version (12 I think).


You are not alone. I think it is a powerful environment but the learning wall to get in is steep.

I found this just now http://www.wolfram.com/language/fast-introduction-for-progra...

And that page in particular is a good example of how not to write documentation for a language/environment.

Table[x^2, {x, 10}]

The page before introduced lists. And there was no mention of lists being able to do magic things like spanning values. I think that line up there makes a table and somehow that magic list goes from 1 to 10 ...

There is just too much hidden there. It is a poor introduction.

I have read this guide before ... and each time I shake my head and wonder why anyone would bother trying to get through the opaque/hidden syntax when there are way better choices of languages.


The learning curve is pretty insane but once you get past that it's a really fun language to work in especially if you want to build or test something really fast just to see if something will work.


Somewhat offtopic but it is strange that special-purpose languages for scientific computing even exist. It seems far saner to just have these tools available as a set of libraries for a general purpose programming language and indeed the world seems to be heading that way with python as the chosen language.

Maybe when matlab and mathematica were first created the existing dynamic languages were not very good?


It is rare for general purpose languages to make matrices as easy to manipulate as MATLAB does. Maybe Julia will fill this need, once it becomes more popular. Right now, for fast coding involving matrices (not necessarily fast running time, mind you), MATLAB takes the cake.


Fortran was released in 1957 as the first compiled language. Guess what it was for ... scientific computing.


I don't think that Fortran was the first compiled language. Wikipedia says the first language to be compiled was Autocode for the Mark 1.

https://en.wikipedia.org/wiki/Autocode


> Maybe when matlab and mathematica were first created the existing dynamic languages were not very good?

It had less to do with the languages and more to do with the libraries. Would you use Python over Matlab if numpy and scikit learn did not exist?


And even today if you have to do a lot of heavy duty or specialized statistics, does Python match R, Matlab or SPSS libraries?


Why does it seem more sane to have libraries instead of specialized languages?

Even with Python and R's mathematical ecosystem, they don't replicate the sheer breadth and depth of specialized tools like Mathematica and MATLAB.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: