Always interesting to me to see how much "IT-first" educated people think that Fortran is a dead language. While when learning physics first you will quickly see how very much alive it still is (newest standard is "Fortran 2018" btw)
Every weather forcast you are seeing was computed using Fortran for example :P
Why is that, though? Is it merely because it's been institutionalized, or is there an inherent advantage that Fortran has (over say C or Rust) for things like weather forecasting?
By no means am I suggesting that it wouldn't have an advantage; I'm just a young whippersnapper who has never had the chance to write any Fortran.
It's designed first and foremost to be a language to help compilers turn large matrix math operations into automagically vectorized and parallelized operations. If Rust is the language for close-to-metal memory-safe systems programming instead of C, modern (F90 and later) Fortran is the language for close-to-metal high-performance computational programming instead of C.
Thinking about it, I'd say the comparison to Rust for mastery of its particular domain is actually quite apt.
Rust is actually the first serious contender in this domain, given that it has native support for parallel, concurrent and distributed programming and also deals elegantly with the aliasing issue that hampers optimization in C/C++. It's nice to see that LLVM is getting an official Fortran frontend, it will ease interop (including with Rust) and make it easier to compare code generation across projects and languages.
It's been my understanding that Fortran is loved by physicists due to it's procedural approach, and doesn't require learning objects and inheritance, etc.
There's a lot to learn in higher level physics, so having the simplest language, without tons of features to fiddle with, while also being very, very performant is preferable.
I have a feeling that much of this work end end up in Julia however.
No. Fortran is loved by physicist because it has an excellent 'impedance match' for the problems they are trying to solve. It has a multi-decade track record of expressing physics problems so it's the computer lingua franca there. And it has an extensive, high-quality and (perhaps most of all) tested ecosystem they can work in. It's not rocket science (unless, of course, it is); right tool for the job and all that.
FWIW...Fortran has had explicit OOP features since at least the 2003 standard (e.g. "type extends"). You aren't "required" to learn those features to use Fortran, but that's true of any number of ostensibly OOP languages.
Lastly...endless people (virtually always non-physicists who wrote a couple of lines of Fortran-77 in college) are continuously popping into the conversation with "well of course dump musty old Fortran for the NewHotness language because ew Fortran". Hasn't happened yet, and the Fortran folks have evolved from -77 to -90, -95, -2003, -2008 and lately -2018, so why would it?
I think that at least part of it is APL's dependence on specialized hardware (at least back when I last encountered it in the 80s). At the Claremont Colleges, most computing access was on VAXes running VMS, although there was a single Unix machine at Harvey Mudd and Pomona College had an IBM minicomputer with 3270 terminals (plus a couple of the IBM graphic terminals) and it was Pomona's system that was the only one that had APL because the 3270s could use the specialized character set while Mudd (the science/engineering school) did not. The big language push at Mudd back then was towards APL, perhaps because a lot of aerospace companies sponsored clinic projects (all engineering students and some other majors did a sponsored real-world project in their senior year).
I remember writing some data analysis code in Fortran for my freshman physics lab and the TA was surprised to see my choice of language.
The barrier to entry to doing fast linear algebra from scratch is the lowest of any of c,c++,Fortran.
Python and matlab being to slow at “low barrier to entry level” for sure anyway.
I’m not going to comment on julia because I’m still drinking coffee. In theory… blah blah maybe it should fill this role.
Sit down, write stupid simple Fortran code to solve a large linear algebra problem (a discretized partial differential equation system) and things like matrix multiplication and numpy style mat(:) access are built in. (Numpy borrowed the syntax from fortran, actually)
For a scientist, this is way easier than wading into c++ and getting it right; or sticking with c, and avoiding the footguns.
Fortran gives you maximum serial speed with minimum effort… it is easy to write efficient serial code, while knowing almost nothing of programming. Memory management sure, but even there, there out of the box are no pointers to fumble and your allocations are going to be contiguous. Optimal
Array Memory access comes down to knowing Fortran is column based.
Ah but then you need to go parallel, where the documentation more in c these days.
But you’re already comfy with Fortran now, so, scientist that you are, you dig and experiment until you get it running in openmp, mpi, and cuda, etc.
There's a long explanation but the summary is that FORTRAN was one of the first languages that targeted the highest computing demands so there was functional and financial interest in making it highly performant. Hardware was often designed around FORTRAN and a tight relationship over decades between those who wrote FORTRAN, those who wrote compilers and tooling for FORTRAN, and those who sold hardware for FORTRAN to run on in large high-performance/supercomputing worked together make things run stupidly efficiently and had decades to do so.
C has some similar legacy but came a little later to the game and wasn't as intuitive to people driving the money in computing as they were with FORTRAN. It's also heavily optimizeable from a performance perspective but often missed the initial buy in and momentum, though developed much of its own.
So FORTRAN has a lot of investment and strategic advantages to be used for things like weather modeling, at least for specific underlying libraries. Modern work tends not to start from greenfield in FORTRAN, it often starts in C or really a high language like Python to provide theoretical proof of concept and just write/use wrappers to highly optimized codebases like those for FORTRAN for the numeric fundamentals in the model. Lots of scientific glue code these days with not a lot of stuff making it into optimization levels like you see in BLAS, LAPACK, ScaLAPACK, etc.
It is faster than C for numerical computing. One of the reasons is that aliasing is not allowed by the language, so the compiler can make some optimizations that C compilers can't, because e.g. the C compiler cannot assume that two pointer arguments to a function do not point to the same memory location.
This was the historical reason for Fortran’s speed advantage over C. But more recent C compilers have a flag to turn off checking for aliasing, so you can write numerical C code with Fortran performance. It’s more verbose and annoying to write, though, because it lacks Fortran’s array syntax: loops for everything.
Flags to turn off aliasing checks do exist in C compilers, but they result in 'slower', less optimized code. They're intended for the common case where types are reinterpreted in a way that isn't expressly allowed by the C/C++ standards, in which case the compiler can't use type information to infer that two pointers don't alias. FORTRAN can assume lack of aliasing in many other cases.
Oh, interesting. Maybe I was thinking of the C `restrict` keyword, rather than a compiler flag. This is an instruction to the compiler, used in a pointer declaration, that there is promised to be no aliasing. It's supposed to allow C to get Fortran speeds in functions where it's used.
It could allow C to close the gap in the future, but it's not there yet. Fortran is fundamentally and pervasively noalias, and its compilers have decades of experience operating under that assumption. C/C++ doesn't use it nearly as much, and as a result the parts of the compilers that use it aren't very mature.
Rust is also pervasively noalias, but it's taken years for it to actually enable "noalias" on LLVM. Rust code uses that feature more than it's ever been used before, which keeps exposing codegen bugs that no one noticed before. Once those get flushed out, I still expect it will take a while for the resulting optimizations to reach the quality & maturity that Fortran has had.
That's exactly why it matters that LLVM is officially getting a Fortran frontend. This will make it possible to leverage these optimization strategies and bug fixes across languages and projects.
Think programmer hours are expensive? Go look at the price for Physics and Engineering PhD programming hours.
Pretty much everything you would ever want to do related to partial differential equations or computational fluid dynamics has already been written in FORTRAN, and is damn good code.
It helps the the language itself is more than capable of great perforamance and that NASA has poured massive amounts of cash into making sure it stays performant on newer hardware.
Back in college I had an internship at an aerospace company. I remember having the exact same thought that "Fortran is dead". I could not have been more wrong, Fortran at least at engineering shops is alive and well.
It's the toolboxes and also SIMULINK. SIMULINK + Signal Processing Toolbox + Control Systems Toolbox is incredibly powerful for Physics and I don't think free implementations really touch that
NASA switched from SIMULINK to ModelingToolkit (Julia) for modeling spacecraft dynamics [0]. They found that the latter was much easier and gave a 15,000x (!!!) performance improvement.
I teach an undergraduate control course that uses Matlab. I'd love to use Julia (I use Julia for all my research) but last time I checked, it was not straight forward to interface to hardware using Julia (our labs use Matlab to control motors). There is a control systems package for Julia but it is no where as complete or polished as the Matlab control systems tookbox (or even the control systems package in octave).
No clue. Octave is here for us… so at least we can run their code and ask any pertinent questions to expedite the port. ;)
(Yes I know it’s simulink for many, but I’ve met some pure computational folk who swear by matlab as their prototyping tool and python is a bug ridden rat nest of footguns to them.)
Octave does exist, but it’s not covering the full set of things Matlab does. If your work is focused on the computational math side it’s fine, but if you have any interest in the simulation stuffs Octave hadn’t caught up and wasn’t even trying to, at least when I last looked a few years ago.
it's good to know that it did; i switched over to general software engineering a few years ago (worked in aerospace before). My college taught matlab as a class, and required you to buy the student version.
Well, Fortran was intended for use in scientific computation. In the 1950s and 1960s, the percentage of computing time spent on scientific computation was much larger than it is today. So, naturally, Fortran had a much more prominent position back then.
These days, scientific computing is a relatively minor niche, because people have found so many other things to do with computers that science is just one of many use cases.
Fortran was aimed at a specific audience, scientists and engineers, and that audience seems - AFAICT - happy to keep using it. One may consider it an impressive success story.
Personally, I have never used Fortran for anything beyond the hello-world-level, so I have no strong opinion on the language as such; I do know it has evolved a lot, probably more so than C.
Granted this was about 20 years ago, yet when Fortran was still labelled as dead. Part of my undergraduate Physics degree involved programming in BASIC, the reason for this was that they felt it was both a readily available language and that it would make for a smoother transition to Fortran provided that we followed their stylistic guidelines. Sure enough, I was dealing with Fortran code a couple of years later. (In that case, it was legacy code but it didn't take much poking around to discover that modern versions were used for HPC.)
Fortran _is_ dead. I work with it professionally. The tooling is next to non-existent, the compilers are trash, the standards committee is in some weird backwards compatible with FORTRAN 66 circle jerk to the point where you can't have a sane modern revision (think the complete opposite problem as Python 2 -> Python 3).
The only reason Fortran is still around is institutional inertia.
I was talking to a maths prof a year ago or so. It seemed that there was some Fortran stuff, although C++ seems the favourite.
Personally, back in the days, I never really got into the NAG libraries. I found it easy enough to roll my own. Maybe some of the stuff can save you time, but I ended up preferring hand-coding my algorithms.
The greatest thing to happen to Fortran was Fortran 90. None of that column malarky. Modules was kinda OK, but if I wanted to use well-structured code I probably wouldn't be starting with Fortran anyway. I don't think ANYONE I knew at the time used to new features other than format-free.
I tried poking around with allocatable arrays at one point, but didn't like it. I guess it was Fortran's way of trying to be like C.
One thing that is rarely discussed is that a programming language isn't just specification, but it is a culture and philosophy shaped by the programmers themselves. One guy made a reference to Cobol, and how object-orientation was an unused feature. He said that what the designers failed to consider is that Cobol programmers just don't "do" object-orientation.
Fortran - even Fortran 77 (IIRC) has some nice little features, like being able to specify parameters in a separate file, and read them in one line. I doubt most of the other guys in the faculty were aware of that feature at the time.
And my standard anecdote ... when I went to a new job, they actually had a little bit of programming in Fortran. We had Visual Fortran. I had to set up the environment to fiddle with something for a client. Set-up was straight-forward. I opened the project file, and the things opened and compiled without a single hitch. I was shocked - shocked I tell you - at how simple the set-up was. Normally one would expect endless futzing around to get the programming environment and libraries in place.
But that's not the real anecdote. One day I was passing by a meeting room. I overheard an outside consultant in discussion with some of our guys about a replacement for Fortran. He was going on about how flexible the system would be, and hey, it you needed to set up extra parameters you could always add them to an XML configuration file he was proposing.
And I thought to myself ... my God, that's complicated. In Fortran, you can read an array into a file in one statement. Bam! His method would have involved external libraries and some serious effort to get going.
I always joke that people should be forced to write Fortran for at least a year. For those that want to be JavaScript developers, two years. That should make people think much more simply about what it is you're trying to do.
A year ago I added programming microcontrollers to my list of things programmers should be forced to do. A resource-constrained environment should focus their minds a little more.
Another point: This thinking cheapens learning. If you only learn, "what isn't dead", you stifle your curiosity and don't learn the whole picture of a thing.
Nobody is suggesting you only learn "what isn't dead". Experimenting with new tech is good. But you should also know _something_ about real, relevant tech and not just the bleeding edge, which will all disappear in a year.