I wonder if SQL will be able to catch up to C#. Maybe they’ll add the English language next year so I can see how popular it is compared to Javascript.
JSON and YAML aren't programming languages. You can build a programming language on top of them, just as Lisp is built on top of S-expressions, but the underlying data representation isn't a programming language alone.
I am interested in reading more about their methodology. They seek to rank by “popularity” but thats a very fuzzy term. I expected to see Javascript #1 based on how common it is. Which owes itself to being the only option (outside of the WASM niche) for front end web development, in addition to being fairly general purpose (node, electron, mobile). So im surprised to see it at #5 and less than half the score of python.
For example, search Indeed for software engineer jobs and filter by programming languages. Last I checked JavaScript had far and away the most results. And it becomes clear why - basically any web development position requires it. Python + JS, C# + JS, Ruby + JS, JS (node, ts, etc) + JS, etc.
I dunno, never underestimate the amount of legacy that exists in the vast majority of corporations. I've come across far more VB developers in non-tech companies then swift developers.
TIL that Prolog is only slightly less popular than ABAP. I mean I'm a big big Prolog fan, but as a niche language for constraint solving and similar stuff; there's no way that Prolog is used as much as ABAP (SAP's language), so usage can't be the criterion for popularity here indeed.
Job postings is even an inaccurate way to measure. Often companies like Amazon will have one posting and hire K people- everyone who passed the bar- and all of them will be programming in Java, typically.
> Veteran languages can also turn up in places you might not expect. Ladder Logic, created for industrial-control applications, is often associated with old-fashioned tech.
Ladder logic is not "old fashioned tech." It is by far the most commonly used language in American manufacturing. Nothing beats it for allowing the onsite techs to easily debug the machine at 2am.
There is a hidden economy of programmers who often don't realize they're programmers. They're hidden inside every industrial facility, maintaining machines that use ladder logic to control their functionality in PLCs. And the ones I've met are generally brilliant at it.
I was watching as one such person modify the program running a machine. "How do you deploy the change?" I asked. "Deploy? No, this is live. Every change I make is happening immediately."
I very often wish I'd never gotten into coding and either done (ladder logic as opposed to custom embedded hardware and software) industrial controls or journalism...
Java is king. Take away all the AI batch stuff and Python would shrink more. Putting python in production for online applications has many issues (trust me, I know). Java is very robust, performs well, tons of mindshare, pretty good frameworks (Spring Boot, Quarkus), and very good IDE support.
- A decent type system, which makes it better than Python, JavaScript, R
- Runtime errors with good stack traces instead of UB for things like invalid casts or null dereference, which makes it better than C++
- Automatic memory management without complex borrowing rules, which makes it (arguably, for some situations) better than Rust
- The JVM targets many different platforms and Java programs often “just work”, which makes it better than Swift
- Decent abstraction, which makes it better than Go
- Large ecosystem, which makes it better than Dart or Nim or Zig
- C#? I’m not sure, C# is a lot like Java. IMO what Java has which is better than C# is Kotlin and better JetBrains support
Now Java isn’t perfect. In fact I prefer Kotlin (always), Rust (despite the increased difficulty), and Swift (if the platform allows) as general-purpose languages. But unlike those, Java is one of the ancient big languages everyone knows about, and compared to the others (JavaScript, Python, C++) it’s the best general-purpose language by far.
Lol so people are getting PhDs in hype? I think not. It's also been like 10-15 years at this point. Me thinks springboot will disappear before pytorch.
Oracle does not own the Java programming language itself. Java is an open-source language, and Oracle primarily maintains Java SE and Java EE platforms implementation. But, there are other implementations and open-source versions of Java, e.g. OpenJDK, which are not owned by Oracle.
SQL is old (was popular before I started programming).
Python is new (I remember when it became popular).
SQL is also the second oldest language in top the 20, after C. Hell, the only languages older than SQL on that list overall are C, Fortran, Lisp and Assembly
Top "programming" languages should not include SQL. No one uses it to do programming, we use it to read/write to a data store. If it's included, Excel/google-sheet should be included ahead of it and would be the #1 programming language. Typescript is a programming language, but should be grouped with JS in lists like this.
Reading and writing a datastore is programming, but, no, that’s not the only thing SQL (including its standard, and various proprietary, procedural modules/extensions) are used for.
Top 15 PL are Python, Java, C++, C, Javascript, C#, SQL, Go, Typescript, HTML, R, Shell, PHP, Ruby and SAS.
Python doesn’t just remain No. 1 but it has managed to widens its lead in 2023 at the expense of smaller, more specialized languages. It has become the jack-of-all-trades language and has found its niche in AI, where powerful and extensive libraries make it very popular.
For PL job ranking, SQL is No. 1 skill but to get proper jobs it must come with Java or C++ skill.
Java and the various C-like languages, however outweigh Python in their combined popularity for high-performance or resource-sensitive tasks where Python is slow.
Very specialized languages like R and Fortran are still very popular perhaps due to their hard to replace reputation in legacy systems.
Python and SQL work great together (especially with SQLite that is build into Python).
So it’s not one or the other.
I have thought ML Phd students that mostly used Pandas in Python about SQL, and they where a surprised about the similarities (and the additional power with combining them)
Of late, I have been googling a lot of SAS and may have contributed to its rise in ranking! Not coding in SAS but moving SAS to Python. Speaking of enterprise Java, there's a ton of enterprise SAS and it's moving to Python.
This doesn’t pass my smell test. Ok, python is taught in colleges and widely used for data science. Fine.
Java… yeah, lots of legacy code and large corps. Fine.
But C++? I can see why C might be up there due to the embedded world, but is C++ really more popular than JavaScript? I’d expect JS to be at the top of the list, and meanwhile it’s only about twice as high as GO? I like Golang, but ask ten software engineers and maybe one has used it, whereas it’s unlikely you haven’t used at least some JS.
My gut tells me that this list isn’t very meaningful.
> But C++? I can see why C might be up there due to the embedded world, but is C++ really more popular than JavaScript?
If it's anything like tiobe, they give bonuses for vendor offerings, which gives languages like c/c++/fortran legs up for most domains that aren't web development. Robotics, embedded, scientific computing, etc.
Or at least I've never seen a js vendor offering that wasn't for web development.
Sure, but it all boils down to what exactly is ranked right? "Percentage of programmers that has used at least some of this programming language." is probably not a criterium.
Which category is supposed to relate to their previous reports? They used to have 4 languages (C, C++, C#, Java) between 50 and 99, now there are none. It appears their algorithm has changed. They mention change in the article but all the change appears to have happened in the past two years. They describe Trending as zeitgeist but what does that mean?
With this kind or ranking, there is a ton of noise so the top few likely correctly reflect the methodology but the ordering of the lower ones is meaningless.
Those numbers are quite interesting and a good wake up call for all those mentions how relevant Kotlin happens to be on the JVM space, across all three criterias.
When I search Kotlin jobs on Indeed I qualify with -Android -mobile to get the truth about Kotlin's failure to make a dent in Java's entrenchment outside mobile.
SQL is Turing complete. Standard SQL may not have been before SQL:1999 (though proprietary query and procedural extensions made many concrete implementations Turing complete), but both CTEs and procedural code (SQL/PSM) were added as part of the SQL standard in SQL:1999, and either one alone would make standard SQL Turing complete.
Otherwise, it’s a language that has picked up pace in the last few years and is not laying still (Java 17 and now 21 for example, but also AOT compilation with graalVM, etc).
End of '90 to mid 2000, Java was the go-to language for everything, from browser applications (applet), to coffee machine, to enterprise applications. It is cross-platform, and was free before oracle bought Sun Microsystem.
It's also the language of devellopement of Android.
Same question for HTML. In my mind, HTML is a markup language. Maybe I am unaware of more advanced features that fall into the programming language category.
For me HTML is a declarative programming language with a very small feature set. Many people disagree with me and I’m ok with that.
The argument of “it’s a mark up language” tends to not hold weight for me as when I ask people why they say “because it’s in the name” - which, for me, misses that HTML is considerably more developed now than when it was named and should be measured on the current feature set.
It’s not a hill I’d want to die on though so really, it doesn’t matter.
HTML isn't turing complete, which, along with other limits make it very limited as a programming language, but it does support programming behavior to some extent without resort to another language, so it certainly makes sense in that regard to call it a programming language.
Yes, SQL, HTML, CSS (not on this list apparently), XML, JSON, YAML and a couple others are all declarative 'programming' languages. I don't know if there's a formal definition of what is and isn't a programming language tbh, but either way I wouldn't consider them lesser for whichever reason.
That said, I think SQL, CSS and maybe HTML are the only ones that are a league of their own, that is, you could get a job as a pure CSS or SQL programmer.
That used a sometimes conventional "sarcasm font" to refer to a widespread belief that now that we have Rust, nobody would start a project of any size in C++.
> Python’s increased dominance appears to be largely at the expense of smaller, more specialized, languages. It has become the jack-of-all-trades language—and the master of some, such as AI, where powerful and extensive libraries make it ubiquitous.
Python being described as a jack-of-all-trades and master of some is a really succinct way of explaining it's dominance. One of my kids is about old enough to start learning to program. At this point, I can't see a reason to teach her anything but Python.
As someone who expounds often on the merits of statically typed languages for production code, I couldn’t disagree more with the sibling comments discouraging teaching children programming with Python. I think it’s the best choice.
I remember when I was a TA teaching CS 101 with C++. People constantly stumbled with types and compilation. Just the concept of storing a value in a variable and retrieving it again was too much for some people.
That is where Python will shine for youngsters. Variables, control structures, functions. I bet they will get to the point on their own when they’re like “what’s the deal with numbers and letters being different”? and then you can take their hand and say “grasshopper, you are ready to learn about C/integers/ASCII/floating point”.
Yeah but you need online IDE to have sandboxing if you try to build something more fancy as global deps by default bite so hard in python.
I like JS for it’s node_modules solution.
In python you end up with libs conflicits and segfaults that you cannot debug unless you are deliberate about deps management.
Python is not a master of anything... just popular garbage.
The reason not to teach Python is the same reason you'd use for preferring home-cooked meal to McDonald's take-out. And I'm saying it as a parent who has to compete with this food enterprise... I think, I'd be a horrible parent if I didn't allow any McDonald's food ever, but I certainly don't encourage it.
Similarly with Python: if you want to give your child something good: there are plenty of options that are better than Python. But, Python will be the inevitable garbage they will have to face due to its popularity. It would be counter-productive not to prepare the next generation to deal with it.
Not in a position to advise anybody how to bring up their children, but I would focus on ideas, not languages. So for me Racket is evident choice in teaching how to program.
Seems like a good way to make someone not want to program. That's jumping into the deep end a little too fast, especially for kids. Early learning requires "easy wins", in my experience. code.org, sparkfun stuff, maybe even Factorio allow kids to experiment with learning programming constructs in a fun sandbox.
One of, if not the most important aspect of learning anything that has a lasting effect is honesty between the teacher and the student. I couldn't with a straight face advise anyone to learn Python. It's just the absolute trash of a language in any aspect you choose.
Playing stupid games with someone's ego and the ability to self-assess is a sort of insidious evil. Just to give you as an example, a story of one Polish naive artist who was led to believe he was a big deal by some newspaper publishing an article about him. It was in the 80s, I think, when the world went through an infatuation with naive art phase (again). Excited about the newly-earned fame, the guy applied to Krakow's art academy, and... was basically laughed out of it for not being a "conventional" good craftsman. He committed suicide short afterwards.
In a less dramatic way, the other thing that comes to mind is the scene from Kung Pow! Enter The Fist, where the protagonist was "taught wrong, on purpose, as a joke". I cannot imagine telling someone who knows nothing about programming that Python is a good way to do anything or to learn anything. It's just a pile of trash made popular by the network effect. If anything, anyone with a drop of common sense should work hard to prevent the next generation from using it.
I started my son on Python but he switched to Java because that's what you need to write Minecraft mods. That lasted for two or three years. Now, thanks to Advent of Code, he's coming back to Python again, as well as assembly language (he did the Nand to Tetris course and is very interested in designing his own CPU on FPGA). Eventually I'm going to show him Scheme and SICP.
I've always felt like it was a bad first language because there's all of this type stuff going on under the hood that isn't really hidden from the user. I think go/java are probably better first language because they're more explicit with the typing.
Java in particular, because the fact that it's so verbose and has so much boiler plate, I think, is actually good for a new programmer.
Couldn't disagree more. I don't know why programmers think there's merit in making things harder than they have to be. There's nothing wrong with learning about strong typing later on. Python is so much more readable and easier to understand than any strongly typed language.
This is not about being "hard". Python is, actually very hard... except you don't know it well, and mistakenly think it's easy. You probably had never seen more than half of the functionality available in the "standard" library, i.e. you don't even know it exists, for example. Now, add to this that there isn't really one Python. There's a succession of similar languages, all called Python. But, can you tell in which version object of "module" type gained all the attributes of their alleged superclass -- "object"? How many facts like this one do you think exist?
There are also different ways things can be hard. Some things are hard to understand once -- and then once you understand, no extra effort is required. You just go around knowing it. Some things could be hard because nobody can explain them to you, and you will be running up against the wall trying to figure them out. Some other things are hard because they are always in flux. You think you've understood them, but the next moment someone is pulling the carpet from under you, and you have to start understanding them all over again.
I still remember my disappointment, for example, when after second semester of automata theory, we ended up constructing "mini machines" -- short sequences of instructions that could be combined to build bigger programs. And then we went straight to Java. There was no continuity. Nobody cared or even could explain how, if you wanted to build from the first principles could you get Java. The step between "mini machines" and Java was never explained. There was no explanation for why should Java have such and such feature, or what would have happened if it hadn't etc.
People are masters of reconciling inconsistencies, and not in a good way. When something doesn't make sense, people will masterfully invent some cooked-up nonsense to replace the real explanation. And this is the case for most CS majors. When you ask them questions about basic things about their trade, what you hear back is some laughable excuse of an answer. We have several generations of programmers today who grew up with this nonsense, internalized it, made it into a cult. Python owes its popularity, in great part, to these generations of programmers with missing foundations, lacking ability of critical thinking which led to bizarrely distorted view of the world.
> But, can you tell in which version object of "module" type gained all the attributes of their alleged superclass -- "object"?
As someone who's day job is mostly Python, not off the top of my head, but I could probably find out if it was ever relevant to my work (finding information about Python is quite easy, which is one of its social strengths), which it would almost certainly not be, because in normal use of Python modules, it doesn't matter.
> How many facts like this one do you think exist?
Trivia that's irrelevant to most use of the language? Lots. So what? That doesn't make actual use hard.
> I still remember my disappointment, for example, when after second semester of automata theory, we ended up constructing "mini machines" -- short sequences of instructions that could be combined to build bigger programs. And then we went straight to Java. There was no continuity. Nobody cared or even could explain how, if you wanted to build from the first principles could you get Java. The step between "mini machines" and Java was never explained.
I can see why that might, to some tastes, make that course of study intellectually unsatisfying, especially if one both had an obsesssive need to see the whole picture but not the drive to do the obvious work laid out, implicitly, fior the student, but it seems like a non-sequitur when discussing whether or not Python is hard.
> But, can you tell in which version object of "module" type gained all the attributes of their alleged superclass -- "object"? How many facts like this one do you think exist?
None of those things matter to a beginner, what's important is minimizing friction, so they can actually learn programming without having their curiosity rewarded with a flood of compiler errors. Python is fantastic in that regard. No need to define a main function, no need to care about signed/unsigned int, printing out the contents of an array or dictinary directly is a single statement that won't just output a heap address, etc.
And when they learn about more advanced stuff they will understand why many languages do things differently, and Python will still be very useful for small scripts and prototypes.
That's where you are wrong. Beginners don't know that these things matter to them. But they very much matter.
> so they can actually learn programming
But they won't. They will become the "water from the toilet" idiots. Whatever they learn how to do will have no value, if they don't know how to do it well. Accidentally I saw this article today about how US military lowered IQ requirements for draftees during the Vietnam war, and how people with low IQ were five times as likely to die. You are suggesting to do thing wrong in hopes that it's somehow if you do it fast, nobody will notice? -- hell no. We already have armies of programmers who believe in strict static typing or in object-oriented programming. This will just add more idiots to their ranks.
> That's where you are wrong. Beginners don't know that these things matter to them. But they very much matter.
Could you specify why they matter for a beginner?
Because I've been using Python for quite a while and I can't answer your previous question about what specific Python version did that subclass attribute change. I think by the time such a detail comes up a "beginner" is already comfortable enough with the basics to learn about it. But in Python those basics tend to be more straightforward and riddled with less boilerplate and redundant syntax than in many other languages, which is nice because getting slowed down by those things when learning is frustrating.
> You are suggesting to do thing wrong in hopes that it's somehow if you do it fast, nobody will notice?
I'm suggesting that reducing friction in the first phases of learning programming is a worthwhile improvement and that the choice of language matters in this regard, nothing else. I didn't say a word about learning an entire language or learning programming up to the point which you call "knowing how to do it well".
> We already have armies of programmers who believe in strict static typing or in object-oriented programming. This will just add more idiots to their ranks.
Am I reading this correctly that you think relying on strict static typing is for idiots? Because I made the experience it's very helpful for maintaining code, in addition to other best practices of course.
They're not redundant and frustrating when you're a beginner. The repetition is good for you, and things still feel novel rather than repetitive and boring.
> But they won't. They will become the "water from the toilet" idiots.
This is, IME, not a function of what language people learn with first (ot second or...) but of their personality and desires. Most people who learn programming learn it as a means to an ends (often the latter is a paycheck) and will learn only as much as feels like a good payoff for the effort in that context. This isn't the fault of the language they start with, its just human nature.
You can't fix that things you thibk everyone should know about programming, and should want to know, actually don’t make sense to bother learning given most people’s actual utility function by using a different first language.
I think that for teaching, you can go too far and eliminated too much friction. In python in particular, there's all this type stuff going on under the hood that's absolutely critical in understanding what the output of your program is going to be, but nowhere in that program is it explicit what that type information is.
It's easy enough to keep track of that in your head once you've mastered the concept, but for a true beginner, I think
int x = console.nextInt();
String operation = console.next();
int y = console.nextInt();
is probably better suited for helping you get your bearings than having to figure out why you have to cast x and y after doing
Of course it's hard to completely understand Python inside and out. There are probably only a few dozen people in the world who do, or a few hundred max. And that's because for the vast majority of people it's completely unnecessary. If you want to go deeper then you're welcome to, but a new programmer is just going to get bogged down in unnecessary details if you try to introduce new concepts when he's just starting out. It's like saying the average PC user should understand the inner workings of GUIs. Makes no sense.
Why not C then? Java has more "just do this because it's the way it is". I think it depends on why someone is learning to program, generally I think python is fine for say a mechanical engineer that has to do some scripting, C is good for, as you say, understanding what's happening under the hood, exposing pointers in particular.
The array-pointer decaying magic in C requires too much mind-bending. Most of the standard library for string processing is insecure. No boolean types. No name-spacing. One-line if bugs (ridiculously common for beginners and even experts). I could go on and on. Frankly - I would have preferred to learn assembly as a first language rather than C in my college days.
I think with C, learning how the pdp11 worked gets in the way of learning how to program. I think most people should eventually learn C, but control flow should come first, and java provides more guard rails. No `char *` and no segfaults.
Boiler plate gets in the way of solving the problem you're writing code for. IMHO it's too much detail to force on someone new. There are reasons not to use python, but it's better to let people discover them than to try to explain it all up front.
If you're just trying to hack something together and make it work, I'd agree that python is the best, even for somebody who's never written a line of code in their life.
If you're trying to train somebody to be a software engineer, rather than just automating a particular data entry task as quickly as possible, I think boiler plate is good. It's better to force the details on the person at first than to have them be hidden, automatic, and mysterious forever.
In practice I assume Javascript is the leader language for onbording new people. The speed to go from learning-the-language to useful results has to be best in Javascript.
Every time I need to work in JavaScript (which happens only a few times a year), just getting build scaffolding to work is so very hard. The gulf from "you understand the language's syntax" to "you can build something useful following commonly-agreed conventions" is vast. Not to forget that these conventions significantly change every few months.
Use Deno. You don't need to deal with `package.json` and `node_modules`. You can have full programs in a single script that import their dependencies within the script. The build scaffolding can range from 0 to whatever you need.
I actually returned to JS scripting after a long hiatus thanks to Deno. And now like the language a lot more.
Lua is also a good choice as a first language because it's a super small well-defined language that let's you transition easily to JavaScript (which has a similar object model but is a dumpster fire) or python.
Lua also integrates with C extremely well, so it can help with that transition.