I know only part of the picture. But I suspect that the following factors are at least sometimes also in play:
* Many company start software projects as greenfield projects which are written by young, inexperienced, and cheap developers.
* These projects, when successful, grow, and become complex legacy projects.
* Some companies do not realize that it requires far more experience and knowledge to develop such a legacy project further, and expect to be able to hire young, inexperienced, and cheap developers, which pick up work exactly where the former developers were when they left.
* And then they might realize or not that it requires far more experience and skill to read, understand, an maintain such a project, then to start a small new project from the scratch. Yet the amount of work required to understanding and maintaining a legacy project can be a significant part of all the summed work of writing it over the years.
* And then, the companies might be in denial on how much skill they require, or they might not be transparent about the amount of work needed, in an effort to get new developers cheap.
* And in my experience, this is often with a general lack of appreciation of what the developers have actually created.
What comes to my mind is that I experienced an extreme case of this, many years ago when I was a young poor student. I found a leaflet announcing a job opening in an university from a company which sold a kind of embedded product. In the interview, they asked me if I knew how to write C, and if I had any experience with 8-bit assembly (which I had). When I came to my desk the first day, I found an 11-inch stack of 500,000 lines of print-out listings for Z80 (at that time, code was printed out in fan-fold paper). It was mostly generated from C code cross-compiled into assembly, an modified from there. It was commented in Japanese (a beautiful language, but I don't speak it). It didn't had documentation. It did not even have a Makefile. In the following weeks and months, I learned that a much more knowledgeable fellow student had offered his work for four times of my own salary. He was furious at me. The company was apparently not aware that he, being an assembly and embedded expert, was much more capable than me for doing this job. The company made tons of money with selling a specific kind of hardware, and the software, which had initially been reverse engineered from the hardware product, was a necessary companion product to their main offering. I also learned that the previous developer had worked as a contractor, and had left the business relationship after negotiations on his compensation failed. He had died shortly after that, and his widow was unwilling to provide any further information or notes he might have had. In the end, I could only achieve so much at that job - it was not that I wasn't smart but it was just way above my level of experience and knowledge (which was actually not bad). And it would have been years of work anyway. The company would have been far, far better off if they paid ten or even twenty times my salary to an experienced professional, rather than to a student. When I left, they gave me a shabby certificate of employment which even contained spelling errors. And I think that sums up the relationship they had with their developers.
Another thing that I have learned over the years is that while there are of course good and competent developers, and less competent ones, what more often than not makes a huge difference is how work is organized. It can easily increase or decrease the productivity of work by a factor of ten. And this can very much determine how much value good developers can create with their work. That means that if a company cannot pay that much, it might also be because its processes simply cannot generate good value.
So, in short, part of the mismatch which seems to crop up here could be that companies sometimes do not know what they really need, and also do not appreciate what they have in terms of people.
* Many company start software projects as greenfield projects which are written by young, inexperienced, and cheap developers.
* These projects, when successful, grow, and become complex legacy projects.
* Some companies do not realize that it requires far more experience and knowledge to develop such a legacy project further, and expect to be able to hire young, inexperienced, and cheap developers, which pick up work exactly where the former developers were when they left.
* And then they might realize or not that it requires far more experience and skill to read, understand, an maintain such a project, then to start a small new project from the scratch. Yet the amount of work required to understanding and maintaining a legacy project can be a significant part of all the summed work of writing it over the years.
* And then, the companies might be in denial on how much skill they require, or they might not be transparent about the amount of work needed, in an effort to get new developers cheap.
* And in my experience, this is often with a general lack of appreciation of what the developers have actually created.
What comes to my mind is that I experienced an extreme case of this, many years ago when I was a young poor student. I found a leaflet announcing a job opening in an university from a company which sold a kind of embedded product. In the interview, they asked me if I knew how to write C, and if I had any experience with 8-bit assembly (which I had). When I came to my desk the first day, I found an 11-inch stack of 500,000 lines of print-out listings for Z80 (at that time, code was printed out in fan-fold paper). It was mostly generated from C code cross-compiled into assembly, an modified from there. It was commented in Japanese (a beautiful language, but I don't speak it). It didn't had documentation. It did not even have a Makefile. In the following weeks and months, I learned that a much more knowledgeable fellow student had offered his work for four times of my own salary. He was furious at me. The company was apparently not aware that he, being an assembly and embedded expert, was much more capable than me for doing this job. The company made tons of money with selling a specific kind of hardware, and the software, which had initially been reverse engineered from the hardware product, was a necessary companion product to their main offering. I also learned that the previous developer had worked as a contractor, and had left the business relationship after negotiations on his compensation failed. He had died shortly after that, and his widow was unwilling to provide any further information or notes he might have had. In the end, I could only achieve so much at that job - it was not that I wasn't smart but it was just way above my level of experience and knowledge (which was actually not bad). And it would have been years of work anyway. The company would have been far, far better off if they paid ten or even twenty times my salary to an experienced professional, rather than to a student. When I left, they gave me a shabby certificate of employment which even contained spelling errors. And I think that sums up the relationship they had with their developers.
Another thing that I have learned over the years is that while there are of course good and competent developers, and less competent ones, what more often than not makes a huge difference is how work is organized. It can easily increase or decrease the productivity of work by a factor of ten. And this can very much determine how much value good developers can create with their work. That means that if a company cannot pay that much, it might also be because its processes simply cannot generate good value.
So, in short, part of the mismatch which seems to crop up here could be that companies sometimes do not know what they really need, and also do not appreciate what they have in terms of people.