Perhaps I misunderstand the point of the essay, but I find it difficult to juxtapose:
"A developer from 1975 could look at modern Javascript and feel pretty comfortable. Garbage collection, runtime typing, closures, and object inheritance all existed in some form 40 years ago."
with
"I argue that almost all of the improvements have come from four areas: test driven development, version control, continuous builds, and library ecosystems.
I don't think that many developers in 1975 would have been comfortable with garbage collection, etc.. Certainly there are some developers would would have had no problem with them, but think of the many COBOL and Fortran developers would would have had no idea of what those terms meant.
On the flip side, there are also people in 1975 who would have been familiar with version control "in some form". SCCS dates from 1972 and the 1975 paper shows it was used for several projects at AT&T.
There was also an IBM user's group, SHARE, which started in 1955. Quoting Wikipedia, "A major resource of SHARE from the beginning was the SHARE library". DECUS was a similar user group for DEC, founded in 1961. The same developer from 1975 who knew about GC, closures, etc. would certainly have known about library ecosystems. So would many of the COBOL and Fortran coders.
My feeling is that the author's understanding is heavily biased by personal experiences. That's clear from phrases like "I think I first used CVS in the late 90s and never worked on an un-versioned project again.", which is 20 years after SCCS was shipped as part of AT&T Unix. Already by 1990 you see projects like Python begin under version control, as https://hg.python.org/cpython/rev/3cd033e6b530 , and Perl 1.0 at http://history.perl.org/src/perl-1.0.tar.gz was developed using RCS, based on
Gibson wrote "The future is already here — it's just not very evenly distributed"; it took about 20-30 years for the usefulness of version control systems to make it's way to the majority of programmers. I suspect the issue is that it's much easier to understand isolated technical details about programming languages from 30 years previous that it is to understand the more nebulous topic of what was considered best practices.
I also find this statement hard to understand: "I feel like Java really kicked off the trend of wide scale reusability with it’s built in standard library". C has a standard library. Python has a standard library and is older than Java. (The expression was, "Python comes with batteries included".) What makes Java's standard library more special than, say, the standard library from Borland's Object Pascal/Delphi with all of its support for rapid application development? Or of the vast library ecosystem built around VB and COM?
Lastly, there's "The magic part is simply that open source licenses like Apache and MIT lower the friction to trying out a new library." Historically there were no copyrights to source code prior to 1980. Someone from 1975 would not have had to worry about these licenses because they weren't relevant.
> I don't think that many developers in 1975 would have been comfortable with garbage collection, etc.. Certainly there are some developers would would have had no problem with them, but think of the many COBOL and Fortran developers would would have had no idea of what those terms meant.
I think it's a mistake to assume they wouldn't know what those terms meant, just as much as it's a mistake to assume modern C developers don't know what they mean, whether they have experience in languages that support them or not. Possibly more so. Part of a CS degree is exposure to concepts, usually embodies in languages that feature them. The fact that the job market was more constrained and there were only jobs available for developing in a small set of languages could account for why they weren't used much.
> I also find this statement hard to understand: "I feel like Java really kicked off the trend of wide scale reusability with it’s built in standard library" ...
Python isn't suitable, or at least is arguable a poor choice, for some tasks. C had libraries, but the standard library is very small compared to what Java and Python include.
> Historically there were no copyrights to source code prior to 1980.
There also wasn't a good way to distribute, much less collaborate, on projects like we do today. I think both the licensing and the new methods of distribution and collaboration.
Neither you nor I nor the original author have hard numbers about who knew what, so all of this is conjectural. But I was addressing a different point, which was my difficulty in reconciling the author's position that "a developer from 1975 could ... feel pretty comfortable" with modern programming languages because the underlying techniques existed already "in some form", while also implying there would be no developers from 1975 who would feel comfortable with version control control and library ecosystems "in some form."
It's surely true that not all programmers now know about closures, just like it's unlikely that all programmers in the 1970s knew about them, so this isn't a universal claim. Nor did I think it was a universal. Rather, I think the author's point is that those with interest and experience in programming languages would have known the topics. That is undeniably true.
My response was to point out that those with interest in software engineering techniques would have known about version control systems and library ecosystems "in some form", because I can point to people who were SCCS, and people who were involved in library ecosystems.
"Python isn't suitable, or at least is arguable a poor choice, for some tasks."
That inarguable. Java isn't suitable for some tasks either. Nor is the statement that the C standard library is smaller than Java. But why is it that Java's standard library "really kicked off the trend of wide scale reusability"? What was the size magic threshold? Why did previous languages with a large set of standard libraries (I suggested Object Pascal, eg, from Turbo Pascal 5.5) not kick off that trend?
"There also wasn't a good way to distribute"
Again, I don't think you are discussing the same point I am. The author wrote "The magic part is simply that open source licenses like Apache and MIT lower the friction to trying out a new library."
My point was that in the 1970s that magic part wasn't relevant. Instead, and I agree with you, the "magic part" for a 1970s developer is instant network access, and not open source licenses.
I think licensing is much less magical... and even in the 1970s there was already ARPAnet, email, and UUCP, so again it's more a diffusion process than something "magical" or unknown in 1975.
We might ask, has humanity gotten better or worse? Given this impossible query, i will try to answer anyway. A caveat being i can only comment on direct observation of about 37 of those years.
The overall answer is software systems are solving multiple orders of magnitude more problems than 40 years ago. So much better from that perspective.
I suspect the questions behind the title question however are qualitative. Are we solving these problems in better ways? Is our code more maintainable? Easier to reason about? Faster to develop? Maybe. Probabably. Yes. Certainly not by the same orders of magnitude.
Frameworks and languages continue to pile on Rube Goldberg-esque construct upon construct. Long-lived systems must ultimately be scrapped or the businesses relying upon such antiquated systems replaced. The very few notable exceptions applying engineering effort to eliminate cruft are often overlooked in favor of whichever latest marketing spiel shouts the loudest.
Ah, so we are talking about humanity after all aren't we? :)
We are indeed talking about humanity. Apparently the human condition also includes taking the easy way out at the cost of long term pain.
I was inspired to write this post after reading about people trying to design better programming languages. I realized that PL design hasn't fixed our problems in the past, and it probably won't fix it in the future. We muddle through by improving our tools and processes.
I'd like to see more emphasis in improving tooling and less in creating new languages too.
I feel like the battery of concepts from different languages is "good enough" for concisely and elegantly expressing software solutions (if you mix and match them), but annoys me that programming language authors seem to leave the implementation of debuggers, language analysis tools, and editors support to the final stages of the project.
E.g. in a lot of languages, getting and working AST of it is no easy matter and many times relies on user supplied libraries. Same for repls/debuggers.
"A developer from 1975 could look at modern Javascript and feel pretty comfortable. Garbage collection, runtime typing, closures, and object inheritance all existed in some form 40 years ago."
with
"I argue that almost all of the improvements have come from four areas: test driven development, version control, continuous builds, and library ecosystems.
I don't think that many developers in 1975 would have been comfortable with garbage collection, etc.. Certainly there are some developers would would have had no problem with them, but think of the many COBOL and Fortran developers would would have had no idea of what those terms meant.
On the flip side, there are also people in 1975 who would have been familiar with version control "in some form". SCCS dates from 1972 and the 1975 paper shows it was used for several projects at AT&T.
There was also an IBM user's group, SHARE, which started in 1955. Quoting Wikipedia, "A major resource of SHARE from the beginning was the SHARE library". DECUS was a similar user group for DEC, founded in 1961. The same developer from 1975 who knew about GC, closures, etc. would certainly have known about library ecosystems. So would many of the COBOL and Fortran coders.
(Kent Beck famously says that he "rediscovered" TDD, http://www.quora.com/Why-does-Kent-Beck-refer-to-the-redisco... , but that is not an aspect of 1970s programming practices that I know about.)
My feeling is that the author's understanding is heavily biased by personal experiences. That's clear from phrases like "I think I first used CVS in the late 90s and never worked on an un-versioned project again.", which is 20 years after SCCS was shipped as part of AT&T Unix. Already by 1990 you see projects like Python begin under version control, as https://hg.python.org/cpython/rev/3cd033e6b530 , and Perl 1.0 at http://history.perl.org/src/perl-1.0.tar.gz was developed using RCS, based on
and contains an imported package that used SCCS: Gibson wrote "The future is already here — it's just not very evenly distributed"; it took about 20-30 years for the usefulness of version control systems to make it's way to the majority of programmers. I suspect the issue is that it's much easier to understand isolated technical details about programming languages from 30 years previous that it is to understand the more nebulous topic of what was considered best practices.I also find this statement hard to understand: "I feel like Java really kicked off the trend of wide scale reusability with it’s built in standard library". C has a standard library. Python has a standard library and is older than Java. (The expression was, "Python comes with batteries included".) What makes Java's standard library more special than, say, the standard library from Borland's Object Pascal/Delphi with all of its support for rapid application development? Or of the vast library ecosystem built around VB and COM?
Lastly, there's "The magic part is simply that open source licenses like Apache and MIT lower the friction to trying out a new library." Historically there were no copyrights to source code prior to 1980. Someone from 1975 would not have had to worry about these licenses because they weren't relevant.