I feel like I've been reading this article every year, with some variation but the same yet-unrealized potential every time.
I'm not an expert in "human" coding or AI, but I get the same vibe from both of them -- until you try it yourself, it seems pretty reasonable that someone will solve this problem fairly soon. And it has seemed that way for many years now.
Simple CRUD apps can already be built with simple, friendly interfaces. The simple apps will continue to get easier for non-programmers to assemble without "coding".
But real life requirements get comlicated soon -- your simple CRUD app now needs to auto-charge customers every time they foo 100 bars, unless it's been less than a month (in which case you charge a prorated total at month end), and early subscribers are grandfathered in to a simple fixed monthly charge. How will you click-and-drag that? Or explain it clearly in natural language, without multiple possible interpretations?
People make complicated stuff in Excel spreadsheets, and while it starts off as fairly vanilla, it quickly becomes something as complicated (if not more so) than writing a standard computer program in code.
Natural language serves a purpose -- mostly around navigating complicated social situations, including expressing general desires to others (who have motivations of their own, and will interpret according to their own internal rules). It's fuzzy and generally non-specific because that's how we are.
Why do we imagine our normal ways of interacting with each other can be applied to instructing a process to execute complex & strict logic?
I'm pretty sure we can make progress -- clean up the leakiest abstractions from software to APIs to hardware... -- and sort out the best way to build coding languages so they are as intuitive as possible for humans -- but I don't think "code" generally is going to disappear anytime soon.
Yup, it's kind of tedious seeing these types of article / blog post. As anyone that works as a programmer knows, it's not the grammar of programming that is hard. It's understanding the hardware and what it can do that is hard. It's encoding nebulous and sometimes conflicting business requirements into something a computer can understand that is hard. It's taking into account all of the different error states that a process can get into that is hard. Learning the syntax of a programming language? Not hard.
We as programmers have weird brains, compared to most of the population. You give a programmer the list of road rules, and they will probably immediately identify, instinctively, several rules that are mutually exclusive / in conflict with each other. You're able to understand how a home entertainment system needs to be set up, even though that's really not what most programmers do. You identify loopholes in legal frameworks / contracts, because the lawyers that wrote them aren't used to thinking about what has to happen for edge cases.
I agree with all the things you say are hard: understanding computation, encoding complex requirements, handling edge cases; but typing code into an editor and then re-running your application doesn't provide much support for making them easier. Better tools could help programmers and non-programmers alike work with that complexity much more effectively.
Tools are designed for certain requirements. As the commenter, antimagic, noted, these requirements can conflict. A better tool would definitely have to sort out those conflicts, but at this point we're talking about something like AI.
To talk down the scope of lots of projects, I've always educated a client by saying if certain types of thinking were automated then we'd have to say we were creating AI and that the cost of the project would grow exponentially.
You identify loopholes in legal frameworks / contracts, because the lawyers that wrote them aren't used to thinking about what has to happen for edge cases.
The lawyers do understand, and did it on purpose. Programmers just think similarly enough to lawyers that they can sometimes find things that were meant to be hidden from the general public (and from the junior lawyers who work for the competition). :)
I don't find these articles tedious. I think it awesome that people want to solve more problems with computers. The more people that can solve problems, the more people that will solve problems, the more problems that get solved. People want to use the power of computers to solve problems, and that is hard to do. Until it is easy (never) these articles will continue.
I honestly don't think you can make it much simpler. That is not to say it is simple, just that it is by nature not simple. You can dress the concepts up, make them look nice with a graphical front end, but really the complexity comes from the abstract notions implicit in functions to be performed. You can tailor things by building libraries of functions to perform common tasks, but these abstractions are never going to be sufficient. What is required is a method of translating from the world we are most familiar with, described by our natural language, into the world described by the language of the computer. The translator needs to understand the context of the natural language requirement and find its correct representation in binary. This is the job of the programmer. An automated solution may well be able to pass the Turing test. That being said, I often think about how to leverage machine learning to help bridge the language gap. Evolving an application is an interesting idea, where the requirements are drip fed to the system through the language of the UI the program should present. Translations from some physical representation, cogs, roads, queues, etc might be useful since it is instinctive to many people yet more precise in its descriptive power than spoken language.
It's certainly at lest 30 years old. Before my voice changed I went to my first computer conference. People were selling tools there that would let business types do "programming" via just writing what they want in English. For minicomputers.
Of course, they had to write in a minimal, highly structured pseudo-English. And the problem domain was highly restricted. So it was basically programming, except less effective.
A fine example of the phenomenon that if enough people want something, somebody will come along and sell it to them, whether or not it actually works. I used to think this was malice, but as far as I can tell, greed, narcissism, and the Dunning-Kruger effect can explain it perfectly well.
Trying is excellent. Trying and failing is also excellent, in that it's how we learn. Trying and failing and then selling the product as if it's a success? Less excellent. :-)
Exactly. I remember, once a friend wanted to create a tool for designers to use for creating HTML+CSS visually, without having to know actual CSS properties and such.
Then I explained, the hard part of laying out webpages is not the basic layout -- it's how layout reflows when text overflows, when the browser window is too narrow, when image sizes change, what things maintain alignment and what things don't -- and that by the time you've added all these options to your visual tool, the visual tool is now equally complicated as CSS, so you've defeated the whole purpose.
At its heart, it's the fact that the specification of the program is the program. The devil is always in the details. (Which, incidentally, is why it's so hard to estimate development time accurately.)
But we have some such better tools! E.g. Chrome's developer tools let you edit CSS properties and the document structure live, seeing results instantly. It's fantastically more efficient that coding and refreshing the page. Tools that better match the problem domain matter so much precisely because of the domain complexities like overflowing and arbitrary data.
It's a small step. I can easily imagine bigger ones.
The idea is to build a tool that better fits the problem domain. In this example a visual tool lifts the burden of remembering specific CSS syntax, HEX codes or browser inconsistencies. In turn it's easier for a designer to worry about more interesting stuff, like reflows or responsive layouts. I have great hopes for visual web tools, such as Macaw. I think it will really take off when we get reusable HTML/JS components.
But that doesn't mean we need to reinvent the wheel every single time. Now look at something like Macaw and tell me that doesn't look like a useful tool: http://macaw.co/peek/
It's a good thing you talked your friend down from creating something. The world has too many creators as it is. Don't they know that everything good has been created already? If it hasn't been done then it isn't possible!
Did you have a point here? Do you think he should have just said "sure man, sounds great, go for it!" Did you know that a lot of non-programmers have a lot of great ideas, and that educating them about the difficulties of implementing those ideas isn't actually a bad thing?
The particular app in question has been tried a million times and countless hours have been invested into it, perhaps learning from that body of work would be helpful? But hey, maybe that guy is The One who will come out of nowhere and solve the problem in a completely new manner (half-serious - it's technically possible, but I don't think it's useful to put our eggs in that basket).
Yes! Those exact words. "Sure man, sounds great, go for it!"
General rule of thumb, if your friend wants to achieve something, help them do it. Don't sit there telling them why they're going to fail. It almost doesn't matter what it is. They want to start a business, be a brain surgeon, jump the grand canyon on a motorcycle, whatever, believe in them and help them succeed. If you don't believe in them, what kind of friend are you really?
I do believe that there is an amazing HTML/CSS design tool waiting to be created. I don't know what it is, and I don't know if that guy would have been the one to create it, but I love to see people try. He probably would have learned a lot even if he failed.
That negative attitude just really gets my goat in real life. There's one guy at work on whom I can count to shit all over any idea I bring him. There's another guy who always gets excited about trying new things. Guess which one I would happily give a recommendation, and which one I wouldn't care if he fell out a third story window?
Learning from your own mistakes is effective, but not efficient. Sometimes it's the best you can do, but when you can learn from other people's mistakes instead, that's a lot better.
It is funny to me that people extrapolate "code" as just our best way of representing computational instructions we want done that we can express in natural language. I agree with you that sometimes this isn't true. Sometimes I read code and I can't express what it does in English. I could roughly describe what it does, but I can't communicate all of its subtlety. The limitations of "no code" isn't the theoretical limits of AI it's human!
Poorly-written code (especially if the reader doesn't have a good grasp of the coding language) can be very confusing, but it will still almost never be ambiguous.
If you try taking some well-written code and converting it to natural language with utterly no ambiguity, you'll quickly find that:
- the natural language version is far, far longer and harder to read
- it takes a lot of work to remove all ambiguities, and you will find yourself wanting to put a dictionary with strict rules and extensions -- totally unlike a REAL dictionary.
=> Word "allow" will always mean generally the 1st (verb) definition in Mirriam-Webster online, and will further imply that if the subsequent test (see below for definition) succeeds (definition below), then execution (see below) will continue; if the test fails (see below) then standard error (see below) handling procedure (see below) will be followed, in which...
It's ridiculous, isn't it?
You'd be inventing a new (but probably horrible) coding language, not using natural language at all any more.
The thing about natural language code -- it does exist, sort of. Think about the checklists that airline pilots run through before takeoff, for a good example.
That's sort of natural language process execution... but it's the same every time, the context is known (it's being executed by an airline pilot sitting in the cockpit of a specific airplane), the pilot is trained such that s/he has practiced running the checklist many, many times before doing it "for real". And: it's a friggin' checklist. Is logic in there whatsoever, beyond "if this is true, continue; if not, abort"?
The limitations on it are incredible, and we still have human beings executing that "code"; they are not automated. The answer to one of the questions might unexpectedly be "sort of", or "yes, value in the safe range, but why is it steadily dropping?"
A race condition is a set of (possibly very many) very well defined states - or at least defined as well as everything else in the program. Someone should discover some algebra for working with concurrent processes and the race conditions would go away!
A simple data race in C11/C++11 is undefined behavior, it doesn't have to result in any well-defined state that you might get by interleaving the execution of threads.
Essentially it has to be that way, or on some architectures every access to a potentially shared variable would have to be wrapped in fences by the compiler.
Ambiguity is quite frequently between several well-defined states. A significant difference between race conditions and natural language ambiguity is that in the latter we typically expect resolution to happen favorably when the interaction is cooperative - which is not a feature of race conditions in the general or typical case. I think it's still fair to say that this is not an essential feature of ambiguity, though, which is why I included it as an example. In the case of a race condition, the relevant pieces of the language do not uniquely define a single result - additional context may.
Right. By the time this problem is truly solved with software sophisticated enough to eliminate the need for programmers, most people will be fairly useless in any endeavor.
The question that these sorts of articles always raise is ultimately , should we treat computation as something like mechanical expertise which is important but can be easily outsourced or is it more like literacy which is critical to just about every profession and is often used as a proxy for intelligence?
It strikes me that the real difficulty is drawing a line between which tasks should be left to technical professionals and which should be within reach of the layman?
This is much easier to define in the case of mechanical devices which tend to be fairly fixed function rather than computers which are applicable in an almost infinite number of tasks.
For example , you wouldn't expect a car driver to be able to modify their own vehicle, but you would expect a lay computer user to be able to do tasks such as changing their wallpaper or installing software on their computer.
I wonder if this is partly what is driving the popularity of computer hardware with artificially imposed constraints? Is it simply easier to understand when you use separate devices for reading books and playing games?
It also begs the question over whether simply changing the means with which to do something changes the nature of the task. For example is somebody designing a complex logical system not "programming" because they are doing it inside of a spreadsheet program instead of an IDE?
That's a good question. I don't have a personal answer yet, though I do like your mechanic example.
But to your last point: I chose the word "coding" so as to refer specifically to the linear textual formal representation we use today. People will keep designing complex logical systems - indeed, I hope more people will become more capable of doing so, and I think better tools will help.
Call me skeptical. Virtually everyone who has spent more than a few years programming can attest that it has, at the very least, opened a new dimension of logical thinking and problem-solving that is not easily found in other fields. I think the challenge of creating an interface to this kind of thinking that is friendly to non-coders is difficult enough...but the idea that once these tools are made, they'll grok what to do? Hmmm...We're much closer to a future where machine readers can read all manner of text out loud, but does that mean there's little use for teaching literacy?
edit: Here's an example - I recently got done teaching a data basics class, and inevitably, my students ran into the problem of data typing in Excel...you type in something like "9/14" into a field, and it treats it like a date, and moreover, displays it as "Sep-14". But when you try to concatenate the field, or perform some other kind of operation on it that is not date-related, Excel treats it as an integer (number of days since 1904, or something like that).
So Microsoft has arguably done something user-friendly here...virtually anyone who types in '9/14' probably means September 14, 2013. However, this isn't always the case, and if it isn't, you have to know how to format columns in Excel as text. Moreover, the operations upon date types are not at all intuitive. If you've done a minimal level of programming, you get the idea of data types, including dates and strings. Let's pretend Microsoft comes up with a way to perfectly guess what you mean when you've typed in a value...is it possible this will eliminate the need for the average user to have an understanding of data types in order to do non-trivial work?
There are many ways to mitigate these issues. The one I like to believe will be the most used would be a entire new way of using computers. Just like Google uses complex algorithms in order to try to give you better search results, I see no reason why Excel can't try to "learn" your usage. I'm quite saddened that it's 2013 and few applications try to learn more about the user's usage patterns.
Overaggressive learning algorithms can in some cases be more annoying than well thought out defaults. Especially since they won't be consistent (at least at first). Its not an easy thing to do right. For things like spreadsheet formatting, you want to have consistent behavior that you can rely on.
Us humans are insanely proficient in adapting to weirdness and suboptimal environments, and plain retarded when it comes to coping when others mess with what we're used to. Classic example: the QWERTY keyboard.
If you want to do a fun experiment, try and watch the pain in two subjects faces. One who has never used predictive text trying to use it for the first time, and one who has used it for years and is trying to use an old phone without it. Now imagine a scenario where every predictive text implementation had tailored itself to the user.
Build a system with what seems to be a good, logical set of usage assumptions and you will find that once it's out in the wild, you will get people trying all kinds of different ways of using your system. It defies your own logic, but humans aren't that logical (usually). Being too clever with your assumptions usually backfires.
I guess the difference is that what google does is already somewhat fuzzy, when you search for something you're not necessarily even sure what you want the search engine to return. We also don't expect the results to necessarily be perfect which is why the search engine returns more than one result.
On the other hand when you're trying to do some specific task you already know in most cases exactly what you want to happen, so if it does something else then that is very frustrating.
Well, that's the trade off I'm hoping people are willing to make. Just like we did it for search back when Google was new, for example: we had to learn how to use Google compared to whichever search engine we were using before. It's just like trying to learn a new application. I do believe unless the whole computing system/environment works in this new paradigm, the effort wouldn't have as much impact as I'm picturing in my mind.
Google is good for searching immensely vast troves of information simply because there is no other way you could really do it. A categorical index of the entire internet would be fairly un-feasible.
OTOH when we have smaller and more focused collections of information such as programming documentation or catalogues of similar products we tend to find it easier to navigate with categorical listings or simpler search algorithms.
I've heard this idea encapsulated in a neat phrase: 'Control interfaces should not be intelligent'. I think I first read that after Wolfram Alpha was launched.
Every time I see a article like this I think back to my frist programming class in high school. The teacher asked us a simple homework question. Describe all of the steps it takes to make a sandwich.
This seems like a simple task. Get all the ingredients and put them together. This is simple to explain to human because they can make choices on their own and generally understand how a food is constructed. But when are are trying to teach a robot how to make a sandwich you have to exaplain every little detail because it doesn't understand that bread comes in a plastic bag or how to open it, or where the fridge is, or not make the sandwich on the floor.
Everyone in the class got the question wrong because they were all missing details.
I believe that code is going to be around for a very long time because it is currently the only way to truely express 'exactly' what you mean.
As the author of Trapeze (spreadsheet like application from way back then) is was nice to be mentioned. People have been trying to make coding go away even before I started coding and so far it's still moving more in the opposite direction.
I was about to respond to his claims by pointing out that spreadsheets are a gateway drug for coding and that everyone who tries to do things without coding ends up coding.
Code is remarkably universal. The article claims spreadsheets are an example of a post-coding tool, but they, and his other examples, really aren't. Notably, "visual programming" has been a failure. I expect the end of coding to be presaged by things that break the linear symbolic language model. There are no such successful things.
I agree that spreadsheets count as programming, since formulas are code. But spreadsheets also add a memory model (cell references) and a UI (rows and columns), and these three things are all smushed together, or rather overlap perfectly, while in general-purpose programming they are independent and there is no such overlap. That's perhaps what makes spreadsheets simpler and more accessible.
Spreadsheets are accessible programming to people motivated enough to understand what happens when you have more than a simple column sum in a spreadsheet, which could be called "programming in constraints." and then that beautiful code-less, flow-less, condition-less paradise lasts only until you need a macro.
I feel like what companies mean when they say that children should learn to program is that they should learn problem solving and critical thinking skills. In my experience these things are saved for only the children who are tested and marked as "gifted" but average joe should also benefit greatly from a healthy dose of learning these skills.
We don't program the computer as much as we program ourselves.
We may be able to build tools that reduce the burden but making a computer debug itself without either re-stating the expected outcomes and/or understanding the "intent" of the running code seems pretty much out of reach by definition.
In the short term it's more about automation/reduction of work than end of coding. For example, in the past I paid designers hundreds or few thousands for creating a new version of my web page, now I use Twitter Bootstrap or a theme and pay a few hundreds for customization.
Now, million of software developers are working on the same problems at the same time, they are asking the same question and following the same Q&A path in sites like SO. There is a lot of space for automation.
Regarding new tools for ending coding, nobody came up with a solution yet and there are a lot of smart people who are/were thinking about it. Surely it will be a breakthrough but it will work for solving domain specific problems. It's impossible to solve it for the "general coding problem".
I've been working on http://loggur.com for a (very) long time now to address exactly what the author describes. (Note: The home page is very outdated and vague; I'm planning on redoing it upon release; see my other comments about the details here:
https://news.ycombinator.com/item?id=5811801)
And I should point out that coding will never fully become unnecessary. Of course many of the repetitive parts of coding can be automated, but with all kinds of new ideas and unknown technologies that the future has in store for us, custom code will almost certainly always be required.
Your site looks pretty good to me (and I just viewed it on my phone). Unless by "outdated" you mean, "it doesn't contain the most up-to-date information about our product(s)." In that case, well, I look forward to seeing it! Already added to my Google Calendar.
BTW: I know you're probably trying to collect emails of interested parties with your "get notified" form but you may wish to consider a direct "Add to Google Calendar" link or button. Here's an example of how to create such a link:
Alex North writes: "For a long time, GUI builders were crap. GUI builders are still crap: they often provide a poor representation of what the rendered interface will look like, are not powerful enough for developers to achieve exactly what they want, and are too complicated and laden with programming concepts for non-programmers to use them."
Beginning about 1995, Borland's Delphi (now CodeGear / Embarcadero) seemed to me to have a pretty nice IDE for making GUIs on Microsoft Windows.
Any opinions on Delphi or information on anything similar for Linux or Mac?
About "GUI builders were crap. GUI builders are still crap": that's just BS. Not usable by non-programmers? Of course. But that hand-wavy dismissal is terribly lazy. Identifying what's exactly wrong and specially how to improve is the difficult part.
There is no GUI builder that enables building app-specific direct manipulation. GUI-builders really should not be called GUI builders unless you think a parade of dialog boxes is a GUI.
Every time I see a UI-to-code translator or converter[1], I find that it produces terribly crufty output- they tend to take that high-level design and regurgitate The Code that would produce "correct" results. Maybe the answer is applying compiler-type optimizations to the output.
On the other hand, maybe the answer is learning logic ... nay, code.
[1] The one currently on my mind is the web publisher part of Microsoft Access from about 2000. Certainly not today's state-of-the-art, but representative of the typical tool.
The tools will only improve, but the use of the tools will continue to be restricted to those who are currently coders until we solve AI. The end of coding can only be accomplished by true AI as most humans are not capable of expressing requirements precisely enough for a software program to be generated without a bunch of follow up questions. The primary value of a good software developer is to identify and ask these questions as soon as possible. Writing the code is relatively easy.
Successful software needs constant maintenance. As a software product becomes harder and more costly to maintain (as most do) the boss who doesn't quite trust his team will be tempted to look for alternatives that appear more trustworthy.
This basic lack of trust becomes a self-fulfilling prophecy as communication breaks down and the team and the boss begin to work in opposition. The boss is now a prime target for being sold the idea that software development can be very nearly automated. It’s not a logical process, it’s an emotional one.
In other cases the boss simply lacks the necessary understanding. They might think that programming is actually an unskilled or semi-skilled profession, where the programmer simply memorises a quirky list of words and symbols and then types them in according to a specification written by someone in marketing.
All this raises the question of what it means to program a computer.
> If you’re a programmer and this offends you, consider how much more value you could create if you didn’t spend half your time as a glorified PSD->HTML translator.
As the author states, the idea of abstracting away the complexity of creating user interfaces isn't new, but I think it's wrong to think we could one day create tools that lets you do that without having to write code and still keep all the flexibility of going sufficiently low-level. An application platform usually gives you low-level building blocks (e.g. Core Animation) and higher-level ones (e.g. UIControl and family). You can construct visual tools to manipulate the latter, but you can't do the same with the former, since they're much more than just position and parameters. Reducing this difference in complexity by blaming it on the tools is misguided IMHO.
There an inherent complexity in coding that isn't its weakness but rather it's advantage. You can't abstract away coding just like you can't abstract away human language or writing. Could you substitute an essay with a very detailed painting?
I was just about to write something like this. Just because you've got a pretty GUI that grandma could use to assemble a sophisticated "app" doesn't mean you have to take away the low-level functionality.
What's wrong with "right-click -> open component in editor"?
I'd also like to add that a lot of people think "coding" means "typing" and their idea of the ultimate abstraction involves a lot of drag & drop. That would be just fine for starting the development of an app but would get really annoying really fast as you start to add more features and functionality.
At the very least give us the option to perform every single task with the keyboard in an efficient manner.
It vaguely feels like he's talking about http://www.ni.com/labview/ which is just graphical programming (and is far less able to meet the level of specifications most tasks truly have than a real programming language, IMO).
Nah. People have been saying that since the 60s, and it will never happen.
Honestly, I think it's a bad idea. Programming skills turn over too quickly. If people in my age bracket had learned computer programming in middle school or high school, they would have been dragged through one sort of BASIC or another and then promptly forgotten about it. Which is probably better than them remembering it and being frustrated that nothing would let them use those skills.
If they were to learn something technical, I'd much rather it be a simple robotics class. A little bit of programming, a little bit of electronics, a little bit of Mech E, and a lot of problem-solving, self-teaching, and making of things that work. Those are skills that last, as does the attitude that they can figure things out if they want.
I don't think this is going to happen, even if someone invents direct mind-computer interface. Our thoughts are not precise enough. Everyone who has transcribed a mathematical equation realizes that context free grammar is just the most efficient way to express exactly what we want.
I agree. I think we will continue to abstract our communication with machines until the act of having to type code into a computer is a task that is largely only performed by a small subset of society (programmers).
But even when the day of having to do very minimal physical coding to direct computers arrives, I think there are still many benefits for the majority of people to learning this crude and dated form of coding. Perhaps not necessarily to become programmers, but to at least have some understanding of the guts of machines that power society.
In fact, I think the further removed we get from the physical code in directing machines, the stronger the imperative we have to expect future generations to understand not just what something is, but how and why it works.
I wholeheartedly agree. I was talking to my professors the other day about this; I think writing code will have be similar to capturing photographs: before, only professionals could take a picture: they had to operate this complicated machine, and then they had to develop the film in a dark room by impregnating them with chemicals.
Nowadays, most of the population has an immensely capable camera in their pocket. I foresee that something similar will happen with programming. Most people will be able to command their computer and teach them to do new things, without having to know the first thing about how they actually work.
There will still be people that fiddle with data structures and pathfinding algorithms, but they will be doing it for research purposes or because they need a highly specialised application.
Yeah, I think photography is a good analogy for capturing (no pun intended) this trend. And maybe I'm just being overly sentimental, but I feel like because it's so easy to do things with our devices now without knowing how they work, that they're often under-appreciated.
For example, there's this popular interview that Louis C.K did on Conan, where he said "Everything's awesome, and nobody's happy". Sure, most of that frustration can be attributed to the human condition, but ignorance also plays a big role. If people knew (and I don't) how planes actually worked (at at basic level, of course), it's harder to get frustrated about spotty Wifi.
I know that everyone will disagree, but for the most part, programmers shouldn't be writing code either. There are a lot of good tools out there. The problem is that programming by definition is writing code. And tools that don't involve code are for non-programmers.
The problem is the definition of programming. And the fact that people are afraid they won't be considered a 'real' programmer if they don't write cryptic code.
I believe that right around the time that programmers start to figure that out, general AIs will come out about 1.5 times as smart as the smartest human, and at that point ordinary human's childish perspectives and problems won't be very relevant.
This is so ethereal. What do you really mean by this? What is it that programmers have to figure out? That they have been bad boys and girls using cryptic code? ...
Ok go make video editing package without any of ths "coding" nonsense and then explain it to us childish programmers. :)
No but seriously I don't think there is a lot here ...
Coding is simply telling a computer what to do. Even to a certain degree, entering a term into Google's search engine is a type of coding. And maybe that will be the future, parsers and compilers that can turn plain language into computer instructions. We're not terribly far off with transpilers/compilers that turn Python and other high level languages into C or native code...
Until a computer can program itself (which may never happen), there's going to be a person telling it what to do - and no matter whether it's a formula in a spreadsheet, a programming language, or something higher level, it'll still be code...
I'm still waiting for the hard AI we where promised was 10 years away 40 years ago.
These posts beside some hand waving never actually attack the meat of the issue, don't predict the demise of something unless you know what is going to kill it.
Spreadsheets work better than code in some instances. Visual data-flow (boxes and noodles) in some instances. Graphing calculators in others. Hell, even Morphic and VB have their place!
Each of these tools creates a 'function' in some way or another. The main problem I see is that it is too hard to make a function in a spreadsheet, combine it with a function made in a data-flow graph editor, and all tied together with a function made in a text coding language.
We need to make the publishing and consumption of 'functions' easier. I've started the work on the underlying framework. Please stay tuned!
There are already tools like you describe and many people use them. The issue is that these tools create vanilla, cookie cutter solutions that rarely fit the situation precisely. If you want a truly original or custom work flow that will fit the situation precisely, then your going to have to code to some extent. Here is an example: artiste er is a program that bypasses the need for handwriting markup. Sure you made a template but it looks like every other shitty Aristeer site. Coding will always be an advantage for those that are willing. The rest can be ok mediocrity.
Please, this is like saying no new works of Poetry, Philosophy, Fiction, Drama, Religion will be written. Reminds me of Fukuyama's ridiculous "End of History" argument.
Regular people have a hard enough time doing every day tasks on the computer with GUIs. If we can't make regular work easy, how the hell do you expect to make programming easy?
To accomplish the complexity of issues I face on a daily basis in code, I would hate to build in a GUI.
I can type faster than I can search for the right button in Photoshop. Ask anyone who lives in their terminal - it will increase (rather than decrease) your productivity and agility. This isn't because the gui isn't well made.
I think a better prediction is, writing code and building constructs will be easier. Cognitive overhead will be reduced. That's the design problem - not coding.
So far visual coding has just been useful for minting more CS degrees, not actually getting stuff done. Can't really see things going this guy's direction.
This is hand waving month or something. Oh this just so cute but honestly when you are ready to be serious and deal with complex things and not GUI for "Grandma" then you will see this is little more than a vague performance of day dreaming. The only way for the to come to fruition some sort of strong AI, even then ... Show some non trivial examples, not this HTML GUI stuff.
For decades we've done coding the only way we can given the hardware available. But nowadays program sizes are approaching the amount of information stored in human DNA, which is a blueprint for a self-fueled, self-healing, self-reproducing nanotech computing machine that puts any modern software to shame. I'm beginning to think we're doing it wrong.
We still have a lot of work to do in tooling to make things simpler. Take for example Xcode. Xcode is harder to use than the old Interface Builder / Project Manager combo. We continue to add complexity that works for experts, but is a pain to onboard people. It is amazing how many people got HyperCard, but would not be able to touch Xcode.
Why nobody wonders that it is still required to know math to develop engineering construction. Or that it is required to know chemistry to create dish washing products.
Why then so many people find it strange that to write a program you must.. program.
Can I see certain aspects of coding becoming automated? Certainly. And I can think of some visual ways of presenting code that are very attractive in the extra information they give to the programmer.
But something like, 'give me the set of all things that....'
How'd you do that in Excel? Probably something like SUMIF - (I don't really use Excel much so I don't know if there's something that returns a list of things matching a predicate.)
Yes, doubtless that can be expressed more concisely than the code. Heck you could wrap that up and just call it fetchset and have it go.
(set1 (fetchset (things-that) from-lst))
But there you're putting more load on the person learning the arcana of certain macros or functions - and it's honestly not that much more concise in terms of headspace, you still have to think about sets for it to make any sense at all. It's not clear how that would translate into a visual interface of any significant power or how doing so would simplify the concepts that one would have to learn.
You know? Will programming get more efficient? Count on it. Will we automate tasks that are just following a pattern? Yeah. But will we stop coding?
Well, if you think about coding as being typing, then maybe we will. I don't think that visual interfaces are efficient enough to let you stick together all the concepts you need in a reasonable space of time without actually typing - but it's possible.
But will we stop thinking computationally? (Which is the real essence of coding I feel.) I don't think so. It seems to me that you have to know what you want a computer to do, or you have to have the computer guess at it. And the problem with guessing at it, other than the computer being wrong, is that if you're not a precise thinker; if don't know fairly precisely what you want to do; then you can't even mean certain things. Assuming quicksort hasn't been invented yet, how do you tell the computer that you want quicksort if it's guessing, what cues does the computer take from a general expression that you want a faster kind of sort?
Now whether that will reduce coding to mathematics I don't know. I'd tend to think not, if for no better reason than that programming includes a kind of coevolutionary mastery that can be a more fitting route for some people into thinking about things like transformations and sets and so on. I'm deeply indebted to programming for helping me think about maths in a way other than my teacher just reading black-box formula at the class.
I'm not an expert in "human" coding or AI, but I get the same vibe from both of them -- until you try it yourself, it seems pretty reasonable that someone will solve this problem fairly soon. And it has seemed that way for many years now.
Simple CRUD apps can already be built with simple, friendly interfaces. The simple apps will continue to get easier for non-programmers to assemble without "coding".
But real life requirements get comlicated soon -- your simple CRUD app now needs to auto-charge customers every time they foo 100 bars, unless it's been less than a month (in which case you charge a prorated total at month end), and early subscribers are grandfathered in to a simple fixed monthly charge. How will you click-and-drag that? Or explain it clearly in natural language, without multiple possible interpretations?
People make complicated stuff in Excel spreadsheets, and while it starts off as fairly vanilla, it quickly becomes something as complicated (if not more so) than writing a standard computer program in code.
Natural language serves a purpose -- mostly around navigating complicated social situations, including expressing general desires to others (who have motivations of their own, and will interpret according to their own internal rules). It's fuzzy and generally non-specific because that's how we are.
Why do we imagine our normal ways of interacting with each other can be applied to instructing a process to execute complex & strict logic?
I'm pretty sure we can make progress -- clean up the leakiest abstractions from software to APIs to hardware... -- and sort out the best way to build coding languages so they are as intuitive as possible for humans -- but I don't think "code" generally is going to disappear anytime soon.