Sorry, but what about this is "designed for humans"?
What do the keywords mean? What's the language paradigm? Why do I want this when it's essentially coalescing a lot of APIs into a language that you've provided no spec for?
Why would I want my language to work with slack?!
I'm not impressed. It just looks like another functional language with a bunch of addons tacked on to make things "easier" or "for humans".
There's plenty of meat in the blog post, on the Github etc. I suspect it's for "humans" because it's literate, functional programming with a REPL and a natural approach to how data is managed. It's a lot more than just a functional programming language. When did it become cool to make uninformed claims about how worthless other people's work is?
Sorry, we're not trying to call you an alien, quite the contrary! We appreciate that you are a human, and we feel a lot of our tools aren't designed for people like you and me. Our argument is that even our best tools today are built with constraints imposed by the earliest computers (isolated, single core). We've moved past these limitations technically, but our ability to program to the best of our ability on these new, more capable machines (connected, multi-core) is hamstrung by restrictions still imposed by even the newest languages. Look at Swift, a brand new language that can only offer time-travel debugging in a special playground context.
What we're saying is that it's time to move past writing code for the consumption of the machine. If we want the power of computing (not necessarily programming as it exists today) to be more collaborative, and more inclusive; and to reach a wider, more diverse audience, then our tools need to start being designed for humans. Right now Eve is focusing on a table of contents, readable and relevant error messages, and clean, consistent, simple semantics. It's a modest start, but that's where we are. :)
So again, I'm very sorry we offended you, and I hope this explanation helps make our view clearer.
I think "for humans" is just intended to convey that the language and environment are based on research into human cognitive traits: how we think and learn.
As a counterpoint: the example of the document containing blocks of code inside what looked more like a word document, to me, exactly explains visually what "designed for humans" means in this context, and I think it's well put.
That's not revolutionary at all. You can do that already with tons of environments. Here's one for Javascript, based on Markdown: https://github.com/jostylr/litpro
While I like the idea, the problem with the demo is that it has about 10 files, with a bunch of small code blocks. What does it look like in a real-world application with 1000s lines in 100s of files?
I'm sorry, but that doesn't give the connotation that it's "designed for humans". Perhaps if they could give a clear indication of what that means other than "it's in a word document", I'd be more inclined to look at it a little harder.
Watching the video is probably the best way to get an understanding of what it's about. It's hard to describe an "experience" with text. Their video made more sense to me than most of the written explanation.
That's kind of the the thing that literate programming is trying to solve[0]. But if the experts aren't able to do it for their own product, what chance does a random programmer have with their own code?
"When we communicate to one another face-to-face, we use gestures, expressions, intonation, etc. to articulate an idea. On the internet, when we communicate with just text, much of my meaning is lost forever and never apparent to anyone who reads this.
Programming is much the same way."
No, he's absolutely right; it's not clear at all how this is supposed to be human-friendly. I think it's incredibly hard to read and looks like it's going to be an unmaintainable mess by throwing random APIs like Slack into the standard library. I wouldn't say I'm salty, but I'm certainly confused as to why this is what it claims to be, rather than just a very specific beginner web dev language.
>I'm sorry, but that doesn't give the connotation that it's "designed for humans". Perhaps if they could give a clear indication of what that means other than "it's in a word document", I'd be more inclined to look at it a little harder.
To be fair, this can easily be read as bitter. "I'm sorry, but", "Perhaps if they could ... I'd be more inclined to look" both come off that way.
There are also instructions on that page you can follow (installation via pip) if you already have a reasonably modern version of Python installed, and you have an appropriate C compiler available. This is a pain to configure if you are using a Windows machine.
Assuming the 'jupyter notebook' command succeeds, a browser window should pop up, displaying a UI for manipulating individual notebooks.
If you have already successfully completed the installation, and are instead looking for guidance on using Jupyter Notebook itself, then your best bet would be to look at some of the examples: https://try.jupyter.org/
The basic premise is we Humans are cognitively build to remember and follow stories. Eve supports Literate Programming, so you can write code like a story.
The meat is that most programming languages are not designed for humans. Many weren't designed at all so much as hacked together for context they were originally used in with terrible consequences for learning or effective use by average person. C and early PHP probably lead that. Many others were affected by committee thinking, backward compatibility, preference of their designer, or the biases of people who previously used hacked-together languages. There's few languages where someone or a group sat down to carefully engineer them to meet specific objectives with proven success in the field at doing those. Examples include ALGOL, COBOL, Ada, ML for theorem proving, BASIC/Pascal for learning imperative, and some LISP's.
So, if we're designing for humans, what does that mean? It means it should easily map to how humans solve problems so they can think like humans instead of bit movers (aka machines). BASIC, some 4GL's and COBOL were early attempts at this that largely succeeded because they looked like pseudo code users started with. Eve appears to take it further in a few ways.
They notice people write descriptions of the problem and solution then must code them. The coding style has same pattern. They notice people have a hard time managing formal specs, the specifics of computation, error handling, etc. So, they change the style to eliminate as much of that as possible while keeping rest high-level. They noticed declarative, data-oriented stuff like SQL or systems for business records were simple enough that laypeople picked it up and used it daily without huge problems. They built language primitives that were similarly declarative, simple, and composable. They noticed people like What You See Is What You Get so did that. Getting code/data for something onscreen, being able to instantly go to it, modify with computer doing bookkeeping of effects throughout program, and seeing results onscreen is something developers consistently love. Makes iterations & maintenance easier. They added it. Their video illustrated that changes on simple apps are painless and intuitive compared to text-based environments with limited GUI support. Their last piece of evidence was writing their toolchain in Eve with it being a few thousand lines of code. Shows great size vs functionality delivered ratio with the quip about JavaScript libraries being bigger illustrating it further.
So, there's your meat. Years of doing it the common way, which wasn't empirically justified to begin with, showed that people just don't get it without too much effort. Then they keep spending too much effort maintaining stuff. Languages like COBOL, BASIC, and Python showed changes to syntax & structure could dramatically improve learning or maintenance by even laypersons. Visual programming systems, even for kids like with Scratch, did even better at rapidly getting people productive. The closer it matched the human mind the easier it is for humans to work with. (shocker) So, they're just doing similar stuff with a mix of old and new techniques to potentially get even better results in terms of effective use of the tool with least, mental effort possible by many people as possible with better results in maintenance phase due to literate programming.
That's my take as a skeptic about their method who just read the article, watched the video, and saw the comment here by one of project's people. I may be off but it all at least looks sensible after studying the field for over a decade.
I did Smalltalk in school and once you peel away a few layers it doesn't feel like it was designed for humans at all. Perl is also said be designed for humans but takes a very different approach.
I guess if you approach it as a "programming language" it's odd. I understood nothing of the syntax nor the semantics. But you have to admit that the way it relates different dimensions of a "program" is much more approachable than any IDE out there, in a way that is more "humane" I guess.
Here, I think you are wrong. I strongly suspect that while the high-level features of programming languages are chosen for human consumption, the implementation details and tooling are often chosen arbitrarily, or for machine-convenience. For example, I don't generally consider language environments where the leading white space on a line matters, or languages where trailing white space matters to be "designed for humans." Computers might be good at counting spaces, but we're mediocre at visually estimating the width of a blank area in text. Such an environment asks the user to develop more machine-like skills rather than attempting to accommodate their weaknesses.
Or it asks the user to use tabs? Or an editor that inserts spaces when you press tab? Indentation has never been a stumbling block for anyone but the most junior of programmers.
Sure, there are all sorts of ways to lessen the pain of syntactically significant indentation. Indeed, I'd say that the annoyance can be made sufficiently small that by the time someone is an experienced enough coder to recognize that the pain has always been communally self-inflicted, they're too used to it to care.
Do also note that I'm not saying that indentation is by itself bad.
It does more of the heavy lifting automatically. For example, rather than having to explicitly build a data structure to keep track of events that have happened, or build some message bus to receive and react to them, Eve allows you to express the fact that you want to react to them, and its runtime takes care of the rest.
How well this scales and remains available, well, that's an implementation challenge, but the user interface looks very convenient. It is potentially a higher level of abstraction over current "high level programming", just as high level programming was over assembly.
What you describe is normally handled by any number of perfectly good libraries. The selling point here seems to be "we've thrown a bunch of random libraries together in the standard library", which isn't a super compelling argument to me.
Could you share an example library or framework that allows a programming model similar to what they demonstrate in the video? I'd like to learn more.
I am aware of frameworks that can pass information around in similar ways, but only with a lot more ceremony of the sort that's not necessarily a benefit. Observe specifically the code blocks with "search" that react to events without needing to be connected in any direct way with event production; and event production doesn't need to involve data structures or storage.
Maybe this model won't work in applications of meaningful scale, but maybe it will.
How I see it, languages are only one projection of what a program is. The missing part is what confuses "normal" people (unlike people who spent long hours learning how to map things in their head). This project reminds me of AOP, old IBM multidimensional orthogonal concerns projects. You can go back and forth between different views of the program, that helps tremendously.
What do the keywords mean? What's the language paradigm? Why do I want this when it's essentially coalescing a lot of APIs into a language that you've provided no spec for?
Why would I want my language to work with slack?!
I'm not impressed. It just looks like another functional language with a bunch of addons tacked on to make things "easier" or "for humans".
Drop the buzzwords and get to the meat please.