Could everyone on HN just take a course in languages theory so we can all stop with these stupid trolls about the best languages which have been emerging for a week.
Hopefully it would allow everyone to realize that a language is just some syntax and semantic and that a compiler is just a program like another. Nothing sacred here. Hell you can even do imperative programming in Haskell if you wish. Coding an interpreter to do so is not particularly difficult.
With a bit of luck, everyone would understand at the same time that the compiler is building what is executed therefore the expression"speed of a language" is utter nonsense. You can only speak of the speed of the implementation which, yes, vary widely amongst compilers for the same language.
So, yes, Haskell semantics encourage functional programming, C semantics imperative programming, both are Turing complete and yes GCC is currently better at optimising C code than any Haskell compiler, nothing new under the sun. Can we all go back to a normal activity now ?
> So, yes, Haskell semantics encourage functional programming, C semantics imperative programming, both are Turing complete and yes GCC is currently better at optimising C code than any Haskell compiler
No. That is not why C is winning. C is winning because of fundamental differences between its semantics and Haskell's semantics that makes it possible to implement C with less overhead on actual CPUs.
If the lesson you took from your language theory course is that all Turing-complete languages are equivalent, you should ask for your money back. While it is true that they are all capable of expressing any algorithm, there are fundamental differences that deeply affect the efficiency of implementing them: http://news.ycombinator.com/item?id=5082609
I'm trying to translate what you are saying. All languages are Turing complete, so it doesn't matter? Do you program in assembly language then? I mean, if it doesn't matter....
> All languages are Turing complete, so it doesn't matter?
Precisely. The whole "my language is better" argument is completely void. Syntax is mostly a matter of preference. Semantic will make the structure of your program different but in the end there is no actual difference on what you can do only on how you will do it. Once again it mostly boils down to preference.
If you really need to argue about something go for my compiler/optimisation pass is better that at least does make sense. Note that there is no theoretical bound preventing an Haskell compiler to generate code equally fast than the one of a C compiler for any program.
> Do you program in assembly language then?
I did when I had to (quite a long time ago). While languages don't matter, compilers do when you need efficiency. While enjoying ML-based languages more, I also did some C. As I stated before, C compilers are better. Well, I even used my own little syntax-extension of C for a moment (if you don't have to share the code, you can do whatever you want with it).
And to paraphrase Feynman, mathematics is mostly the invention of better syntax. One thing I'd say which distinguishes Haskellers is that they see their programming language as a piece of mathematics. A Haskell program is a bunch of equations, and the semantics are given by domain theory.
> Note that there is no theoretical bound preventing an Haskell compiler to generate code equally fast than the one of a C compiler for any program.
And there's no theoretical bound saying that C is faster than Brainfuck. But no Brainfuck implementation will ever catch C for, say, parsing, and I'd be willing to put money on that.
This "all Turing-complete languages are on equal performance footing" is nonsense. Just because you can't write a proof that shows a result mathematically does not mean that a difference does not exist.
>>"Precisely. The whole "my language is better" argument is completely void."
You say later that the difference is "only on how you will do it". If you stop to think for a second, you come to realise that syntax, abstractions and the big elephant in the room: how "tuned" a language is for the myriad of computing platforms/operating systems/domains (scientific, business, web) -among many other factors- play a HUGE part in how you can develop software and how performant it can/will be. So languages does matter.
>>"Note that there is no theoretical bound preventing an Haskell compiler to generate code equally fast than the one of a C compiler for any program."
But it won't, because at the end of the day, a machine has to execute the programs written in a language and C was designed for a machine with certain characteristics in mind while Haskell/Lisp and other higher level languages are an exploration in abstractions.
When Abelson & Sussman says: "Programs must be written for people to read, and only incidentally for machines to execute.", there is an implicit understanding that trade-offs are being made and its not the same as C and other lower level programming languages. Until a machine appears that isn't limited by the constrains of the present, making trade-offs is inevitable and that too, does matter.
@jacquesm: Great write-up. Enjoyed every line of it.
P.S: I make no distinction between a language and its implementation here.
While I halfways agree here, syntax follows semantics.
If a language gives you a syntatical sugar form of something that can only be expressed through many lines of code in another, then it makes a great deal of difference.
If one language cannot be better than another, why is it so hard to name examples of BF programs that paid their own development costs and decades of research costs? I can think of a language for which there is such an example, by the way:
>Precisely. The whole "my language is better" argument is completely void. Syntax is mostly a matter of preference. Semantic will make the structure of your program different but in the end there is no actual difference on what you can do only on how you will do it.
That is extraordinarily wrong.
Syntax:
1) affects the structure of a program,
2) affects how we think about it,
3) affects what is easy to do and what is not,
4) adds or removes mental strain on reasoning about the program,
5) forbids or enables certain kinds of errors.
And much more.
The idea that syntax doesn't matter and is "a matter of preference" is completely bogus. The human mind has certain limitations and capabilities. Some syntaxes cater to that, some do not.
No one using Brainfuck will never be as productive as Python, to take an extreme case.
> No one using Brainfuck will never be as productive as Python, to take an extreme case.
Isn't it a good thing then, that no one prefers using Brainfuck to using Python at work? Leaving things up to preference doesn't mean that all the choices are the same. It means that the languages and the use cases are so varied that it isn't a good idea to restrict oneself blindly to a language before evaluating the use case. It doesn't seem like you are really in disagreement.
>It doesn't seem like you are really in disagreement.
Oh, I'm very, very much in disagreement.
It's not just about "preference" as if preference is arbitrary.
Superficial syntactic preference IS arbitrary.
Substancial syntax issues are not.
Brainfuck is not just left alone because people don't "prefer" its syntax. That's reading it in reverse. People don't prefer Brainfuck's syntax because it is objectively, and for reasons related to human psychology, cognition, etc bad.
E.g it makes it measurably difficult to discern different program states.
It amazes me how bad developers seem to be prepared nowadays in compiler design and language theory.
Back in the day we were able to understand that language and implementation are separate concepts. As well as why certain languages had a specific implementation as the default one.
Now young developers seems to think language and implementation are the same thing.
Many languages go so far as to specify a virtual machine to run the compiled program, so your statement doesn't make much sense in that context. Languages, machines, libraries and implementation are all deeply interconnected. Try writing C without a stack. Try writing Java without java.lang.
Thank you for brilliantly proving his point. What you wrote just prove that you completely mixed what is a language and what is its implementation.
Of course C semantic uses a calling stack in its definition. Does that mean you need a physical stack to implement it ? Not at all, you can perfectly implement C in anything as long as you can simulate a stack. Just look at the JVM and Dalvik. Dalvik is register based while the JVM is stack-based and they both run Java.
java.lang is a standard library. It's only part of the language definition in a really broad sense. It's actually written in Java. Any complete Java compiler can and will compile it (with some modification for the purely JVM oriented functions). There is a lot of projects out there trying to compile Java on a target which is not the JVM.
Of course languages define virtual machines, or in languages like C the abstract machine model, but that is only how the developer sees the machine through the language semantics.
This does not change in any way whatsoever how the language is actually implemented.
This discussion is derailing. From the top-level comment:
Hopefully it would allow everyone to realize that a language is just some syntax and semantic and that a compiler is just a program like another.
Which is true. However, this glosses over the fact that it is prohibitively hard to write efficient compilers for some languages. For instance, Prolog's semantics are so far removed from our machines that it is terribly hard to make it efficient for general purpose programs.
Semantics do make some languages harder to compile to efficient machine code than others.
I think it is usually easy to determine by context when someone is referring to a language and when someone is referring to the typical implementation, libraries, etc that come along with a language. In the context of performance benchmarking it is clear (to me at least) that the benchmark would be of a particular implementation, of which there is usually an obvious choice. I don't think there is really any misunderstanding of that.
This is totally wrong. Different languages have different levels of power. Sure, you could always write an interpreter but that's exactly the difference between a power language vs. a weak one: to do X will I need to write a whole interpreter here or can the language just do it?
Lisp is so powerful because you can do things like delayed evaluation of arguments, via macros. With Haskell you don't even need this due to the lazy evaluation.
Some people like bickering. But I think the point of the article is that C is, by nature of the language, easier to optimize than Haskell, since it is lower level.
> C is, by nature of the language, easier to optimize than Haskell, since it is lower level.
Depends what level you mean - if you want to do all the optimisations yourself, C is better; but if you want to write code and let the compiler do the details, I would think higher-level is better (you can just write doWhatIMean() and the compiler will automatically choose the optimal implementation for your current problem and platform - where if you'd specified the implementation details yourself, you'd be sub-optimal in many cases)
> "(you can just write doWhatIMean() and the compiler will automatically choose the optimal implementation for your current problem and platform - where if you'd specified the implementation details yourself, you'd be sub-optimal in many cases)"
How far are we with declarative programming in sense of being expressive and expecting the compiler to understand what we mean? I'd believe that without strong AI human will always be better off with imperative programming than a compiler with declarative, for anythig beyond relatively simple isolated pieces of code/algorithms. In general we have so much more experience and knowledge about the target system and environment than the compiler does, this is less true for JIT compilers, but they come with their own overhead.
I'd really love to see superoptimization[1] to be done as a service for compilers. Say you have a function with certain semantics the compiler is fully certain about. The compiler fingerprints this function and uploads a fingerprint + behavioral analysis to <somewhere in cloud> where it's being exhaustively optimized by bruteforcing all the meaningful instruction sequences which conform to the semantics of the function. After the most optimal piece of instructions is being found, it's associated with the finerprint and stored in a database and then returned to the compiler. Now, when ever someone else writes the exact same function(or code with exact same semantics) the compiler queries the <some cloud service> for the optimal piece of code. Of course, a system like this would need more information about the actual target of the code(CPU architecture, cost of things like memory access, cache miss, branch mispreciction etc.).
I disagree. Have you read any articles on stream fusion in Haskell? You can tell the compiler that certain actions are equivalent and suddenly some code that makes two passes down a tree makes just one pass. Doing the same thing in C would be a lot more work precisely because you're working at such a low level.
Hopefully it would allow everyone to realize that a language is just some syntax and semantic and that a compiler is just a program like another. Nothing sacred here. Hell you can even do imperative programming in Haskell if you wish. Coding an interpreter to do so is not particularly difficult.
With a bit of luck, everyone would understand at the same time that the compiler is building what is executed therefore the expression"speed of a language" is utter nonsense. You can only speak of the speed of the implementation which, yes, vary widely amongst compilers for the same language.
So, yes, Haskell semantics encourage functional programming, C semantics imperative programming, both are Turing complete and yes GCC is currently better at optimising C code than any Haskell compiler, nothing new under the sun. Can we all go back to a normal activity now ?