Hacker Newsnew | past | comments | ask | show | jobs | submit | d-us-vb's commentslogin

Because codepens can run javascript. And if a page has 50 of them, it might make the page load time much longer. I know that all these examples are pure CSS, and maybe there is a setting in codepen to disable the "Run" button and automatically run it. Still, getting to decide is generally a better pattern than presuming that that's what the user wants, especially when the fact that the code is inside a codepen makes it explicitly not an integral function of the page. "I thought this was just a blog, and now you want me to run all this javascript??" -- some JS hater, probably.

I appreciate getting to choose as much as possible when code runs.


Somewhat ironically, Codepen ended up introducing the JS execution requirement to view the content.

It's easy enough to interpret Peanuts as being that. But Charles Schultz was not trying to present that. He was presenting the world as it is, and how one person can still maintain his optimism in spite of all that. This is made abundantly clear in some of the other strips, like the Father's day strip where he explains to Violet that no matter what, his dad will always love him, and he doesn't care that Violet's dad can buy her all the things.

Schultz was a relatively devout Presbyterian (though still very much a free thinker and criticized the direction American Christianity was going and its attitude about the various wars during the 60s-80s). He was incredibly optimistic about humanity, but he showed in Peanuts the reality of our "default" state, especially among kids.

Keep in mind that these are all 2nd and 3rd graders in the story.


Calvin is such an interesting character. He never "learns", similar to Charlie Brown, but his outlook is that of a scientist who just wants to "see what'll happen". Anything to occupy his hyperactive mind, whether it be spaceman spiff or a trip to the Triassic, or closer to reality, pranking Susie Derkins or trying to get the better of Moe (or Hobbes for that matter). He's not optimistic, but cynical. But his cynicism is irrelevant because he's driven by his avoidance of boredom.

To this day, the C&H strip I remember most is https://cl.pinterest.com/pin/313633561533127275/

Today I learned that I would make a terrible detective!

When I watched Broadchurch with my family, I thought he was doing a fine job at getting to the bottom of the case. Goes to show much crime drama I watch.

I see now that Tennant's character's actions are a plot device to reveal the drama amongst the other characters, not the workings of a good detective.


> No, the best thing you can do for simplicity is to not conflate concepts.

This presumes the framework in which one is working. The type of map is and always will be the same as the type of function. This is a simple fact of type theory, so it is worthwhile to ponder the value of providing a language mechanism to coerce one into another.

> This is cleverness over craftsmanship. Keeping data and execution as separate as possible is what leads to simplicity and modularity.

No, this is research and experimentation. Why are you so negative about someone’s thoughtful blog post about the implications of formal type theory?


This presumes the framework in which one is working.

One doesn't have to presume anything, there are general principles that people eventually find are true after plenty of experience.

The type of map is and always will be the same as the type of function. This is a simple fact of type theory, so it is worthwhile to ponder the value of providing a language mechanism to coerce one into another.

It isn't worthwhile to ponder because this doesn't contradict or even confront what I'm saying.

No, this is research and experimentation.

It might be personal research, but people have been programming for decades and this stuff has been tried over and over. There is a constant cycle where someone thinks of mixing and conflating concepts together, eventually gets burned by it and goes back to something simple and straightforward. What are you saying 'no' to here? You didn't address what I said.

You're mentioning things that you expect to be self evident, but I don't see an explanation of why this simplifies programs at all.


> One doesn't have to presume anything, there are general principles that people eventually find are true after plenty of experience.

I guess I just disagree with you here. Plenty of programmers with decades of experience have found no such general principle. There is a time and place for everything and dogmatic notions about "never conflate X and Y" because they're "fundamentally different" will always fall flat due to the lack of proof that they are in fact fundamentally different. It depends on the framework in which you're analyzing it.

> It isn't worthwhile to ponder because this doesn't contradict or even confront what I'm saying.

This is a non sequitur. What is worthwhile to ponder has no bearing on what you say. How arrogant can one person be?

> It might be personal research, but people have been programming for decades and this stuff has been tried over and over.

Decades? You think that decades is long enough to get down to the fundamentals of a domain? People have been doing physics for 3 centuries and they're still discovering more. People have been doing mathematics for 3 millennia and they're still discovering more. Let the cycle happen. Don't discourage it. What's it you?

> You're mentioning things that you expect to be self evident, but I don't see an explanation of why this simplifies programs at all.

It may not simplify programs, but it allows for other avenues of formal verification and proof of correctness.

----

Do you have other examples of where concepts were conflated that ended up "burning" the programmer?


What is worthwhile to ponder has no bearing on what you say.

Ponder all you want, but what you said wasn't a reply to what I said.

Decades? You think that decades is long enough to get down to the fundamentals of a domain?

It is enough for this because people have been going around in circles constantly the entire time. It isn't the same people, it is new people coming in, thinking up something 'clever' like conflating execution and data, then eventually getting burned by it when it all turns into a quagmire. Some people never realize why their projects turned into a mess that can't move forward quickly without breaking or can't be learned without huge effort of edge cases.

It depends on the framework in which you're analyzing it.

No it doesn't. There are a bunch of fundamentals that are already universal that apply.

First is edge cases. If you make something like an array start acting like a function, you are creating an edge case where the same thing acts differently depending on context. That context is complexity and a dependency you have to remember. This increases the mental load you need to get something correct.

Second is dependencies. Instead of two separate things you now have two things that can't work right because they depend on each other. This increases complexity and mental load while decreasing modularity.

Third is that execution is always more complicated than data. Now instead of something simple like data (which is simple because it is static and self evident) you have it mixed with something complicated that can't be observed unless it runs and all of the states at each line or fragment are observed. Execution is largely a black box, data is clear. Mixing them makes the data opaque again.


It is clear now that you don't understand the claim. It sounds like like you're conflating pure and impure functions. It's obvious from context (did you even read the blog post?) that the title of the blog post is referring to pure functions.

You obviously can't treat an impure function as an array and no one would ever claim that. The blog itself isn't claiming that either given that the author is commenting on a nugget from Haskell documentation, and the author is explaining design choices in his own pure functional language.

Your three points only make sense if you're definition of "function" allows side effects. If we're talking about pure functions, then due to referential transparency, arrays are in fact equivalent to functions from contiguous subsets of the integers to another type, as the Haskell documentation indicates.


It sounds like like you're conflating pure and impure functions.

No I'm not, this applies to both in different ways.

If we're talking about pure functions, then due to referential transparency, arrays are in fact equivalent to functions

Never ever. You're talking about a function that generates data from an index, which is trivial to make. Just because it is possible to disguise it as an array in haskell, C++ or anything else doesn't mean it is a good idea. An array will have vastly different properties fundamentally and can be examined at any time because the data already exists.

Confusing the two is again conflating two things for no reason. Making a function that takes an index and returns something is a trivial interface, there is no value in trying to mix up the two.

Evidence of this can be found in the fact that you haven't tried to explain why this is a good idea, only that it works under haskell's semantics. Being clever with semantics is always possible, that doesn't mean conflating two things and hiding how things actually work is a good idea.


Notice that I never once in this discussion claimed that it was a "good idea". This is about research and experimentation. The blog post has the title because of the Haskell documentation. Also, claiming that the lack of an argument is evidence to the contrary is argument from silence, a logical fallacy.

The good idea surrounding this isn't about treating functions as data, but maintaining the purity of the type system allowing the implications of the type system to run their course. You seem to be a very pragmatic programmer and so the way Haskell builds up its own universe in terms of its own type system and Hask (the pseudo-category of Haskell objects) probably seems pointless to you. I can't say for certain, though.

I completely reject most of your claims because they appear to be incoherent within the framework of type theory and functional programming. It looks like you're using reasoning that applies only to procedural programming to attempt to prove why an idea in functional programming is bad.


This is about research and experimentation.

Haskell is 36 years old. It isn't research and experimentation any more, it is ideas that everyone with experience has had a look at. Some people might be learning haskell for the first time, but that doesn't mean it's still research, that all happened decades ago.

The good idea surrounding this isn't about treating functions as data, but maintaining the purity of the type system allowing the implications of the type system to run their course.

And what are the benefits here? How do they help the things I talked about? How do they avoid the pitfalls?

Also, claiming that the lack of an argument is evidence to the contrary is argument from silence, a logical fallacy.

Not really, because if you could contradict what I've said you would have.

within the framework of type theory and functional programming.

People can gather in a room and tell each other how great they are and how they have all the answers, but in 36 years there is a single email filtering program that was made with haskell.

you're using reasoning that applies only to procedural programming to attempt to prove why an idea in functional programming is bad.

I explained why this is a bad idea from fundamental and universally agreed upon ideas about the best ways to write software.

Functional programing languages had lots of ideas and features that turned out to work well. That doesn't mean that conflating two completely separate concepts is a good idea, no matter what 'type theory' someone comes up with to support it.

Saying something is good because haskell says it is good is circular religious thinking. These two things aren't the same, they are literally two different things and trying to unify them doesn't make life easier for anyone. It's just programming cleverness, like the 'goes to operator --> '


As a young Linux user I always hated the experimentation aspect because usually it meant just straight up getting the command wrong 5 times before trying to read the man page, thinking I understood what the man page meant, trying again another 5 times and then giving up.

This idea of experimenting and getting instant feedback is just survivorship bias for a certain type of person, not “the way we ought to teach Unix shell”

This view is corroborated by the research summarized and presented in the programmer’s brain by Felienne Hermans.


> usually it meant just straight up getting the command wrong 5 times before trying to read the man page, thinking I understood what the man page meant, trying again another 5 times

I think that is a developer's superpower. The poncy term for it is grit. I tell others that the secret to leaning computers is frustration and persistence.

> and then giving up.

Knowing when to stop or change direction is hard.

I've definitely wasted years of work failing to solve something that I eventually had to give up on (most memorably depending on nasty Microsoft products).

But I've also been paid very nicely because I've solved problems that others struggled with.

And I was paid for the failures too.


I've been a sysadmin for a quarter century and I've always said my only real superpower is that I read error messages when they appear, something none of my non-admin coworkers can do, for some reason.

A smart, and by no means technically incompetent friend of mine struggled to get a software product to run. He would reinstall it, try different configuration, uninstall plugins, yet every time he tried running it, the program gave the same error message. I asked him to read the error, and it very simply stated that it was failing to create symlinks because the the disk was formatted with ExFAT. Had he just piped this message into ChatGPT, he would have avoided hours of debugging.

Do copy pasting them into Google count?

Sure, about 3/4 of the time the answer is on the first page of results

I consider myself a fairly good developer, and I think that's in large part due to knowing, "doing this should be possible, and the reason it's not working right now is just due to stupidity (my own or the developer of whatever I'm using's)". But yes, in a few (thankfully rare) cases it just plain isn't practically possible. Even then, I've given up on problems just to have it nagging in the back of my mind and then randomly coming up with a beautifully simple solution weeks later. That's sort of the essence of what I like about programming (and math too).

> the secret to leaning computers is frustration and persistence

And if persistence fails, bring out the big gun: belligerence.


Grit is something you gain once you already have an intrinsic motivation, such as already having a belief you can do this. Something has to spark in people that they’re capable in the first place.

People forget that struggling is part of the learning process. It's great that people want to make things easier to learn, but struggling is essential. You want to ensure that people don't struggle so much they get stuck (one extreme), but you don't want to make it so easy people don't struggle at all (the other extreme). There's balance, but that balance requires struggling.

I think this is one of those statements that sounds reasonable on the surface but if you read it over a few times it doesn’t say anything concrete enough to pin down anything that could be refuted, even though I think the vibe is off.

So in return I’ll share my vibe, which is that my point was a small amount of struggle can be good once people are already determined to learn something, perhaps because they have found a spark for it. But before they’ve found their spark, all it does is turn people off. And in general, I don’t think struggle is essential at all. In fact I’ve learned a great many things successfully without struggling.


You must have a different brain than mine. I'm finding (and increasingly as I age), that the act of learning feels inherently uncomfortable. Like my brain really wants to use its existing toolset instead of learning something new, and is saying so quite loudly. A similar discomfort happens with exercise. If there's zero struggle to lift the weight, then your muscles aren't really developing. I think this is pretty well-known and well-documented.

So how do you learn without struggle? Are you being spoonfed the material on a learning happy path so you happen to never make a mistake and thereby have to redo your work? Do you not experience effort and mistakes and frustration as 'struggle'? After you're "done" learning a particular skill without struggle, what happens when you have to apply that skill at a higher level than you learned it at? Is it just joy and rainbows all the time?


When you read a novel, you’re learning the plot, the intricate relationships between the characters, forming opinions about the current and future state of the fictional world, that sort of thing. Does this feel like a struggle to you? It’s still learning, but I don’t find it a struggle, mostly because I’m enjoying the process and the experience.

A great many things can be learned in this way. Not everything I’ll grant you, I haven’t found a way to learn a language or complex mathematical topics without struggle yet. But even in my software development work if something is a joy to learn I still learn it, no struggle required.

Kids learn a lot through play, and play isn’t a struggle for kids.

Lots of examples.


I don't think it's just age, and I think the comparison to working out is apt. To gain muscle you need to get tired. To restate my previous comment in this framework: to gain muscle you need to struggle; you can go too far and injure yourself but neither is there exercise that is effortless.

I'm not sure how the gp even reasons like they do. What does effortlessly learning a new skill look like? You're just instantly good at it? The logical conclusion here is that either: 1) they're so galaxy brain that nothing is hard or 2) they're incrementing so minutely that the progress is so smooth sailing that they are able to fool themselves into not believing there ever was a struggle.

If the first option we have to consider their morals as they could save countless lives and thrust humanity generations ahead technologically, due to their ability to solve problems us mere mortals struggle with.

Personally I'm much more believing of option 2 as it makes the most sense if we consider the computational requirements for increasing precision along with our current understanding of human psychology to create these types of mental defenses as remembering the struggle can deter us from doing it again. But mostly I'm sold on option 2 because if they were so galaxy brained they'd be cognizant of the fact that the rest of us aren't and we wouldn't be having this conversation.

But hey, maybe I'm the one fooling myself here. Maybe the gp is just god. They could just be omnipotent and not omniscient. Which in that case we've answered the AGI super intelligent problem.


Effotless and without struggle are two very different things. Walking a kilometre or two to the shops isn’t a struggle, but it’s not effortless either. It takes effort but it’s pleasant and not a struggle at all. Running would be a struggle. Effortless would be having someone drive you there. See what I mean?

  > I think that is a developer's superpower.
I do too, but only because we can do both.

I think comparing math education to programming education is quite apt here. After all, programming is math[0]. Both are extremely abstract subjects that require high amounts of precision. In fact, that's why we use those languages![1]

One of the absolute most difficult parts of math is that you don't have feedback. Proving you did things correctly is not automated. You can't run it and have an independent (not you) mechanism tell you that the output is what you expect it to be. This leads to lots of frustration as you sit there thinking very hard about what you've done wrong. It is frustrating because you're often blind to the mistakes as that's why you've made them in the first place! But the upside is that you quickly become attentive to details and learn those pitfalls very well. This also means you can abstract very well (the entire point of math) as you learn to abuse things on purpose. The struggle is real, but the struggle is important to the learning process. You learn very little if there's no struggle. Your mind is made to remember things that are hard better than things that are easy.

In programming we typically have the opposite problem. You get instant feedback. This makes iteration and solving your specific problem much faster. You can experiment and learn faster as you poke and prod seeing how the output changes. BUT there is a strong tendency to leverage this too much and use it to outsource your thinking and analysis. Iterating your way to success rather than struggling and analyzing. This doesn't result in as strong of neural pathways, so you don't remember as well and you don't generalize as well. Having taught programming I can tell you that countless students graduate from university[2] thinking that because the output of their program is correct that this means that their program is correct. This is a massive failure in logic. Much easier to see in math that just because 3+3=6 and 5+1=6 doesn't mean that the process is equivalent[3]. The correctness of the program is the correctness of the process, not the correctness of the output.

While that's the typical outcome of learning programming, it isn't a necessary outcome and there's nothing stopping anyone from also using the same approach we use in math. Math is only that way because we're forced to[4]! Both have their strengths and weaknesses, but neither is strictly better. The strictly better learning path is the combination and that is the superpower we have. It's just easy to abdicate this power and only do the easy thing.

[0] We can really say this from Church-Turing but if you really have concerns with this statement you'll need to read more up on the theory of computer science and I suggest starting with lambda calculus.

[1] https://www.cs.utexas.edu/~EWD/transcriptions/EWD06xx/EWD667...

[2] and even graduate degrees. You'll also see this idea prolific on HN and I'm sure someone will respond to this comment countering the point

[3] You can abstract this, I'm not going to do some long calculation here but I'm sure most people have done a long calculation where they got the right answer but the process was wrong and they had a professor still mark them wrong (and correctly so).

[4] If you're going to lean on me I'll concede


Maybe I am wrong about this but I think a lot of recent research has shown that trial and error is a great way to learn almost everything. Even just making an educated guess, even if it is completely wrong, before learning something makes it much more likely that you remember and understand the thing that you learn. It’s a painful and time-consuming way to learn. But very effective.

Maybe Linux commands is a little different but I kinda doubt it. Errors and feedback are the way to learn, as long as you can endure the pain of getting to the correct result.


Needs qualification. Research shows trial and error learning is very durable, but it’s not the most time efficient (in fact it’s relatively poor, usually, on that front). The two concepts are a bit different. Yes, trial and error engages more of the brain and provides a degree of difficulty that can sometimes be helpful in making the concepts sticky, but well designed teaching coupled with meaningful and appropriately difficult retrieval and practice is better on most axes. When possible… good teaching often needs refinement. And you’d be surprised how many educators know very little about the neuroscience of learning!

> And you’d be surprised how many educators know very little about the neuroscience of learning!

I'm (pleasantly) surprised every time I see evidence of one of them knowing anything about it.


At the university level in the US, few faculty get any kind of training before they are expected to start teaching. And the teaching requirement is more or less “do no harm.” If you’re at a research university, which includes many publicly funded universities, then your career trajectory is based almost exclusively on your research output. I could go on, but it suffices to say that it’s not surprising that the teaching could be better.

That said, most institutions have teacher training resources for faculty. I was fortunate to be able to work intensely with a mentor for a summer, and it improved my teaching dramatically. Still, teaching is hard. Students sometimes know—but often don’t know—what is best for their learning. It’s easy to conflate student satisfaction with teaching effectiveness. The former is definitely an important ingredient, but there’s a lot more to it, and a really effective teacher knows when to employ tools (eg quizzes) that students really do not like.

I am frequently amused by the thought that here we have a bunch of people who have paid tons of money, set aside a significant fraction of their time, and nominally want to learn a subject that they signed up for; and yet, they still won’t sit down and actually do the reading unless they are going to be quizzes on it.


> the thought that here we have a bunch of people who have paid tons of money, set aside a significant fraction of their time, and nominally want to learn a subject that they signed up for; and yet, they still won’t sit down and actually do the reading unless they are going to be quizzes on it.

How often have they put down the money, as opposed to their parents?

How often do they actually care about learning the subject, as opposed to be able to credibly represent (e.g. to employers) that they have learned the subject?

How often is the nominally set-aside time actually an inconvenience? (Generally, they would either be at leisure or at the kind of unskilled work their parents would be disappointed by, right?) My recollection of university is that there was hardly any actual obligation to spend the time on anything specific aside from exams and midterms, as long as you were figuring out some way or other to do well enough on those.


I suppose I should have said “nominally want to learn” etc, but I think you are right: most students simply want the credential. I maintain that this is still a strange attitude, since at some point, some employer is going to ask you to do some skilled work in exchange for money. If you can’t do the work, you are not worth the money, credentials be damned. On the other hand, I routinely see unqualified people making a hash out of things and nobody really seems to care. Maybe the trick is not to be noticably bad at your job. Still, this all strikes me as a bad way to live when learning and doing good work is both interesting and enjoyable.

Trial and error is necessary and beneficial, but not after the student becomes frustrated or anxious/bewildered by the complexity. The research shows that striking a balance between teacher intervention and trial and error is the optimal approach. If a teacher notices that a student is way off course but they keep persisting in one branch of the trial-and-error search space, it’ll be best if they intervene and put the student on the right branch. The student can still use the knowledge of what wasn’t working to find the solution on the right branch, but just persisting would be ineffective.

Gaining true understanding/insight is necessarily trial and error. Teachers cannot teach insight. But they can present the optimal path to gain insight.


Trial and error was the root of what became my IT career. I became curious about what each executable did from DOS and with that did my first tweaking of autoexec.bat and config.sys to maximise memory. Years later I was the only one who could investigate network (and some other) problems in Windows via the command line while I was the junior of the team. Ended up being the driver of several new ways of working for the department and company.

Ditto. I found that people whose attitude was “let’s just try it” tended to be a lot more capable and effective. Nevertheless the prevailing wisdom when I was in IT was that if you had a problem that didn’t have an obvious solution, you had to purchase the solution.

Sounds very profitable for whoever is selling solutions, I wonder if perhaps they also provide wisdom as a loss leader.

I'd add nuance to Hermans' work. Its not all experiment blind, but also not feedback-less. They advocate for "direct instruction", not just rote learning.

> As that is not a surprise, since research keeps showing that direct instruction—explanation followed by a lot of focused practice—works well.

Note the "lot of focused practice".

[0] https://www.felienne.com/archives/6150


There’s a pretty rich literature around this style of pedagogy going back for decades and it is certainly not a new idea. My preferred formulation is Vygotsky’s “zone of proximal development” [1], which is the set of activities that a student can do with assistance from a teacher but not on their own. Keeping a student in the ZPD is pretty easy in a one-on-one setting, and can be done informally, but it is much harder when teaching a group of students (like a class). The. Latter requires a lot more planning, and often leans on tricks like “scaffolded” assignments that let the more advanced students zoom ahead while still providing support to students with a more rudimentary understanding.

Direct instruction sounds similar but in my reading I think the emphasis is more on small, clearly defined tasks. Clarity is always good, but I am not sure that I agree that smallness is. There are times, particularly when students are confused, that little steps are important. But it is also easy for students to lose sight of the goals when they are asked to do countless little steps. I largely tuned out during my elementary school years because class seemed to be entirely about pointless minutiae.

By contrast, project work is often highly motivational for students, especially when projects align with student interests. A good project keeps a student directly in their ZPD, because when they need your help, they ask. Lessons that normally need a lot of motivation to keep students interested just arise naturally.

[1] https://en.wikipedia.org/wiki/Zone_of_proximal_development


I'd like to add that, while anything will have some learning friction, learning the Unix CLI is rather unnecessarily painful.

I actually feel like the Unix/Gnu CLI is quite nice (yes I'm used to it already). I feel like it provides a lot of consistency through community standardization and standardization through POSIX and libraries. For example it's quite difficult to find a program that breaks the "-o --option" long/short options and if you do the "man command" or "info command" pages will tell you how to use a program. In my experience this is quite different on for example Windows.

Learning it is a step but once you've learned the basics you can read 90% of the commands.


I’m curious: what do you see as unnecessary about the CLI? Or, to put it another way, in what way should the CLI be changed so that the only remaining difficulties are the necessary ones?

I'm not qualified to give a complete answer, but I think two main issues are the proliferation of flags in standard tools (e.g. ls has a lot of flags for sorting behavior) and the extreme preference for plain text. Text is very useful, but a lot of semantic information gets discarded. Representing structured data is painful, stdin/stdout/stderr are all in one place, window resizing makes a mess sometimes (even "write at end of line" isn't given), and so on. I'm definitely not qualified to describe just how to fix these issues, though.

I think you hit the nail on the head. Plaintext is universal in a way that nothing else really is. Outputting structured data means that consumers would have to process structured data. That definitely raises the difficulty of the programming. It’s not an easy problem, but I also do not have any good ideas.

I'm trying to remember being a young Unix user but it was four decades ago, so the details become hazy. Nevertheless the proper go-to after the manpage fails to clarify matters is the same as it ever was, that is, one reads the source code, if you have it, and this is easier today than ever.

Getting a pipeline wrong 5 times is common. The normal process is: write a couple of steps, right output! Add a grep, bad output! Fix it, again, again, right output! Add cut -bM-N, adjust the boundaries a few times. Sort the output. Oops, sort -n

The curse of knowledge really. Or perhaps more accurately “awareness”.

You know about things that are to others unknown unknowns. Since ignorance is bliss, it definitely feels like a curse to you, and since what one doesn’t know can hurt them, others would see as a blessing.

Funny world we live in.


I think about this often as an org mode user who uses it exclusively for journaling with none of the GTD features. Org mode was released before markdown by over a year, but never saw the uptake like markdown did, despite being a more featureful syntax. I think that's because org mode was originally a GTD framework for emacs, and the syntax of org files was incidental to doing GTD in plaintext. It didn't get popularized as an alternative to other markup languages until long after markdown became popular.

I don't really know. I wasn't around back then to watch it unfold. But I still much prefer org mode due to better emacs support and (IMO) more intuitive syntax for things.


When you say "better Emacs support," you're kind of understating things: Org Mode was -- and to a large degree, still is -- tied intimately to Emacs. It was only available in Emacs for years, and if you didn't use Emacs, you probably didn't hear about it for years.

As someone who now uses both, I think the syntax between the two is really kind of a wash. I know Org Mode folks who insist that its syntax for links is more intuitive than Markdown's, for instance, whereas I used to insist that Markdown's was. Now I think neither is really intuitive -- the one that feels more natural to you is, very likely, the first one you learned and got comfortable with. Beyond that, most of the differences in syntax are kind of academic. (I would genuinely argue that Markdown's block quote formatting, which is the way that plain text email tended to quote messages, is more intuitive, at least to anyone who remembers writing email in plain text.) Org Mode partisans also correctly point out that you never have to worry about differences in syntax parsing the way you technically do with different flavors of Markdown, but I'd argue that's because there's effectively only one Org Mode parser out there, e.g., Org Mode in Emacs. There is no formal syntax specification for Org Mode any more than there is for Markdown, and if Org Mode had become as popular and had as many different implementations in as many different programming languages, it would absolutely have the same issue. (In fact, the few non-Emacs Org Mode parsers that I've seen are, to a one, at significant variance with Original Flavor Org Mode once you get past the basics.)

Org Mode's real strength isn't the syntax, it's everything else. I don't use it for GTD, either, but I use it as a task manager and an agenda system for work, and as a personal journal and fiction outliner. None of the power it gets for any of those things comes from using asterisks instead of hash marks for headlines, or slashes instead of underlines for italics. :)


The key bindings out of the box with something like Doom emacsx is a big selling point too.

I have not been able to get markdown to walk in Vim, anywhere near as well.


I don't remember Vim's Markdown support to be anything special, either; I do a lot of Markdown work, and tended to use Markdown-specific editors on the Mac like Ulysses and iA Writer, while doing my technical writing in BBEdit. (I never found Vim to fit me particularly well for prose of any kind, even though I was pretty experienced with it. Apparently my writing brain is not modal.)

Semi-ironically given the Org mode discussion, the markdown-mode package for Emacs makes it one of the best Markdown editors I've used!


Same team! Also an orgmode user who uses it for all things longform... A decade+ and counting. Only recently do I see need to start adopting TODOs because work and life tasks are threatening to go beyond the capacity of my normie calendar and paper lists coping mechanisms.

Orgmode text is fairly well supported now, across a plethora of non-Emacs apps and editors. I've enumerated several in my post [0].

Quoting oneself...

> But seriously, Emacs winkwink, amirite?

> Utility is contextual, remember? > > So here are ways to use org-mode without Emacs, for useful-to-you purposes, without even caring it is orgmode text underneath. > > Mobile, Web, and Desktop apps:

     mobile: Orgro, a mobile Org Mode file viewer for iOS and Android
     mobile: Plain Org, org text view and editor for iOS
     mobile: Orgzly, org text viewer and editor for Android (I use this on my phone, and sync notes to my PC with Dropbox).
     mobile: beorg for iOS (tasks, projects, notes)
     mobile: flathabits, inspired by Atomic Habits, with all your data stored in org files
     web+desktop: logseq, a privacy-first, open-source knowledge base
     web: organise, web-based org text editor and viewer
     web: braintool.org, a Chrome plugin "to easily capture and categorize all the information and knowledge you want to keep track of, right at the point you discover it or create it"
> Text Editors (apart from Emacs): > > You can type org markup text (syntax) in any text editor, even Notepad. > > Vim: https://github.com/nvim-orgmode/orgmode > > Atom: https://atom.io/packages/org-mode > > VSCode: https://github.com/vscode-org-mode/vscode-org-mode > > A variety of utilities to:

     Publish, Import, Export, Parse
     More community-enumerated tools for the same
     Even Github, Gitlab etc. support org markup these days!
 
> I'm sure more people are making and releasing tools backed by org-mode text. > > The future is bright!

[0] Why and How I use "Org Mode" for my writing and more

discussed: https://news.ycombinator.com/item?id=43157672


He's talking about "wonder bread" and other factory breads that have had much of their nutrients stripped and some put back, to the detriment of their absorption. Some also are concerned with artificially included preservatives and the unknown unknowns of putting them in places (even if there's a common natural source in another food).

Homemade bread is certainly not ultraprocessed (especially if made with unbleached flour or even better, whole wheat flour), but factory bread most certainly is considered ultraprocessed.


I appreciate when "Woe is Me" style comments are knocked down a notch when they conveniently ignore half of the world. The activity surrounding the discussion is indeed using networked applications, of which the web is only one.

So I don't think they were being needlessly pedantic, nor do I think they didn't understand what the parent meant by internet in the colloquial.

Lots of different ways one could take this: maybe whom they were responding to is just being lazy, that the good parts of the internet are there for them to explore, but they are beholden to their web browser and their favorite loathed platforms that 'make the internet suck'.

Or maybe whom they were responding to really has gone the rounds and really has considered all the options and bemoans how difficult the non-web internet services are to use, and how inelegant they can be at times and what a pain they are to maintain if it isn't your full time job.

There can be so many ways to take written material on the internet; more often even pedantic comments at least let us ensure we aren't simply reaffirming our own biases.


thanks for your input. im still curious to hear from them.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: