Hacker Newsnew | past | comments | ask | show | jobs | submit | hoping1's commentslogin

This and the other comment under this seem to be talking about the work the computer is doing at runtime. I believe the point is about the developer's work in implementing this. (For example, renaming potentially conflicting variables is clearly _less_ work than renaming only the actually conflicting variables, in this framing.) This kind of work is important for (1) compilers whose bugs are potentially very financially expensive, and (2) compilers with very intentionally-portable implementations (including but not limited to educational implementations)


See cvoss's comment in another thread:

" What happens in the evaluator when you have

(\ a . a a) (\ x . \ y . x y)

Variables are uniquely named at parse time, but after one beta step, you have two distinct instances each of x and y, albeit in different scopes, and after a second beta step, you have two distinct variables named y in the same scope. "

This doesn't come up in most mainstream languages, but Zig and Jai are getting people excited about comptime evaluation, and dependent types need comptime evaluation as well. Uniqueness is fine until you're doing higher-order evaluation in your compilation pipeline like this, which will be more and more desirable in the future, at least if the current trends are any indication.


(\ a#0 . a#0 a#0) (\ x#1 . \ y#2 . x#1 y#2)

-->

(\ x#3 . \ y#4 . x#3 y#4)(\ x#5 . \ y#6 . x#5 y#6)


This implies that applying a function is no longer the same as a plain substitution. Moreover this can be pretty expensive computationally.


The semantics are the same for the user and debugging works, which is what matters. If you're using an object oriented language for implementation then this is already the computational cost. If you're using a functional language then yes there is some space overhead because it breaks sharing but that's the price. Was there something else you were thinking of?


It's not just space overhead, it requires you to recursively walk the argument of the application and replace all variables with fresh ones, once for every occurrence of the function argument. This means that function application is no longer O(size of the function) but rather O(size of the function + occurrences of the argument * size of the argument). If the size of the argument is big this makes the function application much much slower.


It's not that expensive: just use De Bruijn notation under the hood. (I've seen a system that does this, and this is how it works.)


But then the point of giving unique names to avoid using De Brujin indices is moot


An accessible introduction to the infamous Par operator, with a focus on intuition. Notably, this is on the broader concept of multiplicative disjunction, which appears even outside of linear logic!


Alternative title: Par and Constructive Classical Logic.

I've finally rounded out the Par trilogy, on sequent calculus, linear logic, and continuations! This post was the point of the whole series, and the most likely to contain things you haven't already heard. I really love par and the theory around it; the elegance is incredibly satisfying. I hope I can share that with you!


A new guide on linear logic! Much deeper than the usual "imagine a vending machine" guide, though I do mention the connection to that at the end lol. This is great for getting an intuition for substructural logics in academic papers, as well as the notorious multiple-conclusion sequents used in, say, type systems for safe parallelism. The post prior gives the necessary background on sequent calculus, if needed. This series is on much of the most beautiful theory ever, and I hope you enjoy!!


A new guide on linear logic! Much deeper than the usual "imagine a vending machine" guide, though I do mention the connection to that at the end lol. This is great for getting an intuition for substructural logics in academic papers, as well as the notorious multiple-conclusion sequents used in, say, type systems for safe parallelism. The post prior gives the necessary background on sequent calculus, if needed. This series is on much of the most beautiful theory ever, and I hope you enjoy!!


What are your plans for future posts in this series? You’ve done some nice work here. Where were you introduced to this subject? Looking forward to more, especially because your presentation is so clean.


I'm glad you liked it! I have one more post planned for this series, on par and using continuations for classical logic proof terms. I have two other posts in the works but they aren't part of this series, which will only have that one more post. Much of what I learned was from blog posts like this one, or at least they got me to the point where I could understand the papers on my own. Blog posts, YouTube, reddit, discord, all helped me a ton. Originally I just wanted to make a cool programming language, but the more research I learned the more my interest shifted to the research, because it's just so unbelievably gorgeous and elegant!


Nice work :)


Thanks!


Fair, I think of this as advanced logic, and those concepts (and that notation) as prerequisite.


Ah heck, I should have added a section on PTSs, maybe I still will or maybe that will be standalone later. It really is gorgeous stuff!!


Extensive and patiently-paced, with many examples, and therefore unfortunately pretty long lol


You probably mean set theory instead of graph theory, since set theory and category theory are kind of seen as two foundations for math.

Both category theory and set theory use sets. But set theory tries to make absolutely everything into a set. It takes on a little complexity in this quest, because of Russel's paradox. Category theory can be seen as studying set theory, among other things, so it is "bigger" or more all-encompassing than set theory. Many things studied in category theory aren't possibly sets.

Set theory is more immediately intuitive, but category theory organizes things in a way that are ultimately more insightful, I think. Meaning that once you can get into the category theory headspace and learn to navigate it, it becomes a much better environment for thinking without mistakes. In my personal opinion.


Also, morphisms from A to B form a set.


Only in locally-small categories. But yes, category theory often makes use of sets.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: