Not a lisp apologist haha, but I think Clojure tries to use fewer smooth parens “()” in general. Square parens “[]” and curly parens “{}” are used to help with things like variable declarations and stuff like data structures.
I recently started using Clojure and I’ve used languages like C#, JavaScript, and Python a lot. My two cents is that a Clojure-like language should try to embrace the aesthetics of a white-space language like Python, but use the parens as clues for scopes or blocks. So much could be done with formatting rules that just make parens easier to scan without some extra IDE highlighting or something.
The best part of parens is that you can try to pick a consistent format, but ya know that sometimes doesn’t happen because everybody likes to use parens differently lol.
I wonder if building a custom language is a simpler way to optimize stuff later on?
For example, Unity uses C# but then you need either all of Mono at runtime or something like IL2CPP to compile to C++. And then eventually that compiler is constantly needing to keep up with language releases and new features. Or in .Net’s case new languages like F#.
I'm sure it is, no doubt its easier to pick an existing language than build a whole new one.
For me however, the benefits of an existing language outweigh the costs. If Godot were to use Python for example, you'd gain the full benefit of pip alongside it. If you use C# with Godot, I know you can use Nuget. I'd also personally rather a more performant language than one thats easier to write, but thats personal preference.
For a moment I thought this was my fault because I had just asked for a free academic license, and somehow that would be the final straw :(
I was very excited to start reading the C code and was hoping a Nim port would be possible. Sad to hear I’m this late to the party because now I can’t even enjoy the blog posts.
You dodged a bullet. Me too. I just started my 3D journey, dabbling with Godot, Three.js and Unity. If I had stumbled upon Machinery and decided to use, I would have been annoyed now.
Awesome resources!
On a side note have you tried the Rescript [0] or Reason [1] programming languages? They both are based on OCaml and compile to JavaScript.
Hey I was wondering what maths are at the foundation of computer science. Could you explain which ones you think are at the foundation?
For context, I’m not sure something like calculus would be a foundational math (though it could be), but something like Boolean logic would be (right?)
Specifically : Logic(Propositional and Predicate), Boolean Algebra, Proofs(Inductive, Deductive, Transformational), Set Theory, Relations, Functions, Lambda Calculus, Formal Systems.
They are all interrelated but some are foundational. Thus for example Set Theory and Logic are foundational while the others are built on top; Set Theory leads to Relations and Functions, Proofs are the process of applying Logic, Lambda Calculus is Function Abstraction taken to extreme and Formal Systems teach you how to build Symbolic Systems (Models) without reference to any specific Domain of Discourse.
A background in "pure maths" is extremely useful. You don't necessarily have to remember many (or perhaps any) of the particular definitions and theorems, but the ability to formulate definitions and prove results about them is valuable. I've seen software documentation that actually assumes it. This is what mathematicians call "mathematical sophistication".
For example, proof by induction is the same basic idea as terminating recursion.
Other things in computing were designed by mathematicians and kind of assume you see things a bit that way. The pointer arithmetic and "inside out" declarations of C are perfectly sensible if you think mathematically – Dennis Ritchie was a mathematician. Likewise, Stroustrup's use of 0 for the nil pointer in C++ would have been understood without explanation by the people down the corridor from him at Bell Labs. Mathematicians often use 0 to represent some distinguished element of a set that is not necessarily a set of numbers.
An example of a specific use of maths in computing is relational databases. Relations are kinds of sets.
Some results in cooperating processes and distributed computing that software engineers should know are proven mathematically. If you can't read a proof, you can't read the paper/textbook and are reduced to "remembering" the result rather than internalising it.
We are in the age of machine learning, whenever you search for something online, take a photo with your phone, drive in your (diver assisted) car, filter out spam, get a product or video recommendation, you are using a system that is founded on calculus.
Just because computer science started with file systems and regexes (discrete math) doesn't mean that the systems of today or tomorow don't have very different foundations.
I'm pretty certain that the systems of today have the same foundations as those of yesterday, we've just abstracted them away so we don't directly see it as often.
That doesn't make a lot of sense given the novel hardware that runs in a lot of chips these days. TPUs, neural accellerators, perceptron branch prediction, flash based neural networks.
Thise are not abstractions over existing posix APIs or von neumann architectures, they are novel ways to do computation from a different branch of math.
Heck we don't even know if some of these are not breaking the church turing thesis by computing with true real numbers and not the computable subset that turing machines can handle.
It really depends on how quantized our universe is.
If you have papers that show full quantization including tunneling probabilities e.t.c please share them! The physical existence of the reals has been a white whale of mine for some time ^^'.
Yeah these are good points. I guess we could also add AI to the list (AI before ML), since it also has use of calculus I think.
I’m now wondering whether to consider both of these subjects foundational to CS or a CS degree. My gut says yes since so much relies on ML and AI is used a bunch (e.g. games and stuff).
Boolean math (building the computing device), then discrete maths (combinatorics for algorithms and DS), followed by probability and statistics (for algorithms as well as ML/AI).
I use it for my work in program analysis, code generation, and other languages-research projects. I’ve also used it to a lesser degree for modeling and simulation problems.
You would probably be better served by F# or Ocaml for a web-oriented ML stack. SML has less of an ecosystem in that area.
I used to work on a hardware compiler written in SML, which compiled a language called Handel to logic circuits for FPGAs.
Mostly I worked on fancy optimisation passes, and a fast gate-level simulator.
Similar to languages like Haskell which emphasise first-class algebraic datatypes (tagged unions) and pattern-matching in a concise but clear syntax, it's good for writing things that manipulate tree-like and symbol-heavy data structures like program syntax trees and circuit graphs, and quickly trying out new ideas with those.
It is good at parsing/other compiler-related tasks. Basically anything where you are working with tree structures I think can be a good fit. It is also good as a way to learn about functional programming.
I keep seeing comments about static html vs generated html. So I have a question (please respond):
Why can’t we just run the example Deno program to generate snapshots of html?
It seems like some of us think pure static html is a good goal for some things, so why not use this Deno program to create the same html responses in generated files?
It’s probably the same amount of code because instead of writing a http response you write a file.
Of course you lose some functionality this way, but your app you rules imo
I’m in the same boat, I’m learning game engines and so many libraries are in C++.
It feels impossible to use Rust or Zig, but I have hope I can use Nim. It can compile to C++ and has a simple-ish way of wrapping C++ code. Maybe it would interesting for you too?
Please someone mention Tup and how it can be a reasonable build system. I’ve heard the Fuse dependency is not ideal, though I felt it had a nice UX experience with the Lua config.
Plus it’s worth considering that you can potentially use Fennel to configure the builds (since Fennel compiles to Lua).
I have encountered Tup in the past and could not figure out how to define a "generator". As in: a function that defines how some output can be generated from a certain input running multiple commands. I don't want to copy those commands for every input I need to process.
Edit: Generator is a term typically used in CMake and Meson for this, in Make I'd use a pattern rule mostly.
Not sure how to exactly to build a "generator" either, but it seems that it would be a build rule that generates multiple outputs from multiple inputs right? If that's the case there's a `foreach` function for convenience, but it doesn't seem to have something for multiple commands. Though there's this Lua example on their website that leaves me wondering if what you want is possible, here's the code (since the documentation seems offline atm):
If you want to have a way to specify rules that use the same command, but still specify the rules manually, look up "!macros" in `man tup`. If the issue is that the commands need to write and read some temp files, then note that you can write temp files in a tup command (they need to be placed in $TMPDIR iirc). Note that the tup command can be calling a script that you place alongside, in case this makes specifying your sequences of commands easier. You can define the macro in top-level Tuprules.tup and use include_rules in all the Tupfiles to get it included.
Thanks, the `!macro` thing seems to be what I was looking for. Having intermediary temporary files is of course fine.
The use-case at that time was to convert a LaTeX file containing a TIKZ image (therefore having a .tikz file extension) to an SVG which required compiling the LaTeX code to PDF, cropping the PDF and converting it to SVG. Since there were multiple occurrences of this across the project, I wanted to have that as a reusable "function" in the build system. I did not want to write a dedicated script for this because of portability issues (Shell vs. Batch).
How does !macro rid you of that portability issue? One way or another, you'd need to have a command that does that whole sequence of operations. If it's inline, I'd expect same portability problem (`&&` in Bash corresponds to something else in `.bat`).
What you could do is use Tup's LUA support (you can write tupfiles and tuprules in LUA, and you can define and use LUA functions in them): have each rule handle just one step of your process, generate the whole necessary DAG of rules, encapsulate all of that in a LUA function defined in tuprules.lua and call that function in tupfile.lua for every diagram you want to generate. One thing that you lose is the ability to use `foreach` with your construction (other than by creating a variant of your constructions that embeds foreachness, or by using tup.glob and iterating).
I recently started using Clojure and I’ve used languages like C#, JavaScript, and Python a lot. My two cents is that a Clojure-like language should try to embrace the aesthetics of a white-space language like Python, but use the parens as clues for scopes or blocks. So much could be done with formatting rules that just make parens easier to scan without some extra IDE highlighting or something.
The best part of parens is that you can try to pick a consistent format, but ya know that sometimes doesn’t happen because everybody likes to use parens differently lol.