Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Building an efficient and portable programming language with Zig (fastly.com)
143 points by DyslexicAtheist on May 13, 2021 | hide | past | favorite | 113 comments


> There is no big company behind Zig, and the non-profit foundation stands on its own legs thanks to a balanced mix of corporate and individual donations. We don't have any big tech company on our board of directors, and frankly, we like it this way.

I hope it stays this way. And maybe it can given the way the foundation has been bootstrapped, and that the project is driven largely by a single creator. Having the funding be driven largely by small donations of actual developers is a healthy incentive structure.

Zig, next to Rust, is one of the more interesting language projects going on these days. Since Rust became orphaned from Mozilla, I am a bit ambivalent about the amount of investment in the project which is being driven by tech giants currently.


Is there a popular language without investment from tech giants?


Python developed despite tech giants.

It's only recently after it became very popular all tech giants started investing in it.

Similar are many other languages like Ruby, Haskell etc.

Rust is developed within a corporate world of Mozilla, like Go language in google, so its natural it will embrace corporate world and companies like Microsoft, Amazon which will in time decide it's future directions.


Zope was the main reason to use Python in 2000, it slowly replaced Perl among UNIX admins, then there was Django among other Python based tooling.

Guido also worked always for quite well known companies after he left academia.


I mean at some point, if the tech is good, tech giants will use it. And then you'll have a flurry of people rambling in shock "Wow, Google depends so much on [tech] and they don't even give a single penny to the devs!".

Programming languages getting a bunch of investment from its users is open source working possibly at its best.


A some point, I think this is inevitable.

And to be honest, I don't care that much about who is behind a tech, as long as they are making the right choices. (except if it is Apple)


Nim, Pony, and maybe Dlang?


I am rarely impressed by new languages, I tend to think we already have too many, a lot of them are slow and or redundant.

But I like to see the emergence of a few sane proposals to replace C/C++, I've been extremely impressed so far by the work done by the team behind Zig.

The best feature, from my perspective, is that Zig does not intend to be a better C++ but a better C.


my only beef with zig/jai is that they both have strong opinions on things like operator overloading and i am afraid they will end up just like java.

i have nothing against having strong opinions, i am the biggest offender but completely disallowing some features, some of the best features static typing can offer is just nonsense.

every language designer should study common-lisp and start with the operators : and :: these two operators alone tell you that the language designed for practical use.

for me overloading the operator + and function 'add' is exactly same. you should then forbid function overloading as well now. you can argue about the weakness of operator overloading but disabling completely, not even a workaround?

why not just enable oo and have safe alternatives instead? . safe(a + b) . a s+ b


quoting myself [1]:

"It's really easy to make fun of C++, but everyone has that one feature of C++ that they like. And they want to put it in Zig. If everyone had their favorite feature from C++ in Zig, Zig would just be C++. But I'm not going to let that happen."

https://youtu.be/Gv2I7qTux7g?t=454


andrew,

a language with first class cross-compilation alone is enough for me to respect you as a language designer and zig have so many features like this. i am a huge fan.

i have only one favourite feature in c++ and it is templates/ctfe. without the function/operator-overloading templates are incomplete, i would then use c instead. i am still using c++ with all its failures, with all its complexity, with all its uglyness|ineleganceonly because c++ got one thing right. you can at least somehow modify the language.

edit: i have to point out that i have never worked in a big team (more than 10) and you can (probably should) safely ignore my ideas/opinions.


My concern about zig/jai and others, is meta-programming.

Meta-programming can be super powerful and practical, and the current trend of reusing the language itself is clearly better than previous attempt.

But meta-programming in itself is not a novel idea, C macros and C++ templates are past well studied examples, and both produced awful results.

I am not sure they can prevent a new cycle of this nightmare.


Maybe I am just a huge dummy, but I have yet to find examples of metaprogramming in the wild that aren't just mind-meltingly hard to grok. (Most of what I have seen is Python and Rust).

I have no doubt about how powerful metaprogramming is, but it makes me feel that understanding and contributing to libraries that use it is out of my reach.


I think what's novel about Zig's approach is that the metaprogramming is just normal code which happens to be executed at compile-time.

I have found that when any project gets to a certain size, it's almost inevitable that metaprogramming will be required, unless you want to make everything super dynamic and sacrifice performance. The idea of being able to do metaprogramming in the language I used to write the program itself is an interesting one.


I don't know if Kelley would agree with my characterization, but I don't see comptime as metaprogramming. Instead it opens the very interesting possibility of having types as values, as long as those values are resolvable at compile time. This lets you do things that feel like metaprogramming (e.g. making a generic container structure) but it seems a better conceptual fit to me that you're programming with types as values rather than generating code from a template or macro.


This is nothing novel. LISP macros is exactly that.


in fact i believe templates are the greatest idea in static typing. they look awful because its design and the implementation is awful not the idea. c++ failed because they didn't know and too late to turn back, and rust just looks same, you should see d-templates.


I think that templates can be practical and readable if used with moderation, just like macros.


> C++ templates are past well studied examples, and both produced awful results.

What's wrong with templates ? They give you a compile-time functional language that operates on types, isn't that great ?


Imagine being able to use C++ to operate on types at compile time instead of using the verbose template syntax.


In my opinion, operators overloading could be safely used if they were explicit.

With something like that:

var x = a [+] b;

There is no possible mistake.


proposed and rejected. I went to a lang meeting and tried to steelman the issue, without having a strong investment either way (I had one use case, where I would slightly prefer the infix operation).

https://github.com/ziglang/zig/issues/8204


Good to see it was discussed :)

It is probably better to have native support for small vector operations, as this can also help to streamline SIMD optimization.

In practice (as a video game dev) I only use operators overloading to have infix notation on vector maths.


Yeah gamedev/simulation is the use-case where I think operator overloading is really a value-add. It's really nice to be able to write linear algebra code which reads like math.


Operator overloading is something that has often been misused, and even if I tend to like syntactic sugar, I understand that being explicit is more important.

When trying to understand a codebase, it is much easier to not worry about hidden indirections.


but you can misuse anything, + vs add was an example for this.

being able to misuse a PL i believe is not a bug but a feature. we really suck at predicting the future.


This is not the role of langage designers to prevent bad programmers to write bad code.

But it is better to avoid giving them too many ways to shoot themselves in the foot.

Forcing the code to be explicit is good design, in my opinion, simply because it is much easier to write code than to read it, we have to put more weight on the clarity side.


Pretty sure jai has operator overloading as it is intended for games and being able to add two vectors is something you want to do all the time.


Is your point with the ':' and '::' operators that by allowing anybody to break package encapsulation at will the language pragmatically allows for situations that the original package designer did not anticipate? I could see that.


that is exactly my point. it has everything. elegance, vision, practicality... it doesn't treat programmers like drones.


The title made me think it was going to be about creating a new language using Zig to build the compiler. I'm glad that wasn't the case.


That's what I thought it would be about as well - not a great title. I kind'a wanted to read about creating a new language using Zig.


> Hannah: How is it possible that Zig applications run faster than C code while also having far better security guarantees?

In a marketing interview with biased questions, anything is possible.

TBH, I have a positive impression of zig, but this kind of interviewing doesn't help it.


I like to think Zig is what we would have if we knew back then (when desiging C) what we know now.

(note: "we" excludes me personally but I mean it as "us who work and study comptuing at large")


We already knew in 1981.

"Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."

C.A.R Hoare on his Turing award speech in 1981.


The difficulty is that in the days when C was being designed, computers were much more irregular than they are today. There were one's complement machines, machines that didn't have power-of-two word sizes, no standardization of character sets, IEEE floating point hadn't been invented yet. The irregular machines weren't fringe stuff, they were the dominant architectures (IBM 360/370, DEC PDP-10, Pr1me, just a big collection of weird stuff). And compiler technology was much less advanced. So C was a messy compromise.


High level system programming languages are about 10 years older than C, which in its early days only cared to target the PDP-11 model used for the first UNIX rewrite.

Authors just chose to ignore what was already out there and do their own thing instead.

UNIX's success in the universities made the rest.


Well I guess it is lesson for both: Those who think technology is better because it is so successful and who think technology will be successful because it is so much better than other things.


C was not "high level" system language at the beginning. It was almost pretty skin for assembler.


Indeed, and the big mistake what that many decided to use it for anything other than writing kernels.


Iirc, C was also designed by committee, and a lot of industry players got some of their grubby hands on the spec... (I could be wrong) but I belive the utter mess that is short/long/char sizes arises from hardware manufacturers wanting their code to be "trivially portable" across platforms with different machine words.


No, the different sizes were done by the original creators, not by some committee. They had to: a PDP-11 is a 16-bit machine, a PDP-10 is 36-bit, etc.


thanks, I think I was crossing stories between the C committee and the IEEE 754 committee.


Indeed, Zig still hasn't a story for use after free issues.


The question is if this should be done at the language-level or library-level. With sufficient metaprogramming capabilities, an enterprising programmer could write safe resource management abstractions for Zig. However, this wouldn't make the language "safe by default," which seems to be what Rust programmers are bringing with this criticism.

Even if Zig doesn't bring safety by default to the language, however, I think you could get it in practice by making the standard library enforce safety. Then, unless someone goes out of their way to write their own libraries from scratch and eschew all of the safety mechanisms, they would likely live in this safety bubble.


Can you write a borrow checker that enforces alias-xor-mut using metaprogramming? I'm skeptical, because of the flow sensitivity you really want to make it practical.


You could with some changes to the type capabilities, but I don't think that's the right direction. The main question is do we want soundness or not? This is a general question for various correctness properties, and, at least in the formal methods space, the answer seems to be "not always." Soundness has a cost, and stopping 100% of UAF bugs for the cost of making the language more complex and even adding a few bugs of other kinds, might not be worth it if you can stop 99% of them for a fraction of the cost. I think the goal should be to not have soundness -- IMO it has more downsides than upsides in this case -- and make good runtime detection joined with fast compile/test cycle and even automatic test generation.


I mean, you're essentially saying here that memory safety is not worth having, which is a position increasingly at odds with the evidence.


Not at all. I'm saying that sound guarantees are not the only way to achieve memory safety. The goal isn't to use a language that makes sound guarantees, but to write correct programs (even Rust programs don't give you sound guarantees for memory safety as many of them depend on unsafe code that isn't soundly proven). That the cheapest way to write such programs is to have sound guarantees for everything is an interesting hypothesis, which would get you a language like, say, Idris, but it is not the consensus. There are many paths to safety and correctness, and not all of them go through soundness. I'd venture to say that most properties your life depends on in safety-critical systems are not soundly guaranteed.


When I say "memory safety" I mean "language-enforced memory safety". You're saying that having the language enforce memory safety isn't worthwhile.

> even Rust programs don't give you sound guarantees for memory safety as many of them depend on unsafe code that isn't soundly proven

There's no such thing as 100% memory safety; at the extreme end there are bit flips caused by cosmic rays. But the evidence suggests that practical language-enforced memory safety is actually worthwhile, despite the fact that no theoretical absolute memory safety is possible. All memory-safe languages have runtimes and FFIs, which in Rust is the unsafe blocks. This doesn't change the fact that, empirically, memory safety, even the imperfect memory safety we have to live with in the real world, is a meaningful improvement in stability and security.

> That the cheapest way to write such programs is to have sound guarantees for everything is an interesting hypothesis, which would get you a language like, say, Idris, but it is not the consensus.

Straw man. Nobody is talking about proving all code correct. What is reasonably describable as consensus nowadays is that having the language enforce memory safety is worthwhile. The idea that enforced memory safety has, in your words, "more downsides than upsides", is increasingly at odds with the consensus.

To get concrete, I see no reason to believe that quarantine (Zig's current solution to UAF) is a meaningful solution to use-after-free problems, given that quarantine has been deployed for a long time in production allocators in other languages and has failed to eliminate this bug class in the wild.


> You're saying that having the language enforce memory safety isn't worthwhile.

I am saying that it may come at a cost, and overall it may not be worthwhile; but, of course, there are different kinds of safety bugs, different kinds of soundness, different costs, and different alternatives.

> But the evidence suggests that practical language-enforced memory safety is actually worthwhile, despite the fact that no theoretical absolute memory safety is possible

What is it that the evidence suggests exactly? There are so many variables. For example, Zig's runtime checking could be much more effective than similar systems for C or C++, because it is much easier to track all pointers in Zig, just as its overflow protection is far more effective than sanitisers in C, because there's no pointer arithmetic (unless using unsafe operations) and all buffer sizes are always known. So you can't compare what Zig can do to what C can do. Then there's the question of what the safety mechanism is. Then, for each point in this coordinate space, what is the fitness you're looking at? Reducing a particular bug or improved overall correctness?

> What is reasonably describable as consensus nowadays is that having the language enforce memory safety is worthwhile.

No, this is not true, or at least, that depends on what is the fitness function and what exactly it is compared against. To put it bluntly, there is absolutely no consensus that Rust would more cheaply produce programs that are more correct overall than Zig. It's possible this is the case, as is the opposite or that the two are about the same, but we just don't know yet.

> To get concrete, I see no reason to believe that quarantine (Zig's current solution to UAF) is a meaningful solution to use-after-free problems, given that quarantine has been deployed for a long time in production allocators in other languages and has failed to eliminate this bug class in the wild.

But that's not how we look at correctness. The goal isn't to eliminate all UAF bugs. Of course completely (or close to it) eliminating a specific bug is more effective at reducing it than not completely eliminating it. If my only goal was to eliminate UAF, then I'd choose Rust's approach over Zig's. But usually our goal is to write a program that meets our correctness level for all bugs combined in the cheapest way possible. Investing a huge language complexity budget in completely eliminating UAF, as opposed to, say, reducing it by 99%, has not been shown to be the best way to achieve what we actually want.

I am not saying that it's not reasonable to believe that soundly eliminating this particular bug with something like Rust's ownership and lifetime system ends up being better overall, but it is equally reasonable, based on what we know, to believe that Zig's way is more effective overall, or that the two end up the same. There's just so much we don't know about correctness.


On the other hand, then one can just go the Apple way and create a safe C dialect instead, while keeping the same tooling if 99% is good enough.

See "Memory safe iBoot implementation" section,

https://manuals.info.apple.com/MANUALS/1000/MA1902/en_US/app...


Sure, but the goal of new languages like Rust or Zig isn't to only add safety.


True in regards to Rust.

For Zig I really don't get what it offers over Modula-2 other than a more C like syntax and compile time metaprogramming.

Maybe I will be proven wrong.


Not sure, but I became optimistic of there being some solution after I saw someone implement affine typing using stateful template metaprogramming: https://godbolt.org/z/PnPPrnPjY.


That's neat, but also flow-insensitive (and wow, generating a definition for every single use of a variable can't be good for compile-time memory usage).


Yeah, I'm unaware of any push to bring flow-sensitive typing, or a mechanism that affords flow-sensitive analysis, to Zig.


Not sure what 'story' is intended to mean here. Is use-after-free undefined behaviour in Zig?


There isn't really anything in the language that prevents use-after-free, partially because allocation isn't even part of the language and exists entirely in userland code. However, IIRC the standard library does include an allocator that tries to detect use-after-free.


Isn't that an intentional design decision in Zig? As far as I understood, Zig isn't trying to "solve" memory management, it's trying to provide manual memory management with better tools than C to help you do it right, but it's explicitly not trying to save you from yourself in this respect.


Except that C and C++ already provide similar tools, one just needs to actually enable them.


I don't follow. You recently pointed out to me that most of the serious security bugs in the Chromium codebase (C++) are rooted in memory-management. [0] It's no simple thing to fix these bugs. Not even Google can do so in their most treasured code.

[0] https://news.ycombinator.com/item?id=26862330


The point being that Zig improves very little regarding that, because it is based on the same kind of tooling.


> allocation isn't even part of the language

It's not supposed to be. In most cases, memory allocation is an operating system concept. As a systems programming language, zig must not abstract this away from the programmer.


You don't need allocations to be part of the language, you just need destructors (or linear types.) But the Zig authors intentionally omitted destructors because they want all control flow to be explicit. There's more at [0].

Side note: the proposal at [0] seems very close to [1].

[0]: https://github.com/ziglang/zig/issues/782

[1]: https://www.haskellforall.com/2013/06/the-resource-applicati...


Too late to edit, but: I'm an idiot, I was thinking of memory leaks, not use-after-free.


Allocation isn’t part of the core language in Rust either, yet it prevents use-after-free generically via RAII


Currently I'm using Rust in a project without RAII. RAII is neither necessary nor sufficient to prevent use-after-free. In Rust use-after-free is prevented by the borrow-checker. In C++ with RAII use-after-free is still possible, because normal references aren't tracked with regard to the lifetime of pointees.


Microsoft and Google are trying to improve it via lifetime analysis, but after 3 years it is still pretty much WIP due to C++ semantics.


Is use-after-free ever really defined behavior?


Yes. Undefined behavior exists only in standards. It's perfectly possible for implementations to define everything.

https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost...

> I first heard about this from one of the developers of the hit game SimCity, who told me that there was a critical bug in his application:

> it used memory right after freeing it, a major no-no that happened to work OK on DOS but would not work under Windows where memory that is freed is likely to be snatched up by another running application right away.

> The testers on the Windows team were going through various popular applications, testing them to make sure they worked OK, but SimCity kept crashing.

> They reported this to the Windows developers, who disassembled SimCity, stepped through it in a debugger, found the bug, and added special code that checked if SimCity was running, and if it did, ran the memory allocator in a special mode in which you could still use memory after freeing it.


That would depend on the language and the libraries.

You could roll your own 'safe' memory pool in C such that use-after-free doesn't produce undefined behaviour. You'd still have a bug, but it wouldn't have to invoke UB at the level of the C programming language.


Does anyone have a link to a good zig book/tutorial? I keep seeing it and want to try it out.


as far as I can tell, none really good ones exist yet...

the documentation is still scattered. The official website [1] is a good starting point. and I wish I'd seen this [2] earlier (I hope this gets incoporated into the official resources)

[1] https://ziglang.org/learn/

[2] https://www.lagerdata.com/articles/an-intro-to-zigs-integer-...

This is probably the most comprehensive resource: https://ziglang.org/documentation/master/



Start with Ziglings: https://github.com/ratfactor/ziglings and then after working through all the exercises read chapter 2 of ziglearn.org.


https://ziglearn.org is a nice way to start!


The language has not reached 1.0 yet, it would be silly to write a book at this stage.

You can find documentation on the language website, also there is an active community and you can ask questions direction on Discord.


There are many erroneous claims made regarding Zig allocators. Contrary to what is claimed, the fact that you pass an explicit allocator does not affect:

    - That an out-of-memory condition is elegantly handled or not.
    - That an out-of-memory condition is properly reported to the caller.
    - That is simplifies WebAssembly support; whatver you put in the allocator your could put into malloc.
    - That it easily allows arena allocators; you can only use an arena allocator if you intimately know every single uses of the passed in allocator, allo the implementation details of the libraries you call and of *all* its dependencies. If a dependency, for example, implements a cache, the arena allocator will corrupt it when deallocated.


As a C programmer, I didn't like that Zig just gets rid of macros. I prefer the Rust approach to make better macros.


There are many ways to approach macros. Zig’s approach with compile time evaluation handles many of them. Esp since types can be created on the fly. Another approach, which is strongly inspired by the ASTEC project for C, is what I use for my C-like “C3”: http://www.c3-lang.org/macros/

Zig’s clearly about a more homogeneous approach where I am happy to have multiple pieces of syntax to cover different usecases.


It's always possible to use the C preprocessor on non-C files, I saw people do it in Java and I heard it's somewhat common in C#; it's also used by various Unix utilities (X11 for Xrdb for instance). So technically every language can have C macros :-)


I think they also have better macros, with compile time code evaluation.


One big benefit I see is that it increases design pressure on other features of the language.


[flagged]


Curious why someone wouldn't like Kelley- Is it because he shoots down feature additions a lot?


I don't know much about Kelley, but if the goal is to create a "C replacement", aggressively keeping the language surface area small seems like a reasonable way to go.


And even if you don't think that's right, we already have the large-surface area systems programing language experiment in Rust. Aren't we better off with Rust and Zig exploring clearly different paths for a C replacement than both of them trying the same things?


Exactly, this is why he sometimes comes off as a jerk, because he tells people stuff like "No, you can't add your virtual function table feature to Zig, OOP programming is a bad idea in a systems language."


Indeed. In this video he explains the reason why he made those choices:

https://youtu.be/Gv2I7qTux7g?t=120

All these programming language features require a lot of machinery, complicating the binary interfaces of the resulting software and preventing code reuse.


Simple, small languages getting complicated when used in large applications while complex languages become simple and easier when used in large applications.


That mostly seems correct, but Zig will be an interesting one to watch due to the approach of having powerful compiletime execution. I.e. Zig doesn't have generics in the traditional sense, but you can get them by writing normal code which creates types for you at compiletime. The promise would be that you can get more advanced features without adding complexity to the language, i.e. like in Rust where you have to basically understand the AST in depth to be able to write macros. We'll see if it pans out.

Also for instance Go has been able to stay relevant while staying very small. Complex languages can help with complex projects if there is a good program structure, but one advantage of simple languages is there is only so much of a mess which can be made if developers go a bit off the reservation.


Yeah this is not my experience. Enterprise distributed systems are much easier in erlang and elixir, which are simple languages, over java, which is not.


They are functional languages. There is difference.


He is childish. He had record breaking flame war in HN.


What was it about? Link?


I'm assuming that he's referring to this thread, https://news.ycombinator.com/item?id=20229632


A key feature of a "flame war" is long chains of responses. By not responding to amedvednikov, AndyKelley did not participate in a flame war on that post. It was a call-out. His criticism was quite relevant at the time, because the article was a false release announcement, and the project was garnering a lot of attention (and taking people's money) by making outlandish claims, with (at the time) nothing to show for it.


There were many posts both in HN and outside HN including Github, Reddit, Twitter.

His criticism is not constructive. He had Linus style comments.


There is no such thing as "constructive criticism" of suspected fraud -- there were a bunch of red flags, and AK was not remotely alone in sounding the alarm. Do you have constructive criticism to offer, is AK still badgering AN, or are you just airing grievances about 2 year-old drama?


For the record this was my last public statement about V, which I made nearly 2 years ago: https://twitter.com/andy_kelley/status/1142503808901308418

ends with "I'm genuinely glad it turned out this way. Good luck on your endeavor and welcome to the programming languages club."


Thank you for building Zig. It's made a huge impression on me, especially the Road to Zig presentation.


> There is no such thing as "constructive criticism" of suspected fraud

You can criticize anyone constructively. What they are today is not what they are in the future. People can change even Andrew, but if something is repeated over and over, it is a sign of their personality.

I see his act as harassment. I know that Vlang had red flags, but I feel Russophobia or Jealousy in his actions. I looked through everything, but the guy didn't let the man talk and explain himself. So, the other side of the story is hushed up. It's like asking someone to prove their worth, otherwise they'll be killed. Naive Bayes can uncover many cases of fraud, but bias can turn things upside down.

I'd hope that Mr Andrew is not an Anarchist.


> You can criticize anyone constructively

Please, show us how that's done. Otherwise, please stop. AK's last word on the matter was rather cordial, and what you're doing here is toxic.


Care to elaborate on why?


vLang

He has gone greater length to tarnish this language creator. It shows me he is up to something and I don't like that.

I pretty much don't use Vlang, but I dislike the way Kelley behaved.


Yeah, he might have come on kind of strong against the V guy, but as I understand it there are some vapor-warey aspects to VLang (or were at the time he pointed them out).


He’s a great guy!


what or who is Kelly?


Andrew Kelley, the initiator of Zig

Edit: name spelling


Kelley btw


I'm leaving HN and it's echo chamber all together. HN has no basis with reality. Bye!


Your account has only been registered for 30 days...


And this is the second departure announcement...

https://news.ycombinator.com/item?id=26967555




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: