Hacker Newsnew | past | comments | ask | show | jobs | submit | cloudhead's commentslogin

I’m working on Radiant Computer: https://radiant.computer — a new from-scratch personal computer and OS.

The title is misleading


Misleading in what way? This is the linker part of a serie of posts about understanding the go compiler. I think there is no much space to be misleading.


Can you elaborate? Zig has a lot of traction already.


Except for Tiger Beetle customers and the few ones using Bun, what traction?



Another one to the list, however it hardly sounds like a killer application.


It's been my daily driver for close to a year now. It might not be a killer application, but it's certainly enough to prove Zig isn't vapourware.


If that is enough, there are plenty of languages around that fit the bill.


I've read Bun is just a wrapper, not actual Zig implementation anyway. Also, making a financial database in beta language that constantly changes and breaks is "really smart".


You can actually go and read the source yourself [1]. If Bun is "just a wrapper", then surely Node.js and Deno are too?

[1] https://github.com/oven-sh/bun/tree/main/src


A wrapper over what?? Bun includes the JavaScriptCore engine for JS evaluation, but it's so much more.

As for financial database concerns, if you're serious about including a project like that in your system, you have thorough correctness and performance testing stages before you commit to it. And once it passes the hurdles, at that point what difference does it make if it's written in a beta language, or a bunch of shell scripts in a trench coat, or whatever.


Ghostty.


ZML


What is even that? Not bothering to Google for it, which shows how irrelevant it is.


Good call, for bad reasons.

> At ZML, we are creating exciting AI produ[...]


Uhm yeah, a touch screen is not a keyboard. It will never be one.


So 5000 IU is the recommended amount?


This was linked on here a couple of months ago: [The Big Vitamin D Mistake [2017]](https://pmc.ncbi.nlm.nih.gov/articles/PMC5541280/)

> A statistical error in the estimation of the recommended dietary allowance (RDA) for vitamin D was recently discovered; in a correct analysis of the data used by the Institute of Medicine, it was found that 8895 IU/d was needed for 97.5% of individuals to achieve values ≥50 nmol/L. Another study confirmed that 6201 IU/d was needed to achieve 75 nmol/L and 9122 IU/d was needed to reach 100 nmol/L.

> This could lead to a recommendation of 1000 IU for children <1 year on enriched formula and 1500 IU for breastfed children older than 6 months, 3000 IU for children >1 year of age, and around 8000 IU for young adults and thereafter. Actions are urgently needed to protect the global population from vitamin D deficiency.

> ...

> Since 10 000 IU/d is needed to achieve 100 nmol/L [9], except for individuals with vitamin D hypersensitivity, and since there is no evidence of adverse effects associated with serum 25(OH)D levels <140 nmol/L, leaving a considerable margin of safety for efforts to raise the population-wide concentration to around 100 nmol/L, the doses we propose could be used to reach the level of 75 nmol/L or preferably 100 nmol/L.


Multiple previous discussions:

https://hn.algolia.com/?q=vitamin+d+mistake

Vitamin D is a favorite topic around here:

https://hn.algolia.com/?q=vitamin+d


It depends. I have MS and I take 10k IU. My cousin who also has MS takes 20k but gets regular blood tests for it.


According to what I read in a newspaper article, the recommended dose is much lower, at 800.


According to the internet, it is way higher, probably over 9000.

Edit because the comment might be to shallow for HN: I sympathize with the struggle against depression and, after first-hand experience, share the skepticism against the widespread prescription of antidepressants and the methods of evidence presented for it.

Very serious and important topic.

Regarding Vitamin D, I am also supplementing in the Winter, but I have not read the article, which says it has an estimated reading time > 10min. I use one 1000IE (0.025mg according to the package) tablet a day max.

I'll bookmark this discussion page to read TFA later maybe.


It’s important to take Vitamin D, as a fat soluble vitamin, with dietary fat during a meal. Something about bile production and absorption.

Also important to take it with Vitamin K.


Yes, I remember that and have Vitamin D+K combo tablets with calcium.

Seems like it would be best to increase time spent outdoors though.


There's likely significant individual variation in bioavailability. I would start with 2-5K/day, then measure and iterate.


I was taking 2x2000 IU with almost no sun exposure and then did bloodwork. My level was 77.8 ng/mL. The lab's reference ranges listed 30-50 ng/mL as optimal, 50-100 as high, over 100 as potentially toxic, and over 200 as toxic.


I don't know why this is downvoted, I had a very similar experience a while back. I took 4000 IU/day for about 4 months, insignificant sun exposure and ended up at 60 ng/mL (lab listed normal range as 30-40).

My starting levels were unknown but I assumed they were low given my usual sun exposure and some low-energy symptoms (which resolved a couple of weeks after I started taking it). I discontinued VitD then and now I only take 1000 IU/day in the winter.


With K3! Otherwise you're fucking yourself up.


Oh dear, here we go again.

IU, not mg.

K2, not K3.


5000 IU is very high, might be beneficial during the winter for folks with very fair skin. but most probably shouldn't take that much every day


You mean very dark skin?

It's my understanding that northern Europeans evolved fair skin in order to cope with the lack of vitamin D in their diet.


yes, i had it backwards - thanks for the correction.


You got it backwards, it would be more beneficial in areas with few hours of sun for darker skin folks, since they do not absorb as much Vitamin D as fair skin folk do.


absorb or create?

i understand it as: absorbing is in the intestine, generating D happens in the skin when exposed to the sun


Correct, what I meant was absorbing UV light, that then as you state creates D.


That's equivalent to about 10 minutes of sun exposure. Not very much when you look at it that way.


That comparison doesn't work. Only 10-20% of the vitamin D we intake is delivered through food and the body cannot process more sourcing from food. Even if you take more you will not benefit in an unlimited way, processing more. The skin is much better at generating/making/doing it.


The skin is definitely much better, but a higher than "recommended" dose is definitely (anecdata) effective at bringing up and maintaining the measureable Vitamin D3 level in your blood if you are under the recommended range. It's an important metric to track in your regular blood tests.


I think you mean for those with very dark skin, not fair?


Hyperion, anyone?


What's MC/DC?


Modified Condition/Decision Coverage (MC/DC) is a test coverage approach that considers a chunk of code covered if:

- Every branch was "visited". Plain coverage already ensures that. I would actually advocate for 100% branch coverage before 100% line coverage.

- Every part (condition) of a branch clause has taken all possible values. If you have if(enabled && limit > 0), MC/DC requires you to test with enabled, !enabled, limit >0, limit <=0.

- Every change to the condition was shown to somehow change the outcome. (false && limit > 0) would not pass this, a change to the limit would not affect the outcome - the decision is always false. But @zweifuss has a better example.

- And, of course, every possible decision (the outcome of the entire 'enabled && limit > 0') needs to be tested. This is what ensures that every branch is taken for if statements, but also for switch statements that they are exhaustive etc.

MC/DC is usually required for all safety-critical code as per NASA, ESA, automotive (ISO 26262) and industrial (IEC 61508).


Limit <=0 appears to include every number between 0 and INT_MIN.

I hope you don't have any string inputs, or your test is gonna take a while to run!


To test 'limit > 0' according to MC/DC, you need only two values, e.g. -1 and 1. There may be other code inside the branch using limit in some other ways, prompting more test cases and more values of limit but this one only needs two.

But yes, exhaustively testing your code is a bit exhausting ;)


Modified Condition/Decision Coverage

It's mandated by DO-178C for the highest-level (Level A) avionics software.

Example: if (A && B || C) { ... } else { ... } needs individual tests for A, B, and C.

Test #,A,B,A && B,Outcome taken,Shows independence for

1,True,True,True,if branch,(baseline true)

2,False,True,False,else branch,A (A flips outcome while B fixed at True)

3,True,False,False,else branch,B (B flips outcome while A fixed at True)


I made a mistake:

  Test #  A      B      C      Result
  1       True   True   False  True
  2       False  True   False  False
  3       True   False  False  False
  4       False  True   True   True


Basically branch coverage but also all variations of the predicates, e.g. testing both true || true, and true || false



How does this work? Files need to reference other files eg. for calling functions from other modules, which means semantic analysis needs both files in memory to check the types. This is especially complicated with mutual recursion across modules (separate compilation doesn't apply here). If you're building a language like C where everything requires forward declarations, then maybe, but anything more modern seems difficult.


The fact that you either need a third party dependency or a large amount of boilerplate just to get decent error reporting, points to an issue in the language or std library design.

I've started also dropping `thiserror` when building libraries, as I don't want upstream users of my libraries to incur this additional dependency, but it's a pain.


> points to an issue in the language or std library design.

Rust has a too-small stdlib because they want to avoid a calcified stdlib like C++ and Python, which both have too-big stdlibs.

This is a law of nature, your stdlib can either be too small or too big. It cannot be the right size. At least it isn't C.


I think you’re right with regards to the intention — but I’ve personally not experienced the case of an std lib being too big — good examples of “the right size” would be Go or Zig.


Why are people disagreeing with this? This is absolutely a problem that most other languages don't have. If you want to claim that Rust's error system is "better" than anything else (as the author did), you should have a good argument about why this exact problem the parent commenter described, which to me is a major problem, does not (maybe) cancel out all the other purported benefits of Rust's error system!


This isn't a problem in other languages because most other languages don't have strong, statically typed errors that need to compose across libraries. And those that do have the same problem.

The general argument against adding something to `std` is that once the API is stabilized, it's stabilized forever (or at least for an edition, but practically I don't think many APIs have been changed or broken across editions in std).

The aversion to dependencies is just something you have to get over in Rust imo. std is purposefully kept small and that's a good thing (although it's still bigger and better than C++, which is the chief language to compare against).


False dichotomy.. if reliability matters, you have to invest in both. Fault tolerance is not a replacement for correctness.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: