Hacker News new | past | comments | ask | show | jobs | submit | more yazzku's comments login

Unlike when God slaughtered several million bison, right? https://allthatsinteresting.com/buffalo-slaughter


That’s the second clause. Anything after 50 years ago interferes with the environment. Even returning things to before that time interferes with the environment. That’s the cutoff date I’m afraid.


Ah, right, the 50-year cut-off makes a lot of sense, circa Exxon Mobil deciding to engage in the largest terraforming experiment done on Earth for profit despite its own knowledge of the consequences. That makes total sense; God created Exxon Mobil, and anything we do to challenge it is interfering with God's fine handiwork.

Let me prepare the school curricula for next calendar year.


NEPA was passed about then and most people would prefer it stay, so yeah, it makes sense. It’s why we don’t have environmental review for I-5 but we do for new bike lanes. I-5 was already decided in the 1960s so the construction was okay. But bike lanes are today and need extensive environmental review, as any environmentalist will tell you.


Superluminal is a sampling profiler for the most part. It works great for what it does, sure. But in the author's own words:

> So far, we’ve only used perf to record stack traces at a regular time interval. This is useful, but only scratching the surface.

For cache hits and other counters, you're gonna have to go deeper than just sampling.


How would you go about cache hits instead of using a sampling profiler? Would you use eBPF / BCC?


The article describes using strace and perf in the paragraph after.


You have earned your Martini.


Missing from this post: string_of_int, int_of_string, +, +., etc. That alone is a massive turn-off for me, I'd rather write C at that point. Any modern language necessitates some kind of polymorphism and make user-defined types feel like first-class citizens.


> string_of_int, int_of_string

That didn't bother me so much because i speak spanish and can read french. OCaml is of french origin. `string_of_int` is a bad english translation—should have been `string_from_int`.

I like F# where I can use the `int` or `string` functions:

    let myString = "2024"
    let myInt = int myString
    let myStringAgain = string myInt


The Ocaml library added a Int module with a function to_string (so Int.to_string) and a generic printer ages ago. There is also a (+) operator for Float in the float module which you can open in any scope if you so wish.

Ocaml obviously supports polymorphism and an extremely expressive type system. The fact that operators are not polymorphic is purely a choice (and a good one).


As the other commenter mentioned, its a mistranslation. I read string_of_int as "string out of int" to make it better.


Interestingly enough, OCaml has a great polymorphism story in its OO system. Because it is structurally typed with nearly automatic inference, you can in fact write completely generic code like `x#y(z)`, which magically "just works" for any x that has a method y that accepts z - all inferred and statically type-checked.


> Because it is structurally typed with nearly automatic inference, you can in fact write completely generic code like `x#y(z)`, which magically "just works"

aka let's mix two jars of jam and shit via this funnel and see what happens.


On the contrary, static and structural typing are a match made in heaven.


Interesting. Why doesn't the standard lib use that for the examples I listed?


Because those types are not object types, so they don't have methods associated with them at all. This is unlike, say, CLR languages in which all types are object types.

There's been research on modular implicits for OCaml to solve this more generally, but that's not landing upstream anytime soon.


Compiler messages:

The big difference here is that the OCaml compiler has a lot less work to do. It's not that the Haskell error messages are inadequate (they are actually pretty good), but that the amount of compiler features and type gymnastics make the errors deeper and more complex. For example, if you get the parens wrong in a >> or >>=, you'll get some rather cryptic error that only hits home once you've seen it a few times, as opposed to "did you mean to put parens over there?"


Engineer vs mathematician. Haskell is the schizophrenic product.

> If I come to an existing OCaml project, the worst thing previous developers could do to it is have poor variable names, minimal documentation, and 200+ LOC functions. That’s fine, nothing extraordinary, I can handle that. > > If I come to an existing Haskell project, the worst thing previous developer>s could do… Well, my previous 8 years of Haskell experience can’t prepare me for that

This is kind of like Go vs C++, or <anything> vs Common Lisp. The former is a rather unsophisticated and limited language, not particularly educational or enlightening but good when you need N developers churning code and onboard M new ones while you're at it. The latter is like tripping on LSD; it's one hell of a trip and education, but unless you adopt specific guidelines, it's going to be harder to get your friends on board. See, for example: https://www.parsonsmatt.org/2019/12/26/write_junior_code.htm...


Go is good for onboarding people onto a project, but not much else.

There's a reason Google is migrating Go services to Rust:

https://www.theregister.com/2024/03/31/rust_google_c/

> "When we've rewritten systems from Go into Rust, we've found that it takes about the same size team about the same amount of time to build it," said Bergstrom. "That is, there's no loss in productivity when moving from Go to Rust. And the interesting thing is we do see some benefits from it.

> "So we see reduced memory usage in the services that we've moved from Go ... and we see a decreased defect rate over time in those services that have been rewritten in Rust – so increasing correctness."

That matches my experience: Go serivces tend to be tire fires, and churn developers on and off teams pretty fast.


You'd expect a rewrite to take less time than development of the original system from scratch. So I'm not sure this is actually as favorable a result for Rust as it's presented.


Isn't Go's concurrency model an advantage over other approaches?


When it exactly fits your problem, yes. But it's not like you can't express that model in Rust (in a more cumbersome way) when you need to.


OCaml is not an unsophisticated language. It inherits the features of ML and has first class modules, which is not present by default in Haskell (present in backpack). Not having first class modules leads to a lot of issues.

Also, there is a better story for compilation to the web.


OCaml's type system is quite janky and simplistic compared to Haskell's. The first class module system is fairly nice, although it leads to an annoying problem where now you kind of have two "levels" to the language (module level and normal level). This is arguably analogous to Haskell having a "term level language" and a "type level language", where the type system is more prolog-y than the term language. Also, Haskell's type system is powerful enough to do most of the things you'd want the OCaml module system for, and more. I do occasionally miss the OCaml module system, but not most of the time.


Conversely, the Ocaml module system is powerful enough to do all the things you had want to do with Haskell except the Ocaml module system is nice to use.

Anyway, the issue has nothing to do with relative powerfulness. The issue is that the Haskell community encourages practices which lead to unreadable code: lot of new operators, point-free, fancy abstraction. Meanwhile, the Ocaml community was always very different with a general dislike of overly fancy things when they were not unavoidable.


> except the Ocaml module system is nice to use

This comment doesn't lead me to believe you've ever worked in an ocaml shop. It's only "nice to use" for trivial use cases, but quickly devolves into a "functorial" mess in practice

> the Ocaml community was always very different with a general dislike of overly fancy things when they were not unavoidable

This is the exact thing that people always say when they are coping about their language being underpowered.


If by "encourages" you mean "has features", then yes. The typical haskell shop doesn't really encourage complex feature use, it's the people learning/online who don't actually need to work within their solutions, do. That's what seems to draw (some) people to haskell.


Learning a “pure” language is a lot like tripping on LSD.

The people who do it can’t stop talking about how great it was, but also can’t really explain why it was so great, and when they try it just sounds ridiculous, maybe even to them. And then they finish by saying that you should drop acid too and then you’ll understand.


The reality is people want what you produce when you're sober, not having fantasy hallucinations.


> also can’t really explain why it was so great

I like it when

  assertTrue (f x)  -- passes in test
means that

  assertTrue (f x)  -- passes in prod


Is there a language where that isn’t the case?


Approximately all of them. The property is "referential transparency", and it's such a sensible thing to have that people assume they already have it (per your question).

The "test/prod" was an unnecessary detail - there's really nothing saying that f(x) will equal f(x) in most languages in most circumstances! It can return different things on repeated calls to it, it can behave differently if two threads call into it at once.

It's a major part of the reason people don't see the appeal of Haskell. They think they already have "type-safety" and "functional stuff" and "generics" and "null-safety" - but it's really not the same.


Haskell isn't all that pure.


what do you mean by that? all functions in haskell are pure unless you explicitly use unsafePreformIO or similar (which is rare to ever have to do)


They can still have side-effects like non-termination.

But I didn't mean purity in that formal sense. I meant that Haskell is plenty pragmatic in its design.


To me, "pure" means referential transparency: same input, same output. So an `Int -> Int` function will return same result on same argument. So, similar to `Int -> IO Int`, the function (action) will return an Int after interacting with outside world, `IO` tracking the fact that this is the case.


Lambda calculus is as pure as can be, and also has terms that don't normalize. That is not considered a side effect.

A better example of impurity in Haskell for pragmatic's sake is the trace function, that can be used to print debugging information from pure functions.


> Lambda calculus is as pure as can be, and also has terms that don't normalize. That is not considered a side effect.

Many typed lambda calculi do normalise. You can also have a look https://dhall-lang.org/ for some pragmatic that normalises.

> A better example of impurity in Haskell for pragmatic's sake is the trace function, that can be used to print debugging information from pure functions.

Well, but that's just unsafePerformIO (or unsafePerformIO-like) stuff under the hood; that was already mentioned.


> They can still have side-effects like non-termination.

you can still have total functions that don't finish in humanly/business reasonable amount of time.


Yes?

Just like pure functions can use more memory than your system has. Or computing them can cause your CPU to heat up, which is surely a side-effect.


It doesn't have great support for Dependent Types


what does that have to do with purity?


Nothing, but arguably a language with dependent types is more Haskell than Haskell


"You mean you're going to make a copy of that every time?"


Haha, can't tell if you're joking or not.

For anyone else reading - you don't need to make a copy if you know your data isn't going to change under your feet.

https://dev.to/kylec32/effective-java-make-defensive-copies-...


I was half-joking. I wasn't aware Java was promoting "defensive copies" :D


Exactly. Plus you get all the discounts scavenging on excess inventory.


excessive inventory of items you don't really need but buy anyway


Well some of us buy exactly what we planned to buy, only at cheaper prices, and after verifying with price comparison sites.

Beeing unrich[1] for years have given me this skill. I don't frivolously spend money.

[1]: wouldn't say poor as I have had a car most of my adult life, but definitely not rich, maybe "not always knowing in advance how to pay for food towards the end of the month" and "being very happy hand me downs" is a good explanation?


And Hollywood.


That's Chinese censorship. Movies leave out or segregate gay relationships because China (and a few other countries) won't allow them.


Unlike in the US?


It's like self-help but flipped on its head. They're going to need a new section for it at the bookstore.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: