Hacker Newsnew | past | comments | ask | show | jobs | submit | instig007's commentslogin

> "Who cares?" was glib on my part, I admit. It was obviously stated from the perspective of an American relying on that power, towards other Americans.

> there is no "teacher" to stop the "bullying" here.

It's funny how the same person can mention "realism" and then proceed to "leverage" in the same conceptual realm of thought about the present day US. Just wait until three to four (insignificantly) smaller powers collude, target, and act against you like hyennas do, then try applying your leverage of ... what exactly?


"Realism" is not being used in the sense of the colloquial word, but as in "political realism," the framework that governs international relations between most superpowers today and in which "leverage" through hard or soft power is the core concept.

> being gratuitously abhorrent and should have known better.

This is an incredibly stupid take, and I would vote for a legislation to penalise incredibly stupid ones before gratuitously abhorrent, and more harshly so. It would be gloriously wonderful, too.


Cool beans


> pyrsistent is super slow, though

Since when is Python about speed?

> Just ran a quick benchmark

Where's the code? Have you observed the bottleneck call?

> Except at 10k+ items, batchup dates on 100K+ items or inserting 100 keys.

> This is rarely the case in practice

Where's the stats on the actual practice?

> You'd better have an incredible ROI to justify that.

The ROI being: fearless API design where 1) multiple instances of high level components are truly independent and could easily parallelize, 2) calling sites know that they keep the original data intact and that callees behave within the immutability constraints, 3) default func inputs and global scope objects are immutable without having to implement another PEP, 4) collections are hashable in general.


Clearly the ROI is perfect for you.

I won't waste more of your time.


It's perfect for most Python developers actually, not just for myself, contrary to your "in practice" claim.


The free Idea in December 2025: no JS code completion.


> software engineers have egos that will not let them accept that they are not designing critical stuff

> Don't write bad code, but also sometimes just getting something out the door is much better than perfect quality (bird in the hand and all that).

Your bank account can be represented as a CR app, it's two letters short of CRUD, but it doesn't make it simple or simpler in any sense of the words.

Now the question: how much are you tolerant to bugs in your bank account? How often can they happen before you complain?


Banking software is critical, but guess what, most software engineers are not writing banking software. I never said no software engineers write critical code. Heck I'd argue most at some point in their career will write something that needs to be as bug free as possible... at some point in their careers.

My point is that for most software engineering getting a product out is more important that a super high quality bar that slows everything down.

If you are writing banking software or flight control systems please do it with care, if you are making some React based recipe website or something I don't really care (99% of software engineering falls into this latter category in my opinion).

Software engineers need to get over themselves a bit, AI really exposed how many were just getting by making repetitive junk and thinking they were special.


> most software engineers are not writing banking software

Many software engineers write software for people who won't like the idea that their request/case can be ignored/failed/lost, when expressed openly on the front page of your business offering. Are bookings important enough? Are gifts for significant events important? Maybe you're okay with losing my code commits every once in a while, I don't know. And I'm not sure why you think it's okay to spread this bad management idea of "not valuable or critical enough" among engineers who should know better and who should keep sources of bad ideas at bay when it comes to software quality in general.


> and a decade of tinkering has just fatigued everyone and destroyed any momentum the language once had.

it's hard to buy it, considering that many of those "fatigued" moved on Kotlin, led by their managers' bs talking points.


Many of the Scala projects got people fired. Something the Scala devs largely ignore. Plus Scala support is truly awful even by the low standards of an OpenSource project. Then there is the fact that the Scala specific libraries are largely dead.

Scala had/has a lot of promise. But how the language is marketed/managed/maintained really let a lot of people down and caused a lot of saltiness about it. And that is before we talk about the church of type-safety.

Scala is a more powerful language than Kotlin. But which do you want? A language with decent support that all your devs can use, or a language with more power but terrible support and only your very best devs can really take advantage of. And I say this as someone writing a compiler in Scala right now. Scala has its uses. But trying to get physicists used to Python to use it isn't one of them. Although that probably says more about the data science folks than Scala.

PS The GP is right, they should have focused on support and fixing the problems with the Scala compiler instead of changing the language. The original language spec is the best thing the Scala devs ever made.


Kotlin has become a pretty big and complex language on its own so I'm not sure this is a good counterexample.

The fundamental issue is that fixing Scala 2 warts warranted an entirely new compiler, TASTy, revamped macros... There was no way around most of the migration pains that we've witnessed. And at least the standard library got frozen for 6+ years.

However I agree that the syntax is a textbook case of trying to fix what ain't broke. Scala 3's syntax improvements should have stuck to the new given/using keywords, quiet if/then/else, and no more overloaded underscore abuse.


> However I agree that the syntax is a textbook case of trying to fix what ain't broke.

The great new syntax is the very reason I don't want to even touch Scala 2 any more.

The syntax change is the absolute highlight in Scala 3. It makes the language so much better!

The only real problem was that it happened so late; at least a decade too late.


One impressive thing for us is that the changes to macros were hardly an issue. We'd been trending off macro-heavy libraries for a while, and our Scala 3 adoption has not really been harmed by the new macro system.


> The original language spec is the best thing the Scala devs ever made.

The overreaching majority thinks that Scala 3 is objectively much better than Scala 2 ever was. That's at least what you hear just everywhere, besides the occasional outlier by some Scala 2 die hards.


> Scala had/has a lot of promise. But how the language is marketed/managed/maintained really let a lot of people down and caused a lot of saltiness about it. And that is before we talk about the church of type-safety.

On the contrary, there was nothing wrong with Scala's marketing. What's damaged it is a decade of FUD and outright lies from the people marketing Kotlin.


> And I bet those green threads still need an IO type of some sort to encode anything non-pure, plus usually do-syntax.

There's no need for the do-syntax at all. The (IO a) is no different to other generic types that can be fmap-ed, pointfree-composed with other expressions, and joined/folded when required. The only difference is the fact that they represent actions that affect the real world, so that ordering of things suddenly becomes important.


Right, and there's also no need for await syntax at all, they can be then-ed, ContinueWith-ed or whatever the language calls them, but people keep bringing syntax into a semantics battle, so I had to mention it.


If you ever need this kind of stuff, you'll be better off building your own distributed interface by using plain regular GHC Haskell and https://haskell-distributed.github.io/


`haskell-distributed` is awesome, but the reason Unison is a new language is exactly to avoid the limitations of such frameworks, namely that they _can't_ send arbitrary code and data around like Unison can.

You should totally write up a tutorial demoing the development of the same application using each. Folks would love that.


> but the reason Unison is a new language is exactly to avoid the limitations of such frameworks

by introducing another set of limitations that Unison docs don't state upfront, because it will affect their progression as the business.

> namely that they _can't_ send arbitrary code and data around like Unison can.

> You should totally write up a tutorial demoing the development of the same application using each. Folks would love that.

You should add "out of the box", and that "like Unison can" isn't the only way to implement a distributed runtime. This leads to another logical question: why should I "totally write up" the Unison way of things? At this point I'm not even sure that 99.999% of workloads need distributed runtimes of any kind, let alone the Unison one.


> The range of problems where C++ is unequivocally the superior solution is getting smaller.

The range of issues where the superior solutions offer language features superior to the features of modern C++ is getting smaller too.


The c++ features that get bolted on to replicate those in other languages tend to never reach parity because of all the legacy baggage they need to design around. Modules are not nearly as useful as one would hope. std::variant and std::optional are not nearly as ergonomic or safe to use as rust equivalents. coroutines are not exactly what anyone really wanted. If you're simply looking for checkboxes on features then I suppose you have a point.

To be clear, I like and continue to use modern c++ daily, but I also use rust daily and you cannot really make a straight faced argument that c++ is catching up. I do think both languages offer a lot that higher languages like go and Python don't offer which is why I never venture into those languages, regardless of performance needs.


> std::variant and std::optional are not nearly as ergonomic or safe to use as rust equivalents.

> but I also use rust daily and you cannot really make a straight faced argument that c++ is catching up.

I mostly use std::ranges, lambdas, and concepts, and I see them catching up, as an evolutionary process rather than a fixed implementation in the current standard. Nowadays I can do familiar folds and parallel traversals that I couldn't do in the past without assuming third-party libraries. My optionals are empty vectors: it suits my algorithms and interfaces a lot, and I never liked `Maybe a` anyways (`Either errorOrDefault a` is so much better). I also use Haskell a lot, and I'm used to the idea that outside of my functional garden the industry's support for unions is hardly different to the support of 2-tuples of (<label>, <T>), so I don't mind the current state of std::variant either.


Everyone is welcome to their own opinions and there is definitely movement in the right direction. However, it's a far cry from catching up. std: variant doesn't force you to check the tag and doesn't have any easy way to exhaustively match on all types it stores. I'm not sure I understand what you're comparing it to I'm terms of tuples. Forcing everything to be "nullable" or have a default state can be painful to deal with and introduces invariants I often wish were impossible. Ranges are definitely nice compared to what we had before. I'm probably just not used to them, but they aren't always intuitive for me and spelling them is very verbose. It's not nearly as simple as calling map and filter on a collection or iterator.


There’s definitely holes, but I’m wondering what you are referring to here.


> Once you get to that point, you might as well create and learn a different language.

Nope, it's still incredibly valuable to be able to c++14 and c++26 two different translation units and then later link them together (all without leaving the familiar toolchains and ecosystems). That's how big legacy projects can evolve towards better safety incrementally.


If the Standard has anything to say about compatibility between different language versions, I doubt many developers know those details. This is breeding ground for ODR violations, as you’re likely using compilers with different output (as they are built in different eras of the language’s lifetime) especially at higher optimization settings.

This flies in the face of modern principles like building all your C++, from source, at the same time, with the same settings.

Languages like Rust include these settings in symbol names as a hash to prevent these kinds of issues by design. Unless your whole team is a moderate-level language lawyer, you must enforce this by some other means or risk some really gnarly issues.


> Languages like Rust include these settings in symbol names as a hash to prevent these kinds of issues by design.

Historically, C++ compilers' name mangling scheme for symbols did precisely the same thing. The 2000-2008 period for gcc was particularly painful since the compiler developers really used it very frequently, to "prevent these kinds of issues by design". The only reason most C++ developers don't think about this much any more is that most C++ compilers haven't needed to change their demangling algorithm for a decade or more.


C++’s name mangling scheme handles some things like namespaces and overloading, but it does not account for other settings that can affect the ABI layer of the routine, like compile time switches or optimization level.


The name mangling scheme was changed to reflect things other than namespaces and overloading, it was modified to reflect fundamental compiler version incompatibilities (i.e. the ABI)

Optimization level should never cause link time or run time issues; if it does I'd consider that a compiler/linker bug, not an issue with the language.


Many languages have ways to call into old code. C++ allows calling C; so does Objective C.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: