Hacker Newsnew | past | comments | ask | show | jobs | submit | ck45's commentslogin

T-shirts from Muji

Is this based on your experience or is it just an assumption? I only have anecdotes, but it does not reflect your claims, rather the exact opposite. A lot of the boilerplate code doesn’t need to be type annotated, but annotating the main business logic doesn’t take more time and is not more complicated, but instead type annotations help write code that is more clear, more obvious, and it adds some kind of documentation.


It really depends on how you tackle gradual typing on a project level. The easiest way to sabotage is a "any new code must be fukly type checked" requirement, because it often means you also need to add type hints to any code you call, which leads to Optional[Union[Any]] nonsense if you let juniors (or coding assistants) go wild.

As always, no fancy new tech is a substitute for competent project management.


The HQ of Microsoft Germany is in Munich.

Edit: Compare to LiMux[1] and the rumors about some deal with MS

[1] https://en.wikipedia.org/wiki/LiMux


The HQ was moved to Munich some years ago to prevent Limux from spreading. It got killed, and this is the next logical step for M$


I never understood why AOT never took off for Java. The write once run anywhere quickly faded as an argument, the number of platforms that a software package needs to support is rather small.


Because developers don't like to pay for tools.

https://en.wikipedia.org/wiki/Excelsior_JET

https://www.ptc.com/en/products/developer-tools/perc

https://www.aicas.com/products-services/jamaicavm/

It is now getting adopted because GraalVM and OpenJ9 are available for free.

Also while not being proper Java, Android does AOT since version 5, mixed JIT/AOT since version 7.

EDIT: Fixed the sentence regarding Android versions.


Developers pay for tools gladly when the pricing model isn’t based on how much money you’re making.

I’m happy to drop a fixed 200e/mo on Claude but I’d never sign paperwork that required us to track user installs and deliver $0.02 per install to someone


Especially not if those kind of contracts don't survive an acquisition because then your acquisition is most likely dead in the water. The acquirer would have to re-negotiate the license and with a little luck they'd be screwed over because they have nowhere else to go.


I have seen worse, where people updated the EULA 6 months after being paid $14k/seat.

Now it is FOSS all the way... lesson learned... =3

https://www.youtube.com/watch?v=WpE_xMRiCLE


That is something that I never understood, that that's even legal. You enter into an agreement (let's call it a contract, because that's how the other side treats it) and then, retroactively they get to pull the rug right out from under you.

I made the 'FOSS all the way' decision somewhere in '96 or so but unfortunately our bookkeeping system and our own software package only worked on Windows (this was an end-user thing) so we had to keep one windows machine around. I was pretty happy when we finally switched it off.

The funny thing is that I wouldn't even know where to start to develop on/for mac or windows, Linux just feels so much more powerful in that sense. Yes, it has some sharp edges but for the most part it is the best thing that could have happened to the world of software development.


I have done native cross-platform projects in https://wxwidgets.org/ and https://quasar.dev/ . Fine for basic interfaces, but static linking on Win64 gets dicey with lgpl libraries etc. YMMV For iOS targets, one must use a MacOS environment with a non-free Apple developer account.

Personally, I like Apache 2.0, and standard quality of life *nix build tools. Everything Windows runs off a frozen VM backing image KVM COW file now, as even Microsoft can no longer resist the urge to break things. =3


Depends on the use-case, anyone that has seen the commercial host scaling cost of options like MATLAB usually ported to another language. lesson learned...

Commercial licensing is simply a variable cost, and if there is another FOSS option most people will make the right call. Some commercial licenses are just Faustian bargains, that can cost serious money to escape. =3


I think what they do is correct. We also need to get paid this way.


You could do AOT Java using gcj, it didn't need commercial tools.


If we ignore gcj was never production ready, and basically the only good case that Red-Hat sponsored was to compile Eclipse, which was usually slower than using the JIT anyway.

And that around 2009, most of the team left the project, some went to OpenJDK, others elsewhere, while GCC kept it around because gcj unit tests stressed parts of the GCC that weren't tested by other frontends, until the decision came to remove it completly.

As side note, I expect a similar outcome to gccgo, abandoned since Go added generics support.


You don't have to pay for dotnet AOT.


Actually you do indirectly, via Windows licenses, Office, Azure, Visual Studio Professional and Ultimate licenses, C# DevKit.

Also you are forgetting AOT first came with NGEN, .NET Native, commercial, and on Mono side, Xamarin had some price points for AOT optimiztions, if I recall correctly.

However this is a moot point, you also don't pay for GraalVM, OpenJ9, or Android.


You don’t have to pay for Java AOT either. Graal is free.


> I never understood why AOT never took off for Java.

GraalVM native images certainly are being adopted, the creation of native binaries via GraalVM is seamlessly integrated into stacks like Quarkus or Spring Boot. One small example would be kcctl, a CLI client for Kafka Connect (https://github.com/kcctl/kcctl/). I guess it boils down to the question of what constitutes "taking off" for you?

But it's also not that native images are unambiguously superior to running on the JVM. Build times definitely leave to be desired, not all 3rd party libraries can easily be used, not all GCs are supported, the closed world assumption is not always practical, peak performance may also be better with JIT. So the way I see it, AOT compiled apps are seen as a tactical tool by the Java community currently, utilized when their advantages (e.g. fast start-up) matter.

That said, interesting work is happening in OpenJDK's Project Leyden, which aims to move more work to AOT while being less disruptive to the development experience than GraalVM native binaries. Arguably, if you're using CDS, you are using AOT.


Well, one aspect is how dynamic the platform is.

It simply defaults to an open world where you could just load a class from any source at any time to subclass something, or straight up apply some transformation to classes as they load via instrumentation. And defaults matter, so AOT compilation is not completely trivial (though it's not too bad either with GraalVM's native image, given that the framework you use (if any) supports it).

Meanwhile most "AOT-first" languages assume a closed-world where everything "that could ever exist" is already known fully.


Except when they support dynamic linking they pay the indirect call cost that JITs can remove.


I'm not sure how much Hotspot can do this, but JIT means you can target different CPUs, taking advantage of specific extensions or CPU quirks. It can also mean better cache performance because you don't need branches to handle different chips, so the branch is gone and the code is smaller.


dynamic class loading is a major issue, and it's an integral feature. Realistically, there are very few cases that AOT and Java make sense.


For me “the problem with software, why smart engineers write bad code” is the prequel. Not as technical, but explains a big problem


I've never seen it that way, but I think you are absolutely right. My favorite always has been https://www.reddit.com/r/TheFarSide/comments/179ikr2/its_a_f...


But then Ruby only goes half way, not unlike the "watered-down form" in your term. Why is `#times` a method of Integer, but `#if` (or `#ifTrue`) not a method of booleans like in Smalltalk? Ruby does the same cherry picking from Smalltalk like everybody else, just different cherries. When looking at Ruby, it feels like the simple examples are all nice and clean but then the weird details start to appear and the language feels more hacky than others (like Ned Flander's house in Simpsons S08E08).


#if and #ifTrue are yours if you want them:

  class TrueClass
    def if = true
    def ifTrue = true
  end

  class FalseClass
    def if = false
    def ifTrue = false
  end

  true.if
  # => true
  false.if
  # => false


In Smalltalk those methods don't return `true`. They take a block and evaluate it if the boolean receiving the message

    (a > b) ifTrue: [ "do something" ]
EDIT: to clarify what's happening there, `>` is a message sent to `a` that will result in a boolean. The True class and False class both understand the ifTrue: message and `True>>ifTrue:` executes the block whereas `False>>ifTrue:` just throws it away.

There's no `if` keyword in the language. Control flow is done purely through polymorphism.


I apologize for my lack of Smalltalk knowledge. As you can imagine, you can do similar in Ruby by defining ifTrue to accept a block, even adding ifTrue on other all objects and defining something similar:

  class TrueClass
    def ifTrue(&block) = block.call
  end

  class FalseClass
    def ifTrue(&block) = nil
  end

  class Object
    def ifTrue(&block) = block.call
  end
      
  class NilClass
    def ifTrue(&block) = nil
  end
If ck45's core complaint was that this is not baked into the language, I will agree that it is less convenient for lack of a default.


Certainly possible: add ifTrue as a method to TrueClass and FalseClass.

It just isn't very fast.


problem is not with ifTrue, and not with it's performance, it's easy to do. it is "ifTrue:ifFalse:"

also it is common to do assignments in the "if", and with actual method and blocks scope of the introduced variable would be different and everyone would be tripping on it all the time.


basically it's because of "else" and "elsif". While ".each" works the same as "for .. in ...; end", it's harder to do "if else" as method which will also return value of the block inside the branch. Smalltalk can do it because "ifTrue:ifFalse:" is _one_ message, ruby didn't go that way syntactically.


This reminds me of Stephen Baxter's novel "Flood"


One argument that I’m missing in the article is that with an enumerated, states are mutually exclusive, while withseveral booleans, there could be some limbo state of several bool columns with value true, e.g. is_guest and is_admin, which is an invalid state.


In that case, you set the enumeration up to use separate bit flags for each boolean, e.g., is_guest is the least significant bit, is_admin is the second least significant bit, etc. Of course, then you've still got a bunch of booleans that you need to test individually, but at least they're in the same column.


look up the typestate pattern.


Or overloading division like https://scapy.net/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: