Hacker Newsnew | past | comments | ask | show | jobs | submit | matthewbauer's commentslogin

Hard to know what OP meant, but I took it as an oblique reference to China.


Well, is China a nation-state or a multi-national state, or essentially just a country(state)? English is my third language, so I just wonder do I miss some nuance here.


Not sure on the exact take of the OP, but:

Package maintainers often think in terms of constraints like I need a 1.0.0 <= pkg1 < 2.0.0 and a 2.5.0 <= pkg2 < 3.0.0. This tends to make total sense in the micro context of a single package but always falls apart IMO in the macro context. The problem is:

- constraints are not always right (say pkg1==1.9.0 actually breaks things)

- constraints of each dependency combined ends up giving very little degrees of freedom in constraint solving, so that you can’t in fact just take any pkg1 and use it

- even if you can use a given version, your package may have a hidden dependency on one if pkg1’s dependencies, that is only apparent once you start changing pkg1’s version

Constraint solving is really difficult and while it’s a cool idea, I think Nixpkgs takes the right approach in mostly avoiding it. If you want a given version of a package, you are forced to take the whole package set with you. So while you can’t say take a version of pkg1 from 2015 and use it with a version of pkg2 from 2025, you can just take the whole 2015 Nixpkgs and get pkg1 & pkg2 from 2015.


There’s no clear definition (in most languages, of major/minor/patch versioning). Amazon did this reasonably well when I was there, though the patch version was implicitly assigned and the major and minor required humans to follow the convention:

You could not depend on a patch version directly in source. You could force a patch version other ways, but each package would depend on a specific major/minor and the patch version was decided at build time. It was expected that differences in the patch version were binary compatible.

Minor version changes were typically were source compatible, but not necessarily binary compatible. You couldn’t just arbitrarily choose a new minor version for deployment (well, you could, but without expecting it to go well).

Major versions were reserved for source or logic breaking changes. Together the major and minor versions were considered the interface version.

There was none of this pinning to arbitrary versions or hashes (though, you could absolutely lock that in at build time).

Any concept of package (version) set was managed by metadata at a higher level. For something like your last example, we would “import” pkg2 from 2025, bringing in its dependency graph. The 2025 graph is known to work, so only packages that declare dependencies on any of those versions would be rebuilt. At the end of the operation you’d have a hybrid graph of 2015, 2025, and whatever new unique versions were created during the merge, and no individual package dependencies were ever touched.

The rules were also clear. There were no arbitrary expressions describing version ranges.


For the record, Amazon's Builder Tools org (or ASBX or whatever) built a replacement system years ago, because this absolutely doesn't work for a lot of projects and is unsustainable. They have been struggling for years to figure out how to move people off it.

Speaking at an even higher level, their system has been a blocker to innovation, and introduces unique challenges to solving software supply chain issues

Not saying there aren't good things about the system (I like cascading builds, reproducibility, buffering from 3p volatility) but I wouldn't hype this up too much.


> Constraint solving is really difficult and while it’s a cool idea, I think Nixpkgs takes the right approach in mostly avoiding it. If you want a given version of a package, you are forced to take the whole package set with you.

Thank you, I was looking for an explanation of exactly why I hate Nix so much. It takes a complicated use case, and tries to "solve" it by making your use-case invalid.

It's like the Soylent of software. "It's hard to cook, and I don't want to take time to eat. I'll just slurp down a bland milkshake. Now I don't have to deal with the complexities of food. I've solved the problem!"


It’s not an invalid use case in nixpkgs. It’s kind of the point of package overlays.

It removes the “magic” constraint solving that seemingly never works and pushes it to the user to make it work


> I was looking for an explanation of exactly why I hate Nix so much

Note that the parent said "I think Nixpkgs takes the right approach in mostly avoiding it". As others have already said, Nix != Nixpkgs.

If you want to go down the "solving dependency version ranges" route, then Nix won't stop you. The usual approach is to use your normal language/ecosystem tooling (cabal, npm, cargo, maven, etc.) to create a "lock file"; then convert that into something Nix can import (if it's JSON that might just be a Nixlang function; if it's more complicated then there's probably a tool to convert it, like cabal2nix, npm2nix, cargo2nix, etc.). I personally prefer to run the latter within a Nix derivation, and use it via "import from derivation"; but others don't like importing from derivations, since it breaks the separation between evaluation and building. Either way, this is a very common way to use Nix.

(If you want to be even more hardcore, you could have Nix run the language tooling too; but that tends to require a bunch of workarounds, since language tooling tends to be wildly unreproducible! e.g. see http://www.chriswarbo.net/projects/nixos/nix_dependencies.ht... )


I mean you can do it in Nix using overlays and overrides. But it won’t be cached for you and there’s a lot of extra fiddling required. I think it’s pretty much the same as how Bazel and Buck work. This is the future like it or now.


What? Isn't wasm just a bytecode? How could you write rules in wasm?


It used to be you could buy fractions of a mutual fund, but not ETFs. Recently, brokerages have started allowed you to do fractional ETFs as well though.


Very cool! It looks like it assumes everything is flat, but I bet you could pull in elevation data from OSM as well.


It does use elevation data, but does not exaggerate it, I guess. Back when I was working with 3D maps, we noticed that many people liked exaggerated terrain heights better, especially when the terrain is viewed from above and realistic heights looked “flat”. Near where I live it looks fairly close to what it does in real life: https://streets.gl/#48.50063,8.99766,7.25,312.50,135.56 (granted, having added building and roof colors for almost all buildings also helps).


the 4 story building I live in is rendered like a basement only dwelling which is actually growing on me the more I look at it ...


You could add a building:levels value to the object in OpenStreetMap, to record the information of how many stories your building has.

https://wiki.openstreetmap.org/wiki/Key:building:levels


While a good idea in general (the StreetComplete app makes this very easy, by the way), this won't help for this app, as the data is from September 2023. Otherwise I'd love to use it more to validate how renderers handle different buildings. F4Map should show the change fairly quickly, though.


Had never heard of “mayaguez” until now. It actually happened under President Ford not Carter.


I have no idea either, but it seems like some kind of art project. There is no city named Springfield in Kansas.


Clicking at the bottom sentence about the reopening of the pool opens this:

https://unworld.neocities.org/pages/clarkstpool

There appear to be other links scattered around. I imagine a sort of mystery story/ adventure game organised as a set of old internet pages?


I’m on mobile, but has anyone checked the source or inspected the resources as to where they’re being hosted?

Is there a neocities profile tool?

edit: there is. Site create date is March 2021. https://neocities.org/site/unworld

It drives me crazy when simulated image file corruption is nowhere near what a file would look like. Chromatic aberration?? What the hell is this crap?


I guess there's differing definitions. One definition was deciding you're going to quit, but not putting in any notice, and basically trying to delay getting fired as long as possible to collect a paycheck. Potentially even getting another job on top (overemployed), which sounds like it could be fraud.


In my experience, higher level of pay doesn't mean there's less coasters. Like, the $150k job could just as well have a lot more expectations than the $300k. Size of company seems like a much bigger factor: you can't hide your low productivity when there's only a few other engineers. And ironically, bigger companies can pay a lot more & get a lot less from their engineers for it.


You can just show the user the transliteration & have them confirm it makes sense. Always store the original version since you can't reverse the process. But you can compare the transliterated version to make sure it matches.

Debit cards a pretty common example of this. I believe you can only have ASCII in the cardholder name field.


>But you can compare the transliterated version to make sure it matches

No you can't.

Add: Okay, you need to know why. I'm right here a living breathing person with a government id that has the same name scribed in two scripts side by side.

There is an algorithm (blessed by the same government that issued said it) which defines how to transliterate names from one to another, published on the parliament web site and implement in all the places that are involved in the id issuing business.

The algorithm will however not produce the outcome you will see on my id, because me, living breathing person who has a name asked nicely to spell it the way I like. The next time I visit the id issuing place, I could forget to ask nicely and then I will have two valid ids (no, the old one will not be marked as void!) with three names that don't exactly match. It's all perfectly fine, because name as a legal concept is defined in the character set you probably can't read anyway.

Please, don't try be smart with names.


Your example fails to explain any problem with GPs proposal. They would show you a transliteration of your name and ask you to confirm it. You would confirm it or not. It might match one or other of your IDs (in which case you would presumably say yes) or not (in which case you would presumably say no). What's the issue?


You will compare the transliterated version I provided with the one you have already, it will not match and then what? Either you tell me I have invalid name or you just ignore it.


I think they were suggesting the opposite order - do an automatic transliteration and offer you the choice to approve or correct it.

But even if the user is entering both, warning them that the transliteration doesn't match and letting them continue if they want is something that pays for itself in support costs.


I have an ID that transliterated my name, and included the original, but the original contained an obvious typo. I immediately notified the government official, but they refused to fix it. They assured me that only the transliterated name would be used.

Human systems aren't always interested in avoiding or fixing defects.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: