Hacker Newsnew | past | comments | ask | show | jobs | submit | indolering's commentslogin

I really like that project! Why don't y'all hand it over to someone willing to do maintenance or at least archive it?

It looks like it was done to not delay the 4.0 release. Since they follow semvar, that means it won't get the axe until 5.0 [1]. Pretty wild considering that 3.0 was released 10 years ago.

But maybe they will scope this one better: they were talking about getting 4.0 released in 2020 back in 2019!

[1]: https://github.com/jquery/jquery/pull/5077 [2]: https://github.com/jquery/jquery/issues/4299


I love that they support ES6 modules, Trusted Types, and CSP! The clearing out of old APIs that have platform replacements is nice to see too!

It's hard to rule out intentional side channels without access to source.

Do you mean a no-internet app (like this) could write data locally in a way that another internet-enabled app (in cahoots) could locally receive? Like a non-sandboxed storage area? Seems plausible.

Meta literally got caught doing this.

Writing to a local server, and then uploading from the browser to bypass consent mechanisms.

https://wire.com/en/blog/metas-stealth-tracking-another-eu-w...


yes that and internet permission can be added later and pushed with an update. Unless you are checking permissions after every update you will not know.

CHERI is undeniably on the rise. Adapting existing code generally only requires rewriting less than 1% of the codebase. It offers speedups for existing as well as new languages (designed with the hardware in mind). I expect to see it everywhere in about a decade.


There's a big 0->1 jump required for it to actually be used by 99% of consumers -- x86 and ARM have to both make a pretty fundamental shift. Do you see that happening? I don't, really.


Tbh I can imagine this catching on if one of the big cloud providers endorses it. Including hardware support in a future version of AWS Graviton, or Azure cloud with a bunch of foundational software already developed to work with it. If one of those hyper scalers puts in the work, it could get to the point where you can launch a simple container running Postgres or whatever, with the full stack adapted to work with CHERI.


CHERI on its own does not fix many of the side-channels, which would need something like "BLACKOUT : Data-Oblivious Computation with Blinded Capabilities", but as I understand it, there is no consensus/infra on how to do efficient capability revocation (potentially in hardware), see https://lwn.net/Articles/1039395/.

On top of that, as I understand it, CHERI has no widespread concept of how to allow disabling/separation of workloads for ulta-low latency/high-throughput/applications in mixed-critical systems in practical systems. The only system I'm aware of with practical timing guarantees and allowing virtualization is sel4, but again there are no practical guides with trade-offs in numbers yet.


I see this happening on ARM world, that is why ARM is working alongside CHERI folks,

https://www.arm.com/architecture/cpu/morello

x86, well Intel has already messed up hardware memory tagging multiple times.


We’re all using the pointer math functions in Rust and testing it with miri, right? Right?


Interesting, what causes the speedup?


You can skip some bounds checks and then get 50% slower because the hardware is not very powerful

Spinlaunch is also promising drastically reduced cost per launch. The payload size for their first launcher is pretty small and they appear to be struggling to get the kinetic launcher online.


Spinlaunch is an outright scam. Their main product is taxing people for inadequate knowledge of basic physics. And engineering. Also… common sense.


Please elaborate!


The proposal is to introduce a whole slew of syntax to JS that according to the proposal will have no meaning. This is a paradox. You have only created a language if you can use it to convey meaning


That's not entirely true. Ideally, there would be a follow up with a reflection API.

Also, comments are syntax and they're mostly meaningless. By your reasoning, programming languages should have no comments.

So, it's not really a qualitative issue (presence of meaningless syntax) but a quantitative one (presence of lots of parsing complexity).


If you say that whatever data is put there doesn't matter at all, the one thing you definitely cannot ever do later is give it meaning.


Unless I say it's meaning is to be optionally reflected upon during runtime!

Look, I understand the purism and mostly, I agree. But this is not a clean slate language, it will never be perfect and it's going to become more and more idiosyncratic as times go by.


I don't see how it's optional.

Comments are a kind of freedom in code. You're completely free to use them precisely because (in a plain execution environment) they cannot influence the result of evaluation

If comments /can/ change the result of evaluation then you simply are not (completely) free to use them. (And yes I know that this is a simplification in JS where you can already get the source code of a function with toString... Ugh)


Makes sense. I'm excited for your solution, despite not having seen it. If you can solve that, it would be awesome.


The healthcare industry is basically locked into 365 due to a lack of alternatives supporting HIPAA.

Google Workplace theoretically can be configured, but it doesn't cover basic stuff like information in contacts. So if ANYONE in your organization (like an outreach coordinator) adds a patient and puts notes into the contact field, it's a HIPAA violation. There is no way to effectively police that.

I wish the regulations were written such that messaging apps, office suites, etc over a certain percentage of revenue had to qualify for HIPAA by default. It's absurd how many small shops just do everything in over WhatsApp/iMessage/Gmail/iCould, etc.


Yes, it's technically possible. But what you are suggesting is basically a dynamic filter. The problem is that codes are designed for end delivery and have very specific practical constraints.

For example, we could GREATLY improve compression ratios if we could reference key frames anywhere in the file. But devices only have so much memory bandwidth and users need to be able to seek while streaming on a 4g connection on a commuter train. I would really like to see memes make use of SVG filters and the like, but basically everyone flattens them into a bitmap and does OCR to extract metadata.

It's also really depressing how little effort is put into encoding, even by the hyper-scalers. Resolution (SD, HD, 4k and 8k) is basically the ONLY knob used for bitrate and quality management. I would much prefer to have 10 bit color over an 8K stream yet every talking head documentary with colored gradient backgrounds has banding.

Finally, there is the horror that are decoders. There a reference files that use formal verification to excise every part of a codec's spec. But Hollywood studios have dedicated movie theaters with all of the major projectors and they pay people to prescreen movies just to try and catch encoding/decoding glitches. And even that fails sometimes.

So sure, anything is possible. Flash was very popular in the 56k days because it rendered everything on the end device. But that entails other tradeoffs like inconsistent rendering and variable performance requirements. Codecs today do something very similar: describe bitmap data using increasingly sophisticated mathematical representations. But they are more consistent and simplify the entire stack by (for example) eliminating a VM. Just run PDF torture tests through your printer if you want an idea of how little end devices care about rendering intent.


They were going to shut it down due to upstream Fedora considering ending 32-bit support. Sticking to upstream wouldn't have helped you avoid that issue.


Why do you say that? If they drop 32-bit support, maybe I won't be able to play games for a time - at least until somebody rigs up a fix - but at least my operating system will still be supported.

If Bazzite goes poof overnight, though, that's a major problem. At least Fedora's official spins will continue to receive necessary updates.


The Steam client is 32-bit, the majority of games on Steam are 32-bit, and very popular titles like Left 4 Dead 2 are 32-bit.

The last time a distro tried to do this Ubuntu caved and continued supporting it with an extra repo. Fedora has no chance of winning that argument.

The good news is the incident you're talking about was a change proposal proposed by a single person and never even voted on. It did not survive the comment stage.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: