Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Build a standard (or extend the AMP standard) that works just like this, but is enforced by the browser.

uMatrix and NoScript do a rather coarse-grained version of this, as you can configure them to block JS by default (and whitelist some domains, eventually default-allowing JS served from the same domain as the page, or one of the subdomains.)

But when you tell someone their website doesn't work without JS, most of the time they either ignore it or tell you it's 2019 and you shouldn't disable JS.



I've spent a month with uMatrix misconfigured so that it was disabled by default on websites. The internet turned into a horrific bloated mess.

How do these sites survive when it takes 30 seconds load the page (which freezes your browser) and then takes another 20 seconds to find the content hidden between creepy ads from ecommerce sites you recently visited?

The last time I remember the internet being this horrible was the search engines back in 1999 and that was quickly fixed when Google came along.


>> Build a standard (or extend the AMP standard) that works just like this, but is enforced by the browser.

>uMatrix and NoScript do a rather coarse-grained version of this

What about something like: window.MAX_ALLOWED_JS_STATEMENTS == 1.mio ? User could set this value in settings for each domain/subdomain. Developers could decide based on that value what should be executed and what not.


I'm not a frontend dev, but isn't this kind of variability across environment exactly what frontend devs loath? It may be even harder to properly test than checking multiple browsers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: