Its quite shocking to think how we used to get by with such little amounts ram and Cpu power. Firefox is currently consuming 518.4 MB of ram (7 tabs) and using 0.6% of a 12 core 3.4 Ghz CPU. But i guess this is progress...
I don't really do anything with it professionally, but just for "feels" and to not completely lose touch with the hard reality (of registers and ports) I took the UT Austin course on embedded systems (https://www.edx.org/course/embedded-systems-shape-world-utau...) - which required the purchase of some real hardware for the automatically graded labs. I also signed up for the follow-up to this excellent course (https://www.edx.org/course/real-time-bluetooth-networks-shap...) starting in September - which is also going to be done on real hardware.
It's just a refresher for me, I did a lot of low-level and assembler in the 8 bit days, so it's nice to see what's current. It's a wonderful counterweight to otherwise doing very high level programming in dynamic languages and learning (more) FP (Scala). It's nice to be able to find plenty of use cases for 32 kByte of RAM on a tiny board (http://www.ti.com/tool/ek-tm4c123gxl).
I think some low-level embedded programming (directly in C to the chip, not on a highlevel board that even runs a full Linux OS) is ideal to keep me grounded and remember how wasteful those many abstraction layers actually are. Yes I know what they do and appreciate their service - but when I compare what I get vs. how much more I put in (in Giga-Hertz and Giga-Bytes) I'm not convinced that there isn't a lot of waste going on that cannot, should not, be justified and sold as "price of progress".
We didn't need them, since websites rendered just fine without all the shit they have today.
I used a site that rendered a grid of images. Literally, that was the only user-visible function. It took 5s to render when throttled to 1GHz i5. Unthrottled i5 required 2.5s.
It is lunacy. It is an artifact of having a terrible language (JS) and a terrible layout system (HTML) mixed up into a modern "standard". Plus naive developers abstracting so far away they have no idea what's going on, and partially don't care.
Wasn't too long ago someone asked (I think unironically) why HN was so "fast". As if such a page shouldn't load basically instantly.
Can you honestly say info browsing (not apps like GDocs) overall is better than it was 10-20 years ago? Ignore the increase in content. Images load faster, sure, but that's offset by all the other junk. Twitter, for instance - it's actually sluggish on a nice ThinkPad! WTF.
I mostly agree. But I don't think that JS and HTML (and CSS) are terrible for what they were meant to do, the problem is that what they were meant to do was lunacy in context. The core architecture of webpages got corrupted somewhere in their development.
What we wanted was a way to describe interactive hyperlinked documents backed by a distributed collection of servers. What we got was a virtual machine running on a virtual network. I would call the web the third generation of application layer and it's an even larger step back for some fundamental promises of computing (composition, interchangeability, others) than window managers were (second) from shells (first). Window managers gave us a better way to interact, but we lost things like piping applications together, easy to use environment variables, sharing other programs (most applications I install these days just ship with their own copies of everything; I blame Microsoft), etc. The web gives us cross-platform access anywhere features, but has terribly degraded any promise of applications working together unless they are owned by the same company. Not at all the shared interactive interlinked document layer we wanted.
And yeah, of course a grid of images is going to be fast, especially if they aren't even scaled. It's literally just a blit. What gets expensive is when you add bilinear filtering, alpha compositing, text, shadows with blur, path filling and/or tessellation...
All of those things are things people now expect from apps, native or otherwise.
There wasn't any fancy shadows, compositing, etc. The CPU was all spent calculating layouts and doing "stuff", before the images even rendered. My guess is some suboptimal code buried by a kilometre's worth of abstractions. It's not that FF is doing something slow (that I know of), it's the mess people have built on top of HTML/JS/CSS that ends up with non-junior developers creating monstrosities.
The expectation and dynamics of ad-driven publishing have created a huge proliferation of sites that have no reason to exist. Taboola, NewsMax, Outbrain, and a slew of others, pimping nothing but crap. It's anti-information. You're dumber for having read or even seen it.
I've been keeping up a 60k+ element blocklist (sorted for dupes, and treating entire domains via DNSMasq), which helps some. But even that just chips away gently.
I've got another tab open with a Guardian article, "How Technology Disrupted the Truth". It dives deep into a space I've been pointing at for a while -- that advertising isn't merely bad for having created bad UI/UX and malware distribution networks, but it's actively screwing with media's primary role of communicating true facts.
Until and unless authors and publishers take the heat for dumping crap on people -- I'm increasingly a fan of the block -- that's not going to turn around.
(And yes, figuring out how to finance quality information also matters -- admitting it's a public good, and should probably be financed like one, would be a good first step.)
It could be argued that media's primary role is not to communicate true facts, but rather to communicate information their owner wants communicated. Regardless of whether the media is a news outlet or, I don't know, your own mouth.
I'll invoke another HN post -- Donella Meadows speaking of systems highlighted the importance of accurate feedback.
You can go further and note that all models are wrong, but some are useful. And note research which suggests that perceptual systems evolve to fitness rather than accuracy, a crucial distinction.
But the perception still needs to provide useful predictive or explanatory power over the range of experienced conditions. And if yo're deliberately violating that condition, you're going to end up with some less than beneficial behaviors.
Contemporary politics in various parts of the world, and interactions with media, demonstrates this well.
I'm talking with an ad blocker. I can only imagine how bad it is without one. Twitter is still sluggish as hell, especially how it's basically one big table of text+images.
And that's mainly due to what, though? AJAX? (Ignoring other enhancements.) And I'm excepting out a lot of "apps", because the browser has gotten more powerful, sure (and software dev has progressed). But mostly content sites that aren't heavy on any real interaction, iow, sites I "surf". Things just feel ... sluggish at a UI level (not the old sluggish of watching progressive JPEGs slowly refine).
I agreed with the top half of your post, but the bottom half is crazy talk. JavaScript and HTML by themselves aren't the problem. You mentioned the problem, but only as an addendum. Shitty programmers will write shitty programs regardless of the language they write. People like to blame JavaScript, but if the web ran on Haskell, you would be complaining about how awful Haskell (the language) is. It's not the tool, it's the person that uses the tool.
I don't think that's quite true. A huge, huge, amount of effort has to go into JS engines to make them competitive with any sane language.
I'm also implying that the attitude of JS and HTML - ignore errors - seems to carry over into users of those things. The fact that having wrappers like jQuery, which accomplish basic tasks that should have been part of the standard, makes it worse. Look at all the "shadow dom" and other hacks around terrible performance due to HTML's model.
Even poorly written desktop apps don't seem to be as bad as common web dev. I'm not sure that most desktop devs are better trained or care more. It's just harder to screw up.
> A huge, huge, amount of effort has to go into JS engines to make them competitive with any sane language.
Most people consider Python and Ruby "sane languages", and JS has been running circles around them for years and years. Even the most trivial JS JIT beats CPython and MRI/YARV hands down.
> I'm also implying that the attitude of JS and HTML - ignore errors - seems to carry over into users of those things.
JS doesn't ignore errors. It throws exceptions for all kinds of illegal operations.
And error recovery is not a reason for CSS's performance problems. Of all the wrong reasons I've heard suggested for CSS's performance issues, that is one of the weirdest.
> Look at all the "shadow dom" and other hacks around terrible performance due to HTML's model.
Shadow DOM isn't really about performance. Are you thinking of virtual DOMs as implemented by React, etc.?
Not my idea of sane (Ruby has a command line option for how it handles string printing...); I was thinking of most more-strictly-typed, compiled language.
By error handling I mean scripts don't break the page. Same as HTML - the browser tries to ignore source errors. This sloppiness spills over, I think, into the attitudes of developers. But maybe this is an invalid personal projection.
I suppose I mean virtual DOM. In other GUI platforms, can't you usually just provide a command to temporarily stop painting, then resume after you've made changes that need recalculating?
But perhaps web dev isn't special; sure there's plenty of terrible server software written and maybe I'm poorly extrapolating from too little experience.
The end result is that, day-to-day, I can count on basically every desktop app to be fairly responsive. Whereas loading any given content (low interaction) website has a high chance of feeling sluggish to operate. When the actual functionality of said websites is fairly unremarkable, it makes me think perhaps the platform has some responsibility for it, somehow.
Edit: I'll admit I'm armchair engineering. I haven't done serious web dev in over 10 years. Just extrapolating based on some observations and how the JS community seems to be, overall. I might be way off here.
I remember the days of Firefox 2 and it was consuming even a bit more than that, which was a disaster if you compare the amount of RAM available back then (And I'm not even talking about speed here).