"There is a good chance that upgrading fails and you would need to give it a second or 3rd try."
Sad that this has almost become the norm when developing in the modern javascript ecosystem. I dread touching those projects and creating one even more because stuff just rots away and your app might break in days, weeks or If you are lucky months. I'm sure there are better developers out there that can handle all of this and know how to avoid it, but a bozo like me does not. This is actually causing me stress irl.
I recently was convinced to start using a packer (webpack, parcel) and was blown away by the hoops people have to jump through to make stuff work. I've been spending an hour or so a day coming up to speed for the past few weeks, and every yarn/npm package I installed had some kind of error/warning that required a hack, or version contortion. I naively thought I could webpack with electron+vue+pug+ts and was soundly smacked down repeatedly, even after trying the boilerplate from each components' docs.
I appreciate what the developers are trying to do, I really do. Only, it is too many cooks + death by a thousand cuts. As I read through pages of closed-but-not-really GitHub issues for each package, sometimes going back years and years, it feels like a constant stream of hacking that erodes the well-intentioned first versions.
> but a bozo like me does not. This is actually causing me stress irl.
Me too! I dread package upgrades because it can instantly turn into an all-hands-on-deck emergency, and these are just the stand-alone packages, not all the ones I mentioned above.
To me what’s interesting is that ESBuild (which is a Go based JavaScript bundler) has come out of nowhere in a relatively short period of time, and it’s basically already better than any of the existing JS JS bundlers. Not just faster: it also has a radically simpler UI.
I have written JS since 1997 and kept up with ES6 and so on over the years. Used it in frontend until now.
The JavaScript-ecosystem just isn't very productive [1] for multiple reasons. The language is untyped. You have no multi-threading. It's hard to debug. You have to spend many brain-cycles on learning Browser-problems, which muddles your understanding of the language. The web-way of: we can fix it and everyone will reload the browser, leads to a "don't care" attitude.
For web/backend this is approachable, though still not good.
For build-systems this is terrible. Why? Because if a bug shows up in a build-system you stop a million developers from being productive. The bug won't get fixed right away, most will just work around it, use different plugins, copy/paste yet another webpack config etc.
Plus JavaScript tends makes people realized what FP is, and they all start putting functions in their functions and making everything modular and plugin-able. (See high-order-components in React). While this can be nice, it leads to even more confusing bugs.
Writing plugin-architectures is hard in any language. Writing it in JavaScript is a recipe for disaster.
Hence the modern JS-ecosystem.
[1]
Google: nullpointerexception -> 4.360.000 results
Google: undefined is not a function -> 800.000.000 results
> Google: undefined is not a function -> 800.000.000 results
You didn't put quotation marks around the second, and the first has no spaces, so this is not a fair comparison. "undefined is not a function" with the quotation marks has 449,000 results.
I don't doubt that there are many results for "undefined is not a function", but I feel quite confident saying there aren't anywhere near 449,000. If you ask google for just 111 of them, you'll get "In order to show you the most relevant results, we have omitted some entries very similar to the 110 already displayed."
please please please do not use google's results as an indication of really anything at all. Don't take my word for it, do this test: search something, like the name of someone you know that's not famous. You'll find that google will reports tens of thousands (or millions! lol) of results. But click through 10 or 15 pages. You'll soon find that many of the pages have almost nothing at all to do with what you searched (and may not even contain any word from the phrase!), but that if you get far enough, maybe 300 results or so in, you'll see that when google runs out of results it will magically just update to show the real number, which is often just a few dozen for a search like the one I described.
That makes perfect sense though. Older projects have a ton of baggage that they carry through all of the revisions. New-comers to the space can learn from the mistakes of previous approaches and also utilize the latest and greatest paradigms to solve the problem without worrying about backwards compatibility.
This categorically not true. It does exactly the thing it's supposed to do. Yes, it is best that potential users wait for now, but that isn't because it "doesn't do anything".
What it _doesn't_ do currently is provide an API which would allow it to be plugged in a similar way to Webpack (not exactly the same, as the deliberate aim is not to cover every usecase, as Webpack allows). This is a requirement for wider adoption as a competitor (rather than just as the module bundling step). Plus the tree shaking needs work, though I think that is less important than has sometimes been made out.
Parcel may have a simple API for simple things, but it's extremely slow and generally suffers from exactly the same issues outlined by OP.
> Me too! I dread package upgrades because it can instantly turn into an all-hands-on-deck emergency, and these are just the stand-alone packages, not all the ones I mentioned above.
To all package manager developers: make sure you have a rollback function, and make sure it is flawless.
Committing the package-lock.json file should do this for you: I always upgrade one dependency per commit and I put code changes in separate changes from upgrades. Also, you just have to learn which packages to avoid: react-router, for example, likes incompatible major version updates. Something simple like routedux that’s stable is almost always a better choice.
With all the different package managers out there, it's just not much fun to keep track of all the rules and special cases for each tool. Better to just have a "--rollback" option.
Isn’t this solved more generically by version control? As long as your package manager uses files to control the packages searched for, them you commit those files between upgrades and revert them to rollback. nix handles this at the system-level by making your system configuration match a declarative specification.
Would be interested to hear what issues you ran into with Parcel. I've used it on dozens of projects without doing any sort of deep dive into their pipeline - just using the default no config compilation and it's worked pretty much perfect every time.
Sorry, my post was misleading, I was convinced to use -A- packer, so WebPack, Parcel and FuseBox in that order. WebPack had the best documentation hands-down because 4 weeks ago I didn't know what a packer did, I had been purposefully avoiding them for years. Now that I know way more about the concept/theory, I'll go down the list. Although I'm reluctant to waste too much time because my experience trying to bring all of the frameworks I already use into webpacker was so painful. But I really am pulled in by hot module swapping...
Strange, because parcel in particular is zero configuration and I have used to it build client projects running in production. No complicated webpack configuration needed.
Parcel is a rare thing in the JS world: a tool I’ve actually found useful and efficient. As a bundler, it does indeed do a lot of things correctly out of the box, and as a direct result the experience of using it is far more pleasant than most such tools.
However, it is also a tool for one job: bundling. If you also want to do things like unit testing or running any sort of linter or style checker over your code, you probably need to figure out how to get those tools to read your TS or ES20xx code as well. You might still need a handful of additional libraries to implement your test suite. You might still need to configure your preferred style and which rules you want to enforce in your static analyser(s).
So even if Parcel was flawless about installing what you needed for your dev and production builds automatically, you’d still end up needing to install a bunch of other tools and write a bunch of other configuration files. The overall efficiency of setting up a realistic development environment, even using Parcel, is still limited by a kind of Amdahl’s Law problem.
So far what I've been doing is keeping my webpack config as simple as possible (< 40 lines) and just relying on default behavior. I know I'm losing out by not bothering with some of the more intricate configuration properties like caching, etc.
But one of my big pet peeves is when you clone a project and the installation says all you need to do is run make install or npm i, but in reality requires 20 google searches and an hour of banging your head to get the project running and even then you end up with 50 cryptic warning messages in your terminal so you don't even know if you did it correctly.
With a very simple webpack config you might lose out on optimizations but at least you can get just about anyone running a project locally and if things go awry you can generally pinpoint the problem to a specific line of configuration.
I've found that a slightly longer (~100 lines) webpack config allows me to use all the caching and fancy tricks I want and is still just as debuggable. I think the main thing is setting it up yourself so you know what all the plugins/settings do.
Caching was a major dissapointment for me. DllPlugin is a nightmare to setup and other caching plugins I tried don't really speed things up. What eventually worked for me is HardSourceWebpackPlugin [1] - two lines added and went from 2min to 15s (with a tradeoff of a slightly longer initial build).
I got the same message. Still, having faster builds with one additional plugin without configuration needs on projects locked on 4 or waiting out the transition is a plus in my book :).
Webpack is kind of more "build your own bundler" than a bundler. The configurability and extensibility is immense and with that level of complexity, it's guaranteed things won't work smoothly for everyone (incompatible plugins, unexpected configurations etc.). Anyway, every big project always has hiccups with new major X.0.0 release.
Think about your own code for a second. It's typically a well defined code that doesn't have to accommodate a combinatorial explosion of configurations. It is bug free?
Keep in mind that 90% of code is written by one guy. It's not Webpack Corp. with a legion of QA testers.
Run it three times however is not part of such bugginess though. It means that they're randomly encountering different executions/run-states on what should be the same config/setup, each time you run it. And they have no idea why.
Or they're running things in parallel, and encountering data races and such, and randomly working as a result.
But the real concern is that they don't really know what a correct installation looks like, or they'd just check-and-retry on a loop until it succeeded (the easiest hack around such problems)
> Run it three times however is not part of such bugginess though. It means that they're randomly encountering different executions/run-states on what should be the same config/setup, each time you run it. And they have no idea why.
Maybe we’ve interpreted that message in different ways, but I took it to mean you may need to try it again at a later date once the community have fixed those issues/bugs. I don’t think they meant literally 3 runs of the same setup.
They’re still working through stuff so there may be edge cases for some where 5 won’t work right now.
I think it’s more if 5.0.0 doesn’t work the first time, you could simply let the community find those issues for you and by 5.0.xx it will mostly likely cover your specific setup (except for the documented api changes)
Configuring the modern JS toolkit (webpack, Babel, your framework of choice, Jest, etc) is such a pain. I’ve been doing front-end for 10+ years. You’re probably not a bozo.
Deno does away with most of the tooling configuration. You get linter, bundler, docs generator, formatter, watcher, version manager, std library, inbuilt tests, and many things that you would otherwise source from third parties in node ecosystem.
Support for webgpu and local storage incoming. Makes it a delight to write scripts. You can also scope them by permission.
I wrote a CLI tool to try it out. It monitors CPU/GPU temps then generates an HTML chart on SIGINT.
I like top-level await, the standard library and first-class typescript support.
I do wish the sandboxing was more granular. My small CLI tool requires: --allow-read --allow-write --allow-run and --unstable. I only need a read/write in a single directory, run a single binary. Unstable is required for signal handling, but that shouldn't be the case forever.
I'm glad someone is re-imagining JS/TS on the back end. A robust and stable standard library could well improve the dependency hell and broken projects issues.
And likely will have similar issues in five years as node has now.
Node was once thought to be the cleaner alternative that had a lot of these features built in, it was the supposed savior of Javascript, and now look at where we are.
I do believe constant iteration is better, but Javascript has many problems at its very core, and many Node developers transitioning to Deno are going to create and write solutions that are more akin to their comfort in Node.
This will lend itself to reproducing similar issues as they currently have.
I am primarily a javascript developer, I've written both Node and Deno projects, but I don't see the issues being solved by just rewrapping the source output.
I have found that over time, I have become more sceptical about new shiny things in the programming world. Every now and then, I wonder if I’m becoming the dreaded veteran dinosaur type, the developer who has more demands on their time now and who just hasn’t kept up with the tech.
But then I look at all the troubles the kids have with their fragile, short-lived tools. I recognise short-sighted design decisions they’re making because they’re being pushed into them by some shiny new library. I notice that many of those tools and libraries have big names behind them, and that the Zeitgeist people think you’re weird if you’re not using them, and remind myself that these things are only ever means to an end, no matter how popular they become.
Of course it’s useful to keep generally aware of what’s happening in the industry and from time to time a genuine improvement does come along. However, there is definitely a difference between not keeping up and not wasting your time repeating past mistakes or following blind alleys that often end up at dead ends.
Yeah not-giving-a-damn-driven-development is also the only way to stay sane and not be over come with paralysis from being surrounded by a thousand broken windows in need of fixing.
It's specifically the frontend ecosystem. A problem caused by the numerous languages, assets, and formats that have to come together, the lack of JS standard library, poor module system, and the disaster of browser compatibility mixed with transpilers, polyfills.
Webpack has already had zero/very-little config required by default. That's what most should use, and that's if you even need to use Webpack directly. Otherwise I recommend sticking to the major site frameworks like Next/Nuxt/Gatsby and let those do all the config wiring for you.
Once I had the freedom to decide, I gave up on using most of the standard Javascript ecosystem stuff in my front-end projects.
No node, no npm_modules, no build chain. I'm lucky to be able to get away with that. It is incredibly efficient, not only because it no longer gets in the way of what I'm trying to build. The previous team I was on, we spent roughly 15% of our time tending to these things.
I know the feeling. It's not just modern stuff, I have an old ExtJS app that I dread to update. The best solution I have found is a dev / build environment in a dedicated VM that I can spin up once or twice a year to make updates. Bringing libraries up to date is a fucking nightmare and often a waste of time. For continuously updated apps I have moved away from complicated build processes. Most web apps are fine without a build process, I think there is a lot of premature optimization going on.
I too maintain an old and complex ExtJS 4.1 app and I never dared upgrading the framework. I carry a big zip around and it will be that until the end of its life or when I finally decide to rewrite it using React. For such an old framework it’s surprisingly bug free though.
The way you avoid problems with a .0.0 release is by not upgrading to a .0.0 release. You would have to do that by hand; npm and yarn have been pinning major versions for a long time now, so that upgrade only happens when you do it by hand. So if you don't want to deal with breaking changes, all you have to do is nothing. Meantime webpack 4 is not going anywhere.
I'm reminded of when npm released 5.7.0 (which some got upgraded to automatically because it wasn't tagged as pre-release) and had a critical bug that deleted your system files.
It was a new minor version, of a totally different package, which was version-tagged in such a way that some distro packagers picked it up when they shouldn't have, and which happened to contain a bug affecting a use case that's been deprecated for at least half a decade now.
Look, I get that there's a lot of hate for modern JS, and it's hardly as if there is any lack of basis for criticism, just as there is with every other highly active, heavily adopted, and fast-evolving software ecosystem. But mistaking this kind of uninformed slagging for meaningful commentary says more about the person who does it than about the subject of their ire.
I've been using ASDF (https://github.com/asdf-vm/asdf) for a while now to manage each of the language (and sometimes build tool) versions for a folder/repo in a consistent manner. I'd definitely recommend taking a look at it.
It helps to pin dependencies so that only the exact specified version is downloaded/installed/used.
I recently updated the dependencies of a frontend project. `vue-apollo` from version 3.0.3 to 3.0.4 had a breaking change. The discussion in the commit on GitHub proposed to release it as v4. But in the dev’s mind it was a fix so he released it as a patch version.
Started my first React Native app some time ago and I can totally confirm that. It feels like a jungle of very fragile dependencies and mechanisms between code and output.
Yes RN is certainly the worst for that for me. I can't count how many hours I've had to put in to wrangle the RN packager in to a monorepo with TypeScript and code-sharing across packages, and it's still unbelievably fragile and I can't actually share components between them. At this point I've made so many little alterations to the config from GitHub issues and StackOverflow answers that I literally have no idea how it works or how to replicate it to a new project. It's an absolute nightmare.
I frequently have to rerun `npm i` when it fails for seemingly spurious reasons. We used to laugh at Windows years ago, when the solution was so often "turn it off and on again", but it's basically what you need to constantly do with tools like npm and webpack.
Yarn is much better in this regard. I'd thoroughly recommend it. It's drop-in compatible (except it will generate it's own lock file), so it's pretty easy to try it.
Yarn also gives much cleaner output. NPM is extremely verbose, but it seems that "verbosity" doesn't really confer any benefit. And yarn is also generally much faster.
Hey Sean from webpack, this was really just about our third party plugin ecosystem needing to catch up to v5. If you are feeling IRL Stress, then please don't update yet. Sometimes you have to break things to create progress and ship major versions+bring new features. You don't have to use them, you can just use webpack's zero config out of the box.
This is why I’m such a fan of the meta-bundlers/frameworks, like create-react-app or Next.js
I like the features that we pack gives me, but knowing webpack and spending the effort on configuring it just seems like a waste. I’m incredibly grateful for the CRA maintainers to handle that for me.
Webpack itself is quite low level. `react-native init` or `create-react-app` gives you the ready to use developing environment and updating that is usually a breeze.
Depends on what packages you use and how well they do semver. Chances are you will get major version bumps for most of your dependencies after 1-2 years of development.
What works well then is to do upgrades partially. Each major library separately, fix issues, go with the next one. Otherwise it's hard(er) to track what breaks your app.
My current project has 100 dependencies and 150 devDependencies. Upgrading takes a week if not longer for one experienced dev. We tend to do it every 3-4 months.
Reading through the comments here has been predictable. A lot of people complaining about the complexity of the front-end ecosystem. Too many tools, to much configuration, etc.
I’d like to say that you don’t need that complexity. If you just want to write a dumb front-end you don’t need typescript, you don’t need babel, you don’t need pug, you don’t need webpack, etc. If these things bother you, just skip it.
I always start my websites with a simple index.html file and run a `python -m http.server`. That is it. Modern JavaScript has a really good module system that works in all major browser (even Edge), and node. If you want to write type safe JavaScript you can write your types as doc comments and have typescript check it without bundling or compiling. You literally need only two tools to write a website: a browser and a text editor.
I’m not here to tell you that people shouldn’t use these complex tools. Many people configure their front-end environment just fine with webpack, babel, typescript, etc. If they work better that way, that is fine. But if that bothers you, simply don’t do it. There is no reason for you to complain about it.
I like your advice, but I think there is a reason for the complaints. An indefensible complexity has become the new normal. And it's what many people are forced to suffer through at work. If having a chorus of complaining dissenters on HN threads sways even a small percentage of people to take simpler approaches, it has done some good.
When people are complaining about this kind of stuff at work, it raises the question of what the alternative is. The person you're replying to describes a great methodology for smaller projects, but in many cases there are great reasons to take on dependencies. Not taking a dependency means you pay in person-hours to reinvent it, and sometimes that cost is much higher than a couple extra seconds of build time.
And, once you have a lot of dependencies, ES modules won't save you. People expect web apps to load fast, and code is big. Things like Webpack help make it a lot smaller.
My point isn't that this is good, it's that it's not "indefensible". There are very real practical reasons that big companies deal with the debt of dependencies and bundlers.
Many people (including my self at work) handle webpack configs just fine. I started a vue project at work the other day and it was as simple as:
npx @vue/cli@latest create my-project
This scaffolded a fully configured project with webpack, babel, typescript, eslint, a custom dev server, etc. I think react developers have a similar tool called create-react-app. I personally (or professionally, since I only use this at work; personally I just have an `index.html` and a python server) just let vue-cli handle the webpack config and forget about it. I guess rust developers use cargo to the same manner.
I suspect much of the complain about the complexity of the front-end ecosystem comes from people that either a) don’t do front-end regularly and are trying to build a website for them self but get bad advice for the tools needed, or b) are backend developers at work and think they know better about the front-end then their front-end counterparts. In the case of (a), they simply just need better advice on tooling (and which tools are an overkill), the tone of my original comment was more directed at the other group (b; although the advice I gave was for the former group—whoops).
Although, I do think that Vue handles the webpack config in a great way. I've been using vue-cli to manage my webpack config for years now, and never had a minor upgrade of any dependency break the build. It also provides a nice way to hook into the webpack config, so you can still add your own plugins and loaders without having to opt out of it managing it for you.
Dissenters are almost always people who haven't spent enough time in the JS ecosystem to understand why we need the tools we need to be successful. If you want to build a web app, you must target javascript and the browser. There are so many backend programming languages that enables developers to figure out their preferences. I get it, if I didn't have a choice but to build every web API using rails I think I would stop doing web development as well.
I learned to embrace the JS ecosystem and all of its quirks a long time ago. I think the ecosystem is first class and better than most other ecosystems I've part of. None of the dissenters ever have good enough arguments to sway me. To each their own, but JS is a fantastic, vibrant ecosystem with exceptional engineers solving problems that are wholly unique to the browser.
> Dissenters are almost always people who haven't spent enough time in the JS ecosystem to understand why we need the tools
No. Dissenters are almost always people who clearly understand how overly complex and at the same time extremely brittle frontend toolchains are. Understanding why you need these tools doesn't change the reality of what these tools are and how they work.
If one is in the majority and is suffering, then there is an opportunity to examine what one is doing, not to mention the environment.
After working at small shops managing frontends with 1-2 other people and then moving to frontends managed by larger numbers and divisions, especially in a culture of moving quickly, things can get hairy quite fast.
Even so, it helps to make a habit of revisiting things from first principles and to also assume the beginner’s mind, collectively, to work towards more refined, elegant, simpler systems/modules based on lessons of the past.
Speaking of “normals”... it used to be normal for most people to be undernourished.
Now, the normal is that the majority of people in many developed countries are overfed.
And overstimulated.
Same with the code we write.
Overfed. Overstimulated.
The dangers of prosperity and excess!
But at least we have the opportunity to focus and sharpen our realities.
... Anyway, back to Webpack.
Webpack is great. It is one such system that went through a lot of growing pains and complexities to get used to, not to mentioned subpar documentation in the early days, but it has evolved gracefully and is quite elegant, and has been for a few years now!
Read the docs, people!
And modern refined tools like Vue CLI, create-react-app, Next.js, and now Vite can take care of most of the “boring” frontend configuration!
No one really needs to struggle with Webpack anymore.
>I always start my websites with a simple index.html file and run a `python -m http.server`. That is it
I didn't know that was a thing, so thanks for mentioning it - tips like these are one of the main reasons I read hn - and it'll save me hours of now needless file transfers.
"npx serve" is another good way to serve static files quickly without needing python or any npm installs even. npx is packaged with npm for a while now.
That is all fine for personal projects. However, professional web apps typically have more complex requirement than just supporting the latest versions of major browsers without any further optimizations, so the only viable way of filling them is to use the complex tooling.
idk. I’ve worked with complex python codes and they also tend to have some complex setups like virtualenvs, django manage scripts, required system software, custom python version etc. Thankfully my workplace has this system set up by python experts who maintains all the complex config and documents how colleagues can get this thing running on their machine. At work I don’t need to understand the complex setup, and if I were to write a python project personally, I wouldn’t do it this complex.
A complex front-end code in your workplace should be similar. Hopefully a seasoned front-end developer is maintaining this complexity and if a back-end developer needs to contribute a trivial change, they can just `npm start` (documented in the README.md) which sets in place a whole lot of complexity that the back-end developer doesn’t need to understand, but just works.
Yes, but... chances are your backend dev will have a system-provided or just plain outdated Node version so they'll get a cryptic exception and will have to reach for assistance.
Ideally you would have a pipeline building the project on each branch commit so that the backend dev can push, wait 5 min, fetch the artifacts for that branch and run it locally to test it. Or even better - if there's a pipeline chances are there's a Dockerfile doing the build but it will not support live-reload.
Either ways it gets complex as soon as you have to get out of your comfort zone. Especially if the docs are outdated/verbose/non-existent.
My setup for simple to somewhat complex web apps is to download a minified version of Preact and htm and just import it as ES6 modules. The builds from unpkg work just fine although it seems you can't use the hooks extension since it has an unpkg URL hard-coded. I haven't looked into building Preact myself.
One can complain that the tools could be better, but not that they exist. They exist because they solve a problem that other platforms don't have: real-time delivery over the web and compatibility with a variety of runtimes.
I have to sing some praises for Webpack for a second.
I work at a company with a large (> 10k files) old (> 10 years) JS codebase. It used to rely on a home-grown build tool, but making our builds both fast and modern took the time of multiple full-time engineers. Webpack isn't "fast" like Rust is fast, and it's not old enough to be as well-documented or understandable as I'd wish, but:
1. it is flexible enough to fit all of our weird edge cases.
2. it has a really solid community of people working on it.
3. it made adopting new features like TypeScript extremely simple for us.
I don't see us upgrading from 4 to 5 eagerly (at least until it's been production-proven), but I'm excited about this release, and I love all the hard work that's gone into it. Congrats to the team — it's no small feat.
I can totally relate. I have been working on a similar codebase (6k modules), once built with Grunt and then migrated to Webpack. It has proven to be a mature tool.
The past year or so I've moved almost exclusively to backend at work (from Vue frontend) and simultaneously switched to vanilla JS for all new side projects. For those I use native ES modules heavily for internal code, and use very few external dependencies and import them as global scripts like a heretic. Heck I often don't even use npm for node projects anymore.
For my latest project I'm even trying to make the entire interactive UI without any JS. I've sadly never done this.
The sense of relief getting out of "modern" JS has been palpable. No more build steps. No endless dependabot PRs. No painful dependency updates. Just good clean fun development.
There are tradeoffs. vdom frameworks really can add a lot of productivity vs hand-coding dom updates. But if I really need it I can use something like Mithril which still doesn't require a build step.
For personal projects that I'm not certain are going to require a battle-tested, fully capable of handling any weird business requirement from product, I always start simple.
This strategy has worked great for personal projects. However, I would never start this way for a commercial product.
Version 5 and this tool still has absolutely awful UX.
I have no problem configuring Webpack, but it's ridiculous that:
- it needs two plugins to generate CSS files [1], when the homepage lists it as one of the supported outputs in front and center; [2]
- it needs a plugin have a non-trash output log. [3]
Parcel was supposed to fix the situation, and in many cases its UX is an order of magnitude better, but in exchange it brings a slew of unfixed bugs in the most simple projects that lightly sway from the base usage.
Maybe in 2030 the JS community will have figured something out.
I saw this headline and kind of winced because I just finished doing battle with our webpack config(s), but top-level await, automatic Worker entry points, cjs tree shaking, and import.meta support are huge! I'm also grateful for the removal of Node polyfilling -- it tends to just add confusion when you're testing across different environments.
The last time I tried to bump webpacker from 3 to 4 at <dayjob> was absolute hell. The fancy chunk splitting logic was spewing out non-deterministic bundles across different machines in our infra. Took a lot of deep digging in the internal graph representation to find the root cause.
I’m scrolling and scrolling waiting for a paragraph that tries to sell me on why I’d bother and risk upgrading. I think the best I got were patch notes.
The benchmark is dominated by terser and SourceMap performance. What you see are different terser configurations. Webpack 4 with compress.passes: 1 and webpack 5 with compress.passes: 2. Enabled SourceMaps makes minimizing extra expensive.
Thank you for making this comment - without it I would've ignored this new release.
Failure of bundlers (Webpack, Parcel, Rollup, etc) to support import.meta.url has been a major headache for me. Making my JS library usable in sites that use such bundlers in their build chain has led to me making some - questionable, let's say - decisions in the library's code. The sooner I can remove my "solutions", the happier I will be!
The parser was updated which brings the ability to use new (stage 4) ES syntax without having to transpile:
- Nullish coalescing
- Optional chaining
- Numeric separators
- Logical assignment operators
I'm surprised it wasn't mentioned in the notes -- I guess everyone is transpiling for production anyway, but it's still nice not having to do so in development.
Now if only they had a way to activate support for stage 3 features (https://github.com/acornjs/acorn-stage3) so all the class field stuff could be added to that list (modern browsers have already supported it to varying degrees for a while now).
The biggest thing I've been excited about is having an offline cache for fast recompilations. It'll make startup times a lot faster, which is a big improvement for developer experence (especially in larger codebases).
Up to a point, eventually when the code base is large enough you spend more time serialising/deserialising and passing things between workers that it can be faster to disable the cache.
From experience with one of the later betas (around 22 or 23?) we saw no improvement or slight regression with the disk cache vs the memory cache for incremental builds. Initial builds (warm starts) were faster, but there was still significant overhead reading the cache - this is an area where any optimisation would yield big results. I haven't had the chance to try the released version yet, but my understanding is there weren't any significant changes in this area.
We've been using Webpack for a number of years now with our massive webapp.
A few months ago we started using esbuild for our development setups. Build times went from ~6 minutes to ~1.
This is a sweet spot for us as esbuild is performing really well and made our development much easier. we still rely on webpack for our staging and production builds.
The reason for this is esbuild doesn't support a variety of production necessary features (transpiling to older ES, uploading bundles to S3, etc)
But I'm very happy with our "esbuild for development, webpack for deployments" process.
Any reason other people are not taking a similar approach?
ESBuild will transpile to different versions of ES6. It won’t drop down to 5.
I assert that you don’t need 5 support because you aren’t actually QAing IE11 and if you did, you’ll learn that it’s actually been broken and no one reported it.
As someone who fought battles with webpack and webpack 2, I’ve since been pleasantly surprised that each upgrade has just gotten simpler and more stable, and I can progressively delete more of my own code as the features get built in. Big thanks to the maintainers!
Eventually the packing part can be done in a compiled language. https://github.com/evanw/esbuild is very fast at doing some things already but not others.
And I guess webpack wins because you can configure it to do anything easily already? And with compiled languages 'configurability' is hard to do?
I've been putting off learning Webpack for far too long.
Can anyone provide some kind of a syllabus to help me figure out what there is to learn about it, starting from almost no knowledge at all?
I feel like it's a critical enough piece of modern web infrastructure that it's worth me taking the time to fully understand how to use it and what it's capable of.
This pretty much nails the problem with Webpack in my view. It doesn't seem to be based on any kind of strong, logical, easy to understand core, but instead is comprised of lots of interwoven threads of magic that no-one really understands.
This makes any attempt to debug and solve problems with it a total nightmare, like crawling through a tar pit. You might make it, but you're gonna feel all sticky and dirty coming out the other side.
That said, kudos to the developers, I use it every day, but I wish it were a lot simpler.
For me, I need webpack to make packs of "configuration set" that's easy to use. Let's say that one of them is react jsx to js "set", bundled with css, style, file and url loader. So for those who want to start with webpack and typescript react, they only need to use that "set", and define only the input/output.
As soon as you veer off the beaten path, which might as well be a tightrope, you need to understand things about webpack that are not made readily available. You need to know what exactly the CSS loader does, what _exactly_ the style loader does, what source queries are, how to _properly_ override import resolutions, and etc.
Getting this information out of the documentation is like squeezing water from a rock. It seems to be written of the opinion that you don't need to know how and what is going on, and if you do you probably already know.
Honestly, if your day job hasn't forced you into using a bundler, I'd recommend avoiding it if you can. For modularity, I start with native ES modules. If things got complicated enough that I thought a bundler would offer me some sort of a benefit, I'd reach for rollup first.
> I've been putting off learning Webpack for far too long.
I don't think most people really "learn" Webpack. Mostly they just tweak cut+paste examples and get back to value add work. You can definitely achieve 99% of whatever task you have without going down the warren of Webpack rabbit holes. You're committing no crime if you just relegate Webpack to the "get it working and forget it" bin; it does what it says on the tin and you can rely on it despite not being an expert.
EDIT: ok, some more context. You should be glad you haven’t had to deal with this shit show yet. Don’t walk willingly into it, there are much less painful tools like Rollup, Vite and Parcel around.
I was joking just the other day, could you imagine if someone you worked with came up with the set of design decisions that required webpack + Babel + node_modules ? You would think they were insane and laugh them out of the design meeting, even the company. Maybe it’s time for a reframing - which problem are we trying to solve?
Honest question: How is that any different from someone coming up with a set of design decisions that required rustc + cargo + crate modules in (target/*/deps)?
Okay, so rustc is compiling rust to machine code. Here we have some advanced version of JavaScript (es7?) going through a series of compilations, asset extraction, tree shaking, dependency graphs that are 10000+ packages long etc. The solutions rust and cargo choose (like having a decent standard library to reduce deps) are all better. Rust is fast and worth paying a compilation prise for, is it worth it in JS so we can have shiny semantic sugar everywhere? I don’t know how crate modules works but hopefully each library needs you to specify versions of all dependencies at the top level rather than magically hiding modules you’re adding 5 levels deep.
I could probably go on further if I knew rust better. The ecosystems are completely different IMO and webpack as a build system leads to complexity not clarity - its like a machine where an animal goes in and a sausage comes out, whereas other build systems are more logical in their steps.
Yes good luck shipping that main.rs to a web browser.
The problem that Webpack solves is performance optimization in addition to what Rust offers:
- writing in a higher-level language: "write modern code and ship to IE 5" is the same as "write in Rust and ship to x64"
- take a bunch of files and make 1 or 2 out of it.
Nobody stops you from loading jQuery and 3 plugins in a script tag and call it a day. Webpack exists because jQuery and 3 plugins can't deliver the same results nowadays.
Also, do you want to improve your code with types and linting? Do that without Webpack, it's not going to be much more efficient.
---
In short, Webpack and Babel basically match the Rust compiler in functionality. If you don't use them, your web app’s code will be harder to manage, not easier. That's all.
Have you ever seen a 3000 line webpack config file. I think the analogy between webpack and rust is nonsense to be honest. They are doing different things, I just bit on the previous question.
Maybe you can justify the complexity in JS projects but I find it excessive.
If you replace Webpack with Parcel, the argument holds up. Webpack just offers more bells and whistles, which in our case we need to wrestle legacy projects into modern build pipelines. For a new project, maybe you don’t need all that, and you also don’t need Webpack with 3000 config lines. And fwiw, with every major, Webpack has moved to more sensible defaults for modern web bundling, meaning if you stick with it for a new project, likely you only need 5 config lines
Coding something in java/kotlin after months or years in only browser world is so relaxing. Maven (or Gradle) just works and upgrading packages is never a scary thing. And if update does break something, it is usually a small API change in some library which IDE will happily highlight for you with a red squiggly line. And what I love about Java is that you have very stable libraries for everything, where as in javascript world of you don’t touch it for a year your already a dinosaur.
it is really frustrating to read infinite variations of the comment "javascript development is too complicated", "it is much simpler in other languages" etc.
yes, it is complicated. the problem is, there are certain constraints that we just cannot ignore: we want to write code in a "normal", efficient way (code organized in namespaced modules, perphaps static typing etc.), but we need it to be executed by the web-browser, maybe an old version of a web-browser. so we need solutions that work with these constraints.
if you know about simpler solutions, i'm very interested hearing about them. but please note, it has to work with those constraints, so ideas like "just write a native application" does not help.
I wish we all were using swc.rs by now. Last time I checked it was still relying on nightly rust, but it seems not anymore. This we I saw someone complaining that webpack-dev-server was taking 4.5GB (not a mistake) to "concatenate strings for older browsers" in a fairly large codebase. There is just no excuse for that.
I've had some really horrible experiences with webpack.
When I first started with it, it was daunting. Seriously, just look at the docs, it's pages upon pages of complex behavior. Some of it fairly unintuitive.
It took me days to dive in, get used to it and kind of know what I was doing, and then promptly forget everything about I knew over the months/years.
Everytime I have to dive into the webpack related config, I basically set aside a whole and feel a deep sense of dread. This isn't normal.
Also, Webpack has horrible UX. At first I thought that was because it's complicated and covers a difficult problem space. But then I used parcel, which is leagues better in terms of usability. I absolutely love parcel and think it's so much better overall. Unfortunately, their team is much smaller and much less used so webpack is a couple more features that make it more "stable". Well actually, it goes both ways, but most of the time parcel has some very small error that webpack might not.
Whic is why I basically run both at the same time - parcel for it's speed and ease of use, webpack to catch a lot of errors (like those outputted by ts) that for some reason parcel (1.x at least, I think 2.x is coming soon) doesn't catch.
Not only that, but also with every release links pointing to older versions stop working, breaking stackoverflow answers, blog posts, and a host of other things.
Sad that this has almost become the norm when developing in the modern javascript ecosystem. I dread touching those projects and creating one even more because stuff just rots away and your app might break in days, weeks or If you are lucky months. I'm sure there are better developers out there that can handle all of this and know how to avoid it, but a bozo like me does not. This is actually causing me stress irl.