Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Keep Webpack Fast: A Field Guide for Better Build Performance (slack.engineering)
214 points by firloop on Jan 18, 2018 | hide | past | favorite | 47 comments


If anyone is tired of Webpack configuration issues / slow performance, give Parcel a try:

https://parceljs.org/

The project is less than two months old, so Parcel might not be a good fit for you. But if it is, it's usually a drop-in replacement. Zero configuration. Just run it and go.

We're working hard to add the most requested features: source map support is coming very soon, for example. But if you can put up with a few rough edges, Parcel promises to give you a ~5x performance increase over webpack.

(We kick off multiple processes to do the actual bundling work, and we have an asset cache, which accounts for most of the speedups.)

If you'd like to participate or have any questions, hop on our Slack! https://slack.parceljs.org/ We have about 500 devs ready to help you out and chat with you (about anything, really).

Feel free to snag an issue and start working on it: https://github.com/parcel-bundler/parcel/issues?utf8=%E2%9C%... There's plenty of work to go around.

It's been really cool to see so many people pitching in. We recently had a Japanese contributor help get Parcel up and running for multi-language projects: https://qiita.com/zacky1972/items/0ce05454b67506edc634


> If anyone is tired of Webpack configuration issues / slow performance, give Parcel a try

Shawn, half the articles about Webpack I see recently have these kinds of comments about Parcel. This goes beyond excitement into spammy behaviour. I've seen plenty of other people complaining about this too.

This was an article about Webpack performance, and you've come along to talk about unrelated Parcel features like source map support and i18n, to link to the Parcel Slack, and to ask for help working on Parcel bugs.

It really puts me off Parcel and there's no chance I'll look at it so long as when an article about Webpack pops up, there's always a comment from a Parcel developer spamming your project. Please stop spamming Parcel in Webpack discussions.


But why not actually try it then, and come back with what you got? I.e maybe he's here because they solve most of the pressing issues.

(I myself have no skin in the game either way)


I am sorry if this is silly question, and no offence intended.

Is Webpack beyond fixable? Tech has always been about reinventing the wheels, but the JS community seems to dial this up to a whole new level.

Isn't Webpack 4 heading towards Zero Configuration?

Isn't Webpack 4 making many performance improvements?

This isn't to say i dislike Parcel. I mean God I wish Rails could have picked up on it as default.


Out of interest, is there an overview of Webpack 4 somewhere? This sounds interesting and it's the first I've heard of it.


I would prefer to stick with webpack and make it easy and fast again, rather than changing tools -again. We've seen this with grunt, gulp, and now webpack. What if, we're changing strategy this time ?


Nah, I heard Yeoman solves all these problems.


Maybe a naive question but did you consider writing it in language that is typically faster (C++/Go/Rust)? I understand the many benefits of it being written in JS but there is an inherent maximum level of performance that JS can achieve in certain workloads.


My naive answer, but I'd think that the heavy lifting for bundlers like this lies mostly in the external packages that parse and modify the code (Babel, uglify, postcss, etc). The core work of the bundler to wrangle which dependencies to pass to those external libraries probably isn't a performance bottleneck, surely?


In my experience you are not wrong. Resolving a build graph can be done efficiently. The main cost is typically the tools invoked to satisfy build targets. However, if the build graph does unnecessary work, it can slow the build down by a large amount, e.g. compiling files which were already compiled. In. a large project, it might be the difference between a 10 second build and a 10 minute build for example.


There is an attempt to write JS module bundler in OCaml https://github.com/fastpack/fastpack


well at least there is a tool in java. and once warmed up it's basically the fastest tool available. google closure compiler. so basically if there would be a closure service that could be installed on your system a lot of things would be way faster.


I tried out Parcel this past weekend and I was really impressed with how easy it was compared to when I tried using Webpack for the first time. Just import what you need and it find it all and you're good to go.


Parcel is a neat solution, lack of sourcemaps stops me from using it for anything serious but I believe this is being worked on.


Is parcel the first node module bundler to run multiple cores? I’ve been waiting for this forever!

Definitely going to take a look at the code and try this out soon.


There’s a reason why it’s not generally done: there are no good tools for doing this in Node, nor is JavaScript particularly conducive to it.

The state of the art in such matters is Rust with its type system-assisted fearless concurrency; my favourite demonstration of how it can work is Rayon, where introducing data parallelism is typically as simple as replacing a .iter() with a .par_iter(). I’d love it if something like Rayon could be ported to Node, somehow.

(By virtue of being excellent at the low-level and having zero-cost abstractions, Rust actually ends up being a really good high-level language.)

In Node, if you want to process a zillion files in the same way, there are no good solutions. I recently needed to improve the performance of a manually-crafted, Node-centred build process; in Rust, it would have been a matter of: add Rayon, replace a .iter() with .par_iter() and we’re done—and what’s more, it’ll only flex its parallelism muscles where it’s worthwhile. But in Node it took a several hours of trouble (after research, I settled on using worker-farm to help) to produce a strictly inferior result.

Rayon: https://github.com/rayon-rs/rayon worker-farm: https://www.npmjs.com/package/worker-farm


I've also tried to naively improve a node build process and given up after realizing that, for the particular project I was working on, splitting up the build into multiple processes actually slowed things down compared to the single threaded build process

Node's single threaded event loop works well for some use cases but makes it hard to parallelize build process.

I would absolutely use a module bundler developed in a better suited language and distributed on npm as a binary like flow.


Then you should consider fastpack. It’s written in OCaml and use Flow parser. Also, it should be distributed as a binary

Fastpack: https://github.com/fastpack/fastpack


I would also consider bazel.

https://bazel.build


Have you tried it to build web projects? Unfortunately there are very few experience reports out there.


I’ve thought in concrete terms about porting postcss to Rust before. It’s the sort of tool that is stuck with slow while it’s in JavaScript, but could be amazingly fast in Rust, for which it’s a really good match (especially algebraic data types). (Seriously, the sort of thing that typically takes a second or so in JavaScript with things like postcss I would expect to take under five milliseconds in well-written Rust, and under a hundred in badly-written Rust.)

The real problem with all these sorts of things is network effects. People aren’t going to use the Rust port of postcss unless the plugin they want to use works in it, and it can be conveniently tied into their existing toolchain, and—— and——

This is why we can’t have fast, efficient things.

I’m really looking forward to getting something of the pthread style in WebAssembly, because then, finally, blending the two should be easily feasible. You can already use things like Neon (https://www.neon-bindings.com) to write Node modules in Rust, but it’s not as easy as it should eventually be a few years down the track.


> The real problem with all these sorts of things is network effects. People aren’t going to use the Rust port of postcss unless the plugin they want to use works in it, and it can be conveniently tied into their existing toolchain, and—— and——

I've never used postcss, but I don't think this is strictly for all js build tools. There's libsass, written in C++. I've mentioned flow. I'm sure there are others.

I think you're right though that webassembly is going to be big for this kind of thing.


I believe webpack can run using multiple threads at least [0]

[0] https://github.com/webpack-contrib/thread-loader


This sounds great, but in reality most projects became so dependent on webpack, that moving to another packager would be a time waster (at least from managemen't spoint of view - and heck, I might agree: for us it would take weeks if not months).


Why is parcel x5 faster than webpack?


Maybe it's because you don't spend 5 hours reading through webpack documentation and examples?


Excellent summary of some of the techniques you can use to speed up Webpack builds. It'll be interesting to see how the ongoing work on performance in Webpack 4 relates to this [0] [1], and I'd guess that the just-announced Rust->WASM sourcemap lib work [2] will help too.

If anyone's looking for more info on this topic, my React/Redux links list has a large section of additional posts on improving Webpack build perf, code splitting, bundle size optimization, and other related techniques [3].

[0] https://github.com/webpack/webpack/issues/6244

[1] https://twitter.com/TheLarkInn/status/945486181575340032

[2] https://twitter.com/TheLarkInn/status/954053251854344192

[3] https://github.com/markerikson/react-redux-links/blob/master...


Can anyone explain how someone can make a build system and not make correctness and speed the top priorities from the start? Once you do that parallelism is third feature you implement, the first being running an action, and the second being a hash based dependency graph of those actions inputs & outputs. Once you have that dependency graph parallelism and caching just appear pretty much for free.

It looks like google uses bazel (a parallel dependency graph based build) internally and they have 1-2 second rebuild times on apps much larger then Slack [0].

0 - https://medium.com/@Jakeherringbone/what-angular-is-doing-wi...


See also, pants and buck.

I think it's a story of PostgreSQL vs Mongo - do you start out with conservative correctness and slowly expand your feature set, or do you advertise a lot of (poorly imolemented) features, and with the large userbase slowly increase your quality.


From someone who has experienced the dev flow on apps with very large, gnarly webpack configs 17 seconds is extremely impressive and sounds like heaven!

What a lot of devs may not be aware of, if their projects aren't of a certain scale & complexity (support for legacy code/library-choices/extreme-modularity/etc.) is that the magical nearly-instant hot-module reloading you experience with a clean, "normal" project should not be taken for granted because webpack builds/reloading can become a monster! Not to say you can't scale up good webpack performance (obviously slack has) just that it's not an automatic thing but something that requires some TLC.

Thanks for taking the time to share slack team!


Hoping that someone might have some recommendations for a problem that we have in particular.

We have multiple sites, each of which has it's own theme. Our various sass files check the theme variable passed into the sass loader, and decide on the correct values for particular styles. Our react components import the stylesheet and reference values accordingly. We also have a spattering of old backbone on some pages, so it's not just react involved.

Unfortunately, this means that we have to build our entire application for each theme. This takes an excruciating amount of time. Parallel Webpack makes it somewhat manageable, but I'd like to eliminate building for each theme altogether.

Any ideas on paths we can look into? If our application was all React, I'd consider styled components or a theme HOC. It's not though, so I've put those options on the back burner for the moment.


I use webpack every day. But the config for it scares the crap out of me.


I recently discover Poi https://poi.js.org/. It’s built on top of webpack. If your issues with webpack are configuration complexity, not speed, then Poi may be appealing to you.


Another option similar to Poi, that I'd recommend adding to any comparison, is https://neutrino.js.org/


I hate webpack, it has been a hack from the start. I can't wait till HTTP2 is more widely supported.

My latest projects use polymer and a custom built service (in go) to automatically parse entry points to my application and index the required files. When a HTTP2 connection is made -- and the browser supports push, I will automatically push all the required files. With browsers like chrome the results are near instant and I have full control over what gets pushed or not.

Personally I think we need to stop hacking the web together and properly solve these problems. I think HTTP2 with push is a good step in that direction.


I don't see how HTTP2 can replace webkack. Even if websites stop making single file bundles of JS, I think we'll still need something to: - Transpile from latest ES standards, to browser supported code - Compile from non-JS languages like TypeScript, Flow, and so on - Do dead code elimination, minification, and tree shaking - Bundle code shared by other bundles - Optimize images And on and on...

I think webpack is a mess, but I think it really is just a reflection of the complexity of the web platform.


You are just highlighting more of the problem. The fact a blunder is responsible for doing all these things is just proof of more hacking.

Most of the things you described live outside of webpack, and webpack is NOT required to use them.


Sure, I don't disagree that webpack merely brings those features together into one place. But webpack was born because 80% of the time you make a webapp you don't want just minifcation, or just transpiling, or just cache busting file names, or just file event based recompiles, or just bundling, or just tree shaking, and on and on. Most of the time, you want all of that stuff. People got tired of maintaining their 1000 line npm/gulp/grunt task files, so a bunch of projects were launched with the goal of eliminating that glue code. Webpack is the most popular, but there are of others like Brunch or more recently, Parcel.

I think webpack is a mess, but will probably end up being the mess we're stuck with for sometime, because while it is complicated, it does solve a lot of problems. I see a lot of parallel with the GNU make and autotools setups used to build most C projects. The layered way that autoconf, configure, and make build configuration to compile and install programs is pretty complex (to my eye). Sure, it could probably be simplified a bit, but cross compiling code that dynamically links to libraries and installing it on different operating systems is just a complex problem.


I use and am very happy with Brunch. Yes, Webpack might have more features but my Brunch config is like 25 lines and it just works. It took me fifteen minutes to set up having struggled with Webpack for a couple of days.


I think that it's possible to use reasonably latest JavaScript without all those eliminations/minifications/tree shakings/sharing, etc for many projects. But writing entire project in one JavaScript file or issuing dozens of HTTP 1 requests is a real deal breaker for a lot of projects. I know that for most of my projects build tool is only used because of modularization. Everything else is good to have but not required and not worth build complexity.


HTTP/2 can't. But it can't make builds less complex. You're not longer bundling every script, inlining every image, etc.


Still just a drop in the bucket of what Webpack can do, so the HTTP/2 post above shows a major misunderstanding of what people are using Webpack for, and probably what HTTP/2 actually is.

I don't see how you can work on a modern JS application, understand HTTP/2, and think HTTP/2 replaces Webpack. Pick two...


Yes, you don't agree with what I said so you attack the author and not the content.

Webpack is used for a lot of things -- that it should not be used for. As I posted above many of the things it is used for should out of band of your blunder.

HTTP2 removes the need for bundling by allowing a smart web server either learn, or be told what files will be required for rendering to be pushed on the first request from the client. This eliminates the multiple request / and connections required in HTTP1 -- even with 4 pipelines that can be a lot of round trip request. If your application was not smart enough to know what to push to the client HTTP2 can allow multiple request over the same connection again negating the need for bundling.


Hi @mbrumlow. Sean here from the webpack team.

We did a lot of study and testing on HTTP/2 and we found that your claims are entirely incorrect. But don't take my word for it, see the stats!

https://medium.com/webpack/webpack-http-2-7083ec3f3ce6

HOWEVER! HTTP/2 and webpack become even more powerful when used together! This is something we are really excited to share and build on!


Oi. They were not attacking the author. Their conclusion is sound. Given that Webpack is a lot _more_ than just bundling files together (as has been pointed out twice now) it is incorrect to say that HTTP/2 is a replacement for Webpack. No amount of explaining how HTTP/2 works obviates that fact.


They said they didn't understand Webpack/HTTP2 and misrepresented what they said. Looks like attacking the author to me.

> it is incorrect to say that HTTP/2 is a replacement for Webpack.

Take a look back over the thread. They didn't say that. They said it was a step in the right direction. Then somebody started putting words in their mouth.


Agreed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: