Relevant webpack/webpack issue.
Using webpack in real projects slows down, in my experience, after you pile on a certain number of components and/or dependencies. I have a test repository that seeks to demonstrate this with the following application:
- The entry point is
A.js
, which requiresB.js
andC.js
. B.js
is tiny and doesn't have a lot of dependencies.C.js
is monolithic and has thousands of requires.
My expectation is that when using webpack-dev-server
in the test project, whenever I save B.js
, webpack should recognize that C.js
and none of its dependencies have been touched. It should compile B.js
swiftly (in <10ms), replace it in the cache, and output the compiled A.js
using the cached version of C.js
from the initial compile.
However, webpack compiles 3002
hidden modules every time I save B.js
, leading to a compile time of 960ms
. This isn't bad on its own, but spirals out of control if you add some loaders like react-hot
and babel
.
I do have a solution: on the same test project there is a dll
branch. On that branch, you can run webpack --config webpack.dll.config.js
to generate two DLLs from B.js
and C.js
which will then get leveraged when compiling A.js
. Afterwards, when using webpack-dev-server
, whenever you save B.js
, its DLL will get recompiled, A.js
will notice that one of its DLLs has updated and it'll just take the old DLL of C.js
and the new DLL of B.js
and conjoin them together into one quick happy bundle.
I could go further on that branch and do directory reads or dependency graph walks to generate a DLL for every component, an approach that could potentially be applied to every webpack project. That should in theory make compiling as fast as I would like it. But at that point it seems to me like I will have reimplemented (poorly) what the caching layer in webpack should do on its own, so what's going on here?