-
-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
webpack support #3
Comments
Thanks for the kind words! Okay, sorry in advance for all the detail here, but I think this deserves a thoughtful answer.
Hmm, this is the third or fourth lazy-cache issue we've had in the past 30 days or so - granted that's out of a couple of hundred million downloads or something, but it's clearly causing issues with webpack and I'd really like to get this solved. Are you familiar with webpack plugins? It seems like there ought to be a way to transform the I created a tool a month or two ago to automate swapping out lazy-cache should the need arise. I knew there might be scenarios where we had to do this (which is why we try to be consistent in how lazy-cache and module dependencies are defined in our projects, so they'll be easy to "swap out" with non-lazy code in a build step), but I haven't had time to release that, and I don't even know if it would actually help with the webpack issue.
I think it depends on what you're doing with the projects that use lazy-cache, what your users care about, and what your concerns are as a developer. If you're using webpack, it's probably not a great pattern. We use browserify, which has worked fine so far. But this is a problem we need to solve. Also, lazy-evaluation isn't always a great pattern for long-running applications. Sometimes it is, but this totally depends on what's being lazily evaluated (like whether or not memory leaks might be a problem), when lazy code might be evaluated and how much of an impact on latency (and thus user experience) it will have. Based on how we use code that uses lazy-cache, the performance benefits are astonishing. We have (or use) projects that used to take 120 ms to load, like In one project that uses inquirer, To put this in perspective, Verb is a documentation generator that loads helpers, template engines, and some libs like inquirer. I'm a big fan of the module pattern, and prefer small libraries, but it's a fact of life that some projects like verb (pick your project) are necessary for performing complex tasks that just aren't possible with small libs. So the dependency tree grows as much as is necessary to accomplish the task. And as the dependency tree grows, as a consequence I tend to see more issues from users complaining (nicely and rightfully) about the size of the dependency tree - but when all is said and done, it's not usually the size of the dependency tree that these issues are about - it's latency. (most) developers only start caring about the size of the dependency tree when their application feels sluggish and they go looking for answers (or more often, a different solution altogether). At the end of the day, no one would or should care how many dependencies there are if the latency problem is solved (and of course the webpack problem). Some devs have made the argument that this benefit - no matter how great - only applies to init, and that "the code will still be initialized, but maybe later when it impacts the user". But this is not necessarily true. It's just as possible that no matter how long an application runs, the typical user might only touch parts of an application that triggers 5-10% of the actual functions in the code base - thus the other 90% of the codebase won't be initialized for no reason. (Please view these numbers as complete conjecture - in my experience, that feels right, but I really don't know. I do performance analysis on most "major" dependencies I use, especially if I didn't write the code. But I don't actually aggregate that data or track it in a way that might be reliable to others - I would if I had an idea of what might be useful). In any case, I think implementors and end-user should have the ability to "opt out" of lazy-caching. Last week we added support for |
I'll keep it short 😉 ...makes sense, but not sure if / how to solve this on the webpack side since it would need to be done at build time instead of run time. @sokra, any suggestions? |
Maybe I'm not going to be able to keep this short... 😉 I was thinking a bit more about this comment:
One could argue that the solution to this This case is probably different than the dynamic / flexible plugin problem with |
But you can only keep the kitchen sink out of your own libraries. here is a good example. your point is valid though, I think that's the "correct" way to view the problem in an absolute sense. But we can't expect or rely on all authors to follow the same pattern. We can hope they will at some point, but "hope is not a strategy" |
Yeah, it's definitely that there are limits to what you can control. If things aren't too complicated and if I've got the time (two big ifs!), I'll just fork, use my fork in the short term, and will create a pull request while I wait and ping. I think there should definitely be a movement for low dependency code (there was a comment to this effect in a recent JavaScript Jabber). I almost always write code under the assumption it may run in the browser so I spend a lot of time reviewing dependencies as part of my module selection criteria (which led me to discover your large body of work in that direction), but maybe that's my altruistic and playing the long game nature. Let's see if there is a webpack solution and if not, we could bounce ideas around the best path forward. |
Sounds like a good plan. I'd definitely like to help solve this. |
I'm a bit confused about this library: Isn't this: var x = lazy("x");
function sometimesCalled() {
x().doSomething();
} the same as function sometimesCalled() {
require("x").doSomething();
} |
sorry, what's the question? |
@jonschlinkert I think @sokra is exploring the lines on whether the library is an anti-pattern or not. This comment is maybe a little too straight to the point (!), but this should be a good exercise to go through...for example, to eliminate circular dependencies, I've done something like this in the past:
or if you rely on the caching nature of most module systems, it would probably be a similar speed (eg. an object key lookup is pretty fast in JavaScript and module systems are optimized around cached-lookups):
Like the plugin pattern, this could perhaps be a different way to achieve a similar result for some of the use cases of lazy-cache. |
@kmalakoff the issue with using the conditional require statements inside other functions is that it always has to be done in each place you want to use that library. This leads to a lot of boilerplate code and possible errors: var x;
function sometimesCalledA() {
x = x || require('x');
x.doSomething();
}
function sometimesCalledB() {
x = x || require('x');
x.doSomething();
}
function sometimesCalledC() {
x = x || require('x');
x.doSomething();
}
function sometimesCalledD() {
x = x || require('x');
x.doSomething();
}
function sometimesCalledE() {
x.doSomething();
} lazy-cache is removing that boilerplate by doing the caching internally so you can have require in a single "utils" library: var utils = require('./utils');
function sometimesCalled() {
utils.x.doSomething();
} |
Yeah I don't think that's a question that he can answer. It's javascript. We're not doing magic, we're using the wonderful features afforded by the language - and we have real performance benefits from doing so. Even entertaining the idea that this is an anti-pattern because a single library is choking on it is silly. |
While we're thinking about anti-patterns, I think this is the worst offender in node.js. IMHO, node.js modules should be created under the assumption that they are node modules, using patterns that are idiomatic for node.js. By hedging and writing code like you suggest, modules end up bloated, buggy and slow, and are in fact a big contributing factor to why lazy-cache is necessary. |
@doowb let's not get sidetracked by discussing a comparative example used to clarify the line questioning. |
I'm not very happy on how this discussion is going. I'd prefer not to have to unpack and respond to "it depends" or "assumptive" styles of argument and to come up with coding example to counter them when examples could often be written to show both sides. Would you like me to close this issue? |
Well, comments like this won't help you
e.g. "I'd prefer not to let reality get in the way of a good theoretical debate" @doowb's comment was clear, correct and to the point. It answered the question without throwing in comments about "having the long view" about something. There is no more to say on that point. If you think you can continue the discussion the way you're suggesting, we can keep this open. If not I'd prefer to close. |
@jonschlinkert the reason I responded to @doowb's comment as I did is that I believe it will not contribute to resolving this because it was not a proposal, but a clarifying example on @sokra's question. Also, it should be clear in the example that if you have one place vs many, it will affect your decision making on whether to bring in a 3rd party module or to solve the problem as a one-off. Plus, it was meant and should be considered mainly as a potential optimization on the module cache if the lookup is proven to be slow in the case of few instances for the one off. Basically, "it depends" is where the argument ends up if we were to consider that this clarifying code is material to the debate. More importantly, the example was mainly put out there to try to diffuse some of the tension I sensed in your response....epic fail! 😉 I realize that you are a prolific developer like myself so we probably have a lot of tactic knowledge and experience that have let to us developing preferences plus this library is your baby which makes it even harder to approach critique on! Also, I believe you are an entrepreneur like me and need to be careful where you spend your time. I'm happy to continue if I'm not wasting my time! I can't guarantee that I won't share my beliefs and biases (I do tend to take a long view compared others developers that I have met...nothing meant by it except to expose a some personal information about me that can be both a strength and weakness, but is definitely a bias / preference that I'm aware of and is why I raised it). Hopefully, it is more clear on where I'm coning from. I really need to safeguard my time so if this is going to be a presumptuous or exaggeration style of debate which tends to just lead to escalation of positions or if there is no chance to affect change, I'd prefer to bow out and close. |
@kmalakoff based on this comment on the webpack issues, we would like to see a compile or loader time solution. Some possibilities:
I like the first one and we had started going down that road a few months ago, but I was unable to find the correct settings for @sokra any ideas around building a plugin or helping with the configuration are welcome. |
Thanks for everything in your response @kmalakoff, very perceptive. we're on the same page. I'd like to continue as well edit: and thank for continuing to try to find a solution instead of writing me and this off. I appreciate that |
@doowb yeah, this might end up being a tough one. I've swapped out modules to work around lazy-cache in the browser (minimatch for micromatch, but am told micromatch is super performant so not happy with that solution!), but am really having a difficulty seeing how code splitting or a loader might work. Webpack is great, but once you get into customizations at this level, I feel like it is best left up to the experts who understand the innards and principles better than me. My gut is that @sokra will judge this to be an anti-pattern (I've tried in the past to find what should be simple node-require-compatible solutions with webpack...one success and one failure) since the lazy-cache API is so dynamic making it hard for / incompatible with static analysis, but you never know! |
@jonschlinkert awesome! I'm totally happy to hear...fingers crossed. |
as a last resort, it seems like it would be trivial for us to implement a as a side note, can we agree to not delegate authority to anyone in this conversation for what makes an anti-pattern? I'd rather stay focused on finding a solution, rather than handing out "get out of jail free" cards ;) |
OK. @sokra "get out of jail free" card revoked 😉 |
I did some research and dug into webpack and came up with an unlazy loader that works by transforming the pattern we use to work with As mentioned on the webpack issue, the hard part is notifying webpack users that they may need to use the Right now, 2 things will happen in For the second issue, we could wrap the |
@doowb excellent! Thank you for finding a solution! I think the try / catch with expanded information. The first time someone ignores the warnings and doesn't investigate (for example. they are trying to get something running quickly and tell themselves that they will deal with the warnings later), they will be provided additional information at runtime to help troubleshoot so sounds perfect. As for notifying webpack users, (as I'm sure you are thinking!) just point the try / catch and highlight it in the lazy-loader README (bold webpack section header) to point to the unlazy-loader README with more information. If you are suggesting using the webpack.config.js
|
awesome thanks @doowb and @kmalakoff! |
@doowb sorry about the alias example. I saw that you actually made a loader and provided the webpack config in the README: https://www.npmjs.com/package/unlazy-loader. You are many steps ahead of me! 😉 Closing... |
Hi Jon,
It's pretty amazing just how many libraries I keep discovering of yours. You have blazed some amazing trails!
This module seems to come up as a bit of a problem unfortunately because it is pulled into by browser/ cross-platform code and I use webpack as my bundling solution. Basically, I emulate node's fs module in the browser and micromatch pulls in braces which pulls in this.
I realize that there isn't really a quick fix for this except maybe to drop this use of this module in the dependent modules which is probably a lot to ask! and I'm assuming that this lazy loading is necessary for either fast initialization time (eg. node-based spawned scripts) or for maybe for optional dependencies or both...but I have to ask...
Given that modules systems already cache the lookup of modules, is this module necessary (I'm assuming there must be some critical use cases that benefit from lazy loading, but am hoping the benefit is marginal)? or from another angle, is there a way webpack's static analysis can be supported?
Right now, I'm having to avoid this part of your modules ecosystem and I'd prefer not to. If you have any suggestions or ideas (besides dropping webpack), I'm all ears...
The text was updated successfully, but these errors were encountered: