Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gulp.watch, mocha and gulp-batch #80

Closed
floatdrop opened this issue Jan 2, 2014 · 16 comments
Closed

gulp.watch, mocha and gulp-batch #80

floatdrop opened this issue Jan 2, 2014 · 16 comments

Comments

@floatdrop
Copy link
Contributor

Suppose we have gulp-mocha and want add watching to run tests on changes:

gulp.watch(['test/**', 'lib/**'], function () {
    gulp.src(['test/*.js'])
        .pipe(mocha({ reporter: 'list' }));
});

This fill work fine, until you change multiple files with some git command - watch will call mocha as many times, as many files was changed. And this is perfectly fine, because this is how gulp should work.

But this is definitely a problem - most of the grunt related blog posts are about - "How to rebuild your css/jade/etc on edit". So I see a way in writing gulp plugin throttle (or something, I can't figure proper name) that will buffer incoming events (for 5 seconds then flush) and call callback only once:

gulp.watch(['test/**', 'lib/**']).pipe(throttle(function (events) {
    gulp.src(['test/*.js'])
        .pipe(mocha({ reporter: 'list' }));
}));

(This will need this #13 to be done)

Any suggestions on that? May be I don't see obvious solution, or missed something in docs.

P.s. gutil.buffer is close, but waits for end of the stream. So maybe gutil.throttle?

@akre54
Copy link

akre54 commented Jan 2, 2014

What you're asking for is a debounce. This seems like a common enough use case that it should probably be baked into the watcher itself rather than the client wrapping the callback.

@sindresorhus
Copy link
Contributor

Yeah, should be baked in IMHO.

@yocontra
Copy link
Member

yocontra commented Jan 2, 2014

Gaze is the gulp file watcher https://github.com/shama/gaze/issues

@akre54
Copy link

akre54 commented Jan 2, 2014

This falls somewhere between glob-watcher and gaze, since gaze (correctly) debounces changes to single files and glob-watcher (also correctly) passes all changed files. Probably best to throw the debounce on glob-watcher line 8, yeah?

I'd open a pull but the tests are currently broken.

@floatdrop
Copy link
Contributor Author

@akre54: This is neither a throttle or debounce. I think of this more as throttle + debounce + group.

@sindresorhus: Maybe in gulp utils, but not into watch or other methods should implement it too?

@contra: In my opinion this is not work for gaze at all. I agreed with @akre54 - this is something in between, but watch and gulp plugin (or it depends on task you want to achieve).

@yocontra
Copy link
Member

yocontra commented Jan 2, 2014

@akre54 glob-watcher just passes what gaze puts out. If gaze was debouncing the files before putting it out then this wouldn't be a problem. I scanned the gaze repo for "debounce" and "throttle" and didn't get any hits. Where do you see that it's debouncing?

@akre54
Copy link

akre54 commented Jan 2, 2014

@yocontra
Copy link
Member

yocontra commented Jan 2, 2014

@akre54 sweet - @floatdrop just sent a PR to let those options go through gulpjs/glob-watcher#2

@floatdrop
Copy link
Contributor Author

These options are cool, but don't solve this problem.

@dashed
Copy link
Contributor

dashed commented Jan 2, 2014

Instead of waiting for some arbitrary time to collect files to flush, would it be better to use some sort of queue? See https://github.com/caolan/async#queueworker-concurrency.

That is, wouldn't it be better to "batch" watched files, so as to not spam the CLI of some app? Batch 20 files at a time, then 'flush' on timeout (2s) which handles outstanding batches.

You can queue via gulp.watch's event variable.

@yocontra
Copy link
Member

yocontra commented Jan 2, 2014

I'll take PRs for this in glob-watcher but won't have time to implement anything until the 10th

@dashed
Copy link
Contributor

dashed commented Jan 2, 2014

This seemed to be an interesting problem. I took a crack at it. @floatdrop would the following work for you? I haven't really tested it.

var async = require('async');

// config
var batch_num = 20;
var timeout = 2000;

var bomb = void 0;
var items = [];

// async to sync
var worker = function(_work, callback) {
    var _args, _f, _this;
    _f = _work['f'];
    _this = _work['_this'];
    _args = _work['_args'] || [];
    _args.push(callback);
    return _f.apply(_this, _args);
};

var queue = async.queue(worker, 1);

var gulper = function(glob) {

    gulp.src(glob)
        .pipe(...)
        // ...
};

var batchProcess = function(item, callback) {

    if(bomb) {
        clearTimeout(bomb);
        bomb = void 0;
    }

    if(items.length + 1 === batch_num) {
        items.push(item);

        gulper(items);

        items = [];
    } else {
        items.push(item);
    }

    // Set bomb
    bomb = setTimeout(function() {

        gulper(items);

        items = [];

    }, timeout);

    callback();

};

gulp.watch(['test/**', 'lib/**'], function (event) {

    var payload = {
      f: batchProcess,
      _this: this,
      _args: [event.path]
    };

    queue.push(payload);

});

@floatdrop
Copy link
Contributor Author

@dashed yep, that looks like what I wanted. I started working on gulp-batch right before read this.

@dashed
Copy link
Contributor

dashed commented Jan 3, 2014

That looks really cool. I hadn't thought of using the function decorator pattern.

Much better and simpler than mine.

@floatdrop
Copy link
Contributor Author

Thanks guys! I think gulp-batch is finished, so my problem is fixed. Thou it will be nice to mention it in API docs (I think many people will stumble upon this, when will try to test their code with mocha and watch).

@sindresorhus
Copy link
Contributor

@floatdrop nice work. use the gulpfriendly keyword instead of gulpplugin.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants