Skip to content

Proxy utility to bundle a batch of calls in one request. Using the batchelor utility reduces HTTP overhead, network round-trip delay time and helps to keep your API design clean.

License

Notifications You must be signed in to change notification settings

LivePersonInc/batchelor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Alt text

Built with Grunt Build Status Test Coverage Code Climate npm version Dependency Status devDependency Status npm downloads

NPM

Proxy utility to bundle a batch of calls in one request. Using the batchelor utility reduces HTTP overhead, network round-trip delay time and helps to keep your API design clean.

Features

  • Server side parallel request processing.
  • Persistent request for Web Socket facade.

Installation

npm install batchelorjs --save

API

configure(options)

configure the batchelor object.

  • log - a logger object containing debug,info and error function (default: empty logger).
  • transport - a transport object implementing issueCalls function (default: internal transport using async and request).
  • maxConcurrentBatches - maximum concurrent batch requests (default:50)
  • whiteList - an array containing a list of allowed hosts for processing the request (default: *, meaning allow all host/urls).
  • request - Object containing the default values per request

Example options

{
    maxConcurrentBatches: 100,
    logger: console,
    request: {
        "method": "GET",
        "timeout": 10000,
        "ip": "unknown",
        "headers": {},
        "strictSSL" : true,
        "pool": {
            "maxSockets": 200
        }
    }
    whiteList: ["*"]
}

execute(batch, callback)

  • batch - A single request object (see below) or an array of single requests [required]
  • callback(err, results) - A callback function to notify the batch processing has finished [required]
    The callback function gets 2 arguments:
    • err - error object, if an error occur, null otherwise
    • results - an JSON object containing the result/s of the batch

request

An object representing a single request in the batch, in the form:

  • name - identifier of the item, the name is used as reference. Names must be UNIQUE! [required]
  • url - URL that calls the item. Possible GET parameters are also given here [required]
  • method - possible values are GET or POST or whatever methods the called API supports [required]
  • encoding - the encoding of the item (default:UTF8) [optional]
  • retries - number of retries if the timeout is reach (default:2) [optional]
  • headers - the headers that the item uses [optional]
  • body || data - the parameters that the item uses when the method is POST are given here [optional]
  • timeout - number of milliseconds to wait for a request from the API to respond before aborting the request, if this parameters is not provided we use timeout from the config.json file [optional]
  • isOnCloseRequest - flag indicating if the item should be called when the connection is droped, used when using web socket facade (default:false) [optional]
  • persistent - flag indicating if the item should be called in persistent way, used when using web socket facade(default:false) [optional]
  • persistentDelay - number of delay between persistent items in milliseconds, used when using web socket facade (default:5000) [optional]

Example batches

Single request

{
	"name": "REQUEST_1",
	"method": "GET",
	"url": "jsonresponser.herokuapp.com/api/json/users",
	"timeout": 1000
}

Array of requests

[
	{
		"name": "REQUEST_1",
		"method": "GET",
		"url": "jsonresponser.herokuapp.com/api/json/users",
		"timeout": 1000
	}
	, 
	{
		"name": "REQUEST_2",
		"method": "POST",
		"url": "jsonresponser.herokuapp.com/api/json/users",
		"timeout": 1000
	}
]

stop(options)

  • options - an object containing the ids to be stopped, the ids provided on persistent requests [required]

      options = {
          ids: ["id1", "id2"] || "id1"
      }
    

returns an array of the requests stopped (empty if not found).

Events

EventEmitter API - will emit the following events:

  • processing with batchId data
  • complete with batchId data
  • persistent_processed with uniqueId data
  • persistent_stopped with uniqueId data

Examples

REST using ExpressJS Version 4.5.x

var exp_app = express();
var compression = require('compression');
var bodyParser = require('body-parser');
var exp_router = express.Router();
exp_app.use(compression());
exp_app.use(bodyParser());
var batchelor = require('batchelorjs');
var configuration = {
    "maxConcurrentBatches": 100,
    "logger": console,
    "request": {
        "method": "GET",
        "timeout": 10000,
        "ip": "unknown",
        "headers": {},
        "data": ""
    },
    "whiteList": ["*"]
};


batchelor.configure(configuration);
exp_router.post("/", function (req, res, next) {
    batchelor.execute(req.body, function (err, results) {
        if (err) {
            console.log("Error occur");
        }
        else {
            res.send(JSON.stringify(results));
        }
    });
});

exp_app.use("/", exp_router);
exp_app.listen(5050);

WebSocket - Server

var WebSocketServer = require('ws').Server;
var wss = new WebSocketServer({port: 5050});
var batchelor = require('batchelorjs');
var configuration = {
    "maxConcurrentBatches": 100,
    "logger": console,
    "request": {
        "method": "GET",
        "timeout": 10000,
        "ip": "unknown",
        "headers": {},
        "data": ""
    },
    "whiteList": ["*"]
};

batchelor.persistent.configure(configuration);
ws.on("message", function (data) {
    batchelor.persistent.execute(data,
        function (err, results) {
            ws.send(JSON.stringify(results));
        });
});

Request - WebSocket Client - sending 3 types of requests

The following example will send 3 types of requests; regular, persistent, and on-close. Batchelor will process these requests and return a response when:

  • regular: a response is returned from the given URL
  • persistent: every persistentDelay milliseconds, if there is a change in the response
  • on-close: once the connection is dropped from client
var batch = [
	{
        name: "regular_request",
        url: "jsonresponser.herokuapp.com/api/json/users"
        method: "GET",
        timeout: 5000, 
    },
    {
        name: "persistent_request",
        url: "jsonresponser.herokuapp.com/api/json/users"
        method: "GET",
        timeout: 5000, 
        persistent: true
        persistentDelay: 5000
    },
    {
        name: "onclose_request",
        url: "https://www.domain.com/item/2"
        method: "POST",
        retries: 5,
        timeout: 5000,
        isOnCloseRequest: true
    }
];
var ws = new WebSocket("wss://yourdomain/path");
ws.onopen = function (ws) {
    document.getElementById("connectionStatus").innerHTML = "Connected";
    ws.send(JSON.stringify(batch));
};
ws.onmessage = function (event) {
    document.getElementById("responseFromServer").value = event.data;
};

Response from previous request

{
    regular_request: {
        data: {
            name: "myname1",
            id: 1
        },
        statusCode: 200,
        headers: {
            "content-type": "application/json"
        }
    },
    persistentRequest: {
        data: "",
        headers: {
            "server": "Cowboy",
            "connection": "keep-alive",
            "x-powered-by": "Express",
            "content-type": "application/json; charset=utf-8",
            "content-length": "116",
            "etag": "W/\"74-1635811801\"",
            "date": "Mon, 12 Jan 2015 09:53:37 GMT",
            "via": "1.1 vegur"
        },
        "statusCode": 200,
        "cancelId": "jobName_37"
    }
}

Having in the response in the client cancelId we can send another request to the server and cancel the specific persistent request like:

var cancelMessage = {
	"cancelId": "jobName_1",
	"requestName": "persistentRequest"
};
ws.send(JSON.stringify(cancelMessage));

About

Proxy utility to bundle a batch of calls in one request. Using the batchelor utility reduces HTTP overhead, network round-trip delay time and helps to keep your API design clean.

Resources

License

Stars

Watchers

Forks

Packages

No packages published