You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current BulkLoad implementation requires that all data rows have to be stacked in memory before they can be sent to the database server. This is not suitable for large table imports.
A streaming bulk insert implementation must use data flow control to apply back pressure when the data arrives faster than it can be sent to the database. This could be modeled after the standard Node.js stream.Writable interface: Return false from write() to pause, emit a 'drain' event to resume.
It has to be determined whether a future BulkLoad streaming class should extend the stream.Writable class of the Node.js API (or the readable-stream package), or just implement the same interface, The objective is to make it compatible with stream.Readable.pipe().
The text was updated successfully, but these errors were encountered:
The current BulkLoad implementation requires that all data rows have to be stacked in memory before they can be sent to the database server. This is not suitable for large table imports.
A streaming bulk insert implementation must use data flow control to apply back pressure when the data arrives faster than it can be sent to the database. This could be modeled after the standard Node.js stream.Writable interface: Return false from write() to pause, emit a 'drain' event to resume.
It has to be determined whether a future BulkLoad streaming class should extend the stream.Writable class of the Node.js API (or the readable-stream package), or just implement the same interface, The objective is to make it compatible with stream.Readable.pipe().
The text was updated successfully, but these errors were encountered: