Skip to content

Commit

Permalink
fix: commonjs types, run tsc and lint to validate changes (#397)
Browse files Browse the repository at this point in the history
* fix: types weren't working for commonjs. Run tsc and lint to validate changes

* chore: needs to work on linux and BSD
  • Loading branch information
DavidTanner authored Aug 15, 2023
1 parent 61d4eea commit e6870fe
Show file tree
Hide file tree
Showing 50 changed files with 1,593 additions and 44 deletions.
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

* Before you open a ticket or send a pull request, [search](https://github.com/adaltas/node-csv/issues) for previous discussions about the same feature or issue. Add to the earlier ticket if you find one.

* Before sending a pull request for a feature, be sure to have [tests](https://github.com/adaltas/node-csv/tree/master/test).
* Before sending a pull request for a feature, be sure to have [tests](https://github.com/adaltas/node-csv/tree/master/packages/csv/test).

* Use the same coding style as the rest of the [codebase](https://github.com/adaltas/node-csv/tree/master/src). If you’re writing a test and if you're just getting started with CoffeeScript, there’s a nice [style guide](https://github.com/polarmobile/coffeescript-style-guide).
* Use the same coding style as the rest of the [codebase](https://github.com/adaltas/node-csv/tree/master/packages). If you’re writing a test and if you're just getting started with CoffeeScript, there’s a nice [style guide](https://github.com/polarmobile/coffeescript-style-guide).

* Documentation is published on [GitHub](https://github.com/adaltas/node-csv-docs) and you are invited to submit a pull request with your changes. For convenience, you can also browse the website and click on the Edit link present at the top of every page.
3 changes: 2 additions & 1 deletion demo/eslint/.eslintrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ module.exports = {
ecmaVersion: 'latest',
},
rules: {
// 'import/no-unresolved': [2, { commonjs: false }],
'import/no-unresolved': [2, { commonjs: false }],
'no-console': 'off',
},
};
2 changes: 2 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@
"build": "lerna run build",
"postinstall": "husky install",
"publish": "lerna publish from-git --yes",
"lint": "lerna run lint",
"pretest": "npm run lint",
"test": "lerna run test",
"test:legacy": "lerna run test:legacy",
"version": "lerna version"
Expand Down
2 changes: 1 addition & 1 deletion packages/csv-generate/dist/cjs/stream.d.cts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@

import { Options } from './index.js';
import { Options } from './index.cjs';

declare function generate(options?: Options): ReadableStream<Buffer>;
// export default generate;
Expand Down
2 changes: 1 addition & 1 deletion packages/csv-generate/dist/cjs/sync.d.cts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@

import { Options } from './index.js';
import { Options } from './index.cjs';

declare function generate<T = any>(options: number | Options): string & Array<T>;
// export default generate;
Expand Down
64 changes: 64 additions & 0 deletions packages/csv-generate/dist/esm/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -3295,6 +3295,26 @@ BufferList.prototype.concat = function (n) {
};

// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.

var isBufferEncoding = Buffer.isEncoding
|| function(encoding) {
switch (encoding && encoding.toLowerCase()) {
Expand Down Expand Up @@ -4381,6 +4401,9 @@ function indexOf(xs, x) {
}

// A bit simpler than readable streams.
// Implement an async ._write(chunk, encoding, cb), and it'll handle all
// the drain event emission and buffering.

Writable.WritableState = WritableState;
inherits$1(Writable, EventEmitter);

Expand Down Expand Up @@ -4892,6 +4915,47 @@ function onEndNT(self) {
}

// a transform stream is a readable/writable stream where you do
// something with the data. Sometimes it's called a "filter",
// but that's not a great name for it, since that implies a thing where
// some bits pass through, and others are simply ignored. (That would
// be a valid example of a transform, of course.)
//
// While the output is causally related to the input, it's not a
// necessarily symmetric or synchronous transformation. For example,
// a zlib stream might take multiple plain-text writes(), and then
// emit a single compressed chunk some time in the future.
//
// Here's how this works:
//
// The Transform stream has all the aspects of the readable and writable
// stream classes. When you write(chunk), that calls _write(chunk,cb)
// internally, and returns false if there's a lot of pending writes
// buffered up. When you call read(), that calls _read(n) until
// there's enough pending readable data buffered up.
//
// In a transform stream, the written data is placed in a buffer. When
// _read(n) is called, it transforms the queued up data, calling the
// buffered _write cb's as it consumes chunks. If consuming a single
// written chunk would result in multiple output chunks, then the first
// outputted bit calls the readcb, and subsequent chunks just go into
// the read buffer, and will cause it to emit 'readable' if necessary.
//
// This way, back-pressure is actually determined by the reading side,
// since _read has to be called to start processing a new chunk. However,
// a pathological inflate type of transform can cause excessive buffering
// here. For example, imagine a stream where every byte of input is
// interpreted as an integer from 0-255, and then results in that many
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
// 1kb of data being output. In this case, you could write a very small
// amount of input, and end up with a very large amount of output. In
// such a pathological inflating mechanism, there'd be no way to tell
// the system to stop doing the transform. A single 4MB write could
// cause the system to run out of memory.
//
// However, even in such a pathological case, only a single written chunk
// would be consumed, and then the rest would wait (un-transformed) until
// the results of the previous transformed chunk were consumed.

inherits$1(Transform, Duplex);

function TransformState(stream) {
Expand Down
64 changes: 64 additions & 0 deletions packages/csv-generate/dist/esm/sync.js
Original file line number Diff line number Diff line change
Expand Up @@ -3295,6 +3295,26 @@ BufferList.prototype.concat = function (n) {
};

// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.

var isBufferEncoding = Buffer.isEncoding
|| function(encoding) {
switch (encoding && encoding.toLowerCase()) {
Expand Down Expand Up @@ -4381,6 +4401,9 @@ function indexOf(xs, x) {
}

// A bit simpler than readable streams.
// Implement an async ._write(chunk, encoding, cb), and it'll handle all
// the drain event emission and buffering.

Writable.WritableState = WritableState;
inherits$1(Writable, EventEmitter);

Expand Down Expand Up @@ -4892,6 +4915,47 @@ function onEndNT(self) {
}

// a transform stream is a readable/writable stream where you do
// something with the data. Sometimes it's called a "filter",
// but that's not a great name for it, since that implies a thing where
// some bits pass through, and others are simply ignored. (That would
// be a valid example of a transform, of course.)
//
// While the output is causally related to the input, it's not a
// necessarily symmetric or synchronous transformation. For example,
// a zlib stream might take multiple plain-text writes(), and then
// emit a single compressed chunk some time in the future.
//
// Here's how this works:
//
// The Transform stream has all the aspects of the readable and writable
// stream classes. When you write(chunk), that calls _write(chunk,cb)
// internally, and returns false if there's a lot of pending writes
// buffered up. When you call read(), that calls _read(n) until
// there's enough pending readable data buffered up.
//
// In a transform stream, the written data is placed in a buffer. When
// _read(n) is called, it transforms the queued up data, calling the
// buffered _write cb's as it consumes chunks. If consuming a single
// written chunk would result in multiple output chunks, then the first
// outputted bit calls the readcb, and subsequent chunks just go into
// the read buffer, and will cause it to emit 'readable' if necessary.
//
// This way, back-pressure is actually determined by the reading side,
// since _read has to be called to start processing a new chunk. However,
// a pathological inflate type of transform can cause excessive buffering
// here. For example, imagine a stream where every byte of input is
// interpreted as an integer from 0-255, and then results in that many
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
// 1kb of data being output. In this case, you could write a very small
// amount of input, and end up with a very large amount of output. In
// such a pathological inflating mechanism, there'd be no way to tell
// the system to stop doing the transform. A single 4MB write could
// cause the system to run out of memory.
//
// However, even in such a pathological case, only a single written chunk
// would be consumed, and then the rest would wait (un-transformed) until
// the results of the previous transformed chunk were consumed.

inherits$1(Transform, Duplex);

function TransformState(stream) {
Expand Down
64 changes: 64 additions & 0 deletions packages/csv-generate/dist/iife/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -3298,6 +3298,26 @@ var csv_generate = (function (exports) {
};

// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.

var isBufferEncoding = Buffer.isEncoding
|| function(encoding) {
switch (encoding && encoding.toLowerCase()) {
Expand Down Expand Up @@ -4384,6 +4404,9 @@ var csv_generate = (function (exports) {
}

// A bit simpler than readable streams.
// Implement an async ._write(chunk, encoding, cb), and it'll handle all
// the drain event emission and buffering.

Writable.WritableState = WritableState;
inherits$1(Writable, EventEmitter);

Expand Down Expand Up @@ -4895,6 +4918,47 @@ var csv_generate = (function (exports) {
}

// a transform stream is a readable/writable stream where you do
// something with the data. Sometimes it's called a "filter",
// but that's not a great name for it, since that implies a thing where
// some bits pass through, and others are simply ignored. (That would
// be a valid example of a transform, of course.)
//
// While the output is causally related to the input, it's not a
// necessarily symmetric or synchronous transformation. For example,
// a zlib stream might take multiple plain-text writes(), and then
// emit a single compressed chunk some time in the future.
//
// Here's how this works:
//
// The Transform stream has all the aspects of the readable and writable
// stream classes. When you write(chunk), that calls _write(chunk,cb)
// internally, and returns false if there's a lot of pending writes
// buffered up. When you call read(), that calls _read(n) until
// there's enough pending readable data buffered up.
//
// In a transform stream, the written data is placed in a buffer. When
// _read(n) is called, it transforms the queued up data, calling the
// buffered _write cb's as it consumes chunks. If consuming a single
// written chunk would result in multiple output chunks, then the first
// outputted bit calls the readcb, and subsequent chunks just go into
// the read buffer, and will cause it to emit 'readable' if necessary.
//
// This way, back-pressure is actually determined by the reading side,
// since _read has to be called to start processing a new chunk. However,
// a pathological inflate type of transform can cause excessive buffering
// here. For example, imagine a stream where every byte of input is
// interpreted as an integer from 0-255, and then results in that many
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
// 1kb of data being output. In this case, you could write a very small
// amount of input, and end up with a very large amount of output. In
// such a pathological inflating mechanism, there'd be no way to tell
// the system to stop doing the transform. A single 4MB write could
// cause the system to run out of memory.
//
// However, even in such a pathological case, only a single written chunk
// would be consumed, and then the rest would wait (un-transformed) until
// the results of the previous transformed chunk were consumed.

inherits$1(Transform, Duplex);

function TransformState(stream) {
Expand Down
Loading

0 comments on commit e6870fe

Please sign in to comment.