diff --git a/CHANGELOG.md b/CHANGELOG.md index f9593cb605ae1e..a5a237ca0fa58c 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -33,7 +33,8 @@ release.
If you have two-factor authentication enabled then you'll be prompted to
-provide an otp token, or may use the --otp=...
option to specify it on
+
If you have two-factor authentication enabled then you'll be prompted to provide a second factor, or may use the --otp=...
option to specify it on
the command line.
If your account is not paid, then attempts to publish scoped packages will fail with an HTTP 402 status code (logically enough), unless you use diff --git a/deps/npm/docs/output/commands/npm-audit.html b/deps/npm/docs/output/commands/npm-audit.html index 1a83cbe47604ec..eda45d9bd23db3 100644 --- a/deps/npm/docs/output/commands/npm-audit.html +++ b/deps/npm/docs/output/commands/npm-audit.html @@ -454,7 +454,7 @@
include-workspace-root
This value is not exported to the environment for child processes.
install-links
When set file: protocol dependencies will be packed and installed as regular diff --git a/deps/npm/docs/output/commands/npm-ci.html b/deps/npm/docs/output/commands/npm-ci.html index 07f07b5630562e..cd2c21a3c215b6 100644 --- a/deps/npm/docs/output/commands/npm-ci.html +++ b/deps/npm/docs/output/commands/npm-ci.html @@ -229,13 +229,13 @@
global
install-strategy
Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct -deps at top-level. linked: (coming soon) install in node_modules/.store, +deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
legacy-bundling
include-workspace-root
This value is not exported to the environment for child processes.
install-links
When set file: protocol dependencies will be packed and installed as regular diff --git a/deps/npm/docs/output/commands/npm-dedupe.html b/deps/npm/docs/output/commands/npm-dedupe.html index 143398d8d7a27b..d77f4a59c7de98 100644 --- a/deps/npm/docs/output/commands/npm-dedupe.html +++ b/deps/npm/docs/output/commands/npm-dedupe.html @@ -198,13 +198,13 @@
install-strategy
Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct -deps at top-level. linked: (coming soon) install in node_modules/.store, +deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
legacy-bundling
include-workspace-root
This value is not exported to the environment for child processes.
install-links
When set file: protocol dependencies will be packed and installed as regular diff --git a/deps/npm/docs/output/commands/npm-dist-tag.html b/deps/npm/docs/output/commands/npm-dist-tag.html index b3396d7cbb53cf..1e75f62f1fcb49 100644 --- a/deps/npm/docs/output/commands/npm-dist-tag.html +++ b/deps/npm/docs/output/commands/npm-dist-tag.html @@ -160,13 +160,13 @@
--tag
config if not specified. If you have
two-factor authentication on auth-and-writes then you’ll need to include a
one-time password on the command line with
---otp <one-time password>
, or at the OTP prompt.
+--otp <one-time password>
, or go through a second factor flow based on your authtype
.
rm: Clear a tag that is no longer in use from the package. If you have
two-factor authentication on auth-and-writes then you’ll need to include
a one-time password on the command line with --otp <one-time password>
,
-or at the OTP prompt.
authtype
ls: Show all of the dist-tags for a package, defaulting to the package in diff --git a/deps/npm/docs/output/commands/npm-find-dupes.html b/deps/npm/docs/output/commands/npm-find-dupes.html index 0dd2bc375069ee..82d5eba473f1df 100644 --- a/deps/npm/docs/output/commands/npm-find-dupes.html +++ b/deps/npm/docs/output/commands/npm-find-dupes.html @@ -155,13 +155,13 @@
install-strategy
Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct -deps at top-level. linked: (coming soon) install in node_modules/.store, +deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
legacy-bundling
include-workspace-root
This value is not exported to the environment for child processes.
install-links
When set file: protocol dependencies will be packed and installed as regular diff --git a/deps/npm/docs/output/commands/npm-install-ci-test.html b/deps/npm/docs/output/commands/npm-install-ci-test.html index 0d3dea5b4cc530..b2f6a3affbbafa 100644 --- a/deps/npm/docs/output/commands/npm-install-ci-test.html +++ b/deps/npm/docs/output/commands/npm-install-ci-test.html @@ -186,13 +186,13 @@
global
install-strategy
Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct -deps at top-level. linked: (coming soon) install in node_modules/.store, +deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
legacy-bundling
include-workspace-root
This value is not exported to the environment for child processes.
install-links
When set file: protocol dependencies will be packed and installed as regular diff --git a/deps/npm/docs/output/commands/npm-install-test.html b/deps/npm/docs/output/commands/npm-install-test.html index ff53148d4fb09d..27ab7c3ae5b23a 100644 --- a/deps/npm/docs/output/commands/npm-install-test.html +++ b/deps/npm/docs/output/commands/npm-install-test.html @@ -187,13 +187,13 @@
global
install-strategy
Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct -deps at top-level. linked: (coming soon) install in node_modules/.store, +deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
legacy-bundling
include-workspace-root
This value is not exported to the environment for child processes.
install-links
When set file: protocol dependencies will be packed and installed as regular diff --git a/deps/npm/docs/output/commands/npm-install.html b/deps/npm/docs/output/commands/npm-install.html index 40dc5d32b3d746..0c72b0c0917200 100644 --- a/deps/npm/docs/output/commands/npm-install.html +++ b/deps/npm/docs/output/commands/npm-install.html @@ -513,13 +513,13 @@
global
install-strategy
Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct -deps at top-level. linked: (coming soon) install in node_modules/.store, +deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
legacy-bundling
include-workspace-root
This value is not exported to the environment for child processes.
install-links
When set file: protocol dependencies will be packed and installed as regular diff --git a/deps/npm/docs/output/commands/npm-link.html b/deps/npm/docs/output/commands/npm-link.html index d23f83826aa9d7..9686106d8b8d03 100644 --- a/deps/npm/docs/output/commands/npm-link.html +++ b/deps/npm/docs/output/commands/npm-link.html @@ -250,13 +250,13 @@
global
install-strategy
Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct -deps at top-level. linked: (coming soon) install in node_modules/.store, +deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
legacy-bundling
include-workspace-root
This value is not exported to the environment for child processes.
install-links
When set file: protocol dependencies will be packed and installed as regular diff --git a/deps/npm/docs/output/commands/npm-ls.html b/deps/npm/docs/output/commands/npm-ls.html index 4bb067ab67a65b..8411b221adb082 100644 --- a/deps/npm/docs/output/commands/npm-ls.html +++ b/deps/npm/docs/output/commands/npm-ls.html @@ -160,7 +160,7 @@
npm ls promzard
in npm's source tree will show:
-npm@9.3.1 /path/to/npm
+npm@9.5.0 /path/to/npm
└─┬ init-package-json@0.0.4
└── promzard@0.1.5
@@ -333,7 +333,7 @@ include-workspace-root
This value is not exported to the environment for child processes.
install-links
-- Default: true
+- Default: false
- Type: Boolean
When set file: protocol dependencies will be packed and installed as regular
diff --git a/deps/npm/docs/output/commands/npm-owner.html b/deps/npm/docs/output/commands/npm-owner.html
index 3566602a011109..dab34abe97c14a 100644
--- a/deps/npm/docs/output/commands/npm-owner.html
+++ b/deps/npm/docs/output/commands/npm-owner.html
@@ -166,8 +166,8 @@
Description
or you can't. Future versions may contain more fine-grained access levels, but
that is not implemented at this time.
If you have two-factor authentication enabled with auth-and-writes
(see
-npm-profile
) then you'll need to include an otp
-on the command line when changing ownership with --otp
.
+npm-profile
) then you'll need to go through a second factor
+flow when changing ownership or include an otp on the command line with --otp
.
Configuration
registry
diff --git a/deps/npm/docs/output/commands/npm-prune.html b/deps/npm/docs/output/commands/npm-prune.html
index b2cbbbeddd949b..9ca44c4474052e 100644
--- a/deps/npm/docs/output/commands/npm-prune.html
+++ b/deps/npm/docs/output/commands/npm-prune.html
@@ -268,7 +268,7 @@ include-workspace-root
This value is not exported to the environment for child processes.
install-links
-- Default: true
+- Default: false
- Type: Boolean
When set file: protocol dependencies will be packed and installed as regular
diff --git a/deps/npm/docs/output/commands/npm-publish.html b/deps/npm/docs/output/commands/npm-publish.html
index 80a136308a19c5..a2a4f0aad05647 100644
--- a/deps/npm/docs/output/commands/npm-publish.html
+++ b/deps/npm/docs/output/commands/npm-publish.html
@@ -142,7 +142,7 @@
npm-publish
Table of contents
-
+
Synopsis
@@ -306,6 +306,12 @@ include-workspace-root
all workspaces via the workspaces
flag, will cause npm to operate only on
the specified workspaces, and not on the root project.
This value is not exported to the environment for child processes.
+provenance
+
+- Default: false
+- Type: Boolean
+
+Indicates that a provenance statement should be generated.
See Also
- package spec
diff --git a/deps/npm/docs/output/commands/npm-rebuild.html b/deps/npm/docs/output/commands/npm-rebuild.html
index 51243f22324a58..4d2ce115e6a362 100644
--- a/deps/npm/docs/output/commands/npm-rebuild.html
+++ b/deps/npm/docs/output/commands/npm-rebuild.html
@@ -250,7 +250,7 @@ include-workspace-root
This value is not exported to the environment for child processes.
install-links
-- Default: true
+- Default: false
- Type: Boolean
When set file: protocol dependencies will be packed and installed as regular
diff --git a/deps/npm/docs/output/commands/npm-team.html b/deps/npm/docs/output/commands/npm-team.html
index a67381f3508dab..ac8740e917cc8f 100644
--- a/deps/npm/docs/output/commands/npm-team.html
+++ b/deps/npm/docs/output/commands/npm-team.html
@@ -162,7 +162,8 @@
Description
as @org:newteam
in these commands.
If you have two-factor authentication enabled in auth-and-writes
mode, then
you can provide a code from your authenticator with [--otp <otpcode>]
.
-If you don't include this then you will be prompted.
+If you don't include this then you will be taken through a second factor flow based
+on your authtype
.
-
create / destroy:
diff --git a/deps/npm/docs/output/commands/npm-uninstall.html b/deps/npm/docs/output/commands/npm-uninstall.html
index 4e6d43fb9dc48f..98a9f08418a14e 100644
--- a/deps/npm/docs/output/commands/npm-uninstall.html
+++ b/deps/npm/docs/output/commands/npm-uninstall.html
@@ -234,7 +234,7 @@
include-workspace-root
This value is not exported to the environment for child processes.
install-links
-- Default: true
+- Default: false
- Type: Boolean
When set file: protocol dependencies will be packed and installed as regular
diff --git a/deps/npm/docs/output/commands/npm-update.html b/deps/npm/docs/output/commands/npm-update.html
index 162eb4fe54b99e..32c3da827ca805 100644
--- a/deps/npm/docs/output/commands/npm-update.html
+++ b/deps/npm/docs/output/commands/npm-update.html
@@ -277,13 +277,13 @@
global
install-strategy
- Default: "hoisted"
-- Type: "hoisted", "nested", or "shallow"
+- Type: "hoisted", "nested", "shallow", or "linked"
Sets the strategy for installing packages in node_modules. hoisted
(default): Install non-duplicated in top-level, and duplicated as necessary
within directory structure. nested: (formerly --legacy-bundling) install in
place, no hoisting. shallow (formerly --global-style) only install direct
-deps at top-level. linked: (coming soon) install in node_modules/.store,
+deps at top-level. linked: (experimental) install in node_modules/.store,
link in place, unhoisted.
legacy-bundling
@@ -447,7 +447,7 @@ include-workspace-root
This value is not exported to the environment for child processes.
install-links
-- Default: true
+- Default: false
- Type: Boolean
When set file: protocol dependencies will be packed and installed as regular
diff --git a/deps/npm/docs/output/commands/npm.html b/deps/npm/docs/output/commands/npm.html
index 76f4fc5cc17c9c..f3678c6bf9393f 100644
--- a/deps/npm/docs/output/commands/npm.html
+++ b/deps/npm/docs/output/commands/npm.html
@@ -150,7 +150,7 @@
Table of contents
Note: This command is unaware of workspaces.
Version
-9.3.1
+9.5.0
Description
npm is the package manager for the Node JavaScript platform. It puts
modules in place so that node can find them, and manages dependency
diff --git a/deps/npm/docs/output/using-npm/config.html b/deps/npm/docs/output/using-npm/config.html
index 35bc3529efaeef..3887fe1a564e4f 100644
--- a/deps/npm/docs/output/using-npm/config.html
+++ b/deps/npm/docs/output/using-npm/config.html
@@ -142,7 +142,7 @@
config
Table of contents
-- Description
- Shorthands and Other CLI Niceties
- Config Settings
_auth
access
all
allow-same-version
audit
audit-level
auth-type
before
bin-links
browser
ca
cache
cafile
call
ci-name
cidr
color
commit-hooks
depth
description
diff
diff-dst-prefix
diff-ignore-all-space
diff-name-only
diff-no-prefix
diff-src-prefix
diff-text
diff-unified
dry-run
editor
engine-strict
fetch-retries
fetch-retry-factor
fetch-retry-maxtimeout
fetch-retry-mintimeout
fetch-timeout
force
foreground-scripts
format-package-lock
fund
git
git-tag-version
global
globalconfig
heading
https-proxy
if-present
ignore-scripts
include
include-staged
include-workspace-root
init-author-email
init-author-name
init-author-url
init-license
init-module
init-version
install-links
install-strategy
json
legacy-peer-deps
link
local-address
location
lockfile-version
loglevel
logs-dir
logs-max
long
maxsockets
message
node-options
noproxy
offline
omit
omit-lockfile-registry-resolved
otp
pack-destination
package
package-lock
package-lock-only
parseable
prefer-offline
prefer-online
prefix
preid
progress
proxy
read-only
rebuild-bundle
registry
replace-registry-host
save
save-bundle
save-dev
save-exact
save-optional
save-peer
save-prefix
save-prod
scope
script-shell
searchexclude
searchlimit
searchopts
searchstaleness
shell
sign-git-commit
sign-git-tag
strict-peer-deps
strict-ssl
tag
tag-version-prefix
timing
umask
unicode
update-notifier
usage
user-agent
userconfig
version
versions
viewer
which
workspace
workspaces
workspaces-update
yes
also
cache-max
cache-min
cert
dev
global-style
init.author.email
init.author.name
init.author.url
init.license
init.module
init.version
key
legacy-bundling
only
optional
production
shrinkwrap
tmp
- See also
+- Description
- Shorthands and Other CLI Niceties
- Config Settings
_auth
access
all
allow-same-version
audit
audit-level
auth-type
before
bin-links
browser
ca
cache
cafile
call
ci-name
cidr
color
commit-hooks
depth
description
diff
diff-dst-prefix
diff-ignore-all-space
diff-name-only
diff-no-prefix
diff-src-prefix
diff-text
diff-unified
dry-run
editor
engine-strict
fetch-retries
fetch-retry-factor
fetch-retry-maxtimeout
fetch-retry-mintimeout
fetch-timeout
force
foreground-scripts
format-package-lock
fund
git
git-tag-version
global
globalconfig
heading
https-proxy
if-present
ignore-scripts
include
include-staged
include-workspace-root
init-author-email
init-author-name
init-author-url
init-license
init-module
init-version
install-links
install-strategy
json
legacy-peer-deps
link
local-address
location
lockfile-version
loglevel
logs-dir
logs-max
long
maxsockets
message
node-options
noproxy
offline
omit
omit-lockfile-registry-resolved
otp
pack-destination
package
package-lock
package-lock-only
parseable
prefer-offline
prefer-online
prefix
preid
progress
provenance
proxy
read-only
rebuild-bundle
registry
replace-registry-host
save
save-bundle
save-dev
save-exact
save-optional
save-peer
save-prefix
save-prod
scope
script-shell
searchexclude
searchlimit
searchopts
searchstaleness
shell
sign-git-commit
sign-git-tag
strict-peer-deps
strict-ssl
tag
tag-version-prefix
timing
umask
unicode
update-notifier
usage
user-agent
userconfig
version
versions
viewer
which
workspace
workspaces
workspaces-update
yes
also
cache-max
cache-min
cert
dev
global-style
init.author.email
init.author.name
init.author.url
init.license
init.module
init.version
key
legacy-bundling
only
optional
production
shrinkwrap
tmp
- See also
Description
@@ -724,7 +724,7 @@ init-version
number, if not already set in package.json.
install-links
-- Default: true
+- Default: false
- Type: Boolean
When set file: protocol dependencies will be packed and installed as regular
@@ -733,13 +733,13 @@
install-links
install-strategy
- Default: "hoisted"
-- Type: "hoisted", "nested", or "shallow"
+- Type: "hoisted", "nested", "shallow", or "linked"
Sets the strategy for installing packages in node_modules. hoisted
(default): Install non-duplicated in top-level, and duplicated as necessary
within directory structure. nested: (formerly --legacy-bundling) install in
place, no hoisting. shallow (formerly --global-style) only install direct
-deps at top-level. linked: (coming soon) install in node_modules/.store,
+deps at top-level. linked: (experimental) install in node_modules/.store,
link in place, unhoisted.
json
@@ -989,6 +989,12 @@ progress
When set to true
, npm will display a progress bar during time intensive
operations, if process.stderr
is a TTY.
Set to false
to suppress the progress bar.
+provenance
+
+- Default: false
+- Type: Boolean
+
+Indicates that a provenance statement should be generated.
proxy
- Default: null
diff --git a/deps/npm/lib/commands/audit.js b/deps/npm/lib/commands/audit.js
index 13886ea6350b66..05830fff69c2ea 100644
--- a/deps/npm/lib/commands/audit.js
+++ b/deps/npm/lib/commands/audit.js
@@ -24,8 +24,8 @@ class VerifySignatures {
this.missing = []
this.checkedPackages = new Set()
this.auditedWithKeysCount = 0
- this.verifiedCount = 0
- this.output = []
+ this.verifiedSignatureCount = 0
+ this.verifiedAttestationCount = 0
this.exitCode = 0
}
@@ -60,13 +60,13 @@ class VerifySignatures {
const hasNoInvalidOrMissing = invalid.length === 0 && missing.length === 0
if (!hasNoInvalidOrMissing) {
- this.exitCode = 1
+ process.exitCode = 1
}
if (this.npm.config.get('json')) {
- this.appendOutput(JSON.stringify({
- invalid: this.makeJSON(invalid),
- missing: this.makeJSON(missing),
+ this.npm.output(JSON.stringify({
+ invalid,
+ missing,
}, null, 2))
return
}
@@ -76,52 +76,92 @@ class VerifySignatures {
const auditedPlural = this.auditedWithKeysCount > 1 ? 's' : ''
const timing = `audited ${this.auditedWithKeysCount} package${auditedPlural} in ` +
`${Math.floor(Number(elapsed) / 1e9)}s`
- this.appendOutput(`${timing}\n`)
+ this.npm.output(timing)
+ this.npm.output('')
+
+ const verifiedBold = this.npm.chalk.bold('verified')
+ if (this.verifiedSignatureCount) {
+ if (this.verifiedSignatureCount === 1) {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`${this.verifiedSignatureCount} package has a ${verifiedBold} registry signature`)
+ } else {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`${this.verifiedSignatureCount} packages have ${verifiedBold} registry signatures`)
+ }
+ this.npm.output('')
+ }
- if (this.verifiedCount) {
- const verifiedBold = this.npm.chalk.bold('verified')
- const msg = this.verifiedCount === 1 ?
- `${this.verifiedCount} package has a ${verifiedBold} registry signature\n` :
- `${this.verifiedCount} packages have ${verifiedBold} registry signatures\n`
- this.appendOutput(msg)
+ if (this.verifiedAttestationCount) {
+ if (this.verifiedAttestationCount === 1) {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`${this.verifiedAttestationCount} package has a ${verifiedBold} attestation`)
+ } else {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`${this.verifiedAttestationCount} packages have ${verifiedBold} attestations`)
+ }
+ this.npm.output('')
}
if (missing.length) {
const missingClr = this.npm.chalk.bold(this.npm.chalk.red('missing'))
- const msg = missing.length === 1 ?
- `package has a ${missingClr} registry signature` :
- `packages have ${missingClr} registry signatures`
- this.appendOutput(
- `${missing.length} ${msg} but the registry is ` +
- `providing signing keys:\n`
+ if (missing.length === 1) {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`1 package has a ${missingClr} registry signature but the registry is providing signing keys:`)
+ } else {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`${missing.length} packages have ${missingClr} registry signatures but the registry is providing signing keys:`)
+ }
+ this.npm.output('')
+ missing.map(m =>
+ this.npm.output(`${this.npm.chalk.red(`${m.name}@${m.version}`)} (${m.registry})`)
)
- this.appendOutput(this.humanOutput(missing))
}
if (invalid.length) {
+ if (missing.length) {
+ this.npm.output('')
+ }
const invalidClr = this.npm.chalk.bold(this.npm.chalk.red('invalid'))
- const msg = invalid.length === 1 ?
- `${invalid.length} package has an ${invalidClr} registry signature:\n` :
- `${invalid.length} packages have ${invalidClr} registry signatures:\n`
- this.appendOutput(
- `${missing.length ? '\n' : ''}${msg}`
- )
- this.appendOutput(this.humanOutput(invalid))
- const tamperMsg = invalid.length === 1 ?
- `\nSomeone might have tampered with this package since it was ` +
- `published on the registry!\n` :
- `\nSomeone might have tampered with these packages since they where ` +
- `published on the registry!\n`
- this.appendOutput(tamperMsg)
- }
- }
+ // We can have either invalid signatures or invalid provenance
+ const invalidSignatures = this.invalid.filter(i => i.code === 'EINTEGRITYSIGNATURE')
+ if (invalidSignatures.length) {
+ if (invalidSignatures.length === 1) {
+ this.npm.output(`1 package has an ${invalidClr} registry signature:`)
+ } else {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`${invalidSignatures.length} packages have ${invalidClr} registry signatures:`)
+ }
+ this.npm.output('')
+ invalidSignatures.map(i =>
+ this.npm.output(`${this.npm.chalk.red(`${i.name}@${i.version}`)} (${i.registry})`)
+ )
+ this.npm.output('')
+ }
- appendOutput (...args) {
- this.output.push(...args.flat())
- }
+ const invalidAttestations = this.invalid.filter(i => i.code === 'EATTESTATIONVERIFY')
+ if (invalidAttestations.length) {
+ if (invalidAttestations.length === 1) {
+ this.npm.output(`1 package has an ${invalidClr} attestation:`)
+ } else {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`${invalidAttestations.length} packages have ${invalidClr} attestations:`)
+ }
+ this.npm.output('')
+ invalidAttestations.map(i =>
+ this.npm.output(`${this.npm.chalk.red(`${i.name}@${i.version}`)} (${i.registry})`)
+ )
+ this.npm.output('')
+ }
- report () {
- return { report: this.output.join('\n'), exitCode: this.exitCode }
+ if (invalid.length === 1) {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`Someone might have tampered with this package since it was published on the registry!`)
+ } else {
+ /* eslint-disable-next-line max-len */
+ this.npm.output(`Someone might have tampered with these packages since they were published on the registry!`)
+ }
+ this.npm.output('')
+ }
}
getEdgesOut (nodes, filterSet) {
@@ -242,18 +282,22 @@ class VerifySignatures {
const {
_integrity: integrity,
_signatures,
+ _attestations,
_resolved: resolved,
} = await pacote.manifest(`${name}@${version}`, {
verifySignatures: true,
+ verifyAttestations: true,
...this.buildRegistryConfig(registry),
...this.npm.flatOptions,
})
const signatures = _signatures || []
- return {
+ const result = {
integrity,
signatures,
+ attestations: _attestations,
resolved,
}
+ return result
}
async getVerifiedInfo (edge) {
@@ -276,61 +320,52 @@ class VerifySignatures {
}
try {
- const { integrity, signatures, resolved } = await this.verifySignatures(
+ const { integrity, signatures, attestations, resolved } = await this.verifySignatures(
name, version, registry
)
// Currently we only care about missing signatures on registries that provide a public key
// We could make this configurable in the future with a strict/paranoid mode
if (signatures.length) {
- this.verifiedCount += 1
+ this.verifiedSignatureCount += 1
} else if (keys.length) {
this.missing.push({
- name,
- version,
- location,
- resolved,
integrity,
+ location,
+ name,
registry,
+ resolved,
+ version,
})
}
+
+ // Track verified attestations separately to registry signatures, as all
+ // packages on registries with signing keys are expected to have registry
+ // signatures, but not all packages have provenance and publish attestations.
+ if (attestations) {
+ this.verifiedAttestationCount += 1
+ }
} catch (e) {
- if (e.code === 'EINTEGRITYSIGNATURE') {
- const { signature, keyid, integrity, resolved } = e
+ if (e.code === 'EINTEGRITYSIGNATURE' || e.code === 'EATTESTATIONVERIFY') {
this.invalid.push({
+ code: e.code,
+ message: e.message,
+ integrity: e.integrity,
+ keyid: e.keyid,
+ location,
name,
+ registry,
+ resolved: e.resolved,
+ signature: e.signature,
+ predicateType: e.predicateType,
type,
version,
- resolved,
- location,
- integrity,
- registry,
- signature,
- keyid,
})
} else {
throw e
}
}
}
-
- humanOutput (list) {
- return list.map(v =>
- `${this.npm.chalk.red(`${v.name}@${v.version}`)} (${v.registry})`
- ).join('\n')
- }
-
- makeJSON (deps) {
- return deps.map(d => ({
- name: d.name,
- version: d.version,
- location: d.location,
- resolved: d.resolved,
- integrity: d.integrity,
- signature: d.signature,
- keyid: d.keyid,
- }))
- }
}
class Audit extends ArboristWorkspaceCmd {
@@ -432,9 +467,6 @@ class Audit extends ArboristWorkspaceCmd {
const verify = new VerifySignatures(tree, filterSet, this.npm, { ...opts })
await verify.run()
- const result = verify.report()
- process.exitCode = process.exitCode || result.exitCode
- this.npm.output(result.report)
}
}
diff --git a/deps/npm/lib/commands/init.js b/deps/npm/lib/commands/init.js
index 16ece46589d7cb..1e5661a7840f26 100644
--- a/deps/npm/lib/commands/init.js
+++ b/deps/npm/lib/commands/init.js
@@ -165,24 +165,20 @@ class Init extends BaseCommand {
].join('\n'))
}
- // XXX promisify init-package-json
- await new Promise((res, rej) => {
- initJson(path, initFile, this.npm.config, (er, data) => {
- log.resume()
- log.enableProgress()
- log.silly('package data', data)
- if (er && er.message === 'canceled') {
- log.warn('init', 'canceled')
- return res()
- }
- if (er) {
- rej(er)
- } else {
- log.info('init', 'written successfully')
- res(data)
- }
- })
- })
+ try {
+ const data = await initJson(path, initFile, this.npm.config)
+ log.silly('package data', data)
+ return data
+ } catch (er) {
+ if (er.message === 'canceled') {
+ log.warn('init', 'canceled')
+ } else {
+ throw er
+ }
+ } finally {
+ log.resume()
+ log.enableProgress()
+ }
}
async setWorkspace (pkg, workspacePath) {
diff --git a/deps/npm/lib/commands/publish.js b/deps/npm/lib/commands/publish.js
index 76faea9457f748..8befbc5ca34cec 100644
--- a/deps/npm/lib/commands/publish.js
+++ b/deps/npm/lib/commands/publish.js
@@ -35,6 +35,7 @@ class Publish extends BaseCommand {
'workspace',
'workspaces',
'include-workspace-root',
+ 'provenance',
]
static usage = ['']
diff --git a/deps/npm/lib/commands/unpublish.js b/deps/npm/lib/commands/unpublish.js
index 9985e2e39f1405..f1bcded192e5ad 100644
--- a/deps/npm/lib/commands/unpublish.js
+++ b/deps/npm/lib/commands/unpublish.js
@@ -26,7 +26,10 @@ class Unpublish extends BaseCommand {
async getKeysOfVersions (name, opts) {
const pkgUri = npa(name).escapedName
- const json = await npmFetch.json(`${pkgUri}?write=true`, opts)
+ const json = await npmFetch.json(`${pkgUri}?write=true`, {
+ ...opts,
+ spec: name,
+ })
return Object.keys(json.versions)
}
diff --git a/deps/npm/lib/utils/auth.js b/deps/npm/lib/utils/auth.js
index 8b9125a1c3ef06..729ce32c2a7a8f 100644
--- a/deps/npm/lib/utils/auth.js
+++ b/deps/npm/lib/utils/auth.js
@@ -8,16 +8,27 @@ const adduser = async (npm, { creds, ...opts }) => {
const authType = npm.config.get('auth-type')
let res
if (authType === 'web') {
- res = await profile.adduserWeb((url, emitter) => {
- openUrlPrompt(
- npm,
- url,
- 'Create your account at',
- 'Press ENTER to open in the browser...',
- emitter
- )
- }, opts)
- } else {
+ try {
+ res = await profile.adduserWeb((url, emitter) => {
+ openUrlPrompt(
+ npm,
+ url,
+ 'Create your account at',
+ 'Press ENTER to open in the browser...',
+ emitter
+ )
+ }, opts)
+ } catch (err) {
+ if (err.code === 'ENYI') {
+ log.verbose('web add user not supported, trying couch')
+ } else {
+ throw err
+ }
+ }
+ }
+
+ // auth type !== web or ENYI error w/ web adduser
+ if (!res) {
const username = await read.username('Username:', creds.username)
const password = await read.password('Password:', creds.password)
const email = await read.email('Email: (this IS public) ', creds.email)
@@ -44,16 +55,27 @@ const login = async (npm, { creds, ...opts }) => {
const authType = npm.config.get('auth-type')
let res
if (authType === 'web') {
- res = await profile.loginWeb((url, emitter) => {
- openUrlPrompt(
- npm,
- url,
- 'Login at',
- 'Press ENTER to open in the browser...',
- emitter
- )
- }, opts)
- } else {
+ try {
+ res = await profile.loginWeb((url, emitter) => {
+ openUrlPrompt(
+ npm,
+ url,
+ 'Login at',
+ 'Press ENTER to open in the browser...',
+ emitter
+ )
+ }, opts)
+ } catch (err) {
+ if (err.code === 'ENYI') {
+ log.verbose('web login not supported, trying couch')
+ } else {
+ throw err
+ }
+ }
+ }
+
+ // auth type !== web or ENYI error w/ web login
+ if (!res) {
const username = await read.username('Username:', creds.username)
const password = await read.password('Password:', creds.password)
res = await otplease(npm, opts, (reqOpts) =>
diff --git a/deps/npm/lib/utils/config/definitions.js b/deps/npm/lib/utils/config/definitions.js
index 9ddbafd46f7bc8..4b9eb1f64cbbd1 100644
--- a/deps/npm/lib/utils/config/definitions.js
+++ b/deps/npm/lib/utils/config/definitions.js
@@ -1078,7 +1078,7 @@ define('init.version', {
})
define('install-links', {
- default: true,
+ default: false,
type: Boolean,
description: `
When set file: protocol dependencies will be packed and installed as
@@ -1090,14 +1090,14 @@ define('install-links', {
define('install-strategy', {
default: 'hoisted',
- type: ['hoisted', 'nested', 'shallow'],
+ type: ['hoisted', 'nested', 'shallow', 'linked'],
description: `
Sets the strategy for installing packages in node_modules.
hoisted (default): Install non-duplicated in top-level, and duplicated as
necessary within directory structure.
nested: (formerly --legacy-bundling) install in place, no hoisting.
shallow (formerly --global-style) only install direct deps at top-level.
- linked: (coming soon) install in node_modules/.store, link in place,
+ linked: (experimental) install in node_modules/.store, link in place,
unhoisted.
`,
flatten,
@@ -1620,6 +1620,15 @@ define('progress', {
},
})
+define('provenance', {
+ default: false,
+ type: Boolean,
+ description: `
+ Indicates that a provenance statement should be generated.
+ `,
+ flatten,
+})
+
define('proxy', {
default: null,
type: [null, false, url], // allow proxy to be disabled explicitly
diff --git a/deps/npm/lib/utils/read-user-info.js b/deps/npm/lib/utils/read-user-info.js
index 26d5b36d55b582..1cac8ee6d2668b 100644
--- a/deps/npm/lib/utils/read-user-info.js
+++ b/deps/npm/lib/utils/read-user-info.js
@@ -1,5 +1,4 @@
-const { promisify } = require('util')
-const readAsync = promisify(require('read'))
+const read = require('read')
const userValidate = require('npm-user-validate')
const log = require('./log-shim.js')
@@ -17,9 +16,9 @@ const passwordPrompt = 'npm password: '
const usernamePrompt = 'npm username: '
const emailPrompt = 'email (this IS public): '
-function read (opts) {
+function readWithProgress (opts) {
log.clearProgress()
- return readAsync(opts).finally(() => log.showProgress())
+ return read(opts).finally(() => log.showProgress())
}
function readOTP (msg = otpPrompt, otp, isRetry) {
@@ -27,7 +26,7 @@ function readOTP (msg = otpPrompt, otp, isRetry) {
return otp.replace(/\s+/g, '')
}
- return read({ prompt: msg, default: otp || '' })
+ return readWithProgress({ prompt: msg, default: otp || '' })
.then((rOtp) => readOTP(msg, rOtp, true))
}
@@ -36,7 +35,7 @@ function readPassword (msg = passwordPrompt, password, isRetry) {
return password
}
- return read({ prompt: msg, silent: true, default: password || '' })
+ return readWithProgress({ prompt: msg, silent: true, default: password || '' })
.then((rPassword) => readPassword(msg, rPassword, true))
}
@@ -50,7 +49,7 @@ function readUsername (msg = usernamePrompt, username, isRetry) {
}
}
- return read({ prompt: msg, default: username || '' })
+ return readWithProgress({ prompt: msg, default: username || '' })
.then((rUsername) => readUsername(msg, rUsername, true))
}
@@ -64,6 +63,6 @@ function readEmail (msg = emailPrompt, email, isRetry) {
}
}
- return read({ prompt: msg, default: email || '' })
+ return readWithProgress({ prompt: msg, default: email || '' })
.then((username) => readEmail(msg, username, true))
}
diff --git a/deps/npm/man/man1/npm-access.1 b/deps/npm/man/man1/npm-access.1
index 91381c80c2c909..732813c2b8642f 100644
--- a/deps/npm/man/man1/npm-access.1
+++ b/deps/npm/man/man1/npm-access.1
@@ -1,4 +1,4 @@
-.TH "NPM-ACCESS" "1" "January 2023" "" ""
+.TH "NPM-ACCESS" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-access\fR - Set access level on published packages
.SS "Synopsis"
@@ -55,7 +55,7 @@ You have been given read-write privileges for a package, either as a member of a
.RE 0
.P
-If you have two-factor authentication enabled then you'll be prompted to provide an otp token, or may use the \fB--otp=...\fR option to specify it on the command line.
+If you have two-factor authentication enabled then you'll be prompted to provide a second factor, or may use the \fB--otp=...\fR option to specify it on the command line.
.P
If your account is not paid, then attempts to publish scoped packages will fail with an HTTP 402 status code (logically enough), unless you use \fB--access=public\fR.
.P
diff --git a/deps/npm/man/man1/npm-adduser.1 b/deps/npm/man/man1/npm-adduser.1
index 4e9a687fb6752c..5a4ee97566cc08 100644
--- a/deps/npm/man/man1/npm-adduser.1
+++ b/deps/npm/man/man1/npm-adduser.1
@@ -1,4 +1,4 @@
-.TH "NPM-ADDUSER" "1" "January 2023" "" ""
+.TH "NPM-ADDUSER" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-adduser\fR - Add a registry user account
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-audit.1 b/deps/npm/man/man1/npm-audit.1
index f99fa41719b110..f0c8475937f7fa 100644
--- a/deps/npm/man/man1/npm-audit.1
+++ b/deps/npm/man/man1/npm-audit.1
@@ -1,4 +1,4 @@
-.TH "NPM-AUDIT" "1" "January 2023" "" ""
+.TH "NPM-AUDIT" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-audit\fR - Run a security audit
.SS "Synopsis"
@@ -403,7 +403,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-bugs.1 b/deps/npm/man/man1/npm-bugs.1
index 61d2835a912352..1f2ee1642dab02 100644
--- a/deps/npm/man/man1/npm-bugs.1
+++ b/deps/npm/man/man1/npm-bugs.1
@@ -1,4 +1,4 @@
-.TH "NPM-BUGS" "1" "January 2023" "" ""
+.TH "NPM-BUGS" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-bugs\fR - Report bugs for a package in a web browser
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-cache.1 b/deps/npm/man/man1/npm-cache.1
index 026f08d17d1ca2..5559b3be352bfb 100644
--- a/deps/npm/man/man1/npm-cache.1
+++ b/deps/npm/man/man1/npm-cache.1
@@ -1,4 +1,4 @@
-.TH "NPM-CACHE" "1" "January 2023" "" ""
+.TH "NPM-CACHE" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-cache\fR - Manipulates packages cache
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-ci.1 b/deps/npm/man/man1/npm-ci.1
index 9335c9aec45f8c..66b688f9717612 100644
--- a/deps/npm/man/man1/npm-ci.1
+++ b/deps/npm/man/man1/npm-ci.1
@@ -1,4 +1,4 @@
-.TH "NPM-CI" "1" "January 2023" "" ""
+.TH "NPM-CI" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-ci\fR - Clean install a project
.SS "Synopsis"
@@ -114,11 +114,11 @@ man pages are linked to \fB{prefix}/share/man\fR
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBlegacy-bundling\fR"
.RS 0
.IP \(bu 4
@@ -314,7 +314,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-completion.1 b/deps/npm/man/man1/npm-completion.1
index c2f5d2a2506d23..b82899fd35db14 100644
--- a/deps/npm/man/man1/npm-completion.1
+++ b/deps/npm/man/man1/npm-completion.1
@@ -1,4 +1,4 @@
-.TH "NPM-COMPLETION" "1" "January 2023" "" ""
+.TH "NPM-COMPLETION" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-completion\fR - Tab Completion for npm
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-config.1 b/deps/npm/man/man1/npm-config.1
index 1f4e831af8d63b..0cf96143ff8e3b 100644
--- a/deps/npm/man/man1/npm-config.1
+++ b/deps/npm/man/man1/npm-config.1
@@ -1,4 +1,4 @@
-.TH "NPM-CONFIG" "1" "January 2023" "" ""
+.TH "NPM-CONFIG" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-config\fR - Manage the npm configuration files
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-dedupe.1 b/deps/npm/man/man1/npm-dedupe.1
index ff45a8d0fd0a43..302d63b3c2c107 100644
--- a/deps/npm/man/man1/npm-dedupe.1
+++ b/deps/npm/man/man1/npm-dedupe.1
@@ -1,4 +1,4 @@
-.TH "NPM-DEDUPE" "1" "January 2023" "" ""
+.TH "NPM-DEDUPE" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-dedupe\fR - Reduce duplication in the package tree
.SS "Synopsis"
@@ -70,11 +70,11 @@ Note: \fBnpm dedupe\fR will never update the semver values of direct dependencie
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBlegacy-bundling\fR"
.RS 0
.IP \(bu 4
@@ -258,7 +258,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-deprecate.1 b/deps/npm/man/man1/npm-deprecate.1
index ad4f34b53c263e..d525e703e841d9 100644
--- a/deps/npm/man/man1/npm-deprecate.1
+++ b/deps/npm/man/man1/npm-deprecate.1
@@ -1,4 +1,4 @@
-.TH "NPM-DEPRECATE" "1" "January 2023" "" ""
+.TH "NPM-DEPRECATE" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-deprecate\fR - Deprecate a version of a package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-diff.1 b/deps/npm/man/man1/npm-diff.1
index 410548534a6e0d..5d2c240f839f95 100644
--- a/deps/npm/man/man1/npm-diff.1
+++ b/deps/npm/man/man1/npm-diff.1
@@ -1,4 +1,4 @@
-.TH "NPM-DIFF" "1" "January 2023" "" ""
+.TH "NPM-DIFF" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-diff\fR - The registry diff command
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-dist-tag.1 b/deps/npm/man/man1/npm-dist-tag.1
index f87010d6019de8..bd700f0fefe5ed 100644
--- a/deps/npm/man/man1/npm-dist-tag.1
+++ b/deps/npm/man/man1/npm-dist-tag.1
@@ -1,4 +1,4 @@
-.TH "NPM-DIST-TAG" "1" "January 2023" "" ""
+.TH "NPM-DIST-TAG" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-dist-tag\fR - Modify package distribution tags
.SS "Synopsis"
@@ -17,9 +17,9 @@ alias: dist-tags
Add, remove, and enumerate distribution tags on a package:
.RS 0
.IP \(bu 4
-add: Tags the specified version of the package with the specified tag, or the \fB\fB--tag\fR config\fR \fI\(la/using-npm/config#tag\(ra\fR if not specified. If you have two-factor authentication on auth-and-writes then you\[cq]ll need to include a one-time password on the command line with \fB--otp \fR, or at the OTP prompt.
+add: Tags the specified version of the package with the specified tag, or the \fB\fB--tag\fR config\fR \fI\(la/using-npm/config#tag\(ra\fR if not specified. If you have two-factor authentication on auth-and-writes then you\[cq]ll need to include a one-time password on the command line with \fB--otp \fR, or go through a second factor flow based on your \fBauthtype\fR.
.IP \(bu 4
-rm: Clear a tag that is no longer in use from the package. If you have two-factor authentication on auth-and-writes then you\[cq]ll need to include a one-time password on the command line with \fB--otp \fR, or at the OTP prompt.
+rm: Clear a tag that is no longer in use from the package. If you have two-factor authentication on auth-and-writes then you\[cq]ll need to include a one-time password on the command line with \fB--otp \fR, or go through a second factor flow based on your \fBauthtype\fR
.IP \(bu 4
ls: Show all of the dist-tags for a package, defaulting to the package in the current prefix. This is the default action if none is specified.
.RE 0
diff --git a/deps/npm/man/man1/npm-docs.1 b/deps/npm/man/man1/npm-docs.1
index c3832ae6f52279..6ee27d9348b7b1 100644
--- a/deps/npm/man/man1/npm-docs.1
+++ b/deps/npm/man/man1/npm-docs.1
@@ -1,4 +1,4 @@
-.TH "NPM-DOCS" "1" "January 2023" "" ""
+.TH "NPM-DOCS" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-docs\fR - Open documentation for a package in a web browser
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-doctor.1 b/deps/npm/man/man1/npm-doctor.1
index a33e7acf628325..d1b5a4165452af 100644
--- a/deps/npm/man/man1/npm-doctor.1
+++ b/deps/npm/man/man1/npm-doctor.1
@@ -1,4 +1,4 @@
-.TH "NPM-DOCTOR" "1" "January 2023" "" ""
+.TH "NPM-DOCTOR" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-doctor\fR - Check your npm environment
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-edit.1 b/deps/npm/man/man1/npm-edit.1
index 4b933ac01c88a4..de723dfabe983b 100644
--- a/deps/npm/man/man1/npm-edit.1
+++ b/deps/npm/man/man1/npm-edit.1
@@ -1,4 +1,4 @@
-.TH "NPM-EDIT" "1" "January 2023" "" ""
+.TH "NPM-EDIT" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-edit\fR - Edit an installed package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-exec.1 b/deps/npm/man/man1/npm-exec.1
index b8b6cc7e96187a..10d06ae263408f 100644
--- a/deps/npm/man/man1/npm-exec.1
+++ b/deps/npm/man/man1/npm-exec.1
@@ -1,4 +1,4 @@
-.TH "NPM-EXEC" "1" "January 2023" "" ""
+.TH "NPM-EXEC" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-exec\fR - Run a command from a local or remote npm package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-explain.1 b/deps/npm/man/man1/npm-explain.1
index b1b5305d0d99e7..a8f5e0e2b63487 100644
--- a/deps/npm/man/man1/npm-explain.1
+++ b/deps/npm/man/man1/npm-explain.1
@@ -1,4 +1,4 @@
-.TH "NPM-EXPLAIN" "1" "January 2023" "" ""
+.TH "NPM-EXPLAIN" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-explain\fR - Explain installed packages
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-explore.1 b/deps/npm/man/man1/npm-explore.1
index de402db28c648e..82452e0cba6eaa 100644
--- a/deps/npm/man/man1/npm-explore.1
+++ b/deps/npm/man/man1/npm-explore.1
@@ -1,4 +1,4 @@
-.TH "NPM-EXPLORE" "1" "January 2023" "" ""
+.TH "NPM-EXPLORE" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-explore\fR - Browse an installed package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-find-dupes.1 b/deps/npm/man/man1/npm-find-dupes.1
index 5ec681c48c39ea..4834f54df6abd3 100644
--- a/deps/npm/man/man1/npm-find-dupes.1
+++ b/deps/npm/man/man1/npm-find-dupes.1
@@ -1,4 +1,4 @@
-.TH "NPM-FIND-DUPES" "1" "January 2023" "" ""
+.TH "NPM-FIND-DUPES" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-find-dupes\fR - Find duplication in the package tree
.SS "Synopsis"
@@ -17,11 +17,11 @@ Runs \fBnpm dedupe\fR in \fB--dry-run\fR mode, making npm only output the duplic
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBlegacy-bundling\fR"
.RS 0
.IP \(bu 4
@@ -193,7 +193,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-fund.1 b/deps/npm/man/man1/npm-fund.1
index 7c1fe64f033cd1..a0a31eebd445ac 100644
--- a/deps/npm/man/man1/npm-fund.1
+++ b/deps/npm/man/man1/npm-fund.1
@@ -1,4 +1,4 @@
-.TH "NPM-FUND" "1" "January 2023" "" ""
+.TH "NPM-FUND" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-fund\fR - Retrieve funding information
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-help-search.1 b/deps/npm/man/man1/npm-help-search.1
index 5d7a40e7ca3cec..846f507f51e4d7 100644
--- a/deps/npm/man/man1/npm-help-search.1
+++ b/deps/npm/man/man1/npm-help-search.1
@@ -1,4 +1,4 @@
-.TH "NPM-HELP-SEARCH" "1" "January 2023" "" ""
+.TH "NPM-HELP-SEARCH" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-help-search\fR - Search npm help documentation
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-help.1 b/deps/npm/man/man1/npm-help.1
index c0e77e0b5fc83d..cffb33d49f91ef 100644
--- a/deps/npm/man/man1/npm-help.1
+++ b/deps/npm/man/man1/npm-help.1
@@ -1,4 +1,4 @@
-.TH "NPM-HELP" "1" "January 2023" "" ""
+.TH "NPM-HELP" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-help\fR - Get help on npm
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-hook.1 b/deps/npm/man/man1/npm-hook.1
index 8ba80f625cebbb..0cd0935d9498a9 100644
--- a/deps/npm/man/man1/npm-hook.1
+++ b/deps/npm/man/man1/npm-hook.1
@@ -1,4 +1,4 @@
-.TH "NPM-HOOK" "1" "January 2023" "" ""
+.TH "NPM-HOOK" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-hook\fR - Manage registry hooks
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-init.1 b/deps/npm/man/man1/npm-init.1
index f5fc5fbc359195..763e2914fa1e67 100644
--- a/deps/npm/man/man1/npm-init.1
+++ b/deps/npm/man/man1/npm-init.1
@@ -1,4 +1,4 @@
-.TH "NPM-INIT" "1" "January 2023" "" ""
+.TH "NPM-INIT" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-init\fR - Create a package.json file
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-install-ci-test.1 b/deps/npm/man/man1/npm-install-ci-test.1
index 96b0bf5acc4c55..fbe641322dac4b 100644
--- a/deps/npm/man/man1/npm-install-ci-test.1
+++ b/deps/npm/man/man1/npm-install-ci-test.1
@@ -1,4 +1,4 @@
-.TH "NPM-INSTALL-CI-TEST" "1" "January 2023" "" ""
+.TH "NPM-INSTALL-CI-TEST" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-install-ci-test\fR - Install a project with a clean slate and run tests
.SS "Synopsis"
@@ -62,11 +62,11 @@ man pages are linked to \fB{prefix}/share/man\fR
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBlegacy-bundling\fR"
.RS 0
.IP \(bu 4
@@ -262,7 +262,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-install-test.1 b/deps/npm/man/man1/npm-install-test.1
index 5b5b6823cf3ca4..528ec9b75adf02 100644
--- a/deps/npm/man/man1/npm-install-test.1
+++ b/deps/npm/man/man1/npm-install-test.1
@@ -1,4 +1,4 @@
-.TH "NPM-INSTALL-TEST" "1" "January 2023" "" ""
+.TH "NPM-INSTALL-TEST" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-install-test\fR - Install package(s) and run tests
.SS "Synopsis"
@@ -62,11 +62,11 @@ man pages are linked to \fB{prefix}/share/man\fR
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBlegacy-bundling\fR"
.RS 0
.IP \(bu 4
@@ -262,7 +262,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-install.1 b/deps/npm/man/man1/npm-install.1
index 55e5abe3f888a9..4eb982a825f3e3 100644
--- a/deps/npm/man/man1/npm-install.1
+++ b/deps/npm/man/man1/npm-install.1
@@ -1,4 +1,4 @@
-.TH "NPM-INSTALL" "1" "January 2023" "" ""
+.TH "NPM-INSTALL" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-install\fR - Install a package
.SS "Synopsis"
@@ -424,11 +424,11 @@ man pages are linked to \fB{prefix}/share/man\fR
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBlegacy-bundling\fR"
.RS 0
.IP \(bu 4
@@ -624,7 +624,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-link.1 b/deps/npm/man/man1/npm-link.1
index 42ecdee92aa522..84c3e7fd7ccc3c 100644
--- a/deps/npm/man/man1/npm-link.1
+++ b/deps/npm/man/man1/npm-link.1
@@ -1,4 +1,4 @@
-.TH "NPM-LINK" "1" "January 2023" "" ""
+.TH "NPM-LINK" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-link\fR - Symlink a package folder
.SS "Synopsis"
@@ -129,11 +129,11 @@ man pages are linked to \fB{prefix}/share/man\fR
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBlegacy-bundling\fR"
.RS 0
.IP \(bu 4
@@ -317,7 +317,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-login.1 b/deps/npm/man/man1/npm-login.1
index 3b99aefeecf75a..83535a93440fa7 100644
--- a/deps/npm/man/man1/npm-login.1
+++ b/deps/npm/man/man1/npm-login.1
@@ -1,4 +1,4 @@
-.TH "NPM-LOGIN" "1" "January 2023" "" ""
+.TH "NPM-LOGIN" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-login\fR - Login to a registry user account
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-logout.1 b/deps/npm/man/man1/npm-logout.1
index 3bb5aa671422ed..945b6ef8e02003 100644
--- a/deps/npm/man/man1/npm-logout.1
+++ b/deps/npm/man/man1/npm-logout.1
@@ -1,4 +1,4 @@
-.TH "NPM-LOGOUT" "1" "January 2023" "" ""
+.TH "NPM-LOGOUT" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-logout\fR - Log out of the registry
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-ls.1 b/deps/npm/man/man1/npm-ls.1
index 76a67908222357..3d12f7a42dfb47 100644
--- a/deps/npm/man/man1/npm-ls.1
+++ b/deps/npm/man/man1/npm-ls.1
@@ -1,4 +1,4 @@
-.TH "NPM-LS" "1" "January 2023" "" ""
+.TH "NPM-LS" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-ls\fR - List installed packages
.SS "Synopsis"
@@ -20,7 +20,7 @@ Positional arguments are \fBname@version-range\fR identifiers, which will limit
.P
.RS 2
.nf
-npm@9.3.1 /path/to/npm
+npm@9.5.0 /path/to/npm
└─┬ init-package-json@0.0.4
└── promzard@0.1.5
.fi
@@ -235,7 +235,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-org.1 b/deps/npm/man/man1/npm-org.1
index ba4ee6c72a5ebc..2e952687b307ac 100644
--- a/deps/npm/man/man1/npm-org.1
+++ b/deps/npm/man/man1/npm-org.1
@@ -1,4 +1,4 @@
-.TH "NPM-ORG" "1" "January 2023" "" ""
+.TH "NPM-ORG" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-org\fR - Manage orgs
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-outdated.1 b/deps/npm/man/man1/npm-outdated.1
index 0cd7762886c959..683321321f6876 100644
--- a/deps/npm/man/man1/npm-outdated.1
+++ b/deps/npm/man/man1/npm-outdated.1
@@ -1,4 +1,4 @@
-.TH "NPM-OUTDATED" "1" "January 2023" "" ""
+.TH "NPM-OUTDATED" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-outdated\fR - Check for outdated packages
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-owner.1 b/deps/npm/man/man1/npm-owner.1
index f2015740b02cd9..bf73df68cdf48b 100644
--- a/deps/npm/man/man1/npm-owner.1
+++ b/deps/npm/man/man1/npm-owner.1
@@ -1,4 +1,4 @@
-.TH "NPM-OWNER" "1" "January 2023" "" ""
+.TH "NPM-OWNER" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-owner\fR - Manage package owners
.SS "Synopsis"
@@ -27,7 +27,7 @@ rm: Remove a user from the package owner list. This immediately revokes their pr
.P
Note that there is only one level of access. Either you can modify a package, or you can't. Future versions may contain more fine-grained access levels, but that is not implemented at this time.
.P
-If you have two-factor authentication enabled with \fBauth-and-writes\fR (see npm help npm-profile) then you'll need to include an otp on the command line when changing ownership with \fB--otp\fR.
+If you have two-factor authentication enabled with \fBauth-and-writes\fR (see npm help npm-profile) then you'll need to go through a second factor flow when changing ownership or include an otp on the command line with \fB--otp\fR.
.SS "Configuration"
.SS "\fBregistry\fR"
.RS 0
diff --git a/deps/npm/man/man1/npm-pack.1 b/deps/npm/man/man1/npm-pack.1
index 934bc9cf4ec176..e6b19f2eb061c6 100644
--- a/deps/npm/man/man1/npm-pack.1
+++ b/deps/npm/man/man1/npm-pack.1
@@ -1,4 +1,4 @@
-.TH "NPM-PACK" "1" "January 2023" "" ""
+.TH "NPM-PACK" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-pack\fR - Create a tarball from a package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-ping.1 b/deps/npm/man/man1/npm-ping.1
index 59c5191676dba1..4ae24a8cd6eb46 100644
--- a/deps/npm/man/man1/npm-ping.1
+++ b/deps/npm/man/man1/npm-ping.1
@@ -1,4 +1,4 @@
-.TH "NPM-PING" "1" "January 2023" "" ""
+.TH "NPM-PING" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-ping\fR - Ping npm registry
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-pkg.1 b/deps/npm/man/man1/npm-pkg.1
index 4f3437fb2fedb1..d67f473563af1a 100644
--- a/deps/npm/man/man1/npm-pkg.1
+++ b/deps/npm/man/man1/npm-pkg.1
@@ -1,4 +1,4 @@
-.TH "NPM-PKG" "1" "January 2023" "" ""
+.TH "NPM-PKG" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-pkg\fR - Manages your package.json
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-prefix.1 b/deps/npm/man/man1/npm-prefix.1
index c51646626a4982..a14aec7c6b3043 100644
--- a/deps/npm/man/man1/npm-prefix.1
+++ b/deps/npm/man/man1/npm-prefix.1
@@ -1,4 +1,4 @@
-.TH "NPM-PREFIX" "1" "January 2023" "" ""
+.TH "NPM-PREFIX" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-prefix\fR - Display prefix
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-profile.1 b/deps/npm/man/man1/npm-profile.1
index cd5cba95150b22..40bc3df8e842a7 100644
--- a/deps/npm/man/man1/npm-profile.1
+++ b/deps/npm/man/man1/npm-profile.1
@@ -1,4 +1,4 @@
-.TH "NPM-PROFILE" "1" "January 2023" "" ""
+.TH "NPM-PROFILE" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-profile\fR - Change settings on your registry profile
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-prune.1 b/deps/npm/man/man1/npm-prune.1
index 928ccd5d806082..86546ac112f1cd 100644
--- a/deps/npm/man/man1/npm-prune.1
+++ b/deps/npm/man/man1/npm-prune.1
@@ -1,4 +1,4 @@
-.TH "NPM-PRUNE" "1" "January 2023" "" ""
+.TH "NPM-PRUNE" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-prune\fR - Remove extraneous packages
.SS "Synopsis"
@@ -152,7 +152,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-publish.1 b/deps/npm/man/man1/npm-publish.1
index d75c7b39eb60aa..97b912987d0bd0 100644
--- a/deps/npm/man/man1/npm-publish.1
+++ b/deps/npm/man/man1/npm-publish.1
@@ -1,4 +1,4 @@
-.TH "NPM-PUBLISH" "1" "January 2023" "" ""
+.TH "NPM-PUBLISH" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-publish\fR - Publish a package
.SS "Synopsis"
@@ -172,6 +172,16 @@ Include the workspace root when workspaces are enabled for a command.
When false, specifying individual workspaces via the \fBworkspace\fR config, or all workspaces via the \fBworkspaces\fR flag, will cause npm to operate only on the specified workspaces, and not on the root project.
.P
This value is not exported to the environment for child processes.
+.SS "\fBprovenance\fR"
+.RS 0
+.IP \(bu 4
+Default: false
+.IP \(bu 4
+Type: Boolean
+.RE 0
+
+.P
+Indicates that a provenance statement should be generated.
.SS "See Also"
.RS 0
.IP \(bu 4
diff --git a/deps/npm/man/man1/npm-query.1 b/deps/npm/man/man1/npm-query.1
index 0e6d444e39948b..24ba7853adba8f 100644
--- a/deps/npm/man/man1/npm-query.1
+++ b/deps/npm/man/man1/npm-query.1
@@ -1,4 +1,4 @@
-.TH "NPM-QUERY" "1" "January 2023" "" ""
+.TH "NPM-QUERY" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-query\fR - Dependency selector query
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-rebuild.1 b/deps/npm/man/man1/npm-rebuild.1
index ce56c880b634af..1c00566dd1701a 100644
--- a/deps/npm/man/man1/npm-rebuild.1
+++ b/deps/npm/man/man1/npm-rebuild.1
@@ -1,4 +1,4 @@
-.TH "NPM-REBUILD" "1" "January 2023" "" ""
+.TH "NPM-REBUILD" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-rebuild\fR - Rebuild a package
.SS "Synopsis"
@@ -132,7 +132,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-repo.1 b/deps/npm/man/man1/npm-repo.1
index 836ae8de258883..65bb96687f571e 100644
--- a/deps/npm/man/man1/npm-repo.1
+++ b/deps/npm/man/man1/npm-repo.1
@@ -1,4 +1,4 @@
-.TH "NPM-REPO" "1" "January 2023" "" ""
+.TH "NPM-REPO" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-repo\fR - Open package repository page in the browser
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-restart.1 b/deps/npm/man/man1/npm-restart.1
index 00b83e280b5cfd..9323e2841a261e 100644
--- a/deps/npm/man/man1/npm-restart.1
+++ b/deps/npm/man/man1/npm-restart.1
@@ -1,4 +1,4 @@
-.TH "NPM-RESTART" "1" "January 2023" "" ""
+.TH "NPM-RESTART" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-restart\fR - Restart a package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-root.1 b/deps/npm/man/man1/npm-root.1
index 9333aa4545235b..bd943bb95a6d9c 100644
--- a/deps/npm/man/man1/npm-root.1
+++ b/deps/npm/man/man1/npm-root.1
@@ -1,4 +1,4 @@
-.TH "NPM-ROOT" "1" "January 2023" "" ""
+.TH "NPM-ROOT" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-root\fR - Display npm root
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-run-script.1 b/deps/npm/man/man1/npm-run-script.1
index 44a6a76a8b3844..afcf4d655bf218 100644
--- a/deps/npm/man/man1/npm-run-script.1
+++ b/deps/npm/man/man1/npm-run-script.1
@@ -1,4 +1,4 @@
-.TH "NPM-RUN-SCRIPT" "1" "January 2023" "" ""
+.TH "NPM-RUN-SCRIPT" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-run-script\fR - Run arbitrary package scripts
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-search.1 b/deps/npm/man/man1/npm-search.1
index 3fe6f69992ed6a..2241f81e2250e8 100644
--- a/deps/npm/man/man1/npm-search.1
+++ b/deps/npm/man/man1/npm-search.1
@@ -1,4 +1,4 @@
-.TH "NPM-SEARCH" "1" "January 2023" "" ""
+.TH "NPM-SEARCH" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-search\fR - Search for packages
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-shrinkwrap.1 b/deps/npm/man/man1/npm-shrinkwrap.1
index 875f0b58bad501..5192bc1222f3dc 100644
--- a/deps/npm/man/man1/npm-shrinkwrap.1
+++ b/deps/npm/man/man1/npm-shrinkwrap.1
@@ -1,4 +1,4 @@
-.TH "NPM-SHRINKWRAP" "1" "January 2023" "" ""
+.TH "NPM-SHRINKWRAP" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-shrinkwrap\fR - Lock down dependency versions for publication
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-star.1 b/deps/npm/man/man1/npm-star.1
index 627d11b3667df7..ecb2ffe84fa809 100644
--- a/deps/npm/man/man1/npm-star.1
+++ b/deps/npm/man/man1/npm-star.1
@@ -1,4 +1,4 @@
-.TH "NPM-STAR" "1" "January 2023" "" ""
+.TH "NPM-STAR" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-star\fR - Mark your favorite packages
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-stars.1 b/deps/npm/man/man1/npm-stars.1
index 6660429c892b48..49ffc90e03ce10 100644
--- a/deps/npm/man/man1/npm-stars.1
+++ b/deps/npm/man/man1/npm-stars.1
@@ -1,4 +1,4 @@
-.TH "NPM-STARS" "1" "January 2023" "" ""
+.TH "NPM-STARS" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-stars\fR - View packages marked as favorites
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-start.1 b/deps/npm/man/man1/npm-start.1
index abdc41d8c5194d..ecf3ad2838266c 100644
--- a/deps/npm/man/man1/npm-start.1
+++ b/deps/npm/man/man1/npm-start.1
@@ -1,4 +1,4 @@
-.TH "NPM-START" "1" "January 2023" "" ""
+.TH "NPM-START" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-start\fR - Start a package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-stop.1 b/deps/npm/man/man1/npm-stop.1
index 9c9437aafe6543..4a5f604941e09d 100644
--- a/deps/npm/man/man1/npm-stop.1
+++ b/deps/npm/man/man1/npm-stop.1
@@ -1,4 +1,4 @@
-.TH "NPM-STOP" "1" "January 2023" "" ""
+.TH "NPM-STOP" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-stop\fR - Stop a package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-team.1 b/deps/npm/man/man1/npm-team.1
index 917dbce667b9a3..30399634bc8bc8 100644
--- a/deps/npm/man/man1/npm-team.1
+++ b/deps/npm/man/man1/npm-team.1
@@ -1,4 +1,4 @@
-.TH "NPM-TEAM" "1" "January 2023" "" ""
+.TH "NPM-TEAM" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-team\fR - Manage organization teams and team memberships
.SS "Synopsis"
@@ -20,7 +20,7 @@ Used to manage teams in organizations, and change team memberships. Does not han
.P
Teams must always be fully qualified with the organization/scope they belong to when operating on them, separated by a colon (\fB:\fR). That is, if you have a \fBnewteam\fR team in an \fBorg\fR organization, you must always refer to that team as \fB@org:newteam\fR in these commands.
.P
-If you have two-factor authentication enabled in \fBauth-and-writes\fR mode, then you can provide a code from your authenticator with \fB\[lB]--otp \[rB]\fR. If you don't include this then you will be prompted.
+If you have two-factor authentication enabled in \fBauth-and-writes\fR mode, then you can provide a code from your authenticator with \fB\[lB]--otp \[rB]\fR. If you don't include this then you will be taken through a second factor flow based on your \fBauthtype\fR.
.RS 0
.IP \(bu 4
create / destroy: Create a new team, or destroy an existing one. Note: You cannot remove the \fBdevelopers\fR team, learn more.
diff --git a/deps/npm/man/man1/npm-test.1 b/deps/npm/man/man1/npm-test.1
index 5863931a62e8fa..ddd45f0757b9ff 100644
--- a/deps/npm/man/man1/npm-test.1
+++ b/deps/npm/man/man1/npm-test.1
@@ -1,4 +1,4 @@
-.TH "NPM-TEST" "1" "January 2023" "" ""
+.TH "NPM-TEST" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-test\fR - Test a package
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-token.1 b/deps/npm/man/man1/npm-token.1
index 1c7f976201283e..202f55bd69e68e 100644
--- a/deps/npm/man/man1/npm-token.1
+++ b/deps/npm/man/man1/npm-token.1
@@ -1,4 +1,4 @@
-.TH "NPM-TOKEN" "1" "January 2023" "" ""
+.TH "NPM-TOKEN" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-token\fR - Manage your authentication tokens
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-uninstall.1 b/deps/npm/man/man1/npm-uninstall.1
index 231b203955aa63..9c90a9863215c9 100644
--- a/deps/npm/man/man1/npm-uninstall.1
+++ b/deps/npm/man/man1/npm-uninstall.1
@@ -1,4 +1,4 @@
-.TH "NPM-UNINSTALL" "1" "January 2023" "" ""
+.TH "NPM-UNINSTALL" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-uninstall\fR - Remove a package
.SS "Synopsis"
@@ -118,7 +118,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-unpublish.1 b/deps/npm/man/man1/npm-unpublish.1
index 99024be559c3c3..8b250dd67bcc41 100644
--- a/deps/npm/man/man1/npm-unpublish.1
+++ b/deps/npm/man/man1/npm-unpublish.1
@@ -1,4 +1,4 @@
-.TH "NPM-UNPUBLISH" "1" "January 2023" "" ""
+.TH "NPM-UNPUBLISH" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-unpublish\fR - Remove a package from the registry
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-unstar.1 b/deps/npm/man/man1/npm-unstar.1
index 8263c1aad3f280..c76968c9050b5b 100644
--- a/deps/npm/man/man1/npm-unstar.1
+++ b/deps/npm/man/man1/npm-unstar.1
@@ -1,4 +1,4 @@
-.TH "NPM-UNSTAR" "1" "January 2023" "" ""
+.TH "NPM-UNSTAR" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-unstar\fR - Remove an item from your favorite packages
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-update.1 b/deps/npm/man/man1/npm-update.1
index 644a6787de55a5..7819631243fccd 100644
--- a/deps/npm/man/man1/npm-update.1
+++ b/deps/npm/man/man1/npm-update.1
@@ -1,4 +1,4 @@
-.TH "NPM-UPDATE" "1" "January 2023" "" ""
+.TH "NPM-UPDATE" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-update\fR - Update packages
.SS "Synopsis"
@@ -170,11 +170,11 @@ man pages are linked to \fB{prefix}/share/man\fR
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBlegacy-bundling\fR"
.RS 0
.IP \(bu 4
@@ -370,7 +370,7 @@ This value is not exported to the environment for child processes.
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
diff --git a/deps/npm/man/man1/npm-version.1 b/deps/npm/man/man1/npm-version.1
index 59248f1a4ef769..66bfdbdc598bdd 100644
--- a/deps/npm/man/man1/npm-version.1
+++ b/deps/npm/man/man1/npm-version.1
@@ -1,4 +1,4 @@
-.TH "NPM-VERSION" "1" "January 2023" "" ""
+.TH "NPM-VERSION" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-version\fR - Bump a package version
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-view.1 b/deps/npm/man/man1/npm-view.1
index 8de0a2d9f7df87..6075c163d0e8d2 100644
--- a/deps/npm/man/man1/npm-view.1
+++ b/deps/npm/man/man1/npm-view.1
@@ -1,4 +1,4 @@
-.TH "NPM-VIEW" "1" "January 2023" "" ""
+.TH "NPM-VIEW" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-view\fR - View registry info
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm-whoami.1 b/deps/npm/man/man1/npm-whoami.1
index a609e66bf52059..d96a3648776381 100644
--- a/deps/npm/man/man1/npm-whoami.1
+++ b/deps/npm/man/man1/npm-whoami.1
@@ -1,4 +1,4 @@
-.TH "NPM-WHOAMI" "1" "January 2023" "" ""
+.TH "NPM-WHOAMI" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm-whoami\fR - Display npm username
.SS "Synopsis"
diff --git a/deps/npm/man/man1/npm.1 b/deps/npm/man/man1/npm.1
index 1286027b16231c..3e1b5acf5c0bfc 100644
--- a/deps/npm/man/man1/npm.1
+++ b/deps/npm/man/man1/npm.1
@@ -1,4 +1,4 @@
-.TH "NPM" "1" "January 2023" "" ""
+.TH "NPM" "1" "February 2023" "" ""
.SH "NAME"
\fBnpm\fR - javascript package manager
.SS "Synopsis"
@@ -12,7 +12,7 @@ npm
Note: This command is unaware of workspaces.
.SS "Version"
.P
-9.3.1
+9.5.0
.SS "Description"
.P
npm is the package manager for the Node JavaScript platform. It puts modules in place so that node can find them, and manages dependency conflicts intelligently.
diff --git a/deps/npm/man/man1/npx.1 b/deps/npm/man/man1/npx.1
index ab3a83236bfcd1..079dffeecdbe2e 100644
--- a/deps/npm/man/man1/npx.1
+++ b/deps/npm/man/man1/npx.1
@@ -1,4 +1,4 @@
-.TH "NPX" "1" "January 2023" "" ""
+.TH "NPX" "1" "February 2023" "" ""
.SH "NAME"
\fBnpx\fR - Run a command from a local or remote npm package
.SS "Synopsis"
diff --git a/deps/npm/man/man5/folders.5 b/deps/npm/man/man5/folders.5
index 09870f292aee49..50bf4c8c7aec20 100644
--- a/deps/npm/man/man5/folders.5
+++ b/deps/npm/man/man5/folders.5
@@ -1,4 +1,4 @@
-.TH "FOLDERS" "5" "January 2023" "" ""
+.TH "FOLDERS" "5" "February 2023" "" ""
.SH "NAME"
\fBfolders\fR - Folder Structures Used by npm
.SS "Description"
diff --git a/deps/npm/man/man5/install.5 b/deps/npm/man/man5/install.5
index ecc6a9b72379ca..68af465f5ee21a 100644
--- a/deps/npm/man/man5/install.5
+++ b/deps/npm/man/man5/install.5
@@ -1,4 +1,4 @@
-.TH "INSTALL" "5" "January 2023" "" ""
+.TH "INSTALL" "5" "February 2023" "" ""
.SH "NAME"
\fBinstall\fR - Download and install node and npm
.SS "Description"
diff --git a/deps/npm/man/man5/npm-global.5 b/deps/npm/man/man5/npm-global.5
index 09870f292aee49..50bf4c8c7aec20 100644
--- a/deps/npm/man/man5/npm-global.5
+++ b/deps/npm/man/man5/npm-global.5
@@ -1,4 +1,4 @@
-.TH "FOLDERS" "5" "January 2023" "" ""
+.TH "FOLDERS" "5" "February 2023" "" ""
.SH "NAME"
\fBfolders\fR - Folder Structures Used by npm
.SS "Description"
diff --git a/deps/npm/man/man5/npm-json.5 b/deps/npm/man/man5/npm-json.5
index 6d7c946a06110d..ceafaf29104e66 100644
--- a/deps/npm/man/man5/npm-json.5
+++ b/deps/npm/man/man5/npm-json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE.JSON" "5" "January 2023" "" ""
+.TH "PACKAGE.JSON" "5" "February 2023" "" ""
.SH "NAME"
\fBpackage.json\fR - Specifics of npm's package.json handling
.SS "Description"
diff --git a/deps/npm/man/man5/npm-shrinkwrap-json.5 b/deps/npm/man/man5/npm-shrinkwrap-json.5
index 2e498029e420b6..12a5dc05887fda 100644
--- a/deps/npm/man/man5/npm-shrinkwrap-json.5
+++ b/deps/npm/man/man5/npm-shrinkwrap-json.5
@@ -1,4 +1,4 @@
-.TH "NPM-SHRINKWRAP.JSON" "5" "January 2023" "" ""
+.TH "NPM-SHRINKWRAP.JSON" "5" "February 2023" "" ""
.SH "NAME"
\fBnpm-shrinkwrap.json\fR - A publishable lockfile
.SS "Description"
diff --git a/deps/npm/man/man5/npmrc.5 b/deps/npm/man/man5/npmrc.5
index 0828d2ea7ba317..a568984bc80aaa 100644
--- a/deps/npm/man/man5/npmrc.5
+++ b/deps/npm/man/man5/npmrc.5
@@ -1,4 +1,4 @@
-.TH "NPMRC" "5" "January 2023" "" ""
+.TH "NPMRC" "5" "February 2023" "" ""
.SH "NAME"
\fBnpmrc\fR - The npm config files
.SS "Description"
diff --git a/deps/npm/man/man5/package-json.5 b/deps/npm/man/man5/package-json.5
index 6d7c946a06110d..ceafaf29104e66 100644
--- a/deps/npm/man/man5/package-json.5
+++ b/deps/npm/man/man5/package-json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE.JSON" "5" "January 2023" "" ""
+.TH "PACKAGE.JSON" "5" "February 2023" "" ""
.SH "NAME"
\fBpackage.json\fR - Specifics of npm's package.json handling
.SS "Description"
diff --git a/deps/npm/man/man5/package-lock-json.5 b/deps/npm/man/man5/package-lock-json.5
index d10a93bc31ab82..20be57b80db4fb 100644
--- a/deps/npm/man/man5/package-lock-json.5
+++ b/deps/npm/man/man5/package-lock-json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE-LOCK.JSON" "5" "January 2023" "" ""
+.TH "PACKAGE-LOCK.JSON" "5" "February 2023" "" ""
.SH "NAME"
\fBpackage-lock.json\fR - A manifestation of the manifest
.SS "Description"
diff --git a/deps/npm/man/man7/config.7 b/deps/npm/man/man7/config.7
index 6f8d20d6efc965..dc4261c2803a8c 100644
--- a/deps/npm/man/man7/config.7
+++ b/deps/npm/man/man7/config.7
@@ -1,4 +1,4 @@
-.TH "CONFIG" "7" "January 2023" "" ""
+.TH "CONFIG" "7" "February 2023" "" ""
.SH "NAME"
\fBconfig\fR - More than you probably want to know about npm configuration
.SS "Description"
@@ -815,7 +815,7 @@ The value that \fBnpm init\fR should use by default for the package version numb
.SS "\fBinstall-links\fR"
.RS 0
.IP \(bu 4
-Default: true
+Default: false
.IP \(bu 4
Type: Boolean
.RE 0
@@ -827,11 +827,11 @@ When set file: protocol dependencies will be packed and installed as regular dep
.IP \(bu 4
Default: "hoisted"
.IP \(bu 4
-Type: "hoisted", "nested", or "shallow"
+Type: "hoisted", "nested", "shallow", or "linked"
.RE 0
.P
-Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (coming soon) install in node_modules/.store, link in place, unhoisted.
+Sets the strategy for installing packages in node_modules. hoisted (default): Install non-duplicated in top-level, and duplicated as necessary within directory structure. nested: (formerly --legacy-bundling) install in place, no hoisting. shallow (formerly --global-style) only install direct deps at top-level. linked: (experimental) install in node_modules/.store, link in place, unhoisted.
.SS "\fBjson\fR"
.RS 0
.IP \(bu 4
@@ -1168,6 +1168,16 @@ Type: Boolean
When set to \fBtrue\fR, npm will display a progress bar during time intensive operations, if \fBprocess.stderr\fR is a TTY.
.P
Set to \fBfalse\fR to suppress the progress bar.
+.SS "\fBprovenance\fR"
+.RS 0
+.IP \(bu 4
+Default: false
+.IP \(bu 4
+Type: Boolean
+.RE 0
+
+.P
+Indicates that a provenance statement should be generated.
.SS "\fBproxy\fR"
.RS 0
.IP \(bu 4
diff --git a/deps/npm/man/man7/dependency-selectors.7 b/deps/npm/man/man7/dependency-selectors.7
index e3b0271a2601f7..f1670a3a20e885 100644
--- a/deps/npm/man/man7/dependency-selectors.7
+++ b/deps/npm/man/man7/dependency-selectors.7
@@ -1,4 +1,4 @@
-.TH "QUERYING" "7" "January 2023" "" ""
+.TH "QUERYING" "7" "February 2023" "" ""
.SH "NAME"
\fBQuerying\fR - Dependency Selector Syntax & Querying
.SS "Description"
diff --git a/deps/npm/man/man7/developers.7 b/deps/npm/man/man7/developers.7
index 8b5184dcb662dc..a15142ba205a4b 100644
--- a/deps/npm/man/man7/developers.7
+++ b/deps/npm/man/man7/developers.7
@@ -1,4 +1,4 @@
-.TH "DEVELOPERS" "7" "January 2023" "" ""
+.TH "DEVELOPERS" "7" "February 2023" "" ""
.SH "NAME"
\fBdevelopers\fR - Developer Guide
.SS "Description"
diff --git a/deps/npm/man/man7/logging.7 b/deps/npm/man/man7/logging.7
index f894a4b36fdfd8..d1cc3bdf00d8ca 100644
--- a/deps/npm/man/man7/logging.7
+++ b/deps/npm/man/man7/logging.7
@@ -1,4 +1,4 @@
-.TH "LOGGING" "7" "January 2023" "" ""
+.TH "LOGGING" "7" "February 2023" "" ""
.SH "NAME"
\fBLogging\fR - Why, What & How We Log
.SS "Description"
diff --git a/deps/npm/man/man7/orgs.7 b/deps/npm/man/man7/orgs.7
index 1aefdbb4145658..7aa5da2ede48f4 100644
--- a/deps/npm/man/man7/orgs.7
+++ b/deps/npm/man/man7/orgs.7
@@ -1,4 +1,4 @@
-.TH "ORGS" "7" "January 2023" "" ""
+.TH "ORGS" "7" "February 2023" "" ""
.SH "NAME"
\fBorgs\fR - Working with Teams & Orgs
.SS "Description"
diff --git a/deps/npm/man/man7/package-spec.7 b/deps/npm/man/man7/package-spec.7
index 9c376719ce9c53..4ae0095bd13b25 100644
--- a/deps/npm/man/man7/package-spec.7
+++ b/deps/npm/man/man7/package-spec.7
@@ -1,4 +1,4 @@
-.TH "PACKAGE-SPEC" "7" "January 2023" "" ""
+.TH "PACKAGE-SPEC" "7" "February 2023" "" ""
.SH "NAME"
\fBpackage-spec\fR - Package name specifier
.SS "Description"
diff --git a/deps/npm/man/man7/registry.7 b/deps/npm/man/man7/registry.7
index da418a424014e8..e67c8bb2b01158 100644
--- a/deps/npm/man/man7/registry.7
+++ b/deps/npm/man/man7/registry.7
@@ -1,4 +1,4 @@
-.TH "REGISTRY" "7" "January 2023" "" ""
+.TH "REGISTRY" "7" "February 2023" "" ""
.SH "NAME"
\fBregistry\fR - The JavaScript Package Registry
.SS "Description"
diff --git a/deps/npm/man/man7/removal.7 b/deps/npm/man/man7/removal.7
index aea330f2f6260e..82e39c8cb1acc3 100644
--- a/deps/npm/man/man7/removal.7
+++ b/deps/npm/man/man7/removal.7
@@ -1,4 +1,4 @@
-.TH "REMOVAL" "7" "January 2023" "" ""
+.TH "REMOVAL" "7" "February 2023" "" ""
.SH "NAME"
\fBremoval\fR - Cleaning the Slate
.SS "Synopsis"
diff --git a/deps/npm/man/man7/scope.7 b/deps/npm/man/man7/scope.7
index 9b892290e9b5cd..670b2f88c82116 100644
--- a/deps/npm/man/man7/scope.7
+++ b/deps/npm/man/man7/scope.7
@@ -1,4 +1,4 @@
-.TH "SCOPE" "7" "January 2023" "" ""
+.TH "SCOPE" "7" "February 2023" "" ""
.SH "NAME"
\fBscope\fR - Scoped packages
.SS "Description"
diff --git a/deps/npm/man/man7/scripts.7 b/deps/npm/man/man7/scripts.7
index 443d9430e76016..5f0ee7735d6a22 100644
--- a/deps/npm/man/man7/scripts.7
+++ b/deps/npm/man/man7/scripts.7
@@ -1,4 +1,4 @@
-.TH "SCRIPTS" "7" "January 2023" "" ""
+.TH "SCRIPTS" "7" "February 2023" "" ""
.SH "NAME"
\fBscripts\fR - How npm handles the "scripts" field
.SS "Description"
diff --git a/deps/npm/man/man7/workspaces.7 b/deps/npm/man/man7/workspaces.7
index e0ec139dd3429c..6e1e9bf59e08f2 100644
--- a/deps/npm/man/man7/workspaces.7
+++ b/deps/npm/man/man7/workspaces.7
@@ -1,4 +1,4 @@
-.TH "WORKSPACES" "7" "January 2023" "" ""
+.TH "WORKSPACES" "7" "February 2023" "" ""
.SH "NAME"
\fBworkspaces\fR - Working with workspaces
.SS "Description"
diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/build-ideal-tree.js b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/build-ideal-tree.js
index a9c4b4bc0bb6df..2ea66ac3364149 100644
--- a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/build-ideal-tree.js
+++ b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/build-ideal-tree.js
@@ -1232,6 +1232,7 @@ This is a one-time fix-up, please be patient...
const isWorkspace = this.idealTree.workspaces && this.idealTree.workspaces.has(spec.name)
// spec is a directory, link it unless installLinks is set or it's a workspace
+ // TODO post arborist refactor, will need to check for installStrategy=linked
if (spec.type === 'directory' && (isWorkspace || !installLinks)) {
return this[_linkFromSpec](name, spec, parent, edge)
}
diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/index.js b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/index.js
index 091e0b46574cc1..afcc3ec27150b6 100644
--- a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/index.js
+++ b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/index.js
@@ -42,6 +42,7 @@ const mixins = [
require('./load-virtual.js'),
require('./rebuild.js'),
require('./reify.js'),
+ require('./isolated-reifier.js'),
]
const _workspacesEnabled = Symbol.for('workspacesEnabled')
diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/isolated-reifier.js b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/isolated-reifier.js
new file mode 100644
index 00000000000000..f4f1bb8e443624
--- /dev/null
+++ b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/isolated-reifier.js
@@ -0,0 +1,453 @@
+const _makeIdealGraph = Symbol('makeIdealGraph')
+const _createIsolatedTree = Symbol.for('createIsolatedTree')
+const _createBundledTree = Symbol('createBundledTree')
+const fs = require('fs')
+const pacote = require('pacote')
+const { join } = require('path')
+const { depth } = require('treeverse')
+const crypto = require('crypto')
+
+// cache complicated function results
+const memoize = (fn) => {
+ const memo = new Map()
+ return async function (arg) {
+ const key = arg
+ if (memo.has(key)) {
+ return memo.get(key)
+ }
+ const result = {}
+ memo.set(key, result)
+ await fn(result, arg)
+ return result
+ }
+}
+
+module.exports = cls => class IsolatedReifier extends cls {
+ /**
+ * Create an ideal graph.
+ *
+ * An implementation of npm RFC-0042
+ * https://github.com/npm/rfcs/blob/main/accepted/0042-isolated-mode.md
+ *
+ * This entire file should be considered technical debt that will be resolved
+ * with an Arborist refactor or rewrite. Embedded logic in Nodes and Links,
+ * and the incremental state of building trees and reifying contains too many
+ * assumptions to do a linked mode properly.
+ *
+ * Instead, this approach takes a tree built from build-ideal-tree, and
+ * returns a new tree-like structure without the embedded logic of Node and
+ * Link classes.
+ *
+ * Since the RFC requires leaving the package-lock in place, this approach
+ * temporarily replaces the tree state for a couple of steps of reifying.
+ *
+ **/
+ async [_makeIdealGraph] (options) {
+ /* Make sure that the ideal tree is build as the rest of
+ * the algorithm depends on it.
+ */
+ const bitOpt = {
+ ...options,
+ complete: false,
+ }
+ await this.buildIdealTree(bitOpt)
+ const idealTree = this.idealTree
+
+ this.rootNode = {}
+ const root = this.rootNode
+ this.counter = 0
+
+ // memoize to cache generating proxy Nodes
+ this.externalProxyMemo = memoize(this.externalProxy.bind(this))
+ this.workspaceProxyMemo = memoize(this.workspaceProxy.bind(this))
+
+ root.external = []
+ root.isProjectRoot = true
+ root.localLocation = idealTree.location
+ root.localPath = idealTree.path
+ root.workspaces = await Promise.all(
+ Array.from(idealTree.fsChildren.values(), this.workspaceProxyMemo))
+ const processed = new Set()
+ const queue = [idealTree, ...idealTree.fsChildren]
+ while (queue.length !== 0) {
+ const next = queue.pop()
+ if (processed.has(next.location)) {
+ continue
+ }
+ processed.add(next.location)
+ next.edgesOut.forEach(e => {
+ if (!e.to || (next.package.bundleDependencies || next.package.bundledDependencies || []).includes(e.to.name)) {
+ return
+ }
+ queue.push(e.to)
+ })
+ if (!next.isProjectRoot && !next.isWorkspace) {
+ root.external.push(await this.externalProxyMemo(next))
+ }
+ }
+
+ await this.assignCommonProperties(idealTree, root)
+
+ this.idealGraph = root
+ }
+
+ async workspaceProxy (result, node) {
+ result.localLocation = node.location
+ result.localPath = node.path
+ result.isWorkspace = true
+ result.resolved = node.resolved
+ await this.assignCommonProperties(node, result)
+ }
+
+ async externalProxy (result, node) {
+ await this.assignCommonProperties(node, result)
+ if (node.hasShrinkwrap) {
+ const dir = join(
+ node.root.path,
+ 'node_modules',
+ '.store',
+ `${node.name}@${node.version}`
+ )
+ fs.mkdirSync(dir, { recursive: true })
+ // TODO this approach feels wrong
+ // and shouldn't be necessary for shrinkwraps
+ await pacote.extract(node.resolved, dir, {
+ ...this.options,
+ resolved: node.resolved,
+ integrity: node.integrity,
+ })
+ const Arborist = this.constructor
+ const arb = new Arborist({ ...this.options, path: dir })
+ await arb[_makeIdealGraph]({ dev: false })
+ this.rootNode.external.push(...arb.idealGraph.external)
+ arb.idealGraph.external.forEach(e => {
+ e.root = this.rootNode
+ e.id = `${node.id}=>${e.id}`
+ })
+ result.localDependencies = []
+ result.externalDependencies = arb.idealGraph.externalDependencies
+ result.externalOptionalDependencies = arb.idealGraph.externalOptionalDependencies
+ result.dependencies = [
+ ...result.externalDependencies,
+ ...result.localDependencies,
+ ...result.externalOptionalDependencies,
+ ]
+ }
+ result.optional = node.optional
+ result.resolved = node.resolved
+ result.version = node.version
+ }
+
+ async assignCommonProperties (node, result) {
+ function validEdgesOut (node) {
+ return [...node.edgesOut.values()].filter(e => e.to && e.to.target && !(node.package.bundledDepenedencies || node.package.bundleDependencies || []).includes(e.to.name))
+ }
+ const edges = validEdgesOut(node)
+ const optionalDeps = edges.filter(e => e.optional).map(e => e.to.target)
+ const nonOptionalDeps = edges.filter(e => !e.optional).map(e => e.to.target)
+
+ result.localDependencies = await Promise.all(nonOptionalDeps.filter(n => n.isWorkspace).map(this.workspaceProxyMemo))
+ result.externalDependencies = await Promise.all(nonOptionalDeps.filter(n => !n.isWorkspace).map(this.externalProxyMemo))
+ result.externalOptionalDependencies = await Promise.all(optionalDeps.map(this.externalProxyMemo))
+ result.dependencies = [
+ ...result.externalDependencies,
+ ...result.localDependencies,
+ ...result.externalOptionalDependencies,
+ ]
+ result.root = this.rootNode
+ result.id = this.counter++
+ result.name = node.name
+ result.package = { ...node.package }
+ result.package.bundleDependencies = undefined
+ result.hasInstallScript = node.hasInstallScript
+ }
+
+ async [_createBundledTree] () {
+ // TODO: make sure that idealTree object exists
+ const idealTree = this.idealTree
+ // TODO: test workspaces having bundled deps
+ const queue = []
+
+ for (const [, edge] of idealTree.edgesOut) {
+ if (edge.to && (idealTree.package.bundleDependencies || idealTree.package.bundledDependencies || []).includes(edge.to.name)) {
+ queue.push({ from: idealTree, to: edge.to })
+ }
+ }
+ for (const child of idealTree.fsChildren) {
+ for (const [, edge] of child.edgesOut) {
+ if (edge.to && (child.package.bundleDependencies || child.package.bundledDependencies || []).includes(edge.to.name)) {
+ queue.push({ from: child, to: edge.to })
+ }
+ }
+ }
+
+ const processed = new Set()
+ const nodes = new Map()
+ const edges = []
+ while (queue.length !== 0) {
+ const nextEdge = queue.pop()
+ const key = `${nextEdge.from.location}=>${nextEdge.to.location}`
+ // should be impossible, unless bundled is duped
+ /* istanbul ignore next */
+ if (processed.has(key)) {
+ continue
+ }
+ processed.add(key)
+ const from = nextEdge.from
+ if (!from.isRoot && !from.isWorkspace) {
+ nodes.set(from.location, { location: from.location, resolved: from.resolved, name: from.name, optional: from.optional, pkg: { ...from.package, bundleDependencies: undefined } })
+ }
+ const to = nextEdge.to
+ nodes.set(to.location, { location: to.location, resolved: to.resolved, name: to.name, optional: to.optional, pkg: { ...to.package, bundleDependencies: undefined } })
+ edges.push({ from: from.isRoot ? 'root' : from.location, to: to.location })
+
+ to.edgesOut.forEach(e => {
+ // an edge out should always have a to
+ /* istanbul ignore else */
+ if (e.to) {
+ queue.push({ from: e.from, to: e.to })
+ }
+ })
+ }
+ return { edges, nodes }
+ }
+
+ async [_createIsolatedTree] (idealTree) {
+ await this[_makeIdealGraph](this.options)
+
+ const proxiedIdealTree = this.idealGraph
+
+ const bundledTree = await this[_createBundledTree]()
+
+ const treeHash = (startNode) => {
+ // generate short hash based on the dependency tree
+ // starting at this node
+ const deps = []
+ const branch = []
+ depth({
+ tree: startNode,
+ getChildren: node => node.dependencies,
+ filter: node => node,
+ visit: node => {
+ branch.push(`${node.name}@${node.version}`)
+ deps.push(`${branch.join('->')}::${node.resolved}`)
+ },
+ leave: () => {
+ branch.pop()
+ },
+ })
+ deps.sort()
+ return crypto.createHash('shake256', { outputLength: 16 })
+ .update(deps.join(','))
+ .digest('base64')
+ // Node v14 doesn't support base64url
+ .replace(/\+/g, '-')
+ .replace(/\//g, '_')
+ .replace(/=+$/m, '')
+ }
+
+ const getKey = (idealTreeNode) => {
+ return `${idealTreeNode.name}@${idealTreeNode.version}-${treeHash(idealTreeNode)}`
+ }
+
+ const root = {
+ fsChildren: [],
+ integrity: null,
+ inventory: new Map(),
+ isLink: false,
+ isRoot: true,
+ binPaths: [],
+ edgesIn: new Set(),
+ edgesOut: new Map(),
+ hasShrinkwrap: false,
+ parent: null,
+ // TODO: we should probably not reference this.idealTree
+ resolved: this.idealTree.resolved,
+ isTop: true,
+ path: proxiedIdealTree.root.localPath,
+ realpath: proxiedIdealTree.root.localPath,
+ package: proxiedIdealTree.root.package,
+ meta: { loadedFromDisk: false },
+ global: false,
+ isProjectRoot: true,
+ children: [],
+ }
+ // root.inventory.set('', t)
+ // root.meta = this.idealTree.meta
+ // TODO We should mock better the inventory object because it is used by audit-report.js ... maybe
+ root.inventory.query = () => {
+ return []
+ }
+ const processed = new Set()
+ proxiedIdealTree.workspaces.forEach(c => {
+ const workspace = {
+ edgesIn: new Set(),
+ edgesOut: new Map(),
+ children: [],
+ hasInstallScript: c.hasInstallScript,
+ binPaths: [],
+ package: c.package,
+ location: c.localLocation,
+ path: c.localPath,
+ realpath: c.localPath,
+ resolved: c.resolved,
+ }
+ root.fsChildren.push(workspace)
+ root.inventory.set(workspace.location, workspace)
+ })
+ const generateChild = (node, location, pkg, inStore) => {
+ const newChild = {
+ global: false,
+ globalTop: false,
+ isProjectRoot: false,
+ isTop: false,
+ location,
+ name: node.name,
+ optional: node.optional,
+ top: { path: proxiedIdealTree.root.localPath },
+ children: [],
+ edgesIn: new Set(),
+ edgesOut: new Map(),
+ binPaths: [],
+ fsChildren: [],
+ /* istanbul ignore next -- emulate Node */
+ getBundler () {
+ return null
+ },
+ hasShrinkwrap: false,
+ inDepBundle: false,
+ integrity: null,
+ isLink: false,
+ isRoot: false,
+ isInStore: inStore,
+ path: join(proxiedIdealTree.root.localPath, location),
+ realpath: join(proxiedIdealTree.root.localPath, location),
+ resolved: node.resolved,
+ version: pkg.version,
+ package: pkg,
+ }
+ newChild.target = newChild
+ root.children.push(newChild)
+ root.inventory.set(newChild.location, newChild)
+ }
+ proxiedIdealTree.external.forEach(c => {
+ const key = getKey(c)
+ if (processed.has(key)) {
+ return
+ }
+ processed.add(key)
+ const location = join('node_modules', '.store', key, 'node_modules', c.name)
+ generateChild(c, location, c.package, true)
+ })
+ bundledTree.nodes.forEach(node => {
+ generateChild(node, node.location, node.pkg, false)
+ })
+ bundledTree.edges.forEach(e => {
+ const from = e.from === 'root' ? root : root.inventory.get(e.from)
+ const to = root.inventory.get(e.to)
+ // Maybe optional should be propagated from the original edge
+ const edge = { optional: false, from, to }
+ from.edgesOut.set(to.name, edge)
+ to.edgesIn.add(edge)
+ })
+ const memo = new Set()
+
+ function processEdges (node, externalEdge) {
+ externalEdge = !!externalEdge
+ const key = getKey(node)
+ if (memo.has(key)) {
+ return
+ }
+ memo.add(key)
+
+ let from, nmFolder
+ if (externalEdge) {
+ const fromLocation = join('node_modules', '.store', key, 'node_modules', node.name)
+ from = root.children.find(c => c.location === fromLocation)
+ nmFolder = join('node_modules', '.store', key, 'node_modules')
+ } else {
+ from = node.isProjectRoot ? root : root.fsChildren.find(c => c.location === node.localLocation)
+ nmFolder = join(node.localLocation, 'node_modules')
+ }
+
+ const processDeps = (dep, optional, external) => {
+ optional = !!optional
+ external = !!external
+
+ const location = join(nmFolder, dep.name)
+ const binNames = dep.package.bin && Object.keys(dep.package.bin) || []
+ const toKey = getKey(dep)
+
+ let target
+ if (external) {
+ const toLocation = join('node_modules', '.store', toKey, 'node_modules', dep.name)
+ target = root.children.find(c => c.location === toLocation)
+ } else {
+ target = root.fsChildren.find(c => c.location === dep.localLocation)
+ }
+ // TODO: we should no-op is an edge has already been created with the same fromKey and toKey
+
+ binNames.forEach(bn => {
+ target.binPaths.push(join(from.realpath, 'node_modules', '.bin', bn))
+ })
+
+ const link = {
+ global: false,
+ globalTop: false,
+ isProjectRoot: false,
+ edgesIn: new Set(),
+ edgesOut: new Map(),
+ binPaths: [],
+ isTop: false,
+ optional,
+ location: location,
+ path: join(dep.root.localPath, nmFolder, dep.name),
+ realpath: target.path,
+ name: toKey,
+ resolved: dep.resolved,
+ top: { path: dep.root.localPath },
+ children: [],
+ fsChildren: [],
+ isLink: true,
+ isStoreLink: true,
+ isRoot: false,
+ package: { _id: 'abc', bundleDependencies: undefined, deprecated: undefined, bin: target.package.bin, scripts: dep.package.scripts },
+ target,
+ }
+ const newEdge1 = { optional, from, to: link }
+ from.edgesOut.set(dep.name, newEdge1)
+ link.edgesIn.add(newEdge1)
+ const newEdge2 = { optional: false, from: link, to: target }
+ link.edgesOut.set(dep.name, newEdge2)
+ target.edgesIn.add(newEdge2)
+ root.children.push(link)
+ }
+
+ for (const dep of node.localDependencies) {
+ processEdges(dep, false)
+ // nonOptional, local
+ processDeps(dep, false, false)
+ }
+ for (const dep of node.externalDependencies) {
+ processEdges(dep, true)
+ // nonOptional, external
+ processDeps(dep, false, true)
+ }
+ for (const dep of node.externalOptionalDependencies) {
+ processEdges(dep, true)
+ // optional, external
+ processDeps(dep, true, true)
+ }
+ }
+
+ processEdges(proxiedIdealTree, false)
+ for (const node of proxiedIdealTree.workspaces) {
+ processEdges(node, false)
+ }
+ root.children.forEach(c => c.parent = root)
+ root.children.forEach(c => c.root = root)
+ root.root = root
+ root.target = root
+ return root
+ }
+}
diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/rebuild.js b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/rebuild.js
index 6a675320d864bb..e8df69e328ce85 100644
--- a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/rebuild.js
+++ b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/rebuild.js
@@ -89,6 +89,7 @@ module.exports = cls => class Builder extends cls {
const {
depNodes,
linkNodes,
+ storeNodes,
} = this[_retrieveNodesByType](nodes)
// build regular deps
@@ -99,6 +100,10 @@ module.exports = cls => class Builder extends cls {
this[_resetQueues]()
await this[_build](linkNodes, { type: 'links' })
}
+ if (storeNodes.size) {
+ this[_resetQueues]()
+ await this[_build](storeNodes, { type: 'storelinks' })
+ }
process.emit('timeEnd', 'build')
}
@@ -130,9 +135,12 @@ module.exports = cls => class Builder extends cls {
[_retrieveNodesByType] (nodes) {
const depNodes = new Set()
const linkNodes = new Set()
+ const storeNodes = new Set()
for (const node of nodes) {
- if (node.isLink) {
+ if (node.isStoreLink) {
+ storeNodes.add(node)
+ } else if (node.isLink) {
linkNodes.add(node)
} else {
depNodes.add(node)
@@ -154,6 +162,7 @@ module.exports = cls => class Builder extends cls {
return {
depNodes,
linkNodes,
+ storeNodes,
}
}
diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/reify.js b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/reify.js
index e5ccec5c71d960..87993cca876d66 100644
--- a/deps/npm/node_modules/@npmcli/arborist/lib/arborist/reify.js
+++ b/deps/npm/node_modules/@npmcli/arborist/lib/arborist/reify.js
@@ -1,5 +1,4 @@
// mixin implementing the reify method
-
const onExit = require('../signal-handling.js')
const pacote = require('pacote')
const AuditReport = require('../audit-report.js')
@@ -10,8 +9,9 @@ const debug = require('../debug.js')
const walkUp = require('walk-up-path')
const log = require('proc-log')
const hgi = require('hosted-git-info')
+const rpj = require('read-package-json-fast')
-const { dirname, resolve, relative } = require('path')
+const { dirname, resolve, relative, join } = require('path')
const { depth: dfwalk } = require('treeverse')
const {
lstat,
@@ -82,7 +82,6 @@ const _rollbackRetireShallowNodes = Symbol.for('rollbackRetireShallowNodes')
const _rollbackCreateSparseTree = Symbol.for('rollbackCreateSparseTree')
const _rollbackMoveBackRetiredUnchanged = Symbol.for('rollbackMoveBackRetiredUnchanged')
const _saveIdealTree = Symbol.for('saveIdealTree')
-const _saveLockFile = Symbol('saveLockFile')
const _copyIdealToActual = Symbol('copyIdealToActual')
const _addOmitsToTrashList = Symbol('addOmitsToTrashList')
const _packageLockOnly = Symbol('packageLockOnly')
@@ -106,6 +105,8 @@ const _resolvedAdd = Symbol.for('resolvedAdd')
const _usePackageLock = Symbol.for('usePackageLock')
const _formatPackageLock = Symbol.for('formatPackageLock')
+const _createIsolatedTree = Symbol.for('createIsolatedTree')
+
module.exports = cls => class Reifier extends cls {
constructor (options) {
super(options)
@@ -138,6 +139,8 @@ module.exports = cls => class Reifier extends cls {
// public method
async reify (options = {}) {
+ const linked = (options.installStrategy || this.options.installStrategy) === 'linked'
+
if (this[_packageLockOnly] && this[_global]) {
const er = new Error('cannot generate lockfile for global packages')
er.code = 'ESHRINKWRAPGLOBAL'
@@ -154,8 +157,22 @@ module.exports = cls => class Reifier extends cls {
process.emit('time', 'reify')
await this[_validatePath]()
await this[_loadTrees](options)
+
+ const oldTree = this.idealTree
+ if (linked) {
+ // swap out the tree with the isolated tree
+ // this is currently technical debt which will be resolved in a refactor
+ // of Node/Link trees
+ log.warn('reify', 'The "linked" install strategy is EXPERIMENTAL and may contain bugs.')
+ this.idealTree = await this[_createIsolatedTree](this.idealTree)
+ }
await this[_diffTrees]()
await this[_reifyPackages]()
+ if (linked) {
+ // swap back in the idealTree
+ // so that the lockfile is preserved
+ this.idealTree = oldTree
+ }
await this[_saveIdealTree](options)
await this[_copyIdealToActual]()
// This is a very bad pattern and I can't wait to stop doing it
@@ -634,44 +651,40 @@ module.exports = cls => class Reifier extends cls {
}
async [_extractOrLink] (node) {
- // in normal cases, node.resolved should *always* be set by now.
- // however, it is possible when a lockfile is damaged, or very old,
- // or in some other race condition bugs in npm v6, that a previously
- // bundled dependency will have just a version, but no resolved value,
- // and no 'bundled: true' setting.
- // Do the best with what we have, or else remove it from the tree
- // entirely, since we can't possibly reify it.
- let res = null
- if (node.resolved) {
- const registryResolved = this[_registryResolved](node.resolved)
- if (registryResolved) {
- res = `${node.name}@${registryResolved}`
- }
- } else if (node.packageName && node.version) {
- res = `${node.packageName}@${node.version}`
- }
-
- // no idea what this thing is. remove it from the tree.
- if (!res) {
- const warning = 'invalid or damaged lockfile detected\n' +
- 'please re-try this operation once it completes\n' +
- 'so that the damage can be corrected, or perform\n' +
- 'a fresh install with no lockfile if the problem persists.'
- log.warn('reify', warning)
- log.verbose('reify', 'unrecognized node in tree', node.path)
- node.parent = null
- node.fsParent = null
- this[_addNodeToTrashList](node)
- return
- }
-
const nm = resolve(node.parent.path, 'node_modules')
await this[_validateNodeModules](nm)
- if (node.isLink) {
- await rm(node.path, { recursive: true, force: true })
- await this[_symlink](node)
- } else {
+ if (!node.isLink) {
+ // in normal cases, node.resolved should *always* be set by now.
+ // however, it is possible when a lockfile is damaged, or very old,
+ // or in some other race condition bugs in npm v6, that a previously
+ // bundled dependency will have just a version, but no resolved value,
+ // and no 'bundled: true' setting.
+ // Do the best with what we have, or else remove it from the tree
+ // entirely, since we can't possibly reify it.
+ let res = null
+ if (node.resolved) {
+ const registryResolved = this[_registryResolved](node.resolved)
+ if (registryResolved) {
+ res = `${node.name}@${registryResolved}`
+ }
+ } else if (node.package.name && node.version) {
+ res = `${node.package.name}@${node.version}`
+ }
+
+ // no idea what this thing is. remove it from the tree.
+ if (!res) {
+ const warning = 'invalid or damaged lockfile detected\n' +
+ 'please re-try this operation once it completes\n' +
+ 'so that the damage can be corrected, or perform\n' +
+ 'a fresh install with no lockfile if the problem persists.'
+ log.warn('reify', warning)
+ log.verbose('reify', 'unrecognized node in tree', node.path)
+ node.parent = null
+ node.fsParent = null
+ this[_addNodeToTrashList](node)
+ return
+ }
await debug(async () => {
const st = await lstat(node.path).catch(e => null)
if (st && !st.isDirectory()) {
@@ -688,7 +701,17 @@ module.exports = cls => class Reifier extends cls {
resolved: node.resolved,
integrity: node.integrity,
})
+ // store nodes don't use Node class so node.package doesn't get updated
+ if (node.isInStore) {
+ const pkg = await rpj(join(node.path, 'package.json'))
+ node.package.scripts = pkg.scripts
+ }
+ return
}
+
+ // node.isLink
+ await rm(node.path, { recursive: true, force: true })
+ await this[_symlink](node)
}
async [_symlink] (node) {
@@ -1380,64 +1403,53 @@ module.exports = cls => class Reifier extends cls {
}
}
- // preserve indentation, if possible
- const {
- [Symbol.for('indent')]: indent,
- } = this.idealTree.package
- const format = indent === undefined ? ' ' : indent
-
- const saveOpt = {
- format: (this[_formatPackageLock] && format) ? format
- : this[_formatPackageLock],
- }
-
- const promises = [this[_saveLockFile](saveOpt)]
-
- const updatePackageJson = async (tree) => {
- const pkgJson = await PackageJson.load(tree.path)
- .catch(() => new PackageJson(tree.path))
- const {
- dependencies = {},
- devDependencies = {},
- optionalDependencies = {},
- peerDependencies = {},
- // bundleDependencies is not required by PackageJson like the other fields here
- // PackageJson also doesn't omit an empty array for this field so defaulting this
- // to an empty array would add that field to every package.json file.
- bundleDependencies,
- } = tree.package
-
- pkgJson.update({
- dependencies,
- devDependencies,
- optionalDependencies,
- peerDependencies,
- bundleDependencies,
- })
- await pkgJson.save()
- }
-
if (save) {
for (const tree of updatedTrees) {
// refresh the edges so they have the correct specs
tree.package = tree.package
- promises.push(updatePackageJson(tree))
+ const pkgJson = await PackageJson.load(tree.path)
+ .catch(() => new PackageJson(tree.path))
+ const {
+ dependencies = {},
+ devDependencies = {},
+ optionalDependencies = {},
+ peerDependencies = {},
+ // bundleDependencies is not required by PackageJson like the other
+ // fields here PackageJson also doesn't omit an empty array for this
+ // field so defaulting this to an empty array would add that field to
+ // every package.json file.
+ bundleDependencies,
+ } = tree.package
+
+ pkgJson.update({
+ dependencies,
+ devDependencies,
+ optionalDependencies,
+ peerDependencies,
+ bundleDependencies,
+ })
+ await pkgJson.save()
}
}
- await Promise.all(promises)
- process.emit('timeEnd', 'reify:save')
- return true
- }
+ // before now edge specs could be changing, affecting the `requires` field
+ // in the package lock, so we hold off saving to the very last action
+ if (this[_usePackageLock]) {
+ // preserve indentation, if possible
+ let format = this.idealTree.package[Symbol.for('indent')]
+ if (format === undefined) {
+ format = ' '
+ }
- async [_saveLockFile] (saveOpt) {
- if (!this[_usePackageLock]) {
- return
+ // TODO this ignores options.save
+ await this.idealTree.meta.save({
+ format: (this[_formatPackageLock] && format) ? format
+ : this[_formatPackageLock],
+ })
}
- const { meta } = this.idealTree
-
- return meta.save(saveOpt)
+ process.emit('timeEnd', 'reify:save')
+ return true
}
async [_copyIdealToActual] () {
diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/link.js b/deps/npm/node_modules/@npmcli/arborist/lib/link.js
index ebdbc94285f1cc..197f96c5c2ddb9 100644
--- a/deps/npm/node_modules/@npmcli/arborist/lib/link.js
+++ b/deps/npm/node_modules/@npmcli/arborist/lib/link.js
@@ -8,7 +8,7 @@ const _delistFromMeta = Symbol.for('_delistFromMeta')
const _refreshLocation = Symbol.for('_refreshLocation')
class Link extends Node {
constructor (options) {
- const { root, realpath, target, parent, fsParent } = options
+ const { root, realpath, target, parent, fsParent, isStoreLink } = options
if (!realpath && !(target && target.path)) {
throw new TypeError('must provide realpath for Link node')
@@ -23,6 +23,8 @@ class Link extends Node {
: null),
})
+ this.isStoreLink = isStoreLink || false
+
if (target) {
this.target = target
} else if (this.realpath === this.root.path) {
diff --git a/deps/npm/node_modules/@npmcli/arborist/lib/node.js b/deps/npm/node_modules/@npmcli/arborist/lib/node.js
index b90a2acf8f8ae6..b21a3d8e3de0a3 100644
--- a/deps/npm/node_modules/@npmcli/arborist/lib/node.js
+++ b/deps/npm/node_modules/@npmcli/arborist/lib/node.js
@@ -91,6 +91,7 @@ class Node {
installLinks = false,
legacyPeerDeps = false,
linksIn,
+ isInStore = false,
hasShrinkwrap,
overrides,
loadOverrides = false,
@@ -113,6 +114,7 @@ class Node {
this[_workspaces] = null
this.errors = error ? [error] : []
+ this.isInStore = isInStore
// this will usually be null, except when modeling a
// package's dependencies in a virtual root.
diff --git a/deps/npm/node_modules/@npmcli/arborist/package.json b/deps/npm/node_modules/@npmcli/arborist/package.json
index a7e8132123fba0..f27b1458d8d59c 100644
--- a/deps/npm/node_modules/@npmcli/arborist/package.json
+++ b/deps/npm/node_modules/@npmcli/arborist/package.json
@@ -1,37 +1,37 @@
{
"name": "@npmcli/arborist",
- "version": "6.1.6",
+ "version": "6.2.2",
"description": "Manage node_modules trees",
"dependencies": {
"@isaacs/string-locale-compare": "^1.1.0",
"@npmcli/fs": "^3.1.0",
"@npmcli/installed-package-contents": "^2.0.0",
- "@npmcli/map-workspaces": "^3.0.0",
+ "@npmcli/map-workspaces": "^3.0.2",
"@npmcli/metavuln-calculator": "^5.0.0",
- "@npmcli/name-from-folder": "^1.0.1",
+ "@npmcli/name-from-folder": "^2.0.0",
"@npmcli/node-gyp": "^3.0.0",
"@npmcli/package-json": "^3.0.0",
"@npmcli/query": "^3.0.0",
"@npmcli/run-script": "^6.0.0",
"bin-links": "^4.0.1",
- "cacache": "^17.0.3",
+ "cacache": "^17.0.4",
"common-ancestor-path": "^1.0.1",
"hosted-git-info": "^6.1.1",
"json-parse-even-better-errors": "^3.0.0",
"json-stringify-nice": "^1.1.4",
- "minimatch": "^5.1.1",
+ "minimatch": "^6.1.6",
"nopt": "^7.0.0",
"npm-install-checks": "^6.0.0",
"npm-package-arg": "^10.1.0",
"npm-pick-manifest": "^8.0.1",
"npm-registry-fetch": "^14.0.3",
"npmlog": "^7.0.1",
- "pacote": "^15.0.7",
+ "pacote": "^15.0.8",
"parse-conflict-json": "^3.0.0",
"proc-log": "^3.0.0",
"promise-all-reject-late": "^1.0.0",
"promise-call-limit": "^1.0.1",
- "read-package-json-fast": "^3.0.1",
+ "read-package-json-fast": "^3.0.2",
"semver": "^7.3.7",
"ssri": "^10.0.1",
"treeverse": "^3.0.0",
@@ -39,26 +39,24 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
+ "@npmcli/template-oss": "4.11.4",
"benchmark": "^2.1.4",
"chalk": "^4.1.0",
"minify-registry-metadata": "^3.0.0",
- "nock": "^13.2.0",
- "tap": "^16.3.2",
+ "nock": "^13.3.0",
+ "tap": "^16.3.4",
+ "tar-stream": "^3.0.0",
"tcompare": "^5.0.6"
},
"scripts": {
"test": "tap",
"posttest": "node ../.. run lint",
"snap": "tap",
- "postsnap": "npm run lintfix",
"test-proxy": "ARBORIST_TEST_PROXY=1 tap --snapshot",
- "eslint": "eslint",
"lint": "eslint \"**/*.js\"",
"lintfix": "node ../.. run lint -- --fix",
"benchmark": "node scripts/benchmark.js",
"benchclean": "rm -rf scripts/benchmark/*/",
- "npmclilint": "npmcli-lint",
"postlint": "template-oss-check",
"template-oss-apply": "template-oss-apply --force"
},
@@ -100,7 +98,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
}
}
diff --git a/deps/npm/node_modules/@npmcli/config/lib/index.js b/deps/npm/node_modules/@npmcli/config/lib/index.js
index 1ddf2678391959..6520e5616ff41c 100644
--- a/deps/npm/node_modules/@npmcli/config/lib/index.js
+++ b/deps/npm/node_modules/@npmcli/config/lib/index.js
@@ -482,8 +482,9 @@ class Config {
if (problem.action === 'delete') {
this.delete(problem.key, problem.where)
} else if (problem.action === 'rename') {
- const old = this.get(problem.from, problem.where)
- this.set(problem.to, old, problem.where)
+ const raw = this.data.get(problem.where).raw?.[problem.from]
+ const calculated = this.get(problem.from, problem.where)
+ this.set(problem.to, raw || calculated, problem.where)
this.delete(problem.from, problem.where)
}
}
diff --git a/deps/npm/node_modules/@npmcli/config/package.json b/deps/npm/node_modules/@npmcli/config/package.json
index 50d860c1c941e9..38c063e358beb1 100644
--- a/deps/npm/node_modules/@npmcli/config/package.json
+++ b/deps/npm/node_modules/@npmcli/config/package.json
@@ -1,6 +1,6 @@
{
"name": "@npmcli/config",
- "version": "6.1.1",
+ "version": "6.1.3",
"files": [
"bin/",
"lib/"
@@ -33,15 +33,15 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
- "tap": "^16.3.2"
+ "@npmcli/template-oss": "4.11.4",
+ "tap": "^16.3.4"
},
"dependencies": {
- "@npmcli/map-workspaces": "^3.0.0",
+ "@npmcli/map-workspaces": "^3.0.2",
"ini": "^3.0.0",
"nopt": "^7.0.0",
"proc-log": "^3.0.0",
- "read-package-json-fast": "^3.0.0",
+ "read-package-json-fast": "^3.0.2",
"semver": "^7.3.5",
"walk-up-path": "^1.0.0"
},
@@ -50,6 +50,6 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0"
+ "version": "4.11.4"
}
}
diff --git a/deps/npm/node_modules/@npmcli/map-workspaces/package.json b/deps/npm/node_modules/@npmcli/map-workspaces/package.json
index c8113cb25eb32b..3f5270360c5fc0 100644
--- a/deps/npm/node_modules/@npmcli/map-workspaces/package.json
+++ b/deps/npm/node_modules/@npmcli/map-workspaces/package.json
@@ -1,6 +1,6 @@
{
"name": "@npmcli/map-workspaces",
- "version": "3.0.0",
+ "version": "3.0.2",
"main": "lib/index.js",
"files": [
"bin/",
@@ -42,18 +42,18 @@
]
},
"devDependencies": {
- "@npmcli/eslint-config": "^3.0.1",
- "@npmcli/template-oss": "4.5.1",
+ "@npmcli/eslint-config": "^4.0.0",
+ "@npmcli/template-oss": "4.11.3",
"tap": "^16.0.1"
},
"dependencies": {
- "@npmcli/name-from-folder": "^1.0.1",
+ "@npmcli/name-from-folder": "^2.0.0",
"glob": "^8.0.1",
- "minimatch": "^5.0.1",
+ "minimatch": "^6.1.6",
"read-package-json-fast": "^3.0.0"
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.5.1"
+ "version": "4.11.3"
}
}
diff --git a/deps/npm/node_modules/node-gyp/node_modules/@npmcli/move-file/LICENSE.md b/deps/npm/node_modules/@npmcli/move-file/LICENSE.md
similarity index 100%
rename from deps/npm/node_modules/node-gyp/node_modules/@npmcli/move-file/LICENSE.md
rename to deps/npm/node_modules/@npmcli/move-file/LICENSE.md
diff --git a/deps/npm/node_modules/node-gyp/node_modules/@npmcli/move-file/lib/index.js b/deps/npm/node_modules/@npmcli/move-file/lib/index.js
similarity index 100%
rename from deps/npm/node_modules/node-gyp/node_modules/@npmcli/move-file/lib/index.js
rename to deps/npm/node_modules/@npmcli/move-file/lib/index.js
diff --git a/deps/npm/node_modules/node-gyp/node_modules/@npmcli/move-file/package.json b/deps/npm/node_modules/@npmcli/move-file/package.json
similarity index 100%
rename from deps/npm/node_modules/node-gyp/node_modules/@npmcli/move-file/package.json
rename to deps/npm/node_modules/@npmcli/move-file/package.json
diff --git a/deps/npm/node_modules/@npmcli/name-from-folder/index.js b/deps/npm/node_modules/@npmcli/name-from-folder/lib/index.js
similarity index 100%
rename from deps/npm/node_modules/@npmcli/name-from-folder/index.js
rename to deps/npm/node_modules/@npmcli/name-from-folder/lib/index.js
diff --git a/deps/npm/node_modules/@npmcli/name-from-folder/package.json b/deps/npm/node_modules/@npmcli/name-from-folder/package.json
index 9569b4e66e90c9..f0aa5b16dba1aa 100644
--- a/deps/npm/node_modules/@npmcli/name-from-folder/package.json
+++ b/deps/npm/node_modules/@npmcli/name-from-folder/package.json
@@ -1,27 +1,43 @@
{
"name": "@npmcli/name-from-folder",
- "version": "1.0.1",
+ "version": "2.0.0",
"files": [
- "index.js"
+ "bin/",
+ "lib/"
],
+ "main": "lib/index.js",
"description": "Get the package name from a folder path",
"repository": {
"type": "git",
- "url": "git+https://github.com/npm/name-from-folder"
+ "url": "https://github.com/npm/name-from-folder.git"
},
- "author": "Isaac Z. Schlueter (https://izs.me)",
+ "author": "GitHub Inc.",
"license": "ISC",
"scripts": {
"test": "tap",
"snap": "tap",
- "preversion": "npm test",
- "postversion": "npm publish",
- "prepublishOnly": "git push origin --follow-tags"
- },
- "tap": {
- "check-coverage": true
+ "lint": "eslint \"**/*.js\"",
+ "postlint": "template-oss-check",
+ "template-oss-apply": "template-oss-apply --force",
+ "lintfix": "npm run lint -- --fix",
+ "posttest": "npm run lint"
},
"devDependencies": {
- "tap": "^14.10.7"
+ "@npmcli/eslint-config": "^4.0.1",
+ "@npmcli/template-oss": "4.11.0",
+ "tap": "^16.3.2"
+ },
+ "engines": {
+ "node": "^14.17.0 || ^16.13.0 || >=18.0.0"
+ },
+ "templateOSS": {
+ "//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
+ "version": "4.11.0"
+ },
+ "tap": {
+ "nyc-arg": [
+ "--exclude",
+ "tap-snapshots/**"
+ ]
}
}
diff --git a/deps/npm/node_modules/@npmcli/promise-spawn/lib/index.js b/deps/npm/node_modules/@npmcli/promise-spawn/lib/index.js
index 1d422045d558c1..571ff6b9169c9b 100644
--- a/deps/npm/node_modules/@npmcli/promise-spawn/lib/index.js
+++ b/deps/npm/node_modules/@npmcli/promise-spawn/lib/index.js
@@ -131,7 +131,7 @@ const open = (_args, opts = {}, extra = {}) => {
let platform = process.platform
// process.platform === 'linux' may actually indicate WSL, if that's the case
// we want to treat things as win32 anyway so the host can open the argument
- if (platform === 'linux' && os.release().includes('Microsoft')) {
+ if (platform === 'linux' && os.release().toLowerCase().includes('microsoft')) {
platform = 'win32'
}
diff --git a/deps/npm/node_modules/@npmcli/promise-spawn/package.json b/deps/npm/node_modules/@npmcli/promise-spawn/package.json
index c21e84fe835997..2080d9f5be9f04 100644
--- a/deps/npm/node_modules/@npmcli/promise-spawn/package.json
+++ b/deps/npm/node_modules/@npmcli/promise-spawn/package.json
@@ -1,6 +1,6 @@
{
"name": "@npmcli/promise-spawn",
- "version": "6.0.1",
+ "version": "6.0.2",
"files": [
"bin/",
"lib/"
@@ -32,8 +32,8 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.8.0",
- "minipass": "^3.1.1",
+ "@npmcli/template-oss": "4.11.0",
+ "minipass": "^4.0.0",
"spawk": "^1.7.1",
"tap": "^16.0.1"
},
@@ -42,7 +42,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.8.0"
+ "version": "4.11.0"
},
"dependencies": {
"which": "^3.0.0"
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.js
deleted file mode 100644
index 105eebbd1c6af8..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.js
+++ /dev/null
@@ -1,3 +0,0 @@
-'use strict' // Keep this file as an alias for the full stream module.
-
-module.exports = require('./stream').Duplex
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.js
deleted file mode 100644
index 31358e6d129574..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.js
+++ /dev/null
@@ -1,3 +0,0 @@
-'use strict' // Keep this file as an alias for the full stream module.
-
-module.exports = require('./stream').PassThrough
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.js
deleted file mode 100644
index abd53db4ca77d7..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.js
+++ /dev/null
@@ -1,3 +0,0 @@
-'use strict' // Keep this file as an alias for the full stream module.
-
-module.exports = require('./stream').Readable
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.js
deleted file mode 100644
index 98ea338248e0a6..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.js
+++ /dev/null
@@ -1,3 +0,0 @@
-'use strict' // Keep this file as an alias for the full stream module.
-
-module.exports = require('./stream').Transform
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.js
deleted file mode 100644
index 07204c42954708..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.js
+++ /dev/null
@@ -1,3 +0,0 @@
-'use strict' // Keep this file as an alias for the full stream module.
-
-module.exports = require('./stream').Writable
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/buffer_list.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/buffer_list.js
deleted file mode 100644
index e22914f0edca0c..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/buffer_list.js
+++ /dev/null
@@ -1,180 +0,0 @@
-'use strict'
-
-const { StringPrototypeSlice, SymbolIterator, TypedArrayPrototypeSet, Uint8Array } = require('../../ours/primordials')
-
-const { Buffer } = require('buffer')
-
-const { inspect } = require('../../ours/util')
-
-module.exports = class BufferList {
- constructor() {
- this.head = null
- this.tail = null
- this.length = 0
- }
-
- push(v) {
- const entry = {
- data: v,
- next: null
- }
- if (this.length > 0) this.tail.next = entry
- else this.head = entry
- this.tail = entry
- ++this.length
- }
-
- unshift(v) {
- const entry = {
- data: v,
- next: this.head
- }
- if (this.length === 0) this.tail = entry
- this.head = entry
- ++this.length
- }
-
- shift() {
- if (this.length === 0) return
- const ret = this.head.data
- if (this.length === 1) this.head = this.tail = null
- else this.head = this.head.next
- --this.length
- return ret
- }
-
- clear() {
- this.head = this.tail = null
- this.length = 0
- }
-
- join(s) {
- if (this.length === 0) return ''
- let p = this.head
- let ret = '' + p.data
-
- while ((p = p.next) !== null) ret += s + p.data
-
- return ret
- }
-
- concat(n) {
- if (this.length === 0) return Buffer.alloc(0)
- const ret = Buffer.allocUnsafe(n >>> 0)
- let p = this.head
- let i = 0
-
- while (p) {
- TypedArrayPrototypeSet(ret, p.data, i)
- i += p.data.length
- p = p.next
- }
-
- return ret
- } // Consumes a specified amount of bytes or characters from the buffered data.
-
- consume(n, hasStrings) {
- const data = this.head.data
-
- if (n < data.length) {
- // `slice` is the same for buffers and strings.
- const slice = data.slice(0, n)
- this.head.data = data.slice(n)
- return slice
- }
-
- if (n === data.length) {
- // First chunk is a perfect match.
- return this.shift()
- } // Result spans more than one buffer.
-
- return hasStrings ? this._getString(n) : this._getBuffer(n)
- }
-
- first() {
- return this.head.data
- }
-
- *[SymbolIterator]() {
- for (let p = this.head; p; p = p.next) {
- yield p.data
- }
- } // Consumes a specified amount of characters from the buffered data.
-
- _getString(n) {
- let ret = ''
- let p = this.head
- let c = 0
-
- do {
- const str = p.data
-
- if (n > str.length) {
- ret += str
- n -= str.length
- } else {
- if (n === str.length) {
- ret += str
- ++c
- if (p.next) this.head = p.next
- else this.head = this.tail = null
- } else {
- ret += StringPrototypeSlice(str, 0, n)
- this.head = p
- p.data = StringPrototypeSlice(str, n)
- }
-
- break
- }
-
- ++c
- } while ((p = p.next) !== null)
-
- this.length -= c
- return ret
- } // Consumes a specified amount of bytes from the buffered data.
-
- _getBuffer(n) {
- const ret = Buffer.allocUnsafe(n)
- const retLen = n
- let p = this.head
- let c = 0
-
- do {
- const buf = p.data
-
- if (n > buf.length) {
- TypedArrayPrototypeSet(ret, buf, retLen - n)
- n -= buf.length
- } else {
- if (n === buf.length) {
- TypedArrayPrototypeSet(ret, buf, retLen - n)
- ++c
- if (p.next) this.head = p.next
- else this.head = this.tail = null
- } else {
- TypedArrayPrototypeSet(ret, new Uint8Array(buf.buffer, buf.byteOffset, n), retLen - n)
- this.head = p
- p.data = buf.slice(n)
- }
-
- break
- }
-
- ++c
- } while ((p = p.next) !== null)
-
- this.length -= c
- return ret
- } // Make sure the linked list only shows the minimal necessary information.
-
- [Symbol.for('nodejs.util.inspect.custom')](_, options) {
- return inspect(this, {
- ...options,
- // Only inspect one level.
- depth: 0,
- // It should not recurse.
- customInspect: false
- })
- }
-}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/destroy.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/destroy.js
deleted file mode 100644
index e04306f0b1904e..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/destroy.js
+++ /dev/null
@@ -1,337 +0,0 @@
-'use strict'
-
-/* replacement start */
-
-const process = require('process')
-/* replacement end */
-
-const {
- aggregateTwoErrors,
- codes: { ERR_MULTIPLE_CALLBACK },
- AbortError
-} = require('../../ours/errors')
-
-const { Symbol } = require('../../ours/primordials')
-
-const { kDestroyed, isDestroyed, isFinished, isServerRequest } = require('./utils')
-
-const kDestroy = Symbol('kDestroy')
-const kConstruct = Symbol('kConstruct')
-
-function checkError(err, w, r) {
- if (err) {
- // Avoid V8 leak, https://github.com/nodejs/node/pull/34103#issuecomment-652002364
- err.stack // eslint-disable-line no-unused-expressions
-
- if (w && !w.errored) {
- w.errored = err
- }
-
- if (r && !r.errored) {
- r.errored = err
- }
- }
-} // Backwards compat. cb() is undocumented and unused in core but
-// unfortunately might be used by modules.
-
-function destroy(err, cb) {
- const r = this._readableState
- const w = this._writableState // With duplex streams we use the writable side for state.
-
- const s = w || r
-
- if ((w && w.destroyed) || (r && r.destroyed)) {
- if (typeof cb === 'function') {
- cb()
- }
-
- return this
- } // We set destroyed to true before firing error callbacks in order
- // to make it re-entrance safe in case destroy() is called within callbacks
-
- checkError(err, w, r)
-
- if (w) {
- w.destroyed = true
- }
-
- if (r) {
- r.destroyed = true
- } // If still constructing then defer calling _destroy.
-
- if (!s.constructed) {
- this.once(kDestroy, function (er) {
- _destroy(this, aggregateTwoErrors(er, err), cb)
- })
- } else {
- _destroy(this, err, cb)
- }
-
- return this
-}
-
-function _destroy(self, err, cb) {
- let called = false
-
- function onDestroy(err) {
- if (called) {
- return
- }
-
- called = true
- const r = self._readableState
- const w = self._writableState
- checkError(err, w, r)
-
- if (w) {
- w.closed = true
- }
-
- if (r) {
- r.closed = true
- }
-
- if (typeof cb === 'function') {
- cb(err)
- }
-
- if (err) {
- process.nextTick(emitErrorCloseNT, self, err)
- } else {
- process.nextTick(emitCloseNT, self)
- }
- }
-
- try {
- self._destroy(err || null, onDestroy)
- } catch (err) {
- onDestroy(err)
- }
-}
-
-function emitErrorCloseNT(self, err) {
- emitErrorNT(self, err)
- emitCloseNT(self)
-}
-
-function emitCloseNT(self) {
- const r = self._readableState
- const w = self._writableState
-
- if (w) {
- w.closeEmitted = true
- }
-
- if (r) {
- r.closeEmitted = true
- }
-
- if ((w && w.emitClose) || (r && r.emitClose)) {
- self.emit('close')
- }
-}
-
-function emitErrorNT(self, err) {
- const r = self._readableState
- const w = self._writableState
-
- if ((w && w.errorEmitted) || (r && r.errorEmitted)) {
- return
- }
-
- if (w) {
- w.errorEmitted = true
- }
-
- if (r) {
- r.errorEmitted = true
- }
-
- self.emit('error', err)
-}
-
-function undestroy() {
- const r = this._readableState
- const w = this._writableState
-
- if (r) {
- r.constructed = true
- r.closed = false
- r.closeEmitted = false
- r.destroyed = false
- r.errored = null
- r.errorEmitted = false
- r.reading = false
- r.ended = r.readable === false
- r.endEmitted = r.readable === false
- }
-
- if (w) {
- w.constructed = true
- w.destroyed = false
- w.closed = false
- w.closeEmitted = false
- w.errored = null
- w.errorEmitted = false
- w.finalCalled = false
- w.prefinished = false
- w.ended = w.writable === false
- w.ending = w.writable === false
- w.finished = w.writable === false
- }
-}
-
-function errorOrDestroy(stream, err, sync) {
- // We have tests that rely on errors being emitted
- // in the same tick, so changing this is semver major.
- // For now when you opt-in to autoDestroy we allow
- // the error to be emitted nextTick. In a future
- // semver major update we should change the default to this.
- const r = stream._readableState
- const w = stream._writableState
-
- if ((w && w.destroyed) || (r && r.destroyed)) {
- return this
- }
-
- if ((r && r.autoDestroy) || (w && w.autoDestroy)) stream.destroy(err)
- else if (err) {
- // Avoid V8 leak, https://github.com/nodejs/node/pull/34103#issuecomment-652002364
- err.stack // eslint-disable-line no-unused-expressions
-
- if (w && !w.errored) {
- w.errored = err
- }
-
- if (r && !r.errored) {
- r.errored = err
- }
-
- if (sync) {
- process.nextTick(emitErrorNT, stream, err)
- } else {
- emitErrorNT(stream, err)
- }
- }
-}
-
-function construct(stream, cb) {
- if (typeof stream._construct !== 'function') {
- return
- }
-
- const r = stream._readableState
- const w = stream._writableState
-
- if (r) {
- r.constructed = false
- }
-
- if (w) {
- w.constructed = false
- }
-
- stream.once(kConstruct, cb)
-
- if (stream.listenerCount(kConstruct) > 1) {
- // Duplex
- return
- }
-
- process.nextTick(constructNT, stream)
-}
-
-function constructNT(stream) {
- let called = false
-
- function onConstruct(err) {
- if (called) {
- errorOrDestroy(stream, err !== null && err !== undefined ? err : new ERR_MULTIPLE_CALLBACK())
- return
- }
-
- called = true
- const r = stream._readableState
- const w = stream._writableState
- const s = w || r
-
- if (r) {
- r.constructed = true
- }
-
- if (w) {
- w.constructed = true
- }
-
- if (s.destroyed) {
- stream.emit(kDestroy, err)
- } else if (err) {
- errorOrDestroy(stream, err, true)
- } else {
- process.nextTick(emitConstructNT, stream)
- }
- }
-
- try {
- stream._construct(onConstruct)
- } catch (err) {
- onConstruct(err)
- }
-}
-
-function emitConstructNT(stream) {
- stream.emit(kConstruct)
-}
-
-function isRequest(stream) {
- return stream && stream.setHeader && typeof stream.abort === 'function'
-}
-
-function emitCloseLegacy(stream) {
- stream.emit('close')
-}
-
-function emitErrorCloseLegacy(stream, err) {
- stream.emit('error', err)
- process.nextTick(emitCloseLegacy, stream)
-} // Normalize destroy for legacy.
-
-function destroyer(stream, err) {
- if (!stream || isDestroyed(stream)) {
- return
- }
-
- if (!err && !isFinished(stream)) {
- err = new AbortError()
- } // TODO: Remove isRequest branches.
-
- if (isServerRequest(stream)) {
- stream.socket = null
- stream.destroy(err)
- } else if (isRequest(stream)) {
- stream.abort()
- } else if (isRequest(stream.req)) {
- stream.req.abort()
- } else if (typeof stream.destroy === 'function') {
- stream.destroy(err)
- } else if (typeof stream.close === 'function') {
- // TODO: Don't lose err?
- stream.close()
- } else if (err) {
- process.nextTick(emitErrorCloseLegacy, stream, err)
- } else {
- process.nextTick(emitCloseLegacy, stream)
- }
-
- if (!stream.destroyed) {
- stream[kDestroyed] = true
- }
-}
-
-module.exports = {
- construct,
- destroyer,
- destroy,
- undestroy,
- errorOrDestroy
-}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/end-of-stream.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
deleted file mode 100644
index acc13be7d26425..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
+++ /dev/null
@@ -1,262 +0,0 @@
-/* replacement start */
-const process = require('process')
-/* replacement end */
-// Ported from https://github.com/mafintosh/end-of-stream with
-// permission from the author, Mathias Buus (@mafintosh).
-
-;('use strict')
-
-const { AbortError, codes } = require('../../ours/errors')
-
-const { ERR_INVALID_ARG_TYPE, ERR_STREAM_PREMATURE_CLOSE } = codes
-
-const { kEmptyObject, once } = require('../../ours/util')
-
-const { validateAbortSignal, validateFunction, validateObject } = require('../validators')
-
-const { Promise } = require('../../ours/primordials')
-
-const {
- isClosed,
- isReadable,
- isReadableNodeStream,
- isReadableFinished,
- isReadableErrored,
- isWritable,
- isWritableNodeStream,
- isWritableFinished,
- isWritableErrored,
- isNodeStream,
- willEmitClose: _willEmitClose
-} = require('./utils')
-
-function isRequest(stream) {
- return stream.setHeader && typeof stream.abort === 'function'
-}
-
-const nop = () => {}
-
-function eos(stream, options, callback) {
- var _options$readable, _options$writable
-
- if (arguments.length === 2) {
- callback = options
- options = kEmptyObject
- } else if (options == null) {
- options = kEmptyObject
- } else {
- validateObject(options, 'options')
- }
-
- validateFunction(callback, 'callback')
- validateAbortSignal(options.signal, 'options.signal')
- callback = once(callback)
- const readable =
- (_options$readable = options.readable) !== null && _options$readable !== undefined
- ? _options$readable
- : isReadableNodeStream(stream)
- const writable =
- (_options$writable = options.writable) !== null && _options$writable !== undefined
- ? _options$writable
- : isWritableNodeStream(stream)
-
- if (!isNodeStream(stream)) {
- // TODO: Webstreams.
- throw new ERR_INVALID_ARG_TYPE('stream', 'Stream', stream)
- }
-
- const wState = stream._writableState
- const rState = stream._readableState
-
- const onlegacyfinish = () => {
- if (!stream.writable) {
- onfinish()
- }
- } // TODO (ronag): Improve soft detection to include core modules and
- // common ecosystem modules that do properly emit 'close' but fail
- // this generic check.
-
- let willEmitClose =
- _willEmitClose(stream) && isReadableNodeStream(stream) === readable && isWritableNodeStream(stream) === writable
- let writableFinished = isWritableFinished(stream, false)
-
- const onfinish = () => {
- writableFinished = true // Stream should not be destroyed here. If it is that
- // means that user space is doing something differently and
- // we cannot trust willEmitClose.
-
- if (stream.destroyed) {
- willEmitClose = false
- }
-
- if (willEmitClose && (!stream.readable || readable)) {
- return
- }
-
- if (!readable || readableFinished) {
- callback.call(stream)
- }
- }
-
- let readableFinished = isReadableFinished(stream, false)
-
- const onend = () => {
- readableFinished = true // Stream should not be destroyed here. If it is that
- // means that user space is doing something differently and
- // we cannot trust willEmitClose.
-
- if (stream.destroyed) {
- willEmitClose = false
- }
-
- if (willEmitClose && (!stream.writable || writable)) {
- return
- }
-
- if (!writable || writableFinished) {
- callback.call(stream)
- }
- }
-
- const onerror = (err) => {
- callback.call(stream, err)
- }
-
- let closed = isClosed(stream)
-
- const onclose = () => {
- closed = true
- const errored = isWritableErrored(stream) || isReadableErrored(stream)
-
- if (errored && typeof errored !== 'boolean') {
- return callback.call(stream, errored)
- }
-
- if (readable && !readableFinished && isReadableNodeStream(stream, true)) {
- if (!isReadableFinished(stream, false)) return callback.call(stream, new ERR_STREAM_PREMATURE_CLOSE())
- }
-
- if (writable && !writableFinished) {
- if (!isWritableFinished(stream, false)) return callback.call(stream, new ERR_STREAM_PREMATURE_CLOSE())
- }
-
- callback.call(stream)
- }
-
- const onrequest = () => {
- stream.req.on('finish', onfinish)
- }
-
- if (isRequest(stream)) {
- stream.on('complete', onfinish)
-
- if (!willEmitClose) {
- stream.on('abort', onclose)
- }
-
- if (stream.req) {
- onrequest()
- } else {
- stream.on('request', onrequest)
- }
- } else if (writable && !wState) {
- // legacy streams
- stream.on('end', onlegacyfinish)
- stream.on('close', onlegacyfinish)
- } // Not all streams will emit 'close' after 'aborted'.
-
- if (!willEmitClose && typeof stream.aborted === 'boolean') {
- stream.on('aborted', onclose)
- }
-
- stream.on('end', onend)
- stream.on('finish', onfinish)
-
- if (options.error !== false) {
- stream.on('error', onerror)
- }
-
- stream.on('close', onclose)
-
- if (closed) {
- process.nextTick(onclose)
- } else if (
- (wState !== null && wState !== undefined && wState.errorEmitted) ||
- (rState !== null && rState !== undefined && rState.errorEmitted)
- ) {
- if (!willEmitClose) {
- process.nextTick(onclose)
- }
- } else if (
- !readable &&
- (!willEmitClose || isReadable(stream)) &&
- (writableFinished || isWritable(stream) === false)
- ) {
- process.nextTick(onclose)
- } else if (
- !writable &&
- (!willEmitClose || isWritable(stream)) &&
- (readableFinished || isReadable(stream) === false)
- ) {
- process.nextTick(onclose)
- } else if (rState && stream.req && stream.aborted) {
- process.nextTick(onclose)
- }
-
- const cleanup = () => {
- callback = nop
- stream.removeListener('aborted', onclose)
- stream.removeListener('complete', onfinish)
- stream.removeListener('abort', onclose)
- stream.removeListener('request', onrequest)
- if (stream.req) stream.req.removeListener('finish', onfinish)
- stream.removeListener('end', onlegacyfinish)
- stream.removeListener('close', onlegacyfinish)
- stream.removeListener('finish', onfinish)
- stream.removeListener('end', onend)
- stream.removeListener('error', onerror)
- stream.removeListener('close', onclose)
- }
-
- if (options.signal && !closed) {
- const abort = () => {
- // Keep it because cleanup removes it.
- const endCallback = callback
- cleanup()
- endCallback.call(
- stream,
- new AbortError(undefined, {
- cause: options.signal.reason
- })
- )
- }
-
- if (options.signal.aborted) {
- process.nextTick(abort)
- } else {
- const originalCallback = callback
- callback = once((...args) => {
- options.signal.removeEventListener('abort', abort)
- originalCallback.apply(stream, args)
- })
- options.signal.addEventListener('abort', abort)
- }
- }
-
- return cleanup
-}
-
-function finished(stream, opts) {
- return new Promise((resolve, reject) => {
- eos(stream, opts, (err) => {
- if (err) {
- reject(err)
- } else {
- resolve()
- }
- })
- })
-}
-
-module.exports = eos
-module.exports.finished = finished
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/from.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/from.js
deleted file mode 100644
index 89ce1a8bec79d6..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/from.js
+++ /dev/null
@@ -1,115 +0,0 @@
-'use strict'
-
-/* replacement start */
-
-const process = require('process')
-/* replacement end */
-
-const { PromisePrototypeThen, SymbolAsyncIterator, SymbolIterator } = require('../../ours/primordials')
-
-const { Buffer } = require('buffer')
-
-const { ERR_INVALID_ARG_TYPE, ERR_STREAM_NULL_VALUES } = require('../../ours/errors').codes
-
-function from(Readable, iterable, opts) {
- let iterator
-
- if (typeof iterable === 'string' || iterable instanceof Buffer) {
- return new Readable({
- objectMode: true,
- ...opts,
-
- read() {
- this.push(iterable)
- this.push(null)
- }
- })
- }
-
- let isAsync
-
- if (iterable && iterable[SymbolAsyncIterator]) {
- isAsync = true
- iterator = iterable[SymbolAsyncIterator]()
- } else if (iterable && iterable[SymbolIterator]) {
- isAsync = false
- iterator = iterable[SymbolIterator]()
- } else {
- throw new ERR_INVALID_ARG_TYPE('iterable', ['Iterable'], iterable)
- }
-
- const readable = new Readable({
- objectMode: true,
- highWaterMark: 1,
- // TODO(ronag): What options should be allowed?
- ...opts
- }) // Flag to protect against _read
- // being called before last iteration completion.
-
- let reading = false
-
- readable._read = function () {
- if (!reading) {
- reading = true
- next()
- }
- }
-
- readable._destroy = function (error, cb) {
- PromisePrototypeThen(
- close(error),
- () => process.nextTick(cb, error), // nextTick is here in case cb throws
- (e) => process.nextTick(cb, e || error)
- )
- }
-
- async function close(error) {
- const hadError = error !== undefined && error !== null
- const hasThrow = typeof iterator.throw === 'function'
-
- if (hadError && hasThrow) {
- const { value, done } = await iterator.throw(error)
- await value
-
- if (done) {
- return
- }
- }
-
- if (typeof iterator.return === 'function') {
- const { value } = await iterator.return()
- await value
- }
- }
-
- async function next() {
- for (;;) {
- try {
- const { value, done } = isAsync ? await iterator.next() : iterator.next()
-
- if (done) {
- readable.push(null)
- } else {
- const res = value && typeof value.then === 'function' ? await value : value
-
- if (res === null) {
- reading = false
- throw new ERR_STREAM_NULL_VALUES()
- } else if (readable.push(res)) {
- continue
- } else {
- reading = false
- }
- }
- } catch (err) {
- readable.destroy(err)
- }
-
- break
- }
- }
-
- return readable
-}
-
-module.exports = from
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/pipeline.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/pipeline.js
deleted file mode 100644
index c170b48327eee6..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/pipeline.js
+++ /dev/null
@@ -1,434 +0,0 @@
-/* replacement start */
-const process = require('process')
-/* replacement end */
-// Ported from https://github.com/mafintosh/pump with
-// permission from the author, Mathias Buus (@mafintosh).
-
-;('use strict')
-
-const { ArrayIsArray, Promise, SymbolAsyncIterator } = require('../../ours/primordials')
-
-const eos = require('./end-of-stream')
-
-const { once } = require('../../ours/util')
-
-const destroyImpl = require('./destroy')
-
-const Duplex = require('./duplex')
-
-const {
- aggregateTwoErrors,
- codes: {
- ERR_INVALID_ARG_TYPE,
- ERR_INVALID_RETURN_VALUE,
- ERR_MISSING_ARGS,
- ERR_STREAM_DESTROYED,
- ERR_STREAM_PREMATURE_CLOSE
- },
- AbortError
-} = require('../../ours/errors')
-
-const { validateFunction, validateAbortSignal } = require('../validators')
-
-const { isIterable, isReadable, isReadableNodeStream, isNodeStream } = require('./utils')
-
-const AbortController = globalThis.AbortController || require('abort-controller').AbortController
-
-let PassThrough
-let Readable
-
-function destroyer(stream, reading, writing) {
- let finished = false
- stream.on('close', () => {
- finished = true
- })
- const cleanup = eos(
- stream,
- {
- readable: reading,
- writable: writing
- },
- (err) => {
- finished = !err
- }
- )
- return {
- destroy: (err) => {
- if (finished) return
- finished = true
- destroyImpl.destroyer(stream, err || new ERR_STREAM_DESTROYED('pipe'))
- },
- cleanup
- }
-}
-
-function popCallback(streams) {
- // Streams should never be an empty array. It should always contain at least
- // a single stream. Therefore optimize for the average case instead of
- // checking for length === 0 as well.
- validateFunction(streams[streams.length - 1], 'streams[stream.length - 1]')
- return streams.pop()
-}
-
-function makeAsyncIterable(val) {
- if (isIterable(val)) {
- return val
- } else if (isReadableNodeStream(val)) {
- // Legacy streams are not Iterable.
- return fromReadable(val)
- }
-
- throw new ERR_INVALID_ARG_TYPE('val', ['Readable', 'Iterable', 'AsyncIterable'], val)
-}
-
-async function* fromReadable(val) {
- if (!Readable) {
- Readable = require('./readable')
- }
-
- yield* Readable.prototype[SymbolAsyncIterator].call(val)
-}
-
-async function pump(iterable, writable, finish, { end }) {
- let error
- let onresolve = null
-
- const resume = (err) => {
- if (err) {
- error = err
- }
-
- if (onresolve) {
- const callback = onresolve
- onresolve = null
- callback()
- }
- }
-
- const wait = () =>
- new Promise((resolve, reject) => {
- if (error) {
- reject(error)
- } else {
- onresolve = () => {
- if (error) {
- reject(error)
- } else {
- resolve()
- }
- }
- }
- })
-
- writable.on('drain', resume)
- const cleanup = eos(
- writable,
- {
- readable: false
- },
- resume
- )
-
- try {
- if (writable.writableNeedDrain) {
- await wait()
- }
-
- for await (const chunk of iterable) {
- if (!writable.write(chunk)) {
- await wait()
- }
- }
-
- if (end) {
- writable.end()
- }
-
- await wait()
- finish()
- } catch (err) {
- finish(error !== err ? aggregateTwoErrors(error, err) : err)
- } finally {
- cleanup()
- writable.off('drain', resume)
- }
-}
-
-function pipeline(...streams) {
- return pipelineImpl(streams, once(popCallback(streams)))
-}
-
-function pipelineImpl(streams, callback, opts) {
- if (streams.length === 1 && ArrayIsArray(streams[0])) {
- streams = streams[0]
- }
-
- if (streams.length < 2) {
- throw new ERR_MISSING_ARGS('streams')
- }
-
- const ac = new AbortController()
- const signal = ac.signal
- const outerSignal = opts === null || opts === undefined ? undefined : opts.signal // Need to cleanup event listeners if last stream is readable
- // https://github.com/nodejs/node/issues/35452
-
- const lastStreamCleanup = []
- validateAbortSignal(outerSignal, 'options.signal')
-
- function abort() {
- finishImpl(new AbortError())
- }
-
- outerSignal === null || outerSignal === undefined ? undefined : outerSignal.addEventListener('abort', abort)
- let error
- let value
- const destroys = []
- let finishCount = 0
-
- function finish(err) {
- finishImpl(err, --finishCount === 0)
- }
-
- function finishImpl(err, final) {
- if (err && (!error || error.code === 'ERR_STREAM_PREMATURE_CLOSE')) {
- error = err
- }
-
- if (!error && !final) {
- return
- }
-
- while (destroys.length) {
- destroys.shift()(error)
- }
-
- outerSignal === null || outerSignal === undefined ? undefined : outerSignal.removeEventListener('abort', abort)
- ac.abort()
-
- if (final) {
- if (!error) {
- lastStreamCleanup.forEach((fn) => fn())
- }
-
- process.nextTick(callback, error, value)
- }
- }
-
- let ret
-
- for (let i = 0; i < streams.length; i++) {
- const stream = streams[i]
- const reading = i < streams.length - 1
- const writing = i > 0
- const end = reading || (opts === null || opts === undefined ? undefined : opts.end) !== false
- const isLastStream = i === streams.length - 1
-
- if (isNodeStream(stream)) {
- if (end) {
- const { destroy, cleanup } = destroyer(stream, reading, writing)
- destroys.push(destroy)
-
- if (isReadable(stream) && isLastStream) {
- lastStreamCleanup.push(cleanup)
- }
- } // Catch stream errors that occur after pipe/pump has completed.
-
- function onError(err) {
- if (err && err.name !== 'AbortError' && err.code !== 'ERR_STREAM_PREMATURE_CLOSE') {
- finish(err)
- }
- }
-
- stream.on('error', onError)
-
- if (isReadable(stream) && isLastStream) {
- lastStreamCleanup.push(() => {
- stream.removeListener('error', onError)
- })
- }
- }
-
- if (i === 0) {
- if (typeof stream === 'function') {
- ret = stream({
- signal
- })
-
- if (!isIterable(ret)) {
- throw new ERR_INVALID_RETURN_VALUE('Iterable, AsyncIterable or Stream', 'source', ret)
- }
- } else if (isIterable(stream) || isReadableNodeStream(stream)) {
- ret = stream
- } else {
- ret = Duplex.from(stream)
- }
- } else if (typeof stream === 'function') {
- ret = makeAsyncIterable(ret)
- ret = stream(ret, {
- signal
- })
-
- if (reading) {
- if (!isIterable(ret, true)) {
- throw new ERR_INVALID_RETURN_VALUE('AsyncIterable', `transform[${i - 1}]`, ret)
- }
- } else {
- var _ret
-
- if (!PassThrough) {
- PassThrough = require('./passthrough')
- } // If the last argument to pipeline is not a stream
- // we must create a proxy stream so that pipeline(...)
- // always returns a stream which can be further
- // composed through `.pipe(stream)`.
-
- const pt = new PassThrough({
- objectMode: true
- }) // Handle Promises/A+ spec, `then` could be a getter that throws on
- // second use.
-
- const then = (_ret = ret) === null || _ret === undefined ? undefined : _ret.then
-
- if (typeof then === 'function') {
- finishCount++
- then.call(
- ret,
- (val) => {
- value = val
-
- if (val != null) {
- pt.write(val)
- }
-
- if (end) {
- pt.end()
- }
-
- process.nextTick(finish)
- },
- (err) => {
- pt.destroy(err)
- process.nextTick(finish, err)
- }
- )
- } else if (isIterable(ret, true)) {
- finishCount++
- pump(ret, pt, finish, {
- end
- })
- } else {
- throw new ERR_INVALID_RETURN_VALUE('AsyncIterable or Promise', 'destination', ret)
- }
-
- ret = pt
- const { destroy, cleanup } = destroyer(ret, false, true)
- destroys.push(destroy)
-
- if (isLastStream) {
- lastStreamCleanup.push(cleanup)
- }
- }
- } else if (isNodeStream(stream)) {
- if (isReadableNodeStream(ret)) {
- finishCount += 2
- const cleanup = pipe(ret, stream, finish, {
- end
- })
-
- if (isReadable(stream) && isLastStream) {
- lastStreamCleanup.push(cleanup)
- }
- } else if (isIterable(ret)) {
- finishCount++
- pump(ret, stream, finish, {
- end
- })
- } else {
- throw new ERR_INVALID_ARG_TYPE('val', ['Readable', 'Iterable', 'AsyncIterable'], ret)
- }
-
- ret = stream
- } else {
- ret = Duplex.from(stream)
- }
- }
-
- if (
- (signal !== null && signal !== undefined && signal.aborted) ||
- (outerSignal !== null && outerSignal !== undefined && outerSignal.aborted)
- ) {
- process.nextTick(abort)
- }
-
- return ret
-}
-
-function pipe(src, dst, finish, { end }) {
- let ended = false
- dst.on('close', () => {
- if (!ended) {
- // Finish if the destination closes before the source has completed.
- finish(new ERR_STREAM_PREMATURE_CLOSE())
- }
- })
- src.pipe(dst, {
- end
- })
-
- if (end) {
- // Compat. Before node v10.12.0 stdio used to throw an error so
- // pipe() did/does not end() stdio destinations.
- // Now they allow it but "secretly" don't close the underlying fd.
- src.once('end', () => {
- ended = true
- dst.end()
- })
- } else {
- finish()
- }
-
- eos(
- src,
- {
- readable: true,
- writable: false
- },
- (err) => {
- const rState = src._readableState
-
- if (
- err &&
- err.code === 'ERR_STREAM_PREMATURE_CLOSE' &&
- rState &&
- rState.ended &&
- !rState.errored &&
- !rState.errorEmitted
- ) {
- // Some readable streams will emit 'close' before 'end'. However, since
- // this is on the readable side 'end' should still be emitted if the
- // stream has been ended and no error emitted. This should be allowed in
- // favor of backwards compatibility. Since the stream is piped to a
- // destination this should not result in any observable difference.
- // We don't need to check if this is a writable premature close since
- // eos will only fail with premature close on the reading side for
- // duplex streams.
- src.once('end', finish).once('error', finish)
- } else {
- finish(err)
- }
- }
- )
- return eos(
- dst,
- {
- readable: false,
- writable: true
- },
- finish
- )
-}
-
-module.exports = {
- pipelineImpl,
- pipeline
-}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/state.js b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/state.js
deleted file mode 100644
index e7fcebdde9de14..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/state.js
+++ /dev/null
@@ -1,33 +0,0 @@
-'use strict'
-
-const { MathFloor, NumberIsInteger } = require('../../ours/primordials')
-
-const { ERR_INVALID_ARG_VALUE } = require('../../ours/errors').codes
-
-function highWaterMarkFrom(options, isDuplex, duplexKey) {
- return options.highWaterMark != null ? options.highWaterMark : isDuplex ? options[duplexKey] : null
-}
-
-function getDefaultHighWaterMark(objectMode) {
- return objectMode ? 16 : 16 * 1024
-}
-
-function getHighWaterMark(state, options, duplexKey, isDuplex) {
- const hwm = highWaterMarkFrom(options, isDuplex, duplexKey)
-
- if (hwm != null) {
- if (!NumberIsInteger(hwm) || hwm < 0) {
- const name = isDuplex ? `options.${duplexKey}` : 'options.highWaterMark'
- throw new ERR_INVALID_ARG_VALUE(name, hwm)
- }
-
- return MathFloor(hwm)
- } // Default value
-
- return getDefaultHighWaterMark(state.objectMode)
-}
-
-module.exports = {
- getHighWaterMark,
- getDefaultHighWaterMark
-}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/package.json b/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/package.json
deleted file mode 100644
index 1b65f2332a4476..00000000000000
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/package.json
+++ /dev/null
@@ -1,84 +0,0 @@
-{
- "name": "readable-stream",
- "version": "4.2.0",
- "description": "Node.js Streams, a user-land copy of the stream library from Node.js",
- "homepage": "https://github.com/nodejs/readable-stream",
- "license": "MIT",
- "licenses": [
- {
- "type": "MIT",
- "url": "https://choosealicense.com/licenses/mit/"
- }
- ],
- "keywords": [
- "readable",
- "stream",
- "pipe"
- ],
- "repository": {
- "type": "git",
- "url": "git://github.com/nodejs/readable-stream"
- },
- "bugs": {
- "url": "https://github.com/nodejs/readable-stream/issues"
- },
- "main": "lib/ours/index.js",
- "files": [
- "lib",
- "LICENSE",
- "README.md"
- ],
- "browser": {
- "util": "./lib/ours/util.js",
- "./lib/ours/index.js": "./lib/ours/browser.js"
- },
- "scripts": {
- "build": "node build/build.mjs",
- "postbuild": "prettier -w lib test",
- "test": "tap --rcfile=./tap.yml test/parallel/test-*.js test/ours/test-*.js",
- "test:prepare": "node test/browser/runner-prepare.mjs",
- "test:browsers": "node test/browser/runner-browser.mjs",
- "test:bundlers": "node test/browser/runner-node.mjs",
- "coverage": "c8 -c ./c8.json tap --rcfile=./tap.yml test/parallel/test-*.js test/ours/test-*.js",
- "format": "prettier -w src lib test",
- "lint": "eslint src"
- },
- "dependencies": {
- "abort-controller": "^3.0.0",
- "buffer": "^6.0.3",
- "events": "^3.3.0",
- "process": "^0.11.10"
- },
- "devDependencies": {
- "@babel/core": "^7.17.10",
- "@babel/plugin-proposal-nullish-coalescing-operator": "^7.16.7",
- "@babel/plugin-proposal-optional-chaining": "^7.16.7",
- "@rollup/plugin-commonjs": "^22.0.0",
- "@rollup/plugin-inject": "^4.0.4",
- "@rollup/plugin-node-resolve": "^13.3.0",
- "@sinonjs/fake-timers": "^9.1.2",
- "browserify": "^17.0.0",
- "c8": "^7.11.2",
- "esbuild": "^0.14.39",
- "esbuild-plugin-alias": "^0.2.1",
- "eslint": "^8.15.0",
- "eslint-config-standard": "^17.0.0",
- "eslint-plugin-import": "^2.26.0",
- "eslint-plugin-n": "^15.2.0",
- "eslint-plugin-promise": "^6.0.0",
- "playwright": "^1.21.1",
- "prettier": "^2.6.2",
- "rollup": "^2.72.1",
- "rollup-plugin-polyfill-node": "^0.9.0",
- "tap": "^16.2.0",
- "tap-mocha-reporter": "^5.0.3",
- "tape": "^5.5.3",
- "tar": "^6.1.11",
- "undici": "^5.1.1",
- "webpack": "^5.72.1",
- "webpack-cli": "^4.9.2"
- },
- "engines": {
- "node": "^12.22.0 || ^14.17.0 || >=16.0.0"
- }
-}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/buffer/AUTHORS.md b/deps/npm/node_modules/buffer/AUTHORS.md
similarity index 100%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/buffer/AUTHORS.md
rename to deps/npm/node_modules/buffer/AUTHORS.md
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/buffer/LICENSE b/deps/npm/node_modules/buffer/LICENSE
similarity index 100%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/buffer/LICENSE
rename to deps/npm/node_modules/buffer/LICENSE
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/buffer/index.d.ts b/deps/npm/node_modules/buffer/index.d.ts
similarity index 100%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/buffer/index.d.ts
rename to deps/npm/node_modules/buffer/index.d.ts
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/buffer/index.js b/deps/npm/node_modules/buffer/index.js
similarity index 100%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/buffer/index.js
rename to deps/npm/node_modules/buffer/index.js
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/buffer/package.json b/deps/npm/node_modules/buffer/package.json
similarity index 100%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/buffer/package.json
rename to deps/npm/node_modules/buffer/package.json
diff --git a/deps/npm/node_modules/cacache/package.json b/deps/npm/node_modules/cacache/package.json
index 6ab0d2a9dd8a36..189a19bf315976 100644
--- a/deps/npm/node_modules/cacache/package.json
+++ b/deps/npm/node_modules/cacache/package.json
@@ -1,6 +1,6 @@
{
"name": "cacache",
- "version": "17.0.3",
+ "version": "17.0.4",
"cache-version": {
"content": "2",
"index": "5"
@@ -46,7 +46,7 @@
"license": "ISC",
"dependencies": {
"@npmcli/fs": "^3.1.0",
- "fs-minipass": "^2.1.0",
+ "fs-minipass": "^3.0.0",
"glob": "^8.0.1",
"lru-cache": "^7.7.1",
"minipass": "^4.0.0",
@@ -61,7 +61,7 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.10.0",
+ "@npmcli/template-oss": "4.11.0",
"tap": "^16.0.0"
},
"engines": {
@@ -70,7 +70,7 @@
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
"windowsCI": false,
- "version": "4.10.0"
+ "version": "4.11.0"
},
"author": "GitHub Inc.",
"tap": {
diff --git a/deps/npm/node_modules/ci-info/LICENSE b/deps/npm/node_modules/ci-info/LICENSE
index 74871920628718..44ca33aa611e73 100644
--- a/deps/npm/node_modules/ci-info/LICENSE
+++ b/deps/npm/node_modules/ci-info/LICENSE
@@ -1,6 +1,6 @@
The MIT License (MIT)
-Copyright (c) 2016-2022 Thomas Watson Steen
+Copyright (c) 2016-2023 Thomas Watson Steen
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
diff --git a/deps/npm/node_modules/ci-info/index.d.ts b/deps/npm/node_modules/ci-info/index.d.ts
index 816d63a867cffd..8aebdcf25858d2 100644
--- a/deps/npm/node_modules/ci-info/index.d.ts
+++ b/deps/npm/node_modules/ci-info/index.d.ts
@@ -43,15 +43,16 @@ export const CODESHIP: boolean;
export const DRONE: boolean;
export const DSARI: boolean;
export const EAS: boolean;
+export const GERRIT: boolean;
export const GITHUB_ACTIONS: boolean;
export const GITLAB: boolean;
export const GOCD: boolean;
export const GOOGLE_CLOUD_BUILD: boolean;
-export const LAYERCI: boolean;
-export const GERRIT: boolean;
+export const HARNESS: boolean;
export const HEROKU: boolean;
export const HUDSON: boolean;
export const JENKINS: boolean;
+export const LAYERCI: boolean;
export const MAGNUM: boolean;
export const NETLIFY: boolean;
export const NEVERCODE: boolean;
diff --git a/deps/npm/node_modules/ci-info/index.js b/deps/npm/node_modules/ci-info/index.js
index e91c518557897a..47907264581eb1 100644
--- a/deps/npm/node_modules/ci-info/index.js
+++ b/deps/npm/node_modules/ci-info/index.js
@@ -54,7 +54,8 @@ vendors.forEach(function (vendor) {
})
exports.isCI = !!(
- env.BUILD_ID || // Jenkins, Cloudbees
+ env.CI !== 'false' && // Bypass all checks if CI env is explicitly set to 'false'
+ (env.BUILD_ID || // Jenkins, Cloudbees
env.BUILD_NUMBER || // Jenkins, TeamCity
env.CI || // Travis CI, CircleCI, Cirrus CI, Gitlab CI, Appveyor, CodeShip, dsari
env.CI_APP_ID || // Appflow
@@ -64,7 +65,7 @@ exports.isCI = !!(
env.CONTINUOUS_INTEGRATION || // Travis CI, Cirrus CI
env.RUN_ID || // TaskCluster, dsari
exports.name ||
- false
+ false)
)
function checkEnv (obj) {
diff --git a/deps/npm/node_modules/ci-info/package.json b/deps/npm/node_modules/ci-info/package.json
index cadaa15026239b..3edae7417a33ef 100644
--- a/deps/npm/node_modules/ci-info/package.json
+++ b/deps/npm/node_modules/ci-info/package.json
@@ -1,6 +1,6 @@
{
"name": "ci-info",
- "version": "3.7.0",
+ "version": "3.8.0",
"description": "Get details about the current Continuous Integration environment",
"main": "index.js",
"typings": "index.d.ts",
@@ -22,6 +22,12 @@
"index.d.ts",
"CHANGELOG.md"
],
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/sibiraj-s"
+ }
+ ],
"scripts": {
"lint:fix": "standard --fix",
"test": "standard && node test.js",
diff --git a/deps/npm/node_modules/ci-info/vendors.json b/deps/npm/node_modules/ci-info/vendors.json
index 00c1601b511937..213711755273ae 100644
--- a/deps/npm/node_modules/ci-info/vendors.json
+++ b/deps/npm/node_modules/ci-info/vendors.json
@@ -48,7 +48,10 @@
"name": "Buildkite",
"constant": "BUILDKITE",
"env": "BUILDKITE",
- "pr": { "env": "BUILDKITE_PULL_REQUEST", "ne": "false" }
+ "pr": {
+ "env": "BUILDKITE_PULL_REQUEST",
+ "ne": "false"
+ }
},
{
"name": "CircleCI",
@@ -66,7 +69,12 @@
"name": "Codefresh",
"constant": "CODEFRESH",
"env": "CF_BUILD_ID",
- "pr": { "any": ["CF_PULL_REQUEST_NUMBER", "CF_PULL_REQUEST_ID"] }
+ "pr": {
+ "any": [
+ "CF_PULL_REQUEST_NUMBER",
+ "CF_PULL_REQUEST_ID"
+ ]
+ }
},
{
"name": "Codemagic",
@@ -77,13 +85,17 @@
{
"name": "Codeship",
"constant": "CODESHIP",
- "env": { "CI_NAME": "codeship" }
+ "env": {
+ "CI_NAME": "codeship"
+ }
},
{
"name": "Drone",
"constant": "DRONE",
"env": "DRONE",
- "pr": { "DRONE_BUILD_EVENT": "pull_request" }
+ "pr": {
+ "DRONE_BUILD_EVENT": "pull_request"
+ }
},
{
"name": "dsari",
@@ -95,11 +107,18 @@
"constant": "EAS",
"env": "EAS_BUILD"
},
+ {
+ "name": "Gerrit",
+ "constant": "GERRIT",
+ "env": "GERRIT_PROJECT"
+ },
{
"name": "GitHub Actions",
"constant": "GITHUB_ACTIONS",
"env": "GITHUB_ACTIONS",
- "pr": { "GITHUB_EVENT_NAME": "pull_request" }
+ "pr": {
+ "GITHUB_EVENT_NAME": "pull_request"
+ }
},
{
"name": "GitLab CI",
@@ -118,20 +137,17 @@
"env": "BUILDER_OUTPUT"
},
{
- "name": "LayerCI",
- "constant": "LAYERCI",
- "env": "LAYERCI",
- "pr": "LAYERCI_PULL_REQUEST"
- },
- {
- "name": "Gerrit",
- "constant": "GERRIT",
- "env": "GERRIT_PROJECT"
+ "name": "Harness CI",
+ "constant": "HARNESS",
+ "env": "HARNESS_BUILD_ID"
},
{
"name": "Heroku",
"constant": "HEROKU",
- "env": { "env": "NODE", "includes": "/app/.heroku/node/bin/node" }
+ "env": {
+ "env": "NODE",
+ "includes": "/app/.heroku/node/bin/node"
+ }
},
{
"name": "Hudson",
@@ -141,8 +157,22 @@
{
"name": "Jenkins",
"constant": "JENKINS",
- "env": ["JENKINS_URL", "BUILD_ID"],
- "pr": { "any": ["ghprbPullId", "CHANGE_ID"] }
+ "env": [
+ "JENKINS_URL",
+ "BUILD_ID"
+ ],
+ "pr": {
+ "any": [
+ "ghprbPullId",
+ "CHANGE_ID"
+ ]
+ }
+ },
+ {
+ "name": "LayerCI",
+ "constant": "LAYERCI",
+ "env": "LAYERCI",
+ "pr": "LAYERCI_PULL_REQUEST"
},
{
"name": "Magnum CI",
@@ -153,13 +183,19 @@
"name": "Netlify CI",
"constant": "NETLIFY",
"env": "NETLIFY",
- "pr": { "env": "PULL_REQUEST", "ne": "false" }
+ "pr": {
+ "env": "PULL_REQUEST",
+ "ne": "false"
+ }
},
{
"name": "Nevercode",
"constant": "NEVERCODE",
"env": "NEVERCODE",
- "pr": { "env": "NEVERCODE_PULL_REQUEST", "ne": "false" }
+ "pr": {
+ "env": "NEVERCODE_PULL_REQUEST",
+ "ne": "false"
+ }
},
{
"name": "ReleaseHub",
@@ -170,7 +206,9 @@
"name": "Render",
"constant": "RENDER",
"env": "RENDER",
- "pr": { "IS_PULL_REQUEST": "true" }
+ "pr": {
+ "IS_PULL_REQUEST": "true"
+ }
},
{
"name": "Sail CI",
@@ -182,7 +220,10 @@
"name": "Screwdriver",
"constant": "SCREWDRIVER",
"env": "SCREWDRIVER",
- "pr": { "env": "SD_PULL_REQUEST", "ne": "false" }
+ "pr": {
+ "env": "SD_PULL_REQUEST",
+ "ne": "false"
+ }
},
{
"name": "Semaphore",
@@ -194,7 +235,9 @@
"name": "Shippable",
"constant": "SHIPPABLE",
"env": "SHIPPABLE",
- "pr": { "IS_PULL_REQUEST": "true" }
+ "pr": {
+ "IS_PULL_REQUEST": "true"
+ }
},
{
"name": "Solano CI",
@@ -205,7 +248,9 @@
{
"name": "Sourcehut",
"constant": "SOURCEHUT",
- "env": { "CI_NAME": "sourcehut" }
+ "env": {
+ "CI_NAME": "sourcehut"
+ }
},
{
"name": "Strider CD",
@@ -215,7 +260,10 @@
{
"name": "TaskCluster",
"constant": "TASKCLUSTER",
- "env": ["TASK_ID", "RUN_ID"]
+ "env": [
+ "TASK_ID",
+ "RUN_ID"
+ ]
},
{
"name": "TeamCity",
@@ -226,12 +274,20 @@
"name": "Travis CI",
"constant": "TRAVIS",
"env": "TRAVIS",
- "pr": { "env": "TRAVIS_PULL_REQUEST", "ne": "false" }
+ "pr": {
+ "env": "TRAVIS_PULL_REQUEST",
+ "ne": "false"
+ }
},
{
"name": "Vercel",
"constant": "VERCEL",
- "env": { "any": ["NOW_BUILDER", "VERCEL"] }
+ "env": {
+ "any": [
+ "NOW_BUILDER",
+ "VERCEL"
+ ]
+ }
},
{
"name": "Visual Studio App Center",
@@ -241,8 +297,12 @@
{
"name": "Woodpecker",
"constant": "WOODPECKER",
- "env": { "CI": "woodpecker" },
- "pr": { "CI_BUILD_EVENT": "pull_request" }
+ "env": {
+ "CI": "woodpecker"
+ },
+ "pr": {
+ "CI_BUILD_EVENT": "pull_request"
+ }
},
{
"name": "Xcode Cloud",
diff --git a/deps/npm/node_modules/cmd-shim/lib/index.js b/deps/npm/node_modules/cmd-shim/lib/index.js
index cf223feb2aa65b..76ea2cb6d624d8 100644
--- a/deps/npm/node_modules/cmd-shim/lib/index.js
+++ b/deps/npm/node_modules/cmd-shim/lib/index.js
@@ -19,7 +19,9 @@ const {
const { dirname, relative } = require('path')
const toBatchSyntax = require('./to-batch-syntax')
-const shebangExpr = /^#!\s*(?:\/usr\/bin\/env\s*((?:[^ \t=]+=[^ \t=]+\s+)*))?([^ \t]+)(.*)$/
+// linting disabled because this regex is really long
+// eslint-disable-next-line max-len
+const shebangExpr = /^#!\s*(?:\/usr\/bin\/env\s+(?:-S\s+)?((?:[^ \t=]+=[^ \t=]+\s+)*))?([^ \t]+)(.*)$/
const cmdShimIfExists = (from, to) =>
stat(from).then(() => cmdShim(from, to), () => {})
diff --git a/deps/npm/node_modules/cmd-shim/package.json b/deps/npm/node_modules/cmd-shim/package.json
index 80cbec8aec6a78..4e52de2412f24a 100644
--- a/deps/npm/node_modules/cmd-shim/package.json
+++ b/deps/npm/node_modules/cmd-shim/package.json
@@ -1,6 +1,6 @@
{
"name": "cmd-shim",
- "version": "6.0.0",
+ "version": "6.0.1",
"description": "Used in npm for command line application support",
"scripts": {
"test": "tap",
@@ -17,8 +17,8 @@
},
"license": "ISC",
"devDependencies": {
- "@npmcli/eslint-config": "^3.0.1",
- "@npmcli/template-oss": "4.5.1",
+ "@npmcli/eslint-config": "^4.0.0",
+ "@npmcli/template-oss": "4.11.0",
"tap": "^16.0.1"
},
"files": [
@@ -41,6 +41,6 @@
"author": "GitHub Inc.",
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.5.1"
+ "version": "4.11.0"
}
}
diff --git a/deps/npm/node_modules/defaults/LICENSE b/deps/npm/node_modules/defaults/LICENSE
index d88b0720784808..11eb6fdebf3b63 100644
--- a/deps/npm/node_modules/defaults/LICENSE
+++ b/deps/npm/node_modules/defaults/LICENSE
@@ -1,5 +1,6 @@
The MIT License (MIT)
+Copyright (c) 2022 Sindre Sorhus
Copyright (c) 2015 Elijah Insua
Permission is hereby granted, free of charge, to any person obtaining a copy
diff --git a/deps/npm/node_modules/defaults/package.json b/deps/npm/node_modules/defaults/package.json
index 854016d56fefd7..44f72b1714ce4e 100644
--- a/deps/npm/node_modules/defaults/package.json
+++ b/deps/npm/node_modules/defaults/package.json
@@ -1,26 +1,33 @@
{
- "name": "defaults",
- "version": "1.0.3",
- "description": "merge single level defaults over a config object",
- "main": "index.js",
- "scripts": {
- "test": "node test.js"
- },
- "repository": {
- "type": "git",
- "url": "git://github.com/tmpvar/defaults.git"
- },
- "keywords": [
- "config",
- "defaults"
- ],
- "author": "Elijah Insua ",
- "license": "MIT",
- "readmeFilename": "README.md",
- "dependencies": {
- "clone": "^1.0.2"
- },
- "devDependencies": {
- "tap": "^2.0.0"
- }
+ "name": "defaults",
+ "version": "1.0.4",
+ "description": "merge single level defaults over a config object",
+ "main": "index.js",
+ "funding": "https://github.com/sponsors/sindresorhus",
+ "scripts": {
+ "test": "node test.js"
+ },
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/sindresorhus/node-defaults.git"
+ },
+ "keywords": [
+ "config",
+ "defaults",
+ "options",
+ "object",
+ "merge",
+ "assign",
+ "properties",
+ "deep"
+ ],
+ "author": "Elijah Insua ",
+ "license": "MIT",
+ "readmeFilename": "README.md",
+ "dependencies": {
+ "clone": "^1.0.2"
+ },
+ "devDependencies": {
+ "tap": "^2.0.0"
+ }
}
diff --git a/deps/npm/node_modules/fs-minipass/lib/index.js b/deps/npm/node_modules/fs-minipass/lib/index.js
new file mode 100644
index 00000000000000..f9d5082a4d6426
--- /dev/null
+++ b/deps/npm/node_modules/fs-minipass/lib/index.js
@@ -0,0 +1,443 @@
+'use strict'
+const MiniPass = require('minipass')
+const EE = require('events').EventEmitter
+const fs = require('fs')
+
+const writev = fs.writev
+
+const _autoClose = Symbol('_autoClose')
+const _close = Symbol('_close')
+const _ended = Symbol('_ended')
+const _fd = Symbol('_fd')
+const _finished = Symbol('_finished')
+const _flags = Symbol('_flags')
+const _flush = Symbol('_flush')
+const _handleChunk = Symbol('_handleChunk')
+const _makeBuf = Symbol('_makeBuf')
+const _mode = Symbol('_mode')
+const _needDrain = Symbol('_needDrain')
+const _onerror = Symbol('_onerror')
+const _onopen = Symbol('_onopen')
+const _onread = Symbol('_onread')
+const _onwrite = Symbol('_onwrite')
+const _open = Symbol('_open')
+const _path = Symbol('_path')
+const _pos = Symbol('_pos')
+const _queue = Symbol('_queue')
+const _read = Symbol('_read')
+const _readSize = Symbol('_readSize')
+const _reading = Symbol('_reading')
+const _remain = Symbol('_remain')
+const _size = Symbol('_size')
+const _write = Symbol('_write')
+const _writing = Symbol('_writing')
+const _defaultFlag = Symbol('_defaultFlag')
+const _errored = Symbol('_errored')
+
+class ReadStream extends MiniPass {
+ constructor (path, opt) {
+ opt = opt || {}
+ super(opt)
+
+ this.readable = true
+ this.writable = false
+
+ if (typeof path !== 'string') {
+ throw new TypeError('path must be a string')
+ }
+
+ this[_errored] = false
+ this[_fd] = typeof opt.fd === 'number' ? opt.fd : null
+ this[_path] = path
+ this[_readSize] = opt.readSize || 16 * 1024 * 1024
+ this[_reading] = false
+ this[_size] = typeof opt.size === 'number' ? opt.size : Infinity
+ this[_remain] = this[_size]
+ this[_autoClose] = typeof opt.autoClose === 'boolean' ?
+ opt.autoClose : true
+
+ if (typeof this[_fd] === 'number') {
+ this[_read]()
+ } else {
+ this[_open]()
+ }
+ }
+
+ get fd () {
+ return this[_fd]
+ }
+
+ get path () {
+ return this[_path]
+ }
+
+ write () {
+ throw new TypeError('this is a readable stream')
+ }
+
+ end () {
+ throw new TypeError('this is a readable stream')
+ }
+
+ [_open] () {
+ fs.open(this[_path], 'r', (er, fd) => this[_onopen](er, fd))
+ }
+
+ [_onopen] (er, fd) {
+ if (er) {
+ this[_onerror](er)
+ } else {
+ this[_fd] = fd
+ this.emit('open', fd)
+ this[_read]()
+ }
+ }
+
+ [_makeBuf] () {
+ return Buffer.allocUnsafe(Math.min(this[_readSize], this[_remain]))
+ }
+
+ [_read] () {
+ if (!this[_reading]) {
+ this[_reading] = true
+ const buf = this[_makeBuf]()
+ /* istanbul ignore if */
+ if (buf.length === 0) {
+ return process.nextTick(() => this[_onread](null, 0, buf))
+ }
+ fs.read(this[_fd], buf, 0, buf.length, null, (er, br, b) =>
+ this[_onread](er, br, b))
+ }
+ }
+
+ [_onread] (er, br, buf) {
+ this[_reading] = false
+ if (er) {
+ this[_onerror](er)
+ } else if (this[_handleChunk](br, buf)) {
+ this[_read]()
+ }
+ }
+
+ [_close] () {
+ if (this[_autoClose] && typeof this[_fd] === 'number') {
+ const fd = this[_fd]
+ this[_fd] = null
+ fs.close(fd, er => er ? this.emit('error', er) : this.emit('close'))
+ }
+ }
+
+ [_onerror] (er) {
+ this[_reading] = true
+ this[_close]()
+ this.emit('error', er)
+ }
+
+ [_handleChunk] (br, buf) {
+ let ret = false
+ // no effect if infinite
+ this[_remain] -= br
+ if (br > 0) {
+ ret = super.write(br < buf.length ? buf.slice(0, br) : buf)
+ }
+
+ if (br === 0 || this[_remain] <= 0) {
+ ret = false
+ this[_close]()
+ super.end()
+ }
+
+ return ret
+ }
+
+ emit (ev, data) {
+ switch (ev) {
+ case 'prefinish':
+ case 'finish':
+ break
+
+ case 'drain':
+ if (typeof this[_fd] === 'number') {
+ this[_read]()
+ }
+ break
+
+ case 'error':
+ if (this[_errored]) {
+ return
+ }
+ this[_errored] = true
+ return super.emit(ev, data)
+
+ default:
+ return super.emit(ev, data)
+ }
+ }
+}
+
+class ReadStreamSync extends ReadStream {
+ [_open] () {
+ let threw = true
+ try {
+ this[_onopen](null, fs.openSync(this[_path], 'r'))
+ threw = false
+ } finally {
+ if (threw) {
+ this[_close]()
+ }
+ }
+ }
+
+ [_read] () {
+ let threw = true
+ try {
+ if (!this[_reading]) {
+ this[_reading] = true
+ do {
+ const buf = this[_makeBuf]()
+ /* istanbul ignore next */
+ const br = buf.length === 0 ? 0
+ : fs.readSync(this[_fd], buf, 0, buf.length, null)
+ if (!this[_handleChunk](br, buf)) {
+ break
+ }
+ } while (true)
+ this[_reading] = false
+ }
+ threw = false
+ } finally {
+ if (threw) {
+ this[_close]()
+ }
+ }
+ }
+
+ [_close] () {
+ if (this[_autoClose] && typeof this[_fd] === 'number') {
+ const fd = this[_fd]
+ this[_fd] = null
+ fs.closeSync(fd)
+ this.emit('close')
+ }
+ }
+}
+
+class WriteStream extends EE {
+ constructor (path, opt) {
+ opt = opt || {}
+ super(opt)
+ this.readable = false
+ this.writable = true
+ this[_errored] = false
+ this[_writing] = false
+ this[_ended] = false
+ this[_needDrain] = false
+ this[_queue] = []
+ this[_path] = path
+ this[_fd] = typeof opt.fd === 'number' ? opt.fd : null
+ this[_mode] = opt.mode === undefined ? 0o666 : opt.mode
+ this[_pos] = typeof opt.start === 'number' ? opt.start : null
+ this[_autoClose] = typeof opt.autoClose === 'boolean' ?
+ opt.autoClose : true
+
+ // truncating makes no sense when writing into the middle
+ const defaultFlag = this[_pos] !== null ? 'r+' : 'w'
+ this[_defaultFlag] = opt.flags === undefined
+ this[_flags] = this[_defaultFlag] ? defaultFlag : opt.flags
+
+ if (this[_fd] === null) {
+ this[_open]()
+ }
+ }
+
+ emit (ev, data) {
+ if (ev === 'error') {
+ if (this[_errored]) {
+ return
+ }
+ this[_errored] = true
+ }
+ return super.emit(ev, data)
+ }
+
+ get fd () {
+ return this[_fd]
+ }
+
+ get path () {
+ return this[_path]
+ }
+
+ [_onerror] (er) {
+ this[_close]()
+ this[_writing] = true
+ this.emit('error', er)
+ }
+
+ [_open] () {
+ fs.open(this[_path], this[_flags], this[_mode],
+ (er, fd) => this[_onopen](er, fd))
+ }
+
+ [_onopen] (er, fd) {
+ if (this[_defaultFlag] &&
+ this[_flags] === 'r+' &&
+ er && er.code === 'ENOENT') {
+ this[_flags] = 'w'
+ this[_open]()
+ } else if (er) {
+ this[_onerror](er)
+ } else {
+ this[_fd] = fd
+ this.emit('open', fd)
+ if (!this[_writing]) {
+ this[_flush]()
+ }
+ }
+ }
+
+ end (buf, enc) {
+ if (buf) {
+ this.write(buf, enc)
+ }
+
+ this[_ended] = true
+
+ // synthetic after-write logic, where drain/finish live
+ if (!this[_writing] && !this[_queue].length &&
+ typeof this[_fd] === 'number') {
+ this[_onwrite](null, 0)
+ }
+ return this
+ }
+
+ write (buf, enc) {
+ if (typeof buf === 'string') {
+ buf = Buffer.from(buf, enc)
+ }
+
+ if (this[_ended]) {
+ this.emit('error', new Error('write() after end()'))
+ return false
+ }
+
+ if (this[_fd] === null || this[_writing] || this[_queue].length) {
+ this[_queue].push(buf)
+ this[_needDrain] = true
+ return false
+ }
+
+ this[_writing] = true
+ this[_write](buf)
+ return true
+ }
+
+ [_write] (buf) {
+ fs.write(this[_fd], buf, 0, buf.length, this[_pos], (er, bw) =>
+ this[_onwrite](er, bw))
+ }
+
+ [_onwrite] (er, bw) {
+ if (er) {
+ this[_onerror](er)
+ } else {
+ if (this[_pos] !== null) {
+ this[_pos] += bw
+ }
+ if (this[_queue].length) {
+ this[_flush]()
+ } else {
+ this[_writing] = false
+
+ if (this[_ended] && !this[_finished]) {
+ this[_finished] = true
+ this[_close]()
+ this.emit('finish')
+ } else if (this[_needDrain]) {
+ this[_needDrain] = false
+ this.emit('drain')
+ }
+ }
+ }
+ }
+
+ [_flush] () {
+ if (this[_queue].length === 0) {
+ if (this[_ended]) {
+ this[_onwrite](null, 0)
+ }
+ } else if (this[_queue].length === 1) {
+ this[_write](this[_queue].pop())
+ } else {
+ const iovec = this[_queue]
+ this[_queue] = []
+ writev(this[_fd], iovec, this[_pos],
+ (er, bw) => this[_onwrite](er, bw))
+ }
+ }
+
+ [_close] () {
+ if (this[_autoClose] && typeof this[_fd] === 'number') {
+ const fd = this[_fd]
+ this[_fd] = null
+ fs.close(fd, er => er ? this.emit('error', er) : this.emit('close'))
+ }
+ }
+}
+
+class WriteStreamSync extends WriteStream {
+ [_open] () {
+ let fd
+ // only wrap in a try{} block if we know we'll retry, to avoid
+ // the rethrow obscuring the error's source frame in most cases.
+ if (this[_defaultFlag] && this[_flags] === 'r+') {
+ try {
+ fd = fs.openSync(this[_path], this[_flags], this[_mode])
+ } catch (er) {
+ if (er.code === 'ENOENT') {
+ this[_flags] = 'w'
+ return this[_open]()
+ } else {
+ throw er
+ }
+ }
+ } else {
+ fd = fs.openSync(this[_path], this[_flags], this[_mode])
+ }
+
+ this[_onopen](null, fd)
+ }
+
+ [_close] () {
+ if (this[_autoClose] && typeof this[_fd] === 'number') {
+ const fd = this[_fd]
+ this[_fd] = null
+ fs.closeSync(fd)
+ this.emit('close')
+ }
+ }
+
+ [_write] (buf) {
+ // throw the original, but try to close if it fails
+ let threw = true
+ try {
+ this[_onwrite](null,
+ fs.writeSync(this[_fd], buf, 0, buf.length, this[_pos]))
+ threw = false
+ } finally {
+ if (threw) {
+ try {
+ this[_close]()
+ } catch {
+ // ok error
+ }
+ }
+ }
+ }
+}
+
+exports.ReadStream = ReadStream
+exports.ReadStreamSync = ReadStreamSync
+
+exports.WriteStream = WriteStream
+exports.WriteStreamSync = WriteStreamSync
diff --git a/deps/npm/node_modules/fs-minipass/package.json b/deps/npm/node_modules/fs-minipass/package.json
index 2f2436cb5c3b1a..cba0d0cbc2dd85 100644
--- a/deps/npm/node_modules/fs-minipass/package.json
+++ b/deps/npm/node_modules/fs-minipass/package.json
@@ -1,19 +1,22 @@
{
"name": "fs-minipass",
- "version": "2.1.0",
- "main": "index.js",
+ "version": "3.0.1",
+ "main": "lib/index.js",
"scripts": {
"test": "tap",
- "preversion": "npm test",
- "postversion": "npm publish",
- "postpublish": "git push origin --follow-tags"
+ "lint": "eslint \"**/*.js\"",
+ "postlint": "template-oss-check",
+ "template-oss-apply": "template-oss-apply --force",
+ "lintfix": "npm run lint -- --fix",
+ "snap": "tap",
+ "posttest": "npm run lint"
},
"keywords": [],
- "author": "Isaac Z. Schlueter (http://blog.izs.me/)",
+ "author": "GitHub Inc.",
"license": "ISC",
"repository": {
"type": "git",
- "url": "git+https://github.com/npm/fs-minipass.git"
+ "url": "https://github.com/npm/fs-minipass.git"
},
"bugs": {
"url": "https://github.com/npm/fs-minipass/issues"
@@ -21,19 +24,30 @@
"homepage": "https://github.com/npm/fs-minipass#readme",
"description": "fs read and write streams based on minipass",
"dependencies": {
- "minipass": "^3.0.0"
+ "minipass": "^4.0.0"
},
"devDependencies": {
- "mutate-fs": "^2.0.1",
- "tap": "^14.6.4"
+ "@npmcli/eslint-config": "^4.0.1",
+ "@npmcli/template-oss": "4.11.3",
+ "mutate-fs": "^2.1.1",
+ "tap": "^16.3.2"
},
"files": [
- "index.js"
+ "bin/",
+ "lib/"
],
"tap": {
- "check-coverage": true
+ "check-coverage": true,
+ "nyc-arg": [
+ "--exclude",
+ "tap-snapshots/**"
+ ]
},
"engines": {
- "node": ">= 8"
+ "node": "^14.17.0 || ^16.13.0 || >=18.0.0"
+ },
+ "templateOSS": {
+ "//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
+ "version": "4.11.3"
}
}
diff --git a/deps/npm/node_modules/glob/common.js b/deps/npm/node_modules/glob/common.js
index e094f750472f78..61a4452f097dcd 100644
--- a/deps/npm/node_modules/glob/common.js
+++ b/deps/npm/node_modules/glob/common.js
@@ -57,6 +57,12 @@ function setopts (self, pattern, options) {
pattern = "**/" + pattern
}
+ self.windowsPathsNoEscape = !!options.windowsPathsNoEscape ||
+ options.allowWindowsEscape === false
+ if (self.windowsPathsNoEscape) {
+ pattern = pattern.replace(/\\/g, '/')
+ }
+
self.silent = !!options.silent
self.pattern = pattern
self.strict = options.strict !== false
@@ -112,8 +118,6 @@ function setopts (self, pattern, options) {
// Note that they are not supported in Glob itself anyway.
options.nonegate = true
options.nocomment = true
- // always treat \ in patterns as escapes, not path separators
- options.allowWindowsEscape = true
self.minimatch = new Minimatch(pattern, options)
self.options = self.minimatch.options
diff --git a/deps/npm/node_modules/glob/node_modules/minimatch/LICENSE b/deps/npm/node_modules/glob/node_modules/minimatch/LICENSE
new file mode 100644
index 00000000000000..1493534e60dce4
--- /dev/null
+++ b/deps/npm/node_modules/glob/node_modules/minimatch/LICENSE
@@ -0,0 +1,15 @@
+The ISC License
+
+Copyright (c) 2011-2023 Isaac Z. Schlueter and Contributors
+
+Permission to use, copy, modify, and/or distribute this software for any
+purpose with or without fee is hereby granted, provided that the above
+copyright notice and this permission notice appear in all copies.
+
+THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
+IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
diff --git a/deps/npm/node_modules/minimatch/lib/path.js b/deps/npm/node_modules/glob/node_modules/minimatch/lib/path.js
similarity index 100%
rename from deps/npm/node_modules/minimatch/lib/path.js
rename to deps/npm/node_modules/glob/node_modules/minimatch/lib/path.js
diff --git a/deps/npm/node_modules/minimatch/minimatch.js b/deps/npm/node_modules/glob/node_modules/minimatch/minimatch.js
similarity index 90%
rename from deps/npm/node_modules/minimatch/minimatch.js
rename to deps/npm/node_modules/glob/node_modules/minimatch/minimatch.js
index 9e8917a46165fb..6c8bfc35181c6d 100644
--- a/deps/npm/node_modules/minimatch/minimatch.js
+++ b/deps/npm/node_modules/glob/node_modules/minimatch/minimatch.js
@@ -157,7 +157,9 @@ minimatch.match = (list, pattern, options = {}) => {
// replace stuff like \* with *
const globUnescape = s => s.replace(/\\(.)/g, '$1')
+const charUnescape = s => s.replace(/\\([^-\]])/g, '$1')
const regExpEscape = s => s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&')
+const braExpEscape = s => s.replace(/[[\]\\]/g, '\\$&')
class Minimatch {
constructor (pattern, options) {
@@ -425,7 +427,7 @@ class Minimatch {
if (pattern === '') return ''
let re = ''
- let hasMagic = !!options.nocase
+ let hasMagic = false
let escaping = false
// ? => one single character
const patternListStack = []
@@ -438,11 +440,23 @@ class Minimatch {
let pl
let sp
// . and .. never match anything that doesn't start with .,
- // even when options.dot is set.
- const patternStart = pattern.charAt(0) === '.' ? '' // anything
- // not (start or / followed by . or .. followed by / or end)
- : options.dot ? '(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))'
- : '(?!\\.)'
+ // even when options.dot is set. However, if the pattern
+ // starts with ., then traversal patterns can match.
+ let dotTravAllowed = pattern.charAt(0) === '.'
+ let dotFileAllowed = options.dot || dotTravAllowed
+ const patternStart = () =>
+ dotTravAllowed
+ ? ''
+ : dotFileAllowed
+ ? '(?!(?:^|\\/)\\.{1,2}(?:$|\\/))'
+ : '(?!\\.)'
+ const subPatternStart = (p) =>
+ p.charAt(0) === '.'
+ ? ''
+ : options.dot
+ ? '(?!(?:^|\\/)\\.{1,2}(?:$|\\/))'
+ : '(?!\\.)'
+
const clearStateChar = () => {
if (stateChar) {
@@ -492,6 +506,11 @@ class Minimatch {
}
case '\\':
+ if (inClass && pattern.charAt(i + 1) === '-') {
+ re += c
+ continue
+ }
+
clearStateChar()
escaping = true
continue
@@ -526,7 +545,7 @@ class Minimatch {
if (options.noext) clearStateChar()
continue
- case '(':
+ case '(': {
if (inClass) {
re += '('
continue
@@ -537,46 +556,64 @@ class Minimatch {
continue
}
- patternListStack.push({
+ const plEntry = {
type: stateChar,
start: i - 1,
reStart: re.length,
open: plTypes[stateChar].open,
- close: plTypes[stateChar].close
- })
- // negation is (?:(?!js)[^/]*)
- re += stateChar === '!' ? '(?:(?!(?:' : '(?:'
+ close: plTypes[stateChar].close,
+ }
+ this.debug(this.pattern, '\t', plEntry)
+ patternListStack.push(plEntry)
+ // negation is (?:(?!(?:js)(?:))[^/]*)
+ re += plEntry.open
+ // next entry starts with a dot maybe?
+ if (plEntry.start === 0 && plEntry.type !== '!') {
+ dotTravAllowed = true
+ re += subPatternStart(pattern.slice(i + 1))
+ }
this.debug('plType %j %j', stateChar, re)
stateChar = false
- continue
+ continue
+ }
- case ')':
- if (inClass || !patternListStack.length) {
+ case ')': {
+ const plEntry = patternListStack[patternListStack.length - 1]
+ if (inClass || !plEntry) {
re += '\\)'
continue
}
+ patternListStack.pop()
+ // closing an extglob
clearStateChar()
hasMagic = true
- pl = patternListStack.pop()
+ pl = plEntry
// negation is (?:(?!js)[^/]*)
// The others are (?:)
re += pl.close
if (pl.type === '!') {
- negativeLists.push(pl)
+ negativeLists.push(Object.assign(pl, { reEnd: re.length }))
}
- pl.reEnd = re.length
- continue
+ continue
+ }
- case '|':
- if (inClass || !patternListStack.length) {
+ case '|': {
+ const plEntry = patternListStack[patternListStack.length - 1]
+ if (inClass || !plEntry) {
re += '\\|'
continue
}
clearStateChar()
re += '|'
- continue
+ // next subpattern can start with a dot?
+ if (plEntry.start === 0 && plEntry.type !== '!') {
+ dotTravAllowed = true
+ re += subPatternStart(pattern.slice(i + 1))
+ }
+ continue
+ }
// these are mostly the same in regexp and glob
case '[':
@@ -604,8 +641,6 @@ class Minimatch {
continue
}
- // handle the case where we left a class open.
- // "[z-a]" is valid, equivalent to "\[z-a\]"
// split where the last [ was, make sure we don't have
// an invalid re. if so, re-walk the contents of the
// would-be class to re-translate any characters that
@@ -615,20 +650,16 @@ class Minimatch {
// to do safely. For now, this is safe and works.
cs = pattern.substring(classStart + 1, i)
try {
- RegExp('[' + cs + ']')
+ RegExp('[' + braExpEscape(charUnescape(cs)) + ']')
+ // looks good, finish up the class.
+ re += c
} catch (er) {
- // not a valid class!
- sp = this.parse(cs, SUBPARSE)
- re = re.substring(0, reClassStart) + '\\[' + sp[0] + '\\]'
- hasMagic = hasMagic || sp[1]
- inClass = false
- continue
+ // out of order ranges in JS are errors, but in glob syntax,
+ // they're just a range that matches nothing.
+ re = re.substring(0, reClassStart) + '(?:$.)' // match nothing ever
}
-
- // finish up the class.
hasMagic = true
inClass = false
- re += c
continue
default:
@@ -721,14 +752,16 @@ class Minimatch {
// Handle nested stuff like *(*.js|!(*.json)), where open parens
// mean that we should *not* include the ) in the bit that is considered
// "after" the negated section.
- const openParensBefore = nlBefore.split('(').length - 1
+ const closeParensBefore = nlBefore.split(')').length
+ const openParensBefore = nlBefore.split('(').length - closeParensBefore
let cleanAfter = nlAfter
for (let i = 0; i < openParensBefore; i++) {
cleanAfter = cleanAfter.replace(/\)[+*?]?/, '')
}
nlAfter = cleanAfter
- const dollar = nlAfter === '' && isSub !== SUBPARSE ? '$' : ''
+ const dollar = nlAfter === '' && isSub !== SUBPARSE ? '(?:$|\\/)' : ''
+
re = nlBefore + nlFirst + nlAfter + dollar + nlLast
}
@@ -740,7 +773,7 @@ class Minimatch {
}
if (addPatternStart) {
- re = patternStart + re
+ re = patternStart() + re
}
// parsing just a piece of a larger pattern.
@@ -748,6 +781,11 @@ class Minimatch {
return [re, hasMagic]
}
+ // if it's nocase, and the lcase/uppercase don't match, it's magic
+ if (options.nocase && !hasMagic) {
+ hasMagic = pattern.toUpperCase() !== pattern.toLowerCase()
+ }
+
// skip the regexp for non-magical patterns
// unescape anything in it, though, so that it'll be
// an exact match against a file etc.
diff --git a/deps/npm/node_modules/glob/node_modules/minimatch/package.json b/deps/npm/node_modules/glob/node_modules/minimatch/package.json
new file mode 100644
index 00000000000000..c8809dbb3119d9
--- /dev/null
+++ b/deps/npm/node_modules/glob/node_modules/minimatch/package.json
@@ -0,0 +1,35 @@
+{
+ "author": "Isaac Z. Schlueter (http://blog.izs.me)",
+ "name": "minimatch",
+ "description": "a glob matcher in javascript",
+ "publishConfig": {
+ "tag": "legacy-v5"
+ },
+ "version": "5.1.6",
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/isaacs/minimatch.git"
+ },
+ "main": "minimatch.js",
+ "scripts": {
+ "test": "tap",
+ "snap": "tap",
+ "preversion": "npm test",
+ "postversion": "npm publish",
+ "prepublishOnly": "git push origin --follow-tags"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "dependencies": {
+ "brace-expansion": "^2.0.1"
+ },
+ "devDependencies": {
+ "tap": "^16.3.2"
+ },
+ "license": "ISC",
+ "files": [
+ "minimatch.js",
+ "lib"
+ ]
+}
diff --git a/deps/npm/node_modules/glob/package.json b/deps/npm/node_modules/glob/package.json
index 5134253e32226f..ca0fd916211b51 100644
--- a/deps/npm/node_modules/glob/package.json
+++ b/deps/npm/node_modules/glob/package.json
@@ -2,7 +2,7 @@
"author": "Isaac Z. Schlueter (http://blog.izs.me/)",
"name": "glob",
"description": "a little globber",
- "version": "8.0.3",
+ "version": "8.1.0",
"repository": {
"type": "git",
"url": "git://github.com/isaacs/node-glob.git"
diff --git a/deps/npm/node_modules/http-cache-semantics/index.js b/deps/npm/node_modules/http-cache-semantics/index.js
index 4f6c2f30498b45..31fba4860024cd 100644
--- a/deps/npm/node_modules/http-cache-semantics/index.js
+++ b/deps/npm/node_modules/http-cache-semantics/index.js
@@ -7,6 +7,7 @@ const statusCodeCacheableByDefault = new Set([
206,
300,
301,
+ 308,
404,
405,
410,
@@ -79,10 +80,10 @@ function parseCacheControl(header) {
// TODO: When there is more than one value present for a given directive (e.g., two Expires header fields, multiple Cache-Control: max-age directives),
// the directive's value is considered invalid. Caches are encouraged to consider responses that have invalid freshness information to be stale
- const parts = header.trim().split(/\s*,\s*/); // TODO: lame parsing
+ const parts = header.trim().split(/,/);
for (const part of parts) {
- const [k, v] = part.split(/\s*=\s*/, 2);
- cc[k] = v === undefined ? true : v.replace(/^"|"$/g, ''); // TODO: lame unquoting
+ const [k, v] = part.split(/=/, 2);
+ cc[k.trim()] = v === undefined ? true : v.trim().replace(/^"|"$/g, '');
}
return cc;
diff --git a/deps/npm/node_modules/http-cache-semantics/package.json b/deps/npm/node_modules/http-cache-semantics/package.json
index 897798d8ccc79c..defbb045a63832 100644
--- a/deps/npm/node_modules/http-cache-semantics/package.json
+++ b/deps/npm/node_modules/http-cache-semantics/package.json
@@ -1,6 +1,6 @@
{
"name": "http-cache-semantics",
- "version": "4.1.0",
+ "version": "4.1.1",
"description": "Parses Cache-Control and other headers. Helps building correct HTTP caches and proxies",
"repository": "https://github.com/kornelski/http-cache-semantics.git",
"main": "index.js",
@@ -13,12 +13,6 @@
"author": "Kornel Lesiński (https://kornel.ski/)",
"license": "BSD-2-Clause",
"devDependencies": {
- "eslint": "^5.13.0",
- "eslint-plugin-prettier": "^3.0.1",
- "husky": "^0.14.3",
- "lint-staged": "^8.1.3",
- "mocha": "^5.1.0",
- "prettier": "^1.14.3",
- "prettier-eslint-cli": "^4.7.1"
+ "mocha": "^10.0"
}
}
diff --git a/deps/npm/node_modules/ignore-walk/package.json b/deps/npm/node_modules/ignore-walk/package.json
index 3d977ed4b84f66..97a2854857939b 100644
--- a/deps/npm/node_modules/ignore-walk/package.json
+++ b/deps/npm/node_modules/ignore-walk/package.json
@@ -1,11 +1,11 @@
{
"name": "ignore-walk",
- "version": "6.0.0",
+ "version": "6.0.1",
"description": "Nested/recursive `.gitignore`/`.npmignore` parsing and filtering.",
"main": "lib/index.js",
"devDependencies": {
- "@npmcli/eslint-config": "^3.0.1",
- "@npmcli/template-oss": "4.5.1",
+ "@npmcli/eslint-config": "^4.0.0",
+ "@npmcli/template-oss": "4.11.3",
"mkdirp": "^1.0.4",
"mutate-fs": "^2.1.1",
"rimraf": "^3.0.2",
@@ -21,6 +21,7 @@
"postsnap": "npm run lintfix --",
"postlint": "template-oss-check",
"template-oss-apply": "template-oss-apply --force",
+ "test:windows-coverage": "npm pkg set tap.statements=99 --json && npm pkg set tap.branches=98 --json && npm pkg set tap.lines=99 --json",
"snap": "tap"
},
"keywords": [
@@ -42,12 +43,13 @@
"lib/"
],
"dependencies": {
- "minimatch": "^5.0.1"
+ "minimatch": "^6.1.6"
},
"tap": {
"test-env": "LC_ALL=sk",
"before": "test/00-setup.js",
"after": "test/zz-cleanup.js",
+ "timeout": 600,
"jobs": 1,
"nyc-arg": [
"--exclude",
@@ -59,7 +61,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.5.1",
- "windowsCI": false
+ "version": "4.11.3",
+ "content": "scripts/template-oss"
}
}
diff --git a/deps/npm/node_modules/init-package-json/lib/default-input.js b/deps/npm/node_modules/init-package-json/lib/default-input.js
index fe5abfdd85e45d..490e83c139887f 100644
--- a/deps/npm/node_modules/init-package-json/lib/default-input.js
+++ b/deps/npm/node_modules/init-package-json/lib/default-input.js
@@ -1,190 +1,160 @@
-/* eslint-disable no-undef */
-var fs = require('fs')
-var path = require('path')
-var validateLicense = require('validate-npm-package-license')
-var validateName = require('validate-npm-package-name')
-var npa = require('npm-package-arg')
-var semver = require('semver')
+/* globals config, dirname, package, basename, yes, prompt */
+
+const fs = require('fs/promises')
+const path = require('path')
+const validateLicense = require('validate-npm-package-license')
+const validateName = require('validate-npm-package-name')
+const npa = require('npm-package-arg')
+const semver = require('semver')
// more popular packages should go here, maybe?
-function isTestPkg (p) {
- return !!p.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/)
-}
+const isTestPkg = (p) => !!p.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/)
-function niceName (n) {
- return n.replace(/^node-|[.-]js$/g, '').replace(/\s+/g, ' ').replace(/ /g, '-').toLowerCase()
-}
+const invalid = (msg) => Object.assign(new Error(msg), { notValid: true })
-function readDeps (test, excluded) {
- return function (cb) {
- fs.readdir('node_modules', function (readdirErr, dir) {
- if (readdirErr) {
- return cb()
- }
- var deps = {}
- var n = dir.length
- if (n === 0) {
- return cb(null, deps)
- }
- dir.forEach(function (d) {
- if (d.match(/^\./)) {
- return next()
- }
- if (test !== isTestPkg(d) || excluded[d]) {
- return next()
- }
+const readDeps = (test, excluded) => async () => {
+ const dirs = await fs.readdir('node_modules').catch(() => null)
- var dp = path.join(dirname, 'node_modules', d, 'package.json')
- fs.readFile(dp, 'utf8', function (readFileErr, p) {
- if (readFileErr) {
- return next()
- }
- try {
- p = JSON.parse(p)
- } catch (e) {
- return next()
- }
- if (!p.version) {
- return next()
- }
- if (p._requiredBy) {
- if (!p._requiredBy.some(function (req) {
- return req === '#USER'
- })) {
- return next()
- }
- }
- deps[d] = config.get('save-exact') ? p.version : config.get('save-prefix') + p.version
- return next()
- })
- })
- function next () {
- if (--n === 0) {
- return cb(null, deps)
- }
- }
- })
+ if (!dirs) {
+ return
}
+
+ const deps = {}
+ for (const dir of dirs) {
+ if (dir.match(/^\./) || test !== isTestPkg(dir) || excluded[dir]) {
+ continue
+ }
+
+ const dp = path.join(dirname, 'node_modules', dir, 'package.json')
+ const p = await fs.readFile(dp, 'utf8').then((d) => JSON.parse(d)).catch(() => null)
+
+ if (!p || !p.version || p?._requiredBy?.some((r) => r === '#USER')) {
+ continue
+ }
+
+ deps[dir] = config.get('save-exact') ? p.version : config.get('save-prefix') + p.version
+ }
+
+ return deps
}
-var name = niceName(package.name || basename)
-var spec
-try {
- spec = npa(name)
-} catch (e) {
- spec = {}
+const getConfig = (key) => {
+ // dots take precedence over dashes
+ const def = config?.defaults?.[`init.${key}`]
+ const val = config.get(`init.${key}`)
+ return (val !== def && val) ? val : config.get(`init-${key.replace(/\./g, '-')}`)
}
-var scope = config.get('scope')
-if (scope) {
- if (scope.charAt(0) !== '@') {
- scope = '@' + scope
+
+const getName = () => {
+ const rawName = package.name || basename
+ let name = rawName
+ .replace(/^node-|[.-]js$/g, '')
+ .replace(/\s+/g, ' ')
+ .replace(/ /g, '-')
+ .toLowerCase()
+
+ let spec
+ try {
+ spec = npa(name)
+ } catch {
+ spec = {}
}
- if (spec.scope) {
- name = scope + '/' + spec.name.split('/')[1]
- } else {
- name = scope + '/' + name
+
+ let scope = config.get('scope')
+
+ if (scope) {
+ if (scope.charAt(0) !== '@') {
+ scope = '@' + scope
+ }
+ if (spec.scope) {
+ name = scope + '/' + spec.name.split('/')[1]
+ } else {
+ name = scope + '/' + name
+ }
}
+
+ return name
}
-exports.name = yes ? name : prompt('package name', name, function (data) {
- var its = validateName(data)
+
+const name = getName()
+exports.name = yes ? name : prompt('package name', name, (data) => {
+ const its = validateName(data)
if (its.validForNewPackages) {
return data
}
- var errors = (its.errors || []).concat(its.warnings || [])
- var er = new Error('Sorry, ' + errors.join(' and ') + '.')
- er.notValid = true
- return er
+ const errors = (its.errors || []).concat(its.warnings || [])
+ return invalid(`Sorry, ${errors.join(' and ')}.`)
})
-const defaultDottedInitVersion = config &&
- config.defaults &&
- config.defaults['init.version']
-const dottedInitVersion =
- config.get('init.version') !== defaultDottedInitVersion &&
- config.get('init.version')
-var version = package.version ||
- dottedInitVersion ||
- config.get('init-version') ||
- '1.0.0'
-exports.version = yes ?
- version :
- prompt('version', version, function (promptedVersion) {
- if (semver.valid(promptedVersion)) {
- return promptedVersion
- }
- var er = new Error('Invalid version: "' + promptedVersion + '"')
- er.notValid = true
- return er
- })
+const version = package.version || getConfig('version') || '1.0.0'
+exports.version = yes ? version : prompt('version', version, (v) => {
+ if (semver.valid(v)) {
+ return v
+ }
+ return invalid(`Invalid version: "${v}"`)
+})
if (!package.description) {
exports.description = yes ? '' : prompt('description')
}
if (!package.main) {
- exports.main = function (cb) {
- fs.readdir(dirname, function (er, f) {
- if (er) {
- f = []
- }
+ exports.main = async () => {
+ const files = await fs.readdir(dirname)
+ .then(list => list.filter((f) => f.match(/\.js$/)))
+ .catch(() => [])
- f = f.filter(function (filtered) {
- return filtered.match(/\.js$/)
- })
-
- if (f.indexOf('index.js') !== -1) {
- f = 'index.js'
- } else if (f.indexOf('main.js') !== -1) {
- f = 'main.js'
- } else if (f.indexOf(basename + '.js') !== -1) {
- f = basename + '.js'
- } else {
- f = f[0]
- }
+ let index
+ if (files.includes('index.js')) {
+ index = 'index.js'
+ } else if (files.includes('main.js')) {
+ index = 'main.js'
+ } else if (files.includes(basename + '.js')) {
+ index = basename + '.js'
+ } else {
+ index = files[0] || 'index.js'
+ }
- var index = f || 'index.js'
- return cb(null, yes ? index : prompt('entry point', index))
- })
+ return yes ? index : prompt('entry point', index)
}
}
if (!package.bin) {
- exports.bin = function (cb) {
- fs.readdir(path.resolve(dirname, 'bin'), function (er, d) {
- // no bins
- if (er) {
- return cb()
- }
+ exports.bin = async () => {
+ try {
+ const d = await fs.readdir(path.resolve(dirname, 'bin'))
// just take the first js file we find there, or nada
let r = d.find(f => f.match(/\.js$/))
if (r) {
r = `bin/${r}`
}
- return cb(null, r)
- })
+ return r
+ } catch {
+ // no bins
+ }
}
}
-exports.directories = function (cb) {
- fs.readdir(dirname, function (er, dirs) {
- if (er) {
- return cb(er)
- }
- var res = {}
- dirs.forEach(function (d) {
- switch (d) {
- case 'example': case 'examples': return res.example = d
- case 'test': case 'tests': return res.test = d
- case 'doc': case 'docs': return res.doc = d
- case 'man': return res.man = d
- case 'lib': return res.lib = d
- }
- })
- if (Object.keys(res).length === 0) {
- res = undefined
+exports.directories = async () => {
+ const dirs = await fs.readdir(dirname)
+
+ const res = dirs.reduce((acc, d) => {
+ if (/^examples?$/.test(d)) {
+ acc.example = d
+ } else if (/^tests?$/.test(d)) {
+ acc.test = d
+ } else if (/^docs?$/.test(d)) {
+ acc.doc = d
+ } else if (d === 'man') {
+ acc.man = d
+ } else if (d === 'lib') {
+ acc.lib = d
}
- return cb(null, res)
- })
+
+ return acc
+ }, {})
+
+ return Object.keys(res).length === 0 ? undefined : res
}
if (!package.dependencies) {
@@ -196,116 +166,97 @@ if (!package.devDependencies) {
}
// MUST have a test script!
-var s = package.scripts || {}
-var notest = 'echo "Error: no test specified" && exit 1'
if (!package.scripts) {
- exports.scripts = function (cb) {
- fs.readdir(path.join(dirname, 'node_modules'), function (er, d) {
- setupScripts(d || [], cb)
- })
- }
-}
-function setupScripts (d, cb) {
- // check to see what framework is in use, if any
- function tx (test) {
- return test || notest
- }
- if (!s.test || s.test === notest) {
- var commands = {
- tap: 'tap test/*.js',
- expresso: 'expresso test',
- mocha: 'mocha',
- }
- var command
- Object.keys(commands).forEach(function (k) {
- if (d.indexOf(k) !== -1) {
- command = commands[k]
+ const scripts = package.scripts || {}
+ const notest = 'echo "Error: no test specified" && exit 1'
+ exports.scripts = async () => {
+ const d = await fs.readdir(path.join(dirname, 'node_modules')).catch(() => [])
+
+ // check to see what framework is in use, if any
+ let command
+ if (!scripts.test || scripts.test === notest) {
+ const commands = {
+ tap: 'tap test/*.js',
+ expresso: 'expresso test',
+ mocha: 'mocha',
+ }
+ for (const [k, v] of Object.entries(commands)) {
+ if (d.includes(k)) {
+ command = v
+ }
}
- })
- var ps = 'test command'
- if (yes) {
- s.test = command || notest
- } else {
- s.test = command ? prompt(ps, command, tx) : prompt(ps, tx)
}
+
+ const promptArgs = ['test command', (t) => t || notest]
+ if (command) {
+ promptArgs.splice(1, 0, command)
+ }
+ scripts.test = yes ? command || notest : prompt(...promptArgs)
+
+ return scripts
}
- return cb(null, s)
}
if (!package.repository) {
- exports.repository = function (cb) {
- fs.readFile('.git/config', 'utf8', function (er, gconf) {
- if (er || !gconf) {
- return cb(null, yes ? '' : prompt('git repository'))
- }
- gconf = gconf.split(/\r?\n/)
- var i = gconf.indexOf('[remote "origin"]')
- if (i !== -1) {
- var u = gconf[i + 1]
- if (!u.match(/^\s*url =/)) {
- u = gconf[i + 2]
- }
- if (!u.match(/^\s*url =/)) {
- u = null
- } else {
- u = u.replace(/^\s*url = /, '')
- }
+ exports.repository = async () => {
+ const gconf = await fs.readFile('.git/config', 'utf8').catch(() => '')
+ const lines = gconf.split(/\r?\n/)
+
+ let url
+ const i = lines.indexOf('[remote "origin"]')
+
+ if (i !== -1) {
+ url = gconf[i + 1]
+ if (!url.match(/^\s*url =/)) {
+ url = gconf[i + 2]
}
- if (u && u.match(/^git@github.com:/)) {
- u = u.replace(/^git@github.com:/, 'https://github.com/')
+ if (!url.match(/^\s*url =/)) {
+ url = null
+ } else {
+ url = url.replace(/^\s*url = /, '')
}
+ }
+
+ if (url && url.match(/^git@github.com:/)) {
+ url = url.replace(/^git@github.com:/, 'https://github.com/')
+ }
- return cb(null, yes ? u : prompt('git repository', u))
- })
+ return yes ? url || '' : prompt('git repository', url || undefined)
}
}
if (!package.keywords) {
- exports.keywords = yes ? '' : prompt('keywords', function (promptedKeywords) {
- if (!promptedKeywords) {
- return undefined
+ exports.keywords = yes ? '' : prompt('keywords', (data) => {
+ if (!data) {
+ return
}
- if (Array.isArray(promptedKeywords)) {
- promptedKeywords = promptedKeywords.join(' ')
+ if (Array.isArray(data)) {
+ data = data.join(' ')
}
- if (typeof promptedKeywords !== 'string') {
- return promptedKeywords
+ if (typeof data !== 'string') {
+ return data
}
- return promptedKeywords.split(/[\s,]+/)
+ return data.split(/[\s,]+/)
})
}
if (!package.author) {
- exports.author = config.get('init.author.name') ||
- config.get('init-author-name')
+ const authorName = getConfig('author.name')
+ exports.author = authorName
? {
- name: config.get('init.author.name') ||
- config.get('init-author-name'),
- email: config.get('init.author.email') ||
- config.get('init-author-email'),
- url: config.get('init.author.url') ||
- config.get('init-author-url'),
+ name: authorName,
+ email: getConfig('author.email'),
+ url: getConfig('author.url'),
}
: yes ? '' : prompt('author')
}
-const defaultDottedInitLicense = config &&
- config.defaults &&
- config.defaults['init.license']
-const dottedInitLicense =
- config.get('init.license') !== defaultDottedInitLicense &&
- config.get('init.license')
-var license = package.license ||
- dottedInitLicense ||
- config.get('init-license') ||
- 'ISC'
-exports.license = yes ? license : prompt('license', license, function (data) {
- var its = validateLicense(data)
+const license = package.license || getConfig('license') || 'ISC'
+exports.license = yes ? license : prompt('license', license, (data) => {
+ const its = validateLicense(data)
if (its.validForNewPackages) {
return data
}
- var errors = (its.errors || []).concat(its.warnings || [])
- var er = new Error('Sorry, ' + errors.join(' and ') + '.')
- er.notValid = true
- return er
+ const errors = (its.errors || []).concat(its.warnings || [])
+ return invalid(`Sorry, ${errors.join(' and ')}.`)
})
diff --git a/deps/npm/node_modules/init-package-json/lib/init-package-json.js b/deps/npm/node_modules/init-package-json/lib/init-package-json.js
index 230bcd81747bdc..077ebd96ffc529 100644
--- a/deps/npm/node_modules/init-package-json/lib/init-package-json.js
+++ b/deps/npm/node_modules/init-package-json/lib/init-package-json.js
@@ -1,184 +1,145 @@
-module.exports = init
-module.exports.yes = yes
-
-var PZ = require('promzard').PromZard
-var path = require('path')
-var def = require.resolve('./default-input.js')
+const promzard = require('promzard')
+const path = require('path')
+const fs = require('fs/promises')
+const semver = require('semver')
+const read = require('read')
+const util = require('util')
+const rpj = require('read-package-json')
-var fs = require('fs')
-var semver = require('semver')
-var read = require('read')
+const def = require.resolve('./default-input.js')
// to validate the data object at the end as a worthwhile package
// and assign default values for things.
-var readJson = require('read-package-json')
+const _extraSet = rpj.extraSet
+const _rpj = util.promisify(rpj)
+const _rpjExtras = util.promisify(rpj.extras)
+const readPkgJson = async (file, pkg) => {
+ // only do a few of these. no need for mans or contributors if they're in the files
+ rpj.extraSet = _extraSet.filter(f => f.name !== 'authors' && f.name !== 'mans')
+ const p = pkg ? _rpjExtras(file, pkg) : _rpj(file)
+ return p.catch(() => ({})).finally(() => rpj.extraSet = _extraSet)
+}
+
+const isYes = (c) => !!(c.get('yes') || c.get('y') || c.get('force') || c.get('f'))
+
+const getConfig = (c = {}) => {
+ // accept either a plain-jane object, or a config object with a "get" method.
+ if (typeof c.get !== 'function') {
+ const data = c
+ return {
+ get: (k) => data[k],
+ toJSON: () => data,
+ }
+ }
+ return c
+}
-function yes (conf) {
- return !!(
- conf.get('yes') || conf.get('y') ||
- conf.get('force') || conf.get('f')
- )
+const stringifyPerson = (p) => {
+ if (typeof p === 'string') {
+ return p
+ }
+ const { name = '', url, web, email, mail } = p
+ const u = url || web
+ const e = email || mail
+ return `${name}${e ? ` <${e}>` : ''}${u ? ` (${u})` : ''}`
}
-function init (dir, input, config, cb) {
- if (typeof config === 'function') {
- cb = config
- config = {}
+async function init (dir, input = def, c = {}) {
+ const config = getConfig(c)
+ const yes = isYes(config)
+ const packageFile = path.resolve(dir, 'package.json')
+
+ const pkg = await readPkgJson(packageFile)
+
+ if (!semver.valid(pkg.version)) {
+ delete pkg.version
}
- // accept either a plain-jane object, or a config object
- // with a "get" method.
- if (typeof config.get !== 'function') {
- var data = config
- config = {
- get: function (k) {
- return data[k]
- },
- toJSON: function () {
- return data
- },
+ // make sure that the input is valid. if not, use the default
+ const pzData = await promzard(path.resolve(input), {
+ yes,
+ config,
+ filename: packageFile,
+ dirname: path.dirname(packageFile),
+ basename: path.basename(path.dirname(packageFile)),
+ package: pkg,
+ }, { backupFile: def })
+
+ for (const [k, v] of Object.entries(pzData)) {
+ if (v != null) {
+ pkg[k] = v
}
}
- var packageFile = path.resolve(dir, 'package.json')
- input = path.resolve(input)
- var pkg
- var ctx = { yes: yes(config) }
-
- var es = readJson.extraSet
- readJson.extraSet = es.filter(function (fn) {
- return fn.name !== 'authors' && fn.name !== 'mans'
- })
- readJson(packageFile, function (er, d) {
- readJson.extraSet = es
-
- if (er) {
- pkg = {}
- } else {
- pkg = d
- }
+ const pkgExtras = await readPkgJson(packageFile, pkg)
+
+ // turn the objects into somewhat more humane strings.
+ if (pkgExtras.author) {
+ pkgExtras.author = stringifyPerson(pkgExtras.author)
+ }
- ctx.filename = packageFile
- ctx.dirname = path.dirname(packageFile)
- ctx.basename = path.basename(ctx.dirname)
- if (!pkg.version || !semver.valid(pkg.version)) {
- delete pkg.version
+ for (const set of ['maintainers', 'contributors']) {
+ if (Array.isArray(pkgExtras[set])) {
+ pkgExtras[set] = pkgExtras[set].map(stringifyPerson)
}
+ }
- ctx.package = pkg
- ctx.config = config || {}
-
- // make sure that the input is valid.
- // if not, use the default
- var pz = new PZ(input, ctx)
- pz.backupFile = def
- pz.on('error', cb)
- pz.on('data', function (pzData) {
- Object.keys(pzData).forEach(function (k) {
- if (pzData[k] !== undefined && pzData[k] !== null) {
- pkg[k] = pzData[k]
- }
- })
-
- // only do a few of these.
- // no need for mans or contributors if they're in the files
- es = readJson.extraSet
- readJson.extraSet = es.filter(function (fn) {
- return fn.name !== 'authors' && fn.name !== 'mans'
- })
- readJson.extras(packageFile, pkg, function (extrasErr, pkgWithExtras) {
- if (extrasErr) {
- return cb(extrasErr, pkgWithExtras)
- }
- readJson.extraSet = es
- pkgWithExtras = unParsePeople(pkgWithExtras)
- // no need for the readme now.
- delete pkgWithExtras.readme
- delete pkgWithExtras.readmeFilename
-
- // really don't want to have this lying around in the file
- delete pkgWithExtras._id
-
- // ditto
- delete pkgWithExtras.gitHead
-
- // if the repo is empty, remove it.
- if (!pkgWithExtras.repository) {
- delete pkgWithExtras.repository
- }
-
- // readJson filters out empty descriptions, but init-package-json
- // traditionally leaves them alone
- if (!pkgWithExtras.description) {
- pkgWithExtras.description = pzData.description
- }
-
- var stringified = JSON.stringify(updateDeps(pkgWithExtras), null, 2) + '\n'
- function write (writeYes) {
- fs.writeFile(packageFile, stringified, 'utf8', function (writeFileErr) {
- if (!writeFileErr && writeYes && !config.get('silent')) {
- console.log('Wrote to %s:\n\n%s\n', packageFile, stringified)
- }
- return cb(writeFileErr, pkgWithExtras)
- })
- }
- if (ctx.yes) {
- return write(true)
- }
- console.log('About to write to %s:\n\n%s\n', packageFile, stringified)
- read({ prompt: 'Is this OK? ', default: 'yes' }, function (promptErr, ok) {
- if (promptErr) {
- return cb(promptErr)
- }
- if (!ok || ok.toLowerCase().charAt(0) !== 'y') {
- console.log('Aborted.')
- } else {
- return write()
- }
- })
- })
- })
- })
-}
+ // no need for the readme now.
+ delete pkgExtras.readme
+ delete pkgExtras.readmeFilename
+
+ // really don't want to have this lying around in the file
+ delete pkgExtras._id
+
+ // ditto
+ delete pkgExtras.gitHead
+
+ // if the repo is empty, remove it.
+ if (!pkgExtras.repository) {
+ delete pkgExtras.repository
+ }
+
+ // readJson filters out empty descriptions, but init-package-json
+ // traditionally leaves them alone
+ if (!pkgExtras.description) {
+ pkgExtras.description = pzData.description
+ }
-function updateDeps (depsData) {
// optionalDependencies don't need to be repeated in two places
- if (depsData.dependencies) {
- if (depsData.optionalDependencies) {
- for (const name of Object.keys(depsData.optionalDependencies)) {
- delete depsData.dependencies[name]
+ if (pkgExtras.dependencies) {
+ if (pkgExtras.optionalDependencies) {
+ for (const name of Object.keys(pkgExtras.optionalDependencies)) {
+ delete pkgExtras.dependencies[name]
}
}
- if (Object.keys(depsData.dependencies).length === 0) {
- delete depsData.dependencies
+ if (Object.keys(pkgExtras.dependencies).length === 0) {
+ delete pkgExtras.dependencies
}
}
- return depsData
-}
+ const stringified = JSON.stringify(pkgExtras, null, 2) + '\n'
+ const msg = util.format('%s:\n\n%s\n', packageFile, stringified)
+ const write = () => fs.writeFile(packageFile, stringified, 'utf8')
-// turn the objects into somewhat more humane strings.
-function unParsePeople (data) {
- if (data.author) {
- data.author = unParsePerson(data.author)
- }['maintainers', 'contributors'].forEach(function (set) {
- if (!Array.isArray(data[set])) {
- return
+ if (yes) {
+ await write()
+ if (!config.get('silent')) {
+ console.log(`Wrote to ${msg}`)
}
- data[set] = data[set].map(unParsePerson)
- })
- return data
-}
+ return pkgExtras
+ }
-function unParsePerson (person) {
- if (typeof person === 'string') {
- return person
+ console.log(`About to write to ${msg}`)
+ const ok = await read({ prompt: 'Is this OK? ', default: 'yes' })
+ if (!ok || !ok.toLowerCase().startsWith('y')) {
+ console.log('Aborted.')
+ return
}
- var name = person.name || ''
- var u = person.url || person.web
- var url = u ? (' (' + u + ')') : ''
- var e = person.email || person.mail
- var email = e ? (' <' + e + '>') : ''
- return name + email + url
+
+ await write()
+ return pkgExtras
}
+
+module.exports = init
+module.exports.yes = isYes
diff --git a/deps/npm/node_modules/init-package-json/package.json b/deps/npm/node_modules/init-package-json/package.json
index 6641323e9f9c04..e2cb1fe25ebba7 100644
--- a/deps/npm/node_modules/init-package-json/package.json
+++ b/deps/npm/node_modules/init-package-json/package.json
@@ -1,6 +1,6 @@
{
"name": "init-package-json",
- "version": "4.0.1",
+ "version": "5.0.0",
"main": "lib/init-package-json.js",
"scripts": {
"test": "tap",
@@ -20,8 +20,8 @@
"description": "A node module to get your node module started",
"dependencies": {
"npm-package-arg": "^10.0.0",
- "promzard": "^0.3.0",
- "read": "^1.0.7",
+ "promzard": "^1.0.0",
+ "read": "^2.0.0",
"read-package-json": "^6.0.0",
"semver": "^7.3.5",
"validate-npm-package-license": "^3.0.4",
@@ -30,16 +30,18 @@
"devDependencies": {
"@npmcli/config": "^6.0.0",
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.6.1",
+ "@npmcli/template-oss": "4.11.3",
"tap": "^16.0.1"
},
"engines": {
"node": "^14.17.0 || ^16.13.0 || >=18.0.0"
},
"tap": {
- "statements": "94",
- "branches": "83",
- "lines": "94",
+ "statements": 95,
+ "branches": 78,
+ "lines": 94,
+ "jobs": 1,
+ "test-ignore": "fixtures/",
"nyc-arg": [
"--exclude",
"tap-snapshots/**"
@@ -61,6 +63,6 @@
],
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.6.1"
+ "version": "4.11.3"
}
}
diff --git a/deps/npm/node_modules/is-core-module/core.json b/deps/npm/node_modules/is-core-module/core.json
index b1d7e46606b77e..d73579c43cbf31 100644
--- a/deps/npm/node_modules/is-core-module/core.json
+++ b/deps/npm/node_modules/is-core-module/core.json
@@ -57,6 +57,8 @@
"node:https": [">= 14.18 && < 15", ">= 16"],
"inspector": ">= 8",
"node:inspector": [">= 14.18 && < 15", ">= 16"],
+ "inspector/promises": [">= 19"],
+ "node:inspector/promises": [">= 19"],
"_linklist": "< 8",
"module": true,
"node:module": [">= 14.18 && < 15", ">= 16"],
diff --git a/deps/npm/node_modules/is-core-module/package.json b/deps/npm/node_modules/is-core-module/package.json
index c2830d63202a9e..5c1a87142ba4bd 100644
--- a/deps/npm/node_modules/is-core-module/package.json
+++ b/deps/npm/node_modules/is-core-module/package.json
@@ -1,6 +1,6 @@
{
"name": "is-core-module",
- "version": "2.10.0",
+ "version": "2.11.0",
"description": "Is this specifier a node.js core module?",
"main": "index.js",
"sideEffects": false,
@@ -9,6 +9,7 @@
"./package.json": "./package.json"
},
"scripts": {
+ "prepack": "npmignore --auto --commentLines=autogenerated",
"prepublish": "not-in-publish || npm run prepublishOnly",
"prepublishOnly": "safe-publish-latest",
"lint": "eslint .",
@@ -45,14 +46,15 @@
},
"devDependencies": {
"@ljharb/eslint-config": "^21.0.0",
- "aud": "^2.0.0",
+ "aud": "^2.0.1",
"auto-changelog": "^2.4.0",
"eslint": "=8.8.0",
"mock-property": "^1.0.0",
+ "npmignore": "^0.3.0",
"nyc": "^10.3.2",
"safe-publish-latest": "^2.0.0",
"semver": "^6.3.0",
- "tape": "^5.5.3"
+ "tape": "^5.6.1"
},
"auto-changelog": {
"output": "CHANGELOG.md",
@@ -61,5 +63,10 @@
"commitLimit": false,
"backfillLimit": false,
"hideCredit": true
+ },
+ "publishConfig": {
+ "ignore": [
+ ".github"
+ ]
}
}
diff --git a/deps/npm/node_modules/just-diff-apply/index.js b/deps/npm/node_modules/just-diff-apply/index.cjs
similarity index 100%
rename from deps/npm/node_modules/just-diff-apply/index.js
rename to deps/npm/node_modules/just-diff-apply/index.cjs
diff --git a/deps/npm/node_modules/just-diff-apply/package.json b/deps/npm/node_modules/just-diff-apply/package.json
index 5317303a8bd1b6..be2879aacfadc2 100644
--- a/deps/npm/node_modules/just-diff-apply/package.json
+++ b/deps/npm/node_modules/just-diff-apply/package.json
@@ -1,17 +1,17 @@
{
"name": "just-diff-apply",
- "version": "5.4.1",
+ "version": "5.5.0",
"description": "Apply a diff to an object. Optionally supports jsonPatch protocol",
- "main": "index.js",
- "module": "index.mjs",
+ "type": "module",
"exports": {
".": {
- "require": "./index.js",
"types": "./index.d.ts",
- "default": "./index.mjs"
+ "require": "./index.cjs",
+ "import": "./index.mjs"
},
"./package.json": "./package.json"
},
+ "main": "index.cjs",
"types": "index.d.ts",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
diff --git a/deps/npm/node_modules/just-diff/index.js b/deps/npm/node_modules/just-diff/index.cjs
similarity index 100%
rename from deps/npm/node_modules/just-diff/index.js
rename to deps/npm/node_modules/just-diff/index.cjs
diff --git a/deps/npm/node_modules/just-diff/package.json b/deps/npm/node_modules/just-diff/package.json
index 035daf034fcc9e..4456df5db5215a 100644
--- a/deps/npm/node_modules/just-diff/package.json
+++ b/deps/npm/node_modules/just-diff/package.json
@@ -1,17 +1,17 @@
{
"name": "just-diff",
- "version": "5.1.1",
+ "version": "5.2.0",
"description": "Return an object representing the diffs between two objects. Supports jsonPatch protocol",
- "main": "index.js",
- "module": "index.mjs",
+ "type": "module",
"exports": {
".": {
- "require": "./index.js",
"types": "./index.d.ts",
- "default": "./index.mjs"
+ "require": "./index.cjs",
+ "import": "./index.mjs"
},
"./package.json": "./package.json"
},
+ "main": "index.cjs",
"types": "index.d.ts",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
diff --git a/deps/npm/node_modules/libnpmaccess/package.json b/deps/npm/node_modules/libnpmaccess/package.json
index 1e27f79597c021..ae4cb8b21eb4bb 100644
--- a/deps/npm/node_modules/libnpmaccess/package.json
+++ b/deps/npm/node_modules/libnpmaccess/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmaccess",
- "version": "7.0.1",
+ "version": "7.0.2",
"description": "programmatic library for `npm access` commands",
"author": "GitHub Inc.",
"license": "ISC",
@@ -17,9 +17,9 @@
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
"@npmcli/mock-registry": "^1.0.0",
- "@npmcli/template-oss": "4.11.0",
- "nock": "^13.2.4",
- "tap": "^16.3.2"
+ "@npmcli/template-oss": "4.11.4",
+ "nock": "^13.3.0",
+ "tap": "^16.3.4"
},
"repository": {
"type": "git",
@@ -41,7 +41,7 @@
],
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmdiff/package.json b/deps/npm/node_modules/libnpmdiff/package.json
index 9e7e3bdb41fc5c..1814f56957d119 100644
--- a/deps/npm/node_modules/libnpmdiff/package.json
+++ b/deps/npm/node_modules/libnpmdiff/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmdiff",
- "version": "5.0.7",
+ "version": "5.0.10",
"description": "The registry diff",
"repository": {
"type": "git",
@@ -32,7 +32,6 @@
],
"license": "ISC",
"scripts": {
- "eslint": "eslint",
"lint": "eslint \"**/*.js\"",
"lintfix": "node ../.. run lint -- --fix",
"test": "tap",
@@ -43,23 +42,23 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
- "tap": "^16.3.2"
+ "@npmcli/template-oss": "4.11.4",
+ "tap": "^16.3.4"
},
"dependencies": {
- "@npmcli/arborist": "^6.1.6",
+ "@npmcli/arborist": "^6.2.2",
"@npmcli/disparity-colors": "^3.0.0",
"@npmcli/installed-package-contents": "^2.0.0",
"binary-extensions": "^2.2.0",
"diff": "^5.1.0",
- "minimatch": "^5.1.1",
+ "minimatch": "^6.1.6",
"npm-package-arg": "^10.1.0",
- "pacote": "^15.0.7",
+ "pacote": "^15.0.8",
"tar": "^6.1.13"
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmexec/lib/index.js b/deps/npm/node_modules/libnpmexec/lib/index.js
index ea4ff5d5474240..719f81ec11d57b 100644
--- a/deps/npm/node_modules/libnpmexec/lib/index.js
+++ b/deps/npm/node_modules/libnpmexec/lib/index.js
@@ -1,8 +1,6 @@
'use strict'
const { mkdir } = require('fs/promises')
-const { promisify } = require('util')
-
const Arborist = require('@npmcli/arborist')
const ciInfo = require('ci-info')
const crypto = require('crypto')
@@ -10,7 +8,7 @@ const log = require('proc-log')
const npa = require('npm-package-arg')
const npmlog = require('npmlog')
const pacote = require('pacote')
-const read = promisify(require('read'))
+const read = require('read')
const semver = require('semver')
const { fileExists, localFileExists } = require('./file-exists.js')
diff --git a/deps/npm/node_modules/libnpmexec/package.json b/deps/npm/node_modules/libnpmexec/package.json
index c0092d4c8767b4..6e3696a3a7d053 100644
--- a/deps/npm/node_modules/libnpmexec/package.json
+++ b/deps/npm/node_modules/libnpmexec/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmexec",
- "version": "5.0.7",
+ "version": "5.0.10",
"files": [
"bin/",
"lib/"
@@ -52,31 +52,30 @@
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
"@npmcli/mock-registry": "^1.0.0",
- "@npmcli/template-oss": "4.11.0",
+ "@npmcli/template-oss": "4.11.4",
"bin-links": "^4.0.1",
- "just-extend": "^6.1.1",
- "just-safe-set": "^4.1.1",
+ "just-extend": "^6.2.0",
+ "just-safe-set": "^4.2.1",
"minify-registry-metadata": "^3.0.0",
- "mkdirp": "^1.0.4",
- "tap": "^16.3.2"
+ "tap": "^16.3.4"
},
"dependencies": {
- "@npmcli/arborist": "^6.1.6",
+ "@npmcli/arborist": "^6.2.2",
"@npmcli/run-script": "^6.0.0",
"chalk": "^4.1.0",
- "ci-info": "^3.7.0",
+ "ci-info": "^3.7.1",
"npm-package-arg": "^10.1.0",
"npmlog": "^7.0.1",
- "pacote": "^15.0.7",
+ "pacote": "^15.0.8",
"proc-log": "^3.0.0",
- "read": "^1.0.7",
- "read-package-json-fast": "^3.0.1",
+ "read": "^2.0.0",
+ "read-package-json-fast": "^3.0.2",
"semver": "^7.3.7",
"walk-up-path": "^1.0.0"
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
}
}
diff --git a/deps/npm/node_modules/libnpmfund/package.json b/deps/npm/node_modules/libnpmfund/package.json
index c0de224fba7ef1..21dae2a78bd616 100644
--- a/deps/npm/node_modules/libnpmfund/package.json
+++ b/deps/npm/node_modules/libnpmfund/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmfund",
- "version": "4.0.7",
+ "version": "4.0.10",
"main": "lib/index.js",
"files": [
"bin/",
@@ -31,7 +31,6 @@
],
"license": "ISC",
"scripts": {
- "eslint": "eslint",
"lint": "eslint \"**/*.js\"",
"lintfix": "node ../.. run lint -- --fix",
"posttest": "node ../.. run lint",
@@ -42,18 +41,18 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
- "tap": "^16.3.2"
+ "@npmcli/template-oss": "4.11.4",
+ "tap": "^16.3.4"
},
"dependencies": {
- "@npmcli/arborist": "^6.1.6"
+ "@npmcli/arborist": "^6.2.2"
},
"engines": {
"node": "^14.17.0 || ^16.13.0 || >=18.0.0"
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmhook/package.json b/deps/npm/node_modules/libnpmhook/package.json
index b157f97e685b47..493b64359cc220 100644
--- a/deps/npm/node_modules/libnpmhook/package.json
+++ b/deps/npm/node_modules/libnpmhook/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmhook",
- "version": "9.0.1",
+ "version": "9.0.3",
"description": "programmatic API for managing npm registry hooks",
"main": "lib/index.js",
"files": [
@@ -8,8 +8,6 @@
"lib/"
],
"scripts": {
- "prerelease": "npm t",
- "postrelease": "npm publish && git push --follow-tags",
"test": "tap",
"lint": "eslint \"**/*.js\"",
"postlint": "template-oss-check",
@@ -37,16 +35,16 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
- "nock": "^13.2.4",
- "tap": "^16.3.2"
+ "@npmcli/template-oss": "4.11.4",
+ "nock": "^13.3.0",
+ "tap": "^16.3.4"
},
"engines": {
"node": "^14.17.0 || ^16.13.0 || >=18.0.0"
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmorg/package.json b/deps/npm/node_modules/libnpmorg/package.json
index 529a7ff9d2c97f..97d957492eae91 100644
--- a/deps/npm/node_modules/libnpmorg/package.json
+++ b/deps/npm/node_modules/libnpmorg/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmorg",
- "version": "5.0.1",
+ "version": "5.0.3",
"description": "Programmatic api for `npm org` commands",
"author": "GitHub Inc.",
"main": "lib/index.js",
@@ -28,10 +28,10 @@
],
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
- "minipass": "^4.0.0",
- "nock": "^13.2.4",
- "tap": "^16.3.2"
+ "@npmcli/template-oss": "4.11.4",
+ "minipass": "^4.0.2",
+ "nock": "^13.3.0",
+ "tap": "^16.3.4"
},
"repository": {
"type": "git",
@@ -49,7 +49,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmpack/package.json b/deps/npm/node_modules/libnpmpack/package.json
index 035edaa9808d5a..671e4b24720dbc 100644
--- a/deps/npm/node_modules/libnpmpack/package.json
+++ b/deps/npm/node_modules/libnpmpack/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmpack",
- "version": "5.0.7",
+ "version": "5.0.10",
"description": "Programmatic API for the bits behind npm pack",
"author": "GitHub Inc.",
"main": "lib/index.js",
@@ -23,10 +23,10 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
- "nock": "^13.0.7",
+ "@npmcli/template-oss": "4.11.4",
+ "nock": "^13.3.0",
"spawk": "^1.7.1",
- "tap": "^16.3.2"
+ "tap": "^16.3.4"
},
"repository": {
"type": "git",
@@ -36,17 +36,17 @@
"bugs": "https://github.com/npm/libnpmpack/issues",
"homepage": "https://npmjs.com/package/libnpmpack",
"dependencies": {
- "@npmcli/arborist": "^6.1.6",
+ "@npmcli/arborist": "^6.2.2",
"@npmcli/run-script": "^6.0.0",
"npm-package-arg": "^10.1.0",
- "pacote": "^15.0.7"
+ "pacote": "^15.0.8"
},
"engines": {
"node": "^14.17.0 || ^16.13.0 || >=18.0.0"
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmpublish/lib/provenance.js b/deps/npm/node_modules/libnpmpublish/lib/provenance.js
new file mode 100644
index 00000000000000..d11d210478b651
--- /dev/null
+++ b/deps/npm/node_modules/libnpmpublish/lib/provenance.js
@@ -0,0 +1,70 @@
+const { sigstore } = require('sigstore')
+
+const INTOTO_PAYLOAD_TYPE = 'application/vnd.in-toto+json'
+const INTOTO_STATEMENT_TYPE = 'https://in-toto.io/Statement/v0.1'
+const SLSA_PREDICATE_TYPE = 'https://slsa.dev/provenance/v0.2'
+
+const BUILDER_ID_PREFIX = 'https://github.com/npm/cli'
+const BUILD_TYPE_PREFIX = 'https://github.com/npm/cli/gha'
+const BUILD_TYPE_VERSION = 'v1'
+
+const generateProvenance = async (subject, opts) => {
+ const { env } = process
+ const payload = {
+ _type: INTOTO_STATEMENT_TYPE,
+ subject,
+ predicateType: SLSA_PREDICATE_TYPE,
+ predicate: {
+ buildType: `${BUILD_TYPE_PREFIX}@${BUILD_TYPE_VERSION}`,
+ builder: { id: `${BUILDER_ID_PREFIX}@${opts.npmVersion}` },
+ invocation: {
+ configSource: {
+ uri: `git+${env.GITHUB_SERVER_URL}/${env.GITHUB_REPOSITORY}@${env.GITHUB_REF}`,
+ digest: {
+ sha1: env.GITHUB_SHA,
+ },
+ entryPoint: env.GITHUB_WORKFLOW_REF,
+ },
+ parameters: {},
+ environment: {
+ GITHUB_ACTOR_ID: env.GITHUB_ACTOR_ID,
+ GITHUB_EVENT_NAME: env.GITHUB_EVENT_NAME,
+ GITHUB_REF: env.GITHUB_REF,
+ GITHUB_REF_TYPE: env.GITHUB_REF_TYPE,
+ GITHUB_REPOSITORY: env.GITHUB_REPOSITORY,
+ GITHUB_REPOSITORY_ID: env.GITHUB_REPOSITORY_ID,
+ GITHUB_REPOSITORY_OWNER_ID: env.GITHUB_REPOSITORY_OWNER_ID,
+ GITHUB_RUN_ATTEMPT: env.GITHUB_RUN_ATTEMPT,
+ GITHUB_RUN_ID: env.GITHUB_RUN_ID,
+ GITHUB_RUN_NUMBER: env.GITHUB_RUN_NUMBER,
+ GITHUB_SHA: env.GITHUB_SHA,
+ GITHUB_WORKFLOW_REF: env.GITHUB_WORKFLOW_REF,
+ GITHUB_WORKFLOW_SHA: env.GITHUB_WORKFLOW_SHA,
+ },
+ },
+ metadata: {
+ buildInvocationId: `${env.GITHUB_RUN_ID}-${env.GITHUB_RUN_ATTEMPT}`,
+ completeness: {
+ parameters: false,
+ environment: false,
+ materials: false,
+ },
+ reproducible: false,
+ },
+ materials: [
+ {
+ uri: `git+${env.GITHUB_SERVER_URL}/${env.GITHUB_REPOSITORY}`,
+ digest: {
+ sha1: env.GITHUB_SHA,
+ },
+ },
+ ],
+ },
+ }
+
+ return sigstore.attest(Buffer.from(JSON.stringify(payload)), INTOTO_PAYLOAD_TYPE, opts)
+}
+
+module.exports = {
+ generateProvenance,
+}
diff --git a/deps/npm/node_modules/libnpmpublish/lib/publish.js b/deps/npm/node_modules/libnpmpublish/lib/publish.js
index 7d01fabf1f2b4d..353688a10eac19 100644
--- a/deps/npm/node_modules/libnpmpublish/lib/publish.js
+++ b/deps/npm/node_modules/libnpmpublish/lib/publish.js
@@ -4,6 +4,9 @@ const npa = require('npm-package-arg')
const semver = require('semver')
const { URL } = require('url')
const ssri = require('ssri')
+const ciInfo = require('ci-info')
+
+const { generateProvenance } = require('./provenance')
const publish = async (manifest, tarballData, opts) => {
if (manifest.private) {
@@ -36,7 +39,7 @@ Remove the 'private' field from the package.json to publish it.`),
)
}
- const metadata = buildMetadata(reg, pubManifest, tarballData, opts)
+ const metadata = await buildMetadata(reg, pubManifest, tarballData, spec, opts)
try {
return await npmFetch(spec.escapedName, {
@@ -89,8 +92,8 @@ const patchManifest = (_manifest, opts) => {
return manifest
}
-const buildMetadata = (registry, manifest, tarballData, opts) => {
- const { access, defaultTag, algorithms } = opts
+const buildMetadata = async (registry, manifest, tarballData, spec, opts) => {
+ const { access, defaultTag, algorithms, provenance } = opts
const root = {
_id: manifest.name,
name: manifest.name,
@@ -105,6 +108,7 @@ const buildMetadata = (registry, manifest, tarballData, opts) => {
root['dist-tags'][tag] = manifest.version
const tarballName = `${manifest.name}-${manifest.version}.tgz`
+ const provenanceBundleName = `${manifest.name}-${manifest.version}.sigstore`
const tarballURI = `${manifest.name}/-/${tarballName}`
const integrity = ssri.fromData(tarballData, {
algorithms: [...new Set(['sha1'].concat(algorithms))],
@@ -130,6 +134,41 @@ const buildMetadata = (registry, manifest, tarballData, opts) => {
length: tarballData.length,
}
+ // Handle case where --provenance flag was set to true
+ if (provenance === true) {
+ const subject = {
+ name: npa.toPurl(spec),
+ digest: { sha512: integrity.sha512[0].hexDigest() },
+ }
+
+ // Ensure that we're running in GHA and an OIDC token is available,
+ // currently the only supported build environment
+ if (ciInfo.name !== 'GitHub Actions' || !process.env.ACTIONS_ID_TOKEN_REQUEST_URL) {
+ throw Object.assign(
+ new Error('Automatic provenance generation not supported outside of GitHub Actions'),
+ { code: 'EUSAGE' }
+ )
+ }
+
+ const visibility =
+ await npmFetch.json(`${registry}/-/package/${spec.escapedName}/visibility`, opts)
+ if (!visibility.public && opts.provenance === true && opts.access !== 'public') {
+ throw Object.assign(
+ /* eslint-disable-next-line max-len */
+ new Error("Can't generate provenance for new or private package, you must set `access` to public."),
+ { code: 'EUSAGE' }
+ )
+ }
+ const provenanceBundle = await generateProvenance([subject], opts)
+
+ const serializedBundle = JSON.stringify(provenanceBundle)
+ root._attachments[provenanceBundleName] = {
+ content_type: provenanceBundle.mediaType,
+ data: serializedBundle,
+ length: serializedBundle.length,
+ }
+ }
+
return root
}
diff --git a/deps/npm/node_modules/libnpmpublish/package.json b/deps/npm/node_modules/libnpmpublish/package.json
index c293d566d1dc28..1b6a53eae61561 100644
--- a/deps/npm/node_modules/libnpmpublish/package.json
+++ b/deps/npm/node_modules/libnpmpublish/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmpublish",
- "version": "7.0.6",
+ "version": "7.1.0",
"description": "Programmatic API for the bits behind npm publish and unpublish",
"author": "GitHub Inc.",
"main": "lib/index.js",
@@ -14,7 +14,6 @@
],
"license": "ISC",
"scripts": {
- "eslint": "eslint",
"lint": "eslint \"**/*.js\"",
"lintfix": "node ../.. run lint -- --fix",
"test": "tap",
@@ -26,10 +25,10 @@
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
"@npmcli/mock-registry": "^1.0.0",
- "@npmcli/template-oss": "4.11.0",
+ "@npmcli/template-oss": "4.11.4",
"lodash.clonedeep": "^4.5.0",
- "nock": "^13.2.4",
- "tap": "^16.3.2"
+ "nock": "^13.3.0",
+ "tap": "^16.3.4"
},
"repository": {
"type": "git",
@@ -39,10 +38,12 @@
"bugs": "https://github.com/npm/cli/issues",
"homepage": "https://npmjs.com/package/libnpmpublish",
"dependencies": {
+ "ci-info": "^3.6.1",
"normalize-package-data": "^5.0.0",
"npm-package-arg": "^10.1.0",
"npm-registry-fetch": "^14.0.3",
"semver": "^7.3.7",
+ "sigstore": "^1.0.0",
"ssri": "^10.0.1"
},
"engines": {
@@ -50,7 +51,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmsearch/package.json b/deps/npm/node_modules/libnpmsearch/package.json
index e0d67afbbf66dc..51e1d0adf9348f 100644
--- a/deps/npm/node_modules/libnpmsearch/package.json
+++ b/deps/npm/node_modules/libnpmsearch/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmsearch",
- "version": "6.0.1",
+ "version": "6.0.2",
"description": "Programmatic API for searching in npm and compatible registries.",
"author": "GitHub Inc.",
"main": "lib/index.js",
@@ -26,9 +26,9 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
- "nock": "^13.2.4",
- "tap": "^16.3.2"
+ "@npmcli/template-oss": "4.11.4",
+ "nock": "^13.3.0",
+ "tap": "^16.3.4"
},
"repository": {
"type": "git",
@@ -45,7 +45,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmteam/lib/index.js b/deps/npm/node_modules/libnpmteam/lib/index.js
index 4b257ea850d6e7..9925d2c5c3bfde 100644
--- a/deps/npm/node_modules/libnpmteam/lib/index.js
+++ b/deps/npm/node_modules/libnpmteam/lib/index.js
@@ -20,15 +20,17 @@ cmd.create = (entity, opts = {}) => {
})
}
-cmd.destroy = (entity, opts = {}) => {
+cmd.destroy = async (entity, opts = {}) => {
const { scope, team } = splitEntity(entity)
validate('SSO', [scope, team, opts])
const uri = `/-/team/${eu(scope)}/${eu(team)}`
- return npmFetch.json(uri, {
+ await npmFetch(uri, {
...opts,
method: 'DELETE',
scope,
+ ignoreBody: true,
})
+ return true
}
cmd.add = (user, entity, opts = {}) => {
@@ -43,16 +45,18 @@ cmd.add = (user, entity, opts = {}) => {
})
}
-cmd.rm = (user, entity, opts = {}) => {
+cmd.rm = async (user, entity, opts = {}) => {
const { scope, team } = splitEntity(entity)
validate('SSO', [scope, team, opts])
const uri = `/-/team/${eu(scope)}/${eu(team)}/user`
- return npmFetch.json(uri, {
+ await npmFetch(uri, {
...opts,
method: 'DELETE',
scope,
body: { user },
+ ignoreBody: true,
})
+ return true
}
cmd.lsTeams = (...args) => cmd.lsTeams.stream(...args).collect()
diff --git a/deps/npm/node_modules/libnpmteam/package.json b/deps/npm/node_modules/libnpmteam/package.json
index b3444c77b8dcfb..4d98dc9dc52f32 100644
--- a/deps/npm/node_modules/libnpmteam/package.json
+++ b/deps/npm/node_modules/libnpmteam/package.json
@@ -1,7 +1,7 @@
{
"name": "libnpmteam",
"description": "npm Team management APIs",
- "version": "5.0.1",
+ "version": "5.0.3",
"author": "GitHub Inc.",
"license": "ISC",
"main": "lib/index.js",
@@ -16,9 +16,9 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
- "nock": "^13.2.4",
- "tap": "^16.3.2"
+ "@npmcli/template-oss": "4.11.4",
+ "nock": "^13.3.0",
+ "tap": "^16.3.4"
},
"repository": {
"type": "git",
@@ -39,7 +39,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
},
"tap": {
diff --git a/deps/npm/node_modules/libnpmversion/package.json b/deps/npm/node_modules/libnpmversion/package.json
index ff3855ae6c1a32..8fce14cebff382 100644
--- a/deps/npm/node_modules/libnpmversion/package.json
+++ b/deps/npm/node_modules/libnpmversion/package.json
@@ -1,6 +1,6 @@
{
"name": "libnpmversion",
- "version": "4.0.1",
+ "version": "4.0.2",
"main": "lib/index.js",
"files": [
"bin/",
@@ -32,9 +32,9 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.11.0",
+ "@npmcli/template-oss": "4.11.4",
"require-inject": "^1.4.4",
- "tap": "^16.3.2"
+ "tap": "^16.3.4"
},
"dependencies": {
"@npmcli/git": "^4.0.1",
@@ -48,7 +48,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.11.0",
+ "version": "4.11.4",
"content": "../../scripts/template-oss/index.js"
}
}
diff --git a/deps/npm/node_modules/lru-cache/index.d.ts b/deps/npm/node_modules/lru-cache/index.d.ts
index e5481b3f926376..ceed413c70919e 100644
--- a/deps/npm/node_modules/lru-cache/index.d.ts
+++ b/deps/npm/node_modules/lru-cache/index.d.ts
@@ -45,6 +45,7 @@ declare class LRUCache implements Iterable<[K, V]> {
public readonly max: number
public readonly maxSize: number
+ public readonly maxEntrySize: number
public readonly sizeCalculation:
| LRUCache.SizeCalculator
| undefined
@@ -320,6 +321,13 @@ declare namespace LRUCache {
max: number
}
+ type MaybeMaxEntrySizeLimit =
+ | {
+ maxEntrySize: number
+ sizeCalculation?: SizeCalculator
+ }
+ | {}
+
interface LimitedBySize {
/**
* If you wish to track item size, you must provide a maxSize
@@ -507,7 +515,8 @@ declare namespace LRUCache {
type Options = SharedOptions &
DeprecatedOptions &
- SafetyBounds
+ SafetyBounds &
+ MaybeMaxEntrySizeLimit
/**
* options which override the options set in the LRUCache constructor
diff --git a/deps/npm/node_modules/lru-cache/index.js b/deps/npm/node_modules/lru-cache/index.js
index 0a551c9d1d6f2c..fa53c12096cf64 100644
--- a/deps/npm/node_modules/lru-cache/index.js
+++ b/deps/npm/node_modules/lru-cache/index.js
@@ -157,6 +157,7 @@ class LRUCache {
noDisposeOnSet,
noUpdateTTL,
maxSize = 0,
+ maxEntrySize = 0,
sizeCalculation,
fetchMethod,
fetchContext,
@@ -180,11 +181,12 @@ class LRUCache {
this.max = max
this.maxSize = maxSize
+ this.maxEntrySize = maxEntrySize || this.maxSize
this.sizeCalculation = sizeCalculation || length
if (this.sizeCalculation) {
- if (!this.maxSize) {
+ if (!this.maxSize && !this.maxEntrySize) {
throw new TypeError(
- 'cannot set sizeCalculation without setting maxSize'
+ 'cannot set sizeCalculation without setting maxSize or maxEntrySize'
)
}
if (typeof this.sizeCalculation !== 'function') {
@@ -231,10 +233,18 @@ class LRUCache {
this.noUpdateTTL = !!noUpdateTTL
this.noDeleteOnFetchRejection = !!noDeleteOnFetchRejection
- if (this.maxSize !== 0) {
- if (!isPosInt(this.maxSize)) {
+ // NB: maxEntrySize is set to maxSize if it's set
+ if (this.maxEntrySize !== 0) {
+ if (this.maxSize !== 0) {
+ if (!isPosInt(this.maxSize)) {
+ throw new TypeError(
+ 'maxSize must be a positive integer if specified'
+ )
+ }
+ }
+ if (!isPosInt(this.maxEntrySize)) {
throw new TypeError(
- 'maxSize must be a positive integer if specified'
+ 'maxEntrySize must be a positive integer if specified'
)
}
this.initializeSizeTracking()
@@ -369,6 +379,11 @@ class LRUCache {
this.sizes[index] = 0
}
this.requireSize = (k, v, size, sizeCalculation) => {
+ // provisionally accept background fetches.
+ // actual value size will be checked when they return.
+ if (this.isBackgroundFetch(v)) {
+ return 0
+ }
if (!isPosInt(size)) {
if (sizeCalculation) {
if (typeof sizeCalculation !== 'function') {
@@ -390,9 +405,11 @@ class LRUCache {
}
this.addItemSize = (index, size) => {
this.sizes[index] = size
- const maxSize = this.maxSize - this.sizes[index]
- while (this.calculatedSize > maxSize) {
- this.evict(true)
+ if (this.maxSize) {
+ const maxSize = this.maxSize - this.sizes[index]
+ while (this.calculatedSize > maxSize) {
+ this.evict(true)
+ }
}
this.calculatedSize += this.sizes[index]
}
@@ -402,7 +419,7 @@ class LRUCache {
requireSize(k, v, size, sizeCalculation) {
if (size || sizeCalculation) {
throw new TypeError(
- 'cannot set size without setting maxSize on cache'
+ 'cannot set size without setting maxSize or maxEntrySize on cache'
)
}
}
@@ -574,7 +591,11 @@ class LRUCache {
) {
size = this.requireSize(k, v, size, sizeCalculation)
// if the item doesn't fit, don't do anything
- if (this.maxSize && size > this.maxSize) {
+ // NB: maxEntrySize set to maxSize by default
+ if (this.maxEntrySize && size > this.maxEntrySize) {
+ // have to delete, in case a background fetch is there already.
+ // in non-async cases, this is a no-op
+ this.delete(k)
return this
}
let index = this.size === 0 ? undefined : this.keyMap.get(k)
diff --git a/deps/npm/node_modules/lru-cache/package.json b/deps/npm/node_modules/lru-cache/package.json
index c3c62e0a3254e0..366ec03dfbc52f 100644
--- a/deps/npm/node_modules/lru-cache/package.json
+++ b/deps/npm/node_modules/lru-cache/package.json
@@ -1,7 +1,7 @@
{
"name": "lru-cache",
"description": "A cache object that deletes the least-recently-used items.",
- "version": "7.13.2",
+ "version": "7.14.1",
"author": "Isaac Z. Schlueter ",
"keywords": [
"mru",
diff --git a/deps/npm/node_modules/make-fetch-happen/package.json b/deps/npm/node_modules/make-fetch-happen/package.json
index 1690c932f75cd2..7c340820f0c550 100644
--- a/deps/npm/node_modules/make-fetch-happen/package.json
+++ b/deps/npm/node_modules/make-fetch-happen/package.json
@@ -1,6 +1,6 @@
{
"name": "make-fetch-happen",
- "version": "11.0.2",
+ "version": "11.0.3",
"description": "Opinionated, caching, retrying fetch client",
"main": "lib/index.js",
"files": [
@@ -35,13 +35,12 @@
"dependencies": {
"agentkeepalive": "^4.2.1",
"cacache": "^17.0.0",
- "http-cache-semantics": "^4.1.0",
+ "http-cache-semantics": "^4.1.1",
"http-proxy-agent": "^5.0.0",
"https-proxy-agent": "^5.0.0",
"is-lambda": "^1.0.1",
"lru-cache": "^7.7.1",
"minipass": "^4.0.0",
- "minipass-collect": "^1.0.2",
"minipass-fetch": "^3.0.0",
"minipass-flush": "^1.0.5",
"minipass-pipeline": "^1.2.4",
@@ -52,10 +51,8 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.10.0",
- "mkdirp": "^1.0.4",
+ "@npmcli/template-oss": "4.11.3",
"nock": "^13.2.4",
- "rimraf": "^3.0.2",
"safe-buffer": "^5.2.1",
"standard-version": "^9.3.2",
"tap": "^16.0.0"
@@ -75,6 +72,6 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.10.0"
+ "version": "4.11.3"
}
}
diff --git a/deps/npm/node_modules/minimatch/LICENSE b/deps/npm/node_modules/minimatch/LICENSE
index 9517b7d995bb03..1493534e60dce4 100644
--- a/deps/npm/node_modules/minimatch/LICENSE
+++ b/deps/npm/node_modules/minimatch/LICENSE
@@ -1,6 +1,6 @@
The ISC License
-Copyright (c) 2011-2022 Isaac Z. Schlueter and Contributors
+Copyright (c) 2011-2023 Isaac Z. Schlueter and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
diff --git a/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.d.ts b/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.d.ts
new file mode 100644
index 00000000000000..29fdd1d95838fd
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.d.ts
@@ -0,0 +1,35 @@
+declare const _default: {
+ (p: string, pattern: string, options?: import("./index.js").MinimatchOptions): boolean;
+ sep: string;
+ GLOBSTAR: typeof import("./index.js").GLOBSTAR;
+ filter: (pattern: string, options?: import("./index.js").MinimatchOptions) => (p: string) => boolean;
+ defaults: (def: import("./index.js").MinimatchOptions) => any;
+ braceExpand: (pattern: string, options?: import("./index.js").MinimatchOptions) => string[];
+ makeRe: (pattern: string, options?: import("./index.js").MinimatchOptions) => false | import("./index.js").MMRegExp;
+ match: (list: string[], pattern: string, options?: import("./index.js").MinimatchOptions) => string[];
+ Minimatch: typeof import("./index.js").Minimatch;
+} & {
+ default: {
+ (p: string, pattern: string, options?: import("./index.js").MinimatchOptions): boolean;
+ sep: string;
+ GLOBSTAR: typeof import("./index.js").GLOBSTAR;
+ filter: (pattern: string, options?: import("./index.js").MinimatchOptions) => (p: string) => boolean;
+ defaults: (def: import("./index.js").MinimatchOptions) => any;
+ braceExpand: (pattern: string, options?: import("./index.js").MinimatchOptions) => string[];
+ makeRe: (pattern: string, options?: import("./index.js").MinimatchOptions) => false | import("./index.js").MMRegExp;
+ match: (list: string[], pattern: string, options?: import("./index.js").MinimatchOptions) => string[];
+ Minimatch: typeof import("./index.js").Minimatch;
+ };
+ minimatch: {
+ (p: string, pattern: string, options?: import("./index.js").MinimatchOptions): boolean;
+ sep: string;
+ GLOBSTAR: typeof import("./index.js").GLOBSTAR;
+ filter: (pattern: string, options?: import("./index.js").MinimatchOptions) => (p: string) => boolean;
+ defaults: (def: import("./index.js").MinimatchOptions) => any;
+ braceExpand: (pattern: string, options?: import("./index.js").MinimatchOptions) => string[];
+ makeRe: (pattern: string, options?: import("./index.js").MinimatchOptions) => false | import("./index.js").MMRegExp;
+ match: (list: string[], pattern: string, options?: import("./index.js").MinimatchOptions) => string[];
+ Minimatch: typeof import("./index.js").Minimatch;
+ };
+};
+export = _default;
diff --git a/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.js b/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.js
new file mode 100644
index 00000000000000..db73b6b933a8a5
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.js
@@ -0,0 +1,7 @@
+"use strict";
+var __importDefault = (this && this.__importDefault) || function (mod) {
+ return (mod && mod.__esModule) ? mod : { "default": mod };
+};
+const index_js_1 = __importDefault(require("./index.js"));
+module.exports = Object.assign(index_js_1.default, { default: index_js_1.default, minimatch: index_js_1.default });
+//# sourceMappingURL=index-cjs.js.map
\ No newline at end of file
diff --git a/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.js.map b/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.js.map
new file mode 100644
index 00000000000000..1a054859a1341b
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/cjs/index-cjs.js.map
@@ -0,0 +1 @@
+{"version":3,"file":"index-cjs.js","sourceRoot":"","sources":["../../src/index-cjs.ts"],"names":[],"mappings":";;;;AAAA,0DAAkC;AAElC,iBAAS,MAAM,CAAC,MAAM,CAAC,kBAAS,EAAE,EAAE,OAAO,EAAE,kBAAS,EAAE,SAAS,EAAT,kBAAS,EAAE,CAAC,CAAA"}
\ No newline at end of file
diff --git a/deps/npm/node_modules/minimatch/dist/cjs/index.d.ts b/deps/npm/node_modules/minimatch/dist/cjs/index.d.ts
new file mode 100644
index 00000000000000..cca07a8280d896
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/cjs/index.d.ts
@@ -0,0 +1,71 @@
+export interface MinimatchOptions {
+ nobrace?: boolean;
+ nocomment?: boolean;
+ nonegate?: boolean;
+ debug?: boolean;
+ noglobstar?: boolean;
+ noext?: boolean;
+ nonull?: boolean;
+ windowsPathsNoEscape?: boolean;
+ allowWindowsEscape?: boolean;
+ partial?: boolean;
+ dot?: boolean;
+ nocase?: boolean;
+ nocaseMagicOnly?: boolean;
+ matchBase?: boolean;
+ flipNegate?: boolean;
+ preserveMultipleSlashes?: boolean;
+}
+export declare const minimatch: {
+ (p: string, pattern: string, options?: MinimatchOptions): boolean;
+ sep: string;
+ GLOBSTAR: typeof GLOBSTAR;
+ filter: (pattern: string, options?: MinimatchOptions) => (p: string) => boolean;
+ defaults: (def: MinimatchOptions) => typeof minimatch;
+ braceExpand: (pattern: string, options?: MinimatchOptions) => string[];
+ makeRe: (pattern: string, options?: MinimatchOptions) => false | MMRegExp;
+ match: (list: string[], pattern: string, options?: MinimatchOptions) => string[];
+ Minimatch: typeof Minimatch;
+};
+export default minimatch;
+export declare const sep: string;
+export declare const GLOBSTAR: unique symbol;
+export declare const filter: (pattern: string, options?: MinimatchOptions) => (p: string) => boolean;
+export declare const defaults: (def: MinimatchOptions) => typeof minimatch;
+export declare const braceExpand: (pattern: string, options?: MinimatchOptions) => string[];
+declare const SUBPARSE: unique symbol;
+export declare const makeRe: (pattern: string, options?: MinimatchOptions) => false | MMRegExp;
+export declare const match: (list: string[], pattern: string, options?: MinimatchOptions) => string[];
+export type MMRegExp = RegExp & {
+ _src?: string;
+ _glob?: string;
+};
+type SubparseReturn = [string, boolean];
+type ParseReturnFiltered = string | MMRegExp | typeof GLOBSTAR;
+type ParseReturn = ParseReturnFiltered | false;
+export declare class Minimatch {
+ options: MinimatchOptions;
+ set: ParseReturnFiltered[][];
+ pattern: string;
+ windowsPathsNoEscape: boolean;
+ nonegate: boolean;
+ negate: boolean;
+ comment: boolean;
+ empty: boolean;
+ preserveMultipleSlashes: boolean;
+ partial: boolean;
+ globSet: string[];
+ globParts: string[][];
+ regexp: false | null | MMRegExp;
+ constructor(pattern: string, options?: MinimatchOptions);
+ debug(..._: any[]): void;
+ make(): void;
+ parseNegate(): void;
+ matchOne(file: string[], pattern: ParseReturn[], partial?: boolean): boolean;
+ braceExpand(): string[];
+ parse(pattern: string, isSub?: typeof SUBPARSE): ParseReturn | SubparseReturn;
+ makeRe(): false | MMRegExp;
+ slashSplit(p: string): string[];
+ match(f: string, partial?: boolean): boolean;
+ static defaults(def: MinimatchOptions): typeof Minimatch;
+}
diff --git a/deps/npm/node_modules/minimatch/dist/cjs/index.js b/deps/npm/node_modules/minimatch/dist/cjs/index.js
new file mode 100644
index 00000000000000..63fc3bdd0b109a
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/cjs/index.js
@@ -0,0 +1,1090 @@
+"use strict";
+var __importDefault = (this && this.__importDefault) || function (mod) {
+ return (mod && mod.__esModule) ? mod : { "default": mod };
+};
+Object.defineProperty(exports, "__esModule", { value: true });
+exports.Minimatch = exports.match = exports.makeRe = exports.braceExpand = exports.defaults = exports.filter = exports.GLOBSTAR = exports.sep = exports.minimatch = void 0;
+const minimatch = (p, pattern, options = {}) => {
+ assertValidPattern(pattern);
+ // shortcut: comments match nothing.
+ if (!options.nocomment && pattern.charAt(0) === '#') {
+ return false;
+ }
+ return new Minimatch(pattern, options).match(p);
+};
+exports.minimatch = minimatch;
+exports.default = exports.minimatch;
+// Optimized checking for the most common glob patterns.
+const starDotExtRE = /^\*+([^+@!?\*\[\(]*)$/;
+const starDotExtTest = (ext) => (f) => !f.startsWith('.') && f.endsWith(ext);
+const starDotExtTestDot = (ext) => (f) => f.endsWith(ext);
+const starDotExtTestNocase = (ext) => {
+ ext = ext.toLowerCase();
+ return (f) => !f.startsWith('.') && f.toLowerCase().endsWith(ext);
+};
+const starDotExtTestNocaseDot = (ext) => {
+ ext = ext.toLowerCase();
+ return (f) => f.toLowerCase().endsWith(ext);
+};
+const starDotStarRE = /^\*+\.\*+$/;
+const starDotStarTest = (f) => !f.startsWith('.') && f.includes('.');
+const starDotStarTestDot = (f) => f !== '.' && f !== '..' && f.includes('.');
+const dotStarRE = /^\.\*+$/;
+const dotStarTest = (f) => f !== '.' && f !== '..' && f.startsWith('.');
+const starRE = /^\*+$/;
+const starTest = (f) => f.length !== 0 && !f.startsWith('.');
+const starTestDot = (f) => f.length !== 0 && f !== '.' && f !== '..';
+const qmarksRE = /^\?+([^+@!?\*\[\(]*)?$/;
+const qmarksTestNocase = ([$0, ext = '']) => {
+ const noext = qmarksTestNoExt([$0]);
+ if (!ext)
+ return noext;
+ ext = ext.toLowerCase();
+ return (f) => noext(f) && f.toLowerCase().endsWith(ext);
+};
+const qmarksTestNocaseDot = ([$0, ext = '']) => {
+ const noext = qmarksTestNoExtDot([$0]);
+ if (!ext)
+ return noext;
+ ext = ext.toLowerCase();
+ return (f) => noext(f) && f.toLowerCase().endsWith(ext);
+};
+const qmarksTestDot = ([$0, ext = '']) => {
+ const noext = qmarksTestNoExtDot([$0]);
+ return !ext ? noext : (f) => noext(f) && f.endsWith(ext);
+};
+const qmarksTest = ([$0, ext = '']) => {
+ const noext = qmarksTestNoExt([$0]);
+ return !ext ? noext : (f) => noext(f) && f.endsWith(ext);
+};
+const qmarksTestNoExt = ([$0]) => {
+ const len = $0.length;
+ return (f) => f.length === len && !f.startsWith('.');
+};
+const qmarksTestNoExtDot = ([$0]) => {
+ const len = $0.length;
+ return (f) => f.length === len && f !== '.' && f !== '..';
+};
+/* c8 ignore start */
+const platform = typeof process === 'object' && process
+ ? (typeof process.env === 'object' &&
+ process.env &&
+ process.env.__MINIMATCH_TESTING_PLATFORM__) ||
+ process.platform
+ : 'posix';
+const isWindows = platform === 'win32';
+const path = isWindows ? { sep: '\\' } : { sep: '/' };
+/* c8 ignore stop */
+exports.sep = path.sep;
+exports.minimatch.sep = exports.sep;
+exports.GLOBSTAR = Symbol('globstar **');
+exports.minimatch.GLOBSTAR = exports.GLOBSTAR;
+const brace_expansion_1 = __importDefault(require("brace-expansion"));
+const plTypes = {
+ '!': { open: '(?:(?!(?:', close: '))[^/]*?)' },
+ '?': { open: '(?:', close: ')?' },
+ '+': { open: '(?:', close: ')+' },
+ '*': { open: '(?:', close: ')*' },
+ '@': { open: '(?:', close: ')' },
+};
+// any single thing other than /
+// don't need to escape / when using new RegExp()
+const qmark = '[^/]';
+// * => any number of characters
+const star = qmark + '*?';
+// ** when dots are allowed. Anything goes, except .. and .
+// not (^ or / followed by one or two dots followed by $ or /),
+// followed by anything, any number of times.
+const twoStarDot = '(?:(?!(?:\\/|^)(?:\\.{1,2})($|\\/)).)*?';
+// not a ^ or / followed by a dot,
+// followed by anything, any number of times.
+const twoStarNoDot = '(?:(?!(?:\\/|^)\\.).)*?';
+// "abc" -> { a:true, b:true, c:true }
+const charSet = (s) => s.split('').reduce((set, c) => {
+ set[c] = true;
+ return set;
+}, {});
+// characters that need to be escaped in RegExp.
+const reSpecials = charSet('().*{}+?[]^$\\!');
+// characters that indicate we have to add the pattern start
+const addPatternStartSet = charSet('[.(');
+const filter = (pattern, options = {}) => (p) => (0, exports.minimatch)(p, pattern, options);
+exports.filter = filter;
+exports.minimatch.filter = exports.filter;
+const ext = (a, b = {}) => Object.assign({}, a, b);
+const defaults = (def) => {
+ if (!def || typeof def !== 'object' || !Object.keys(def).length) {
+ return exports.minimatch;
+ }
+ const orig = exports.minimatch;
+ const m = (p, pattern, options = {}) => orig(p, pattern, ext(def, options));
+ return Object.assign(m, {
+ Minimatch: class Minimatch extends orig.Minimatch {
+ constructor(pattern, options = {}) {
+ super(pattern, ext(def, options));
+ }
+ static defaults(options) {
+ return orig.defaults(ext(def, options)).Minimatch;
+ }
+ },
+ filter: (pattern, options = {}) => orig.filter(pattern, ext(def, options)),
+ defaults: (options) => orig.defaults(ext(def, options)),
+ makeRe: (pattern, options = {}) => orig.makeRe(pattern, ext(def, options)),
+ braceExpand: (pattern, options = {}) => orig.braceExpand(pattern, ext(def, options)),
+ match: (list, pattern, options = {}) => orig.match(list, pattern, ext(def, options)),
+ sep: orig.sep,
+ GLOBSTAR: exports.GLOBSTAR,
+ });
+};
+exports.defaults = defaults;
+exports.minimatch.defaults = exports.defaults;
+// Brace expansion:
+// a{b,c}d -> abd acd
+// a{b,}c -> abc ac
+// a{0..3}d -> a0d a1d a2d a3d
+// a{b,c{d,e}f}g -> abg acdfg acefg
+// a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg
+//
+// Invalid sets are not expanded.
+// a{2..}b -> a{2..}b
+// a{b}c -> a{b}c
+const braceExpand = (pattern, options = {}) => {
+ assertValidPattern(pattern);
+ // Thanks to Yeting Li for
+ // improving this regexp to avoid a ReDOS vulnerability.
+ if (options.nobrace || !/\{(?:(?!\{).)*\}/.test(pattern)) {
+ // shortcut. no need to expand.
+ return [pattern];
+ }
+ return (0, brace_expansion_1.default)(pattern);
+};
+exports.braceExpand = braceExpand;
+exports.minimatch.braceExpand = exports.braceExpand;
+const MAX_PATTERN_LENGTH = 1024 * 64;
+const assertValidPattern = (pattern) => {
+ if (typeof pattern !== 'string') {
+ throw new TypeError('invalid pattern');
+ }
+ if (pattern.length > MAX_PATTERN_LENGTH) {
+ throw new TypeError('pattern is too long');
+ }
+};
+// parse a component of the expanded set.
+// At this point, no pattern may contain "/" in it
+// so we're going to return a 2d array, where each entry is the full
+// pattern, split on '/', and then turned into a regular expression.
+// A regexp is made at the end which joins each array with an
+// escaped /, and another full one which joins each regexp with |.
+//
+// Following the lead of Bash 4.1, note that "**" only has special meaning
+// when it is the *only* thing in a path portion. Otherwise, any series
+// of * is equivalent to a single *. Globstar behavior is enabled by
+// default, and can be disabled by setting options.noglobstar.
+const SUBPARSE = Symbol('subparse');
+const makeRe = (pattern, options = {}) => new Minimatch(pattern, options).makeRe();
+exports.makeRe = makeRe;
+exports.minimatch.makeRe = exports.makeRe;
+const match = (list, pattern, options = {}) => {
+ const mm = new Minimatch(pattern, options);
+ list = list.filter(f => mm.match(f));
+ if (mm.options.nonull && !list.length) {
+ list.push(pattern);
+ }
+ return list;
+};
+exports.match = match;
+exports.minimatch.match = exports.match;
+// replace stuff like \* with *
+const globUnescape = (s) => s.replace(/\\(.)/g, '$1');
+const charUnescape = (s) => s.replace(/\\([^-\]])/g, '$1');
+const regExpEscape = (s) => s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&');
+const braExpEscape = (s) => s.replace(/[[\]\\]/g, '\\$&');
+class Minimatch {
+ options;
+ set;
+ pattern;
+ windowsPathsNoEscape;
+ nonegate;
+ negate;
+ comment;
+ empty;
+ preserveMultipleSlashes;
+ partial;
+ globSet;
+ globParts;
+ regexp;
+ constructor(pattern, options = {}) {
+ assertValidPattern(pattern);
+ options = options || {};
+ this.options = options;
+ this.pattern = pattern;
+ this.windowsPathsNoEscape =
+ !!options.windowsPathsNoEscape || options.allowWindowsEscape === false;
+ if (this.windowsPathsNoEscape) {
+ this.pattern = this.pattern.replace(/\\/g, '/');
+ }
+ this.preserveMultipleSlashes = !!options.preserveMultipleSlashes;
+ this.regexp = null;
+ this.negate = false;
+ this.nonegate = !!options.nonegate;
+ this.comment = false;
+ this.empty = false;
+ this.partial = !!options.partial;
+ this.globSet = [];
+ this.globParts = [];
+ this.set = [];
+ // make the set of regexps etc.
+ this.make();
+ }
+ debug(..._) { }
+ make() {
+ const pattern = this.pattern;
+ const options = this.options;
+ // empty patterns and comments match nothing.
+ if (!options.nocomment && pattern.charAt(0) === '#') {
+ this.comment = true;
+ return;
+ }
+ if (!pattern) {
+ this.empty = true;
+ return;
+ }
+ // step 1: figure out negation, etc.
+ this.parseNegate();
+ // step 2: expand braces
+ this.globSet = this.braceExpand();
+ if (options.debug) {
+ this.debug = (...args) => console.error(...args);
+ }
+ this.debug(this.pattern, this.globSet);
+ // step 3: now we have a set, so turn each one into a series of path-portion
+ // matching patterns.
+ // These will be regexps, except in the case of "**", which is
+ // set to the GLOBSTAR object for globstar behavior,
+ // and will not contain any / characters
+ const rawGlobParts = this.globSet.map(s => this.slashSplit(s));
+ // consecutive globstars are an unncessary perf killer
+ // also, **/*/... is equivalent to */**/..., so swap all of those
+ // this turns a pattern like **/*/**/*/x into */*/**/x
+ // and a pattern like **/x/**/*/y becomes **/x/*/**/y
+ // the *later* we can push the **, the more efficient it is,
+ // because we can avoid having to do a recursive walk until
+ // the walked tree is as shallow as possible.
+ // Note that this is only true up to the last pattern, though, because
+ // a/*/** will only match a/b if b is a dir, but a/**/* will match a/b
+ // regardless, since it's "0 or more path segments" if it's not final.
+ if (this.options.noglobstar) {
+ // ** is * anyway
+ this.globParts = rawGlobParts;
+ }
+ else {
+ // do this swap BEFORE the reduce, so that we can turn a string
+ // of **/*/**/* into */*/**/** and then reduce the **'s into one
+ for (const parts of rawGlobParts) {
+ let swapped;
+ do {
+ swapped = false;
+ for (let i = 0; i < parts.length - 1; i++) {
+ if (parts[i] === '*' && parts[i - 1] === '**') {
+ parts[i] = '**';
+ parts[i - 1] = '*';
+ swapped = true;
+ }
+ }
+ } while (swapped);
+ }
+ this.globParts = rawGlobParts.map(parts => {
+ parts = parts.reduce((set, part) => {
+ const prev = set[set.length - 1];
+ if (part === '**' && prev === '**') {
+ return set;
+ }
+ if (part === '..') {
+ if (prev && prev !== '..' && prev !== '.' && prev !== '**') {
+ set.pop();
+ return set;
+ }
+ }
+ set.push(part);
+ return set;
+ }, []);
+ return parts.length === 0 ? [''] : parts;
+ });
+ }
+ this.debug(this.pattern, this.globParts);
+ // glob --> regexps
+ let set = this.globParts.map((s, _, __) => s.map(ss => this.parse(ss)));
+ this.debug(this.pattern, set);
+ // filter out everything that didn't compile properly.
+ this.set = set.filter(s => s.indexOf(false) === -1);
+ // do not treat the ? in UNC paths as magic
+ if (isWindows) {
+ for (let i = 0; i < this.set.length; i++) {
+ const p = this.set[i];
+ if (p[0] === '' &&
+ p[1] === '' &&
+ this.globParts[i][2] === '?' &&
+ typeof p[3] === 'string' &&
+ /^[a-z]:$/i.test(p[3])) {
+ p[2] = '?';
+ }
+ }
+ }
+ this.debug(this.pattern, this.set);
+ }
+ parseNegate() {
+ if (this.nonegate)
+ return;
+ const pattern = this.pattern;
+ let negate = false;
+ let negateOffset = 0;
+ for (let i = 0; i < pattern.length && pattern.charAt(i) === '!'; i++) {
+ negate = !negate;
+ negateOffset++;
+ }
+ if (negateOffset)
+ this.pattern = pattern.slice(negateOffset);
+ this.negate = negate;
+ }
+ // set partial to true to test if, for example,
+ // "/a/b" matches the start of "/*/b/*/d"
+ // Partial means, if you run out of file before you run
+ // out of pattern, then that's fine, as long as all
+ // the parts match.
+ matchOne(file, pattern, partial = false) {
+ const options = this.options;
+ // a UNC pattern like //?/c:/* can match a path like c:/x
+ // and vice versa
+ if (isWindows) {
+ const fileUNC = file[0] === '' &&
+ file[1] === '' &&
+ file[2] === '?' &&
+ typeof file[3] === 'string' &&
+ /^[a-z]:$/i.test(file[3]);
+ const patternUNC = pattern[0] === '' &&
+ pattern[1] === '' &&
+ pattern[2] === '?' &&
+ typeof pattern[3] === 'string' &&
+ /^[a-z]:$/i.test(pattern[3]);
+ if (fileUNC && patternUNC) {
+ const fd = file[3];
+ const pd = pattern[3];
+ if (fd.toLowerCase() === pd.toLowerCase()) {
+ file[3] = pd;
+ }
+ }
+ else if (patternUNC && typeof file[0] === 'string') {
+ const pd = pattern[3];
+ const fd = file[0];
+ if (pd.toLowerCase() === fd.toLowerCase()) {
+ pattern[3] = fd;
+ pattern = pattern.slice(3);
+ }
+ }
+ else if (fileUNC && typeof pattern[0] === 'string') {
+ const fd = file[3];
+ if (fd.toLowerCase() === pattern[0].toLowerCase()) {
+ pattern[0] = fd;
+ file = file.slice(3);
+ }
+ }
+ }
+ this.debug('matchOne', this, { file, pattern });
+ this.debug('matchOne', file.length, pattern.length);
+ for (var fi = 0, pi = 0, fl = file.length, pl = pattern.length; fi < fl && pi < pl; fi++, pi++) {
+ this.debug('matchOne loop');
+ var p = pattern[pi];
+ var f = file[fi];
+ this.debug(pattern, p, f);
+ // should be impossible.
+ // some invalid regexp stuff in the set.
+ /* c8 ignore start */
+ if (p === false) {
+ return false;
+ }
+ /* c8 ignore stop */
+ if (p === exports.GLOBSTAR) {
+ this.debug('GLOBSTAR', [pattern, p, f]);
+ // "**"
+ // a/**/b/**/c would match the following:
+ // a/b/x/y/z/c
+ // a/x/y/z/b/c
+ // a/b/x/b/x/c
+ // a/b/c
+ // To do this, take the rest of the pattern after
+ // the **, and see if it would match the file remainder.
+ // If so, return success.
+ // If not, the ** "swallows" a segment, and try again.
+ // This is recursively awful.
+ //
+ // a/**/b/**/c matching a/b/x/y/z/c
+ // - a matches a
+ // - doublestar
+ // - matchOne(b/x/y/z/c, b/**/c)
+ // - b matches b
+ // - doublestar
+ // - matchOne(x/y/z/c, c) -> no
+ // - matchOne(y/z/c, c) -> no
+ // - matchOne(z/c, c) -> no
+ // - matchOne(c, c) yes, hit
+ var fr = fi;
+ var pr = pi + 1;
+ if (pr === pl) {
+ this.debug('** at the end');
+ // a ** at the end will just swallow the rest.
+ // We have found a match.
+ // however, it will not swallow /.x, unless
+ // options.dot is set.
+ // . and .. are *never* matched by **, for explosively
+ // exponential reasons.
+ for (; fi < fl; fi++) {
+ if (file[fi] === '.' ||
+ file[fi] === '..' ||
+ (!options.dot && file[fi].charAt(0) === '.'))
+ return false;
+ }
+ return true;
+ }
+ // ok, let's see if we can swallow whatever we can.
+ while (fr < fl) {
+ var swallowee = file[fr];
+ this.debug('\nglobstar while', file, fr, pattern, pr, swallowee);
+ // XXX remove this slice. Just pass the start index.
+ if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) {
+ this.debug('globstar found match!', fr, fl, swallowee);
+ // found a match.
+ return true;
+ }
+ else {
+ // can't swallow "." or ".." ever.
+ // can only swallow ".foo" when explicitly asked.
+ if (swallowee === '.' ||
+ swallowee === '..' ||
+ (!options.dot && swallowee.charAt(0) === '.')) {
+ this.debug('dot detected!', file, fr, pattern, pr);
+ break;
+ }
+ // ** swallows a segment, and continue.
+ this.debug('globstar swallow a segment, and continue');
+ fr++;
+ }
+ }
+ // no match was found.
+ // However, in partial mode, we can't say this is necessarily over.
+ /* c8 ignore start */
+ if (partial) {
+ // ran out of file
+ this.debug('\n>>> no match, partial?', file, fr, pattern, pr);
+ if (fr === fl) {
+ return true;
+ }
+ }
+ /* c8 ignore stop */
+ return false;
+ }
+ // something other than **
+ // non-magic patterns just have to match exactly
+ // patterns with magic have been turned into regexps.
+ let hit;
+ if (typeof p === 'string') {
+ hit = f === p;
+ this.debug('string match', p, f, hit);
+ }
+ else {
+ hit = p.test(f);
+ this.debug('pattern match', p, f, hit);
+ }
+ if (!hit)
+ return false;
+ }
+ // Note: ending in / means that we'll get a final ""
+ // at the end of the pattern. This can only match a
+ // corresponding "" at the end of the file.
+ // If the file ends in /, then it can only match a
+ // a pattern that ends in /, unless the pattern just
+ // doesn't have any more for it. But, a/b/ should *not*
+ // match "a/b/*", even though "" matches against the
+ // [^/]*? pattern, except in partial mode, where it might
+ // simply not be reached yet.
+ // However, a/b/ should still satisfy a/*
+ // now either we fell off the end of the pattern, or we're done.
+ if (fi === fl && pi === pl) {
+ // ran out of pattern and filename at the same time.
+ // an exact hit!
+ return true;
+ }
+ else if (fi === fl) {
+ // ran out of file, but still had pattern left.
+ // this is ok if we're doing the match as part of
+ // a glob fs traversal.
+ return partial;
+ }
+ else if (pi === pl) {
+ // ran out of pattern, still have file left.
+ // this is only acceptable if we're on the very last
+ // empty segment of a file with a trailing slash.
+ // a/* should match a/b/
+ return fi === fl - 1 && file[fi] === '';
+ /* c8 ignore start */
+ }
+ else {
+ // should be unreachable.
+ throw new Error('wtf?');
+ }
+ /* c8 ignore stop */
+ }
+ braceExpand() {
+ return (0, exports.braceExpand)(this.pattern, this.options);
+ }
+ parse(pattern, isSub) {
+ assertValidPattern(pattern);
+ const options = this.options;
+ // shortcuts
+ if (pattern === '**') {
+ if (!options.noglobstar)
+ return exports.GLOBSTAR;
+ else
+ pattern = '*';
+ }
+ if (pattern === '')
+ return '';
+ // far and away, the most common glob pattern parts are
+ // *, *.*, and *. Add a fast check method for those.
+ let m;
+ let fastTest = null;
+ if (isSub !== SUBPARSE) {
+ if ((m = pattern.match(starRE))) {
+ fastTest = options.dot ? starTestDot : starTest;
+ }
+ else if ((m = pattern.match(starDotExtRE))) {
+ fastTest = (options.nocase
+ ? options.dot
+ ? starDotExtTestNocaseDot
+ : starDotExtTestNocase
+ : options.dot
+ ? starDotExtTestDot
+ : starDotExtTest)(m[1]);
+ }
+ else if ((m = pattern.match(qmarksRE))) {
+ fastTest = (options.nocase
+ ? options.dot
+ ? qmarksTestNocaseDot
+ : qmarksTestNocase
+ : options.dot
+ ? qmarksTestDot
+ : qmarksTest)(m);
+ }
+ else if ((m = pattern.match(starDotStarRE))) {
+ fastTest = options.dot ? starDotStarTestDot : starDotStarTest;
+ }
+ else if ((m = pattern.match(dotStarRE))) {
+ fastTest = dotStarTest;
+ }
+ }
+ let re = '';
+ let hasMagic = false;
+ let escaping = false;
+ // ? => one single character
+ const patternListStack = [];
+ const negativeLists = [];
+ let stateChar = false;
+ let inClass = false;
+ let reClassStart = -1;
+ let classStart = -1;
+ let cs;
+ let pl;
+ let sp;
+ // . and .. never match anything that doesn't start with .,
+ // even when options.dot is set. However, if the pattern
+ // starts with ., then traversal patterns can match.
+ let dotTravAllowed = pattern.charAt(0) === '.';
+ let dotFileAllowed = options.dot || dotTravAllowed;
+ const patternStart = () => dotTravAllowed
+ ? ''
+ : dotFileAllowed
+ ? '(?!(?:^|\\/)\\.{1,2}(?:$|\\/))'
+ : '(?!\\.)';
+ const subPatternStart = (p) => p.charAt(0) === '.'
+ ? ''
+ : options.dot
+ ? '(?!(?:^|\\/)\\.{1,2}(?:$|\\/))'
+ : '(?!\\.)';
+ const clearStateChar = () => {
+ if (stateChar) {
+ // we had some state-tracking character
+ // that wasn't consumed by this pass.
+ switch (stateChar) {
+ case '*':
+ re += star;
+ hasMagic = true;
+ break;
+ case '?':
+ re += qmark;
+ hasMagic = true;
+ break;
+ default:
+ re += '\\' + stateChar;
+ break;
+ }
+ this.debug('clearStateChar %j %j', stateChar, re);
+ stateChar = false;
+ }
+ };
+ for (let i = 0, c; i < pattern.length && (c = pattern.charAt(i)); i++) {
+ this.debug('%s\t%s %s %j', pattern, i, re, c);
+ // skip over any that are escaped.
+ if (escaping) {
+ // completely not allowed, even escaped.
+ // should be impossible.
+ /* c8 ignore start */
+ if (c === '/') {
+ return false;
+ }
+ /* c8 ignore stop */
+ if (reSpecials[c]) {
+ re += '\\';
+ }
+ re += c;
+ escaping = false;
+ continue;
+ }
+ switch (c) {
+ // Should already be path-split by now.
+ /* c8 ignore start */
+ case '/': {
+ return false;
+ }
+ /* c8 ignore stop */
+ case '\\':
+ if (inClass && pattern.charAt(i + 1) === '-') {
+ re += c;
+ continue;
+ }
+ clearStateChar();
+ escaping = true;
+ continue;
+ // the various stateChar values
+ // for the "extglob" stuff.
+ case '?':
+ case '*':
+ case '+':
+ case '@':
+ case '!':
+ this.debug('%s\t%s %s %j <-- stateChar', pattern, i, re, c);
+ // all of those are literals inside a class, except that
+ // the glob [!a] means [^a] in regexp
+ if (inClass) {
+ this.debug(' in class');
+ if (c === '!' && i === classStart + 1)
+ c = '^';
+ re += c;
+ continue;
+ }
+ // if we already have a stateChar, then it means
+ // that there was something like ** or +? in there.
+ // Handle the stateChar, then proceed with this one.
+ this.debug('call clearStateChar %j', stateChar);
+ clearStateChar();
+ stateChar = c;
+ // if extglob is disabled, then +(asdf|foo) isn't a thing.
+ // just clear the statechar *now*, rather than even diving into
+ // the patternList stuff.
+ if (options.noext)
+ clearStateChar();
+ continue;
+ case '(': {
+ if (inClass) {
+ re += '(';
+ continue;
+ }
+ if (!stateChar) {
+ re += '\\(';
+ continue;
+ }
+ const plEntry = {
+ type: stateChar,
+ start: i - 1,
+ reStart: re.length,
+ open: plTypes[stateChar].open,
+ close: plTypes[stateChar].close,
+ };
+ this.debug(this.pattern, '\t', plEntry);
+ patternListStack.push(plEntry);
+ // negation is (?:(?!(?:js)(?:))[^/]*)
+ re += plEntry.open;
+ // next entry starts with a dot maybe?
+ if (plEntry.start === 0 && plEntry.type !== '!') {
+ dotTravAllowed = true;
+ re += subPatternStart(pattern.slice(i + 1));
+ }
+ this.debug('plType %j %j', stateChar, re);
+ stateChar = false;
+ continue;
+ }
+ case ')': {
+ const plEntry = patternListStack[patternListStack.length - 1];
+ if (inClass || !plEntry) {
+ re += '\\)';
+ continue;
+ }
+ patternListStack.pop();
+ // closing an extglob
+ clearStateChar();
+ hasMagic = true;
+ pl = plEntry;
+ // negation is (?:(?!js)[^/]*)
+ // The others are (?:)
+ re += pl.close;
+ if (pl.type === '!') {
+ negativeLists.push(Object.assign(pl, { reEnd: re.length }));
+ }
+ continue;
+ }
+ case '|': {
+ const plEntry = patternListStack[patternListStack.length - 1];
+ if (inClass || !plEntry) {
+ re += '\\|';
+ continue;
+ }
+ clearStateChar();
+ re += '|';
+ // next subpattern can start with a dot?
+ if (plEntry.start === 0 && plEntry.type !== '!') {
+ dotTravAllowed = true;
+ re += subPatternStart(pattern.slice(i + 1));
+ }
+ continue;
+ }
+ // these are mostly the same in regexp and glob
+ case '[':
+ // swallow any state-tracking char before the [
+ clearStateChar();
+ if (inClass) {
+ re += '\\' + c;
+ continue;
+ }
+ inClass = true;
+ classStart = i;
+ reClassStart = re.length;
+ re += c;
+ continue;
+ case ']':
+ // a right bracket shall lose its special
+ // meaning and represent itself in
+ // a bracket expression if it occurs
+ // first in the list. -- POSIX.2 2.8.3.2
+ if (i === classStart + 1 || !inClass) {
+ re += '\\' + c;
+ continue;
+ }
+ // split where the last [ was, make sure we don't have
+ // an invalid re. if so, re-walk the contents of the
+ // would-be class to re-translate any characters that
+ // were passed through as-is
+ // TODO: It would probably be faster to determine this
+ // without a try/catch and a new RegExp, but it's tricky
+ // to do safely. For now, this is safe and works.
+ cs = pattern.substring(classStart + 1, i);
+ try {
+ RegExp('[' + braExpEscape(charUnescape(cs)) + ']');
+ // looks good, finish up the class.
+ re += c;
+ }
+ catch (er) {
+ // out of order ranges in JS are errors, but in glob syntax,
+ // they're just a range that matches nothing.
+ re = re.substring(0, reClassStart) + '(?:$.)'; // match nothing ever
+ }
+ hasMagic = true;
+ inClass = false;
+ continue;
+ default:
+ // swallow any state char that wasn't consumed
+ clearStateChar();
+ if (reSpecials[c] && !(c === '^' && inClass)) {
+ re += '\\';
+ }
+ re += c;
+ break;
+ } // switch
+ } // for
+ // handle the case where we left a class open.
+ // "[abc" is valid, equivalent to "\[abc"
+ if (inClass) {
+ // split where the last [ was, and escape it
+ // this is a huge pita. We now have to re-walk
+ // the contents of the would-be class to re-translate
+ // any characters that were passed through as-is
+ cs = pattern.slice(classStart + 1);
+ sp = this.parse(cs, SUBPARSE);
+ re = re.substring(0, reClassStart) + '\\[' + sp[0];
+ hasMagic = hasMagic || sp[1];
+ }
+ // handle the case where we had a +( thing at the *end*
+ // of the pattern.
+ // each pattern list stack adds 3 chars, and we need to go through
+ // and escape any | chars that were passed through as-is for the regexp.
+ // Go through and escape them, taking care not to double-escape any
+ // | chars that were already escaped.
+ for (pl = patternListStack.pop(); pl; pl = patternListStack.pop()) {
+ let tail;
+ tail = re.slice(pl.reStart + pl.open.length);
+ this.debug(this.pattern, 'setting tail', re, pl);
+ // maybe some even number of \, then maybe 1 \, followed by a |
+ tail = tail.replace(/((?:\\{2}){0,64})(\\?)\|/g, (_, $1, $2) => {
+ if (!$2) {
+ // the | isn't already escaped, so escape it.
+ $2 = '\\';
+ // should already be done
+ /* c8 ignore start */
+ }
+ /* c8 ignore stop */
+ // need to escape all those slashes *again*, without escaping the
+ // one that we need for escaping the | character. As it works out,
+ // escaping an even number of slashes can be done by simply repeating
+ // it exactly after itself. That's why this trick works.
+ //
+ // I am sorry that you have to see this.
+ return $1 + $1 + $2 + '|';
+ });
+ this.debug('tail=%j\n %s', tail, tail, pl, re);
+ const t = pl.type === '*' ? star : pl.type === '?' ? qmark : '\\' + pl.type;
+ hasMagic = true;
+ re = re.slice(0, pl.reStart) + t + '\\(' + tail;
+ }
+ // handle trailing things that only matter at the very end.
+ clearStateChar();
+ if (escaping) {
+ // trailing \\
+ re += '\\\\';
+ }
+ // only need to apply the nodot start if the re starts with
+ // something that could conceivably capture a dot
+ const addPatternStart = addPatternStartSet[re.charAt(0)];
+ // Hack to work around lack of negative lookbehind in JS
+ // A pattern like: *.!(x).!(y|z) needs to ensure that a name
+ // like 'a.xyz.yz' doesn't match. So, the first negative
+ // lookahead, has to look ALL the way ahead, to the end of
+ // the pattern.
+ for (let n = negativeLists.length - 1; n > -1; n--) {
+ const nl = negativeLists[n];
+ const nlBefore = re.slice(0, nl.reStart);
+ const nlFirst = re.slice(nl.reStart, nl.reEnd - 8);
+ let nlAfter = re.slice(nl.reEnd);
+ const nlLast = re.slice(nl.reEnd - 8, nl.reEnd) + nlAfter;
+ // Handle nested stuff like *(*.js|!(*.json)), where open parens
+ // mean that we should *not* include the ) in the bit that is considered
+ // "after" the negated section.
+ const closeParensBefore = nlBefore.split(')').length;
+ const openParensBefore = nlBefore.split('(').length - closeParensBefore;
+ let cleanAfter = nlAfter;
+ for (let i = 0; i < openParensBefore; i++) {
+ cleanAfter = cleanAfter.replace(/\)[+*?]?/, '');
+ }
+ nlAfter = cleanAfter;
+ const dollar = nlAfter === '' && isSub !== SUBPARSE ? '(?:$|\\/)' : '';
+ re = nlBefore + nlFirst + nlAfter + dollar + nlLast;
+ }
+ // if the re is not "" at this point, then we need to make sure
+ // it doesn't match against an empty path part.
+ // Otherwise a/* will match a/, which it should not.
+ if (re !== '' && hasMagic) {
+ re = '(?=.)' + re;
+ }
+ if (addPatternStart) {
+ re = patternStart() + re;
+ }
+ // parsing just a piece of a larger pattern.
+ if (isSub === SUBPARSE) {
+ return [re, hasMagic];
+ }
+ // if it's nocase, and the lcase/uppercase don't match, it's magic
+ if (options.nocase && !hasMagic && !options.nocaseMagicOnly) {
+ hasMagic = pattern.toUpperCase() !== pattern.toLowerCase();
+ }
+ // skip the regexp for non-magical patterns
+ // unescape anything in it, though, so that it'll be
+ // an exact match against a file etc.
+ if (!hasMagic) {
+ return globUnescape(pattern);
+ }
+ const flags = options.nocase ? 'i' : '';
+ try {
+ const ext = fastTest
+ ? {
+ _glob: pattern,
+ _src: re,
+ test: fastTest,
+ }
+ : {
+ _glob: pattern,
+ _src: re,
+ };
+ return Object.assign(new RegExp('^' + re + '$', flags), ext);
+ /* c8 ignore start */
+ }
+ catch (er) {
+ // should be impossible
+ // If it was an invalid regular expression, then it can't match
+ // anything. This trick looks for a character after the end of
+ // the string, which is of course impossible, except in multi-line
+ // mode, but it's not a /m regex.
+ this.debug('invalid regexp', er);
+ return new RegExp('$.');
+ }
+ /* c8 ignore stop */
+ }
+ makeRe() {
+ if (this.regexp || this.regexp === false)
+ return this.regexp;
+ // at this point, this.set is a 2d array of partial
+ // pattern strings, or "**".
+ //
+ // It's better to use .match(). This function shouldn't
+ // be used, really, but it's pretty convenient sometimes,
+ // when you just want to work with a regex.
+ const set = this.set;
+ if (!set.length) {
+ this.regexp = false;
+ return this.regexp;
+ }
+ const options = this.options;
+ const twoStar = options.noglobstar
+ ? star
+ : options.dot
+ ? twoStarDot
+ : twoStarNoDot;
+ const flags = options.nocase ? 'i' : '';
+ // regexpify non-globstar patterns
+ // if ** is only item, then we just do one twoStar
+ // if ** is first, and there are more, prepend (\/|twoStar\/)? to next
+ // if ** is last, append (\/twoStar|) to previous
+ // if ** is in the middle, append (\/|\/twoStar\/) to previous
+ // then filter out GLOBSTAR symbols
+ let re = set
+ .map(pattern => {
+ const pp = pattern.map(p => typeof p === 'string'
+ ? regExpEscape(p)
+ : p === exports.GLOBSTAR
+ ? exports.GLOBSTAR
+ : p._src);
+ pp.forEach((p, i) => {
+ const next = pp[i + 1];
+ const prev = pp[i - 1];
+ if (p !== exports.GLOBSTAR || prev === exports.GLOBSTAR) {
+ return;
+ }
+ if (prev === undefined) {
+ if (next !== undefined && next !== exports.GLOBSTAR) {
+ pp[i + 1] = '(?:\\/|' + twoStar + '\\/)?' + next;
+ }
+ else {
+ pp[i] = twoStar;
+ }
+ }
+ else if (next === undefined) {
+ pp[i - 1] = prev + '(?:\\/|' + twoStar + ')?';
+ }
+ else if (next !== exports.GLOBSTAR) {
+ pp[i - 1] = prev + '(?:\\/|\\/' + twoStar + '\\/)' + next;
+ pp[i + 1] = exports.GLOBSTAR;
+ }
+ });
+ return pp.filter(p => p !== exports.GLOBSTAR).join('/');
+ })
+ .join('|');
+ // must match entire pattern
+ // ending in a * or ** will make it less strict.
+ re = '^(?:' + re + ')$';
+ // can match anything, as long as it's not this.
+ if (this.negate)
+ re = '^(?!' + re + ').*$';
+ try {
+ this.regexp = new RegExp(re, flags);
+ /* c8 ignore start */
+ }
+ catch (ex) {
+ // should be impossible
+ this.regexp = false;
+ }
+ /* c8 ignore stop */
+ return this.regexp;
+ }
+ slashSplit(p) {
+ // if p starts with // on windows, we preserve that
+ // so that UNC paths aren't broken. Otherwise, any number of
+ // / characters are coalesced into one, unless
+ // preserveMultipleSlashes is set to true.
+ if (this.preserveMultipleSlashes) {
+ return p.split('/');
+ }
+ else if (isWindows && /^\/\/[^\/]+/.test(p)) {
+ // add an extra '' for the one we lose
+ return ['', ...p.split(/\/+/)];
+ }
+ else {
+ return p.split(/\/+/);
+ }
+ }
+ match(f, partial = this.partial) {
+ this.debug('match', f, this.pattern);
+ // short-circuit in the case of busted things.
+ // comments, etc.
+ if (this.comment) {
+ return false;
+ }
+ if (this.empty) {
+ return f === '';
+ }
+ if (f === '/' && partial) {
+ return true;
+ }
+ const options = this.options;
+ // windows: need to use /, not \
+ if (path.sep !== '/') {
+ f = f.split(path.sep).join('/');
+ }
+ // treat the test path as a set of pathparts.
+ const ff = this.slashSplit(f);
+ this.debug(this.pattern, 'split', ff);
+ // just ONE of the pattern sets in this.set needs to match
+ // in order for it to be valid. If negating, then just one
+ // match means that we have failed.
+ // Either way, return on the first hit.
+ const set = this.set;
+ this.debug(this.pattern, 'set', set);
+ // Find the basename of the path by looking for the last non-empty segment
+ let filename = ff[ff.length - 1];
+ if (!filename) {
+ for (let i = ff.length - 2; !filename && i >= 0; i--) {
+ filename = ff[i];
+ }
+ }
+ for (let i = 0; i < set.length; i++) {
+ const pattern = set[i];
+ let file = ff;
+ if (options.matchBase && pattern.length === 1) {
+ file = [filename];
+ }
+ const hit = this.matchOne(file, pattern, partial);
+ if (hit) {
+ if (options.flipNegate) {
+ return true;
+ }
+ return !this.negate;
+ }
+ }
+ // didn't get any hits. this is success if it's a negative
+ // pattern, failure otherwise.
+ if (options.flipNegate) {
+ return false;
+ }
+ return this.negate;
+ }
+ static defaults(def) {
+ return exports.minimatch.defaults(def).Minimatch;
+ }
+}
+exports.Minimatch = Minimatch;
+exports.minimatch.Minimatch = Minimatch;
+//# sourceMappingURL=index.js.map
\ No newline at end of file
diff --git a/deps/npm/node_modules/minimatch/dist/cjs/index.js.map b/deps/npm/node_modules/minimatch/dist/cjs/index.js.map
new file mode 100644
index 00000000000000..0f561db44c2027
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/cjs/index.js.map
@@ -0,0 +1 @@
+{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;;;;AAmBO,MAAM,SAAS,GAAG,CACvB,CAAS,EACT,OAAe,EACf,UAA4B,EAAE,EAC9B,EAAE;IACF,kBAAkB,CAAC,OAAO,CAAC,CAAA;IAE3B,oCAAoC;IACpC,IAAI,CAAC,OAAO,CAAC,SAAS,IAAI,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;QACnD,OAAO,KAAK,CAAA;KACb;IAED,OAAO,IAAI,SAAS,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAA;AACjD,CAAC,CAAA;AAbY,QAAA,SAAS,aAarB;AAED,kBAAe,iBAAS,CAAA;AAExB,wDAAwD;AACxD,MAAM,YAAY,GAAG,uBAAuB,CAAA;AAC5C,MAAM,cAAc,GAAG,CAAC,GAAW,EAAE,EAAE,CAAC,CAAC,CAAS,EAAE,EAAE,CACpD,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACvC,MAAM,iBAAiB,GAAG,CAAC,GAAW,EAAE,EAAE,CAAC,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACzE,MAAM,oBAAoB,GAAG,CAAC,GAAW,EAAE,EAAE;IAC3C,GAAG,GAAG,GAAG,CAAC,WAAW,EAAE,CAAA;IACvB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAC3E,CAAC,CAAA;AACD,MAAM,uBAAuB,GAAG,CAAC,GAAW,EAAE,EAAE;IAC9C,GAAG,GAAG,GAAG,CAAC,WAAW,EAAE,CAAA;IACvB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACrD,CAAC,CAAA;AACD,MAAM,aAAa,GAAG,YAAY,CAAA;AAClC,MAAM,eAAe,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAC5E,MAAM,kBAAkB,GAAG,CAAC,CAAS,EAAE,EAAE,CACvC,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,IAAI,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAC5C,MAAM,SAAS,GAAG,SAAS,CAAA;AAC3B,MAAM,WAAW,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,IAAI,IAAI,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,CAAA;AAC/E,MAAM,MAAM,GAAG,OAAO,CAAA;AACtB,MAAM,QAAQ,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,CAAA;AACpE,MAAM,WAAW,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,IAAI,CAAA;AAC5E,MAAM,QAAQ,GAAG,wBAAwB,CAAA;AACzC,MAAM,gBAAgB,GAAG,CAAC,CAAC,EAAE,EAAE,GAAG,GAAG,EAAE,CAAmB,EAAE,EAAE;IAC5D,MAAM,KAAK,GAAG,eAAe,CAAC,CAAC,EAAE,CAAC,CAAC,CAAA;IACnC,IAAI,CAAC,GAAG;QAAE,OAAO,KAAK,CAAA;IACtB,GAAG,GAAG,GAAG,CAAC,WAAW,EAAE,CAAA;IACvB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACjE,CAAC,CAAA;AACD,MAAM,mBAAmB,GAAG,CAAC,CAAC,EAAE,EAAE,GAAG,GAAG,EAAE,CAAmB,EAAE,EAAE;IAC/D,MAAM,KAAK,GAAG,kBAAkB,CAAC,CAAC,EAAE,CAAC,CAAC,CAAA;IACtC,IAAI,CAAC,GAAG;QAAE,OAAO,KAAK,CAAA;IACtB,GAAG,GAAG,GAAG,CAAC,WAAW,EAAE,CAAA;IACvB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACjE,CAAC,CAAA;AACD,MAAM,aAAa,GAAG,CAAC,CAAC,EAAE,EAAE,GAAG,GAAG,EAAE,CAAmB,EAAE,EAAE;IACzD,MAAM,KAAK,GAAG,kBAAkB,CAAC,CAAC,EAAE,CAAC,CAAC,CAAA;IACtC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAS,EAAE,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAClE,CAAC,CAAA;AACD,MAAM,UAAU,GAAG,CAAC,CAAC,EAAE,EAAE,GAAG,GAAG,EAAE,CAAmB,EAAE,EAAE;IACtD,MAAM,KAAK,GAAG,eAAe,CAAC,CAAC,EAAE,CAAC,CAAC,CAAA;IACnC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAS,EAAE,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAClE,CAAC,CAAA;AACD,MAAM,eAAe,GAAG,CAAC,CAAC,EAAE,CAAmB,EAAE,EAAE;IACjD,MAAM,GAAG,GAAG,EAAE,CAAC,MAAM,CAAA;IACrB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,MAAM,KAAK,GAAG,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,CAAA;AAC9D,CAAC,CAAA;AACD,MAAM,kBAAkB,GAAG,CAAC,CAAC,EAAE,CAAmB,EAAE,EAAE;IACpD,MAAM,GAAG,GAAG,EAAE,CAAC,MAAM,CAAA;IACrB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,IAAI,CAAA;AACnE,CAAC,CAAA;AAED,qBAAqB;AACrB,MAAM,QAAQ,GACZ,OAAO,OAAO,KAAK,QAAQ,IAAI,OAAO;IACpC,CAAC,CAAC,CAAC,OAAO,OAAO,CAAC,GAAG,KAAK,QAAQ;QAC9B,OAAO,CAAC,GAAG;QACX,OAAO,CAAC,GAAG,CAAC,8BAA8B,CAAC;QAC7C,OAAO,CAAC,QAAQ;IAClB,CAAC,CAAC,OAAO,CAAA;AACb,MAAM,SAAS,GAAG,QAAQ,KAAK,OAAO,CAAA;AACtC,MAAM,IAAI,GAAG,SAAS,CAAC,CAAC,CAAC,EAAE,GAAG,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,EAAE,GAAG,EAAE,CAAA;AACrD,oBAAoB;AAEP,QAAA,GAAG,GAAG,IAAI,CAAC,GAAG,CAAA;AAC3B,iBAAS,CAAC,GAAG,GAAG,WAAG,CAAA;AAEN,QAAA,QAAQ,GAAG,MAAM,CAAC,aAAa,CAAC,CAAA;AAC7C,iBAAS,CAAC,QAAQ,GAAG,gBAAQ,CAAA;AAC7B,sEAAoC;AAEpC,MAAM,OAAO,GAAG;IACd,GAAG,EAAE,EAAE,IAAI,EAAE,WAAW,EAAE,KAAK,EAAE,WAAW,EAAE;IAC9C,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE;IACjC,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE;IACjC,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE;IACjC,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,EAAE,GAAG,EAAE;CACjC,CAAA;AAGD,gCAAgC;AAChC,iDAAiD;AACjD,MAAM,KAAK,GAAG,MAAM,CAAA;AAEpB,gCAAgC;AAChC,MAAM,IAAI,GAAG,KAAK,GAAG,IAAI,CAAA;AAEzB,4DAA4D;AAC5D,+DAA+D;AAC/D,6CAA6C;AAC7C,MAAM,UAAU,GAAG,yCAAyC,CAAA;AAE5D,kCAAkC;AAClC,6CAA6C;AAC7C,MAAM,YAAY,GAAG,yBAAyB,CAAA;AAE9C,sCAAsC;AACtC,MAAM,OAAO,GAAG,CAAC,CAAS,EAAE,EAAE,CAC5B,CAAC,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC,MAAM,CAAC,CAAC,GAA6B,EAAE,CAAC,EAAE,EAAE;IACtD,GAAG,CAAC,CAAC,CAAC,GAAG,IAAI,CAAA;IACb,OAAO,GAAG,CAAA;AACZ,CAAC,EAAE,EAAE,CAAC,CAAA;AAER,gDAAgD;AAChD,MAAM,UAAU,GAAG,OAAO,CAAC,iBAAiB,CAAC,CAAA;AAE7C,4DAA4D;AAC5D,MAAM,kBAAkB,GAAG,OAAO,CAAC,KAAK,CAAC,CAAA;AAElC,MAAM,MAAM,GACjB,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CACpD,CAAC,CAAS,EAAE,EAAE,CACZ,IAAA,iBAAS,EAAC,CAAC,EAAE,OAAO,EAAE,OAAO,CAAC,CAAA;AAHrB,QAAA,MAAM,UAGe;AAClC,iBAAS,CAAC,MAAM,GAAG,cAAM,CAAA;AAEzB,MAAM,GAAG,GAAG,CAAC,CAAmB,EAAE,IAAsB,EAAE,EAAE,EAAE,CAC5D,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,CAAC,EAAE,CAAC,CAAC,CAAA;AAElB,MAAM,QAAQ,GAAG,CAAC,GAAqB,EAAoB,EAAE;IAClE,IAAI,CAAC,GAAG,IAAI,OAAO,GAAG,KAAK,QAAQ,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,MAAM,EAAE;QAC/D,OAAO,iBAAS,CAAA;KACjB;IAED,MAAM,IAAI,GAAG,iBAAS,CAAA;IAEtB,MAAM,CAAC,GAAG,CAAC,CAAS,EAAE,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CACvE,IAAI,CAAC,CAAC,EAAE,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC,CAAA;IAErC,OAAO,MAAM,CAAC,MAAM,CAAC,CAAC,EAAE;QACtB,SAAS,EAAE,MAAM,SAAU,SAAQ,IAAI,CAAC,SAAS;YAC/C,YAAY,OAAe,EAAE,UAA4B,EAAE;gBACzD,KAAK,CAAC,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC,CAAA;YACnC,CAAC;YACD,MAAM,CAAC,QAAQ,CAAC,OAAyB;gBACvC,OAAO,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC,CAAC,SAAS,CAAA;YACnD,CAAC;SACF;QAED,MAAM,EAAE,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CAC1D,IAAI,CAAC,MAAM,CAAC,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAEzC,QAAQ,EAAE,CAAC,OAAyB,EAAE,EAAE,CAAC,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAEzE,MAAM,EAAE,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CAC1D,IAAI,CAAC,MAAM,CAAC,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAEzC,WAAW,EAAE,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CAC/D,IAAI,CAAC,WAAW,CAAC,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAE9C,KAAK,EAAE,CAAC,IAAc,EAAE,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CACzE,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAE9C,GAAG,EAAE,IAAI,CAAC,GAAG;QACb,QAAQ,EAAE,gBAA2B;KACtC,CAAC,CAAA;AACJ,CAAC,CAAA;AArCY,QAAA,QAAQ,YAqCpB;AACD,iBAAS,CAAC,QAAQ,GAAG,gBAAQ,CAAA;AAE7B,mBAAmB;AACnB,qBAAqB;AACrB,mBAAmB;AACnB,8BAA8B;AAC9B,mCAAmC;AACnC,2CAA2C;AAC3C,EAAE;AACF,iCAAiC;AACjC,qBAAqB;AACrB,iBAAiB;AACV,MAAM,WAAW,GAAG,CACzB,OAAe,EACf,UAA4B,EAAE,EAC9B,EAAE;IACF,kBAAkB,CAAC,OAAO,CAAC,CAAA;IAE3B,wDAAwD;IACxD,wDAAwD;IACxD,IAAI,OAAO,CAAC,OAAO,IAAI,CAAC,kBAAkB,CAAC,IAAI,CAAC,OAAO,CAAC,EAAE;QACxD,+BAA+B;QAC/B,OAAO,CAAC,OAAO,CAAC,CAAA;KACjB;IAED,OAAO,IAAA,yBAAM,EAAC,OAAO,CAAC,CAAA;AACxB,CAAC,CAAA;AAdY,QAAA,WAAW,eAcvB;AACD,iBAAS,CAAC,WAAW,GAAG,mBAAW,CAAA;AAEnC,MAAM,kBAAkB,GAAG,IAAI,GAAG,EAAE,CAAA;AACpC,MAAM,kBAAkB,GAA2B,CACjD,OAAY,EACe,EAAE;IAC7B,IAAI,OAAO,OAAO,KAAK,QAAQ,EAAE;QAC/B,MAAM,IAAI,SAAS,CAAC,iBAAiB,CAAC,CAAA;KACvC;IAED,IAAI,OAAO,CAAC,MAAM,GAAG,kBAAkB,EAAE;QACvC,MAAM,IAAI,SAAS,CAAC,qBAAqB,CAAC,CAAA;KAC3C;AACH,CAAC,CAAA;AAED,yCAAyC;AACzC,kDAAkD;AAClD,oEAAoE;AACpE,oEAAoE;AACpE,6DAA6D;AAC7D,kEAAkE;AAClE,EAAE;AACF,0EAA0E;AAC1E,wEAAwE;AACxE,qEAAqE;AACrE,8DAA8D;AAC9D,MAAM,QAAQ,GAAG,MAAM,CAAC,UAAU,CAAC,CAAA;AAE5B,MAAM,MAAM,GAAG,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CACxE,IAAI,SAAS,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC,MAAM,EAAE,CAAA;AAD7B,QAAA,MAAM,UACuB;AAC1C,iBAAS,CAAC,MAAM,GAAG,cAAM,CAAA;AAElB,MAAM,KAAK,GAAG,CACnB,IAAc,EACd,OAAe,EACf,UAA4B,EAAE,EAC9B,EAAE;IACF,MAAM,EAAE,GAAG,IAAI,SAAS,CAAC,OAAO,EAAE,OAAO,CAAC,CAAA;IAC1C,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAA;IACpC,IAAI,EAAE,CAAC,OAAO,CAAC,MAAM,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE;QACrC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,CAAA;KACnB;IACD,OAAO,IAAI,CAAA;AACb,CAAC,CAAA;AAXY,QAAA,KAAK,SAWjB;AACD,iBAAS,CAAC,KAAK,GAAG,aAAK,CAAA;AAEvB,+BAA+B;AAC/B,MAAM,YAAY,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,QAAQ,EAAE,IAAI,CAAC,CAAA;AAC7D,MAAM,YAAY,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,EAAE,IAAI,CAAC,CAAA;AAClE,MAAM,YAAY,GAAG,CAAC,CAAS,EAAE,EAAE,CACjC,CAAC,CAAC,OAAO,CAAC,0BAA0B,EAAE,MAAM,CAAC,CAAA;AAC/C,MAAM,YAAY,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,UAAU,EAAE,MAAM,CAAC,CAAA;AAsBjE,MAAa,SAAS;IACpB,OAAO,CAAkB;IACzB,GAAG,CAAyB;IAC5B,OAAO,CAAQ;IAEf,oBAAoB,CAAS;IAC7B,QAAQ,CAAS;IACjB,MAAM,CAAS;IACf,OAAO,CAAS;IAChB,KAAK,CAAS;IACd,uBAAuB,CAAS;IAChC,OAAO,CAAS;IAChB,OAAO,CAAU;IACjB,SAAS,CAAY;IAErB,MAAM,CAAyB;IAC/B,YAAY,OAAe,EAAE,UAA4B,EAAE;QACzD,kBAAkB,CAAC,OAAO,CAAC,CAAA;QAE3B,OAAO,GAAG,OAAO,IAAI,EAAE,CAAA;QACvB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAA;QACtB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAA;QACtB,IAAI,CAAC,oBAAoB;YACvB,CAAC,CAAC,OAAO,CAAC,oBAAoB,IAAI,OAAO,CAAC,kBAAkB,KAAK,KAAK,CAAA;QACxE,IAAI,IAAI,CAAC,oBAAoB,EAAE;YAC7B,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,KAAK,EAAE,GAAG,CAAC,CAAA;SAChD;QACD,IAAI,CAAC,uBAAuB,GAAG,CAAC,CAAC,OAAO,CAAC,uBAAuB,CAAA;QAChE,IAAI,CAAC,MAAM,GAAG,IAAI,CAAA;QAClB,IAAI,CAAC,MAAM,GAAG,KAAK,CAAA;QACnB,IAAI,CAAC,QAAQ,GAAG,CAAC,CAAC,OAAO,CAAC,QAAQ,CAAA;QAClC,IAAI,CAAC,OAAO,GAAG,KAAK,CAAA;QACpB,IAAI,CAAC,KAAK,GAAG,KAAK,CAAA;QAClB,IAAI,CAAC,OAAO,GAAG,CAAC,CAAC,OAAO,CAAC,OAAO,CAAA;QAEhC,IAAI,CAAC,OAAO,GAAG,EAAE,CAAA;QACjB,IAAI,CAAC,SAAS,GAAG,EAAE,CAAA;QACnB,IAAI,CAAC,GAAG,GAAG,EAAE,CAAA;QAEb,+BAA+B;QAC/B,IAAI,CAAC,IAAI,EAAE,CAAA;IACb,CAAC;IAED,KAAK,CAAC,GAAG,CAAQ,IAAG,CAAC;IAErB,IAAI;QACF,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAC5B,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,6CAA6C;QAC7C,IAAI,CAAC,OAAO,CAAC,SAAS,IAAI,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;YACnD,IAAI,CAAC,OAAO,GAAG,IAAI,CAAA;YACnB,OAAM;SACP;QAED,IAAI,CAAC,OAAO,EAAE;YACZ,IAAI,CAAC,KAAK,GAAG,IAAI,CAAA;YACjB,OAAM;SACP;QAED,oCAAoC;QACpC,IAAI,CAAC,WAAW,EAAE,CAAA;QAElB,wBAAwB;QACxB,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,WAAW,EAAE,CAAA;QAEjC,IAAI,OAAO,CAAC,KAAK,EAAE;YACjB,IAAI,CAAC,KAAK,GAAG,CAAC,GAAG,IAAW,EAAE,EAAE,CAAC,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,CAAA;SACxD;QAED,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,OAAO,CAAC,CAAA;QAEtC,4EAA4E;QAC5E,qBAAqB;QACrB,8DAA8D;QAC9D,oDAAoD;QACpD,wCAAwC;QACxC,MAAM,YAAY,GAAG,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAA;QAE9D,sDAAsD;QACtD,iEAAiE;QACjE,sDAAsD;QACtD,qDAAqD;QACrD,4DAA4D;QAC5D,2DAA2D;QAC3D,6CAA6C;QAC7C,sEAAsE;QACtE,sEAAsE;QACtE,sEAAsE;QACtE,IAAI,IAAI,CAAC,OAAO,CAAC,UAAU,EAAE;YAC3B,iBAAiB;YACjB,IAAI,CAAC,SAAS,GAAG,YAAY,CAAA;SAC9B;aAAM;YACL,+DAA+D;YAC/D,gEAAgE;YAChE,KAAK,MAAM,KAAK,IAAI,YAAY,EAAE;gBAChC,IAAI,OAAgB,CAAA;gBACpB,GAAG;oBACD,OAAO,GAAG,KAAK,CAAA;oBACf,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC,EAAE,EAAE;wBACzC,IAAI,KAAK,CAAC,CAAC,CAAC,KAAK,GAAG,IAAI,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,KAAK,IAAI,EAAE;4BAC7C,KAAK,CAAC,CAAC,CAAC,GAAG,IAAI,CAAA;4BACf,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,GAAG,CAAA;4BAClB,OAAO,GAAG,IAAI,CAAA;yBACf;qBACF;iBACF,QAAQ,OAAO,EAAC;aAClB;YACD,IAAI,CAAC,SAAS,GAAG,YAAY,CAAC,GAAG,CAAC,KAAK,CAAC,EAAE;gBACxC,KAAK,GAAG,KAAK,CAAC,MAAM,CAAC,CAAC,GAAa,EAAE,IAAI,EAAE,EAAE;oBAC3C,MAAM,IAAI,GAAG,GAAG,CAAC,GAAG,CAAC,MAAM,GAAG,CAAC,CAAC,CAAA;oBAChC,IAAI,IAAI,KAAK,IAAI,IAAI,IAAI,KAAK,IAAI,EAAE;wBAClC,OAAO,GAAG,CAAA;qBACX;oBACD,IAAI,IAAI,KAAK,IAAI,EAAE;wBACjB,IAAI,IAAI,IAAI,IAAI,KAAK,IAAI,IAAI,IAAI,KAAK,GAAG,IAAI,IAAI,KAAK,IAAI,EAAE;4BAC1D,GAAG,CAAC,GAAG,EAAE,CAAA;4BACT,OAAO,GAAG,CAAA;yBACX;qBACF;oBACD,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC,CAAA;oBACd,OAAO,GAAG,CAAA;gBACZ,CAAC,EAAE,EAAE,CAAC,CAAA;gBACN,OAAO,KAAK,CAAC,MAAM,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,KAAK,CAAA;YAC1C,CAAC,CAAC,CAAA;SACH;QAED,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,SAAS,CAAC,CAAA;QAExC,mBAAmB;QACnB,IAAI,GAAG,GAAG,IAAI,CAAC,SAAS,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,EAAE,EAAE,CAAC,CAAC,CAAC,GAAG,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC,CAAC,CAAA;QAEvE,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,GAAG,CAAC,CAAA;QAE7B,sDAAsD;QACtD,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC,MAAM,CACnB,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,CACF,CAAA;QAE5B,2CAA2C;QAC3C,IAAI,SAAS,EAAE;YACb,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,GAAG,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;gBACxC,MAAM,CAAC,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,CAAA;gBACrB,IACE,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE;oBACX,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE;oBACX,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,KAAK,GAAG;oBAC5B,OAAO,CAAC,CAAC,CAAC,CAAC,KAAK,QAAQ;oBACxB,WAAW,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,EACtB;oBACA,CAAC,CAAC,CAAC,CAAC,GAAG,GAAG,CAAA;iBACX;aACF;SACF;QAED,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,GAAG,CAAC,CAAA;IACpC,CAAC;IAED,WAAW;QACT,IAAI,IAAI,CAAC,QAAQ;YAAE,OAAM;QAEzB,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAC5B,IAAI,MAAM,GAAG,KAAK,CAAA;QAClB,IAAI,YAAY,GAAG,CAAC,CAAA;QAEpB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,CAAC,MAAM,IAAI,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE,CAAC,EAAE,EAAE;YACpE,MAAM,GAAG,CAAC,MAAM,CAAA;YAChB,YAAY,EAAE,CAAA;SACf;QAED,IAAI,YAAY;YAAE,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC,KAAK,CAAC,YAAY,CAAC,CAAA;QAC5D,IAAI,CAAC,MAAM,GAAG,MAAM,CAAA;IACtB,CAAC;IAED,+CAA+C;IAC/C,yCAAyC;IACzC,uDAAuD;IACvD,mDAAmD;IACnD,mBAAmB;IACnB,QAAQ,CAAC,IAAc,EAAE,OAAsB,EAAE,UAAmB,KAAK;QACvE,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,yDAAyD;QACzD,iBAAiB;QACjB,IAAI,SAAS,EAAE;YACb,MAAM,OAAO,GACX,IAAI,CAAC,CAAC,CAAC,KAAK,EAAE;gBACd,IAAI,CAAC,CAAC,CAAC,KAAK,EAAE;gBACd,IAAI,CAAC,CAAC,CAAC,KAAK,GAAG;gBACf,OAAO,IAAI,CAAC,CAAC,CAAC,KAAK,QAAQ;gBAC3B,WAAW,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAA;YAC3B,MAAM,UAAU,GACd,OAAO,CAAC,CAAC,CAAC,KAAK,EAAE;gBACjB,OAAO,CAAC,CAAC,CAAC,KAAK,EAAE;gBACjB,OAAO,CAAC,CAAC,CAAC,KAAK,GAAG;gBAClB,OAAO,OAAO,CAAC,CAAC,CAAC,KAAK,QAAQ;gBAC9B,WAAW,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,CAAA;YAE9B,IAAI,OAAO,IAAI,UAAU,EAAE;gBACzB,MAAM,EAAE,GAAG,IAAI,CAAC,CAAC,CAAW,CAAA;gBAC5B,MAAM,EAAE,GAAG,OAAO,CAAC,CAAC,CAAW,CAAA;gBAC/B,IAAI,EAAE,CAAC,WAAW,EAAE,KAAK,EAAE,CAAC,WAAW,EAAE,EAAE;oBACzC,IAAI,CAAC,CAAC,CAAC,GAAG,EAAE,CAAA;iBACb;aACF;iBAAM,IAAI,UAAU,IAAI,OAAO,IAAI,CAAC,CAAC,CAAC,KAAK,QAAQ,EAAE;gBACpD,MAAM,EAAE,GAAG,OAAO,CAAC,CAAC,CAAW,CAAA;gBAC/B,MAAM,EAAE,GAAG,IAAI,CAAC,CAAC,CAAC,CAAA;gBAClB,IAAI,EAAE,CAAC,WAAW,EAAE,KAAK,EAAE,CAAC,WAAW,EAAE,EAAE;oBACzC,OAAO,CAAC,CAAC,CAAC,GAAG,EAAE,CAAA;oBACf,OAAO,GAAG,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAA;iBAC3B;aACF;iBAAM,IAAI,OAAO,IAAI,OAAO,OAAO,CAAC,CAAC,CAAC,KAAK,QAAQ,EAAE;gBACpD,MAAM,EAAE,GAAG,IAAI,CAAC,CAAC,CAAC,CAAA;gBAClB,IAAI,EAAE,CAAC,WAAW,EAAE,KAAK,OAAO,CAAC,CAAC,CAAC,CAAC,WAAW,EAAE,EAAE;oBACjD,OAAO,CAAC,CAAC,CAAC,GAAG,EAAE,CAAA;oBACf,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAA;iBACrB;aACF;SACF;QAED,IAAI,CAAC,KAAK,CAAC,UAAU,EAAE,IAAI,EAAE,EAAE,IAAI,EAAE,OAAO,EAAE,CAAC,CAAA;QAC/C,IAAI,CAAC,KAAK,CAAC,UAAU,EAAE,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,MAAM,CAAC,CAAA;QAEnD,KACE,IAAI,EAAE,GAAG,CAAC,EAAE,EAAE,GAAG,CAAC,EAAE,EAAE,GAAG,IAAI,CAAC,MAAM,EAAE,EAAE,GAAG,OAAO,CAAC,MAAM,EACzD,EAAE,GAAG,EAAE,IAAI,EAAE,GAAG,EAAE,EAClB,EAAE,EAAE,EAAE,EAAE,EAAE,EACV;YACA,IAAI,CAAC,KAAK,CAAC,eAAe,CAAC,CAAA;YAC3B,IAAI,CAAC,GAAG,OAAO,CAAC,EAAE,CAAC,CAAA;YACnB,IAAI,CAAC,GAAG,IAAI,CAAC,EAAE,CAAC,CAAA;YAEhB,IAAI,CAAC,KAAK,CAAC,OAAO,EAAE,CAAC,EAAE,CAAC,CAAC,CAAA;YAEzB,wBAAwB;YACxB,wCAAwC;YACxC,qBAAqB;YACrB,IAAI,CAAC,KAAK,KAAK,EAAE;gBACf,OAAO,KAAK,CAAA;aACb;YACD,oBAAoB;YAEpB,IAAI,CAAC,KAAK,gBAAQ,EAAE;gBAClB,IAAI,CAAC,KAAK,CAAC,UAAU,EAAE,CAAC,OAAO,EAAE,CAAC,EAAE,CAAC,CAAC,CAAC,CAAA;gBAEvC,OAAO;gBACP,yCAAyC;gBACzC,cAAc;gBACd,cAAc;gBACd,cAAc;gBACd,QAAQ;gBACR,iDAAiD;gBACjD,wDAAwD;gBACxD,yBAAyB;gBACzB,sDAAsD;gBACtD,6BAA6B;gBAC7B,EAAE;gBACF,mCAAmC;gBACnC,gBAAgB;gBAChB,eAAe;gBACf,kCAAkC;gBAClC,oBAAoB;gBACpB,mBAAmB;gBACnB,qCAAqC;gBACrC,mCAAmC;gBACnC,iCAAiC;gBACjC,kCAAkC;gBAClC,IAAI,EAAE,GAAG,EAAE,CAAA;gBACX,IAAI,EAAE,GAAG,EAAE,GAAG,CAAC,CAAA;gBACf,IAAI,EAAE,KAAK,EAAE,EAAE;oBACb,IAAI,CAAC,KAAK,CAAC,eAAe,CAAC,CAAA;oBAC3B,8CAA8C;oBAC9C,yBAAyB;oBACzB,2CAA2C;oBAC3C,sBAAsB;oBACtB,sDAAsD;oBACtD,uBAAuB;oBACvB,OAAO,EAAE,GAAG,EAAE,EAAE,EAAE,EAAE,EAAE;wBACpB,IACE,IAAI,CAAC,EAAE,CAAC,KAAK,GAAG;4BAChB,IAAI,CAAC,EAAE,CAAC,KAAK,IAAI;4BACjB,CAAC,CAAC,OAAO,CAAC,GAAG,IAAI,IAAI,CAAC,EAAE,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,CAAC;4BAE5C,OAAO,KAAK,CAAA;qBACf;oBACD,OAAO,IAAI,CAAA;iBACZ;gBAED,mDAAmD;gBACnD,OAAO,EAAE,GAAG,EAAE,EAAE;oBACd,IAAI,SAAS,GAAG,IAAI,CAAC,EAAE,CAAC,CAAA;oBAExB,IAAI,CAAC,KAAK,CAAC,kBAAkB,EAAE,IAAI,EAAE,EAAE,EAAE,OAAO,EAAE,EAAE,EAAE,SAAS,CAAC,CAAA;oBAEhE,qDAAqD;oBACrD,IAAI,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,EAAE,OAAO,CAAC,KAAK,CAAC,EAAE,CAAC,EAAE,OAAO,CAAC,EAAE;wBAC7D,IAAI,CAAC,KAAK,CAAC,uBAAuB,EAAE,EAAE,EAAE,EAAE,EAAE,SAAS,CAAC,CAAA;wBACtD,iBAAiB;wBACjB,OAAO,IAAI,CAAA;qBACZ;yBAAM;wBACL,kCAAkC;wBAClC,iDAAiD;wBACjD,IACE,SAAS,KAAK,GAAG;4BACjB,SAAS,KAAK,IAAI;4BAClB,CAAC,CAAC,OAAO,CAAC,GAAG,IAAI,SAAS,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,CAAC,EAC7C;4BACA,IAAI,CAAC,KAAK,CAAC,eAAe,EAAE,IAAI,EAAE,EAAE,EAAE,OAAO,EAAE,EAAE,CAAC,CAAA;4BAClD,MAAK;yBACN;wBAED,uCAAuC;wBACvC,IAAI,CAAC,KAAK,CAAC,0CAA0C,CAAC,CAAA;wBACtD,EAAE,EAAE,CAAA;qBACL;iBACF;gBAED,sBAAsB;gBACtB,mEAAmE;gBACnE,qBAAqB;gBACrB,IAAI,OAAO,EAAE;oBACX,kBAAkB;oBAClB,IAAI,CAAC,KAAK,CAAC,0BAA0B,EAAE,IAAI,EAAE,EAAE,EAAE,OAAO,EAAE,EAAE,CAAC,CAAA;oBAC7D,IAAI,EAAE,KAAK,EAAE,EAAE;wBACb,OAAO,IAAI,CAAA;qBACZ;iBACF;gBACD,oBAAoB;gBACpB,OAAO,KAAK,CAAA;aACb;YAED,0BAA0B;YAC1B,gDAAgD;YAChD,qDAAqD;YACrD,IAAI,GAAY,CAAA;YAChB,IAAI,OAAO,CAAC,KAAK,QAAQ,EAAE;gBACzB,GAAG,GAAG,CAAC,KAAK,CAAC,CAAA;gBACb,IAAI,CAAC,KAAK,CAAC,cAAc,EAAE,CAAC,EAAE,CAAC,EAAE,GAAG,CAAC,CAAA;aACtC;iBAAM;gBACL,GAAG,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAA;gBACf,IAAI,CAAC,KAAK,CAAC,eAAe,EAAE,CAAC,EAAE,CAAC,EAAE,GAAG,CAAC,CAAA;aACvC;YAED,IAAI,CAAC,GAAG;gBAAE,OAAO,KAAK,CAAA;SACvB;QAED,oDAAoD;QACpD,oDAAoD;QACpD,2CAA2C;QAC3C,kDAAkD;QAClD,oDAAoD;QACpD,uDAAuD;QACvD,oDAAoD;QACpD,yDAAyD;QACzD,6BAA6B;QAC7B,yCAAyC;QAEzC,gEAAgE;QAChE,IAAI,EAAE,KAAK,EAAE,IAAI,EAAE,KAAK,EAAE,EAAE;YAC1B,oDAAoD;YACpD,gBAAgB;YAChB,OAAO,IAAI,CAAA;SACZ;aAAM,IAAI,EAAE,KAAK,EAAE,EAAE;YACpB,+CAA+C;YAC/C,iDAAiD;YACjD,uBAAuB;YACvB,OAAO,OAAO,CAAA;SACf;aAAM,IAAI,EAAE,KAAK,EAAE,EAAE;YACpB,4CAA4C;YAC5C,oDAAoD;YACpD,iDAAiD;YACjD,wBAAwB;YACxB,OAAO,EAAE,KAAK,EAAE,GAAG,CAAC,IAAI,IAAI,CAAC,EAAE,CAAC,KAAK,EAAE,CAAA;YAEvC,qBAAqB;SACtB;aAAM;YACL,yBAAyB;YACzB,MAAM,IAAI,KAAK,CAAC,MAAM,CAAC,CAAA;SACxB;QACD,oBAAoB;IACtB,CAAC;IAED,WAAW;QACT,OAAO,IAAA,mBAAW,EAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,OAAO,CAAC,CAAA;IAChD,CAAC;IAED,KAAK,CACH,OAAe,EACf,KAAuB;QAEvB,kBAAkB,CAAC,OAAO,CAAC,CAAA;QAE3B,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,YAAY;QACZ,IAAI,OAAO,KAAK,IAAI,EAAE;YACpB,IAAI,CAAC,OAAO,CAAC,UAAU;gBAAE,OAAO,gBAAQ,CAAA;;gBACnC,OAAO,GAAG,GAAG,CAAA;SACnB;QACD,IAAI,OAAO,KAAK,EAAE;YAAE,OAAO,EAAE,CAAA;QAE7B,uDAAuD;QACvD,0DAA0D;QAC1D,IAAI,CAA0B,CAAA;QAC9B,IAAI,QAAQ,GAAoC,IAAI,CAAA;QACpD,IAAI,KAAK,KAAK,QAAQ,EAAE;YACtB,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,EAAE;gBAC/B,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAA;aAChD;iBAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,YAAY,CAAC,CAAC,EAAE;gBAC5C,QAAQ,GAAG,CACT,OAAO,CAAC,MAAM;oBACZ,CAAC,CAAC,OAAO,CAAC,GAAG;wBACX,CAAC,CAAC,uBAAuB;wBACzB,CAAC,CAAC,oBAAoB;oBACxB,CAAC,CAAC,OAAO,CAAC,GAAG;wBACb,CAAC,CAAC,iBAAiB;wBACnB,CAAC,CAAC,cAAc,CACnB,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAA;aACR;iBAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,QAAQ,CAAC,CAAC,EAAE;gBACxC,QAAQ,GAAG,CACT,OAAO,CAAC,MAAM;oBACZ,CAAC,CAAC,OAAO,CAAC,GAAG;wBACX,CAAC,CAAC,mBAAmB;wBACrB,CAAC,CAAC,gBAAgB;oBACpB,CAAC,CAAC,OAAO,CAAC,GAAG;wBACb,CAAC,CAAC,aAAa;wBACf,CAAC,CAAC,UAAU,CACf,CAAC,CAAC,CAAC,CAAA;aACL;iBAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,aAAa,CAAC,CAAC,EAAE;gBAC7C,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC,CAAC,eAAe,CAAA;aAC9D;iBAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,SAAS,CAAC,CAAC,EAAE;gBACzC,QAAQ,GAAG,WAAW,CAAA;aACvB;SACF;QAED,IAAI,EAAE,GAAG,EAAE,CAAA;QACX,IAAI,QAAQ,GAAG,KAAK,CAAA;QACpB,IAAI,QAAQ,GAAG,KAAK,CAAA;QACpB,4BAA4B;QAC5B,MAAM,gBAAgB,GAAuB,EAAE,CAAA;QAC/C,MAAM,aAAa,GAA+B,EAAE,CAAA;QACpD,IAAI,SAAS,GAAsB,KAAK,CAAA;QACxC,IAAI,OAAO,GAAG,KAAK,CAAA;QACnB,IAAI,YAAY,GAAG,CAAC,CAAC,CAAA;QACrB,IAAI,UAAU,GAAG,CAAC,CAAC,CAAA;QACnB,IAAI,EAAU,CAAA;QACd,IAAI,EAAgC,CAAA;QACpC,IAAI,EAAkB,CAAA;QACtB,2DAA2D;QAC3D,yDAAyD;QACzD,oDAAoD;QACpD,IAAI,cAAc,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,CAAA;QAC9C,IAAI,cAAc,GAAG,OAAO,CAAC,GAAG,IAAI,cAAc,CAAA;QAClD,MAAM,YAAY,GAAG,GAAG,EAAE,CACxB,cAAc;YACZ,CAAC,CAAC,EAAE;YACJ,CAAC,CAAC,cAAc;gBAChB,CAAC,CAAC,gCAAgC;gBAClC,CAAC,CAAC,SAAS,CAAA;QACf,MAAM,eAAe,GAAG,CAAC,CAAS,EAAE,EAAE,CACpC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG;YACjB,CAAC,CAAC,EAAE;YACJ,CAAC,CAAC,OAAO,CAAC,GAAG;gBACb,CAAC,CAAC,gCAAgC;gBAClC,CAAC,CAAC,SAAS,CAAA;QAEf,MAAM,cAAc,GAAG,GAAG,EAAE;YAC1B,IAAI,SAAS,EAAE;gBACb,uCAAuC;gBACvC,qCAAqC;gBACrC,QAAQ,SAAS,EAAE;oBACjB,KAAK,GAAG;wBACN,EAAE,IAAI,IAAI,CAAA;wBACV,QAAQ,GAAG,IAAI,CAAA;wBACf,MAAK;oBACP,KAAK,GAAG;wBACN,EAAE,IAAI,KAAK,CAAA;wBACX,QAAQ,GAAG,IAAI,CAAA;wBACf,MAAK;oBACP;wBACE,EAAE,IAAI,IAAI,GAAG,SAAS,CAAA;wBACtB,MAAK;iBACR;gBACD,IAAI,CAAC,KAAK,CAAC,sBAAsB,EAAE,SAAS,EAAE,EAAE,CAAC,CAAA;gBACjD,SAAS,GAAG,KAAK,CAAA;aAClB;QACH,CAAC,CAAA;QAED,KACE,IAAI,CAAC,GAAG,CAAC,EAAE,CAAS,EACpB,CAAC,GAAG,OAAO,CAAC,MAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,EAC7C,CAAC,EAAE,EACH;YACA,IAAI,CAAC,KAAK,CAAC,cAAc,EAAE,OAAO,EAAE,CAAC,EAAE,EAAE,EAAE,CAAC,CAAC,CAAA;YAE7C,kCAAkC;YAClC,IAAI,QAAQ,EAAE;gBACZ,wCAAwC;gBACxC,wBAAwB;gBACxB,qBAAqB;gBACrB,IAAI,CAAC,KAAK,GAAG,EAAE;oBACb,OAAO,KAAK,CAAA;iBACb;gBACD,oBAAoB;gBAEpB,IAAI,UAAU,CAAC,CAAC,CAAC,EAAE;oBACjB,EAAE,IAAI,IAAI,CAAA;iBACX;gBACD,EAAE,IAAI,CAAC,CAAA;gBACP,QAAQ,GAAG,KAAK,CAAA;gBAChB,SAAQ;aACT;YAED,QAAQ,CAAC,EAAE;gBACT,uCAAuC;gBACvC,qBAAqB;gBACrB,KAAK,GAAG,CAAC,CAAC;oBACR,OAAO,KAAK,CAAA;iBACb;gBACD,oBAAoB;gBAEpB,KAAK,IAAI;oBACP,IAAI,OAAO,IAAI,OAAO,CAAC,MAAM,CAAC,CAAC,GAAG,CAAC,CAAC,KAAK,GAAG,EAAE;wBAC5C,EAAE,IAAI,CAAC,CAAA;wBACP,SAAQ;qBACT;oBAED,cAAc,EAAE,CAAA;oBAChB,QAAQ,GAAG,IAAI,CAAA;oBACf,SAAQ;gBAEV,+BAA+B;gBAC/B,2BAA2B;gBAC3B,KAAK,GAAG,CAAC;gBACT,KAAK,GAAG,CAAC;gBACT,KAAK,GAAG,CAAC;gBACT,KAAK,GAAG,CAAC;gBACT,KAAK,GAAG;oBACN,IAAI,CAAC,KAAK,CAAC,4BAA4B,EAAE,OAAO,EAAE,CAAC,EAAE,EAAE,EAAE,CAAC,CAAC,CAAA;oBAE3D,wDAAwD;oBACxD,qCAAqC;oBACrC,IAAI,OAAO,EAAE;wBACX,IAAI,CAAC,KAAK,CAAC,YAAY,CAAC,CAAA;wBACxB,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,UAAU,GAAG,CAAC;4BAAE,CAAC,GAAG,GAAG,CAAA;wBAC9C,EAAE,IAAI,CAAC,CAAA;wBACP,SAAQ;qBACT;oBAED,gDAAgD;oBAChD,mDAAmD;oBACnD,oDAAoD;oBACpD,IAAI,CAAC,KAAK,CAAC,wBAAwB,EAAE,SAAS,CAAC,CAAA;oBAC/C,cAAc,EAAE,CAAA;oBAChB,SAAS,GAAG,CAAC,CAAA;oBACb,0DAA0D;oBAC1D,+DAA+D;oBAC/D,yBAAyB;oBACzB,IAAI,OAAO,CAAC,KAAK;wBAAE,cAAc,EAAE,CAAA;oBACnC,SAAQ;gBAEV,KAAK,GAAG,CAAC,CAAC;oBACR,IAAI,OAAO,EAAE;wBACX,EAAE,IAAI,GAAG,CAAA;wBACT,SAAQ;qBACT;oBAED,IAAI,CAAC,SAAS,EAAE;wBACd,EAAE,IAAI,KAAK,CAAA;wBACX,SAAQ;qBACT;oBAED,MAAM,OAAO,GAAqB;wBAChC,IAAI,EAAE,SAAS;wBACf,KAAK,EAAE,CAAC,GAAG,CAAC;wBACZ,OAAO,EAAE,EAAE,CAAC,MAAM;wBAClB,IAAI,EAAE,OAAO,CAAC,SAAS,CAAC,CAAC,IAAI;wBAC7B,KAAK,EAAE,OAAO,CAAC,SAAS,CAAC,CAAC,KAAK;qBAChC,CAAA;oBACD,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,EAAE,OAAO,CAAC,CAAA;oBACvC,gBAAgB,CAAC,IAAI,CAAC,OAAO,CAAC,CAAA;oBAC9B,4CAA4C;oBAC5C,EAAE,IAAI,OAAO,CAAC,IAAI,CAAA;oBAClB,sCAAsC;oBACtC,IAAI,OAAO,CAAC,KAAK,KAAK,CAAC,IAAI,OAAO,CAAC,IAAI,KAAK,GAAG,EAAE;wBAC/C,cAAc,GAAG,IAAI,CAAA;wBACrB,EAAE,IAAI,eAAe,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAA;qBAC5C;oBACD,IAAI,CAAC,KAAK,CAAC,cAAc,EAAE,SAAS,EAAE,EAAE,CAAC,CAAA;oBACzC,SAAS,GAAG,KAAK,CAAA;oBACjB,SAAQ;iBACT;gBAED,KAAK,GAAG,CAAC,CAAC;oBACR,MAAM,OAAO,GAAG,gBAAgB,CAAC,gBAAgB,CAAC,MAAM,GAAG,CAAC,CAAC,CAAA;oBAC7D,IAAI,OAAO,IAAI,CAAC,OAAO,EAAE;wBACvB,EAAE,IAAI,KAAK,CAAA;wBACX,SAAQ;qBACT;oBACD,gBAAgB,CAAC,GAAG,EAAE,CAAA;oBAEtB,qBAAqB;oBACrB,cAAc,EAAE,CAAA;oBAChB,QAAQ,GAAG,IAAI,CAAA;oBACf,EAAE,GAAG,OAAO,CAAA;oBACZ,8BAA8B;oBAC9B,qCAAqC;oBACrC,EAAE,IAAI,EAAE,CAAC,KAAK,CAAA;oBACd,IAAI,EAAE,CAAC,IAAI,KAAK,GAAG,EAAE;wBACnB,aAAa,CAAC,IAAI,CAAC,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,EAAE,KAAK,EAAE,EAAE,CAAC,MAAM,EAAE,CAAC,CAAC,CAAA;qBAC5D;oBACD,SAAQ;iBACT;gBAED,KAAK,GAAG,CAAC,CAAC;oBACR,MAAM,OAAO,GAAG,gBAAgB,CAAC,gBAAgB,CAAC,MAAM,GAAG,CAAC,CAAC,CAAA;oBAC7D,IAAI,OAAO,IAAI,CAAC,OAAO,EAAE;wBACvB,EAAE,IAAI,KAAK,CAAA;wBACX,SAAQ;qBACT;oBAED,cAAc,EAAE,CAAA;oBAChB,EAAE,IAAI,GAAG,CAAA;oBACT,wCAAwC;oBACxC,IAAI,OAAO,CAAC,KAAK,KAAK,CAAC,IAAI,OAAO,CAAC,IAAI,KAAK,GAAG,EAAE;wBAC/C,cAAc,GAAG,IAAI,CAAA;wBACrB,EAAE,IAAI,eAAe,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAA;qBAC5C;oBACD,SAAQ;iBACT;gBAED,+CAA+C;gBAC/C,KAAK,GAAG;oBACN,+CAA+C;oBAC/C,cAAc,EAAE,CAAA;oBAEhB,IAAI,OAAO,EAAE;wBACX,EAAE,IAAI,IAAI,GAAG,CAAC,CAAA;wBACd,SAAQ;qBACT;oBAED,OAAO,GAAG,IAAI,CAAA;oBACd,UAAU,GAAG,CAAC,CAAA;oBACd,YAAY,GAAG,EAAE,CAAC,MAAM,CAAA;oBACxB,EAAE,IAAI,CAAC,CAAA;oBACP,SAAQ;gBAEV,KAAK,GAAG;oBACN,0CAA0C;oBAC1C,mCAAmC;oBACnC,qCAAqC;oBACrC,0CAA0C;oBAC1C,IAAI,CAAC,KAAK,UAAU,GAAG,CAAC,IAAI,CAAC,OAAO,EAAE;wBACpC,EAAE,IAAI,IAAI,GAAG,CAAC,CAAA;wBACd,SAAQ;qBACT;oBAED,sDAAsD;oBACtD,oDAAoD;oBACpD,qDAAqD;oBACrD,4BAA4B;oBAC5B,sDAAsD;oBACtD,wDAAwD;oBACxD,kDAAkD;oBAClD,EAAE,GAAG,OAAO,CAAC,SAAS,CAAC,UAAU,GAAG,CAAC,EAAE,CAAC,CAAC,CAAA;oBACzC,IAAI;wBACF,MAAM,CAAC,GAAG,GAAG,YAAY,CAAC,YAAY,CAAC,EAAE,CAAC,CAAC,GAAG,GAAG,CAAC,CAAA;wBAClD,mCAAmC;wBACnC,EAAE,IAAI,CAAC,CAAA;qBACR;oBAAC,OAAO,EAAE,EAAE;wBACX,4DAA4D;wBAC5D,6CAA6C;wBAC7C,EAAE,GAAG,EAAE,CAAC,SAAS,CAAC,CAAC,EAAE,YAAY,CAAC,GAAG,QAAQ,CAAA,CAAC,qBAAqB;qBACpE;oBACD,QAAQ,GAAG,IAAI,CAAA;oBACf,OAAO,GAAG,KAAK,CAAA;oBACf,SAAQ;gBAEV;oBACE,8CAA8C;oBAC9C,cAAc,EAAE,CAAA;oBAEhB,IAAI,UAAU,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,GAAG,IAAI,OAAO,CAAC,EAAE;wBAC5C,EAAE,IAAI,IAAI,CAAA;qBACX;oBAED,EAAE,IAAI,CAAC,CAAA;oBACP,MAAK;aACR,CAAC,SAAS;SACZ,CAAC,MAAM;QAER,8CAA8C;QAC9C,yCAAyC;QACzC,IAAI,OAAO,EAAE;YACX,4CAA4C;YAC5C,+CAA+C;YAC/C,qDAAqD;YACrD,gDAAgD;YAChD,EAAE,GAAG,OAAO,CAAC,KAAK,CAAC,UAAU,GAAG,CAAC,CAAC,CAAA;YAClC,EAAE,GAAG,IAAI,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,CAAmB,CAAA;YAC/C,EAAE,GAAG,EAAE,CAAC,SAAS,CAAC,CAAC,EAAE,YAAY,CAAC,GAAG,KAAK,GAAG,EAAE,CAAC,CAAC,CAAC,CAAA;YAClD,QAAQ,GAAG,QAAQ,IAAI,EAAE,CAAC,CAAC,CAAC,CAAA;SAC7B;QAED,uDAAuD;QACvD,kBAAkB;QAClB,kEAAkE;QAClE,wEAAwE;QACxE,mEAAmE;QACnE,qCAAqC;QACrC,KAAK,EAAE,GAAG,gBAAgB,CAAC,GAAG,EAAE,EAAE,EAAE,EAAE,EAAE,GAAG,gBAAgB,CAAC,GAAG,EAAE,EAAE;YACjE,IAAI,IAAY,CAAA;YAChB,IAAI,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,CAAA;YAC5C,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,cAAc,EAAE,EAAE,EAAE,EAAE,CAAC,CAAA;YAChD,+DAA+D;YAC/D,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC,2BAA2B,EAAE,CAAC,CAAC,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE;gBAC7D,IAAI,CAAC,EAAE,EAAE;oBACP,6CAA6C;oBAC7C,EAAE,GAAG,IAAI,CAAA;oBACT,yBAAyB;oBACzB,qBAAqB;iBACtB;gBACD,oBAAoB;gBAEpB,iEAAiE;gBACjE,mEAAmE;gBACnE,qEAAqE;gBACrE,yDAAyD;gBACzD,EAAE;gBACF,wCAAwC;gBACxC,OAAO,EAAE,GAAG,EAAE,GAAG,EAAE,GAAG,GAAG,CAAA;YAC3B,CAAC,CAAC,CAAA;YAEF,IAAI,CAAC,KAAK,CAAC,gBAAgB,EAAE,IAAI,EAAE,IAAI,EAAE,EAAE,EAAE,EAAE,CAAC,CAAA;YAChD,MAAM,CAAC,GACL,EAAE,CAAC,IAAI,KAAK,GAAG,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,IAAI,KAAK,GAAG,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,GAAG,EAAE,CAAC,IAAI,CAAA;YAEnE,QAAQ,GAAG,IAAI,CAAA;YACf,EAAE,GAAG,EAAE,CAAC,KAAK,CAAC,CAAC,EAAE,EAAE,CAAC,OAAO,CAAC,GAAG,CAAC,GAAG,KAAK,GAAG,IAAI,CAAA;SAChD;QAED,2DAA2D;QAC3D,cAAc,EAAE,CAAA;QAChB,IAAI,QAAQ,EAAE;YACZ,cAAc;YACd,EAAE,IAAI,MAAM,CAAA;SACb;QAED,2DAA2D;QAC3D,iDAAiD;QACjD,MAAM,eAAe,GAAG,kBAAkB,CAAC,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAA;QAExD,wDAAwD;QACxD,4DAA4D;QAC5D,yDAAyD;QACzD,0DAA0D;QAC1D,eAAe;QACf,KAAK,IAAI,CAAC,GAAG,aAAa,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE;YAClD,MAAM,EAAE,GAAG,aAAa,CAAC,CAAC,CAAC,CAAA;YAE3B,MAAM,QAAQ,GAAG,EAAE,CAAC,KAAK,CAAC,CAAC,EAAE,EAAE,CAAC,OAAO,CAAC,CAAA;YACxC,MAAM,OAAO,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,OAAO,EAAE,EAAE,CAAC,KAAK,GAAG,CAAC,CAAC,CAAA;YAClD,IAAI,OAAO,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,KAAK,CAAC,CAAA;YAChC,MAAM,MAAM,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,KAAK,GAAG,CAAC,EAAE,EAAE,CAAC,KAAK,CAAC,GAAG,OAAO,CAAA;YAEzD,gEAAgE;YAChE,wEAAwE;YACxE,+BAA+B;YAC/B,MAAM,iBAAiB,GAAG,QAAQ,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,MAAM,CAAA;YACpD,MAAM,gBAAgB,GAAG,QAAQ,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,MAAM,GAAG,iBAAiB,CAAA;YACvE,IAAI,UAAU,GAAG,OAAO,CAAA;YACxB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,gBAAgB,EAAE,CAAC,EAAE,EAAE;gBACzC,UAAU,GAAG,UAAU,CAAC,OAAO,CAAC,UAAU,EAAE,EAAE,CAAC,CAAA;aAChD;YACD,OAAO,GAAG,UAAU,CAAA;YAEpB,MAAM,MAAM,GAAG,OAAO,KAAK,EAAE,IAAI,KAAK,KAAK,QAAQ,CAAC,CAAC,CAAC,WAAW,CAAC,CAAC,CAAC,EAAE,CAAA;YAEtE,EAAE,GAAG,QAAQ,GAAG,OAAO,GAAG,OAAO,GAAG,MAAM,GAAG,MAAM,CAAA;SACpD;QAED,+DAA+D;QAC/D,+CAA+C;QAC/C,oDAAoD;QACpD,IAAI,EAAE,KAAK,EAAE,IAAI,QAAQ,EAAE;YACzB,EAAE,GAAG,OAAO,GAAG,EAAE,CAAA;SAClB;QAED,IAAI,eAAe,EAAE;YACnB,EAAE,GAAG,YAAY,EAAE,GAAG,EAAE,CAAA;SACzB;QAED,4CAA4C;QAC5C,IAAI,KAAK,KAAK,QAAQ,EAAE;YACtB,OAAO,CAAC,EAAE,EAAE,QAAQ,CAAC,CAAA;SACtB;QAED,kEAAkE;QAClE,IAAI,OAAO,CAAC,MAAM,IAAI,CAAC,QAAQ,IAAI,CAAC,OAAO,CAAC,eAAe,EAAE;YAC3D,QAAQ,GAAG,OAAO,CAAC,WAAW,EAAE,KAAK,OAAO,CAAC,WAAW,EAAE,CAAA;SAC3D;QAED,2CAA2C;QAC3C,oDAAoD;QACpD,qCAAqC;QACrC,IAAI,CAAC,QAAQ,EAAE;YACb,OAAO,YAAY,CAAC,OAAO,CAAC,CAAA;SAC7B;QAED,MAAM,KAAK,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAA;QACvC,IAAI;YACF,MAAM,GAAG,GAAG,QAAQ;gBAClB,CAAC,CAAC;oBACE,KAAK,EAAE,OAAO;oBACd,IAAI,EAAE,EAAE;oBACR,IAAI,EAAE,QAAQ;iBACf;gBACH,CAAC,CAAC;oBACE,KAAK,EAAE,OAAO;oBACd,IAAI,EAAE,EAAE;iBACT,CAAA;YACL,OAAO,MAAM,CAAC,MAAM,CAAC,IAAI,MAAM,CAAC,GAAG,GAAG,EAAE,GAAG,GAAG,EAAE,KAAK,CAAC,EAAE,GAAG,CAAC,CAAA;YAC5D,qBAAqB;SACtB;QAAC,OAAO,EAAE,EAAE;YACX,uBAAuB;YACvB,+DAA+D;YAC/D,+DAA+D;YAC/D,kEAAkE;YAClE,iCAAiC;YACjC,IAAI,CAAC,KAAK,CAAC,gBAAgB,EAAE,EAAE,CAAC,CAAA;YAChC,OAAO,IAAI,MAAM,CAAC,IAAI,CAAC,CAAA;SACxB;QACD,oBAAoB;IACtB,CAAC;IAED,MAAM;QACJ,IAAI,IAAI,CAAC,MAAM,IAAI,IAAI,CAAC,MAAM,KAAK,KAAK;YAAE,OAAO,IAAI,CAAC,MAAM,CAAA;QAE5D,mDAAmD;QACnD,4BAA4B;QAC5B,EAAE;QACF,wDAAwD;QACxD,yDAAyD;QACzD,2CAA2C;QAC3C,MAAM,GAAG,GAAG,IAAI,CAAC,GAAG,CAAA;QAEpB,IAAI,CAAC,GAAG,CAAC,MAAM,EAAE;YACf,IAAI,CAAC,MAAM,GAAG,KAAK,CAAA;YACnB,OAAO,IAAI,CAAC,MAAM,CAAA;SACnB;QACD,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,MAAM,OAAO,GAAG,OAAO,CAAC,UAAU;YAChC,CAAC,CAAC,IAAI;YACN,CAAC,CAAC,OAAO,CAAC,GAAG;gBACb,CAAC,CAAC,UAAU;gBACZ,CAAC,CAAC,YAAY,CAAA;QAChB,MAAM,KAAK,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAA;QAEvC,kCAAkC;QAClC,kDAAkD;QAClD,sEAAsE;QACtE,iDAAiD;QACjD,8DAA8D;QAC9D,mCAAmC;QACnC,IAAI,EAAE,GAAG,GAAG;aACT,GAAG,CAAC,OAAO,CAAC,EAAE;YACb,MAAM,EAAE,GAAiC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CACvD,OAAO,CAAC,KAAK,QAAQ;gBACnB,CAAC,CAAC,YAAY,CAAC,CAAC,CAAC;gBACjB,CAAC,CAAC,CAAC,KAAK,gBAAQ;oBAChB,CAAC,CAAC,gBAAQ;oBACV,CAAC,CAAC,CAAC,CAAC,IAAI,CACqB,CAAA;YACjC,EAAE,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE;gBAClB,MAAM,IAAI,GAAG,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,CAAA;gBACtB,MAAM,IAAI,GAAG,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,CAAA;gBACtB,IAAI,CAAC,KAAK,gBAAQ,IAAI,IAAI,KAAK,gBAAQ,EAAE;oBACvC,OAAM;iBACP;gBACD,IAAI,IAAI,KAAK,SAAS,EAAE;oBACtB,IAAI,IAAI,KAAK,SAAS,IAAI,IAAI,KAAK,gBAAQ,EAAE;wBAC3C,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,SAAS,GAAG,OAAO,GAAG,OAAO,GAAG,IAAI,CAAA;qBACjD;yBAAM;wBACL,EAAE,CAAC,CAAC,CAAC,GAAG,OAAO,CAAA;qBAChB;iBACF;qBAAM,IAAI,IAAI,KAAK,SAAS,EAAE;oBAC7B,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,IAAI,GAAG,SAAS,GAAG,OAAO,GAAG,IAAI,CAAA;iBAC9C;qBAAM,IAAI,IAAI,KAAK,gBAAQ,EAAE;oBAC5B,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,IAAI,GAAG,YAAY,GAAG,OAAO,GAAG,MAAM,GAAG,IAAI,CAAA;oBACzD,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,gBAAQ,CAAA;iBACrB;YACH,CAAC,CAAC,CAAA;YACF,OAAO,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,KAAK,gBAAQ,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAA;QACjD,CAAC,CAAC;aACD,IAAI,CAAC,GAAG,CAAC,CAAA;QAEZ,4BAA4B;QAC5B,gDAAgD;QAChD,EAAE,GAAG,MAAM,GAAG,EAAE,GAAG,IAAI,CAAA;QAEvB,gDAAgD;QAChD,IAAI,IAAI,CAAC,MAAM;YAAE,EAAE,GAAG,MAAM,GAAG,EAAE,GAAG,MAAM,CAAA;QAE1C,IAAI;YACF,IAAI,CAAC,MAAM,GAAG,IAAI,MAAM,CAAC,EAAE,EAAE,KAAK,CAAC,CAAA;YACnC,qBAAqB;SACtB;QAAC,OAAO,EAAE,EAAE;YACX,uBAAuB;YACvB,IAAI,CAAC,MAAM,GAAG,KAAK,CAAA;SACpB;QACD,oBAAoB;QACpB,OAAO,IAAI,CAAC,MAAM,CAAA;IACpB,CAAC;IAED,UAAU,CAAC,CAAS;QAClB,mDAAmD;QACnD,6DAA6D;QAC7D,8CAA8C;QAC9C,0CAA0C;QAC1C,IAAI,IAAI,CAAC,uBAAuB,EAAE;YAChC,OAAO,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,CAAA;SACpB;aAAM,IAAI,SAAS,IAAI,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE;YAC7C,sCAAsC;YACtC,OAAO,CAAC,EAAE,EAAE,GAAG,CAAC,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,CAAA;SAC/B;aAAM;YACL,OAAO,CAAC,CAAC,KAAK,CAAC,KAAK,CAAC,CAAA;SACtB;IACH,CAAC;IAED,KAAK,CAAC,CAAS,EAAE,OAAO,GAAG,IAAI,CAAC,OAAO;QACrC,IAAI,CAAC,KAAK,CAAC,OAAO,EAAE,CAAC,EAAE,IAAI,CAAC,OAAO,CAAC,CAAA;QACpC,8CAA8C;QAC9C,iBAAiB;QACjB,IAAI,IAAI,CAAC,OAAO,EAAE;YAChB,OAAO,KAAK,CAAA;SACb;QACD,IAAI,IAAI,CAAC,KAAK,EAAE;YACd,OAAO,CAAC,KAAK,EAAE,CAAA;SAChB;QAED,IAAI,CAAC,KAAK,GAAG,IAAI,OAAO,EAAE;YACxB,OAAO,IAAI,CAAA;SACZ;QAED,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,gCAAgC;QAChC,IAAI,IAAI,CAAC,GAAG,KAAK,GAAG,EAAE;YACpB,CAAC,GAAG,CAAC,CAAC,KAAK,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAA;SAChC;QAED,6CAA6C;QAC7C,MAAM,EAAE,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAA;QAC7B,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,OAAO,EAAE,EAAE,CAAC,CAAA;QAErC,0DAA0D;QAC1D,2DAA2D;QAC3D,mCAAmC;QACnC,uCAAuC;QAEvC,MAAM,GAAG,GAAG,IAAI,CAAC,GAAG,CAAA;QACpB,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,KAAK,EAAE,GAAG,CAAC,CAAA;QAEpC,0EAA0E;QAC1E,IAAI,QAAQ,GAAW,EAAE,CAAC,EAAE,CAAC,MAAM,GAAG,CAAC,CAAC,CAAA;QACxC,IAAI,CAAC,QAAQ,EAAE;YACb,KAAK,IAAI,CAAC,GAAG,EAAE,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC,QAAQ,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,EAAE,EAAE;gBACpD,QAAQ,GAAG,EAAE,CAAC,CAAC,CAAC,CAAA;aACjB;SACF;QAED,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,GAAG,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YACnC,MAAM,OAAO,GAAG,GAAG,CAAC,CAAC,CAAC,CAAA;YACtB,IAAI,IAAI,GAAG,EAAE,CAAA;YACb,IAAI,OAAO,CAAC,SAAS,IAAI,OAAO,CAAC,MAAM,KAAK,CAAC,EAAE;gBAC7C,IAAI,GAAG,CAAC,QAAQ,CAAC,CAAA;aAClB;YACD,MAAM,GAAG,GAAG,IAAI,CAAC,QAAQ,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,CAAA;YACjD,IAAI,GAAG,EAAE;gBACP,IAAI,OAAO,CAAC,UAAU,EAAE;oBACtB,OAAO,IAAI,CAAA;iBACZ;gBACD,OAAO,CAAC,IAAI,CAAC,MAAM,CAAA;aACpB;SACF;QAED,2DAA2D;QAC3D,8BAA8B;QAC9B,IAAI,OAAO,CAAC,UAAU,EAAE;YACtB,OAAO,KAAK,CAAA;SACb;QACD,OAAO,IAAI,CAAC,MAAM,CAAA;IACpB,CAAC;IAED,MAAM,CAAC,QAAQ,CAAC,GAAqB;QACnC,OAAO,iBAAS,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,SAAS,CAAA;IAC1C,CAAC;CACF;AAt+BD,8BAs+BC;AAED,iBAAS,CAAC,SAAS,GAAG,SAAS,CAAA"}
\ No newline at end of file
diff --git a/deps/npm/node_modules/minimatch/dist/cjs/package.json b/deps/npm/node_modules/minimatch/dist/cjs/package.json
new file mode 100644
index 00000000000000..5bbefffbabee39
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/cjs/package.json
@@ -0,0 +1,3 @@
+{
+ "type": "commonjs"
+}
diff --git a/deps/npm/node_modules/minimatch/dist/mjs/index.d.ts b/deps/npm/node_modules/minimatch/dist/mjs/index.d.ts
new file mode 100644
index 00000000000000..cca07a8280d896
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/mjs/index.d.ts
@@ -0,0 +1,71 @@
+export interface MinimatchOptions {
+ nobrace?: boolean;
+ nocomment?: boolean;
+ nonegate?: boolean;
+ debug?: boolean;
+ noglobstar?: boolean;
+ noext?: boolean;
+ nonull?: boolean;
+ windowsPathsNoEscape?: boolean;
+ allowWindowsEscape?: boolean;
+ partial?: boolean;
+ dot?: boolean;
+ nocase?: boolean;
+ nocaseMagicOnly?: boolean;
+ matchBase?: boolean;
+ flipNegate?: boolean;
+ preserveMultipleSlashes?: boolean;
+}
+export declare const minimatch: {
+ (p: string, pattern: string, options?: MinimatchOptions): boolean;
+ sep: string;
+ GLOBSTAR: typeof GLOBSTAR;
+ filter: (pattern: string, options?: MinimatchOptions) => (p: string) => boolean;
+ defaults: (def: MinimatchOptions) => typeof minimatch;
+ braceExpand: (pattern: string, options?: MinimatchOptions) => string[];
+ makeRe: (pattern: string, options?: MinimatchOptions) => false | MMRegExp;
+ match: (list: string[], pattern: string, options?: MinimatchOptions) => string[];
+ Minimatch: typeof Minimatch;
+};
+export default minimatch;
+export declare const sep: string;
+export declare const GLOBSTAR: unique symbol;
+export declare const filter: (pattern: string, options?: MinimatchOptions) => (p: string) => boolean;
+export declare const defaults: (def: MinimatchOptions) => typeof minimatch;
+export declare const braceExpand: (pattern: string, options?: MinimatchOptions) => string[];
+declare const SUBPARSE: unique symbol;
+export declare const makeRe: (pattern: string, options?: MinimatchOptions) => false | MMRegExp;
+export declare const match: (list: string[], pattern: string, options?: MinimatchOptions) => string[];
+export type MMRegExp = RegExp & {
+ _src?: string;
+ _glob?: string;
+};
+type SubparseReturn = [string, boolean];
+type ParseReturnFiltered = string | MMRegExp | typeof GLOBSTAR;
+type ParseReturn = ParseReturnFiltered | false;
+export declare class Minimatch {
+ options: MinimatchOptions;
+ set: ParseReturnFiltered[][];
+ pattern: string;
+ windowsPathsNoEscape: boolean;
+ nonegate: boolean;
+ negate: boolean;
+ comment: boolean;
+ empty: boolean;
+ preserveMultipleSlashes: boolean;
+ partial: boolean;
+ globSet: string[];
+ globParts: string[][];
+ regexp: false | null | MMRegExp;
+ constructor(pattern: string, options?: MinimatchOptions);
+ debug(..._: any[]): void;
+ make(): void;
+ parseNegate(): void;
+ matchOne(file: string[], pattern: ParseReturn[], partial?: boolean): boolean;
+ braceExpand(): string[];
+ parse(pattern: string, isSub?: typeof SUBPARSE): ParseReturn | SubparseReturn;
+ makeRe(): false | MMRegExp;
+ slashSplit(p: string): string[];
+ match(f: string, partial?: boolean): boolean;
+ static defaults(def: MinimatchOptions): typeof Minimatch;
+}
diff --git a/deps/npm/node_modules/minimatch/dist/mjs/index.js b/deps/npm/node_modules/minimatch/dist/mjs/index.js
new file mode 100644
index 00000000000000..59ac1968b5ded3
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/mjs/index.js
@@ -0,0 +1,1077 @@
+export const minimatch = (p, pattern, options = {}) => {
+ assertValidPattern(pattern);
+ // shortcut: comments match nothing.
+ if (!options.nocomment && pattern.charAt(0) === '#') {
+ return false;
+ }
+ return new Minimatch(pattern, options).match(p);
+};
+export default minimatch;
+// Optimized checking for the most common glob patterns.
+const starDotExtRE = /^\*+([^+@!?\*\[\(]*)$/;
+const starDotExtTest = (ext) => (f) => !f.startsWith('.') && f.endsWith(ext);
+const starDotExtTestDot = (ext) => (f) => f.endsWith(ext);
+const starDotExtTestNocase = (ext) => {
+ ext = ext.toLowerCase();
+ return (f) => !f.startsWith('.') && f.toLowerCase().endsWith(ext);
+};
+const starDotExtTestNocaseDot = (ext) => {
+ ext = ext.toLowerCase();
+ return (f) => f.toLowerCase().endsWith(ext);
+};
+const starDotStarRE = /^\*+\.\*+$/;
+const starDotStarTest = (f) => !f.startsWith('.') && f.includes('.');
+const starDotStarTestDot = (f) => f !== '.' && f !== '..' && f.includes('.');
+const dotStarRE = /^\.\*+$/;
+const dotStarTest = (f) => f !== '.' && f !== '..' && f.startsWith('.');
+const starRE = /^\*+$/;
+const starTest = (f) => f.length !== 0 && !f.startsWith('.');
+const starTestDot = (f) => f.length !== 0 && f !== '.' && f !== '..';
+const qmarksRE = /^\?+([^+@!?\*\[\(]*)?$/;
+const qmarksTestNocase = ([$0, ext = '']) => {
+ const noext = qmarksTestNoExt([$0]);
+ if (!ext)
+ return noext;
+ ext = ext.toLowerCase();
+ return (f) => noext(f) && f.toLowerCase().endsWith(ext);
+};
+const qmarksTestNocaseDot = ([$0, ext = '']) => {
+ const noext = qmarksTestNoExtDot([$0]);
+ if (!ext)
+ return noext;
+ ext = ext.toLowerCase();
+ return (f) => noext(f) && f.toLowerCase().endsWith(ext);
+};
+const qmarksTestDot = ([$0, ext = '']) => {
+ const noext = qmarksTestNoExtDot([$0]);
+ return !ext ? noext : (f) => noext(f) && f.endsWith(ext);
+};
+const qmarksTest = ([$0, ext = '']) => {
+ const noext = qmarksTestNoExt([$0]);
+ return !ext ? noext : (f) => noext(f) && f.endsWith(ext);
+};
+const qmarksTestNoExt = ([$0]) => {
+ const len = $0.length;
+ return (f) => f.length === len && !f.startsWith('.');
+};
+const qmarksTestNoExtDot = ([$0]) => {
+ const len = $0.length;
+ return (f) => f.length === len && f !== '.' && f !== '..';
+};
+/* c8 ignore start */
+const platform = typeof process === 'object' && process
+ ? (typeof process.env === 'object' &&
+ process.env &&
+ process.env.__MINIMATCH_TESTING_PLATFORM__) ||
+ process.platform
+ : 'posix';
+const isWindows = platform === 'win32';
+const path = isWindows ? { sep: '\\' } : { sep: '/' };
+/* c8 ignore stop */
+export const sep = path.sep;
+minimatch.sep = sep;
+export const GLOBSTAR = Symbol('globstar **');
+minimatch.GLOBSTAR = GLOBSTAR;
+import expand from 'brace-expansion';
+const plTypes = {
+ '!': { open: '(?:(?!(?:', close: '))[^/]*?)' },
+ '?': { open: '(?:', close: ')?' },
+ '+': { open: '(?:', close: ')+' },
+ '*': { open: '(?:', close: ')*' },
+ '@': { open: '(?:', close: ')' },
+};
+// any single thing other than /
+// don't need to escape / when using new RegExp()
+const qmark = '[^/]';
+// * => any number of characters
+const star = qmark + '*?';
+// ** when dots are allowed. Anything goes, except .. and .
+// not (^ or / followed by one or two dots followed by $ or /),
+// followed by anything, any number of times.
+const twoStarDot = '(?:(?!(?:\\/|^)(?:\\.{1,2})($|\\/)).)*?';
+// not a ^ or / followed by a dot,
+// followed by anything, any number of times.
+const twoStarNoDot = '(?:(?!(?:\\/|^)\\.).)*?';
+// "abc" -> { a:true, b:true, c:true }
+const charSet = (s) => s.split('').reduce((set, c) => {
+ set[c] = true;
+ return set;
+}, {});
+// characters that need to be escaped in RegExp.
+const reSpecials = charSet('().*{}+?[]^$\\!');
+// characters that indicate we have to add the pattern start
+const addPatternStartSet = charSet('[.(');
+export const filter = (pattern, options = {}) => (p) => minimatch(p, pattern, options);
+minimatch.filter = filter;
+const ext = (a, b = {}) => Object.assign({}, a, b);
+export const defaults = (def) => {
+ if (!def || typeof def !== 'object' || !Object.keys(def).length) {
+ return minimatch;
+ }
+ const orig = minimatch;
+ const m = (p, pattern, options = {}) => orig(p, pattern, ext(def, options));
+ return Object.assign(m, {
+ Minimatch: class Minimatch extends orig.Minimatch {
+ constructor(pattern, options = {}) {
+ super(pattern, ext(def, options));
+ }
+ static defaults(options) {
+ return orig.defaults(ext(def, options)).Minimatch;
+ }
+ },
+ filter: (pattern, options = {}) => orig.filter(pattern, ext(def, options)),
+ defaults: (options) => orig.defaults(ext(def, options)),
+ makeRe: (pattern, options = {}) => orig.makeRe(pattern, ext(def, options)),
+ braceExpand: (pattern, options = {}) => orig.braceExpand(pattern, ext(def, options)),
+ match: (list, pattern, options = {}) => orig.match(list, pattern, ext(def, options)),
+ sep: orig.sep,
+ GLOBSTAR: GLOBSTAR,
+ });
+};
+minimatch.defaults = defaults;
+// Brace expansion:
+// a{b,c}d -> abd acd
+// a{b,}c -> abc ac
+// a{0..3}d -> a0d a1d a2d a3d
+// a{b,c{d,e}f}g -> abg acdfg acefg
+// a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg
+//
+// Invalid sets are not expanded.
+// a{2..}b -> a{2..}b
+// a{b}c -> a{b}c
+export const braceExpand = (pattern, options = {}) => {
+ assertValidPattern(pattern);
+ // Thanks to Yeting Li for
+ // improving this regexp to avoid a ReDOS vulnerability.
+ if (options.nobrace || !/\{(?:(?!\{).)*\}/.test(pattern)) {
+ // shortcut. no need to expand.
+ return [pattern];
+ }
+ return expand(pattern);
+};
+minimatch.braceExpand = braceExpand;
+const MAX_PATTERN_LENGTH = 1024 * 64;
+const assertValidPattern = (pattern) => {
+ if (typeof pattern !== 'string') {
+ throw new TypeError('invalid pattern');
+ }
+ if (pattern.length > MAX_PATTERN_LENGTH) {
+ throw new TypeError('pattern is too long');
+ }
+};
+// parse a component of the expanded set.
+// At this point, no pattern may contain "/" in it
+// so we're going to return a 2d array, where each entry is the full
+// pattern, split on '/', and then turned into a regular expression.
+// A regexp is made at the end which joins each array with an
+// escaped /, and another full one which joins each regexp with |.
+//
+// Following the lead of Bash 4.1, note that "**" only has special meaning
+// when it is the *only* thing in a path portion. Otherwise, any series
+// of * is equivalent to a single *. Globstar behavior is enabled by
+// default, and can be disabled by setting options.noglobstar.
+const SUBPARSE = Symbol('subparse');
+export const makeRe = (pattern, options = {}) => new Minimatch(pattern, options).makeRe();
+minimatch.makeRe = makeRe;
+export const match = (list, pattern, options = {}) => {
+ const mm = new Minimatch(pattern, options);
+ list = list.filter(f => mm.match(f));
+ if (mm.options.nonull && !list.length) {
+ list.push(pattern);
+ }
+ return list;
+};
+minimatch.match = match;
+// replace stuff like \* with *
+const globUnescape = (s) => s.replace(/\\(.)/g, '$1');
+const charUnescape = (s) => s.replace(/\\([^-\]])/g, '$1');
+const regExpEscape = (s) => s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&');
+const braExpEscape = (s) => s.replace(/[[\]\\]/g, '\\$&');
+export class Minimatch {
+ options;
+ set;
+ pattern;
+ windowsPathsNoEscape;
+ nonegate;
+ negate;
+ comment;
+ empty;
+ preserveMultipleSlashes;
+ partial;
+ globSet;
+ globParts;
+ regexp;
+ constructor(pattern, options = {}) {
+ assertValidPattern(pattern);
+ options = options || {};
+ this.options = options;
+ this.pattern = pattern;
+ this.windowsPathsNoEscape =
+ !!options.windowsPathsNoEscape || options.allowWindowsEscape === false;
+ if (this.windowsPathsNoEscape) {
+ this.pattern = this.pattern.replace(/\\/g, '/');
+ }
+ this.preserveMultipleSlashes = !!options.preserveMultipleSlashes;
+ this.regexp = null;
+ this.negate = false;
+ this.nonegate = !!options.nonegate;
+ this.comment = false;
+ this.empty = false;
+ this.partial = !!options.partial;
+ this.globSet = [];
+ this.globParts = [];
+ this.set = [];
+ // make the set of regexps etc.
+ this.make();
+ }
+ debug(..._) { }
+ make() {
+ const pattern = this.pattern;
+ const options = this.options;
+ // empty patterns and comments match nothing.
+ if (!options.nocomment && pattern.charAt(0) === '#') {
+ this.comment = true;
+ return;
+ }
+ if (!pattern) {
+ this.empty = true;
+ return;
+ }
+ // step 1: figure out negation, etc.
+ this.parseNegate();
+ // step 2: expand braces
+ this.globSet = this.braceExpand();
+ if (options.debug) {
+ this.debug = (...args) => console.error(...args);
+ }
+ this.debug(this.pattern, this.globSet);
+ // step 3: now we have a set, so turn each one into a series of path-portion
+ // matching patterns.
+ // These will be regexps, except in the case of "**", which is
+ // set to the GLOBSTAR object for globstar behavior,
+ // and will not contain any / characters
+ const rawGlobParts = this.globSet.map(s => this.slashSplit(s));
+ // consecutive globstars are an unncessary perf killer
+ // also, **/*/... is equivalent to */**/..., so swap all of those
+ // this turns a pattern like **/*/**/*/x into */*/**/x
+ // and a pattern like **/x/**/*/y becomes **/x/*/**/y
+ // the *later* we can push the **, the more efficient it is,
+ // because we can avoid having to do a recursive walk until
+ // the walked tree is as shallow as possible.
+ // Note that this is only true up to the last pattern, though, because
+ // a/*/** will only match a/b if b is a dir, but a/**/* will match a/b
+ // regardless, since it's "0 or more path segments" if it's not final.
+ if (this.options.noglobstar) {
+ // ** is * anyway
+ this.globParts = rawGlobParts;
+ }
+ else {
+ // do this swap BEFORE the reduce, so that we can turn a string
+ // of **/*/**/* into */*/**/** and then reduce the **'s into one
+ for (const parts of rawGlobParts) {
+ let swapped;
+ do {
+ swapped = false;
+ for (let i = 0; i < parts.length - 1; i++) {
+ if (parts[i] === '*' && parts[i - 1] === '**') {
+ parts[i] = '**';
+ parts[i - 1] = '*';
+ swapped = true;
+ }
+ }
+ } while (swapped);
+ }
+ this.globParts = rawGlobParts.map(parts => {
+ parts = parts.reduce((set, part) => {
+ const prev = set[set.length - 1];
+ if (part === '**' && prev === '**') {
+ return set;
+ }
+ if (part === '..') {
+ if (prev && prev !== '..' && prev !== '.' && prev !== '**') {
+ set.pop();
+ return set;
+ }
+ }
+ set.push(part);
+ return set;
+ }, []);
+ return parts.length === 0 ? [''] : parts;
+ });
+ }
+ this.debug(this.pattern, this.globParts);
+ // glob --> regexps
+ let set = this.globParts.map((s, _, __) => s.map(ss => this.parse(ss)));
+ this.debug(this.pattern, set);
+ // filter out everything that didn't compile properly.
+ this.set = set.filter(s => s.indexOf(false) === -1);
+ // do not treat the ? in UNC paths as magic
+ if (isWindows) {
+ for (let i = 0; i < this.set.length; i++) {
+ const p = this.set[i];
+ if (p[0] === '' &&
+ p[1] === '' &&
+ this.globParts[i][2] === '?' &&
+ typeof p[3] === 'string' &&
+ /^[a-z]:$/i.test(p[3])) {
+ p[2] = '?';
+ }
+ }
+ }
+ this.debug(this.pattern, this.set);
+ }
+ parseNegate() {
+ if (this.nonegate)
+ return;
+ const pattern = this.pattern;
+ let negate = false;
+ let negateOffset = 0;
+ for (let i = 0; i < pattern.length && pattern.charAt(i) === '!'; i++) {
+ negate = !negate;
+ negateOffset++;
+ }
+ if (negateOffset)
+ this.pattern = pattern.slice(negateOffset);
+ this.negate = negate;
+ }
+ // set partial to true to test if, for example,
+ // "/a/b" matches the start of "/*/b/*/d"
+ // Partial means, if you run out of file before you run
+ // out of pattern, then that's fine, as long as all
+ // the parts match.
+ matchOne(file, pattern, partial = false) {
+ const options = this.options;
+ // a UNC pattern like //?/c:/* can match a path like c:/x
+ // and vice versa
+ if (isWindows) {
+ const fileUNC = file[0] === '' &&
+ file[1] === '' &&
+ file[2] === '?' &&
+ typeof file[3] === 'string' &&
+ /^[a-z]:$/i.test(file[3]);
+ const patternUNC = pattern[0] === '' &&
+ pattern[1] === '' &&
+ pattern[2] === '?' &&
+ typeof pattern[3] === 'string' &&
+ /^[a-z]:$/i.test(pattern[3]);
+ if (fileUNC && patternUNC) {
+ const fd = file[3];
+ const pd = pattern[3];
+ if (fd.toLowerCase() === pd.toLowerCase()) {
+ file[3] = pd;
+ }
+ }
+ else if (patternUNC && typeof file[0] === 'string') {
+ const pd = pattern[3];
+ const fd = file[0];
+ if (pd.toLowerCase() === fd.toLowerCase()) {
+ pattern[3] = fd;
+ pattern = pattern.slice(3);
+ }
+ }
+ else if (fileUNC && typeof pattern[0] === 'string') {
+ const fd = file[3];
+ if (fd.toLowerCase() === pattern[0].toLowerCase()) {
+ pattern[0] = fd;
+ file = file.slice(3);
+ }
+ }
+ }
+ this.debug('matchOne', this, { file, pattern });
+ this.debug('matchOne', file.length, pattern.length);
+ for (var fi = 0, pi = 0, fl = file.length, pl = pattern.length; fi < fl && pi < pl; fi++, pi++) {
+ this.debug('matchOne loop');
+ var p = pattern[pi];
+ var f = file[fi];
+ this.debug(pattern, p, f);
+ // should be impossible.
+ // some invalid regexp stuff in the set.
+ /* c8 ignore start */
+ if (p === false) {
+ return false;
+ }
+ /* c8 ignore stop */
+ if (p === GLOBSTAR) {
+ this.debug('GLOBSTAR', [pattern, p, f]);
+ // "**"
+ // a/**/b/**/c would match the following:
+ // a/b/x/y/z/c
+ // a/x/y/z/b/c
+ // a/b/x/b/x/c
+ // a/b/c
+ // To do this, take the rest of the pattern after
+ // the **, and see if it would match the file remainder.
+ // If so, return success.
+ // If not, the ** "swallows" a segment, and try again.
+ // This is recursively awful.
+ //
+ // a/**/b/**/c matching a/b/x/y/z/c
+ // - a matches a
+ // - doublestar
+ // - matchOne(b/x/y/z/c, b/**/c)
+ // - b matches b
+ // - doublestar
+ // - matchOne(x/y/z/c, c) -> no
+ // - matchOne(y/z/c, c) -> no
+ // - matchOne(z/c, c) -> no
+ // - matchOne(c, c) yes, hit
+ var fr = fi;
+ var pr = pi + 1;
+ if (pr === pl) {
+ this.debug('** at the end');
+ // a ** at the end will just swallow the rest.
+ // We have found a match.
+ // however, it will not swallow /.x, unless
+ // options.dot is set.
+ // . and .. are *never* matched by **, for explosively
+ // exponential reasons.
+ for (; fi < fl; fi++) {
+ if (file[fi] === '.' ||
+ file[fi] === '..' ||
+ (!options.dot && file[fi].charAt(0) === '.'))
+ return false;
+ }
+ return true;
+ }
+ // ok, let's see if we can swallow whatever we can.
+ while (fr < fl) {
+ var swallowee = file[fr];
+ this.debug('\nglobstar while', file, fr, pattern, pr, swallowee);
+ // XXX remove this slice. Just pass the start index.
+ if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) {
+ this.debug('globstar found match!', fr, fl, swallowee);
+ // found a match.
+ return true;
+ }
+ else {
+ // can't swallow "." or ".." ever.
+ // can only swallow ".foo" when explicitly asked.
+ if (swallowee === '.' ||
+ swallowee === '..' ||
+ (!options.dot && swallowee.charAt(0) === '.')) {
+ this.debug('dot detected!', file, fr, pattern, pr);
+ break;
+ }
+ // ** swallows a segment, and continue.
+ this.debug('globstar swallow a segment, and continue');
+ fr++;
+ }
+ }
+ // no match was found.
+ // However, in partial mode, we can't say this is necessarily over.
+ /* c8 ignore start */
+ if (partial) {
+ // ran out of file
+ this.debug('\n>>> no match, partial?', file, fr, pattern, pr);
+ if (fr === fl) {
+ return true;
+ }
+ }
+ /* c8 ignore stop */
+ return false;
+ }
+ // something other than **
+ // non-magic patterns just have to match exactly
+ // patterns with magic have been turned into regexps.
+ let hit;
+ if (typeof p === 'string') {
+ hit = f === p;
+ this.debug('string match', p, f, hit);
+ }
+ else {
+ hit = p.test(f);
+ this.debug('pattern match', p, f, hit);
+ }
+ if (!hit)
+ return false;
+ }
+ // Note: ending in / means that we'll get a final ""
+ // at the end of the pattern. This can only match a
+ // corresponding "" at the end of the file.
+ // If the file ends in /, then it can only match a
+ // a pattern that ends in /, unless the pattern just
+ // doesn't have any more for it. But, a/b/ should *not*
+ // match "a/b/*", even though "" matches against the
+ // [^/]*? pattern, except in partial mode, where it might
+ // simply not be reached yet.
+ // However, a/b/ should still satisfy a/*
+ // now either we fell off the end of the pattern, or we're done.
+ if (fi === fl && pi === pl) {
+ // ran out of pattern and filename at the same time.
+ // an exact hit!
+ return true;
+ }
+ else if (fi === fl) {
+ // ran out of file, but still had pattern left.
+ // this is ok if we're doing the match as part of
+ // a glob fs traversal.
+ return partial;
+ }
+ else if (pi === pl) {
+ // ran out of pattern, still have file left.
+ // this is only acceptable if we're on the very last
+ // empty segment of a file with a trailing slash.
+ // a/* should match a/b/
+ return fi === fl - 1 && file[fi] === '';
+ /* c8 ignore start */
+ }
+ else {
+ // should be unreachable.
+ throw new Error('wtf?');
+ }
+ /* c8 ignore stop */
+ }
+ braceExpand() {
+ return braceExpand(this.pattern, this.options);
+ }
+ parse(pattern, isSub) {
+ assertValidPattern(pattern);
+ const options = this.options;
+ // shortcuts
+ if (pattern === '**') {
+ if (!options.noglobstar)
+ return GLOBSTAR;
+ else
+ pattern = '*';
+ }
+ if (pattern === '')
+ return '';
+ // far and away, the most common glob pattern parts are
+ // *, *.*, and *. Add a fast check method for those.
+ let m;
+ let fastTest = null;
+ if (isSub !== SUBPARSE) {
+ if ((m = pattern.match(starRE))) {
+ fastTest = options.dot ? starTestDot : starTest;
+ }
+ else if ((m = pattern.match(starDotExtRE))) {
+ fastTest = (options.nocase
+ ? options.dot
+ ? starDotExtTestNocaseDot
+ : starDotExtTestNocase
+ : options.dot
+ ? starDotExtTestDot
+ : starDotExtTest)(m[1]);
+ }
+ else if ((m = pattern.match(qmarksRE))) {
+ fastTest = (options.nocase
+ ? options.dot
+ ? qmarksTestNocaseDot
+ : qmarksTestNocase
+ : options.dot
+ ? qmarksTestDot
+ : qmarksTest)(m);
+ }
+ else if ((m = pattern.match(starDotStarRE))) {
+ fastTest = options.dot ? starDotStarTestDot : starDotStarTest;
+ }
+ else if ((m = pattern.match(dotStarRE))) {
+ fastTest = dotStarTest;
+ }
+ }
+ let re = '';
+ let hasMagic = false;
+ let escaping = false;
+ // ? => one single character
+ const patternListStack = [];
+ const negativeLists = [];
+ let stateChar = false;
+ let inClass = false;
+ let reClassStart = -1;
+ let classStart = -1;
+ let cs;
+ let pl;
+ let sp;
+ // . and .. never match anything that doesn't start with .,
+ // even when options.dot is set. However, if the pattern
+ // starts with ., then traversal patterns can match.
+ let dotTravAllowed = pattern.charAt(0) === '.';
+ let dotFileAllowed = options.dot || dotTravAllowed;
+ const patternStart = () => dotTravAllowed
+ ? ''
+ : dotFileAllowed
+ ? '(?!(?:^|\\/)\\.{1,2}(?:$|\\/))'
+ : '(?!\\.)';
+ const subPatternStart = (p) => p.charAt(0) === '.'
+ ? ''
+ : options.dot
+ ? '(?!(?:^|\\/)\\.{1,2}(?:$|\\/))'
+ : '(?!\\.)';
+ const clearStateChar = () => {
+ if (stateChar) {
+ // we had some state-tracking character
+ // that wasn't consumed by this pass.
+ switch (stateChar) {
+ case '*':
+ re += star;
+ hasMagic = true;
+ break;
+ case '?':
+ re += qmark;
+ hasMagic = true;
+ break;
+ default:
+ re += '\\' + stateChar;
+ break;
+ }
+ this.debug('clearStateChar %j %j', stateChar, re);
+ stateChar = false;
+ }
+ };
+ for (let i = 0, c; i < pattern.length && (c = pattern.charAt(i)); i++) {
+ this.debug('%s\t%s %s %j', pattern, i, re, c);
+ // skip over any that are escaped.
+ if (escaping) {
+ // completely not allowed, even escaped.
+ // should be impossible.
+ /* c8 ignore start */
+ if (c === '/') {
+ return false;
+ }
+ /* c8 ignore stop */
+ if (reSpecials[c]) {
+ re += '\\';
+ }
+ re += c;
+ escaping = false;
+ continue;
+ }
+ switch (c) {
+ // Should already be path-split by now.
+ /* c8 ignore start */
+ case '/': {
+ return false;
+ }
+ /* c8 ignore stop */
+ case '\\':
+ if (inClass && pattern.charAt(i + 1) === '-') {
+ re += c;
+ continue;
+ }
+ clearStateChar();
+ escaping = true;
+ continue;
+ // the various stateChar values
+ // for the "extglob" stuff.
+ case '?':
+ case '*':
+ case '+':
+ case '@':
+ case '!':
+ this.debug('%s\t%s %s %j <-- stateChar', pattern, i, re, c);
+ // all of those are literals inside a class, except that
+ // the glob [!a] means [^a] in regexp
+ if (inClass) {
+ this.debug(' in class');
+ if (c === '!' && i === classStart + 1)
+ c = '^';
+ re += c;
+ continue;
+ }
+ // if we already have a stateChar, then it means
+ // that there was something like ** or +? in there.
+ // Handle the stateChar, then proceed with this one.
+ this.debug('call clearStateChar %j', stateChar);
+ clearStateChar();
+ stateChar = c;
+ // if extglob is disabled, then +(asdf|foo) isn't a thing.
+ // just clear the statechar *now*, rather than even diving into
+ // the patternList stuff.
+ if (options.noext)
+ clearStateChar();
+ continue;
+ case '(': {
+ if (inClass) {
+ re += '(';
+ continue;
+ }
+ if (!stateChar) {
+ re += '\\(';
+ continue;
+ }
+ const plEntry = {
+ type: stateChar,
+ start: i - 1,
+ reStart: re.length,
+ open: plTypes[stateChar].open,
+ close: plTypes[stateChar].close,
+ };
+ this.debug(this.pattern, '\t', plEntry);
+ patternListStack.push(plEntry);
+ // negation is (?:(?!(?:js)(?:))[^/]*)
+ re += plEntry.open;
+ // next entry starts with a dot maybe?
+ if (plEntry.start === 0 && plEntry.type !== '!') {
+ dotTravAllowed = true;
+ re += subPatternStart(pattern.slice(i + 1));
+ }
+ this.debug('plType %j %j', stateChar, re);
+ stateChar = false;
+ continue;
+ }
+ case ')': {
+ const plEntry = patternListStack[patternListStack.length - 1];
+ if (inClass || !plEntry) {
+ re += '\\)';
+ continue;
+ }
+ patternListStack.pop();
+ // closing an extglob
+ clearStateChar();
+ hasMagic = true;
+ pl = plEntry;
+ // negation is (?:(?!js)[^/]*)
+ // The others are (?:)
+ re += pl.close;
+ if (pl.type === '!') {
+ negativeLists.push(Object.assign(pl, { reEnd: re.length }));
+ }
+ continue;
+ }
+ case '|': {
+ const plEntry = patternListStack[patternListStack.length - 1];
+ if (inClass || !plEntry) {
+ re += '\\|';
+ continue;
+ }
+ clearStateChar();
+ re += '|';
+ // next subpattern can start with a dot?
+ if (plEntry.start === 0 && plEntry.type !== '!') {
+ dotTravAllowed = true;
+ re += subPatternStart(pattern.slice(i + 1));
+ }
+ continue;
+ }
+ // these are mostly the same in regexp and glob
+ case '[':
+ // swallow any state-tracking char before the [
+ clearStateChar();
+ if (inClass) {
+ re += '\\' + c;
+ continue;
+ }
+ inClass = true;
+ classStart = i;
+ reClassStart = re.length;
+ re += c;
+ continue;
+ case ']':
+ // a right bracket shall lose its special
+ // meaning and represent itself in
+ // a bracket expression if it occurs
+ // first in the list. -- POSIX.2 2.8.3.2
+ if (i === classStart + 1 || !inClass) {
+ re += '\\' + c;
+ continue;
+ }
+ // split where the last [ was, make sure we don't have
+ // an invalid re. if so, re-walk the contents of the
+ // would-be class to re-translate any characters that
+ // were passed through as-is
+ // TODO: It would probably be faster to determine this
+ // without a try/catch and a new RegExp, but it's tricky
+ // to do safely. For now, this is safe and works.
+ cs = pattern.substring(classStart + 1, i);
+ try {
+ RegExp('[' + braExpEscape(charUnescape(cs)) + ']');
+ // looks good, finish up the class.
+ re += c;
+ }
+ catch (er) {
+ // out of order ranges in JS are errors, but in glob syntax,
+ // they're just a range that matches nothing.
+ re = re.substring(0, reClassStart) + '(?:$.)'; // match nothing ever
+ }
+ hasMagic = true;
+ inClass = false;
+ continue;
+ default:
+ // swallow any state char that wasn't consumed
+ clearStateChar();
+ if (reSpecials[c] && !(c === '^' && inClass)) {
+ re += '\\';
+ }
+ re += c;
+ break;
+ } // switch
+ } // for
+ // handle the case where we left a class open.
+ // "[abc" is valid, equivalent to "\[abc"
+ if (inClass) {
+ // split where the last [ was, and escape it
+ // this is a huge pita. We now have to re-walk
+ // the contents of the would-be class to re-translate
+ // any characters that were passed through as-is
+ cs = pattern.slice(classStart + 1);
+ sp = this.parse(cs, SUBPARSE);
+ re = re.substring(0, reClassStart) + '\\[' + sp[0];
+ hasMagic = hasMagic || sp[1];
+ }
+ // handle the case where we had a +( thing at the *end*
+ // of the pattern.
+ // each pattern list stack adds 3 chars, and we need to go through
+ // and escape any | chars that were passed through as-is for the regexp.
+ // Go through and escape them, taking care not to double-escape any
+ // | chars that were already escaped.
+ for (pl = patternListStack.pop(); pl; pl = patternListStack.pop()) {
+ let tail;
+ tail = re.slice(pl.reStart + pl.open.length);
+ this.debug(this.pattern, 'setting tail', re, pl);
+ // maybe some even number of \, then maybe 1 \, followed by a |
+ tail = tail.replace(/((?:\\{2}){0,64})(\\?)\|/g, (_, $1, $2) => {
+ if (!$2) {
+ // the | isn't already escaped, so escape it.
+ $2 = '\\';
+ // should already be done
+ /* c8 ignore start */
+ }
+ /* c8 ignore stop */
+ // need to escape all those slashes *again*, without escaping the
+ // one that we need for escaping the | character. As it works out,
+ // escaping an even number of slashes can be done by simply repeating
+ // it exactly after itself. That's why this trick works.
+ //
+ // I am sorry that you have to see this.
+ return $1 + $1 + $2 + '|';
+ });
+ this.debug('tail=%j\n %s', tail, tail, pl, re);
+ const t = pl.type === '*' ? star : pl.type === '?' ? qmark : '\\' + pl.type;
+ hasMagic = true;
+ re = re.slice(0, pl.reStart) + t + '\\(' + tail;
+ }
+ // handle trailing things that only matter at the very end.
+ clearStateChar();
+ if (escaping) {
+ // trailing \\
+ re += '\\\\';
+ }
+ // only need to apply the nodot start if the re starts with
+ // something that could conceivably capture a dot
+ const addPatternStart = addPatternStartSet[re.charAt(0)];
+ // Hack to work around lack of negative lookbehind in JS
+ // A pattern like: *.!(x).!(y|z) needs to ensure that a name
+ // like 'a.xyz.yz' doesn't match. So, the first negative
+ // lookahead, has to look ALL the way ahead, to the end of
+ // the pattern.
+ for (let n = negativeLists.length - 1; n > -1; n--) {
+ const nl = negativeLists[n];
+ const nlBefore = re.slice(0, nl.reStart);
+ const nlFirst = re.slice(nl.reStart, nl.reEnd - 8);
+ let nlAfter = re.slice(nl.reEnd);
+ const nlLast = re.slice(nl.reEnd - 8, nl.reEnd) + nlAfter;
+ // Handle nested stuff like *(*.js|!(*.json)), where open parens
+ // mean that we should *not* include the ) in the bit that is considered
+ // "after" the negated section.
+ const closeParensBefore = nlBefore.split(')').length;
+ const openParensBefore = nlBefore.split('(').length - closeParensBefore;
+ let cleanAfter = nlAfter;
+ for (let i = 0; i < openParensBefore; i++) {
+ cleanAfter = cleanAfter.replace(/\)[+*?]?/, '');
+ }
+ nlAfter = cleanAfter;
+ const dollar = nlAfter === '' && isSub !== SUBPARSE ? '(?:$|\\/)' : '';
+ re = nlBefore + nlFirst + nlAfter + dollar + nlLast;
+ }
+ // if the re is not "" at this point, then we need to make sure
+ // it doesn't match against an empty path part.
+ // Otherwise a/* will match a/, which it should not.
+ if (re !== '' && hasMagic) {
+ re = '(?=.)' + re;
+ }
+ if (addPatternStart) {
+ re = patternStart() + re;
+ }
+ // parsing just a piece of a larger pattern.
+ if (isSub === SUBPARSE) {
+ return [re, hasMagic];
+ }
+ // if it's nocase, and the lcase/uppercase don't match, it's magic
+ if (options.nocase && !hasMagic && !options.nocaseMagicOnly) {
+ hasMagic = pattern.toUpperCase() !== pattern.toLowerCase();
+ }
+ // skip the regexp for non-magical patterns
+ // unescape anything in it, though, so that it'll be
+ // an exact match against a file etc.
+ if (!hasMagic) {
+ return globUnescape(pattern);
+ }
+ const flags = options.nocase ? 'i' : '';
+ try {
+ const ext = fastTest
+ ? {
+ _glob: pattern,
+ _src: re,
+ test: fastTest,
+ }
+ : {
+ _glob: pattern,
+ _src: re,
+ };
+ return Object.assign(new RegExp('^' + re + '$', flags), ext);
+ /* c8 ignore start */
+ }
+ catch (er) {
+ // should be impossible
+ // If it was an invalid regular expression, then it can't match
+ // anything. This trick looks for a character after the end of
+ // the string, which is of course impossible, except in multi-line
+ // mode, but it's not a /m regex.
+ this.debug('invalid regexp', er);
+ return new RegExp('$.');
+ }
+ /* c8 ignore stop */
+ }
+ makeRe() {
+ if (this.regexp || this.regexp === false)
+ return this.regexp;
+ // at this point, this.set is a 2d array of partial
+ // pattern strings, or "**".
+ //
+ // It's better to use .match(). This function shouldn't
+ // be used, really, but it's pretty convenient sometimes,
+ // when you just want to work with a regex.
+ const set = this.set;
+ if (!set.length) {
+ this.regexp = false;
+ return this.regexp;
+ }
+ const options = this.options;
+ const twoStar = options.noglobstar
+ ? star
+ : options.dot
+ ? twoStarDot
+ : twoStarNoDot;
+ const flags = options.nocase ? 'i' : '';
+ // regexpify non-globstar patterns
+ // if ** is only item, then we just do one twoStar
+ // if ** is first, and there are more, prepend (\/|twoStar\/)? to next
+ // if ** is last, append (\/twoStar|) to previous
+ // if ** is in the middle, append (\/|\/twoStar\/) to previous
+ // then filter out GLOBSTAR symbols
+ let re = set
+ .map(pattern => {
+ const pp = pattern.map(p => typeof p === 'string'
+ ? regExpEscape(p)
+ : p === GLOBSTAR
+ ? GLOBSTAR
+ : p._src);
+ pp.forEach((p, i) => {
+ const next = pp[i + 1];
+ const prev = pp[i - 1];
+ if (p !== GLOBSTAR || prev === GLOBSTAR) {
+ return;
+ }
+ if (prev === undefined) {
+ if (next !== undefined && next !== GLOBSTAR) {
+ pp[i + 1] = '(?:\\/|' + twoStar + '\\/)?' + next;
+ }
+ else {
+ pp[i] = twoStar;
+ }
+ }
+ else if (next === undefined) {
+ pp[i - 1] = prev + '(?:\\/|' + twoStar + ')?';
+ }
+ else if (next !== GLOBSTAR) {
+ pp[i - 1] = prev + '(?:\\/|\\/' + twoStar + '\\/)' + next;
+ pp[i + 1] = GLOBSTAR;
+ }
+ });
+ return pp.filter(p => p !== GLOBSTAR).join('/');
+ })
+ .join('|');
+ // must match entire pattern
+ // ending in a * or ** will make it less strict.
+ re = '^(?:' + re + ')$';
+ // can match anything, as long as it's not this.
+ if (this.negate)
+ re = '^(?!' + re + ').*$';
+ try {
+ this.regexp = new RegExp(re, flags);
+ /* c8 ignore start */
+ }
+ catch (ex) {
+ // should be impossible
+ this.regexp = false;
+ }
+ /* c8 ignore stop */
+ return this.regexp;
+ }
+ slashSplit(p) {
+ // if p starts with // on windows, we preserve that
+ // so that UNC paths aren't broken. Otherwise, any number of
+ // / characters are coalesced into one, unless
+ // preserveMultipleSlashes is set to true.
+ if (this.preserveMultipleSlashes) {
+ return p.split('/');
+ }
+ else if (isWindows && /^\/\/[^\/]+/.test(p)) {
+ // add an extra '' for the one we lose
+ return ['', ...p.split(/\/+/)];
+ }
+ else {
+ return p.split(/\/+/);
+ }
+ }
+ match(f, partial = this.partial) {
+ this.debug('match', f, this.pattern);
+ // short-circuit in the case of busted things.
+ // comments, etc.
+ if (this.comment) {
+ return false;
+ }
+ if (this.empty) {
+ return f === '';
+ }
+ if (f === '/' && partial) {
+ return true;
+ }
+ const options = this.options;
+ // windows: need to use /, not \
+ if (path.sep !== '/') {
+ f = f.split(path.sep).join('/');
+ }
+ // treat the test path as a set of pathparts.
+ const ff = this.slashSplit(f);
+ this.debug(this.pattern, 'split', ff);
+ // just ONE of the pattern sets in this.set needs to match
+ // in order for it to be valid. If negating, then just one
+ // match means that we have failed.
+ // Either way, return on the first hit.
+ const set = this.set;
+ this.debug(this.pattern, 'set', set);
+ // Find the basename of the path by looking for the last non-empty segment
+ let filename = ff[ff.length - 1];
+ if (!filename) {
+ for (let i = ff.length - 2; !filename && i >= 0; i--) {
+ filename = ff[i];
+ }
+ }
+ for (let i = 0; i < set.length; i++) {
+ const pattern = set[i];
+ let file = ff;
+ if (options.matchBase && pattern.length === 1) {
+ file = [filename];
+ }
+ const hit = this.matchOne(file, pattern, partial);
+ if (hit) {
+ if (options.flipNegate) {
+ return true;
+ }
+ return !this.negate;
+ }
+ }
+ // didn't get any hits. this is success if it's a negative
+ // pattern, failure otherwise.
+ if (options.flipNegate) {
+ return false;
+ }
+ return this.negate;
+ }
+ static defaults(def) {
+ return minimatch.defaults(def).Minimatch;
+ }
+}
+minimatch.Minimatch = Minimatch;
+//# sourceMappingURL=index.js.map
\ No newline at end of file
diff --git a/deps/npm/node_modules/minimatch/dist/mjs/index.js.map b/deps/npm/node_modules/minimatch/dist/mjs/index.js.map
new file mode 100644
index 00000000000000..854172c5bd0f01
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/mjs/index.js.map
@@ -0,0 +1 @@
+{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":"AAmBA,MAAM,CAAC,MAAM,SAAS,GAAG,CACvB,CAAS,EACT,OAAe,EACf,UAA4B,EAAE,EAC9B,EAAE;IACF,kBAAkB,CAAC,OAAO,CAAC,CAAA;IAE3B,oCAAoC;IACpC,IAAI,CAAC,OAAO,CAAC,SAAS,IAAI,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;QACnD,OAAO,KAAK,CAAA;KACb;IAED,OAAO,IAAI,SAAS,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAA;AACjD,CAAC,CAAA;AAED,eAAe,SAAS,CAAA;AAExB,wDAAwD;AACxD,MAAM,YAAY,GAAG,uBAAuB,CAAA;AAC5C,MAAM,cAAc,GAAG,CAAC,GAAW,EAAE,EAAE,CAAC,CAAC,CAAS,EAAE,EAAE,CACpD,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACvC,MAAM,iBAAiB,GAAG,CAAC,GAAW,EAAE,EAAE,CAAC,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACzE,MAAM,oBAAoB,GAAG,CAAC,GAAW,EAAE,EAAE;IAC3C,GAAG,GAAG,GAAG,CAAC,WAAW,EAAE,CAAA;IACvB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAC3E,CAAC,CAAA;AACD,MAAM,uBAAuB,GAAG,CAAC,GAAW,EAAE,EAAE;IAC9C,GAAG,GAAG,GAAG,CAAC,WAAW,EAAE,CAAA;IACvB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACrD,CAAC,CAAA;AACD,MAAM,aAAa,GAAG,YAAY,CAAA;AAClC,MAAM,eAAe,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAC5E,MAAM,kBAAkB,GAAG,CAAC,CAAS,EAAE,EAAE,CACvC,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,IAAI,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAC5C,MAAM,SAAS,GAAG,SAAS,CAAA;AAC3B,MAAM,WAAW,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,IAAI,IAAI,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,CAAA;AAC/E,MAAM,MAAM,GAAG,OAAO,CAAA;AACtB,MAAM,QAAQ,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,CAAA;AACpE,MAAM,WAAW,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,IAAI,CAAA;AAC5E,MAAM,QAAQ,GAAG,wBAAwB,CAAA;AACzC,MAAM,gBAAgB,GAAG,CAAC,CAAC,EAAE,EAAE,GAAG,GAAG,EAAE,CAAmB,EAAE,EAAE;IAC5D,MAAM,KAAK,GAAG,eAAe,CAAC,CAAC,EAAE,CAAC,CAAC,CAAA;IACnC,IAAI,CAAC,GAAG;QAAE,OAAO,KAAK,CAAA;IACtB,GAAG,GAAG,GAAG,CAAC,WAAW,EAAE,CAAA;IACvB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACjE,CAAC,CAAA;AACD,MAAM,mBAAmB,GAAG,CAAC,CAAC,EAAE,EAAE,GAAG,GAAG,EAAE,CAAmB,EAAE,EAAE;IAC/D,MAAM,KAAK,GAAG,kBAAkB,CAAC,CAAC,EAAE,CAAC,CAAC,CAAA;IACtC,IAAI,CAAC,GAAG;QAAE,OAAO,KAAK,CAAA;IACtB,GAAG,GAAG,GAAG,CAAC,WAAW,EAAE,CAAA;IACvB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AACjE,CAAC,CAAA;AACD,MAAM,aAAa,GAAG,CAAC,CAAC,EAAE,EAAE,GAAG,GAAG,EAAE,CAAmB,EAAE,EAAE;IACzD,MAAM,KAAK,GAAG,kBAAkB,CAAC,CAAC,EAAE,CAAC,CAAC,CAAA;IACtC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAS,EAAE,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAClE,CAAC,CAAA;AACD,MAAM,UAAU,GAAG,CAAC,CAAC,EAAE,EAAE,GAAG,GAAG,EAAE,CAAmB,EAAE,EAAE;IACtD,MAAM,KAAK,GAAG,eAAe,CAAC,CAAC,EAAE,CAAC,CAAC,CAAA;IACnC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAS,EAAE,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAA;AAClE,CAAC,CAAA;AACD,MAAM,eAAe,GAAG,CAAC,CAAC,EAAE,CAAmB,EAAE,EAAE;IACjD,MAAM,GAAG,GAAG,EAAE,CAAC,MAAM,CAAA;IACrB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,MAAM,KAAK,GAAG,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,GAAG,CAAC,CAAA;AAC9D,CAAC,CAAA;AACD,MAAM,kBAAkB,GAAG,CAAC,CAAC,EAAE,CAAmB,EAAE,EAAE;IACpD,MAAM,GAAG,GAAG,EAAE,CAAC,MAAM,CAAA;IACrB,OAAO,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,IAAI,CAAA;AACnE,CAAC,CAAA;AAED,qBAAqB;AACrB,MAAM,QAAQ,GACZ,OAAO,OAAO,KAAK,QAAQ,IAAI,OAAO;IACpC,CAAC,CAAC,CAAC,OAAO,OAAO,CAAC,GAAG,KAAK,QAAQ;QAC9B,OAAO,CAAC,GAAG;QACX,OAAO,CAAC,GAAG,CAAC,8BAA8B,CAAC;QAC7C,OAAO,CAAC,QAAQ;IAClB,CAAC,CAAC,OAAO,CAAA;AACb,MAAM,SAAS,GAAG,QAAQ,KAAK,OAAO,CAAA;AACtC,MAAM,IAAI,GAAG,SAAS,CAAC,CAAC,CAAC,EAAE,GAAG,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,EAAE,GAAG,EAAE,CAAA;AACrD,oBAAoB;AAEpB,MAAM,CAAC,MAAM,GAAG,GAAG,IAAI,CAAC,GAAG,CAAA;AAC3B,SAAS,CAAC,GAAG,GAAG,GAAG,CAAA;AAEnB,MAAM,CAAC,MAAM,QAAQ,GAAG,MAAM,CAAC,aAAa,CAAC,CAAA;AAC7C,SAAS,CAAC,QAAQ,GAAG,QAAQ,CAAA;AAC7B,OAAO,MAAM,MAAM,iBAAiB,CAAA;AAEpC,MAAM,OAAO,GAAG;IACd,GAAG,EAAE,EAAE,IAAI,EAAE,WAAW,EAAE,KAAK,EAAE,WAAW,EAAE;IAC9C,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE;IACjC,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE;IACjC,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE;IACjC,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE,KAAK,EAAE,GAAG,EAAE;CACjC,CAAA;AAGD,gCAAgC;AAChC,iDAAiD;AACjD,MAAM,KAAK,GAAG,MAAM,CAAA;AAEpB,gCAAgC;AAChC,MAAM,IAAI,GAAG,KAAK,GAAG,IAAI,CAAA;AAEzB,4DAA4D;AAC5D,+DAA+D;AAC/D,6CAA6C;AAC7C,MAAM,UAAU,GAAG,yCAAyC,CAAA;AAE5D,kCAAkC;AAClC,6CAA6C;AAC7C,MAAM,YAAY,GAAG,yBAAyB,CAAA;AAE9C,sCAAsC;AACtC,MAAM,OAAO,GAAG,CAAC,CAAS,EAAE,EAAE,CAC5B,CAAC,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC,MAAM,CAAC,CAAC,GAA6B,EAAE,CAAC,EAAE,EAAE;IACtD,GAAG,CAAC,CAAC,CAAC,GAAG,IAAI,CAAA;IACb,OAAO,GAAG,CAAA;AACZ,CAAC,EAAE,EAAE,CAAC,CAAA;AAER,gDAAgD;AAChD,MAAM,UAAU,GAAG,OAAO,CAAC,iBAAiB,CAAC,CAAA;AAE7C,4DAA4D;AAC5D,MAAM,kBAAkB,GAAG,OAAO,CAAC,KAAK,CAAC,CAAA;AAEzC,MAAM,CAAC,MAAM,MAAM,GACjB,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CACpD,CAAC,CAAS,EAAE,EAAE,CACZ,SAAS,CAAC,CAAC,EAAE,OAAO,EAAE,OAAO,CAAC,CAAA;AAClC,SAAS,CAAC,MAAM,GAAG,MAAM,CAAA;AAEzB,MAAM,GAAG,GAAG,CAAC,CAAmB,EAAE,IAAsB,EAAE,EAAE,EAAE,CAC5D,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,CAAC,EAAE,CAAC,CAAC,CAAA;AAEzB,MAAM,CAAC,MAAM,QAAQ,GAAG,CAAC,GAAqB,EAAoB,EAAE;IAClE,IAAI,CAAC,GAAG,IAAI,OAAO,GAAG,KAAK,QAAQ,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,MAAM,EAAE;QAC/D,OAAO,SAAS,CAAA;KACjB;IAED,MAAM,IAAI,GAAG,SAAS,CAAA;IAEtB,MAAM,CAAC,GAAG,CAAC,CAAS,EAAE,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CACvE,IAAI,CAAC,CAAC,EAAE,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC,CAAA;IAErC,OAAO,MAAM,CAAC,MAAM,CAAC,CAAC,EAAE;QACtB,SAAS,EAAE,MAAM,SAAU,SAAQ,IAAI,CAAC,SAAS;YAC/C,YAAY,OAAe,EAAE,UAA4B,EAAE;gBACzD,KAAK,CAAC,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC,CAAA;YACnC,CAAC;YACD,MAAM,CAAC,QAAQ,CAAC,OAAyB;gBACvC,OAAO,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC,CAAC,SAAS,CAAA;YACnD,CAAC;SACF;QAED,MAAM,EAAE,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CAC1D,IAAI,CAAC,MAAM,CAAC,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAEzC,QAAQ,EAAE,CAAC,OAAyB,EAAE,EAAE,CAAC,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAEzE,MAAM,EAAE,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CAC1D,IAAI,CAAC,MAAM,CAAC,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAEzC,WAAW,EAAE,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CAC/D,IAAI,CAAC,WAAW,CAAC,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAE9C,KAAK,EAAE,CAAC,IAAc,EAAE,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CACzE,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,OAAO,EAAE,GAAG,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QAE9C,GAAG,EAAE,IAAI,CAAC,GAAG;QACb,QAAQ,EAAE,QAA2B;KACtC,CAAC,CAAA;AACJ,CAAC,CAAA;AACD,SAAS,CAAC,QAAQ,GAAG,QAAQ,CAAA;AAE7B,mBAAmB;AACnB,qBAAqB;AACrB,mBAAmB;AACnB,8BAA8B;AAC9B,mCAAmC;AACnC,2CAA2C;AAC3C,EAAE;AACF,iCAAiC;AACjC,qBAAqB;AACrB,iBAAiB;AACjB,MAAM,CAAC,MAAM,WAAW,GAAG,CACzB,OAAe,EACf,UAA4B,EAAE,EAC9B,EAAE;IACF,kBAAkB,CAAC,OAAO,CAAC,CAAA;IAE3B,wDAAwD;IACxD,wDAAwD;IACxD,IAAI,OAAO,CAAC,OAAO,IAAI,CAAC,kBAAkB,CAAC,IAAI,CAAC,OAAO,CAAC,EAAE;QACxD,+BAA+B;QAC/B,OAAO,CAAC,OAAO,CAAC,CAAA;KACjB;IAED,OAAO,MAAM,CAAC,OAAO,CAAC,CAAA;AACxB,CAAC,CAAA;AACD,SAAS,CAAC,WAAW,GAAG,WAAW,CAAA;AAEnC,MAAM,kBAAkB,GAAG,IAAI,GAAG,EAAE,CAAA;AACpC,MAAM,kBAAkB,GAA2B,CACjD,OAAY,EACe,EAAE;IAC7B,IAAI,OAAO,OAAO,KAAK,QAAQ,EAAE;QAC/B,MAAM,IAAI,SAAS,CAAC,iBAAiB,CAAC,CAAA;KACvC;IAED,IAAI,OAAO,CAAC,MAAM,GAAG,kBAAkB,EAAE;QACvC,MAAM,IAAI,SAAS,CAAC,qBAAqB,CAAC,CAAA;KAC3C;AACH,CAAC,CAAA;AAED,yCAAyC;AACzC,kDAAkD;AAClD,oEAAoE;AACpE,oEAAoE;AACpE,6DAA6D;AAC7D,kEAAkE;AAClE,EAAE;AACF,0EAA0E;AAC1E,wEAAwE;AACxE,qEAAqE;AACrE,8DAA8D;AAC9D,MAAM,QAAQ,GAAG,MAAM,CAAC,UAAU,CAAC,CAAA;AAEnC,MAAM,CAAC,MAAM,MAAM,GAAG,CAAC,OAAe,EAAE,UAA4B,EAAE,EAAE,EAAE,CACxE,IAAI,SAAS,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC,MAAM,EAAE,CAAA;AAC1C,SAAS,CAAC,MAAM,GAAG,MAAM,CAAA;AAEzB,MAAM,CAAC,MAAM,KAAK,GAAG,CACnB,IAAc,EACd,OAAe,EACf,UAA4B,EAAE,EAC9B,EAAE;IACF,MAAM,EAAE,GAAG,IAAI,SAAS,CAAC,OAAO,EAAE,OAAO,CAAC,CAAA;IAC1C,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAA;IACpC,IAAI,EAAE,CAAC,OAAO,CAAC,MAAM,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE;QACrC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,CAAA;KACnB;IACD,OAAO,IAAI,CAAA;AACb,CAAC,CAAA;AACD,SAAS,CAAC,KAAK,GAAG,KAAK,CAAA;AAEvB,+BAA+B;AAC/B,MAAM,YAAY,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,QAAQ,EAAE,IAAI,CAAC,CAAA;AAC7D,MAAM,YAAY,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,EAAE,IAAI,CAAC,CAAA;AAClE,MAAM,YAAY,GAAG,CAAC,CAAS,EAAE,EAAE,CACjC,CAAC,CAAC,OAAO,CAAC,0BAA0B,EAAE,MAAM,CAAC,CAAA;AAC/C,MAAM,YAAY,GAAG,CAAC,CAAS,EAAE,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,UAAU,EAAE,MAAM,CAAC,CAAA;AAsBjE,MAAM,OAAO,SAAS;IACpB,OAAO,CAAkB;IACzB,GAAG,CAAyB;IAC5B,OAAO,CAAQ;IAEf,oBAAoB,CAAS;IAC7B,QAAQ,CAAS;IACjB,MAAM,CAAS;IACf,OAAO,CAAS;IAChB,KAAK,CAAS;IACd,uBAAuB,CAAS;IAChC,OAAO,CAAS;IAChB,OAAO,CAAU;IACjB,SAAS,CAAY;IAErB,MAAM,CAAyB;IAC/B,YAAY,OAAe,EAAE,UAA4B,EAAE;QACzD,kBAAkB,CAAC,OAAO,CAAC,CAAA;QAE3B,OAAO,GAAG,OAAO,IAAI,EAAE,CAAA;QACvB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAA;QACtB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAA;QACtB,IAAI,CAAC,oBAAoB;YACvB,CAAC,CAAC,OAAO,CAAC,oBAAoB,IAAI,OAAO,CAAC,kBAAkB,KAAK,KAAK,CAAA;QACxE,IAAI,IAAI,CAAC,oBAAoB,EAAE;YAC7B,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,KAAK,EAAE,GAAG,CAAC,CAAA;SAChD;QACD,IAAI,CAAC,uBAAuB,GAAG,CAAC,CAAC,OAAO,CAAC,uBAAuB,CAAA;QAChE,IAAI,CAAC,MAAM,GAAG,IAAI,CAAA;QAClB,IAAI,CAAC,MAAM,GAAG,KAAK,CAAA;QACnB,IAAI,CAAC,QAAQ,GAAG,CAAC,CAAC,OAAO,CAAC,QAAQ,CAAA;QAClC,IAAI,CAAC,OAAO,GAAG,KAAK,CAAA;QACpB,IAAI,CAAC,KAAK,GAAG,KAAK,CAAA;QAClB,IAAI,CAAC,OAAO,GAAG,CAAC,CAAC,OAAO,CAAC,OAAO,CAAA;QAEhC,IAAI,CAAC,OAAO,GAAG,EAAE,CAAA;QACjB,IAAI,CAAC,SAAS,GAAG,EAAE,CAAA;QACnB,IAAI,CAAC,GAAG,GAAG,EAAE,CAAA;QAEb,+BAA+B;QAC/B,IAAI,CAAC,IAAI,EAAE,CAAA;IACb,CAAC;IAED,KAAK,CAAC,GAAG,CAAQ,IAAG,CAAC;IAErB,IAAI;QACF,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAC5B,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,6CAA6C;QAC7C,IAAI,CAAC,OAAO,CAAC,SAAS,IAAI,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;YACnD,IAAI,CAAC,OAAO,GAAG,IAAI,CAAA;YACnB,OAAM;SACP;QAED,IAAI,CAAC,OAAO,EAAE;YACZ,IAAI,CAAC,KAAK,GAAG,IAAI,CAAA;YACjB,OAAM;SACP;QAED,oCAAoC;QACpC,IAAI,CAAC,WAAW,EAAE,CAAA;QAElB,wBAAwB;QACxB,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,WAAW,EAAE,CAAA;QAEjC,IAAI,OAAO,CAAC,KAAK,EAAE;YACjB,IAAI,CAAC,KAAK,GAAG,CAAC,GAAG,IAAW,EAAE,EAAE,CAAC,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,CAAA;SACxD;QAED,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,OAAO,CAAC,CAAA;QAEtC,4EAA4E;QAC5E,qBAAqB;QACrB,8DAA8D;QAC9D,oDAAoD;QACpD,wCAAwC;QACxC,MAAM,YAAY,GAAG,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAA;QAE9D,sDAAsD;QACtD,iEAAiE;QACjE,sDAAsD;QACtD,qDAAqD;QACrD,4DAA4D;QAC5D,2DAA2D;QAC3D,6CAA6C;QAC7C,sEAAsE;QACtE,sEAAsE;QACtE,sEAAsE;QACtE,IAAI,IAAI,CAAC,OAAO,CAAC,UAAU,EAAE;YAC3B,iBAAiB;YACjB,IAAI,CAAC,SAAS,GAAG,YAAY,CAAA;SAC9B;aAAM;YACL,+DAA+D;YAC/D,gEAAgE;YAChE,KAAK,MAAM,KAAK,IAAI,YAAY,EAAE;gBAChC,IAAI,OAAgB,CAAA;gBACpB,GAAG;oBACD,OAAO,GAAG,KAAK,CAAA;oBACf,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC,EAAE,EAAE;wBACzC,IAAI,KAAK,CAAC,CAAC,CAAC,KAAK,GAAG,IAAI,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,KAAK,IAAI,EAAE;4BAC7C,KAAK,CAAC,CAAC,CAAC,GAAG,IAAI,CAAA;4BACf,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,GAAG,CAAA;4BAClB,OAAO,GAAG,IAAI,CAAA;yBACf;qBACF;iBACF,QAAQ,OAAO,EAAC;aAClB;YACD,IAAI,CAAC,SAAS,GAAG,YAAY,CAAC,GAAG,CAAC,KAAK,CAAC,EAAE;gBACxC,KAAK,GAAG,KAAK,CAAC,MAAM,CAAC,CAAC,GAAa,EAAE,IAAI,EAAE,EAAE;oBAC3C,MAAM,IAAI,GAAG,GAAG,CAAC,GAAG,CAAC,MAAM,GAAG,CAAC,CAAC,CAAA;oBAChC,IAAI,IAAI,KAAK,IAAI,IAAI,IAAI,KAAK,IAAI,EAAE;wBAClC,OAAO,GAAG,CAAA;qBACX;oBACD,IAAI,IAAI,KAAK,IAAI,EAAE;wBACjB,IAAI,IAAI,IAAI,IAAI,KAAK,IAAI,IAAI,IAAI,KAAK,GAAG,IAAI,IAAI,KAAK,IAAI,EAAE;4BAC1D,GAAG,CAAC,GAAG,EAAE,CAAA;4BACT,OAAO,GAAG,CAAA;yBACX;qBACF;oBACD,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC,CAAA;oBACd,OAAO,GAAG,CAAA;gBACZ,CAAC,EAAE,EAAE,CAAC,CAAA;gBACN,OAAO,KAAK,CAAC,MAAM,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,KAAK,CAAA;YAC1C,CAAC,CAAC,CAAA;SACH;QAED,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,SAAS,CAAC,CAAA;QAExC,mBAAmB;QACnB,IAAI,GAAG,GAAG,IAAI,CAAC,SAAS,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE,EAAE,EAAE,CAAC,CAAC,CAAC,GAAG,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC,CAAC,CAAA;QAEvE,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,GAAG,CAAC,CAAA;QAE7B,sDAAsD;QACtD,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC,MAAM,CACnB,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,CACF,CAAA;QAE5B,2CAA2C;QAC3C,IAAI,SAAS,EAAE;YACb,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,GAAG,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;gBACxC,MAAM,CAAC,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,CAAA;gBACrB,IACE,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE;oBACX,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE;oBACX,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,KAAK,GAAG;oBAC5B,OAAO,CAAC,CAAC,CAAC,CAAC,KAAK,QAAQ;oBACxB,WAAW,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,EACtB;oBACA,CAAC,CAAC,CAAC,CAAC,GAAG,GAAG,CAAA;iBACX;aACF;SACF;QAED,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,GAAG,CAAC,CAAA;IACpC,CAAC;IAED,WAAW;QACT,IAAI,IAAI,CAAC,QAAQ;YAAE,OAAM;QAEzB,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAC5B,IAAI,MAAM,GAAG,KAAK,CAAA;QAClB,IAAI,YAAY,GAAG,CAAC,CAAA;QAEpB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,CAAC,MAAM,IAAI,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE,CAAC,EAAE,EAAE;YACpE,MAAM,GAAG,CAAC,MAAM,CAAA;YAChB,YAAY,EAAE,CAAA;SACf;QAED,IAAI,YAAY;YAAE,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC,KAAK,CAAC,YAAY,CAAC,CAAA;QAC5D,IAAI,CAAC,MAAM,GAAG,MAAM,CAAA;IACtB,CAAC;IAED,+CAA+C;IAC/C,yCAAyC;IACzC,uDAAuD;IACvD,mDAAmD;IACnD,mBAAmB;IACnB,QAAQ,CAAC,IAAc,EAAE,OAAsB,EAAE,UAAmB,KAAK;QACvE,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,yDAAyD;QACzD,iBAAiB;QACjB,IAAI,SAAS,EAAE;YACb,MAAM,OAAO,GACX,IAAI,CAAC,CAAC,CAAC,KAAK,EAAE;gBACd,IAAI,CAAC,CAAC,CAAC,KAAK,EAAE;gBACd,IAAI,CAAC,CAAC,CAAC,KAAK,GAAG;gBACf,OAAO,IAAI,CAAC,CAAC,CAAC,KAAK,QAAQ;gBAC3B,WAAW,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAA;YAC3B,MAAM,UAAU,GACd,OAAO,CAAC,CAAC,CAAC,KAAK,EAAE;gBACjB,OAAO,CAAC,CAAC,CAAC,KAAK,EAAE;gBACjB,OAAO,CAAC,CAAC,CAAC,KAAK,GAAG;gBAClB,OAAO,OAAO,CAAC,CAAC,CAAC,KAAK,QAAQ;gBAC9B,WAAW,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,CAAA;YAE9B,IAAI,OAAO,IAAI,UAAU,EAAE;gBACzB,MAAM,EAAE,GAAG,IAAI,CAAC,CAAC,CAAW,CAAA;gBAC5B,MAAM,EAAE,GAAG,OAAO,CAAC,CAAC,CAAW,CAAA;gBAC/B,IAAI,EAAE,CAAC,WAAW,EAAE,KAAK,EAAE,CAAC,WAAW,EAAE,EAAE;oBACzC,IAAI,CAAC,CAAC,CAAC,GAAG,EAAE,CAAA;iBACb;aACF;iBAAM,IAAI,UAAU,IAAI,OAAO,IAAI,CAAC,CAAC,CAAC,KAAK,QAAQ,EAAE;gBACpD,MAAM,EAAE,GAAG,OAAO,CAAC,CAAC,CAAW,CAAA;gBAC/B,MAAM,EAAE,GAAG,IAAI,CAAC,CAAC,CAAC,CAAA;gBAClB,IAAI,EAAE,CAAC,WAAW,EAAE,KAAK,EAAE,CAAC,WAAW,EAAE,EAAE;oBACzC,OAAO,CAAC,CAAC,CAAC,GAAG,EAAE,CAAA;oBACf,OAAO,GAAG,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAA;iBAC3B;aACF;iBAAM,IAAI,OAAO,IAAI,OAAO,OAAO,CAAC,CAAC,CAAC,KAAK,QAAQ,EAAE;gBACpD,MAAM,EAAE,GAAG,IAAI,CAAC,CAAC,CAAC,CAAA;gBAClB,IAAI,EAAE,CAAC,WAAW,EAAE,KAAK,OAAO,CAAC,CAAC,CAAC,CAAC,WAAW,EAAE,EAAE;oBACjD,OAAO,CAAC,CAAC,CAAC,GAAG,EAAE,CAAA;oBACf,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAA;iBACrB;aACF;SACF;QAED,IAAI,CAAC,KAAK,CAAC,UAAU,EAAE,IAAI,EAAE,EAAE,IAAI,EAAE,OAAO,EAAE,CAAC,CAAA;QAC/C,IAAI,CAAC,KAAK,CAAC,UAAU,EAAE,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,MAAM,CAAC,CAAA;QAEnD,KACE,IAAI,EAAE,GAAG,CAAC,EAAE,EAAE,GAAG,CAAC,EAAE,EAAE,GAAG,IAAI,CAAC,MAAM,EAAE,EAAE,GAAG,OAAO,CAAC,MAAM,EACzD,EAAE,GAAG,EAAE,IAAI,EAAE,GAAG,EAAE,EAClB,EAAE,EAAE,EAAE,EAAE,EAAE,EACV;YACA,IAAI,CAAC,KAAK,CAAC,eAAe,CAAC,CAAA;YAC3B,IAAI,CAAC,GAAG,OAAO,CAAC,EAAE,CAAC,CAAA;YACnB,IAAI,CAAC,GAAG,IAAI,CAAC,EAAE,CAAC,CAAA;YAEhB,IAAI,CAAC,KAAK,CAAC,OAAO,EAAE,CAAC,EAAE,CAAC,CAAC,CAAA;YAEzB,wBAAwB;YACxB,wCAAwC;YACxC,qBAAqB;YACrB,IAAI,CAAC,KAAK,KAAK,EAAE;gBACf,OAAO,KAAK,CAAA;aACb;YACD,oBAAoB;YAEpB,IAAI,CAAC,KAAK,QAAQ,EAAE;gBAClB,IAAI,CAAC,KAAK,CAAC,UAAU,EAAE,CAAC,OAAO,EAAE,CAAC,EAAE,CAAC,CAAC,CAAC,CAAA;gBAEvC,OAAO;gBACP,yCAAyC;gBACzC,cAAc;gBACd,cAAc;gBACd,cAAc;gBACd,QAAQ;gBACR,iDAAiD;gBACjD,wDAAwD;gBACxD,yBAAyB;gBACzB,sDAAsD;gBACtD,6BAA6B;gBAC7B,EAAE;gBACF,mCAAmC;gBACnC,gBAAgB;gBAChB,eAAe;gBACf,kCAAkC;gBAClC,oBAAoB;gBACpB,mBAAmB;gBACnB,qCAAqC;gBACrC,mCAAmC;gBACnC,iCAAiC;gBACjC,kCAAkC;gBAClC,IAAI,EAAE,GAAG,EAAE,CAAA;gBACX,IAAI,EAAE,GAAG,EAAE,GAAG,CAAC,CAAA;gBACf,IAAI,EAAE,KAAK,EAAE,EAAE;oBACb,IAAI,CAAC,KAAK,CAAC,eAAe,CAAC,CAAA;oBAC3B,8CAA8C;oBAC9C,yBAAyB;oBACzB,2CAA2C;oBAC3C,sBAAsB;oBACtB,sDAAsD;oBACtD,uBAAuB;oBACvB,OAAO,EAAE,GAAG,EAAE,EAAE,EAAE,EAAE,EAAE;wBACpB,IACE,IAAI,CAAC,EAAE,CAAC,KAAK,GAAG;4BAChB,IAAI,CAAC,EAAE,CAAC,KAAK,IAAI;4BACjB,CAAC,CAAC,OAAO,CAAC,GAAG,IAAI,IAAI,CAAC,EAAE,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,CAAC;4BAE5C,OAAO,KAAK,CAAA;qBACf;oBACD,OAAO,IAAI,CAAA;iBACZ;gBAED,mDAAmD;gBACnD,OAAO,EAAE,GAAG,EAAE,EAAE;oBACd,IAAI,SAAS,GAAG,IAAI,CAAC,EAAE,CAAC,CAAA;oBAExB,IAAI,CAAC,KAAK,CAAC,kBAAkB,EAAE,IAAI,EAAE,EAAE,EAAE,OAAO,EAAE,EAAE,EAAE,SAAS,CAAC,CAAA;oBAEhE,qDAAqD;oBACrD,IAAI,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,EAAE,OAAO,CAAC,KAAK,CAAC,EAAE,CAAC,EAAE,OAAO,CAAC,EAAE;wBAC7D,IAAI,CAAC,KAAK,CAAC,uBAAuB,EAAE,EAAE,EAAE,EAAE,EAAE,SAAS,CAAC,CAAA;wBACtD,iBAAiB;wBACjB,OAAO,IAAI,CAAA;qBACZ;yBAAM;wBACL,kCAAkC;wBAClC,iDAAiD;wBACjD,IACE,SAAS,KAAK,GAAG;4BACjB,SAAS,KAAK,IAAI;4BAClB,CAAC,CAAC,OAAO,CAAC,GAAG,IAAI,SAAS,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,CAAC,EAC7C;4BACA,IAAI,CAAC,KAAK,CAAC,eAAe,EAAE,IAAI,EAAE,EAAE,EAAE,OAAO,EAAE,EAAE,CAAC,CAAA;4BAClD,MAAK;yBACN;wBAED,uCAAuC;wBACvC,IAAI,CAAC,KAAK,CAAC,0CAA0C,CAAC,CAAA;wBACtD,EAAE,EAAE,CAAA;qBACL;iBACF;gBAED,sBAAsB;gBACtB,mEAAmE;gBACnE,qBAAqB;gBACrB,IAAI,OAAO,EAAE;oBACX,kBAAkB;oBAClB,IAAI,CAAC,KAAK,CAAC,0BAA0B,EAAE,IAAI,EAAE,EAAE,EAAE,OAAO,EAAE,EAAE,CAAC,CAAA;oBAC7D,IAAI,EAAE,KAAK,EAAE,EAAE;wBACb,OAAO,IAAI,CAAA;qBACZ;iBACF;gBACD,oBAAoB;gBACpB,OAAO,KAAK,CAAA;aACb;YAED,0BAA0B;YAC1B,gDAAgD;YAChD,qDAAqD;YACrD,IAAI,GAAY,CAAA;YAChB,IAAI,OAAO,CAAC,KAAK,QAAQ,EAAE;gBACzB,GAAG,GAAG,CAAC,KAAK,CAAC,CAAA;gBACb,IAAI,CAAC,KAAK,CAAC,cAAc,EAAE,CAAC,EAAE,CAAC,EAAE,GAAG,CAAC,CAAA;aACtC;iBAAM;gBACL,GAAG,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAA;gBACf,IAAI,CAAC,KAAK,CAAC,eAAe,EAAE,CAAC,EAAE,CAAC,EAAE,GAAG,CAAC,CAAA;aACvC;YAED,IAAI,CAAC,GAAG;gBAAE,OAAO,KAAK,CAAA;SACvB;QAED,oDAAoD;QACpD,oDAAoD;QACpD,2CAA2C;QAC3C,kDAAkD;QAClD,oDAAoD;QACpD,uDAAuD;QACvD,oDAAoD;QACpD,yDAAyD;QACzD,6BAA6B;QAC7B,yCAAyC;QAEzC,gEAAgE;QAChE,IAAI,EAAE,KAAK,EAAE,IAAI,EAAE,KAAK,EAAE,EAAE;YAC1B,oDAAoD;YACpD,gBAAgB;YAChB,OAAO,IAAI,CAAA;SACZ;aAAM,IAAI,EAAE,KAAK,EAAE,EAAE;YACpB,+CAA+C;YAC/C,iDAAiD;YACjD,uBAAuB;YACvB,OAAO,OAAO,CAAA;SACf;aAAM,IAAI,EAAE,KAAK,EAAE,EAAE;YACpB,4CAA4C;YAC5C,oDAAoD;YACpD,iDAAiD;YACjD,wBAAwB;YACxB,OAAO,EAAE,KAAK,EAAE,GAAG,CAAC,IAAI,IAAI,CAAC,EAAE,CAAC,KAAK,EAAE,CAAA;YAEvC,qBAAqB;SACtB;aAAM;YACL,yBAAyB;YACzB,MAAM,IAAI,KAAK,CAAC,MAAM,CAAC,CAAA;SACxB;QACD,oBAAoB;IACtB,CAAC;IAED,WAAW;QACT,OAAO,WAAW,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,OAAO,CAAC,CAAA;IAChD,CAAC;IAED,KAAK,CACH,OAAe,EACf,KAAuB;QAEvB,kBAAkB,CAAC,OAAO,CAAC,CAAA;QAE3B,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,YAAY;QACZ,IAAI,OAAO,KAAK,IAAI,EAAE;YACpB,IAAI,CAAC,OAAO,CAAC,UAAU;gBAAE,OAAO,QAAQ,CAAA;;gBACnC,OAAO,GAAG,GAAG,CAAA;SACnB;QACD,IAAI,OAAO,KAAK,EAAE;YAAE,OAAO,EAAE,CAAA;QAE7B,uDAAuD;QACvD,0DAA0D;QAC1D,IAAI,CAA0B,CAAA;QAC9B,IAAI,QAAQ,GAAoC,IAAI,CAAA;QACpD,IAAI,KAAK,KAAK,QAAQ,EAAE;YACtB,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,EAAE;gBAC/B,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,WAAW,CAAC,CAAC,CAAC,QAAQ,CAAA;aAChD;iBAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,YAAY,CAAC,CAAC,EAAE;gBAC5C,QAAQ,GAAG,CACT,OAAO,CAAC,MAAM;oBACZ,CAAC,CAAC,OAAO,CAAC,GAAG;wBACX,CAAC,CAAC,uBAAuB;wBACzB,CAAC,CAAC,oBAAoB;oBACxB,CAAC,CAAC,OAAO,CAAC,GAAG;wBACb,CAAC,CAAC,iBAAiB;wBACnB,CAAC,CAAC,cAAc,CACnB,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAA;aACR;iBAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,QAAQ,CAAC,CAAC,EAAE;gBACxC,QAAQ,GAAG,CACT,OAAO,CAAC,MAAM;oBACZ,CAAC,CAAC,OAAO,CAAC,GAAG;wBACX,CAAC,CAAC,mBAAmB;wBACrB,CAAC,CAAC,gBAAgB;oBACpB,CAAC,CAAC,OAAO,CAAC,GAAG;wBACb,CAAC,CAAC,aAAa;wBACf,CAAC,CAAC,UAAU,CACf,CAAC,CAAC,CAAC,CAAA;aACL;iBAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,aAAa,CAAC,CAAC,EAAE;gBAC7C,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC,CAAC,eAAe,CAAA;aAC9D;iBAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,SAAS,CAAC,CAAC,EAAE;gBACzC,QAAQ,GAAG,WAAW,CAAA;aACvB;SACF;QAED,IAAI,EAAE,GAAG,EAAE,CAAA;QACX,IAAI,QAAQ,GAAG,KAAK,CAAA;QACpB,IAAI,QAAQ,GAAG,KAAK,CAAA;QACpB,4BAA4B;QAC5B,MAAM,gBAAgB,GAAuB,EAAE,CAAA;QAC/C,MAAM,aAAa,GAA+B,EAAE,CAAA;QACpD,IAAI,SAAS,GAAsB,KAAK,CAAA;QACxC,IAAI,OAAO,GAAG,KAAK,CAAA;QACnB,IAAI,YAAY,GAAG,CAAC,CAAC,CAAA;QACrB,IAAI,UAAU,GAAG,CAAC,CAAC,CAAA;QACnB,IAAI,EAAU,CAAA;QACd,IAAI,EAAgC,CAAA;QACpC,IAAI,EAAkB,CAAA;QACtB,2DAA2D;QAC3D,yDAAyD;QACzD,oDAAoD;QACpD,IAAI,cAAc,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,CAAA;QAC9C,IAAI,cAAc,GAAG,OAAO,CAAC,GAAG,IAAI,cAAc,CAAA;QAClD,MAAM,YAAY,GAAG,GAAG,EAAE,CACxB,cAAc;YACZ,CAAC,CAAC,EAAE;YACJ,CAAC,CAAC,cAAc;gBAChB,CAAC,CAAC,gCAAgC;gBAClC,CAAC,CAAC,SAAS,CAAA;QACf,MAAM,eAAe,GAAG,CAAC,CAAS,EAAE,EAAE,CACpC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG;YACjB,CAAC,CAAC,EAAE;YACJ,CAAC,CAAC,OAAO,CAAC,GAAG;gBACb,CAAC,CAAC,gCAAgC;gBAClC,CAAC,CAAC,SAAS,CAAA;QAEf,MAAM,cAAc,GAAG,GAAG,EAAE;YAC1B,IAAI,SAAS,EAAE;gBACb,uCAAuC;gBACvC,qCAAqC;gBACrC,QAAQ,SAAS,EAAE;oBACjB,KAAK,GAAG;wBACN,EAAE,IAAI,IAAI,CAAA;wBACV,QAAQ,GAAG,IAAI,CAAA;wBACf,MAAK;oBACP,KAAK,GAAG;wBACN,EAAE,IAAI,KAAK,CAAA;wBACX,QAAQ,GAAG,IAAI,CAAA;wBACf,MAAK;oBACP;wBACE,EAAE,IAAI,IAAI,GAAG,SAAS,CAAA;wBACtB,MAAK;iBACR;gBACD,IAAI,CAAC,KAAK,CAAC,sBAAsB,EAAE,SAAS,EAAE,EAAE,CAAC,CAAA;gBACjD,SAAS,GAAG,KAAK,CAAA;aAClB;QACH,CAAC,CAAA;QAED,KACE,IAAI,CAAC,GAAG,CAAC,EAAE,CAAS,EACpB,CAAC,GAAG,OAAO,CAAC,MAAM,IAAI,CAAC,CAAC,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,EAC7C,CAAC,EAAE,EACH;YACA,IAAI,CAAC,KAAK,CAAC,cAAc,EAAE,OAAO,EAAE,CAAC,EAAE,EAAE,EAAE,CAAC,CAAC,CAAA;YAE7C,kCAAkC;YAClC,IAAI,QAAQ,EAAE;gBACZ,wCAAwC;gBACxC,wBAAwB;gBACxB,qBAAqB;gBACrB,IAAI,CAAC,KAAK,GAAG,EAAE;oBACb,OAAO,KAAK,CAAA;iBACb;gBACD,oBAAoB;gBAEpB,IAAI,UAAU,CAAC,CAAC,CAAC,EAAE;oBACjB,EAAE,IAAI,IAAI,CAAA;iBACX;gBACD,EAAE,IAAI,CAAC,CAAA;gBACP,QAAQ,GAAG,KAAK,CAAA;gBAChB,SAAQ;aACT;YAED,QAAQ,CAAC,EAAE;gBACT,uCAAuC;gBACvC,qBAAqB;gBACrB,KAAK,GAAG,CAAC,CAAC;oBACR,OAAO,KAAK,CAAA;iBACb;gBACD,oBAAoB;gBAEpB,KAAK,IAAI;oBACP,IAAI,OAAO,IAAI,OAAO,CAAC,MAAM,CAAC,CAAC,GAAG,CAAC,CAAC,KAAK,GAAG,EAAE;wBAC5C,EAAE,IAAI,CAAC,CAAA;wBACP,SAAQ;qBACT;oBAED,cAAc,EAAE,CAAA;oBAChB,QAAQ,GAAG,IAAI,CAAA;oBACf,SAAQ;gBAEV,+BAA+B;gBAC/B,2BAA2B;gBAC3B,KAAK,GAAG,CAAC;gBACT,KAAK,GAAG,CAAC;gBACT,KAAK,GAAG,CAAC;gBACT,KAAK,GAAG,CAAC;gBACT,KAAK,GAAG;oBACN,IAAI,CAAC,KAAK,CAAC,4BAA4B,EAAE,OAAO,EAAE,CAAC,EAAE,EAAE,EAAE,CAAC,CAAC,CAAA;oBAE3D,wDAAwD;oBACxD,qCAAqC;oBACrC,IAAI,OAAO,EAAE;wBACX,IAAI,CAAC,KAAK,CAAC,YAAY,CAAC,CAAA;wBACxB,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,UAAU,GAAG,CAAC;4BAAE,CAAC,GAAG,GAAG,CAAA;wBAC9C,EAAE,IAAI,CAAC,CAAA;wBACP,SAAQ;qBACT;oBAED,gDAAgD;oBAChD,mDAAmD;oBACnD,oDAAoD;oBACpD,IAAI,CAAC,KAAK,CAAC,wBAAwB,EAAE,SAAS,CAAC,CAAA;oBAC/C,cAAc,EAAE,CAAA;oBAChB,SAAS,GAAG,CAAC,CAAA;oBACb,0DAA0D;oBAC1D,+DAA+D;oBAC/D,yBAAyB;oBACzB,IAAI,OAAO,CAAC,KAAK;wBAAE,cAAc,EAAE,CAAA;oBACnC,SAAQ;gBAEV,KAAK,GAAG,CAAC,CAAC;oBACR,IAAI,OAAO,EAAE;wBACX,EAAE,IAAI,GAAG,CAAA;wBACT,SAAQ;qBACT;oBAED,IAAI,CAAC,SAAS,EAAE;wBACd,EAAE,IAAI,KAAK,CAAA;wBACX,SAAQ;qBACT;oBAED,MAAM,OAAO,GAAqB;wBAChC,IAAI,EAAE,SAAS;wBACf,KAAK,EAAE,CAAC,GAAG,CAAC;wBACZ,OAAO,EAAE,EAAE,CAAC,MAAM;wBAClB,IAAI,EAAE,OAAO,CAAC,SAAS,CAAC,CAAC,IAAI;wBAC7B,KAAK,EAAE,OAAO,CAAC,SAAS,CAAC,CAAC,KAAK;qBAChC,CAAA;oBACD,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,EAAE,OAAO,CAAC,CAAA;oBACvC,gBAAgB,CAAC,IAAI,CAAC,OAAO,CAAC,CAAA;oBAC9B,4CAA4C;oBAC5C,EAAE,IAAI,OAAO,CAAC,IAAI,CAAA;oBAClB,sCAAsC;oBACtC,IAAI,OAAO,CAAC,KAAK,KAAK,CAAC,IAAI,OAAO,CAAC,IAAI,KAAK,GAAG,EAAE;wBAC/C,cAAc,GAAG,IAAI,CAAA;wBACrB,EAAE,IAAI,eAAe,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAA;qBAC5C;oBACD,IAAI,CAAC,KAAK,CAAC,cAAc,EAAE,SAAS,EAAE,EAAE,CAAC,CAAA;oBACzC,SAAS,GAAG,KAAK,CAAA;oBACjB,SAAQ;iBACT;gBAED,KAAK,GAAG,CAAC,CAAC;oBACR,MAAM,OAAO,GAAG,gBAAgB,CAAC,gBAAgB,CAAC,MAAM,GAAG,CAAC,CAAC,CAAA;oBAC7D,IAAI,OAAO,IAAI,CAAC,OAAO,EAAE;wBACvB,EAAE,IAAI,KAAK,CAAA;wBACX,SAAQ;qBACT;oBACD,gBAAgB,CAAC,GAAG,EAAE,CAAA;oBAEtB,qBAAqB;oBACrB,cAAc,EAAE,CAAA;oBAChB,QAAQ,GAAG,IAAI,CAAA;oBACf,EAAE,GAAG,OAAO,CAAA;oBACZ,8BAA8B;oBAC9B,qCAAqC;oBACrC,EAAE,IAAI,EAAE,CAAC,KAAK,CAAA;oBACd,IAAI,EAAE,CAAC,IAAI,KAAK,GAAG,EAAE;wBACnB,aAAa,CAAC,IAAI,CAAC,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,EAAE,KAAK,EAAE,EAAE,CAAC,MAAM,EAAE,CAAC,CAAC,CAAA;qBAC5D;oBACD,SAAQ;iBACT;gBAED,KAAK,GAAG,CAAC,CAAC;oBACR,MAAM,OAAO,GAAG,gBAAgB,CAAC,gBAAgB,CAAC,MAAM,GAAG,CAAC,CAAC,CAAA;oBAC7D,IAAI,OAAO,IAAI,CAAC,OAAO,EAAE;wBACvB,EAAE,IAAI,KAAK,CAAA;wBACX,SAAQ;qBACT;oBAED,cAAc,EAAE,CAAA;oBAChB,EAAE,IAAI,GAAG,CAAA;oBACT,wCAAwC;oBACxC,IAAI,OAAO,CAAC,KAAK,KAAK,CAAC,IAAI,OAAO,CAAC,IAAI,KAAK,GAAG,EAAE;wBAC/C,cAAc,GAAG,IAAI,CAAA;wBACrB,EAAE,IAAI,eAAe,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAA;qBAC5C;oBACD,SAAQ;iBACT;gBAED,+CAA+C;gBAC/C,KAAK,GAAG;oBACN,+CAA+C;oBAC/C,cAAc,EAAE,CAAA;oBAEhB,IAAI,OAAO,EAAE;wBACX,EAAE,IAAI,IAAI,GAAG,CAAC,CAAA;wBACd,SAAQ;qBACT;oBAED,OAAO,GAAG,IAAI,CAAA;oBACd,UAAU,GAAG,CAAC,CAAA;oBACd,YAAY,GAAG,EAAE,CAAC,MAAM,CAAA;oBACxB,EAAE,IAAI,CAAC,CAAA;oBACP,SAAQ;gBAEV,KAAK,GAAG;oBACN,0CAA0C;oBAC1C,mCAAmC;oBACnC,qCAAqC;oBACrC,0CAA0C;oBAC1C,IAAI,CAAC,KAAK,UAAU,GAAG,CAAC,IAAI,CAAC,OAAO,EAAE;wBACpC,EAAE,IAAI,IAAI,GAAG,CAAC,CAAA;wBACd,SAAQ;qBACT;oBAED,sDAAsD;oBACtD,oDAAoD;oBACpD,qDAAqD;oBACrD,4BAA4B;oBAC5B,sDAAsD;oBACtD,wDAAwD;oBACxD,kDAAkD;oBAClD,EAAE,GAAG,OAAO,CAAC,SAAS,CAAC,UAAU,GAAG,CAAC,EAAE,CAAC,CAAC,CAAA;oBACzC,IAAI;wBACF,MAAM,CAAC,GAAG,GAAG,YAAY,CAAC,YAAY,CAAC,EAAE,CAAC,CAAC,GAAG,GAAG,CAAC,CAAA;wBAClD,mCAAmC;wBACnC,EAAE,IAAI,CAAC,CAAA;qBACR;oBAAC,OAAO,EAAE,EAAE;wBACX,4DAA4D;wBAC5D,6CAA6C;wBAC7C,EAAE,GAAG,EAAE,CAAC,SAAS,CAAC,CAAC,EAAE,YAAY,CAAC,GAAG,QAAQ,CAAA,CAAC,qBAAqB;qBACpE;oBACD,QAAQ,GAAG,IAAI,CAAA;oBACf,OAAO,GAAG,KAAK,CAAA;oBACf,SAAQ;gBAEV;oBACE,8CAA8C;oBAC9C,cAAc,EAAE,CAAA;oBAEhB,IAAI,UAAU,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,GAAG,IAAI,OAAO,CAAC,EAAE;wBAC5C,EAAE,IAAI,IAAI,CAAA;qBACX;oBAED,EAAE,IAAI,CAAC,CAAA;oBACP,MAAK;aACR,CAAC,SAAS;SACZ,CAAC,MAAM;QAER,8CAA8C;QAC9C,yCAAyC;QACzC,IAAI,OAAO,EAAE;YACX,4CAA4C;YAC5C,+CAA+C;YAC/C,qDAAqD;YACrD,gDAAgD;YAChD,EAAE,GAAG,OAAO,CAAC,KAAK,CAAC,UAAU,GAAG,CAAC,CAAC,CAAA;YAClC,EAAE,GAAG,IAAI,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,CAAmB,CAAA;YAC/C,EAAE,GAAG,EAAE,CAAC,SAAS,CAAC,CAAC,EAAE,YAAY,CAAC,GAAG,KAAK,GAAG,EAAE,CAAC,CAAC,CAAC,CAAA;YAClD,QAAQ,GAAG,QAAQ,IAAI,EAAE,CAAC,CAAC,CAAC,CAAA;SAC7B;QAED,uDAAuD;QACvD,kBAAkB;QAClB,kEAAkE;QAClE,wEAAwE;QACxE,mEAAmE;QACnE,qCAAqC;QACrC,KAAK,EAAE,GAAG,gBAAgB,CAAC,GAAG,EAAE,EAAE,EAAE,EAAE,EAAE,GAAG,gBAAgB,CAAC,GAAG,EAAE,EAAE;YACjE,IAAI,IAAY,CAAA;YAChB,IAAI,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,CAAA;YAC5C,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,cAAc,EAAE,EAAE,EAAE,EAAE,CAAC,CAAA;YAChD,+DAA+D;YAC/D,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC,2BAA2B,EAAE,CAAC,CAAC,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE;gBAC7D,IAAI,CAAC,EAAE,EAAE;oBACP,6CAA6C;oBAC7C,EAAE,GAAG,IAAI,CAAA;oBACT,yBAAyB;oBACzB,qBAAqB;iBACtB;gBACD,oBAAoB;gBAEpB,iEAAiE;gBACjE,mEAAmE;gBACnE,qEAAqE;gBACrE,yDAAyD;gBACzD,EAAE;gBACF,wCAAwC;gBACxC,OAAO,EAAE,GAAG,EAAE,GAAG,EAAE,GAAG,GAAG,CAAA;YAC3B,CAAC,CAAC,CAAA;YAEF,IAAI,CAAC,KAAK,CAAC,gBAAgB,EAAE,IAAI,EAAE,IAAI,EAAE,EAAE,EAAE,EAAE,CAAC,CAAA;YAChD,MAAM,CAAC,GACL,EAAE,CAAC,IAAI,KAAK,GAAG,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,IAAI,KAAK,GAAG,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,GAAG,EAAE,CAAC,IAAI,CAAA;YAEnE,QAAQ,GAAG,IAAI,CAAA;YACf,EAAE,GAAG,EAAE,CAAC,KAAK,CAAC,CAAC,EAAE,EAAE,CAAC,OAAO,CAAC,GAAG,CAAC,GAAG,KAAK,GAAG,IAAI,CAAA;SAChD;QAED,2DAA2D;QAC3D,cAAc,EAAE,CAAA;QAChB,IAAI,QAAQ,EAAE;YACZ,cAAc;YACd,EAAE,IAAI,MAAM,CAAA;SACb;QAED,2DAA2D;QAC3D,iDAAiD;QACjD,MAAM,eAAe,GAAG,kBAAkB,CAAC,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAA;QAExD,wDAAwD;QACxD,4DAA4D;QAC5D,yDAAyD;QACzD,0DAA0D;QAC1D,eAAe;QACf,KAAK,IAAI,CAAC,GAAG,aAAa,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC,GAAG,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE;YAClD,MAAM,EAAE,GAAG,aAAa,CAAC,CAAC,CAAC,CAAA;YAE3B,MAAM,QAAQ,GAAG,EAAE,CAAC,KAAK,CAAC,CAAC,EAAE,EAAE,CAAC,OAAO,CAAC,CAAA;YACxC,MAAM,OAAO,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,OAAO,EAAE,EAAE,CAAC,KAAK,GAAG,CAAC,CAAC,CAAA;YAClD,IAAI,OAAO,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,KAAK,CAAC,CAAA;YAChC,MAAM,MAAM,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,CAAC,KAAK,GAAG,CAAC,EAAE,EAAE,CAAC,KAAK,CAAC,GAAG,OAAO,CAAA;YAEzD,gEAAgE;YAChE,wEAAwE;YACxE,+BAA+B;YAC/B,MAAM,iBAAiB,GAAG,QAAQ,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,MAAM,CAAA;YACpD,MAAM,gBAAgB,GAAG,QAAQ,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,MAAM,GAAG,iBAAiB,CAAA;YACvE,IAAI,UAAU,GAAG,OAAO,CAAA;YACxB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,gBAAgB,EAAE,CAAC,EAAE,EAAE;gBACzC,UAAU,GAAG,UAAU,CAAC,OAAO,CAAC,UAAU,EAAE,EAAE,CAAC,CAAA;aAChD;YACD,OAAO,GAAG,UAAU,CAAA;YAEpB,MAAM,MAAM,GAAG,OAAO,KAAK,EAAE,IAAI,KAAK,KAAK,QAAQ,CAAC,CAAC,CAAC,WAAW,CAAC,CAAC,CAAC,EAAE,CAAA;YAEtE,EAAE,GAAG,QAAQ,GAAG,OAAO,GAAG,OAAO,GAAG,MAAM,GAAG,MAAM,CAAA;SACpD;QAED,+DAA+D;QAC/D,+CAA+C;QAC/C,oDAAoD;QACpD,IAAI,EAAE,KAAK,EAAE,IAAI,QAAQ,EAAE;YACzB,EAAE,GAAG,OAAO,GAAG,EAAE,CAAA;SAClB;QAED,IAAI,eAAe,EAAE;YACnB,EAAE,GAAG,YAAY,EAAE,GAAG,EAAE,CAAA;SACzB;QAED,4CAA4C;QAC5C,IAAI,KAAK,KAAK,QAAQ,EAAE;YACtB,OAAO,CAAC,EAAE,EAAE,QAAQ,CAAC,CAAA;SACtB;QAED,kEAAkE;QAClE,IAAI,OAAO,CAAC,MAAM,IAAI,CAAC,QAAQ,IAAI,CAAC,OAAO,CAAC,eAAe,EAAE;YAC3D,QAAQ,GAAG,OAAO,CAAC,WAAW,EAAE,KAAK,OAAO,CAAC,WAAW,EAAE,CAAA;SAC3D;QAED,2CAA2C;QAC3C,oDAAoD;QACpD,qCAAqC;QACrC,IAAI,CAAC,QAAQ,EAAE;YACb,OAAO,YAAY,CAAC,OAAO,CAAC,CAAA;SAC7B;QAED,MAAM,KAAK,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAA;QACvC,IAAI;YACF,MAAM,GAAG,GAAG,QAAQ;gBAClB,CAAC,CAAC;oBACE,KAAK,EAAE,OAAO;oBACd,IAAI,EAAE,EAAE;oBACR,IAAI,EAAE,QAAQ;iBACf;gBACH,CAAC,CAAC;oBACE,KAAK,EAAE,OAAO;oBACd,IAAI,EAAE,EAAE;iBACT,CAAA;YACL,OAAO,MAAM,CAAC,MAAM,CAAC,IAAI,MAAM,CAAC,GAAG,GAAG,EAAE,GAAG,GAAG,EAAE,KAAK,CAAC,EAAE,GAAG,CAAC,CAAA;YAC5D,qBAAqB;SACtB;QAAC,OAAO,EAAE,EAAE;YACX,uBAAuB;YACvB,+DAA+D;YAC/D,+DAA+D;YAC/D,kEAAkE;YAClE,iCAAiC;YACjC,IAAI,CAAC,KAAK,CAAC,gBAAgB,EAAE,EAAE,CAAC,CAAA;YAChC,OAAO,IAAI,MAAM,CAAC,IAAI,CAAC,CAAA;SACxB;QACD,oBAAoB;IACtB,CAAC;IAED,MAAM;QACJ,IAAI,IAAI,CAAC,MAAM,IAAI,IAAI,CAAC,MAAM,KAAK,KAAK;YAAE,OAAO,IAAI,CAAC,MAAM,CAAA;QAE5D,mDAAmD;QACnD,4BAA4B;QAC5B,EAAE;QACF,wDAAwD;QACxD,yDAAyD;QACzD,2CAA2C;QAC3C,MAAM,GAAG,GAAG,IAAI,CAAC,GAAG,CAAA;QAEpB,IAAI,CAAC,GAAG,CAAC,MAAM,EAAE;YACf,IAAI,CAAC,MAAM,GAAG,KAAK,CAAA;YACnB,OAAO,IAAI,CAAC,MAAM,CAAA;SACnB;QACD,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,MAAM,OAAO,GAAG,OAAO,CAAC,UAAU;YAChC,CAAC,CAAC,IAAI;YACN,CAAC,CAAC,OAAO,CAAC,GAAG;gBACb,CAAC,CAAC,UAAU;gBACZ,CAAC,CAAC,YAAY,CAAA;QAChB,MAAM,KAAK,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAA;QAEvC,kCAAkC;QAClC,kDAAkD;QAClD,sEAAsE;QACtE,iDAAiD;QACjD,8DAA8D;QAC9D,mCAAmC;QACnC,IAAI,EAAE,GAAG,GAAG;aACT,GAAG,CAAC,OAAO,CAAC,EAAE;YACb,MAAM,EAAE,GAAiC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CACvD,OAAO,CAAC,KAAK,QAAQ;gBACnB,CAAC,CAAC,YAAY,CAAC,CAAC,CAAC;gBACjB,CAAC,CAAC,CAAC,KAAK,QAAQ;oBAChB,CAAC,CAAC,QAAQ;oBACV,CAAC,CAAC,CAAC,CAAC,IAAI,CACqB,CAAA;YACjC,EAAE,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,EAAE;gBAClB,MAAM,IAAI,GAAG,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,CAAA;gBACtB,MAAM,IAAI,GAAG,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,CAAA;gBACtB,IAAI,CAAC,KAAK,QAAQ,IAAI,IAAI,KAAK,QAAQ,EAAE;oBACvC,OAAM;iBACP;gBACD,IAAI,IAAI,KAAK,SAAS,EAAE;oBACtB,IAAI,IAAI,KAAK,SAAS,IAAI,IAAI,KAAK,QAAQ,EAAE;wBAC3C,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,SAAS,GAAG,OAAO,GAAG,OAAO,GAAG,IAAI,CAAA;qBACjD;yBAAM;wBACL,EAAE,CAAC,CAAC,CAAC,GAAG,OAAO,CAAA;qBAChB;iBACF;qBAAM,IAAI,IAAI,KAAK,SAAS,EAAE;oBAC7B,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,IAAI,GAAG,SAAS,GAAG,OAAO,GAAG,IAAI,CAAA;iBAC9C;qBAAM,IAAI,IAAI,KAAK,QAAQ,EAAE;oBAC5B,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,IAAI,GAAG,YAAY,GAAG,OAAO,GAAG,MAAM,GAAG,IAAI,CAAA;oBACzD,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC,GAAG,QAAQ,CAAA;iBACrB;YACH,CAAC,CAAC,CAAA;YACF,OAAO,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,KAAK,QAAQ,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAA;QACjD,CAAC,CAAC;aACD,IAAI,CAAC,GAAG,CAAC,CAAA;QAEZ,4BAA4B;QAC5B,gDAAgD;QAChD,EAAE,GAAG,MAAM,GAAG,EAAE,GAAG,IAAI,CAAA;QAEvB,gDAAgD;QAChD,IAAI,IAAI,CAAC,MAAM;YAAE,EAAE,GAAG,MAAM,GAAG,EAAE,GAAG,MAAM,CAAA;QAE1C,IAAI;YACF,IAAI,CAAC,MAAM,GAAG,IAAI,MAAM,CAAC,EAAE,EAAE,KAAK,CAAC,CAAA;YACnC,qBAAqB;SACtB;QAAC,OAAO,EAAE,EAAE;YACX,uBAAuB;YACvB,IAAI,CAAC,MAAM,GAAG,KAAK,CAAA;SACpB;QACD,oBAAoB;QACpB,OAAO,IAAI,CAAC,MAAM,CAAA;IACpB,CAAC;IAED,UAAU,CAAC,CAAS;QAClB,mDAAmD;QACnD,6DAA6D;QAC7D,8CAA8C;QAC9C,0CAA0C;QAC1C,IAAI,IAAI,CAAC,uBAAuB,EAAE;YAChC,OAAO,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,CAAA;SACpB;aAAM,IAAI,SAAS,IAAI,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE;YAC7C,sCAAsC;YACtC,OAAO,CAAC,EAAE,EAAE,GAAG,CAAC,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,CAAA;SAC/B;aAAM;YACL,OAAO,CAAC,CAAC,KAAK,CAAC,KAAK,CAAC,CAAA;SACtB;IACH,CAAC;IAED,KAAK,CAAC,CAAS,EAAE,OAAO,GAAG,IAAI,CAAC,OAAO;QACrC,IAAI,CAAC,KAAK,CAAC,OAAO,EAAE,CAAC,EAAE,IAAI,CAAC,OAAO,CAAC,CAAA;QACpC,8CAA8C;QAC9C,iBAAiB;QACjB,IAAI,IAAI,CAAC,OAAO,EAAE;YAChB,OAAO,KAAK,CAAA;SACb;QACD,IAAI,IAAI,CAAC,KAAK,EAAE;YACd,OAAO,CAAC,KAAK,EAAE,CAAA;SAChB;QAED,IAAI,CAAC,KAAK,GAAG,IAAI,OAAO,EAAE;YACxB,OAAO,IAAI,CAAA;SACZ;QAED,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAA;QAE5B,gCAAgC;QAChC,IAAI,IAAI,CAAC,GAAG,KAAK,GAAG,EAAE;YACpB,CAAC,GAAG,CAAC,CAAC,KAAK,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAA;SAChC;QAED,6CAA6C;QAC7C,MAAM,EAAE,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAA;QAC7B,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,OAAO,EAAE,EAAE,CAAC,CAAA;QAErC,0DAA0D;QAC1D,2DAA2D;QAC3D,mCAAmC;QACnC,uCAAuC;QAEvC,MAAM,GAAG,GAAG,IAAI,CAAC,GAAG,CAAA;QACpB,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,KAAK,EAAE,GAAG,CAAC,CAAA;QAEpC,0EAA0E;QAC1E,IAAI,QAAQ,GAAW,EAAE,CAAC,EAAE,CAAC,MAAM,GAAG,CAAC,CAAC,CAAA;QACxC,IAAI,CAAC,QAAQ,EAAE;YACb,KAAK,IAAI,CAAC,GAAG,EAAE,CAAC,MAAM,GAAG,CAAC,EAAE,CAAC,QAAQ,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,EAAE,EAAE;gBACpD,QAAQ,GAAG,EAAE,CAAC,CAAC,CAAC,CAAA;aACjB;SACF;QAED,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,GAAG,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YACnC,MAAM,OAAO,GAAG,GAAG,CAAC,CAAC,CAAC,CAAA;YACtB,IAAI,IAAI,GAAG,EAAE,CAAA;YACb,IAAI,OAAO,CAAC,SAAS,IAAI,OAAO,CAAC,MAAM,KAAK,CAAC,EAAE;gBAC7C,IAAI,GAAG,CAAC,QAAQ,CAAC,CAAA;aAClB;YACD,MAAM,GAAG,GAAG,IAAI,CAAC,QAAQ,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,CAAA;YACjD,IAAI,GAAG,EAAE;gBACP,IAAI,OAAO,CAAC,UAAU,EAAE;oBACtB,OAAO,IAAI,CAAA;iBACZ;gBACD,OAAO,CAAC,IAAI,CAAC,MAAM,CAAA;aACpB;SACF;QAED,2DAA2D;QAC3D,8BAA8B;QAC9B,IAAI,OAAO,CAAC,UAAU,EAAE;YACtB,OAAO,KAAK,CAAA;SACb;QACD,OAAO,IAAI,CAAC,MAAM,CAAA;IACpB,CAAC;IAED,MAAM,CAAC,QAAQ,CAAC,GAAqB;QACnC,OAAO,SAAS,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,SAAS,CAAA;IAC1C,CAAC;CACF;AAED,SAAS,CAAC,SAAS,GAAG,SAAS,CAAA"}
\ No newline at end of file
diff --git a/deps/npm/node_modules/minimatch/dist/mjs/package.json b/deps/npm/node_modules/minimatch/dist/mjs/package.json
new file mode 100644
index 00000000000000..3dbc1ca591c055
--- /dev/null
+++ b/deps/npm/node_modules/minimatch/dist/mjs/package.json
@@ -0,0 +1,3 @@
+{
+ "type": "module"
+}
diff --git a/deps/npm/node_modules/minimatch/package.json b/deps/npm/node_modules/minimatch/package.json
index 8e237d3f15504f..58f289adf75e76 100644
--- a/deps/npm/node_modules/minimatch/package.json
+++ b/deps/npm/node_modules/minimatch/package.json
@@ -2,18 +2,54 @@
"author": "Isaac Z. Schlueter (http://blog.izs.me)",
"name": "minimatch",
"description": "a glob matcher in javascript",
- "version": "5.1.1",
+ "version": "6.2.0",
"repository": {
"type": "git",
"url": "git://github.com/isaacs/minimatch.git"
},
- "main": "minimatch.js",
+ "main": "./dist/cjs/index-cjs.js",
+ "module": "./dist/mjs/index.js",
+ "types": "./dist/cjs/index.d.ts",
+ "exports": {
+ ".": {
+ "import": {
+ "types": "./dist/mjs/index.d.ts",
+ "default": "./dist/mjs/index.js"
+ },
+ "require": {
+ "types": "./dist/cjs/index-cjs.d.ts",
+ "default": "./dist/cjs/index-cjs.js"
+ }
+ }
+ },
+ "files": [
+ "dist"
+ ],
"scripts": {
- "test": "tap",
- "snap": "tap",
"preversion": "npm test",
"postversion": "npm publish",
- "prepublishOnly": "git push origin --follow-tags"
+ "prepublishOnly": "git push origin --follow-tags",
+ "preprepare": "rm -rf dist",
+ "prepare": "tsc -p tsconfig.json && tsc -p tsconfig-esm.json",
+ "postprepare": "bash fixup.sh",
+ "pretest": "npm run prepare",
+ "presnap": "npm run prepare",
+ "test": "c8 tap",
+ "snap": "c8 tap",
+ "format": "prettier --write . --loglevel warn",
+ "benchmark": "node benchmark/index.js",
+ "typedoc": "typedoc --tsconfig tsconfig-esm.json ./src/*.ts"
+ },
+ "prettier": {
+ "semi": false,
+ "printWidth": 80,
+ "tabWidth": 2,
+ "useTabs": false,
+ "singleQuote": true,
+ "jsxSingleQuote": false,
+ "bracketSameLine": true,
+ "arrowParens": "avoid",
+ "endOfLine": "lf"
},
"engines": {
"node": ">=10"
@@ -22,11 +58,29 @@
"brace-expansion": "^2.0.1"
},
"devDependencies": {
- "tap": "^16.3.2"
+ "@types/brace-expansion": "^1.1.0",
+ "@types/node": "^18.11.9",
+ "@types/tap": "^15.0.7",
+ "c8": "^7.12.0",
+ "eslint-config-prettier": "^8.6.0",
+ "mkdirp": "1",
+ "prettier": "^2.8.2",
+ "tap": "^16.3.3",
+ "ts-node": "^10.9.1",
+ "typedoc": "^0.23.21",
+ "typescript": "^4.9.3"
},
- "license": "ISC",
- "files": [
- "minimatch.js",
- "lib"
- ]
+ "tap": {
+ "coverage": false,
+ "node-arg": [
+ "--no-warnings",
+ "--loader",
+ "ts-node/esm"
+ ],
+ "ts": false
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/isaacs"
+ },
+ "license": "ISC"
}
diff --git a/deps/npm/node_modules/minipass/LICENSE b/deps/npm/node_modules/minipass/LICENSE
index bf1dece2e1f122..97f8e32ed82e4c 100644
--- a/deps/npm/node_modules/minipass/LICENSE
+++ b/deps/npm/node_modules/minipass/LICENSE
@@ -1,6 +1,6 @@
The ISC License
-Copyright (c) 2017-2022 npm, Inc., Isaac Z. Schlueter, and Contributors
+Copyright (c) 2017-2023 npm, Inc., Isaac Z. Schlueter, and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
diff --git a/deps/npm/node_modules/minipass/index.d.ts b/deps/npm/node_modules/minipass/index.d.ts
index f68ce8a259c471..93a06eb357109b 100644
--- a/deps/npm/node_modules/minipass/index.d.ts
+++ b/deps/npm/node_modules/minipass/index.d.ts
@@ -140,8 +140,8 @@ declare class Minipass<
listener: () => any
): this
- [Symbol.iterator](): Iterator
- [Symbol.asyncIterator](): AsyncIterator
+ [Symbol.iterator](): Generator
+ [Symbol.asyncIterator](): AsyncGenerator
}
export = Minipass
diff --git a/deps/npm/node_modules/minipass/index.js b/deps/npm/node_modules/minipass/index.js
index d5003ed9a57546..5d45de8d39f76e 100644
--- a/deps/npm/node_modules/minipass/index.js
+++ b/deps/npm/node_modules/minipass/index.js
@@ -1,8 +1,11 @@
'use strict'
-const proc = typeof process === 'object' && process ? process : {
- stdout: null,
- stderr: null,
-}
+const proc =
+ typeof process === 'object' && process
+ ? process
+ : {
+ stdout: null,
+ stderr: null,
+ }
const EE = require('events')
const Stream = require('stream')
const SD = require('string_decoder').StringDecoder
@@ -27,7 +30,10 @@ const BUFFERLENGTH = Symbol('bufferLength')
const BUFFERPUSH = Symbol('bufferPush')
const BUFFERSHIFT = Symbol('bufferShift')
const OBJECTMODE = Symbol('objectMode')
+// internal event when stream is destroyed
const DESTROYED = Symbol('destroyed')
+// internal event when stream has an error
+const ERROR = Symbol('error')
const EMITDATA = Symbol('emitData')
const EMITEND = Symbol('emitEnd')
const EMITEND2 = Symbol('emitEnd2')
@@ -36,54 +42,51 @@ const ASYNC = Symbol('async')
const defer = fn => Promise.resolve().then(fn)
// TODO remove when Node v8 support drops
-const doIter = global._MP_NO_ITERATOR_SYMBOLS_ !== '1'
-const ASYNCITERATOR = doIter && Symbol.asyncIterator
- || Symbol('asyncIterator not implemented')
-const ITERATOR = doIter && Symbol.iterator
- || Symbol('iterator not implemented')
+const doIter = global._MP_NO_ITERATOR_SYMBOLS_ !== '1'
+const ASYNCITERATOR =
+ (doIter && Symbol.asyncIterator) || Symbol('asyncIterator not implemented')
+const ITERATOR =
+ (doIter && Symbol.iterator) || Symbol('iterator not implemented')
// events that mean 'the stream is over'
// these are treated specially, and re-emitted
// if they are listened for after emitting.
-const isEndish = ev =>
- ev === 'end' ||
- ev === 'finish' ||
- ev === 'prefinish'
+const isEndish = ev => ev === 'end' || ev === 'finish' || ev === 'prefinish'
-const isArrayBuffer = b => b instanceof ArrayBuffer ||
- typeof b === 'object' &&
- b.constructor &&
- b.constructor.name === 'ArrayBuffer' &&
- b.byteLength >= 0
+const isArrayBuffer = b =>
+ b instanceof ArrayBuffer ||
+ (typeof b === 'object' &&
+ b.constructor &&
+ b.constructor.name === 'ArrayBuffer' &&
+ b.byteLength >= 0)
const isArrayBufferView = b => !Buffer.isBuffer(b) && ArrayBuffer.isView(b)
class Pipe {
- constructor (src, dest, opts) {
+ constructor(src, dest, opts) {
this.src = src
this.dest = dest
this.opts = opts
this.ondrain = () => src[RESUME]()
dest.on('drain', this.ondrain)
}
- unpipe () {
+ unpipe() {
this.dest.removeListener('drain', this.ondrain)
}
// istanbul ignore next - only here for the prototype
- proxyErrors () {}
- end () {
+ proxyErrors() {}
+ end() {
this.unpipe()
- if (this.opts.end)
- this.dest.end()
+ if (this.opts.end) this.dest.end()
}
}
class PipeProxyErrors extends Pipe {
- unpipe () {
+ unpipe() {
this.src.removeListener('error', this.proxyErrors)
super.unpipe()
}
- constructor (src, dest, opts) {
+ constructor(src, dest, opts) {
super(src, dest, opts)
this.proxyErrors = er => dest.emit('error', er)
src.on('error', this.proxyErrors)
@@ -91,21 +94,18 @@ class PipeProxyErrors extends Pipe {
}
module.exports = class Minipass extends Stream {
- constructor (options) {
+ constructor(options) {
super()
this[FLOWING] = false
// whether we're explicitly paused
this[PAUSED] = false
this[PIPES] = []
this[BUFFER] = []
- this[OBJECTMODE] = options && options.objectMode || false
- if (this[OBJECTMODE])
- this[ENCODING] = null
- else
- this[ENCODING] = options && options.encoding || null
- if (this[ENCODING] === 'buffer')
- this[ENCODING] = null
- this[ASYNC] = options && !!options.async || false
+ this[OBJECTMODE] = (options && options.objectMode) || false
+ if (this[OBJECTMODE]) this[ENCODING] = null
+ else this[ENCODING] = (options && options.encoding) || null
+ if (this[ENCODING] === 'buffer') this[ENCODING] = null
+ this[ASYNC] = (options && !!options.async) || false
this[DECODER] = this[ENCODING] ? new SD(this[ENCODING]) : null
this[EOF] = false
this[EMITTED_END] = false
@@ -124,15 +124,21 @@ module.exports = class Minipass extends Stream {
}
}
- get bufferLength () { return this[BUFFERLENGTH] }
+ get bufferLength() {
+ return this[BUFFERLENGTH]
+ }
- get encoding () { return this[ENCODING] }
- set encoding (enc) {
- if (this[OBJECTMODE])
- throw new Error('cannot set encoding in objectMode')
+ get encoding() {
+ return this[ENCODING]
+ }
+ set encoding(enc) {
+ if (this[OBJECTMODE]) throw new Error('cannot set encoding in objectMode')
- if (this[ENCODING] && enc !== this[ENCODING] &&
- (this[DECODER] && this[DECODER].lastNeed || this[BUFFERLENGTH]))
+ if (
+ this[ENCODING] &&
+ enc !== this[ENCODING] &&
+ ((this[DECODER] && this[DECODER].lastNeed) || this[BUFFERLENGTH])
+ )
throw new Error('cannot change encoding')
if (this[ENCODING] !== enc) {
@@ -144,33 +150,41 @@ module.exports = class Minipass extends Stream {
this[ENCODING] = enc
}
- setEncoding (enc) {
+ setEncoding(enc) {
this.encoding = enc
}
- get objectMode () { return this[OBJECTMODE] }
- set objectMode (om) { this[OBJECTMODE] = this[OBJECTMODE] || !!om }
+ get objectMode() {
+ return this[OBJECTMODE]
+ }
+ set objectMode(om) {
+ this[OBJECTMODE] = this[OBJECTMODE] || !!om
+ }
- get ['async'] () { return this[ASYNC] }
- set ['async'] (a) { this[ASYNC] = this[ASYNC] || !!a }
+ get ['async']() {
+ return this[ASYNC]
+ }
+ set ['async'](a) {
+ this[ASYNC] = this[ASYNC] || !!a
+ }
- write (chunk, encoding, cb) {
- if (this[EOF])
- throw new Error('write after end')
+ write(chunk, encoding, cb) {
+ if (this[EOF]) throw new Error('write after end')
if (this[DESTROYED]) {
- this.emit('error', Object.assign(
- new Error('Cannot call write after a stream was destroyed'),
- { code: 'ERR_STREAM_DESTROYED' }
- ))
+ this.emit(
+ 'error',
+ Object.assign(
+ new Error('Cannot call write after a stream was destroyed'),
+ { code: 'ERR_STREAM_DESTROYED' }
+ )
+ )
return true
}
- if (typeof encoding === 'function')
- cb = encoding, encoding = 'utf8'
+ if (typeof encoding === 'function') (cb = encoding), (encoding = 'utf8')
- if (!encoding)
- encoding = 'utf8'
+ if (!encoding) encoding = 'utf8'
const fn = this[ASYNC] ? defer : f => f()
@@ -181,8 +195,7 @@ module.exports = class Minipass extends Stream {
if (!this[OBJECTMODE] && !Buffer.isBuffer(chunk)) {
if (isArrayBufferView(chunk))
chunk = Buffer.from(chunk.buffer, chunk.byteOffset, chunk.byteLength)
- else if (isArrayBuffer(chunk))
- chunk = Buffer.from(chunk)
+ else if (isArrayBuffer(chunk)) chunk = Buffer.from(chunk)
else if (typeof chunk !== 'string')
// use the setter so we throw if we have encoding set
this.objectMode = true
@@ -192,19 +205,14 @@ module.exports = class Minipass extends Stream {
// this yields better performance, fewer checks later.
if (this[OBJECTMODE]) {
/* istanbul ignore if - maybe impossible? */
- if (this.flowing && this[BUFFERLENGTH] !== 0)
- this[FLUSH](true)
+ if (this.flowing && this[BUFFERLENGTH] !== 0) this[FLUSH](true)
- if (this.flowing)
- this.emit('data', chunk)
- else
- this[BUFFERPUSH](chunk)
+ if (this.flowing) this.emit('data', chunk)
+ else this[BUFFERPUSH](chunk)
- if (this[BUFFERLENGTH] !== 0)
- this.emit('readable')
+ if (this[BUFFERLENGTH] !== 0) this.emit('readable')
- if (cb)
- fn(cb)
+ if (cb) fn(cb)
return this.flowing
}
@@ -212,18 +220,18 @@ module.exports = class Minipass extends Stream {
// at this point the chunk is a buffer or string
// don't buffer it up or send it to the decoder
if (!chunk.length) {
- if (this[BUFFERLENGTH] !== 0)
- this.emit('readable')
- if (cb)
- fn(cb)
+ if (this[BUFFERLENGTH] !== 0) this.emit('readable')
+ if (cb) fn(cb)
return this.flowing
}
// fast-path writing strings of same encoding to a stream with
// an empty buffer, skipping the buffer/decoder dance
- if (typeof chunk === 'string' &&
- // unless it is a string already ready for us to use
- !(encoding === this[ENCODING] && !this[DECODER].lastNeed)) {
+ if (
+ typeof chunk === 'string' &&
+ // unless it is a string already ready for us to use
+ !(encoding === this[ENCODING] && !this[DECODER].lastNeed)
+ ) {
chunk = Buffer.from(chunk, encoding)
}
@@ -231,40 +239,31 @@ module.exports = class Minipass extends Stream {
chunk = this[DECODER].write(chunk)
// Note: flushing CAN potentially switch us into not-flowing mode
- if (this.flowing && this[BUFFERLENGTH] !== 0)
- this[FLUSH](true)
+ if (this.flowing && this[BUFFERLENGTH] !== 0) this[FLUSH](true)
- if (this.flowing)
- this.emit('data', chunk)
- else
- this[BUFFERPUSH](chunk)
+ if (this.flowing) this.emit('data', chunk)
+ else this[BUFFERPUSH](chunk)
- if (this[BUFFERLENGTH] !== 0)
- this.emit('readable')
+ if (this[BUFFERLENGTH] !== 0) this.emit('readable')
- if (cb)
- fn(cb)
+ if (cb) fn(cb)
return this.flowing
}
- read (n) {
- if (this[DESTROYED])
- return null
+ read(n) {
+ if (this[DESTROYED]) return null
if (this[BUFFERLENGTH] === 0 || n === 0 || n > this[BUFFERLENGTH]) {
this[MAYBE_EMIT_END]()
return null
}
- if (this[OBJECTMODE])
- n = null
+ if (this[OBJECTMODE]) n = null
if (this[BUFFER].length > 1 && !this[OBJECTMODE]) {
- if (this.encoding)
- this[BUFFER] = [this[BUFFER].join('')]
- else
- this[BUFFER] = [Buffer.concat(this[BUFFER], this[BUFFERLENGTH])]
+ if (this.encoding) this[BUFFER] = [this[BUFFER].join('')]
+ else this[BUFFER] = [Buffer.concat(this[BUFFER], this[BUFFERLENGTH])]
}
const ret = this[READ](n || null, this[BUFFER][0])
@@ -272,9 +271,8 @@ module.exports = class Minipass extends Stream {
return ret
}
- [READ] (n, chunk) {
- if (n === chunk.length || n === null)
- this[BUFFERSHIFT]()
+ [READ](n, chunk) {
+ if (n === chunk.length || n === null) this[BUFFERSHIFT]()
else {
this[BUFFER][0] = chunk.slice(n)
chunk = chunk.slice(0, n)
@@ -283,21 +281,16 @@ module.exports = class Minipass extends Stream {
this.emit('data', chunk)
- if (!this[BUFFER].length && !this[EOF])
- this.emit('drain')
+ if (!this[BUFFER].length && !this[EOF]) this.emit('drain')
return chunk
}
- end (chunk, encoding, cb) {
- if (typeof chunk === 'function')
- cb = chunk, chunk = null
- if (typeof encoding === 'function')
- cb = encoding, encoding = 'utf8'
- if (chunk)
- this.write(chunk, encoding)
- if (cb)
- this.once('end', cb)
+ end(chunk, encoding, cb) {
+ if (typeof chunk === 'function') (cb = chunk), (chunk = null)
+ if (typeof encoding === 'function') (cb = encoding), (encoding = 'utf8')
+ if (chunk) this.write(chunk, encoding)
+ if (cb) this.once('end', cb)
this[EOF] = true
this.writable = false
@@ -305,106 +298,93 @@ module.exports = class Minipass extends Stream {
// even if we're not reading.
// we'll re-emit if a new 'end' listener is added anyway.
// This makes MP more suitable to write-only use cases.
- if (this.flowing || !this[PAUSED])
- this[MAYBE_EMIT_END]()
+ if (this.flowing || !this[PAUSED]) this[MAYBE_EMIT_END]()
return this
}
// don't let the internal resume be overwritten
- [RESUME] () {
- if (this[DESTROYED])
- return
+ [RESUME]() {
+ if (this[DESTROYED]) return
this[PAUSED] = false
this[FLOWING] = true
this.emit('resume')
- if (this[BUFFER].length)
- this[FLUSH]()
- else if (this[EOF])
- this[MAYBE_EMIT_END]()
- else
- this.emit('drain')
+ if (this[BUFFER].length) this[FLUSH]()
+ else if (this[EOF]) this[MAYBE_EMIT_END]()
+ else this.emit('drain')
}
- resume () {
+ resume() {
return this[RESUME]()
}
- pause () {
+ pause() {
this[FLOWING] = false
this[PAUSED] = true
}
- get destroyed () {
+ get destroyed() {
return this[DESTROYED]
}
- get flowing () {
+ get flowing() {
return this[FLOWING]
}
- get paused () {
+ get paused() {
return this[PAUSED]
}
- [BUFFERPUSH] (chunk) {
- if (this[OBJECTMODE])
- this[BUFFERLENGTH] += 1
- else
- this[BUFFERLENGTH] += chunk.length
+ [BUFFERPUSH](chunk) {
+ if (this[OBJECTMODE]) this[BUFFERLENGTH] += 1
+ else this[BUFFERLENGTH] += chunk.length
this[BUFFER].push(chunk)
}
- [BUFFERSHIFT] () {
+ [BUFFERSHIFT]() {
if (this[BUFFER].length) {
- if (this[OBJECTMODE])
- this[BUFFERLENGTH] -= 1
- else
- this[BUFFERLENGTH] -= this[BUFFER][0].length
+ if (this[OBJECTMODE]) this[BUFFERLENGTH] -= 1
+ else this[BUFFERLENGTH] -= this[BUFFER][0].length
}
return this[BUFFER].shift()
}
- [FLUSH] (noDrain) {
+ [FLUSH](noDrain) {
do {} while (this[FLUSHCHUNK](this[BUFFERSHIFT]()))
- if (!noDrain && !this[BUFFER].length && !this[EOF])
- this.emit('drain')
+ if (!noDrain && !this[BUFFER].length && !this[EOF]) this.emit('drain')
}
- [FLUSHCHUNK] (chunk) {
+ [FLUSHCHUNK](chunk) {
return chunk ? (this.emit('data', chunk), this.flowing) : false
}
- pipe (dest, opts) {
- if (this[DESTROYED])
- return
+ pipe(dest, opts) {
+ if (this[DESTROYED]) return
const ended = this[EMITTED_END]
opts = opts || {}
- if (dest === proc.stdout || dest === proc.stderr)
- opts.end = false
- else
- opts.end = opts.end !== false
+ if (dest === proc.stdout || dest === proc.stderr) opts.end = false
+ else opts.end = opts.end !== false
opts.proxyErrors = !!opts.proxyErrors
// piping an ended stream ends immediately
if (ended) {
- if (opts.end)
- dest.end()
+ if (opts.end) dest.end()
} else {
- this[PIPES].push(!opts.proxyErrors ? new Pipe(this, dest, opts)
- : new PipeProxyErrors(this, dest, opts))
- if (this[ASYNC])
- defer(() => this[RESUME]())
- else
- this[RESUME]()
+ this[PIPES].push(
+ !opts.proxyErrors
+ ? new Pipe(this, dest, opts)
+ : new PipeProxyErrors(this, dest, opts)
+ )
+ if (this[ASYNC]) defer(() => this[RESUME]())
+ else this[RESUME]()
}
return dest
}
- unpipe (dest) {
+ unpipe(dest) {
const p = this[PIPES].find(p => p.dest === dest)
if (p) {
this[PIPES].splice(this[PIPES].indexOf(p), 1)
@@ -412,68 +392,68 @@ module.exports = class Minipass extends Stream {
}
}
- addListener (ev, fn) {
+ addListener(ev, fn) {
return this.on(ev, fn)
}
- on (ev, fn) {
+ on(ev, fn) {
const ret = super.on(ev, fn)
- if (ev === 'data' && !this[PIPES].length && !this.flowing)
- this[RESUME]()
+ if (ev === 'data' && !this[PIPES].length && !this.flowing) this[RESUME]()
else if (ev === 'readable' && this[BUFFERLENGTH] !== 0)
super.emit('readable')
else if (isEndish(ev) && this[EMITTED_END]) {
super.emit(ev)
this.removeAllListeners(ev)
} else if (ev === 'error' && this[EMITTED_ERROR]) {
- if (this[ASYNC])
- defer(() => fn.call(this, this[EMITTED_ERROR]))
- else
- fn.call(this, this[EMITTED_ERROR])
+ if (this[ASYNC]) defer(() => fn.call(this, this[EMITTED_ERROR]))
+ else fn.call(this, this[EMITTED_ERROR])
}
return ret
}
- get emittedEnd () {
+ get emittedEnd() {
return this[EMITTED_END]
}
- [MAYBE_EMIT_END] () {
- if (!this[EMITTING_END] &&
- !this[EMITTED_END] &&
- !this[DESTROYED] &&
- this[BUFFER].length === 0 &&
- this[EOF]) {
+ [MAYBE_EMIT_END]() {
+ if (
+ !this[EMITTING_END] &&
+ !this[EMITTED_END] &&
+ !this[DESTROYED] &&
+ this[BUFFER].length === 0 &&
+ this[EOF]
+ ) {
this[EMITTING_END] = true
this.emit('end')
this.emit('prefinish')
this.emit('finish')
- if (this[CLOSED])
- this.emit('close')
+ if (this[CLOSED]) this.emit('close')
this[EMITTING_END] = false
}
}
- emit (ev, data, ...extra) {
+ emit(ev, data, ...extra) {
// error and close are only events allowed after calling destroy()
if (ev !== 'error' && ev !== 'close' && ev !== DESTROYED && this[DESTROYED])
return
else if (ev === 'data') {
- return !data ? false
- : this[ASYNC] ? defer(() => this[EMITDATA](data))
+ return !data
+ ? false
+ : this[ASYNC]
+ ? defer(() => this[EMITDATA](data))
: this[EMITDATA](data)
} else if (ev === 'end') {
return this[EMITEND]()
} else if (ev === 'close') {
this[CLOSED] = true
// don't emit close before 'end' and 'finish'
- if (!this[EMITTED_END] && !this[DESTROYED])
- return
+ if (!this[EMITTED_END] && !this[DESTROYED]) return
const ret = super.emit('close')
this.removeAllListeners('close')
return ret
} else if (ev === 'error') {
this[EMITTED_ERROR] = data
+ super.emit(ERROR, data)
const ret = super.emit('error', data)
this[MAYBE_EMIT_END]()
return ret
@@ -493,29 +473,25 @@ module.exports = class Minipass extends Stream {
return ret
}
- [EMITDATA] (data) {
+ [EMITDATA](data) {
for (const p of this[PIPES]) {
- if (p.dest.write(data) === false)
- this.pause()
+ if (p.dest.write(data) === false) this.pause()
}
const ret = super.emit('data', data)
this[MAYBE_EMIT_END]()
return ret
}
- [EMITEND] () {
- if (this[EMITTED_END])
- return
+ [EMITEND]() {
+ if (this[EMITTED_END]) return
this[EMITTED_END] = true
this.readable = false
- if (this[ASYNC])
- defer(() => this[EMITEND2]())
- else
- this[EMITEND2]()
+ if (this[ASYNC]) defer(() => this[EMITEND2]())
+ else this[EMITEND2]()
}
- [EMITEND2] () {
+ [EMITEND2]() {
if (this[DECODER]) {
const data = this[DECODER].end()
if (data) {
@@ -535,33 +511,34 @@ module.exports = class Minipass extends Stream {
}
// const all = await stream.collect()
- collect () {
+ collect() {
const buf = []
- if (!this[OBJECTMODE])
- buf.dataLength = 0
+ if (!this[OBJECTMODE]) buf.dataLength = 0
// set the promise first, in case an error is raised
// by triggering the flow here.
const p = this.promise()
this.on('data', c => {
buf.push(c)
- if (!this[OBJECTMODE])
- buf.dataLength += c.length
+ if (!this[OBJECTMODE]) buf.dataLength += c.length
})
return p.then(() => buf)
}
// const data = await stream.concat()
- concat () {
+ concat() {
return this[OBJECTMODE]
? Promise.reject(new Error('cannot concat in objectMode'))
: this.collect().then(buf =>
this[OBJECTMODE]
? Promise.reject(new Error('cannot concat in objectMode'))
- : this[ENCODING] ? buf.join('') : Buffer.concat(buf, buf.dataLength))
+ : this[ENCODING]
+ ? buf.join('')
+ : Buffer.concat(buf, buf.dataLength)
+ )
}
// stream.promise().then(() => done, er => emitted error)
- promise () {
+ promise() {
return new Promise((resolve, reject) => {
this.on(DESTROYED, () => reject(new Error('stream destroyed')))
this.on('error', er => reject(er))
@@ -570,20 +547,26 @@ module.exports = class Minipass extends Stream {
}
// for await (let chunk of stream)
- [ASYNCITERATOR] () {
+ [ASYNCITERATOR]() {
+ let stopped = false
+ const stop = () => {
+ this.pause()
+ stopped = true
+ return Promise.resolve({ done: true })
+ }
const next = () => {
+ if (stopped) return stop()
const res = this.read()
- if (res !== null)
- return Promise.resolve({ done: false, value: res })
+ if (res !== null) return Promise.resolve({ done: false, value: res })
- if (this[EOF])
- return Promise.resolve({ done: true })
+ if (this[EOF]) return stop()
let resolve = null
let reject = null
const onerr = er => {
this.removeListener('data', ondata)
this.removeListener('end', onend)
+ stop()
reject(er)
}
const ondata = value => {
@@ -595,6 +578,7 @@ module.exports = class Minipass extends Stream {
const onend = () => {
this.removeListener('error', onerr)
this.removeListener('data', ondata)
+ stop()
resolve({ done: true })
}
const ondestroy = () => onerr(new Error('stream destroyed'))
@@ -608,25 +592,49 @@ module.exports = class Minipass extends Stream {
})
}
- return { next }
+ return {
+ next,
+ throw: stop,
+ return: stop,
+ [ASYNCITERATOR]() {
+ return this
+ },
+ }
}
// for (let chunk of stream)
- [ITERATOR] () {
+ [ITERATOR]() {
+ let stopped = false
+ const stop = () => {
+ this.pause()
+ this.removeListener(ERROR, stop)
+ this.removeListener('end', stop)
+ stopped = true
+ return { done: true }
+ }
+
const next = () => {
+ if (stopped) return stop()
const value = this.read()
- const done = value === null
- return { value, done }
+ return value === null ? stop() : { value }
+ }
+ this.once('end', stop)
+ this.once(ERROR, stop)
+
+ return {
+ next,
+ throw: stop,
+ return: stop,
+ [ITERATOR]() {
+ return this
+ },
}
- return { next }
}
- destroy (er) {
+ destroy(er) {
if (this[DESTROYED]) {
- if (er)
- this.emit('error', er)
- else
- this.emit(DESTROYED)
+ if (er) this.emit('error', er)
+ else this.emit(DESTROYED)
return this
}
@@ -636,22 +644,23 @@ module.exports = class Minipass extends Stream {
this[BUFFER].length = 0
this[BUFFERLENGTH] = 0
- if (typeof this.close === 'function' && !this[CLOSED])
- this.close()
+ if (typeof this.close === 'function' && !this[CLOSED]) this.close()
- if (er)
- this.emit('error', er)
- else // if no error to emit, still reject pending promises
- this.emit(DESTROYED)
+ if (er) this.emit('error', er)
+ // if no error to emit, still reject pending promises
+ else this.emit(DESTROYED)
return this
}
- static isStream (s) {
- return !!s && (s instanceof Minipass || s instanceof Stream ||
- s instanceof EE && (
- typeof s.pipe === 'function' || // readable
- (typeof s.write === 'function' && typeof s.end === 'function') // writable
- ))
+ static isStream(s) {
+ return (
+ !!s &&
+ (s instanceof Minipass ||
+ s instanceof Stream ||
+ (s instanceof EE &&
+ (typeof s.pipe === 'function' || // readable
+ (typeof s.write === 'function' && typeof s.end === 'function')))) // writable
+ )
}
}
diff --git a/deps/npm/node_modules/minipass/package.json b/deps/npm/node_modules/minipass/package.json
index ca30e694aa4497..43051ad5b6c831 100644
--- a/deps/npm/node_modules/minipass/package.json
+++ b/deps/npm/node_modules/minipass/package.json
@@ -1,12 +1,9 @@
{
"name": "minipass",
- "version": "4.0.0",
+ "version": "4.0.3",
"description": "minimal implementation of a PassThrough stream",
"main": "index.js",
"types": "index.d.ts",
- "dependencies": {
- "yallist": "^4.0.0"
- },
"devDependencies": {
"@types/node": "^17.0.41",
"end-of-stream": "^1.4.0",
@@ -14,13 +11,16 @@
"tap": "^16.2.0",
"through2": "^2.0.3",
"ts-node": "^10.8.1",
+ "typedoc": "^0.23.24",
"typescript": "^4.7.3"
},
"scripts": {
"test": "tap",
"preversion": "npm test",
"postversion": "npm publish",
- "postpublish": "git push origin --follow-tags"
+ "postpublish": "git push origin --follow-tags",
+ "typedoc": "typedoc ./index.d.ts",
+ "format": "prettier --write . --loglevel warn"
},
"repository": {
"type": "git",
diff --git a/deps/npm/node_modules/mute-stream/lib/index.js b/deps/npm/node_modules/mute-stream/lib/index.js
new file mode 100644
index 00000000000000..368f727e2c3ed8
--- /dev/null
+++ b/deps/npm/node_modules/mute-stream/lib/index.js
@@ -0,0 +1,142 @@
+const Stream = require('stream')
+
+class MuteStream extends Stream {
+ #isTTY = null
+
+ constructor (opts = {}) {
+ super(opts)
+ this.writable = this.readable = true
+ this.muted = false
+ this.on('pipe', this._onpipe)
+ this.replace = opts.replace
+
+ // For readline-type situations
+ // This much at the start of a line being redrawn after a ctrl char
+ // is seen (such as backspace) won't be redrawn as the replacement
+ this._prompt = opts.prompt || null
+ this._hadControl = false
+ }
+
+ #destSrc (key, def) {
+ if (this._dest) {
+ return this._dest[key]
+ }
+ if (this._src) {
+ return this._src[key]
+ }
+ return def
+ }
+
+ #proxy (method, ...args) {
+ if (typeof this._dest?.[method] === 'function') {
+ this._dest[method](...args)
+ }
+ if (typeof this._src?.[method] === 'function') {
+ this._src[method](...args)
+ }
+ }
+
+ get isTTY () {
+ if (this.#isTTY !== null) {
+ return this.#isTTY
+ }
+ return this.#destSrc('isTTY', false)
+ }
+
+ // basically just get replace the getter/setter with a regular value
+ set isTTY (val) {
+ this.#isTTY = val
+ }
+
+ get rows () {
+ return this.#destSrc('rows')
+ }
+
+ get columns () {
+ return this.#destSrc('columns')
+ }
+
+ mute () {
+ this.muted = true
+ }
+
+ unmute () {
+ this.muted = false
+ }
+
+ _onpipe (src) {
+ this._src = src
+ }
+
+ pipe (dest, options) {
+ this._dest = dest
+ return super.pipe(dest, options)
+ }
+
+ pause () {
+ if (this._src) {
+ return this._src.pause()
+ }
+ }
+
+ resume () {
+ if (this._src) {
+ return this._src.resume()
+ }
+ }
+
+ write (c) {
+ if (this.muted) {
+ if (!this.replace) {
+ return true
+ }
+ // eslint-disable-next-line no-control-regex
+ if (c.match(/^\u001b/)) {
+ if (c.indexOf(this._prompt) === 0) {
+ c = c.slice(this._prompt.length)
+ c = c.replace(/./g, this.replace)
+ c = this._prompt + c
+ }
+ this._hadControl = true
+ return this.emit('data', c)
+ } else {
+ if (this._prompt && this._hadControl &&
+ c.indexOf(this._prompt) === 0) {
+ this._hadControl = false
+ this.emit('data', this._prompt)
+ c = c.slice(this._prompt.length)
+ }
+ c = c.toString().replace(/./g, this.replace)
+ }
+ }
+ this.emit('data', c)
+ }
+
+ end (c) {
+ if (this.muted) {
+ if (c && this.replace) {
+ c = c.toString().replace(/./g, this.replace)
+ } else {
+ c = null
+ }
+ }
+ if (c) {
+ this.emit('data', c)
+ }
+ this.emit('end')
+ }
+
+ destroy (...args) {
+ return this.#proxy('destroy', ...args)
+ }
+
+ destroySoon (...args) {
+ return this.#proxy('destroySoon', ...args)
+ }
+
+ close (...args) {
+ return this.#proxy('close', ...args)
+ }
+}
+
+module.exports = MuteStream
diff --git a/deps/npm/node_modules/mute-stream/mute.js b/deps/npm/node_modules/mute-stream/mute.js
deleted file mode 100644
index a24fc09975bb32..00000000000000
--- a/deps/npm/node_modules/mute-stream/mute.js
+++ /dev/null
@@ -1,145 +0,0 @@
-var Stream = require('stream')
-
-module.exports = MuteStream
-
-// var out = new MuteStream(process.stdout)
-// argument auto-pipes
-function MuteStream (opts) {
- Stream.apply(this)
- opts = opts || {}
- this.writable = this.readable = true
- this.muted = false
- this.on('pipe', this._onpipe)
- this.replace = opts.replace
-
- // For readline-type situations
- // This much at the start of a line being redrawn after a ctrl char
- // is seen (such as backspace) won't be redrawn as the replacement
- this._prompt = opts.prompt || null
- this._hadControl = false
-}
-
-MuteStream.prototype = Object.create(Stream.prototype)
-
-Object.defineProperty(MuteStream.prototype, 'constructor', {
- value: MuteStream,
- enumerable: false
-})
-
-MuteStream.prototype.mute = function () {
- this.muted = true
-}
-
-MuteStream.prototype.unmute = function () {
- this.muted = false
-}
-
-Object.defineProperty(MuteStream.prototype, '_onpipe', {
- value: onPipe,
- enumerable: false,
- writable: true,
- configurable: true
-})
-
-function onPipe (src) {
- this._src = src
-}
-
-Object.defineProperty(MuteStream.prototype, 'isTTY', {
- get: getIsTTY,
- set: setIsTTY,
- enumerable: true,
- configurable: true
-})
-
-function getIsTTY () {
- return( (this._dest) ? this._dest.isTTY
- : (this._src) ? this._src.isTTY
- : false
- )
-}
-
-// basically just get replace the getter/setter with a regular value
-function setIsTTY (isTTY) {
- Object.defineProperty(this, 'isTTY', {
- value: isTTY,
- enumerable: true,
- writable: true,
- configurable: true
- })
-}
-
-Object.defineProperty(MuteStream.prototype, 'rows', {
- get: function () {
- return( this._dest ? this._dest.rows
- : this._src ? this._src.rows
- : undefined )
- }, enumerable: true, configurable: true })
-
-Object.defineProperty(MuteStream.prototype, 'columns', {
- get: function () {
- return( this._dest ? this._dest.columns
- : this._src ? this._src.columns
- : undefined )
- }, enumerable: true, configurable: true })
-
-
-MuteStream.prototype.pipe = function (dest, options) {
- this._dest = dest
- return Stream.prototype.pipe.call(this, dest, options)
-}
-
-MuteStream.prototype.pause = function () {
- if (this._src) return this._src.pause()
-}
-
-MuteStream.prototype.resume = function () {
- if (this._src) return this._src.resume()
-}
-
-MuteStream.prototype.write = function (c) {
- if (this.muted) {
- if (!this.replace) return true
- if (c.match(/^\u001b/)) {
- if(c.indexOf(this._prompt) === 0) {
- c = c.substr(this._prompt.length);
- c = c.replace(/./g, this.replace);
- c = this._prompt + c;
- }
- this._hadControl = true
- return this.emit('data', c)
- } else {
- if (this._prompt && this._hadControl &&
- c.indexOf(this._prompt) === 0) {
- this._hadControl = false
- this.emit('data', this._prompt)
- c = c.substr(this._prompt.length)
- }
- c = c.toString().replace(/./g, this.replace)
- }
- }
- this.emit('data', c)
-}
-
-MuteStream.prototype.end = function (c) {
- if (this.muted) {
- if (c && this.replace) {
- c = c.toString().replace(/./g, this.replace)
- } else {
- c = null
- }
- }
- if (c) this.emit('data', c)
- this.emit('end')
-}
-
-function proxy (fn) { return function () {
- var d = this._dest
- var s = this._src
- if (d && d[fn]) d[fn].apply(d, arguments)
- if (s && s[fn]) s[fn].apply(s, arguments)
-}}
-
-MuteStream.prototype.destroy = proxy('destroy')
-MuteStream.prototype.destroySoon = proxy('destroySoon')
-MuteStream.prototype.close = proxy('close')
diff --git a/deps/npm/node_modules/mute-stream/package.json b/deps/npm/node_modules/mute-stream/package.json
index 56ebb363b92511..37b2f5070ed69f 100644
--- a/deps/npm/node_modules/mute-stream/package.json
+++ b/deps/npm/node_modules/mute-stream/package.json
@@ -1,29 +1,52 @@
{
"name": "mute-stream",
- "version": "0.0.8",
- "main": "mute.js",
- "directories": {
- "test": "test"
- },
+ "version": "1.0.0",
+ "main": "lib/index.js",
"devDependencies": {
- "tap": "^12.1.1"
+ "@npmcli/eslint-config": "^4.0.0",
+ "@npmcli/template-oss": "4.11.0",
+ "tap": "^16.3.0"
},
"scripts": {
- "test": "tap test/*.js --cov"
+ "test": "tap",
+ "lint": "eslint \"**/*.js\"",
+ "postlint": "template-oss-check",
+ "template-oss-apply": "template-oss-apply --force",
+ "lintfix": "npm run lint -- --fix",
+ "snap": "tap",
+ "posttest": "npm run lint"
},
"repository": {
"type": "git",
- "url": "git://github.com/isaacs/mute-stream"
+ "url": "https://github.com/npm/mute-stream.git"
},
"keywords": [
"mute",
"stream",
"pipe"
],
- "author": "Isaac Z. Schlueter (http://blog.izs.me/)",
+ "author": "GitHub Inc.",
"license": "ISC",
"description": "Bytes go in, but they don't come out (when muted).",
"files": [
- "mute.js"
- ]
+ "bin/",
+ "lib/"
+ ],
+ "tap": {
+ "statements": 70,
+ "branches": 60,
+ "functions": 81,
+ "lines": 70,
+ "nyc-arg": [
+ "--exclude",
+ "tap-snapshots/**"
+ ]
+ },
+ "engines": {
+ "node": "^14.17.0 || ^16.13.0 || >=18.0.0"
+ },
+ "templateOSS": {
+ "//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
+ "version": "4.11.0"
+ }
}
diff --git a/deps/npm/node_modules/node-gyp/.github/workflows/tests.yml b/deps/npm/node_modules/node-gyp/.github/workflows/tests.yml
index a3b68bdd5d387d..8f34d4e11fdc6f 100644
--- a/deps/npm/node_modules/node-gyp/.github/workflows/tests.yml
+++ b/deps/npm/node_modules/node-gyp/.github/workflows/tests.yml
@@ -14,7 +14,7 @@ jobs:
max-parallel: 15
matrix:
node: [14.x, 16.x, 18.x]
- python: ["3.6", "3.8", "3.10"]
+ python: ["3.7", "3.9", "3.11"]
os: [macos-latest, ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
steps:
diff --git a/deps/npm/node_modules/node-gyp/CHANGELOG.md b/deps/npm/node_modules/node-gyp/CHANGELOG.md
index 54f0b4f427a546..4131521515994e 100644
--- a/deps/npm/node_modules/node-gyp/CHANGELOG.md
+++ b/deps/npm/node_modules/node-gyp/CHANGELOG.md
@@ -1,5 +1,17 @@
# Changelog
+### [9.3.1](https://www.github.com/nodejs/node-gyp/compare/v9.3.0...v9.3.1) (2022-12-16)
+
+
+### Bug Fixes
+
+* increase node 12 support to ^12.13 ([#2771](https://www.github.com/nodejs/node-gyp/issues/2771)) ([888efb9](https://www.github.com/nodejs/node-gyp/commit/888efb9055857afee6a6b54550722cf9ae3ee323))
+
+
+### Miscellaneous
+
+* update python test matrix ([#2774](https://www.github.com/nodejs/node-gyp/issues/2774)) ([38f01fa](https://www.github.com/nodejs/node-gyp/commit/38f01fa57d10fdb3db7697121d957bc2e0e96508))
+
## [9.3.0](https://www.github.com/nodejs/node-gyp/compare/v9.2.0...v9.3.0) (2022-10-10)
diff --git a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/glob/common.js b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/glob/common.js
index e094f750472f78..61a4452f097dcd 100644
--- a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/glob/common.js
+++ b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/glob/common.js
@@ -57,6 +57,12 @@ function setopts (self, pattern, options) {
pattern = "**/" + pattern
}
+ self.windowsPathsNoEscape = !!options.windowsPathsNoEscape ||
+ options.allowWindowsEscape === false
+ if (self.windowsPathsNoEscape) {
+ pattern = pattern.replace(/\\/g, '/')
+ }
+
self.silent = !!options.silent
self.pattern = pattern
self.strict = options.strict !== false
@@ -112,8 +118,6 @@ function setopts (self, pattern, options) {
// Note that they are not supported in Glob itself anyway.
options.nonegate = true
options.nocomment = true
- // always treat \ in patterns as escapes, not path separators
- options.allowWindowsEscape = true
self.minimatch = new Minimatch(pattern, options)
self.options = self.minimatch.options
diff --git a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/glob/package.json b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/glob/package.json
index 5134253e32226f..ca0fd916211b51 100644
--- a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/glob/package.json
+++ b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/glob/package.json
@@ -2,7 +2,7 @@
"author": "Isaac Z. Schlueter (http://blog.izs.me/)",
"name": "glob",
"description": "a little globber",
- "version": "8.0.3",
+ "version": "8.1.0",
"repository": {
"type": "git",
"url": "git://github.com/isaacs/node-glob.git"
diff --git a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/LICENSE b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/LICENSE
index 9517b7d995bb03..1493534e60dce4 100644
--- a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/LICENSE
+++ b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/LICENSE
@@ -1,6 +1,6 @@
The ISC License
-Copyright (c) 2011-2022 Isaac Z. Schlueter and Contributors
+Copyright (c) 2011-2023 Isaac Z. Schlueter and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
diff --git a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/minimatch.js b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/minimatch.js
index 71c96a1fb71cce..6c8bfc35181c6d 100644
--- a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/minimatch.js
+++ b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/minimatch.js
@@ -157,7 +157,9 @@ minimatch.match = (list, pattern, options = {}) => {
// replace stuff like \* with *
const globUnescape = s => s.replace(/\\(.)/g, '$1')
+const charUnescape = s => s.replace(/\\([^-\]])/g, '$1')
const regExpEscape = s => s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&')
+const braExpEscape = s => s.replace(/[[\]\\]/g, '\\$&')
class Minimatch {
constructor (pattern, options) {
@@ -243,7 +245,7 @@ class Minimatch {
negateOffset++
}
- if (negateOffset) this.pattern = pattern.substr(negateOffset)
+ if (negateOffset) this.pattern = pattern.slice(negateOffset)
this.negate = negate
}
@@ -425,7 +427,7 @@ class Minimatch {
if (pattern === '') return ''
let re = ''
- let hasMagic = !!options.nocase
+ let hasMagic = false
let escaping = false
// ? => one single character
const patternListStack = []
@@ -438,11 +440,23 @@ class Minimatch {
let pl
let sp
// . and .. never match anything that doesn't start with .,
- // even when options.dot is set.
- const patternStart = pattern.charAt(0) === '.' ? '' // anything
- // not (start or / followed by . or .. followed by / or end)
- : options.dot ? '(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))'
- : '(?!\\.)'
+ // even when options.dot is set. However, if the pattern
+ // starts with ., then traversal patterns can match.
+ let dotTravAllowed = pattern.charAt(0) === '.'
+ let dotFileAllowed = options.dot || dotTravAllowed
+ const patternStart = () =>
+ dotTravAllowed
+ ? ''
+ : dotFileAllowed
+ ? '(?!(?:^|\\/)\\.{1,2}(?:$|\\/))'
+ : '(?!\\.)'
+ const subPatternStart = (p) =>
+ p.charAt(0) === '.'
+ ? ''
+ : options.dot
+ ? '(?!(?:^|\\/)\\.{1,2}(?:$|\\/))'
+ : '(?!\\.)'
+
const clearStateChar = () => {
if (stateChar) {
@@ -492,6 +506,11 @@ class Minimatch {
}
case '\\':
+ if (inClass && pattern.charAt(i + 1) === '-') {
+ re += c
+ continue
+ }
+
clearStateChar()
escaping = true
continue
@@ -526,7 +545,7 @@ class Minimatch {
if (options.noext) clearStateChar()
continue
- case '(':
+ case '(': {
if (inClass) {
re += '('
continue
@@ -537,46 +556,64 @@ class Minimatch {
continue
}
- patternListStack.push({
+ const plEntry = {
type: stateChar,
start: i - 1,
reStart: re.length,
open: plTypes[stateChar].open,
- close: plTypes[stateChar].close
- })
- // negation is (?:(?!js)[^/]*)
- re += stateChar === '!' ? '(?:(?!(?:' : '(?:'
+ close: plTypes[stateChar].close,
+ }
+ this.debug(this.pattern, '\t', plEntry)
+ patternListStack.push(plEntry)
+ // negation is (?:(?!(?:js)(?:))[^/]*)
+ re += plEntry.open
+ // next entry starts with a dot maybe?
+ if (plEntry.start === 0 && plEntry.type !== '!') {
+ dotTravAllowed = true
+ re += subPatternStart(pattern.slice(i + 1))
+ }
this.debug('plType %j %j', stateChar, re)
stateChar = false
- continue
+ continue
+ }
- case ')':
- if (inClass || !patternListStack.length) {
+ case ')': {
+ const plEntry = patternListStack[patternListStack.length - 1]
+ if (inClass || !plEntry) {
re += '\\)'
continue
}
+ patternListStack.pop()
+ // closing an extglob
clearStateChar()
hasMagic = true
- pl = patternListStack.pop()
+ pl = plEntry
// negation is (?:(?!js)[^/]*)
// The others are (?:)
re += pl.close
if (pl.type === '!') {
- negativeLists.push(pl)
+ negativeLists.push(Object.assign(pl, { reEnd: re.length }))
}
- pl.reEnd = re.length
- continue
+ continue
+ }
- case '|':
- if (inClass || !patternListStack.length) {
+ case '|': {
+ const plEntry = patternListStack[patternListStack.length - 1]
+ if (inClass || !plEntry) {
re += '\\|'
continue
}
clearStateChar()
re += '|'
- continue
+ // next subpattern can start with a dot?
+ if (plEntry.start === 0 && plEntry.type !== '!') {
+ dotTravAllowed = true
+ re += subPatternStart(pattern.slice(i + 1))
+ }
+ continue
+ }
// these are mostly the same in regexp and glob
case '[':
@@ -604,8 +641,6 @@ class Minimatch {
continue
}
- // handle the case where we left a class open.
- // "[z-a]" is valid, equivalent to "\[z-a\]"
// split where the last [ was, make sure we don't have
// an invalid re. if so, re-walk the contents of the
// would-be class to re-translate any characters that
@@ -615,20 +650,16 @@ class Minimatch {
// to do safely. For now, this is safe and works.
cs = pattern.substring(classStart + 1, i)
try {
- RegExp('[' + cs + ']')
+ RegExp('[' + braExpEscape(charUnescape(cs)) + ']')
+ // looks good, finish up the class.
+ re += c
} catch (er) {
- // not a valid class!
- sp = this.parse(cs, SUBPARSE)
- re = re.substr(0, reClassStart) + '\\[' + sp[0] + '\\]'
- hasMagic = hasMagic || sp[1]
- inClass = false
- continue
+ // out of order ranges in JS are errors, but in glob syntax,
+ // they're just a range that matches nothing.
+ re = re.substring(0, reClassStart) + '(?:$.)' // match nothing ever
}
-
- // finish up the class.
hasMagic = true
inClass = false
- re += c
continue
default:
@@ -652,9 +683,9 @@ class Minimatch {
// this is a huge pita. We now have to re-walk
// the contents of the would-be class to re-translate
// any characters that were passed through as-is
- cs = pattern.substr(classStart + 1)
+ cs = pattern.slice(classStart + 1)
sp = this.parse(cs, SUBPARSE)
- re = re.substr(0, reClassStart) + '\\[' + sp[0]
+ re = re.substring(0, reClassStart) + '\\[' + sp[0]
hasMagic = hasMagic || sp[1]
}
@@ -721,14 +752,16 @@ class Minimatch {
// Handle nested stuff like *(*.js|!(*.json)), where open parens
// mean that we should *not* include the ) in the bit that is considered
// "after" the negated section.
- const openParensBefore = nlBefore.split('(').length - 1
+ const closeParensBefore = nlBefore.split(')').length
+ const openParensBefore = nlBefore.split('(').length - closeParensBefore
let cleanAfter = nlAfter
for (let i = 0; i < openParensBefore; i++) {
cleanAfter = cleanAfter.replace(/\)[+*?]?/, '')
}
nlAfter = cleanAfter
- const dollar = nlAfter === '' && isSub !== SUBPARSE ? '$' : ''
+ const dollar = nlAfter === '' && isSub !== SUBPARSE ? '(?:$|\\/)' : ''
+
re = nlBefore + nlFirst + nlAfter + dollar + nlLast
}
@@ -740,7 +773,7 @@ class Minimatch {
}
if (addPatternStart) {
- re = patternStart + re
+ re = patternStart() + re
}
// parsing just a piece of a larger pattern.
@@ -748,6 +781,11 @@ class Minimatch {
return [re, hasMagic]
}
+ // if it's nocase, and the lcase/uppercase don't match, it's magic
+ if (options.nocase && !hasMagic) {
+ hasMagic = pattern.toUpperCase() !== pattern.toLowerCase()
+ }
+
// skip the regexp for non-magical patterns
// unescape anything in it, though, so that it'll be
// an exact match against a file etc.
diff --git a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/package.json b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/package.json
index 8e1a84285d38f3..c8809dbb3119d9 100644
--- a/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/package.json
+++ b/deps/npm/node_modules/node-gyp/node_modules/cacache/node_modules/minimatch/package.json
@@ -2,7 +2,10 @@
"author": "Isaac Z. Schlueter (http://blog.izs.me)",
"name": "minimatch",
"description": "a glob matcher in javascript",
- "version": "5.1.0",
+ "publishConfig": {
+ "tag": "legacy-v5"
+ },
+ "version": "5.1.6",
"repository": {
"type": "git",
"url": "git://github.com/isaacs/minimatch.git"
@@ -22,7 +25,7 @@
"brace-expansion": "^2.0.1"
},
"devDependencies": {
- "tap": "^15.1.6"
+ "tap": "^16.3.2"
},
"license": "ISC",
"files": [
diff --git a/deps/npm/node_modules/node-gyp/node_modules/fs-minipass/LICENSE b/deps/npm/node_modules/node-gyp/node_modules/fs-minipass/LICENSE
new file mode 100644
index 00000000000000..19129e315fe593
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/fs-minipass/LICENSE
@@ -0,0 +1,15 @@
+The ISC License
+
+Copyright (c) Isaac Z. Schlueter and Contributors
+
+Permission to use, copy, modify, and/or distribute this software for any
+purpose with or without fee is hereby granted, provided that the above
+copyright notice and this permission notice appear in all copies.
+
+THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
+IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
diff --git a/deps/npm/node_modules/fs-minipass/index.js b/deps/npm/node_modules/node-gyp/node_modules/fs-minipass/index.js
similarity index 100%
rename from deps/npm/node_modules/fs-minipass/index.js
rename to deps/npm/node_modules/node-gyp/node_modules/fs-minipass/index.js
diff --git a/deps/npm/node_modules/node-gyp/node_modules/fs-minipass/package.json b/deps/npm/node_modules/node-gyp/node_modules/fs-minipass/package.json
new file mode 100644
index 00000000000000..2f2436cb5c3b1a
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/fs-minipass/package.json
@@ -0,0 +1,39 @@
+{
+ "name": "fs-minipass",
+ "version": "2.1.0",
+ "main": "index.js",
+ "scripts": {
+ "test": "tap",
+ "preversion": "npm test",
+ "postversion": "npm publish",
+ "postpublish": "git push origin --follow-tags"
+ },
+ "keywords": [],
+ "author": "Isaac Z. Schlueter (http://blog.izs.me/)",
+ "license": "ISC",
+ "repository": {
+ "type": "git",
+ "url": "git+https://github.com/npm/fs-minipass.git"
+ },
+ "bugs": {
+ "url": "https://github.com/npm/fs-minipass/issues"
+ },
+ "homepage": "https://github.com/npm/fs-minipass#readme",
+ "description": "fs read and write streams based on minipass",
+ "dependencies": {
+ "minipass": "^3.0.0"
+ },
+ "devDependencies": {
+ "mutate-fs": "^2.0.1",
+ "tap": "^14.6.4"
+ },
+ "files": [
+ "index.js"
+ ],
+ "tap": {
+ "check-coverage": true
+ },
+ "engines": {
+ "node": ">= 8"
+ }
+}
diff --git a/deps/npm/node_modules/readable-stream/CONTRIBUTING.md b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/CONTRIBUTING.md
similarity index 100%
rename from deps/npm/node_modules/readable-stream/CONTRIBUTING.md
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/CONTRIBUTING.md
diff --git a/deps/npm/node_modules/readable-stream/GOVERNANCE.md b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/GOVERNANCE.md
similarity index 100%
rename from deps/npm/node_modules/readable-stream/GOVERNANCE.md
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/GOVERNANCE.md
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/LICENSE b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/LICENSE
similarity index 100%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/LICENSE
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/LICENSE
diff --git a/deps/npm/node_modules/readable-stream/errors-browser.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/errors-browser.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/errors-browser.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/errors-browser.js
diff --git a/deps/npm/node_modules/readable-stream/errors.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/errors.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/errors.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/errors.js
diff --git a/deps/npm/node_modules/readable-stream/experimentalWarning.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/experimentalWarning.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/experimentalWarning.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/experimentalWarning.js
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_duplex.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_duplex.js
new file mode 100644
index 00000000000000..67525192250f6d
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_duplex.js
@@ -0,0 +1,139 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+// a duplex stream is just a stream that is both readable and writable.
+// Since JS doesn't have multiple prototypal inheritance, this class
+// prototypally inherits from Readable, and then parasitically from
+// Writable.
+'use strict';
+/**/
+
+var objectKeys = Object.keys || function (obj) {
+ var keys = [];
+
+ for (var key in obj) {
+ keys.push(key);
+ }
+
+ return keys;
+};
+/* */
+
+
+module.exports = Duplex;
+
+var Readable = require('./_stream_readable');
+
+var Writable = require('./_stream_writable');
+
+require('inherits')(Duplex, Readable);
+
+{
+ // Allow the keys array to be GC'ed.
+ var keys = objectKeys(Writable.prototype);
+
+ for (var v = 0; v < keys.length; v++) {
+ var method = keys[v];
+ if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method];
+ }
+}
+
+function Duplex(options) {
+ if (!(this instanceof Duplex)) return new Duplex(options);
+ Readable.call(this, options);
+ Writable.call(this, options);
+ this.allowHalfOpen = true;
+
+ if (options) {
+ if (options.readable === false) this.readable = false;
+ if (options.writable === false) this.writable = false;
+
+ if (options.allowHalfOpen === false) {
+ this.allowHalfOpen = false;
+ this.once('end', onend);
+ }
+ }
+}
+
+Object.defineProperty(Duplex.prototype, 'writableHighWaterMark', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._writableState.highWaterMark;
+ }
+});
+Object.defineProperty(Duplex.prototype, 'writableBuffer', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._writableState && this._writableState.getBuffer();
+ }
+});
+Object.defineProperty(Duplex.prototype, 'writableLength', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._writableState.length;
+ }
+}); // the no-half-open enforcer
+
+function onend() {
+ // If the writable side ended, then we're ok.
+ if (this._writableState.ended) return; // no more data can be written.
+ // But allow more writes to happen in this tick.
+
+ process.nextTick(onEndNT, this);
+}
+
+function onEndNT(self) {
+ self.end();
+}
+
+Object.defineProperty(Duplex.prototype, 'destroyed', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ if (this._readableState === undefined || this._writableState === undefined) {
+ return false;
+ }
+
+ return this._readableState.destroyed && this._writableState.destroyed;
+ },
+ set: function set(value) {
+ // we ignore the value if the stream
+ // has not been initialized yet
+ if (this._readableState === undefined || this._writableState === undefined) {
+ return;
+ } // backward compatibility, the user is explicitly
+ // managing destroyed
+
+
+ this._readableState.destroyed = value;
+ this._writableState.destroyed = value;
+ }
+});
\ No newline at end of file
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_passthrough.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_passthrough.js
new file mode 100644
index 00000000000000..32e7414c5a8271
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_passthrough.js
@@ -0,0 +1,39 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+// a passthrough stream.
+// basically just the most minimal sort of Transform stream.
+// Every written chunk gets output as-is.
+'use strict';
+
+module.exports = PassThrough;
+
+var Transform = require('./_stream_transform');
+
+require('inherits')(PassThrough, Transform);
+
+function PassThrough(options) {
+ if (!(this instanceof PassThrough)) return new PassThrough(options);
+ Transform.call(this, options);
+}
+
+PassThrough.prototype._transform = function (chunk, encoding, cb) {
+ cb(null, chunk);
+};
\ No newline at end of file
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_readable.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_readable.js
new file mode 100644
index 00000000000000..192d451488f208
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_readable.js
@@ -0,0 +1,1124 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+'use strict';
+
+module.exports = Readable;
+/**/
+
+var Duplex;
+/* */
+
+Readable.ReadableState = ReadableState;
+/**/
+
+var EE = require('events').EventEmitter;
+
+var EElistenerCount = function EElistenerCount(emitter, type) {
+ return emitter.listeners(type).length;
+};
+/* */
+
+/**/
+
+
+var Stream = require('./internal/streams/stream');
+/* */
+
+
+var Buffer = require('buffer').Buffer;
+
+var OurUint8Array = global.Uint8Array || function () {};
+
+function _uint8ArrayToBuffer(chunk) {
+ return Buffer.from(chunk);
+}
+
+function _isUint8Array(obj) {
+ return Buffer.isBuffer(obj) || obj instanceof OurUint8Array;
+}
+/**/
+
+
+var debugUtil = require('util');
+
+var debug;
+
+if (debugUtil && debugUtil.debuglog) {
+ debug = debugUtil.debuglog('stream');
+} else {
+ debug = function debug() {};
+}
+/* */
+
+
+var BufferList = require('./internal/streams/buffer_list');
+
+var destroyImpl = require('./internal/streams/destroy');
+
+var _require = require('./internal/streams/state'),
+ getHighWaterMark = _require.getHighWaterMark;
+
+var _require$codes = require('../errors').codes,
+ ERR_INVALID_ARG_TYPE = _require$codes.ERR_INVALID_ARG_TYPE,
+ ERR_STREAM_PUSH_AFTER_EOF = _require$codes.ERR_STREAM_PUSH_AFTER_EOF,
+ ERR_METHOD_NOT_IMPLEMENTED = _require$codes.ERR_METHOD_NOT_IMPLEMENTED,
+ ERR_STREAM_UNSHIFT_AFTER_END_EVENT = _require$codes.ERR_STREAM_UNSHIFT_AFTER_END_EVENT; // Lazy loaded to improve the startup performance.
+
+
+var StringDecoder;
+var createReadableStreamAsyncIterator;
+var from;
+
+require('inherits')(Readable, Stream);
+
+var errorOrDestroy = destroyImpl.errorOrDestroy;
+var kProxyEvents = ['error', 'close', 'destroy', 'pause', 'resume'];
+
+function prependListener(emitter, event, fn) {
+ // Sadly this is not cacheable as some libraries bundle their own
+ // event emitter implementation with them.
+ if (typeof emitter.prependListener === 'function') return emitter.prependListener(event, fn); // This is a hack to make sure that our error handler is attached before any
+ // userland ones. NEVER DO THIS. This is here only because this code needs
+ // to continue to work with older versions of Node.js that do not include
+ // the prependListener() method. The goal is to eventually remove this hack.
+
+ if (!emitter._events || !emitter._events[event]) emitter.on(event, fn);else if (Array.isArray(emitter._events[event])) emitter._events[event].unshift(fn);else emitter._events[event] = [fn, emitter._events[event]];
+}
+
+function ReadableState(options, stream, isDuplex) {
+ Duplex = Duplex || require('./_stream_duplex');
+ options = options || {}; // Duplex streams are both readable and writable, but share
+ // the same options object.
+ // However, some cases require setting options to different
+ // values for the readable and the writable sides of the duplex stream.
+ // These options can be provided separately as readableXXX and writableXXX.
+
+ if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof Duplex; // object stream flag. Used to make read(n) ignore n and to
+ // make all the buffer merging and length checks go away
+
+ this.objectMode = !!options.objectMode;
+ if (isDuplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // the point at which it stops calling _read() to fill the buffer
+ // Note: 0 is a valid value, means "don't call _read preemptively ever"
+
+ this.highWaterMark = getHighWaterMark(this, options, 'readableHighWaterMark', isDuplex); // A linked list is used to store data chunks instead of an array because the
+ // linked list can remove elements from the beginning faster than
+ // array.shift()
+
+ this.buffer = new BufferList();
+ this.length = 0;
+ this.pipes = null;
+ this.pipesCount = 0;
+ this.flowing = null;
+ this.ended = false;
+ this.endEmitted = false;
+ this.reading = false; // a flag to be able to tell if the event 'readable'/'data' is emitted
+ // immediately, or on a later tick. We set this to true at first, because
+ // any actions that shouldn't happen until "later" should generally also
+ // not happen before the first read call.
+
+ this.sync = true; // whenever we return null, then we set a flag to say
+ // that we're awaiting a 'readable' event emission.
+
+ this.needReadable = false;
+ this.emittedReadable = false;
+ this.readableListening = false;
+ this.resumeScheduled = false;
+ this.paused = true; // Should close be emitted on destroy. Defaults to true.
+
+ this.emitClose = options.emitClose !== false; // Should .destroy() be called after 'end' (and potentially 'finish')
+
+ this.autoDestroy = !!options.autoDestroy; // has it been destroyed
+
+ this.destroyed = false; // Crypto is kind of old and crusty. Historically, its default string
+ // encoding is 'binary' so we have to make this configurable.
+ // Everything else in the universe uses 'utf8', though.
+
+ this.defaultEncoding = options.defaultEncoding || 'utf8'; // the number of writers that are awaiting a drain event in .pipe()s
+
+ this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled
+
+ this.readingMore = false;
+ this.decoder = null;
+ this.encoding = null;
+
+ if (options.encoding) {
+ if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder;
+ this.decoder = new StringDecoder(options.encoding);
+ this.encoding = options.encoding;
+ }
+}
+
+function Readable(options) {
+ Duplex = Duplex || require('./_stream_duplex');
+ if (!(this instanceof Readable)) return new Readable(options); // Checking for a Stream.Duplex instance is faster here instead of inside
+ // the ReadableState constructor, at least with V8 6.5
+
+ var isDuplex = this instanceof Duplex;
+ this._readableState = new ReadableState(options, this, isDuplex); // legacy
+
+ this.readable = true;
+
+ if (options) {
+ if (typeof options.read === 'function') this._read = options.read;
+ if (typeof options.destroy === 'function') this._destroy = options.destroy;
+ }
+
+ Stream.call(this);
+}
+
+Object.defineProperty(Readable.prototype, 'destroyed', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ if (this._readableState === undefined) {
+ return false;
+ }
+
+ return this._readableState.destroyed;
+ },
+ set: function set(value) {
+ // we ignore the value if the stream
+ // has not been initialized yet
+ if (!this._readableState) {
+ return;
+ } // backward compatibility, the user is explicitly
+ // managing destroyed
+
+
+ this._readableState.destroyed = value;
+ }
+});
+Readable.prototype.destroy = destroyImpl.destroy;
+Readable.prototype._undestroy = destroyImpl.undestroy;
+
+Readable.prototype._destroy = function (err, cb) {
+ cb(err);
+}; // Manually shove something into the read() buffer.
+// This returns true if the highWaterMark has not been hit yet,
+// similar to how Writable.write() returns true if you should
+// write() some more.
+
+
+Readable.prototype.push = function (chunk, encoding) {
+ var state = this._readableState;
+ var skipChunkCheck;
+
+ if (!state.objectMode) {
+ if (typeof chunk === 'string') {
+ encoding = encoding || state.defaultEncoding;
+
+ if (encoding !== state.encoding) {
+ chunk = Buffer.from(chunk, encoding);
+ encoding = '';
+ }
+
+ skipChunkCheck = true;
+ }
+ } else {
+ skipChunkCheck = true;
+ }
+
+ return readableAddChunk(this, chunk, encoding, false, skipChunkCheck);
+}; // Unshift should *always* be something directly out of read()
+
+
+Readable.prototype.unshift = function (chunk) {
+ return readableAddChunk(this, chunk, null, true, false);
+};
+
+function readableAddChunk(stream, chunk, encoding, addToFront, skipChunkCheck) {
+ debug('readableAddChunk', chunk);
+ var state = stream._readableState;
+
+ if (chunk === null) {
+ state.reading = false;
+ onEofChunk(stream, state);
+ } else {
+ var er;
+ if (!skipChunkCheck) er = chunkInvalid(state, chunk);
+
+ if (er) {
+ errorOrDestroy(stream, er);
+ } else if (state.objectMode || chunk && chunk.length > 0) {
+ if (typeof chunk !== 'string' && !state.objectMode && Object.getPrototypeOf(chunk) !== Buffer.prototype) {
+ chunk = _uint8ArrayToBuffer(chunk);
+ }
+
+ if (addToFront) {
+ if (state.endEmitted) errorOrDestroy(stream, new ERR_STREAM_UNSHIFT_AFTER_END_EVENT());else addChunk(stream, state, chunk, true);
+ } else if (state.ended) {
+ errorOrDestroy(stream, new ERR_STREAM_PUSH_AFTER_EOF());
+ } else if (state.destroyed) {
+ return false;
+ } else {
+ state.reading = false;
+
+ if (state.decoder && !encoding) {
+ chunk = state.decoder.write(chunk);
+ if (state.objectMode || chunk.length !== 0) addChunk(stream, state, chunk, false);else maybeReadMore(stream, state);
+ } else {
+ addChunk(stream, state, chunk, false);
+ }
+ }
+ } else if (!addToFront) {
+ state.reading = false;
+ maybeReadMore(stream, state);
+ }
+ } // We can push more data if we are below the highWaterMark.
+ // Also, if we have no data yet, we can stand some more bytes.
+ // This is to work around cases where hwm=0, such as the repl.
+
+
+ return !state.ended && (state.length < state.highWaterMark || state.length === 0);
+}
+
+function addChunk(stream, state, chunk, addToFront) {
+ if (state.flowing && state.length === 0 && !state.sync) {
+ state.awaitDrain = 0;
+ stream.emit('data', chunk);
+ } else {
+ // update the buffer info.
+ state.length += state.objectMode ? 1 : chunk.length;
+ if (addToFront) state.buffer.unshift(chunk);else state.buffer.push(chunk);
+ if (state.needReadable) emitReadable(stream);
+ }
+
+ maybeReadMore(stream, state);
+}
+
+function chunkInvalid(state, chunk) {
+ var er;
+
+ if (!_isUint8Array(chunk) && typeof chunk !== 'string' && chunk !== undefined && !state.objectMode) {
+ er = new ERR_INVALID_ARG_TYPE('chunk', ['string', 'Buffer', 'Uint8Array'], chunk);
+ }
+
+ return er;
+}
+
+Readable.prototype.isPaused = function () {
+ return this._readableState.flowing === false;
+}; // backwards compatibility.
+
+
+Readable.prototype.setEncoding = function (enc) {
+ if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder;
+ var decoder = new StringDecoder(enc);
+ this._readableState.decoder = decoder; // If setEncoding(null), decoder.encoding equals utf8
+
+ this._readableState.encoding = this._readableState.decoder.encoding; // Iterate over current buffer to convert already stored Buffers:
+
+ var p = this._readableState.buffer.head;
+ var content = '';
+
+ while (p !== null) {
+ content += decoder.write(p.data);
+ p = p.next;
+ }
+
+ this._readableState.buffer.clear();
+
+ if (content !== '') this._readableState.buffer.push(content);
+ this._readableState.length = content.length;
+ return this;
+}; // Don't raise the hwm > 1GB
+
+
+var MAX_HWM = 0x40000000;
+
+function computeNewHighWaterMark(n) {
+ if (n >= MAX_HWM) {
+ // TODO(ronag): Throw ERR_VALUE_OUT_OF_RANGE.
+ n = MAX_HWM;
+ } else {
+ // Get the next highest power of 2 to prevent increasing hwm excessively in
+ // tiny amounts
+ n--;
+ n |= n >>> 1;
+ n |= n >>> 2;
+ n |= n >>> 4;
+ n |= n >>> 8;
+ n |= n >>> 16;
+ n++;
+ }
+
+ return n;
+} // This function is designed to be inlinable, so please take care when making
+// changes to the function body.
+
+
+function howMuchToRead(n, state) {
+ if (n <= 0 || state.length === 0 && state.ended) return 0;
+ if (state.objectMode) return 1;
+
+ if (n !== n) {
+ // Only flow one buffer at a time
+ if (state.flowing && state.length) return state.buffer.head.data.length;else return state.length;
+ } // If we're asking for more than the current hwm, then raise the hwm.
+
+
+ if (n > state.highWaterMark) state.highWaterMark = computeNewHighWaterMark(n);
+ if (n <= state.length) return n; // Don't have enough
+
+ if (!state.ended) {
+ state.needReadable = true;
+ return 0;
+ }
+
+ return state.length;
+} // you can override either this method, or the async _read(n) below.
+
+
+Readable.prototype.read = function (n) {
+ debug('read', n);
+ n = parseInt(n, 10);
+ var state = this._readableState;
+ var nOrig = n;
+ if (n !== 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we
+ // already have a bunch of data in the buffer, then just trigger
+ // the 'readable' event and move on.
+
+ if (n === 0 && state.needReadable && ((state.highWaterMark !== 0 ? state.length >= state.highWaterMark : state.length > 0) || state.ended)) {
+ debug('read: emitReadable', state.length, state.ended);
+ if (state.length === 0 && state.ended) endReadable(this);else emitReadable(this);
+ return null;
+ }
+
+ n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up.
+
+ if (n === 0 && state.ended) {
+ if (state.length === 0) endReadable(this);
+ return null;
+ } // All the actual chunk generation logic needs to be
+ // *below* the call to _read. The reason is that in certain
+ // synthetic stream cases, such as passthrough streams, _read
+ // may be a completely synchronous operation which may change
+ // the state of the read buffer, providing enough data when
+ // before there was *not* enough.
+ //
+ // So, the steps are:
+ // 1. Figure out what the state of things will be after we do
+ // a read from the buffer.
+ //
+ // 2. If that resulting state will trigger a _read, then call _read.
+ // Note that this may be asynchronous, or synchronous. Yes, it is
+ // deeply ugly to write APIs this way, but that still doesn't mean
+ // that the Readable class should behave improperly, as streams are
+ // designed to be sync/async agnostic.
+ // Take note if the _read call is sync or async (ie, if the read call
+ // has returned yet), so that we know whether or not it's safe to emit
+ // 'readable' etc.
+ //
+ // 3. Actually pull the requested chunks out of the buffer and return.
+ // if we need a readable event, then we need to do some reading.
+
+
+ var doRead = state.needReadable;
+ debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some
+
+ if (state.length === 0 || state.length - n < state.highWaterMark) {
+ doRead = true;
+ debug('length less than watermark', doRead);
+ } // however, if we've ended, then there's no point, and if we're already
+ // reading, then it's unnecessary.
+
+
+ if (state.ended || state.reading) {
+ doRead = false;
+ debug('reading or ended', doRead);
+ } else if (doRead) {
+ debug('do read');
+ state.reading = true;
+ state.sync = true; // if the length is currently zero, then we *need* a readable event.
+
+ if (state.length === 0) state.needReadable = true; // call internal read method
+
+ this._read(state.highWaterMark);
+
+ state.sync = false; // If _read pushed data synchronously, then `reading` will be false,
+ // and we need to re-evaluate how much data we can return to the user.
+
+ if (!state.reading) n = howMuchToRead(nOrig, state);
+ }
+
+ var ret;
+ if (n > 0) ret = fromList(n, state);else ret = null;
+
+ if (ret === null) {
+ state.needReadable = state.length <= state.highWaterMark;
+ n = 0;
+ } else {
+ state.length -= n;
+ state.awaitDrain = 0;
+ }
+
+ if (state.length === 0) {
+ // If we have nothing in the buffer, then we want to know
+ // as soon as we *do* get something into the buffer.
+ if (!state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick.
+
+ if (nOrig !== n && state.ended) endReadable(this);
+ }
+
+ if (ret !== null) this.emit('data', ret);
+ return ret;
+};
+
+function onEofChunk(stream, state) {
+ debug('onEofChunk');
+ if (state.ended) return;
+
+ if (state.decoder) {
+ var chunk = state.decoder.end();
+
+ if (chunk && chunk.length) {
+ state.buffer.push(chunk);
+ state.length += state.objectMode ? 1 : chunk.length;
+ }
+ }
+
+ state.ended = true;
+
+ if (state.sync) {
+ // if we are sync, wait until next tick to emit the data.
+ // Otherwise we risk emitting data in the flow()
+ // the readable code triggers during a read() call
+ emitReadable(stream);
+ } else {
+ // emit 'readable' now to make sure it gets picked up.
+ state.needReadable = false;
+
+ if (!state.emittedReadable) {
+ state.emittedReadable = true;
+ emitReadable_(stream);
+ }
+ }
+} // Don't emit readable right away in sync mode, because this can trigger
+// another read() call => stack overflow. This way, it might trigger
+// a nextTick recursion warning, but that's not so bad.
+
+
+function emitReadable(stream) {
+ var state = stream._readableState;
+ debug('emitReadable', state.needReadable, state.emittedReadable);
+ state.needReadable = false;
+
+ if (!state.emittedReadable) {
+ debug('emitReadable', state.flowing);
+ state.emittedReadable = true;
+ process.nextTick(emitReadable_, stream);
+ }
+}
+
+function emitReadable_(stream) {
+ var state = stream._readableState;
+ debug('emitReadable_', state.destroyed, state.length, state.ended);
+
+ if (!state.destroyed && (state.length || state.ended)) {
+ stream.emit('readable');
+ state.emittedReadable = false;
+ } // The stream needs another readable event if
+ // 1. It is not flowing, as the flow mechanism will take
+ // care of it.
+ // 2. It is not ended.
+ // 3. It is below the highWaterMark, so we can schedule
+ // another readable later.
+
+
+ state.needReadable = !state.flowing && !state.ended && state.length <= state.highWaterMark;
+ flow(stream);
+} // at this point, the user has presumably seen the 'readable' event,
+// and called read() to consume some data. that may have triggered
+// in turn another _read(n) call, in which case reading = true if
+// it's in progress.
+// However, if we're not ended, or reading, and the length < hwm,
+// then go ahead and try to read some more preemptively.
+
+
+function maybeReadMore(stream, state) {
+ if (!state.readingMore) {
+ state.readingMore = true;
+ process.nextTick(maybeReadMore_, stream, state);
+ }
+}
+
+function maybeReadMore_(stream, state) {
+ // Attempt to read more data if we should.
+ //
+ // The conditions for reading more data are (one of):
+ // - Not enough data buffered (state.length < state.highWaterMark). The loop
+ // is responsible for filling the buffer with enough data if such data
+ // is available. If highWaterMark is 0 and we are not in the flowing mode
+ // we should _not_ attempt to buffer any extra data. We'll get more data
+ // when the stream consumer calls read() instead.
+ // - No data in the buffer, and the stream is in flowing mode. In this mode
+ // the loop below is responsible for ensuring read() is called. Failing to
+ // call read here would abort the flow and there's no other mechanism for
+ // continuing the flow if the stream consumer has just subscribed to the
+ // 'data' event.
+ //
+ // In addition to the above conditions to keep reading data, the following
+ // conditions prevent the data from being read:
+ // - The stream has ended (state.ended).
+ // - There is already a pending 'read' operation (state.reading). This is a
+ // case where the the stream has called the implementation defined _read()
+ // method, but they are processing the call asynchronously and have _not_
+ // called push() with new data. In this case we skip performing more
+ // read()s. The execution ends in this method again after the _read() ends
+ // up calling push() with more data.
+ while (!state.reading && !state.ended && (state.length < state.highWaterMark || state.flowing && state.length === 0)) {
+ var len = state.length;
+ debug('maybeReadMore read 0');
+ stream.read(0);
+ if (len === state.length) // didn't get any data, stop spinning.
+ break;
+ }
+
+ state.readingMore = false;
+} // abstract method. to be overridden in specific implementation classes.
+// call cb(er, data) where data is <= n in length.
+// for virtual (non-string, non-buffer) streams, "length" is somewhat
+// arbitrary, and perhaps not very meaningful.
+
+
+Readable.prototype._read = function (n) {
+ errorOrDestroy(this, new ERR_METHOD_NOT_IMPLEMENTED('_read()'));
+};
+
+Readable.prototype.pipe = function (dest, pipeOpts) {
+ var src = this;
+ var state = this._readableState;
+
+ switch (state.pipesCount) {
+ case 0:
+ state.pipes = dest;
+ break;
+
+ case 1:
+ state.pipes = [state.pipes, dest];
+ break;
+
+ default:
+ state.pipes.push(dest);
+ break;
+ }
+
+ state.pipesCount += 1;
+ debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts);
+ var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr;
+ var endFn = doEnd ? onend : unpipe;
+ if (state.endEmitted) process.nextTick(endFn);else src.once('end', endFn);
+ dest.on('unpipe', onunpipe);
+
+ function onunpipe(readable, unpipeInfo) {
+ debug('onunpipe');
+
+ if (readable === src) {
+ if (unpipeInfo && unpipeInfo.hasUnpiped === false) {
+ unpipeInfo.hasUnpiped = true;
+ cleanup();
+ }
+ }
+ }
+
+ function onend() {
+ debug('onend');
+ dest.end();
+ } // when the dest drains, it reduces the awaitDrain counter
+ // on the source. This would be more elegant with a .once()
+ // handler in flow(), but adding and removing repeatedly is
+ // too slow.
+
+
+ var ondrain = pipeOnDrain(src);
+ dest.on('drain', ondrain);
+ var cleanedUp = false;
+
+ function cleanup() {
+ debug('cleanup'); // cleanup event handlers once the pipe is broken
+
+ dest.removeListener('close', onclose);
+ dest.removeListener('finish', onfinish);
+ dest.removeListener('drain', ondrain);
+ dest.removeListener('error', onerror);
+ dest.removeListener('unpipe', onunpipe);
+ src.removeListener('end', onend);
+ src.removeListener('end', unpipe);
+ src.removeListener('data', ondata);
+ cleanedUp = true; // if the reader is waiting for a drain event from this
+ // specific writer, then it would cause it to never start
+ // flowing again.
+ // So, if this is awaiting a drain, then we just call it now.
+ // If we don't know, then assume that we are waiting for one.
+
+ if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain();
+ }
+
+ src.on('data', ondata);
+
+ function ondata(chunk) {
+ debug('ondata');
+ var ret = dest.write(chunk);
+ debug('dest.write', ret);
+
+ if (ret === false) {
+ // If the user unpiped during `dest.write()`, it is possible
+ // to get stuck in a permanently paused state if that write
+ // also returned false.
+ // => Check whether `dest` is still a piping destination.
+ if ((state.pipesCount === 1 && state.pipes === dest || state.pipesCount > 1 && indexOf(state.pipes, dest) !== -1) && !cleanedUp) {
+ debug('false write response, pause', state.awaitDrain);
+ state.awaitDrain++;
+ }
+
+ src.pause();
+ }
+ } // if the dest has an error, then stop piping into it.
+ // however, don't suppress the throwing behavior for this.
+
+
+ function onerror(er) {
+ debug('onerror', er);
+ unpipe();
+ dest.removeListener('error', onerror);
+ if (EElistenerCount(dest, 'error') === 0) errorOrDestroy(dest, er);
+ } // Make sure our error handler is attached before userland ones.
+
+
+ prependListener(dest, 'error', onerror); // Both close and finish should trigger unpipe, but only once.
+
+ function onclose() {
+ dest.removeListener('finish', onfinish);
+ unpipe();
+ }
+
+ dest.once('close', onclose);
+
+ function onfinish() {
+ debug('onfinish');
+ dest.removeListener('close', onclose);
+ unpipe();
+ }
+
+ dest.once('finish', onfinish);
+
+ function unpipe() {
+ debug('unpipe');
+ src.unpipe(dest);
+ } // tell the dest that it's being piped to
+
+
+ dest.emit('pipe', src); // start the flow if it hasn't been started already.
+
+ if (!state.flowing) {
+ debug('pipe resume');
+ src.resume();
+ }
+
+ return dest;
+};
+
+function pipeOnDrain(src) {
+ return function pipeOnDrainFunctionResult() {
+ var state = src._readableState;
+ debug('pipeOnDrain', state.awaitDrain);
+ if (state.awaitDrain) state.awaitDrain--;
+
+ if (state.awaitDrain === 0 && EElistenerCount(src, 'data')) {
+ state.flowing = true;
+ flow(src);
+ }
+ };
+}
+
+Readable.prototype.unpipe = function (dest) {
+ var state = this._readableState;
+ var unpipeInfo = {
+ hasUnpiped: false
+ }; // if we're not piping anywhere, then do nothing.
+
+ if (state.pipesCount === 0) return this; // just one destination. most common case.
+
+ if (state.pipesCount === 1) {
+ // passed in one, but it's not the right one.
+ if (dest && dest !== state.pipes) return this;
+ if (!dest) dest = state.pipes; // got a match.
+
+ state.pipes = null;
+ state.pipesCount = 0;
+ state.flowing = false;
+ if (dest) dest.emit('unpipe', this, unpipeInfo);
+ return this;
+ } // slow case. multiple pipe destinations.
+
+
+ if (!dest) {
+ // remove all.
+ var dests = state.pipes;
+ var len = state.pipesCount;
+ state.pipes = null;
+ state.pipesCount = 0;
+ state.flowing = false;
+
+ for (var i = 0; i < len; i++) {
+ dests[i].emit('unpipe', this, {
+ hasUnpiped: false
+ });
+ }
+
+ return this;
+ } // try to find the right one.
+
+
+ var index = indexOf(state.pipes, dest);
+ if (index === -1) return this;
+ state.pipes.splice(index, 1);
+ state.pipesCount -= 1;
+ if (state.pipesCount === 1) state.pipes = state.pipes[0];
+ dest.emit('unpipe', this, unpipeInfo);
+ return this;
+}; // set up data events if they are asked for
+// Ensure readable listeners eventually get something
+
+
+Readable.prototype.on = function (ev, fn) {
+ var res = Stream.prototype.on.call(this, ev, fn);
+ var state = this._readableState;
+
+ if (ev === 'data') {
+ // update readableListening so that resume() may be a no-op
+ // a few lines down. This is needed to support once('readable').
+ state.readableListening = this.listenerCount('readable') > 0; // Try start flowing on next tick if stream isn't explicitly paused
+
+ if (state.flowing !== false) this.resume();
+ } else if (ev === 'readable') {
+ if (!state.endEmitted && !state.readableListening) {
+ state.readableListening = state.needReadable = true;
+ state.flowing = false;
+ state.emittedReadable = false;
+ debug('on readable', state.length, state.reading);
+
+ if (state.length) {
+ emitReadable(this);
+ } else if (!state.reading) {
+ process.nextTick(nReadingNextTick, this);
+ }
+ }
+ }
+
+ return res;
+};
+
+Readable.prototype.addListener = Readable.prototype.on;
+
+Readable.prototype.removeListener = function (ev, fn) {
+ var res = Stream.prototype.removeListener.call(this, ev, fn);
+
+ if (ev === 'readable') {
+ // We need to check if there is someone still listening to
+ // readable and reset the state. However this needs to happen
+ // after readable has been emitted but before I/O (nextTick) to
+ // support once('readable', fn) cycles. This means that calling
+ // resume within the same tick will have no
+ // effect.
+ process.nextTick(updateReadableListening, this);
+ }
+
+ return res;
+};
+
+Readable.prototype.removeAllListeners = function (ev) {
+ var res = Stream.prototype.removeAllListeners.apply(this, arguments);
+
+ if (ev === 'readable' || ev === undefined) {
+ // We need to check if there is someone still listening to
+ // readable and reset the state. However this needs to happen
+ // after readable has been emitted but before I/O (nextTick) to
+ // support once('readable', fn) cycles. This means that calling
+ // resume within the same tick will have no
+ // effect.
+ process.nextTick(updateReadableListening, this);
+ }
+
+ return res;
+};
+
+function updateReadableListening(self) {
+ var state = self._readableState;
+ state.readableListening = self.listenerCount('readable') > 0;
+
+ if (state.resumeScheduled && !state.paused) {
+ // flowing needs to be set to true now, otherwise
+ // the upcoming resume will not flow.
+ state.flowing = true; // crude way to check if we should resume
+ } else if (self.listenerCount('data') > 0) {
+ self.resume();
+ }
+}
+
+function nReadingNextTick(self) {
+ debug('readable nexttick read 0');
+ self.read(0);
+} // pause() and resume() are remnants of the legacy readable stream API
+// If the user uses them, then switch into old mode.
+
+
+Readable.prototype.resume = function () {
+ var state = this._readableState;
+
+ if (!state.flowing) {
+ debug('resume'); // we flow only if there is no one listening
+ // for readable, but we still have to call
+ // resume()
+
+ state.flowing = !state.readableListening;
+ resume(this, state);
+ }
+
+ state.paused = false;
+ return this;
+};
+
+function resume(stream, state) {
+ if (!state.resumeScheduled) {
+ state.resumeScheduled = true;
+ process.nextTick(resume_, stream, state);
+ }
+}
+
+function resume_(stream, state) {
+ debug('resume', state.reading);
+
+ if (!state.reading) {
+ stream.read(0);
+ }
+
+ state.resumeScheduled = false;
+ stream.emit('resume');
+ flow(stream);
+ if (state.flowing && !state.reading) stream.read(0);
+}
+
+Readable.prototype.pause = function () {
+ debug('call pause flowing=%j', this._readableState.flowing);
+
+ if (this._readableState.flowing !== false) {
+ debug('pause');
+ this._readableState.flowing = false;
+ this.emit('pause');
+ }
+
+ this._readableState.paused = true;
+ return this;
+};
+
+function flow(stream) {
+ var state = stream._readableState;
+ debug('flow', state.flowing);
+
+ while (state.flowing && stream.read() !== null) {
+ ;
+ }
+} // wrap an old-style stream as the async data source.
+// This is *not* part of the readable stream interface.
+// It is an ugly unfortunate mess of history.
+
+
+Readable.prototype.wrap = function (stream) {
+ var _this = this;
+
+ var state = this._readableState;
+ var paused = false;
+ stream.on('end', function () {
+ debug('wrapped end');
+
+ if (state.decoder && !state.ended) {
+ var chunk = state.decoder.end();
+ if (chunk && chunk.length) _this.push(chunk);
+ }
+
+ _this.push(null);
+ });
+ stream.on('data', function (chunk) {
+ debug('wrapped data');
+ if (state.decoder) chunk = state.decoder.write(chunk); // don't skip over falsy values in objectMode
+
+ if (state.objectMode && (chunk === null || chunk === undefined)) return;else if (!state.objectMode && (!chunk || !chunk.length)) return;
+
+ var ret = _this.push(chunk);
+
+ if (!ret) {
+ paused = true;
+ stream.pause();
+ }
+ }); // proxy all the other methods.
+ // important when wrapping filters and duplexes.
+
+ for (var i in stream) {
+ if (this[i] === undefined && typeof stream[i] === 'function') {
+ this[i] = function methodWrap(method) {
+ return function methodWrapReturnFunction() {
+ return stream[method].apply(stream, arguments);
+ };
+ }(i);
+ }
+ } // proxy certain important events.
+
+
+ for (var n = 0; n < kProxyEvents.length; n++) {
+ stream.on(kProxyEvents[n], this.emit.bind(this, kProxyEvents[n]));
+ } // when we try to consume some more bytes, simply unpause the
+ // underlying stream.
+
+
+ this._read = function (n) {
+ debug('wrapped _read', n);
+
+ if (paused) {
+ paused = false;
+ stream.resume();
+ }
+ };
+
+ return this;
+};
+
+if (typeof Symbol === 'function') {
+ Readable.prototype[Symbol.asyncIterator] = function () {
+ if (createReadableStreamAsyncIterator === undefined) {
+ createReadableStreamAsyncIterator = require('./internal/streams/async_iterator');
+ }
+
+ return createReadableStreamAsyncIterator(this);
+ };
+}
+
+Object.defineProperty(Readable.prototype, 'readableHighWaterMark', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._readableState.highWaterMark;
+ }
+});
+Object.defineProperty(Readable.prototype, 'readableBuffer', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._readableState && this._readableState.buffer;
+ }
+});
+Object.defineProperty(Readable.prototype, 'readableFlowing', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._readableState.flowing;
+ },
+ set: function set(state) {
+ if (this._readableState) {
+ this._readableState.flowing = state;
+ }
+ }
+}); // exposed for testing purposes only.
+
+Readable._fromList = fromList;
+Object.defineProperty(Readable.prototype, 'readableLength', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._readableState.length;
+ }
+}); // Pluck off n bytes from an array of buffers.
+// Length is the combined lengths of all the buffers in the list.
+// This function is designed to be inlinable, so please take care when making
+// changes to the function body.
+
+function fromList(n, state) {
+ // nothing buffered
+ if (state.length === 0) return null;
+ var ret;
+ if (state.objectMode) ret = state.buffer.shift();else if (!n || n >= state.length) {
+ // read it all, truncate the list
+ if (state.decoder) ret = state.buffer.join('');else if (state.buffer.length === 1) ret = state.buffer.first();else ret = state.buffer.concat(state.length);
+ state.buffer.clear();
+ } else {
+ // read part of list
+ ret = state.buffer.consume(n, state.decoder);
+ }
+ return ret;
+}
+
+function endReadable(stream) {
+ var state = stream._readableState;
+ debug('endReadable', state.endEmitted);
+
+ if (!state.endEmitted) {
+ state.ended = true;
+ process.nextTick(endReadableNT, state, stream);
+ }
+}
+
+function endReadableNT(state, stream) {
+ debug('endReadableNT', state.endEmitted, state.length); // Check that we didn't get one last unshift.
+
+ if (!state.endEmitted && state.length === 0) {
+ state.endEmitted = true;
+ stream.readable = false;
+ stream.emit('end');
+
+ if (state.autoDestroy) {
+ // In case of duplex streams we need a way to detect
+ // if the writable side is ready for autoDestroy as well
+ var wState = stream._writableState;
+
+ if (!wState || wState.autoDestroy && wState.finished) {
+ stream.destroy();
+ }
+ }
+ }
+}
+
+if (typeof Symbol === 'function') {
+ Readable.from = function (iterable, opts) {
+ if (from === undefined) {
+ from = require('./internal/streams/from');
+ }
+
+ return from(Readable, iterable, opts);
+ };
+}
+
+function indexOf(xs, x) {
+ for (var i = 0, l = xs.length; i < l; i++) {
+ if (xs[i] === x) return i;
+ }
+
+ return -1;
+}
\ No newline at end of file
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_transform.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_transform.js
new file mode 100644
index 00000000000000..41a738c4e93599
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_transform.js
@@ -0,0 +1,201 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+// a transform stream is a readable/writable stream where you do
+// something with the data. Sometimes it's called a "filter",
+// but that's not a great name for it, since that implies a thing where
+// some bits pass through, and others are simply ignored. (That would
+// be a valid example of a transform, of course.)
+//
+// While the output is causally related to the input, it's not a
+// necessarily symmetric or synchronous transformation. For example,
+// a zlib stream might take multiple plain-text writes(), and then
+// emit a single compressed chunk some time in the future.
+//
+// Here's how this works:
+//
+// The Transform stream has all the aspects of the readable and writable
+// stream classes. When you write(chunk), that calls _write(chunk,cb)
+// internally, and returns false if there's a lot of pending writes
+// buffered up. When you call read(), that calls _read(n) until
+// there's enough pending readable data buffered up.
+//
+// In a transform stream, the written data is placed in a buffer. When
+// _read(n) is called, it transforms the queued up data, calling the
+// buffered _write cb's as it consumes chunks. If consuming a single
+// written chunk would result in multiple output chunks, then the first
+// outputted bit calls the readcb, and subsequent chunks just go into
+// the read buffer, and will cause it to emit 'readable' if necessary.
+//
+// This way, back-pressure is actually determined by the reading side,
+// since _read has to be called to start processing a new chunk. However,
+// a pathological inflate type of transform can cause excessive buffering
+// here. For example, imagine a stream where every byte of input is
+// interpreted as an integer from 0-255, and then results in that many
+// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
+// 1kb of data being output. In this case, you could write a very small
+// amount of input, and end up with a very large amount of output. In
+// such a pathological inflating mechanism, there'd be no way to tell
+// the system to stop doing the transform. A single 4MB write could
+// cause the system to run out of memory.
+//
+// However, even in such a pathological case, only a single written chunk
+// would be consumed, and then the rest would wait (un-transformed) until
+// the results of the previous transformed chunk were consumed.
+'use strict';
+
+module.exports = Transform;
+
+var _require$codes = require('../errors').codes,
+ ERR_METHOD_NOT_IMPLEMENTED = _require$codes.ERR_METHOD_NOT_IMPLEMENTED,
+ ERR_MULTIPLE_CALLBACK = _require$codes.ERR_MULTIPLE_CALLBACK,
+ ERR_TRANSFORM_ALREADY_TRANSFORMING = _require$codes.ERR_TRANSFORM_ALREADY_TRANSFORMING,
+ ERR_TRANSFORM_WITH_LENGTH_0 = _require$codes.ERR_TRANSFORM_WITH_LENGTH_0;
+
+var Duplex = require('./_stream_duplex');
+
+require('inherits')(Transform, Duplex);
+
+function afterTransform(er, data) {
+ var ts = this._transformState;
+ ts.transforming = false;
+ var cb = ts.writecb;
+
+ if (cb === null) {
+ return this.emit('error', new ERR_MULTIPLE_CALLBACK());
+ }
+
+ ts.writechunk = null;
+ ts.writecb = null;
+ if (data != null) // single equals check for both `null` and `undefined`
+ this.push(data);
+ cb(er);
+ var rs = this._readableState;
+ rs.reading = false;
+
+ if (rs.needReadable || rs.length < rs.highWaterMark) {
+ this._read(rs.highWaterMark);
+ }
+}
+
+function Transform(options) {
+ if (!(this instanceof Transform)) return new Transform(options);
+ Duplex.call(this, options);
+ this._transformState = {
+ afterTransform: afterTransform.bind(this),
+ needTransform: false,
+ transforming: false,
+ writecb: null,
+ writechunk: null,
+ writeencoding: null
+ }; // start out asking for a readable event once data is transformed.
+
+ this._readableState.needReadable = true; // we have implemented the _read method, and done the other things
+ // that Readable wants before the first _read call, so unset the
+ // sync guard flag.
+
+ this._readableState.sync = false;
+
+ if (options) {
+ if (typeof options.transform === 'function') this._transform = options.transform;
+ if (typeof options.flush === 'function') this._flush = options.flush;
+ } // When the writable side finishes, then flush out anything remaining.
+
+
+ this.on('prefinish', prefinish);
+}
+
+function prefinish() {
+ var _this = this;
+
+ if (typeof this._flush === 'function' && !this._readableState.destroyed) {
+ this._flush(function (er, data) {
+ done(_this, er, data);
+ });
+ } else {
+ done(this, null, null);
+ }
+}
+
+Transform.prototype.push = function (chunk, encoding) {
+ this._transformState.needTransform = false;
+ return Duplex.prototype.push.call(this, chunk, encoding);
+}; // This is the part where you do stuff!
+// override this function in implementation classes.
+// 'chunk' is an input chunk.
+//
+// Call `push(newChunk)` to pass along transformed output
+// to the readable side. You may call 'push' zero or more times.
+//
+// Call `cb(err)` when you are done with this chunk. If you pass
+// an error, then that'll put the hurt on the whole operation. If you
+// never call cb(), then you'll never get another chunk.
+
+
+Transform.prototype._transform = function (chunk, encoding, cb) {
+ cb(new ERR_METHOD_NOT_IMPLEMENTED('_transform()'));
+};
+
+Transform.prototype._write = function (chunk, encoding, cb) {
+ var ts = this._transformState;
+ ts.writecb = cb;
+ ts.writechunk = chunk;
+ ts.writeencoding = encoding;
+
+ if (!ts.transforming) {
+ var rs = this._readableState;
+ if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark);
+ }
+}; // Doesn't matter what the args are here.
+// _transform does all the work.
+// That we got here means that the readable side wants more data.
+
+
+Transform.prototype._read = function (n) {
+ var ts = this._transformState;
+
+ if (ts.writechunk !== null && !ts.transforming) {
+ ts.transforming = true;
+
+ this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform);
+ } else {
+ // mark that we need a transform, so that any data that comes in
+ // will get processed, now that we've asked for it.
+ ts.needTransform = true;
+ }
+};
+
+Transform.prototype._destroy = function (err, cb) {
+ Duplex.prototype._destroy.call(this, err, function (err2) {
+ cb(err2);
+ });
+};
+
+function done(stream, er, data) {
+ if (er) return stream.emit('error', er);
+ if (data != null) // single equals check for both `null` and `undefined`
+ stream.push(data); // TODO(BridgeAR): Write a test for these two error cases
+ // if there's nothing in the write buffer, then that means
+ // that nothing more will ever be provided
+
+ if (stream._writableState.length) throw new ERR_TRANSFORM_WITH_LENGTH_0();
+ if (stream._transformState.transforming) throw new ERR_TRANSFORM_ALREADY_TRANSFORMING();
+ return stream.push(null);
+}
\ No newline at end of file
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_writable.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_writable.js
new file mode 100644
index 00000000000000..a2634d7c24fd5e
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/_stream_writable.js
@@ -0,0 +1,697 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+// A bit simpler than readable streams.
+// Implement an async ._write(chunk, encoding, cb), and it'll handle all
+// the drain event emission and buffering.
+'use strict';
+
+module.exports = Writable;
+/* */
+
+function WriteReq(chunk, encoding, cb) {
+ this.chunk = chunk;
+ this.encoding = encoding;
+ this.callback = cb;
+ this.next = null;
+} // It seems a linked list but it is not
+// there will be only 2 of these for each stream
+
+
+function CorkedRequest(state) {
+ var _this = this;
+
+ this.next = null;
+ this.entry = null;
+
+ this.finish = function () {
+ onCorkedFinish(_this, state);
+ };
+}
+/* */
+
+/**/
+
+
+var Duplex;
+/* */
+
+Writable.WritableState = WritableState;
+/**/
+
+var internalUtil = {
+ deprecate: require('util-deprecate')
+};
+/* */
+
+/**/
+
+var Stream = require('./internal/streams/stream');
+/* */
+
+
+var Buffer = require('buffer').Buffer;
+
+var OurUint8Array = global.Uint8Array || function () {};
+
+function _uint8ArrayToBuffer(chunk) {
+ return Buffer.from(chunk);
+}
+
+function _isUint8Array(obj) {
+ return Buffer.isBuffer(obj) || obj instanceof OurUint8Array;
+}
+
+var destroyImpl = require('./internal/streams/destroy');
+
+var _require = require('./internal/streams/state'),
+ getHighWaterMark = _require.getHighWaterMark;
+
+var _require$codes = require('../errors').codes,
+ ERR_INVALID_ARG_TYPE = _require$codes.ERR_INVALID_ARG_TYPE,
+ ERR_METHOD_NOT_IMPLEMENTED = _require$codes.ERR_METHOD_NOT_IMPLEMENTED,
+ ERR_MULTIPLE_CALLBACK = _require$codes.ERR_MULTIPLE_CALLBACK,
+ ERR_STREAM_CANNOT_PIPE = _require$codes.ERR_STREAM_CANNOT_PIPE,
+ ERR_STREAM_DESTROYED = _require$codes.ERR_STREAM_DESTROYED,
+ ERR_STREAM_NULL_VALUES = _require$codes.ERR_STREAM_NULL_VALUES,
+ ERR_STREAM_WRITE_AFTER_END = _require$codes.ERR_STREAM_WRITE_AFTER_END,
+ ERR_UNKNOWN_ENCODING = _require$codes.ERR_UNKNOWN_ENCODING;
+
+var errorOrDestroy = destroyImpl.errorOrDestroy;
+
+require('inherits')(Writable, Stream);
+
+function nop() {}
+
+function WritableState(options, stream, isDuplex) {
+ Duplex = Duplex || require('./_stream_duplex');
+ options = options || {}; // Duplex streams are both readable and writable, but share
+ // the same options object.
+ // However, some cases require setting options to different
+ // values for the readable and the writable sides of the duplex stream,
+ // e.g. options.readableObjectMode vs. options.writableObjectMode, etc.
+
+ if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof Duplex; // object stream flag to indicate whether or not this stream
+ // contains buffers or objects.
+
+ this.objectMode = !!options.objectMode;
+ if (isDuplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // the point at which write() starts returning false
+ // Note: 0 is a valid value, means that we always return false if
+ // the entire buffer is not flushed immediately on write()
+
+ this.highWaterMark = getHighWaterMark(this, options, 'writableHighWaterMark', isDuplex); // if _final has been called
+
+ this.finalCalled = false; // drain event flag.
+
+ this.needDrain = false; // at the start of calling end()
+
+ this.ending = false; // when end() has been called, and returned
+
+ this.ended = false; // when 'finish' is emitted
+
+ this.finished = false; // has it been destroyed
+
+ this.destroyed = false; // should we decode strings into buffers before passing to _write?
+ // this is here so that some node-core streams can optimize string
+ // handling at a lower level.
+
+ var noDecode = options.decodeStrings === false;
+ this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string
+ // encoding is 'binary' so we have to make this configurable.
+ // Everything else in the universe uses 'utf8', though.
+
+ this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement
+ // of how much we're waiting to get pushed to some underlying
+ // socket or file.
+
+ this.length = 0; // a flag to see when we're in the middle of a write.
+
+ this.writing = false; // when true all writes will be buffered until .uncork() call
+
+ this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately,
+ // or on a later tick. We set this to true at first, because any
+ // actions that shouldn't happen until "later" should generally also
+ // not happen before the first write call.
+
+ this.sync = true; // a flag to know if we're processing previously buffered items, which
+ // may call the _write() callback in the same tick, so that we don't
+ // end up in an overlapped onwrite situation.
+
+ this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb)
+
+ this.onwrite = function (er) {
+ onwrite(stream, er);
+ }; // the callback that the user supplies to write(chunk,encoding,cb)
+
+
+ this.writecb = null; // the amount that is being written when _write is called.
+
+ this.writelen = 0;
+ this.bufferedRequest = null;
+ this.lastBufferedRequest = null; // number of pending user-supplied write callbacks
+ // this must be 0 before 'finish' can be emitted
+
+ this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs
+ // This is relevant for synchronous Transform streams
+
+ this.prefinished = false; // True if the error was already emitted and should not be thrown again
+
+ this.errorEmitted = false; // Should close be emitted on destroy. Defaults to true.
+
+ this.emitClose = options.emitClose !== false; // Should .destroy() be called after 'finish' (and potentially 'end')
+
+ this.autoDestroy = !!options.autoDestroy; // count buffered requests
+
+ this.bufferedRequestCount = 0; // allocate the first CorkedRequest, there is always
+ // one allocated and free to use, and we maintain at most two
+
+ this.corkedRequestsFree = new CorkedRequest(this);
+}
+
+WritableState.prototype.getBuffer = function getBuffer() {
+ var current = this.bufferedRequest;
+ var out = [];
+
+ while (current) {
+ out.push(current);
+ current = current.next;
+ }
+
+ return out;
+};
+
+(function () {
+ try {
+ Object.defineProperty(WritableState.prototype, 'buffer', {
+ get: internalUtil.deprecate(function writableStateBufferGetter() {
+ return this.getBuffer();
+ }, '_writableState.buffer is deprecated. Use _writableState.getBuffer ' + 'instead.', 'DEP0003')
+ });
+ } catch (_) {}
+})(); // Test _writableState for inheritance to account for Duplex streams,
+// whose prototype chain only points to Readable.
+
+
+var realHasInstance;
+
+if (typeof Symbol === 'function' && Symbol.hasInstance && typeof Function.prototype[Symbol.hasInstance] === 'function') {
+ realHasInstance = Function.prototype[Symbol.hasInstance];
+ Object.defineProperty(Writable, Symbol.hasInstance, {
+ value: function value(object) {
+ if (realHasInstance.call(this, object)) return true;
+ if (this !== Writable) return false;
+ return object && object._writableState instanceof WritableState;
+ }
+ });
+} else {
+ realHasInstance = function realHasInstance(object) {
+ return object instanceof this;
+ };
+}
+
+function Writable(options) {
+ Duplex = Duplex || require('./_stream_duplex'); // Writable ctor is applied to Duplexes, too.
+ // `realHasInstance` is necessary because using plain `instanceof`
+ // would return false, as no `_writableState` property is attached.
+ // Trying to use the custom `instanceof` for Writable here will also break the
+ // Node.js LazyTransform implementation, which has a non-trivial getter for
+ // `_writableState` that would lead to infinite recursion.
+ // Checking for a Stream.Duplex instance is faster here instead of inside
+ // the WritableState constructor, at least with V8 6.5
+
+ var isDuplex = this instanceof Duplex;
+ if (!isDuplex && !realHasInstance.call(Writable, this)) return new Writable(options);
+ this._writableState = new WritableState(options, this, isDuplex); // legacy.
+
+ this.writable = true;
+
+ if (options) {
+ if (typeof options.write === 'function') this._write = options.write;
+ if (typeof options.writev === 'function') this._writev = options.writev;
+ if (typeof options.destroy === 'function') this._destroy = options.destroy;
+ if (typeof options.final === 'function') this._final = options.final;
+ }
+
+ Stream.call(this);
+} // Otherwise people can pipe Writable streams, which is just wrong.
+
+
+Writable.prototype.pipe = function () {
+ errorOrDestroy(this, new ERR_STREAM_CANNOT_PIPE());
+};
+
+function writeAfterEnd(stream, cb) {
+ var er = new ERR_STREAM_WRITE_AFTER_END(); // TODO: defer error events consistently everywhere, not just the cb
+
+ errorOrDestroy(stream, er);
+ process.nextTick(cb, er);
+} // Checks that a user-supplied chunk is valid, especially for the particular
+// mode the stream is in. Currently this means that `null` is never accepted
+// and undefined/non-string values are only allowed in object mode.
+
+
+function validChunk(stream, state, chunk, cb) {
+ var er;
+
+ if (chunk === null) {
+ er = new ERR_STREAM_NULL_VALUES();
+ } else if (typeof chunk !== 'string' && !state.objectMode) {
+ er = new ERR_INVALID_ARG_TYPE('chunk', ['string', 'Buffer'], chunk);
+ }
+
+ if (er) {
+ errorOrDestroy(stream, er);
+ process.nextTick(cb, er);
+ return false;
+ }
+
+ return true;
+}
+
+Writable.prototype.write = function (chunk, encoding, cb) {
+ var state = this._writableState;
+ var ret = false;
+
+ var isBuf = !state.objectMode && _isUint8Array(chunk);
+
+ if (isBuf && !Buffer.isBuffer(chunk)) {
+ chunk = _uint8ArrayToBuffer(chunk);
+ }
+
+ if (typeof encoding === 'function') {
+ cb = encoding;
+ encoding = null;
+ }
+
+ if (isBuf) encoding = 'buffer';else if (!encoding) encoding = state.defaultEncoding;
+ if (typeof cb !== 'function') cb = nop;
+ if (state.ending) writeAfterEnd(this, cb);else if (isBuf || validChunk(this, state, chunk, cb)) {
+ state.pendingcb++;
+ ret = writeOrBuffer(this, state, isBuf, chunk, encoding, cb);
+ }
+ return ret;
+};
+
+Writable.prototype.cork = function () {
+ this._writableState.corked++;
+};
+
+Writable.prototype.uncork = function () {
+ var state = this._writableState;
+
+ if (state.corked) {
+ state.corked--;
+ if (!state.writing && !state.corked && !state.bufferProcessing && state.bufferedRequest) clearBuffer(this, state);
+ }
+};
+
+Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) {
+ // node::ParseEncoding() requires lower case.
+ if (typeof encoding === 'string') encoding = encoding.toLowerCase();
+ if (!(['hex', 'utf8', 'utf-8', 'ascii', 'binary', 'base64', 'ucs2', 'ucs-2', 'utf16le', 'utf-16le', 'raw'].indexOf((encoding + '').toLowerCase()) > -1)) throw new ERR_UNKNOWN_ENCODING(encoding);
+ this._writableState.defaultEncoding = encoding;
+ return this;
+};
+
+Object.defineProperty(Writable.prototype, 'writableBuffer', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._writableState && this._writableState.getBuffer();
+ }
+});
+
+function decodeChunk(state, chunk, encoding) {
+ if (!state.objectMode && state.decodeStrings !== false && typeof chunk === 'string') {
+ chunk = Buffer.from(chunk, encoding);
+ }
+
+ return chunk;
+}
+
+Object.defineProperty(Writable.prototype, 'writableHighWaterMark', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._writableState.highWaterMark;
+ }
+}); // if we're already writing something, then just put this
+// in the queue, and wait our turn. Otherwise, call _write
+// If we return false, then we need a drain event, so set that flag.
+
+function writeOrBuffer(stream, state, isBuf, chunk, encoding, cb) {
+ if (!isBuf) {
+ var newChunk = decodeChunk(state, chunk, encoding);
+
+ if (chunk !== newChunk) {
+ isBuf = true;
+ encoding = 'buffer';
+ chunk = newChunk;
+ }
+ }
+
+ var len = state.objectMode ? 1 : chunk.length;
+ state.length += len;
+ var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false.
+
+ if (!ret) state.needDrain = true;
+
+ if (state.writing || state.corked) {
+ var last = state.lastBufferedRequest;
+ state.lastBufferedRequest = {
+ chunk: chunk,
+ encoding: encoding,
+ isBuf: isBuf,
+ callback: cb,
+ next: null
+ };
+
+ if (last) {
+ last.next = state.lastBufferedRequest;
+ } else {
+ state.bufferedRequest = state.lastBufferedRequest;
+ }
+
+ state.bufferedRequestCount += 1;
+ } else {
+ doWrite(stream, state, false, len, chunk, encoding, cb);
+ }
+
+ return ret;
+}
+
+function doWrite(stream, state, writev, len, chunk, encoding, cb) {
+ state.writelen = len;
+ state.writecb = cb;
+ state.writing = true;
+ state.sync = true;
+ if (state.destroyed) state.onwrite(new ERR_STREAM_DESTROYED('write'));else if (writev) stream._writev(chunk, state.onwrite);else stream._write(chunk, encoding, state.onwrite);
+ state.sync = false;
+}
+
+function onwriteError(stream, state, sync, er, cb) {
+ --state.pendingcb;
+
+ if (sync) {
+ // defer the callback if we are being called synchronously
+ // to avoid piling up things on the stack
+ process.nextTick(cb, er); // this can emit finish, and it will always happen
+ // after error
+
+ process.nextTick(finishMaybe, stream, state);
+ stream._writableState.errorEmitted = true;
+ errorOrDestroy(stream, er);
+ } else {
+ // the caller expect this to happen before if
+ // it is async
+ cb(er);
+ stream._writableState.errorEmitted = true;
+ errorOrDestroy(stream, er); // this can emit finish, but finish must
+ // always follow error
+
+ finishMaybe(stream, state);
+ }
+}
+
+function onwriteStateUpdate(state) {
+ state.writing = false;
+ state.writecb = null;
+ state.length -= state.writelen;
+ state.writelen = 0;
+}
+
+function onwrite(stream, er) {
+ var state = stream._writableState;
+ var sync = state.sync;
+ var cb = state.writecb;
+ if (typeof cb !== 'function') throw new ERR_MULTIPLE_CALLBACK();
+ onwriteStateUpdate(state);
+ if (er) onwriteError(stream, state, sync, er, cb);else {
+ // Check if we're actually ready to finish, but don't emit yet
+ var finished = needFinish(state) || stream.destroyed;
+
+ if (!finished && !state.corked && !state.bufferProcessing && state.bufferedRequest) {
+ clearBuffer(stream, state);
+ }
+
+ if (sync) {
+ process.nextTick(afterWrite, stream, state, finished, cb);
+ } else {
+ afterWrite(stream, state, finished, cb);
+ }
+ }
+}
+
+function afterWrite(stream, state, finished, cb) {
+ if (!finished) onwriteDrain(stream, state);
+ state.pendingcb--;
+ cb();
+ finishMaybe(stream, state);
+} // Must force callback to be called on nextTick, so that we don't
+// emit 'drain' before the write() consumer gets the 'false' return
+// value, and has a chance to attach a 'drain' listener.
+
+
+function onwriteDrain(stream, state) {
+ if (state.length === 0 && state.needDrain) {
+ state.needDrain = false;
+ stream.emit('drain');
+ }
+} // if there's something in the buffer waiting, then process it
+
+
+function clearBuffer(stream, state) {
+ state.bufferProcessing = true;
+ var entry = state.bufferedRequest;
+
+ if (stream._writev && entry && entry.next) {
+ // Fast case, write everything using _writev()
+ var l = state.bufferedRequestCount;
+ var buffer = new Array(l);
+ var holder = state.corkedRequestsFree;
+ holder.entry = entry;
+ var count = 0;
+ var allBuffers = true;
+
+ while (entry) {
+ buffer[count] = entry;
+ if (!entry.isBuf) allBuffers = false;
+ entry = entry.next;
+ count += 1;
+ }
+
+ buffer.allBuffers = allBuffers;
+ doWrite(stream, state, true, state.length, buffer, '', holder.finish); // doWrite is almost always async, defer these to save a bit of time
+ // as the hot path ends with doWrite
+
+ state.pendingcb++;
+ state.lastBufferedRequest = null;
+
+ if (holder.next) {
+ state.corkedRequestsFree = holder.next;
+ holder.next = null;
+ } else {
+ state.corkedRequestsFree = new CorkedRequest(state);
+ }
+
+ state.bufferedRequestCount = 0;
+ } else {
+ // Slow case, write chunks one-by-one
+ while (entry) {
+ var chunk = entry.chunk;
+ var encoding = entry.encoding;
+ var cb = entry.callback;
+ var len = state.objectMode ? 1 : chunk.length;
+ doWrite(stream, state, false, len, chunk, encoding, cb);
+ entry = entry.next;
+ state.bufferedRequestCount--; // if we didn't call the onwrite immediately, then
+ // it means that we need to wait until it does.
+ // also, that means that the chunk and cb are currently
+ // being processed, so move the buffer counter past them.
+
+ if (state.writing) {
+ break;
+ }
+ }
+
+ if (entry === null) state.lastBufferedRequest = null;
+ }
+
+ state.bufferedRequest = entry;
+ state.bufferProcessing = false;
+}
+
+Writable.prototype._write = function (chunk, encoding, cb) {
+ cb(new ERR_METHOD_NOT_IMPLEMENTED('_write()'));
+};
+
+Writable.prototype._writev = null;
+
+Writable.prototype.end = function (chunk, encoding, cb) {
+ var state = this._writableState;
+
+ if (typeof chunk === 'function') {
+ cb = chunk;
+ chunk = null;
+ encoding = null;
+ } else if (typeof encoding === 'function') {
+ cb = encoding;
+ encoding = null;
+ }
+
+ if (chunk !== null && chunk !== undefined) this.write(chunk, encoding); // .end() fully uncorks
+
+ if (state.corked) {
+ state.corked = 1;
+ this.uncork();
+ } // ignore unnecessary end() calls.
+
+
+ if (!state.ending) endWritable(this, state, cb);
+ return this;
+};
+
+Object.defineProperty(Writable.prototype, 'writableLength', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ return this._writableState.length;
+ }
+});
+
+function needFinish(state) {
+ return state.ending && state.length === 0 && state.bufferedRequest === null && !state.finished && !state.writing;
+}
+
+function callFinal(stream, state) {
+ stream._final(function (err) {
+ state.pendingcb--;
+
+ if (err) {
+ errorOrDestroy(stream, err);
+ }
+
+ state.prefinished = true;
+ stream.emit('prefinish');
+ finishMaybe(stream, state);
+ });
+}
+
+function prefinish(stream, state) {
+ if (!state.prefinished && !state.finalCalled) {
+ if (typeof stream._final === 'function' && !state.destroyed) {
+ state.pendingcb++;
+ state.finalCalled = true;
+ process.nextTick(callFinal, stream, state);
+ } else {
+ state.prefinished = true;
+ stream.emit('prefinish');
+ }
+ }
+}
+
+function finishMaybe(stream, state) {
+ var need = needFinish(state);
+
+ if (need) {
+ prefinish(stream, state);
+
+ if (state.pendingcb === 0) {
+ state.finished = true;
+ stream.emit('finish');
+
+ if (state.autoDestroy) {
+ // In case of duplex streams we need a way to detect
+ // if the readable side is ready for autoDestroy as well
+ var rState = stream._readableState;
+
+ if (!rState || rState.autoDestroy && rState.endEmitted) {
+ stream.destroy();
+ }
+ }
+ }
+ }
+
+ return need;
+}
+
+function endWritable(stream, state, cb) {
+ state.ending = true;
+ finishMaybe(stream, state);
+
+ if (cb) {
+ if (state.finished) process.nextTick(cb);else stream.once('finish', cb);
+ }
+
+ state.ended = true;
+ stream.writable = false;
+}
+
+function onCorkedFinish(corkReq, state, err) {
+ var entry = corkReq.entry;
+ corkReq.entry = null;
+
+ while (entry) {
+ var cb = entry.callback;
+ state.pendingcb--;
+ cb(err);
+ entry = entry.next;
+ } // reuse the free corkReq.
+
+
+ state.corkedRequestsFree.next = corkReq;
+}
+
+Object.defineProperty(Writable.prototype, 'destroyed', {
+ // making it explicit this property is not enumerable
+ // because otherwise some prototype manipulation in
+ // userland will fail
+ enumerable: false,
+ get: function get() {
+ if (this._writableState === undefined) {
+ return false;
+ }
+
+ return this._writableState.destroyed;
+ },
+ set: function set(value) {
+ // we ignore the value if the stream
+ // has not been initialized yet
+ if (!this._writableState) {
+ return;
+ } // backward compatibility, the user is explicitly
+ // managing destroyed
+
+
+ this._writableState.destroyed = value;
+ }
+});
+Writable.prototype.destroy = destroyImpl.destroy;
+Writable.prototype._undestroy = destroyImpl.undestroy;
+
+Writable.prototype._destroy = function (err, cb) {
+ cb(err);
+};
\ No newline at end of file
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/async_iterator.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/async_iterator.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/lib/internal/streams/async_iterator.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/async_iterator.js
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/buffer_list.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/buffer_list.js
new file mode 100644
index 00000000000000..cdea425f19dd96
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/buffer_list.js
@@ -0,0 +1,210 @@
+'use strict';
+
+function ownKeys(object, enumerableOnly) { var keys = Object.keys(object); if (Object.getOwnPropertySymbols) { var symbols = Object.getOwnPropertySymbols(object); if (enumerableOnly) symbols = symbols.filter(function (sym) { return Object.getOwnPropertyDescriptor(object, sym).enumerable; }); keys.push.apply(keys, symbols); } return keys; }
+
+function _objectSpread(target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i] != null ? arguments[i] : {}; if (i % 2) { ownKeys(Object(source), true).forEach(function (key) { _defineProperty(target, key, source[key]); }); } else if (Object.getOwnPropertyDescriptors) { Object.defineProperties(target, Object.getOwnPropertyDescriptors(source)); } else { ownKeys(Object(source)).forEach(function (key) { Object.defineProperty(target, key, Object.getOwnPropertyDescriptor(source, key)); }); } } return target; }
+
+function _defineProperty(obj, key, value) { if (key in obj) { Object.defineProperty(obj, key, { value: value, enumerable: true, configurable: true, writable: true }); } else { obj[key] = value; } return obj; }
+
+function _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError("Cannot call a class as a function"); } }
+
+function _defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if ("value" in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } }
+
+function _createClass(Constructor, protoProps, staticProps) { if (protoProps) _defineProperties(Constructor.prototype, protoProps); if (staticProps) _defineProperties(Constructor, staticProps); return Constructor; }
+
+var _require = require('buffer'),
+ Buffer = _require.Buffer;
+
+var _require2 = require('util'),
+ inspect = _require2.inspect;
+
+var custom = inspect && inspect.custom || 'inspect';
+
+function copyBuffer(src, target, offset) {
+ Buffer.prototype.copy.call(src, target, offset);
+}
+
+module.exports =
+/*#__PURE__*/
+function () {
+ function BufferList() {
+ _classCallCheck(this, BufferList);
+
+ this.head = null;
+ this.tail = null;
+ this.length = 0;
+ }
+
+ _createClass(BufferList, [{
+ key: "push",
+ value: function push(v) {
+ var entry = {
+ data: v,
+ next: null
+ };
+ if (this.length > 0) this.tail.next = entry;else this.head = entry;
+ this.tail = entry;
+ ++this.length;
+ }
+ }, {
+ key: "unshift",
+ value: function unshift(v) {
+ var entry = {
+ data: v,
+ next: this.head
+ };
+ if (this.length === 0) this.tail = entry;
+ this.head = entry;
+ ++this.length;
+ }
+ }, {
+ key: "shift",
+ value: function shift() {
+ if (this.length === 0) return;
+ var ret = this.head.data;
+ if (this.length === 1) this.head = this.tail = null;else this.head = this.head.next;
+ --this.length;
+ return ret;
+ }
+ }, {
+ key: "clear",
+ value: function clear() {
+ this.head = this.tail = null;
+ this.length = 0;
+ }
+ }, {
+ key: "join",
+ value: function join(s) {
+ if (this.length === 0) return '';
+ var p = this.head;
+ var ret = '' + p.data;
+
+ while (p = p.next) {
+ ret += s + p.data;
+ }
+
+ return ret;
+ }
+ }, {
+ key: "concat",
+ value: function concat(n) {
+ if (this.length === 0) return Buffer.alloc(0);
+ var ret = Buffer.allocUnsafe(n >>> 0);
+ var p = this.head;
+ var i = 0;
+
+ while (p) {
+ copyBuffer(p.data, ret, i);
+ i += p.data.length;
+ p = p.next;
+ }
+
+ return ret;
+ } // Consumes a specified amount of bytes or characters from the buffered data.
+
+ }, {
+ key: "consume",
+ value: function consume(n, hasStrings) {
+ var ret;
+
+ if (n < this.head.data.length) {
+ // `slice` is the same for buffers and strings.
+ ret = this.head.data.slice(0, n);
+ this.head.data = this.head.data.slice(n);
+ } else if (n === this.head.data.length) {
+ // First chunk is a perfect match.
+ ret = this.shift();
+ } else {
+ // Result spans more than one buffer.
+ ret = hasStrings ? this._getString(n) : this._getBuffer(n);
+ }
+
+ return ret;
+ }
+ }, {
+ key: "first",
+ value: function first() {
+ return this.head.data;
+ } // Consumes a specified amount of characters from the buffered data.
+
+ }, {
+ key: "_getString",
+ value: function _getString(n) {
+ var p = this.head;
+ var c = 1;
+ var ret = p.data;
+ n -= ret.length;
+
+ while (p = p.next) {
+ var str = p.data;
+ var nb = n > str.length ? str.length : n;
+ if (nb === str.length) ret += str;else ret += str.slice(0, n);
+ n -= nb;
+
+ if (n === 0) {
+ if (nb === str.length) {
+ ++c;
+ if (p.next) this.head = p.next;else this.head = this.tail = null;
+ } else {
+ this.head = p;
+ p.data = str.slice(nb);
+ }
+
+ break;
+ }
+
+ ++c;
+ }
+
+ this.length -= c;
+ return ret;
+ } // Consumes a specified amount of bytes from the buffered data.
+
+ }, {
+ key: "_getBuffer",
+ value: function _getBuffer(n) {
+ var ret = Buffer.allocUnsafe(n);
+ var p = this.head;
+ var c = 1;
+ p.data.copy(ret);
+ n -= p.data.length;
+
+ while (p = p.next) {
+ var buf = p.data;
+ var nb = n > buf.length ? buf.length : n;
+ buf.copy(ret, ret.length - n, 0, nb);
+ n -= nb;
+
+ if (n === 0) {
+ if (nb === buf.length) {
+ ++c;
+ if (p.next) this.head = p.next;else this.head = this.tail = null;
+ } else {
+ this.head = p;
+ p.data = buf.slice(nb);
+ }
+
+ break;
+ }
+
+ ++c;
+ }
+
+ this.length -= c;
+ return ret;
+ } // Make sure the linked list only shows the minimal necessary information.
+
+ }, {
+ key: custom,
+ value: function value(_, options) {
+ return inspect(this, _objectSpread({}, options, {
+ // Only inspect one level.
+ depth: 0,
+ // It should not recurse.
+ customInspect: false
+ }));
+ }
+ }]);
+
+ return BufferList;
+}();
\ No newline at end of file
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/destroy.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/destroy.js
new file mode 100644
index 00000000000000..3268a16f3b6f23
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/destroy.js
@@ -0,0 +1,105 @@
+'use strict'; // undocumented cb() API, needed for core, not for public API
+
+function destroy(err, cb) {
+ var _this = this;
+
+ var readableDestroyed = this._readableState && this._readableState.destroyed;
+ var writableDestroyed = this._writableState && this._writableState.destroyed;
+
+ if (readableDestroyed || writableDestroyed) {
+ if (cb) {
+ cb(err);
+ } else if (err) {
+ if (!this._writableState) {
+ process.nextTick(emitErrorNT, this, err);
+ } else if (!this._writableState.errorEmitted) {
+ this._writableState.errorEmitted = true;
+ process.nextTick(emitErrorNT, this, err);
+ }
+ }
+
+ return this;
+ } // we set destroyed to true before firing error callbacks in order
+ // to make it re-entrance safe in case destroy() is called within callbacks
+
+
+ if (this._readableState) {
+ this._readableState.destroyed = true;
+ } // if this is a duplex stream mark the writable part as destroyed as well
+
+
+ if (this._writableState) {
+ this._writableState.destroyed = true;
+ }
+
+ this._destroy(err || null, function (err) {
+ if (!cb && err) {
+ if (!_this._writableState) {
+ process.nextTick(emitErrorAndCloseNT, _this, err);
+ } else if (!_this._writableState.errorEmitted) {
+ _this._writableState.errorEmitted = true;
+ process.nextTick(emitErrorAndCloseNT, _this, err);
+ } else {
+ process.nextTick(emitCloseNT, _this);
+ }
+ } else if (cb) {
+ process.nextTick(emitCloseNT, _this);
+ cb(err);
+ } else {
+ process.nextTick(emitCloseNT, _this);
+ }
+ });
+
+ return this;
+}
+
+function emitErrorAndCloseNT(self, err) {
+ emitErrorNT(self, err);
+ emitCloseNT(self);
+}
+
+function emitCloseNT(self) {
+ if (self._writableState && !self._writableState.emitClose) return;
+ if (self._readableState && !self._readableState.emitClose) return;
+ self.emit('close');
+}
+
+function undestroy() {
+ if (this._readableState) {
+ this._readableState.destroyed = false;
+ this._readableState.reading = false;
+ this._readableState.ended = false;
+ this._readableState.endEmitted = false;
+ }
+
+ if (this._writableState) {
+ this._writableState.destroyed = false;
+ this._writableState.ended = false;
+ this._writableState.ending = false;
+ this._writableState.finalCalled = false;
+ this._writableState.prefinished = false;
+ this._writableState.finished = false;
+ this._writableState.errorEmitted = false;
+ }
+}
+
+function emitErrorNT(self, err) {
+ self.emit('error', err);
+}
+
+function errorOrDestroy(stream, err) {
+ // We have tests that rely on errors being emitted
+ // in the same tick, so changing this is semver major.
+ // For now when you opt-in to autoDestroy we allow
+ // the error to be emitted nextTick. In a future
+ // semver major update we should change the default to this.
+ var rState = stream._readableState;
+ var wState = stream._writableState;
+ if (rState && rState.autoDestroy || wState && wState.autoDestroy) stream.destroy(err);else stream.emit('error', err);
+}
+
+module.exports = {
+ destroy: destroy,
+ undestroy: undestroy,
+ errorOrDestroy: errorOrDestroy
+};
\ No newline at end of file
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/end-of-stream.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
new file mode 100644
index 00000000000000..831f286d98fa95
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
@@ -0,0 +1,104 @@
+// Ported from https://github.com/mafintosh/end-of-stream with
+// permission from the author, Mathias Buus (@mafintosh).
+'use strict';
+
+var ERR_STREAM_PREMATURE_CLOSE = require('../../../errors').codes.ERR_STREAM_PREMATURE_CLOSE;
+
+function once(callback) {
+ var called = false;
+ return function () {
+ if (called) return;
+ called = true;
+
+ for (var _len = arguments.length, args = new Array(_len), _key = 0; _key < _len; _key++) {
+ args[_key] = arguments[_key];
+ }
+
+ callback.apply(this, args);
+ };
+}
+
+function noop() {}
+
+function isRequest(stream) {
+ return stream.setHeader && typeof stream.abort === 'function';
+}
+
+function eos(stream, opts, callback) {
+ if (typeof opts === 'function') return eos(stream, null, opts);
+ if (!opts) opts = {};
+ callback = once(callback || noop);
+ var readable = opts.readable || opts.readable !== false && stream.readable;
+ var writable = opts.writable || opts.writable !== false && stream.writable;
+
+ var onlegacyfinish = function onlegacyfinish() {
+ if (!stream.writable) onfinish();
+ };
+
+ var writableEnded = stream._writableState && stream._writableState.finished;
+
+ var onfinish = function onfinish() {
+ writable = false;
+ writableEnded = true;
+ if (!readable) callback.call(stream);
+ };
+
+ var readableEnded = stream._readableState && stream._readableState.endEmitted;
+
+ var onend = function onend() {
+ readable = false;
+ readableEnded = true;
+ if (!writable) callback.call(stream);
+ };
+
+ var onerror = function onerror(err) {
+ callback.call(stream, err);
+ };
+
+ var onclose = function onclose() {
+ var err;
+
+ if (readable && !readableEnded) {
+ if (!stream._readableState || !stream._readableState.ended) err = new ERR_STREAM_PREMATURE_CLOSE();
+ return callback.call(stream, err);
+ }
+
+ if (writable && !writableEnded) {
+ if (!stream._writableState || !stream._writableState.ended) err = new ERR_STREAM_PREMATURE_CLOSE();
+ return callback.call(stream, err);
+ }
+ };
+
+ var onrequest = function onrequest() {
+ stream.req.on('finish', onfinish);
+ };
+
+ if (isRequest(stream)) {
+ stream.on('complete', onfinish);
+ stream.on('abort', onclose);
+ if (stream.req) onrequest();else stream.on('request', onrequest);
+ } else if (writable && !stream._writableState) {
+ // legacy streams
+ stream.on('end', onlegacyfinish);
+ stream.on('close', onlegacyfinish);
+ }
+
+ stream.on('end', onend);
+ stream.on('finish', onfinish);
+ if (opts.error !== false) stream.on('error', onerror);
+ stream.on('close', onclose);
+ return function () {
+ stream.removeListener('complete', onfinish);
+ stream.removeListener('abort', onclose);
+ stream.removeListener('request', onrequest);
+ if (stream.req) stream.req.removeListener('finish', onfinish);
+ stream.removeListener('end', onlegacyfinish);
+ stream.removeListener('close', onlegacyfinish);
+ stream.removeListener('finish', onfinish);
+ stream.removeListener('end', onend);
+ stream.removeListener('error', onerror);
+ stream.removeListener('close', onclose);
+ };
+}
+
+module.exports = eos;
\ No newline at end of file
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/from-browser.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/from-browser.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/lib/internal/streams/from-browser.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/from-browser.js
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/from.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/from.js
new file mode 100644
index 00000000000000..6c41284416799c
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/from.js
@@ -0,0 +1,64 @@
+'use strict';
+
+function asyncGeneratorStep(gen, resolve, reject, _next, _throw, key, arg) { try { var info = gen[key](arg); var value = info.value; } catch (error) { reject(error); return; } if (info.done) { resolve(value); } else { Promise.resolve(value).then(_next, _throw); } }
+
+function _asyncToGenerator(fn) { return function () { var self = this, args = arguments; return new Promise(function (resolve, reject) { var gen = fn.apply(self, args); function _next(value) { asyncGeneratorStep(gen, resolve, reject, _next, _throw, "next", value); } function _throw(err) { asyncGeneratorStep(gen, resolve, reject, _next, _throw, "throw", err); } _next(undefined); }); }; }
+
+function ownKeys(object, enumerableOnly) { var keys = Object.keys(object); if (Object.getOwnPropertySymbols) { var symbols = Object.getOwnPropertySymbols(object); if (enumerableOnly) symbols = symbols.filter(function (sym) { return Object.getOwnPropertyDescriptor(object, sym).enumerable; }); keys.push.apply(keys, symbols); } return keys; }
+
+function _objectSpread(target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i] != null ? arguments[i] : {}; if (i % 2) { ownKeys(Object(source), true).forEach(function (key) { _defineProperty(target, key, source[key]); }); } else if (Object.getOwnPropertyDescriptors) { Object.defineProperties(target, Object.getOwnPropertyDescriptors(source)); } else { ownKeys(Object(source)).forEach(function (key) { Object.defineProperty(target, key, Object.getOwnPropertyDescriptor(source, key)); }); } } return target; }
+
+function _defineProperty(obj, key, value) { if (key in obj) { Object.defineProperty(obj, key, { value: value, enumerable: true, configurable: true, writable: true }); } else { obj[key] = value; } return obj; }
+
+var ERR_INVALID_ARG_TYPE = require('../../../errors').codes.ERR_INVALID_ARG_TYPE;
+
+function from(Readable, iterable, opts) {
+ var iterator;
+
+ if (iterable && typeof iterable.next === 'function') {
+ iterator = iterable;
+ } else if (iterable && iterable[Symbol.asyncIterator]) iterator = iterable[Symbol.asyncIterator]();else if (iterable && iterable[Symbol.iterator]) iterator = iterable[Symbol.iterator]();else throw new ERR_INVALID_ARG_TYPE('iterable', ['Iterable'], iterable);
+
+ var readable = new Readable(_objectSpread({
+ objectMode: true
+ }, opts)); // Reading boolean to protect against _read
+ // being called before last iteration completion.
+
+ var reading = false;
+
+ readable._read = function () {
+ if (!reading) {
+ reading = true;
+ next();
+ }
+ };
+
+ function next() {
+ return _next2.apply(this, arguments);
+ }
+
+ function _next2() {
+ _next2 = _asyncToGenerator(function* () {
+ try {
+ var _ref = yield iterator.next(),
+ value = _ref.value,
+ done = _ref.done;
+
+ if (done) {
+ readable.push(null);
+ } else if (readable.push((yield value))) {
+ next();
+ } else {
+ reading = false;
+ }
+ } catch (err) {
+ readable.destroy(err);
+ }
+ });
+ return _next2.apply(this, arguments);
+ }
+
+ return readable;
+}
+
+module.exports = from;
\ No newline at end of file
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/pipeline.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/pipeline.js
new file mode 100644
index 00000000000000..6589909889c585
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/pipeline.js
@@ -0,0 +1,97 @@
+// Ported from https://github.com/mafintosh/pump with
+// permission from the author, Mathias Buus (@mafintosh).
+'use strict';
+
+var eos;
+
+function once(callback) {
+ var called = false;
+ return function () {
+ if (called) return;
+ called = true;
+ callback.apply(void 0, arguments);
+ };
+}
+
+var _require$codes = require('../../../errors').codes,
+ ERR_MISSING_ARGS = _require$codes.ERR_MISSING_ARGS,
+ ERR_STREAM_DESTROYED = _require$codes.ERR_STREAM_DESTROYED;
+
+function noop(err) {
+ // Rethrow the error if it exists to avoid swallowing it
+ if (err) throw err;
+}
+
+function isRequest(stream) {
+ return stream.setHeader && typeof stream.abort === 'function';
+}
+
+function destroyer(stream, reading, writing, callback) {
+ callback = once(callback);
+ var closed = false;
+ stream.on('close', function () {
+ closed = true;
+ });
+ if (eos === undefined) eos = require('./end-of-stream');
+ eos(stream, {
+ readable: reading,
+ writable: writing
+ }, function (err) {
+ if (err) return callback(err);
+ closed = true;
+ callback();
+ });
+ var destroyed = false;
+ return function (err) {
+ if (closed) return;
+ if (destroyed) return;
+ destroyed = true; // request.destroy just do .end - .abort is what we want
+
+ if (isRequest(stream)) return stream.abort();
+ if (typeof stream.destroy === 'function') return stream.destroy();
+ callback(err || new ERR_STREAM_DESTROYED('pipe'));
+ };
+}
+
+function call(fn) {
+ fn();
+}
+
+function pipe(from, to) {
+ return from.pipe(to);
+}
+
+function popCallback(streams) {
+ if (!streams.length) return noop;
+ if (typeof streams[streams.length - 1] !== 'function') return noop;
+ return streams.pop();
+}
+
+function pipeline() {
+ for (var _len = arguments.length, streams = new Array(_len), _key = 0; _key < _len; _key++) {
+ streams[_key] = arguments[_key];
+ }
+
+ var callback = popCallback(streams);
+ if (Array.isArray(streams[0])) streams = streams[0];
+
+ if (streams.length < 2) {
+ throw new ERR_MISSING_ARGS('streams');
+ }
+
+ var error;
+ var destroys = streams.map(function (stream, i) {
+ var reading = i < streams.length - 1;
+ var writing = i > 0;
+ return destroyer(stream, reading, writing, function (err) {
+ if (!error) error = err;
+ if (err) destroys.forEach(call);
+ if (reading) return;
+ destroys.forEach(call);
+ callback(error);
+ });
+ });
+ return streams.reduce(pipe);
+}
+
+module.exports = pipeline;
\ No newline at end of file
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/state.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/state.js
new file mode 100644
index 00000000000000..19887eb8a9070e
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/state.js
@@ -0,0 +1,27 @@
+'use strict';
+
+var ERR_INVALID_OPT_VALUE = require('../../../errors').codes.ERR_INVALID_OPT_VALUE;
+
+function highWaterMarkFrom(options, isDuplex, duplexKey) {
+ return options.highWaterMark != null ? options.highWaterMark : isDuplex ? options[duplexKey] : null;
+}
+
+function getHighWaterMark(state, options, duplexKey, isDuplex) {
+ var hwm = highWaterMarkFrom(options, isDuplex, duplexKey);
+
+ if (hwm != null) {
+ if (!(isFinite(hwm) && Math.floor(hwm) === hwm) || hwm < 0) {
+ var name = isDuplex ? duplexKey : 'highWaterMark';
+ throw new ERR_INVALID_OPT_VALUE(name, hwm);
+ }
+
+ return Math.floor(hwm);
+ } // Default value
+
+
+ return state.objectMode ? 16 : 16 * 1024;
+}
+
+module.exports = {
+ getHighWaterMark: getHighWaterMark
+};
\ No newline at end of file
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/stream-browser.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/stream-browser.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/lib/internal/streams/stream-browser.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/stream-browser.js
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/stream.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/stream.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/lib/internal/streams/stream.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/lib/internal/streams/stream.js
diff --git a/deps/npm/node_modules/node-gyp/node_modules/readable-stream/package.json b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/package.json
new file mode 100644
index 00000000000000..0b0c4bd207ace3
--- /dev/null
+++ b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/package.json
@@ -0,0 +1,68 @@
+{
+ "name": "readable-stream",
+ "version": "3.6.0",
+ "description": "Streams3, a user-land copy of the stream library from Node.js",
+ "main": "readable.js",
+ "engines": {
+ "node": ">= 6"
+ },
+ "dependencies": {
+ "inherits": "^2.0.3",
+ "string_decoder": "^1.1.1",
+ "util-deprecate": "^1.0.1"
+ },
+ "devDependencies": {
+ "@babel/cli": "^7.2.0",
+ "@babel/core": "^7.2.0",
+ "@babel/polyfill": "^7.0.0",
+ "@babel/preset-env": "^7.2.0",
+ "airtap": "0.0.9",
+ "assert": "^1.4.0",
+ "bl": "^2.0.0",
+ "deep-strict-equal": "^0.2.0",
+ "events.once": "^2.0.2",
+ "glob": "^7.1.2",
+ "gunzip-maybe": "^1.4.1",
+ "hyperquest": "^2.1.3",
+ "lolex": "^2.6.0",
+ "nyc": "^11.0.0",
+ "pump": "^3.0.0",
+ "rimraf": "^2.6.2",
+ "tap": "^12.0.0",
+ "tape": "^4.9.0",
+ "tar-fs": "^1.16.2",
+ "util-promisify": "^2.1.0"
+ },
+ "scripts": {
+ "test": "tap -J --no-esm test/parallel/*.js test/ours/*.js",
+ "ci": "TAP=1 tap --no-esm test/parallel/*.js test/ours/*.js | tee test.tap",
+ "test-browsers": "airtap --sauce-connect --loopback airtap.local -- test/browser.js",
+ "test-browser-local": "airtap --open --local -- test/browser.js",
+ "cover": "nyc npm test",
+ "report": "nyc report --reporter=lcov",
+ "update-browser-errors": "babel -o errors-browser.js errors.js"
+ },
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/nodejs/readable-stream"
+ },
+ "keywords": [
+ "readable",
+ "stream",
+ "pipe"
+ ],
+ "browser": {
+ "util": false,
+ "worker_threads": false,
+ "./errors": "./errors-browser.js",
+ "./readable.js": "./readable-browser.js",
+ "./lib/internal/streams/from.js": "./lib/internal/streams/from-browser.js",
+ "./lib/internal/streams/stream.js": "./lib/internal/streams/stream-browser.js"
+ },
+ "nyc": {
+ "include": [
+ "lib/**.js"
+ ]
+ },
+ "license": "MIT"
+}
diff --git a/deps/npm/node_modules/readable-stream/readable-browser.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/readable-browser.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/readable-browser.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/readable-browser.js
diff --git a/deps/npm/node_modules/readable-stream/readable.js b/deps/npm/node_modules/node-gyp/node_modules/readable-stream/readable.js
similarity index 100%
rename from deps/npm/node_modules/readable-stream/readable.js
rename to deps/npm/node_modules/node-gyp/node_modules/readable-stream/readable.js
diff --git a/deps/npm/node_modules/node-gyp/package.json b/deps/npm/node_modules/node-gyp/package.json
index 932e8cb3b35553..f95ebeadecd70d 100644
--- a/deps/npm/node_modules/node-gyp/package.json
+++ b/deps/npm/node_modules/node-gyp/package.json
@@ -11,7 +11,7 @@
"bindings",
"gyp"
],
- "version": "9.3.0",
+ "version": "9.3.1",
"installVersion": 9,
"author": "Nathan Rajlich (http://tootallnate.net)",
"repository": {
@@ -34,7 +34,7 @@
"which": "^2.0.2"
},
"engines": {
- "node": "^12.22 || ^14.13 || >=16"
+ "node": "^12.13 || ^14.13 || >=16"
},
"devDependencies": {
"bindings": "^1.5.0",
diff --git a/deps/npm/node_modules/npm-user-validate/npm-user-validate.js b/deps/npm/node_modules/npm-user-validate/lib/index.js
similarity index 90%
rename from deps/npm/node_modules/npm-user-validate/npm-user-validate.js
rename to deps/npm/node_modules/npm-user-validate/lib/index.js
index ffd8791c7eb95d..379a31d2720e39 100644
--- a/deps/npm/node_modules/npm-user-validate/npm-user-validate.js
+++ b/deps/npm/node_modules/npm-user-validate/lib/index.js
@@ -7,17 +7,17 @@ var requirements = exports.requirements = {
lowerCase: 'Name must be lowercase',
urlSafe: 'Name may not contain non-url-safe chars',
dot: 'Name may not start with "."',
- illegal: 'Name may not contain illegal character'
+ illegal: 'Name may not contain illegal character',
},
password: {},
email: {
length: 'Email length must be less then or equal to 254 characters long',
- valid: 'Email must be an email address'
- }
+ valid: 'Email must be an email address',
+ },
}
var illegalCharacterRe = new RegExp('([' + [
- "'"
+ "'",
].join() + '])')
function username (un) {
@@ -56,6 +56,6 @@ function email (em) {
return null
}
-function pw (pw) {
+function pw () {
return null
}
diff --git a/deps/npm/node_modules/npm-user-validate/package.json b/deps/npm/node_modules/npm-user-validate/package.json
index ffcf1be7f5c62b..8cf48f80f86a8e 100644
--- a/deps/npm/node_modules/npm-user-validate/package.json
+++ b/deps/npm/node_modules/npm-user-validate/package.json
@@ -1,29 +1,48 @@
{
"name": "npm-user-validate",
- "version": "1.0.1",
+ "version": "2.0.0",
"description": "User validations for npm",
- "main": "npm-user-validate.js",
+ "main": "lib/index.js",
"devDependencies": {
- "standard": "^8.4.0",
- "standard-version": "^3.0.0",
- "tap": "^7.1.2"
+ "@npmcli/eslint-config": "^4.0.1",
+ "@npmcli/template-oss": "4.11.0",
+ "tap": "^16.3.2"
},
"scripts": {
- "pretest": "standard",
- "test": "tap --100 test/*.js"
+ "test": "tap",
+ "lint": "eslint \"**/*.js\"",
+ "postlint": "template-oss-check",
+ "template-oss-apply": "template-oss-apply --force",
+ "lintfix": "npm run lint -- --fix",
+ "snap": "tap",
+ "posttest": "npm run lint"
},
"repository": {
"type": "git",
- "url": "git://github.com/npm/npm-user-validate.git"
+ "url": "https://github.com/npm/npm-user-validate.git"
},
"keywords": [
"npm",
"validation",
"registry"
],
- "author": "Robert Kowalski ",
+ "author": "GitHub Inc.",
"license": "BSD-2-Clause",
"files": [
- "npm-user-validate.js"
- ]
+ "bin/",
+ "lib/"
+ ],
+ "engines": {
+ "node": "^14.17.0 || ^16.13.0 || >=18.0.0"
+ },
+ "templateOSS": {
+ "//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
+ "version": "4.11.0"
+ },
+ "tap": {
+ "nyc-arg": [
+ "--exclude",
+ "tap-snapshots/**"
+ ]
+ }
}
diff --git a/deps/npm/node_modules/pacote/README.md b/deps/npm/node_modules/pacote/README.md
index 75581c857e43dc..64480b254239a1 100644
--- a/deps/npm/node_modules/pacote/README.md
+++ b/deps/npm/node_modules/pacote/README.md
@@ -172,7 +172,9 @@ resolved, and other properties, as they are determined.
integrity signature of a manifest, if present. There must be a
configured `_keys` entry in the config that is scoped to the
registry the manifest is being fetched from.
-
+* `verifyAttestations` A boolean that will make pacote verify Sigstore
+ attestations, if present. There must be a configured `_keys` entry in the
+ config that is scoped to the registry the manifest is being fetched from.
### Advanced API
diff --git a/deps/npm/node_modules/pacote/lib/registry.js b/deps/npm/node_modules/pacote/lib/registry.js
index c4c9df8e4096c9..625bedc9a77369 100644
--- a/deps/npm/node_modules/pacote/lib/registry.js
+++ b/deps/npm/node_modules/pacote/lib/registry.js
@@ -7,6 +7,8 @@ const rpj = require('read-package-json-fast')
const pickManifest = require('npm-pick-manifest')
const ssri = require('ssri')
const crypto = require('crypto')
+const npa = require('npm-package-arg')
+const { sigstore } = require('sigstore')
// Corgis are cute. 🐕🐶
const corgiDoc = 'application/vnd.npm.install-v1+json; q=1.0, application/json; q=0.8, */*'
@@ -203,7 +205,118 @@ class RegistryFetcher extends Fetcher {
mani._signatures = dist.signatures
}
}
+
+ if (dist.attestations) {
+ if (this.opts.verifyAttestations) {
+ // Always fetch attestations from the current registry host
+ const attestationsPath = new URL(dist.attestations.url).pathname
+ const attestationsUrl = removeTrailingSlashes(this.registry) + attestationsPath
+ const res = await fetch(attestationsUrl, {
+ ...this.opts,
+ // disable integrity check for attestations json payload, we check the
+ // integrity in the verification steps below
+ integrity: null,
+ })
+ const { attestations } = await res.json()
+ const bundles = attestations.map(({ predicateType, bundle }) => {
+ const statement = JSON.parse(
+ Buffer.from(bundle.dsseEnvelope.payload, 'base64').toString('utf8')
+ )
+ const keyid = bundle.dsseEnvelope.signatures[0].keyid
+ const signature = bundle.dsseEnvelope.signatures[0].sig
+
+ return {
+ predicateType,
+ bundle,
+ statement,
+ keyid,
+ signature,
+ }
+ })
+
+ const attestationKeyIds = bundles.map((b) => b.keyid).filter((k) => !!k)
+ const attestationRegistryKeys = (this.registryKeys || [])
+ .filter(key => attestationKeyIds.includes(key.keyid))
+ if (!attestationRegistryKeys.length) {
+ throw Object.assign(new Error(
+ `${mani._id} has attestations but no corresponding public key(s) can be found`
+ ), { code: 'EMISSINGSIGNATUREKEY' })
+ }
+
+ for (const { predicateType, bundle, keyid, signature, statement } of bundles) {
+ const publicKey = attestationRegistryKeys.find(key => key.keyid === keyid)
+ // Publish attestations have a keyid set and a valid public key must be found
+ if (keyid) {
+ if (!publicKey) {
+ throw Object.assign(new Error(
+ `${mani._id} has attestations with keyid: ${keyid} ` +
+ 'but no corresponding public key can be found'
+ ), { code: 'EMISSINGSIGNATUREKEY' })
+ }
+
+ const validPublicKey =
+ !publicKey.expires || (Date.parse(publicKey.expires) > Date.now())
+ if (!validPublicKey) {
+ throw Object.assign(new Error(
+ `${mani._id} has attestations with keyid: ${keyid} ` +
+ `but the corresponding public key has expired ${publicKey.expires}`
+ ), { code: 'EEXPIREDSIGNATUREKEY' })
+ }
+ }
+
+ const subject = {
+ name: statement.subject[0].name,
+ sha512: statement.subject[0].digest.sha512,
+ }
+
+ // Only type 'version' can be turned into a PURL
+ const purl = this.spec.type === 'version' ? npa.toPurl(this.spec) : this.spec
+ // Verify the statement subject matches the package, version
+ if (subject.name !== purl) {
+ throw Object.assign(new Error(
+ `${mani._id} package name and version (PURL): ${purl} ` +
+ `doesn't match what was signed: ${subject.name}`
+ ), { code: 'EATTESTATIONSUBJECT' })
+ }
+
+ // Verify the statement subject matches the tarball integrity
+ const integrityHexDigest = ssri.parse(this.integrity).hexDigest()
+ if (subject.sha512 !== integrityHexDigest) {
+ throw Object.assign(new Error(
+ `${mani._id} package integrity (hex digest): ` +
+ `${integrityHexDigest} ` +
+ `doesn't match what was signed: ${subject.sha512}`
+ ), { code: 'EATTESTATIONSUBJECT' })
+ }
+
+ try {
+ // Provenance attestations are signed with a signing certificate
+ // (including the key) so we don't need to return a public key.
+ //
+ // Publish attestations are signed with a keyid so we need to
+ // specify a public key from the keys endpoint: `registry-host.tld/-/npm/v1/keys`
+ const options = { keySelector: publicKey ? () => publicKey.pemkey : undefined }
+ await sigstore.verify(bundle, null, options)
+ } catch (e) {
+ throw Object.assign(new Error(
+ `${mani._id} failed to verify attestation: ${e.message}`
+ ), {
+ code: 'EATTESTATIONVERIFY',
+ predicateType,
+ keyid,
+ signature,
+ resolved: mani._resolved,
+ integrity: mani._integrity,
+ })
+ }
+ }
+ mani._attestations = dist.attestations
+ } else {
+ mani._attestations = dist.attestations
+ }
+ }
}
+
this.package = mani
return this.package
}
diff --git a/deps/npm/node_modules/pacote/package.json b/deps/npm/node_modules/pacote/package.json
index 77675134099853..c09fbda86aa1dd 100644
--- a/deps/npm/node_modules/pacote/package.json
+++ b/deps/npm/node_modules/pacote/package.json
@@ -1,6 +1,6 @@
{
"name": "pacote",
- "version": "15.0.7",
+ "version": "15.1.0",
"description": "JavaScript package downloader",
"author": "GitHub Inc.",
"bin": {
@@ -27,7 +27,7 @@
"devDependencies": {
"@npmcli/arborist": "^6.0.0 || ^6.0.0-pre.0",
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.10.0",
+ "@npmcli/template-oss": "4.11.4",
"hosted-git-info": "^6.0.0",
"mutate-fs": "^2.1.1",
"nock": "^13.2.4",
@@ -49,7 +49,7 @@
"@npmcli/promise-spawn": "^6.0.1",
"@npmcli/run-script": "^6.0.0",
"cacache": "^17.0.0",
- "fs-minipass": "^2.1.0",
+ "fs-minipass": "^3.0.0",
"minipass": "^4.0.0",
"npm-package-arg": "^10.0.0",
"npm-packlist": "^7.0.0",
@@ -59,6 +59,7 @@
"promise-retry": "^2.0.1",
"read-package-json": "^6.0.0",
"read-package-json-fast": "^3.0.0",
+ "sigstore": "^1.0.0",
"ssri": "^10.0.0",
"tar": "^6.1.11"
},
@@ -71,7 +72,7 @@
},
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.10.0",
+ "version": "4.11.4",
"windowsCI": false
}
}
diff --git a/deps/npm/node_modules/postcss-selector-parser/API.md b/deps/npm/node_modules/postcss-selector-parser/API.md
index 6aa1f1438f1be4..36c7298fc9753c 100644
--- a/deps/npm/node_modules/postcss-selector-parser/API.md
+++ b/deps/npm/node_modules/postcss-selector-parser/API.md
@@ -395,16 +395,18 @@ Arguments:
### `container.atPosition(line, column)`
-Returns the node at the source position `index`.
+Returns the node at the source position `line` and `column`.
```js
-selector.at(0) === selector.first;
-selector.at(0) === selector.nodes[0];
+// Input: :not(.foo),\n#foo > :matches(ol, ul)
+selector.atPosition(1, 1); // => :not(.foo)
+selector.atPosition(2, 1); // => \n#foo
```
Arguments:
-* `index`: The index of the node to return.
+* `line`: The line number of the node to return.
+* `column`: The column number of the node to return.
### `container.index(node)`
diff --git a/deps/npm/node_modules/postcss-selector-parser/dist/parser.js b/deps/npm/node_modules/postcss-selector-parser/dist/parser.js
index 8af1cff1ba07ed..b97a0fab366488 100644
--- a/deps/npm/node_modules/postcss-selector-parser/dist/parser.js
+++ b/deps/npm/node_modules/postcss-selector-parser/dist/parser.js
@@ -344,7 +344,7 @@ var Parser = /*#__PURE__*/function () {
}
lastAdded = 'attribute';
- } else if (!node.value && node.value !== "" || lastAdded === "value" && !spaceAfterMeaningfulToken) {
+ } else if (!node.value && node.value !== "" || lastAdded === "value" && !(spaceAfterMeaningfulToken || node.quoteMark)) {
var _unescaped = (0, _util.unesc)(content);
var _oldRawValue = (0, _util.getProp)(node, 'raws', 'value') || '';
diff --git a/deps/npm/node_modules/postcss-selector-parser/dist/selectors/attribute.js b/deps/npm/node_modules/postcss-selector-parser/dist/selectors/attribute.js
index 8f535e5d731299..9edc30bf133dc5 100644
--- a/deps/npm/node_modules/postcss-selector-parser/dist/selectors/attribute.js
+++ b/deps/npm/node_modules/postcss-selector-parser/dist/selectors/attribute.js
@@ -441,7 +441,8 @@ var Attribute = /*#__PURE__*/function (_Namespace) {
key: "value",
get: function get() {
return this._value;
- }
+ },
+ set:
/**
* Before 3.0, the value had to be set to an escaped value including any wrapped
* quote marks. In 3.0, the semantics of `Attribute.value` changed so that the value
@@ -454,8 +455,7 @@ var Attribute = /*#__PURE__*/function (_Namespace) {
* Instead, you should call `attr.setValue(newValue, opts)` and pass options that describe
* how the new value is quoted.
*/
- ,
- set: function set(v) {
+ function set(v) {
if (this._constructed) {
var _unescapeValue2 = unescapeValue(v),
deprecatedUsage = _unescapeValue2.deprecatedUsage,
@@ -478,6 +478,31 @@ var Attribute = /*#__PURE__*/function (_Namespace) {
this._value = v;
}
}
+ }, {
+ key: "insensitive",
+ get: function get() {
+ return this._insensitive;
+ }
+ /**
+ * Set the case insensitive flag.
+ * If the case insensitive flag changes, the raw (escaped) value at `attr.raws.insensitiveFlag`
+ * of the attribute is updated accordingly.
+ *
+ * @param {true | false} insensitive true if the attribute should match case-insensitively.
+ */
+ ,
+ set: function set(insensitive) {
+ if (!insensitive) {
+ this._insensitive = false; // "i" and "I" can be used in "this.raws.insensitiveFlag" to store the original notation.
+ // When setting `attr.insensitive = false` both should be erased to ensure correct serialization.
+
+ if (this.raws && (this.raws.insensitiveFlag === 'I' || this.raws.insensitiveFlag === 'i')) {
+ this.raws.insensitiveFlag = undefined;
+ }
+ }
+
+ this._insensitive = insensitive;
+ }
}, {
key: "attribute",
get: function get() {
diff --git a/deps/npm/node_modules/postcss-selector-parser/package.json b/deps/npm/node_modules/postcss-selector-parser/package.json
index a6f33589ba051d..9655072779a29e 100644
--- a/deps/npm/node_modules/postcss-selector-parser/package.json
+++ b/deps/npm/node_modules/postcss-selector-parser/package.json
@@ -1,6 +1,6 @@
{
"name": "postcss-selector-parser",
- "version": "6.0.10",
+ "version": "6.0.11",
"devDependencies": {
"@babel/cli": "^7.11.6",
"@babel/core": "^7.11.6",
diff --git a/deps/npm/node_modules/promzard/example/buffer.js b/deps/npm/node_modules/promzard/example/buffer.js
deleted file mode 100644
index 828f9d1df9da2f..00000000000000
--- a/deps/npm/node_modules/promzard/example/buffer.js
+++ /dev/null
@@ -1,12 +0,0 @@
-var pz = require('../promzard')
-
-var path = require('path')
-var file = path.resolve(__dirname, 'substack-input.js')
-var buf = require('fs').readFileSync(file)
-var ctx = { basename: path.basename(path.dirname(file)) }
-
-pz.fromBuffer(buf, ctx, function (er, res) {
- if (er)
- throw er
- console.error(JSON.stringify(res, null, 2))
-})
diff --git a/deps/npm/node_modules/promzard/example/index.js b/deps/npm/node_modules/promzard/example/index.js
deleted file mode 100644
index 435131f3a6d1e2..00000000000000
--- a/deps/npm/node_modules/promzard/example/index.js
+++ /dev/null
@@ -1,11 +0,0 @@
-var pz = require('../promzard')
-
-var path = require('path')
-var file = path.resolve(__dirname, 'substack-input.js')
-var ctx = { basename: path.basename(path.dirname(file)) }
-
-pz(file, ctx, function (er, res) {
- if (er)
- throw er
- console.error(JSON.stringify(res, null, 2))
-})
diff --git a/deps/npm/node_modules/promzard/example/npm-init/init-input.js b/deps/npm/node_modules/promzard/example/npm-init/init-input.js
deleted file mode 100644
index ba7781b3539e4d..00000000000000
--- a/deps/npm/node_modules/promzard/example/npm-init/init-input.js
+++ /dev/null
@@ -1,191 +0,0 @@
-var fs = require('fs')
-var path = require('path');
-
-module.exports = {
- "name" : prompt('name',
- typeof name === 'undefined'
- ? basename.replace(/^node-|[.-]js$/g, ''): name),
- "version" : prompt('version', typeof version !== "undefined"
- ? version : '0.0.0'),
- "description" : (function () {
- if (typeof description !== 'undefined' && description) {
- return description
- }
- var value;
- try {
- var src = fs.readFileSync('README.md', 'utf8');
- value = src.split('\n').filter(function (line) {
- return /\s+/.test(line)
- && line.trim() !== basename.replace(/^node-/, '')
- && !line.trim().match(/^#/)
- ;
- })[0]
- .trim()
- .replace(/^./, function (c) { return c.toLowerCase() })
- .replace(/\.$/, '')
- ;
- }
- catch (e) {
- try {
- // Wouldn't it be nice if that file mattered?
- var d = fs.readFileSync('.git/description', 'utf8')
- } catch (e) {}
- if (d.trim() && !value) value = d
- }
- return prompt('description', value);
- })(),
- "main" : (function () {
- var f
- try {
- f = fs.readdirSync(dirname).filter(function (f) {
- return f.match(/\.js$/)
- })
- if (f.indexOf('index.js') !== -1)
- f = 'index.js'
- else if (f.indexOf('main.js') !== -1)
- f = 'main.js'
- else if (f.indexOf(basename + '.js') !== -1)
- f = basename + '.js'
- else
- f = f[0]
- } catch (e) {}
-
- return prompt('entry point', f || 'index.js')
- })(),
- "bin" : function (cb) {
- fs.readdir(dirname + '/bin', function (er, d) {
- // no bins
- if (er) return cb()
- // just take the first js file we find there, or nada
- return cb(null, d.filter(function (f) {
- return f.match(/\.js$/)
- })[0])
- })
- },
- "directories" : function (cb) {
- fs.readdir('.', function (er, dirs) {
- if (er) return cb(er)
- var res = {}
- dirs.forEach(function (d) {
- switch (d) {
- case 'example': case 'examples': return res.example = d
- case 'test': case 'tests': return res.test = d
- case 'doc': case 'docs': return res.doc = d
- case 'man': return res.man = d
- }
- })
- if (Object.keys(res).length === 0) res = undefined
- return cb(null, res)
- })
- },
- "dependencies" : typeof dependencies !== 'undefined' ? dependencies
- : function (cb) {
- fs.readdir('node_modules', function (er, dir) {
- if (er) return cb()
- var deps = {}
- var n = dir.length
- dir.forEach(function (d) {
- if (d.match(/^\./)) return next()
- if (d.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/))
- return next()
- fs.readFile('node_modules/' + d + '/package.json', function (er, p) {
- if (er) return next()
- try { p = JSON.parse(p) } catch (e) { return next() }
- if (!p.version) return next()
- deps[d] = '~' + p.version
- return next()
- })
- })
- function next () {
- if (--n === 0) return cb(null, deps)
- }
- })
- },
- "devDependencies" : typeof devDependencies !== 'undefined' ? devDependencies
- : function (cb) {
- // same as dependencies but for dev deps
- fs.readdir('node_modules', function (er, dir) {
- if (er) return cb()
- var deps = {}
- var n = dir.length
- dir.forEach(function (d) {
- if (d.match(/^\./)) return next()
- if (!d.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/))
- return next()
- fs.readFile('node_modules/' + d + '/package.json', function (er, p) {
- if (er) return next()
- try { p = JSON.parse(p) } catch (e) { return next() }
- if (!p.version) return next()
- deps[d] = '~' + p.version
- return next()
- })
- })
- function next () {
- if (--n === 0) return cb(null, deps)
- }
- })
- },
- "scripts" : (function () {
- // check to see what framework is in use, if any
- try { var d = fs.readdirSync('node_modules') }
- catch (e) { d = [] }
- var s = typeof scripts === 'undefined' ? {} : scripts
-
- if (d.indexOf('coffee-script') !== -1)
- s.prepublish = prompt('build command',
- s.prepublish || 'coffee src/*.coffee -o lib')
-
- var notest = 'echo "Error: no test specified" && exit 1'
- function tx (test) {
- return test || notest
- }
-
- if (!s.test || s.test === notest) {
- if (d.indexOf('tap') !== -1)
- s.test = prompt('test command', 'tap test/*.js', tx)
- else if (d.indexOf('expresso') !== -1)
- s.test = prompt('test command', 'expresso test', tx)
- else if (d.indexOf('mocha') !== -1)
- s.test = prompt('test command', 'mocha', tx)
- else
- s.test = prompt('test command', tx)
- }
-
- return s
-
- })(),
-
- "repository" : (function () {
- try { var gconf = fs.readFileSync('.git/config') }
- catch (e) { gconf = null }
- if (gconf) {
- gconf = gconf.split(/\r?\n/)
- var i = gconf.indexOf('[remote "origin"]')
- if (i !== -1) {
- var u = gconf[i + 1]
- if (!u.match(/^\s*url =/)) u = gconf[i + 2]
- if (!u.match(/^\s*url =/)) u = null
- else u = u.replace(/^\s*url = /, '')
- }
- if (u && u.match(/^git@github.com:/))
- u = u.replace(/^git@github.com:/, 'git://github.com/')
- }
-
- return prompt('git repository', u)
- })(),
-
- "keywords" : prompt(function (s) {
- if (!s) return undefined
- if (Array.isArray(s)) s = s.join(' ')
- if (typeof s !== 'string') return s
- return s.split(/[\s,]+/)
- }),
- "author" : config['init.author.name']
- ? {
- "name" : config['init.author.name'],
- "email" : config['init.author.email'],
- "url" : config['init.author.url']
- }
- : undefined,
- "license" : prompt('license', 'BSD')
-}
diff --git a/deps/npm/node_modules/promzard/example/npm-init/init.js b/deps/npm/node_modules/promzard/example/npm-init/init.js
deleted file mode 100644
index 09484cd1c19911..00000000000000
--- a/deps/npm/node_modules/promzard/example/npm-init/init.js
+++ /dev/null
@@ -1,37 +0,0 @@
-var PZ = require('../../promzard').PromZard
-var path = require('path')
-var input = path.resolve(__dirname, 'init-input.js')
-
-var fs = require('fs')
-var package = path.resolve(__dirname, 'package.json')
-var pkg
-
-fs.readFile(package, 'utf8', function (er, d) {
- if (er) ctx = {}
- try { ctx = JSON.parse(d); pkg = JSON.parse(d) }
- catch (e) { ctx = {} }
-
- ctx.dirname = path.dirname(package)
- ctx.basename = path.basename(ctx.dirname)
- if (!ctx.version) ctx.version = undefined
-
- // this should be replaced with the npm conf object
- ctx.config = {}
-
- console.error('ctx=', ctx)
-
- var pz = new PZ(input, ctx)
-
- pz.on('data', function (data) {
- console.error('pz data', data)
- if (!pkg) pkg = {}
- Object.keys(data).forEach(function (k) {
- if (data[k] !== undefined && data[k] !== null) pkg[k] = data[k]
- })
- console.error('package data %s', JSON.stringify(data, null, 2))
- fs.writeFile(package, JSON.stringify(pkg, null, 2), function (er) {
- if (er) throw er
- console.log('ok')
- })
- })
-})
diff --git a/deps/npm/node_modules/promzard/example/npm-init/package.json b/deps/npm/node_modules/promzard/example/npm-init/package.json
deleted file mode 100644
index 89c6d1fb6e2acf..00000000000000
--- a/deps/npm/node_modules/promzard/example/npm-init/package.json
+++ /dev/null
@@ -1,10 +0,0 @@
-{
- "name": "npm-init",
- "version": "0.0.0",
- "description": "an initter you init wit, innit?",
- "main": "index.js",
- "scripts": {
- "test": "asdf"
- },
- "license": "BSD"
-}
\ No newline at end of file
diff --git a/deps/npm/node_modules/promzard/example/substack-input.js b/deps/npm/node_modules/promzard/example/substack-input.js
deleted file mode 100644
index bf7aedb82d41fd..00000000000000
--- a/deps/npm/node_modules/promzard/example/substack-input.js
+++ /dev/null
@@ -1,61 +0,0 @@
-module.exports = {
- "name" : basename.replace(/^node-/, ''),
- "version" : "0.0.0",
- "description" : (function (cb) {
- var fs = require('fs');
- var value;
- try {
- var src = fs.readFileSync('README.markdown', 'utf8');
- value = src.split('\n').filter(function (line) {
- return /\s+/.test(line)
- && line.trim() !== basename.replace(/^node-/, '')
- ;
- })[0]
- .trim()
- .replace(/^./, function (c) { return c.toLowerCase() })
- .replace(/\.$/, '')
- ;
- }
- catch (e) {}
-
- return prompt('description', value);
- })(),
- "main" : prompt('entry point', 'index.js'),
- "bin" : function (cb) {
- var path = require('path');
- var fs = require('fs');
- var exists = fs.exists || path.exists;
- exists('bin/cmd.js', function (ex) {
- var bin
- if (ex) {
- var bin = {}
- bin[basename.replace(/^node-/, '')] = 'bin/cmd.js'
- }
- cb(null, bin);
- });
- },
- "directories" : {
- "example" : "example",
- "test" : "test"
- },
- "dependencies" : {},
- "devDependencies" : {
- "tap" : "~0.2.5"
- },
- "scripts" : {
- "test" : "tap test/*.js"
- },
- "repository" : {
- "type" : "git",
- "url" : "git://github.com/substack/" + basename + ".git"
- },
- "homepage" : "https://github.com/substack/" + basename,
- "keywords" : prompt(function (s) { return s.split(/\s+/) }),
- "author" : {
- "name" : "James Halliday",
- "email" : "mail@substack.net",
- "url" : "http://substack.net"
- },
- "license" : "MIT",
- "engine" : { "node" : ">=0.6" }
-}
diff --git a/deps/npm/node_modules/promzard/lib/index.js b/deps/npm/node_modules/promzard/lib/index.js
new file mode 100644
index 00000000000000..2244cbbbacdb02
--- /dev/null
+++ b/deps/npm/node_modules/promzard/lib/index.js
@@ -0,0 +1,175 @@
+const fs = require('fs/promises')
+const { runInThisContext } = require('vm')
+const { promisify } = require('util')
+const { randomBytes } = require('crypto')
+const { Module } = require('module')
+const { dirname, basename } = require('path')
+const read = require('read')
+
+const files = {}
+
+class PromZard {
+ #file = null
+ #backupFile = null
+ #ctx = null
+ #unique = randomBytes(8).toString('hex')
+ #prompts = []
+
+ constructor (file, ctx = {}, options = {}) {
+ this.#file = file
+ this.#ctx = ctx
+ this.#backupFile = options.backupFile
+ }
+
+ static async promzard (file, ctx, options) {
+ const pz = new PromZard(file, ctx, options)
+ return pz.load()
+ }
+
+ static async fromBuffer (buf, ctx, options) {
+ let filename = 0
+ do {
+ filename = '\0' + Math.random()
+ } while (files[filename])
+ files[filename] = buf
+ const ret = await PromZard.promzard(filename, ctx, options)
+ delete files[filename]
+ return ret
+ }
+
+ async load () {
+ if (files[this.#file]) {
+ return this.#loaded()
+ }
+
+ try {
+ files[this.#file] = await fs.readFile(this.#file, 'utf8')
+ } catch (er) {
+ if (er && this.#backupFile) {
+ this.#file = this.#backupFile
+ this.#backupFile = null
+ return this.load()
+ }
+ throw er
+ }
+
+ return this.#loaded()
+ }
+
+ async #loaded () {
+ const mod = new Module(this.#file, module)
+ mod.loaded = true
+ mod.filename = this.#file
+ mod.id = this.#file
+ mod.paths = Module._nodeModulePaths(dirname(this.#file))
+
+ this.#ctx.prompt = this.#makePrompt()
+ this.#ctx.__filename = this.#file
+ this.#ctx.__dirname = dirname(this.#file)
+ this.#ctx.__basename = basename(this.#file)
+ this.#ctx.module = mod
+ this.#ctx.require = (p) => mod.require(p)
+ this.#ctx.require.resolve = (p) => Module._resolveFilename(p, mod)
+ this.#ctx.exports = mod.exports
+
+ const body = `(function(${Object.keys(this.#ctx).join(', ')}) { ${files[this.#file]}\n })`
+ runInThisContext(body, this.#file).apply(this.#ctx, Object.values(this.#ctx))
+ this.#ctx.res = mod.exports
+
+ return this.#walk()
+ }
+
+ #makePrompt () {
+ return (...args) => {
+ let p, d, t
+ for (let i = 0; i < args.length; i++) {
+ const a = args[i]
+ if (typeof a === 'string') {
+ if (p) {
+ d = a
+ } else {
+ p = a
+ }
+ } else if (typeof a === 'function') {
+ t = a
+ } else if (a && typeof a === 'object') {
+ p = a.prompt || p
+ d = a.default || d
+ t = a.transform || t
+ }
+ }
+ try {
+ return `${this.#unique}-${this.#prompts.length}`
+ } finally {
+ this.#prompts.push([p, d, t])
+ }
+ }
+ }
+
+ async #walk (o = this.#ctx.res) {
+ const keys = Object.keys(o)
+
+ const len = keys.length
+ let i = 0
+
+ while (i < len) {
+ const k = keys[i]
+ const v = o[k]
+ i++
+
+ if (v && typeof v === 'object') {
+ o[k] = await this.#walk(v)
+ continue
+ }
+
+ if (v && typeof v === 'string' && v.startsWith(this.#unique)) {
+ const n = +v.slice(this.#unique.length + 1)
+
+ // default to the key
+ // default to the ctx value, if there is one
+ const [prompt = k, def = this.#ctx[k], tx] = this.#prompts[n]
+
+ try {
+ o[k] = await this.#prompt(prompt, def, tx)
+ } catch (er) {
+ if (er.notValid) {
+ console.log(er.message)
+ i--
+ } else {
+ throw er
+ }
+ }
+ continue
+ }
+
+ if (typeof v === 'function') {
+ // XXX: remove v.length check to remove cb from functions
+ // would be a breaking change for `npm init`
+ // XXX: if cb is no longer an argument then this.#ctx should
+ // be passed in to allow arrow fns to be used and still access ctx
+ const fn = v.length ? promisify(v) : v
+ o[k] = await fn.call(this.#ctx)
+ // back up so that we process this one again.
+ // this is because it might return a prompt() call in the cb.
+ i--
+ continue
+ }
+ }
+
+ return o
+ }
+
+ async #prompt (prompt, def, tx) {
+ const res = await read({ prompt: prompt + ':', default: def }).then((r) => tx ? tx(r) : r)
+ // XXX: remove this to require throwing an error instead of
+ // returning it. would be a breaking change for `npm init`
+ if (res instanceof Error && res.notValid) {
+ throw res
+ }
+ return res
+ }
+}
+
+module.exports = PromZard.promzard
+module.exports.fromBuffer = PromZard.fromBuffer
+module.exports.PromZard = PromZard
diff --git a/deps/npm/node_modules/promzard/package.json b/deps/npm/node_modules/promzard/package.json
index 77e3bd65513c92..a48764dd5441b8 100644
--- a/deps/npm/node_modules/promzard/package.json
+++ b/deps/npm/node_modules/promzard/package.json
@@ -1,20 +1,48 @@
{
- "author": "Isaac Z. Schlueter (http://blog.izs.me/)",
+ "author": "GitHub Inc.",
"name": "promzard",
"description": "prompting wizardly",
- "version": "0.3.0",
+ "version": "1.0.0",
"repository": {
- "url": "git://github.com/isaacs/promzard"
+ "url": "https://github.com/npm/promzard.git",
+ "type": "git"
},
"dependencies": {
- "read": "1"
+ "read": "^2.0.0"
},
"devDependencies": {
- "tap": "~0.2.5"
+ "@npmcli/eslint-config": "^4.0.0",
+ "@npmcli/template-oss": "4.11.0",
+ "tap": "^16.3.0"
},
- "main": "promzard.js",
+ "main": "lib/index.js",
"scripts": {
- "test": "tap test/*.js"
+ "test": "tap",
+ "lint": "eslint \"**/*.js\"",
+ "postlint": "template-oss-check",
+ "template-oss-apply": "template-oss-apply --force",
+ "lintfix": "npm run lint -- --fix",
+ "snap": "tap",
+ "posttest": "npm run lint"
},
- "license": "ISC"
+ "license": "ISC",
+ "files": [
+ "bin/",
+ "lib/"
+ ],
+ "engines": {
+ "node": "^14.17.0 || ^16.13.0 || >=18.0.0"
+ },
+ "templateOSS": {
+ "//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
+ "version": "4.11.0"
+ },
+ "tap": {
+ "jobs": 1,
+ "test-ignore": "fixtures/",
+ "nyc-arg": [
+ "--exclude",
+ "tap-snapshots/**"
+ ]
+ }
}
diff --git a/deps/npm/node_modules/promzard/promzard.js b/deps/npm/node_modules/promzard/promzard.js
deleted file mode 100644
index da1abca9535e4f..00000000000000
--- a/deps/npm/node_modules/promzard/promzard.js
+++ /dev/null
@@ -1,238 +0,0 @@
-module.exports = promzard
-promzard.PromZard = PromZard
-
-var fs = require('fs')
-var vm = require('vm')
-var util = require('util')
-var files = {}
-var crypto = require('crypto')
-var EventEmitter = require('events').EventEmitter
-var read = require('read')
-
-var Module = require('module').Module
-var path = require('path')
-
-function promzard (file, ctx, cb) {
- if (typeof ctx === 'function') cb = ctx, ctx = null;
- if (!ctx) ctx = {};
- var pz = new PromZard(file, ctx)
- pz.on('error', cb)
- pz.on('data', function (data) {
- cb(null, data)
- })
-}
-promzard.fromBuffer = function (buf, ctx, cb) {
- var filename = 0
- do {
- filename = '\0' + Math.random();
- } while (files[filename])
- files[filename] = buf
- var ret = promzard(filename, ctx, cb)
- delete files[filename]
- return ret
-}
-
-function PromZard (file, ctx) {
- if (!(this instanceof PromZard))
- return new PromZard(file, ctx)
- EventEmitter.call(this)
- this.file = file
- this.ctx = ctx
- this.unique = crypto.randomBytes(8).toString('hex')
- this.load()
-}
-
-PromZard.prototype = Object.create(
- EventEmitter.prototype,
- { constructor: {
- value: PromZard,
- readable: true,
- configurable: true,
- writable: true,
- enumerable: false } } )
-
-PromZard.prototype.load = function () {
- if (files[this.file])
- return this.loaded()
-
- fs.readFile(this.file, 'utf8', function (er, d) {
- if (er && this.backupFile) {
- this.file = this.backupFile
- delete this.backupFile
- return this.load()
- }
- if (er)
- return this.emit('error', this.error = er)
- files[this.file] = d
- this.loaded()
- }.bind(this))
-}
-
-PromZard.prototype.loaded = function () {
- this.ctx.prompt = this.makePrompt()
- this.ctx.__filename = this.file
- this.ctx.__dirname = path.dirname(this.file)
- this.ctx.__basename = path.basename(this.file)
- var mod = this.ctx.module = this.makeModule()
- this.ctx.require = function (path) {
- return mod.require(path)
- }
- this.ctx.require.resolve = function(path) {
- return Module._resolveFilename(path, mod);
- }
- this.ctx.exports = mod.exports
-
- this.script = this.wrap(files[this.file])
- var fn = vm.runInThisContext(this.script, this.file)
- var args = Object.keys(this.ctx).map(function (k) {
- return this.ctx[k]
- }.bind(this))
- try { var res = fn.apply(this.ctx, args) }
- catch (er) { this.emit('error', er) }
- if (res &&
- typeof res === 'object' &&
- exports === mod.exports &&
- Object.keys(exports).length === 1) {
- this.result = res
- } else {
- this.result = mod.exports
- }
- this.walk()
-}
-
-PromZard.prototype.makeModule = function () {
- var mod = new Module(this.file, module)
- mod.loaded = true
- mod.filename = this.file
- mod.id = this.file
- mod.paths = Module._nodeModulePaths(path.dirname(this.file))
- return mod
-}
-
-PromZard.prototype.wrap = function (body) {
- var s = '(function( %s ) { %s\n })'
- var args = Object.keys(this.ctx).join(', ')
- return util.format(s, args, body)
-}
-
-PromZard.prototype.makePrompt = function () {
- this.prompts = []
- return prompt.bind(this)
- function prompt () {
- var p, d, t
- for (var i = 0; i < arguments.length; i++) {
- var a = arguments[i]
- if (typeof a === 'string' && p)
- d = a
- else if (typeof a === 'string')
- p = a
- else if (typeof a === 'function')
- t = a
- else if (a && typeof a === 'object') {
- p = a.prompt || p
- d = a.default || d
- t = a.transform || t
- }
- }
-
- try { return this.unique + '-' + this.prompts.length }
- finally { this.prompts.push([p, d, t]) }
- }
-}
-
-PromZard.prototype.walk = function (o, cb) {
- o = o || this.result
- cb = cb || function (er, res) {
- if (er)
- return this.emit('error', this.error = er)
- this.result = res
- return this.emit('data', res)
- }
- cb = cb.bind(this)
- var keys = Object.keys(o)
- var i = 0
- var len = keys.length
-
- L.call(this)
- function L () {
- if (this.error)
- return
- while (i < len) {
- var k = keys[i]
- var v = o[k]
- i++
-
- if (v && typeof v === 'object') {
- return this.walk(v, function (er, res) {
- if (er) return cb(er)
- o[k] = res
- L.call(this)
- }.bind(this))
- } else if (v &&
- typeof v === 'string' &&
- v.indexOf(this.unique) === 0) {
- var n = +v.substr(this.unique.length + 1)
- var prompt = this.prompts[n]
- if (isNaN(n) || !prompt)
- continue
-
- // default to the key
- if (undefined === prompt[0])
- prompt[0] = k
-
- // default to the ctx value, if there is one
- if (undefined === prompt[1])
- prompt[1] = this.ctx[k]
-
- return this.prompt(prompt, function (er, res) {
- if (er) {
- if (!er.notValid) {
- return this.emit('error', this.error = er);
- }
- console.log(er.message)
- i --
- return L.call(this)
- }
- o[k] = res
- L.call(this)
- }.bind(this))
- } else if (typeof v === 'function') {
- try { return v.call(this.ctx, function (er, res) {
- if (er)
- return this.emit('error', this.error = er)
- o[k] = res
- // back up so that we process this one again.
- // this is because it might return a prompt() call in the cb.
- i --
- L.call(this)
- }.bind(this)) }
- catch (er) { this.emit('error', er) }
- }
- }
- // made it to the end of the loop, maybe
- if (i >= len)
- return cb(null, o)
- }
-}
-
-PromZard.prototype.prompt = function (pdt, cb) {
- var prompt = pdt[0]
- var def = pdt[1]
- var tx = pdt[2]
-
- if (tx) {
- cb = function (cb) { return function (er, data) {
- try {
- var res = tx(data)
- if (!er && res instanceof Error && !!res.notValid) {
- return cb(res, null)
- }
- return cb(er, res)
- }
- catch (er) { this.emit('error', er) }
- }}(cb).bind(this)
- }
-
- read({ prompt: prompt + ':' , default: def }, cb)
-}
-
diff --git a/deps/npm/node_modules/promzard/test/basic.js b/deps/npm/node_modules/promzard/test/basic.js
deleted file mode 100644
index ad1c92df9c4c05..00000000000000
--- a/deps/npm/node_modules/promzard/test/basic.js
+++ /dev/null
@@ -1,91 +0,0 @@
-var tap = require('tap')
-var pz = require('../promzard.js')
-var spawn = require('child_process').spawn
-
-tap.test('run the example', function (t) {
-
- var example = require.resolve('../example/index.js')
- var node = process.execPath
-
- var expect = {
- "name": "example",
- "version": "0.0.0",
- "description": "testing description",
- "main": "test-entry.js",
- "directories": {
- "example": "example",
- "test": "test"
- },
- "dependencies": {},
- "devDependencies": {
- "tap": "~0.2.5"
- },
- "scripts": {
- "test": "tap test/*.js"
- },
- "repository": {
- "type": "git",
- "url": "git://github.com/substack/example.git"
- },
- "homepage": "https://github.com/substack/example",
- "keywords": [
- "fugazi",
- "function",
- "waiting",
- "room"
- ],
- "author": {
- "name": "James Halliday",
- "email": "mail@substack.net",
- "url": "http://substack.net"
- },
- "license": "MIT",
- "engine": {
- "node": ">=0.6"
- }
- }
-
- console.error('%s %s', node, example)
- var c = spawn(node, [example], { customFds: [-1,-1,-1] })
- var output = ''
- c.stdout.on('data', function (d) {
- output += d
- respond()
- })
-
- var actual = ''
- c.stderr.on('data', function (d) {
- actual += d
- })
-
- function respond () {
- console.error('respond', output)
- if (output.match(/description: $/)) {
- c.stdin.write('testing description\n')
- return
- }
- if (output.match(/entry point: \(index\.js\) $/)) {
- c.stdin.write('test-entry.js\n')
- return
- }
- if (output.match(/keywords: $/)) {
- c.stdin.write('fugazi function waiting room\n')
- // "read" module is weird on node >= 0.10 when not a TTY
- // requires explicit ending for reasons.
- // could dig in, but really just wanna make tests pass, whatever.
- c.stdin.end()
- return
- }
- }
-
- c.on('exit', function () {
- console.error('exit event')
- })
-
- c.on('close', function () {
- console.error('actual', actual)
- actual = JSON.parse(actual)
- t.deepEqual(actual, expect)
- t.end()
- })
-})
diff --git a/deps/npm/node_modules/promzard/test/buffer.js b/deps/npm/node_modules/promzard/test/buffer.js
deleted file mode 100644
index e1d240e2e4f480..00000000000000
--- a/deps/npm/node_modules/promzard/test/buffer.js
+++ /dev/null
@@ -1,84 +0,0 @@
-var tap = require('tap')
-var pz = require('../promzard.js')
-var spawn = require('child_process').spawn
-
-tap.test('run the example using a buffer', function (t) {
-
- var example = require.resolve('../example/buffer.js')
- var node = process.execPath
-
- var expect = {
- "name": "example",
- "version": "0.0.0",
- "description": "testing description",
- "main": "test-entry.js",
- "directories": {
- "example": "example",
- "test": "test"
- },
- "dependencies": {},
- "devDependencies": {
- "tap": "~0.2.5"
- },
- "scripts": {
- "test": "tap test/*.js"
- },
- "repository": {
- "type": "git",
- "url": "git://github.com/substack/example.git"
- },
- "homepage": "https://github.com/substack/example",
- "keywords": [
- "fugazi",
- "function",
- "waiting",
- "room"
- ],
- "author": {
- "name": "James Halliday",
- "email": "mail@substack.net",
- "url": "http://substack.net"
- },
- "license": "MIT",
- "engine": {
- "node": ">=0.6"
- }
- }
-
- var c = spawn(node, [example], { customFds: [-1,-1,-1] })
- var output = ''
- c.stdout.on('data', function (d) {
- output += d
- respond()
- })
-
- var actual = ''
- c.stderr.on('data', function (d) {
- actual += d
- })
-
- function respond () {
- if (output.match(/description: $/)) {
- c.stdin.write('testing description\n')
- return
- }
- if (output.match(/entry point: \(index\.js\) $/)) {
- c.stdin.write('test-entry.js\n')
- return
- }
- if (output.match(/keywords: $/)) {
- c.stdin.write('fugazi function waiting room\n')
- // "read" module is weird on node >= 0.10 when not a TTY
- // requires explicit ending for reasons.
- // could dig in, but really just wanna make tests pass, whatever.
- c.stdin.end()
- return
- }
- }
-
- c.on('close', function () {
- actual = JSON.parse(actual)
- t.deepEqual(actual, expect)
- t.end()
- })
-})
diff --git a/deps/npm/node_modules/promzard/test/exports.input b/deps/npm/node_modules/promzard/test/exports.input
deleted file mode 100644
index 061cbfe1055aa2..00000000000000
--- a/deps/npm/node_modules/promzard/test/exports.input
+++ /dev/null
@@ -1,5 +0,0 @@
-exports.a = 1 + 2
-exports.b = prompt('To be or not to be?', '!2b')
-exports.c = {}
-exports.c.x = prompt()
-exports.c.y = tmpdir + "/y/file.txt"
diff --git a/deps/npm/node_modules/promzard/test/exports.js b/deps/npm/node_modules/promzard/test/exports.js
deleted file mode 100644
index c17993a4e9e754..00000000000000
--- a/deps/npm/node_modules/promzard/test/exports.js
+++ /dev/null
@@ -1,48 +0,0 @@
-var test = require('tap').test;
-var promzard = require('../');
-
-if (process.argv[2] === 'child') {
- return child()
-}
-
-test('exports', function (t) {
- t.plan(1);
-
- var spawn = require('child_process').spawn
- var child = spawn(process.execPath, [__filename, 'child'])
-
- var output = ''
- child.stderr.on('data', function (c) {
- output += c
- })
-
- setTimeout(function () {
- child.stdin.write('\n');
- }, 100)
- setTimeout(function () {
- child.stdin.end('55\n');
- }, 200)
-
- child.on('close', function () {
- console.error('output=%j', output)
- output = JSON.parse(output)
- t.same({
- a : 3,
- b : '!2b',
- c : {
- x : 55,
- y : '/tmp/y/file.txt',
- }
- }, output);
- t.end()
- })
-});
-
-function child () {
- var ctx = { tmpdir : '/tmp' }
- var file = __dirname + '/exports.input';
-
- promzard(file, ctx, function (err, output) {
- console.error(JSON.stringify(output))
- });
-}
diff --git a/deps/npm/node_modules/promzard/test/fn.input b/deps/npm/node_modules/promzard/test/fn.input
deleted file mode 100644
index ed6c3f1c80c5b5..00000000000000
--- a/deps/npm/node_modules/promzard/test/fn.input
+++ /dev/null
@@ -1,18 +0,0 @@
-var fs = require('fs')
-
-module.exports = {
- "a": 1 + 2,
- "b": prompt('To be or not to be?', '!2b', function (s) {
- return s.toUpperCase() + '...'
- }),
- "c": {
- "x": prompt(function (x) { return x * 100 }),
- "y": tmpdir + "/y/file.txt"
- },
- a_function: function (cb) {
- fs.readFile(__filename, 'utf8', cb)
- },
- asyncPrompt: function (cb) {
- return cb(null, prompt('a prompt at any other time would smell as sweet'))
- }
-}
diff --git a/deps/npm/node_modules/promzard/test/fn.js b/deps/npm/node_modules/promzard/test/fn.js
deleted file mode 100644
index 899ebedbdd010c..00000000000000
--- a/deps/npm/node_modules/promzard/test/fn.js
+++ /dev/null
@@ -1,56 +0,0 @@
-var test = require('tap').test;
-var promzard = require('../');
-var fs = require('fs')
-var file = __dirname + '/fn.input';
-
-var expect = {
- a : 3,
- b : '!2B...',
- c : {
- x : 5500,
- y : '/tmp/y/file.txt',
- }
-}
-expect.a_function = fs.readFileSync(file, 'utf8')
-expect.asyncPrompt = 'async prompt'
-
-if (process.argv[2] === 'child') {
- return child()
-}
-
-test('prompt callback param', function (t) {
- t.plan(1);
-
- var spawn = require('child_process').spawn
- var child = spawn(process.execPath, [__filename, 'child'])
-
- var output = ''
- child.stderr.on('data', function (c) {
- output += c
- })
-
- child.on('close', function () {
- console.error('output=%j', output)
- output = JSON.parse(output)
- t.same(output, expect);
- t.end()
- })
-
- setTimeout(function () {
- child.stdin.write('\n')
- }, 100)
- setTimeout(function () {
- child.stdin.write('55\n')
- }, 150)
- setTimeout(function () {
- child.stdin.end('async prompt\n')
- }, 200)
-})
-
-function child () {
- var ctx = { tmpdir : '/tmp' }
- var file = __dirname + '/fn.input';
- promzard(file, ctx, function (err, output) {
- console.error(JSON.stringify(output))
- })
-}
diff --git a/deps/npm/node_modules/promzard/test/simple.input b/deps/npm/node_modules/promzard/test/simple.input
deleted file mode 100644
index e49def6470d599..00000000000000
--- a/deps/npm/node_modules/promzard/test/simple.input
+++ /dev/null
@@ -1,8 +0,0 @@
-module.exports = {
- "a": 1 + 2,
- "b": prompt('To be or not to be?', '!2b'),
- "c": {
- "x": prompt(),
- "y": tmpdir + "/y/file.txt"
- }
-}
diff --git a/deps/npm/node_modules/promzard/test/simple.js b/deps/npm/node_modules/promzard/test/simple.js
deleted file mode 100644
index 034a86475afbd5..00000000000000
--- a/deps/npm/node_modules/promzard/test/simple.js
+++ /dev/null
@@ -1,30 +0,0 @@
-var test = require('tap').test;
-var promzard = require('../');
-
-test('simple', function (t) {
- t.plan(1);
-
- var ctx = { tmpdir : '/tmp' }
- var file = __dirname + '/simple.input';
- promzard(file, ctx, function (err, output) {
- t.same(
- {
- a : 3,
- b : '!2b',
- c : {
- x : 55,
- y : '/tmp/y/file.txt',
- }
- },
- output
- );
- });
-
- setTimeout(function () {
- process.stdin.emit('data', '\n');
- }, 100);
-
- setTimeout(function () {
- process.stdin.emit('data', '55\n');
- }, 200);
-});
diff --git a/deps/npm/node_modules/promzard/test/validate.input b/deps/npm/node_modules/promzard/test/validate.input
deleted file mode 100644
index 839c0652294ac0..00000000000000
--- a/deps/npm/node_modules/promzard/test/validate.input
+++ /dev/null
@@ -1,8 +0,0 @@
-module.exports = {
- "name": prompt("name", function (data) {
- if (data === 'cool') return data
- var er = new Error('not cool')
- er.notValid = true
- return er
- })
-}
diff --git a/deps/npm/node_modules/promzard/test/validate.js b/deps/npm/node_modules/promzard/test/validate.js
deleted file mode 100644
index a120681494e60d..00000000000000
--- a/deps/npm/node_modules/promzard/test/validate.js
+++ /dev/null
@@ -1,20 +0,0 @@
-
-var promzard = require('../')
-var test = require('tap').test
-
-test('validate', function (t) {
- t.plan(2)
- var ctx = { tmpdir : '/tmp' }
- var file = __dirname + '/validate.input'
- promzard(file, ctx, function (er, found) {
- t.ok(!er)
- var wanted = { name: 'cool' }
- t.same(found, wanted)
- })
- setTimeout(function () {
- process.stdin.emit('data', 'not cool\n')
- }, 100)
- setTimeout(function () {
- process.stdin.emit('data', 'cool\n')
- }, 200)
-})
diff --git a/deps/npm/node_modules/read-package-json-fast/lib/index.js b/deps/npm/node_modules/read-package-json-fast/lib/index.js
index c5c896ffec0046..beb089db8d53ec 100644
--- a/deps/npm/node_modules/read-package-json-fast/lib/index.js
+++ b/deps/npm/node_modules/read-package-json-fast/lib/index.js
@@ -1,10 +1,6 @@
-const { promisify } = require('util')
-const fs = require('fs')
-const readFile = promisify(fs.readFile)
-const lstat = promisify(fs.lstat)
-const readdir = promisify(fs.readdir)
+const { readFile, lstat, readdir } = require('fs/promises')
const parse = require('json-parse-even-better-errors')
-
+const normalizePackageBin = require('npm-normalize-package-bin')
const { resolve, dirname, join, relative } = require('path')
const rpj = path => readFile(path, 'utf8')
@@ -14,8 +10,6 @@ const rpj = path => readFile(path, 'utf8')
throw er
})
-const normalizePackageBin = require('npm-normalize-package-bin')
-
// load the directories.bin folder as a 'bin' object
const readBinDir = async (path, data) => {
if (data.bin) {
diff --git a/deps/npm/node_modules/read-package-json-fast/package.json b/deps/npm/node_modules/read-package-json-fast/package.json
index 7baa36cfbeb9e9..4964bb0a934cb8 100644
--- a/deps/npm/node_modules/read-package-json-fast/package.json
+++ b/deps/npm/node_modules/read-package-json-fast/package.json
@@ -1,6 +1,6 @@
{
"name": "read-package-json-fast",
- "version": "3.0.1",
+ "version": "3.0.2",
"description": "Like read-package-json, but faster",
"main": "lib/index.js",
"author": "GitHub Inc.",
@@ -19,7 +19,7 @@
},
"devDependencies": {
"@npmcli/eslint-config": "^4.0.0",
- "@npmcli/template-oss": "4.5.1",
+ "@npmcli/template-oss": "4.11.0",
"tap": "^16.3.0"
},
"dependencies": {
@@ -36,7 +36,7 @@
],
"templateOSS": {
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
- "version": "4.5.1"
+ "version": "4.11.0"
},
"tap": {
"nyc-arg": [
diff --git a/deps/npm/node_modules/read/lib/read.js b/deps/npm/node_modules/read/lib/read.js
index a93d1b3b532c08..882b11c285bcf3 100644
--- a/deps/npm/node_modules/read/lib/read.js
+++ b/deps/npm/node_modules/read/lib/read.js
@@ -1,113 +1,82 @@
-
-module.exports = read
-
-var readline = require('readline')
-var Mute = require('mute-stream')
-
-function read (opts, cb) {
- if (opts.num) {
- throw new Error('read() no longer accepts a char number limit')
- }
-
- if (typeof opts.default !== 'undefined' &&
- typeof opts.default !== 'string' &&
- typeof opts.default !== 'number') {
+const readline = require('readline')
+const Mute = require('mute-stream')
+
+module.exports = async function read ({
+ default: def = '',
+ input = process.stdin,
+ output = process.stdout,
+ prompt = '',
+ silent,
+ timeout,
+ edit,
+ terminal,
+ replace,
+}) {
+ if (typeof def !== 'undefined' && typeof def !== 'string' && typeof def !== 'number') {
throw new Error('default value must be string or number')
}
- var input = opts.input || process.stdin
- var output = opts.output || process.stdout
- var prompt = (opts.prompt || '').trim() + ' '
- var silent = opts.silent
- var editDef = false
- var timeout = opts.timeout
+ let editDef = false
+ prompt = prompt.trim() + ' '
+ terminal = !!(terminal || output.isTTY)
- var def = opts.default || ''
if (def) {
if (silent) {
prompt += '() '
- } else if (opts.edit) {
+ } else if (edit) {
editDef = true
} else {
prompt += '(' + def + ') '
}
}
- var terminal = !!(opts.terminal || output.isTTY)
- var m = new Mute({ replace: opts.replace, prompt: prompt })
- m.pipe(output, {end: false})
+ const m = new Mute({ replace, prompt })
+ m.pipe(output, { end: false })
output = m
- var rlOpts = { input: input, output: output, terminal: terminal }
-
- if (process.version.match(/^v0\.6/)) {
- var rl = readline.createInterface(rlOpts.input, rlOpts.output)
- } else {
- var rl = readline.createInterface(rlOpts)
- }
-
-
- output.unmute()
- rl.setPrompt(prompt)
- rl.prompt()
- if (silent) {
- output.mute()
- } else if (editDef) {
- rl.line = def
- rl.cursor = def.length
- rl._refreshLine()
- }
- var called = false
- rl.on('line', onLine)
- rl.on('error', onError)
+ return new Promise((resolve, reject) => {
+ const rl = readline.createInterface({ input, output, terminal })
+ const timer = timeout && setTimeout(() => onError(new Error('timed out')), timeout)
- rl.on('SIGINT', function () {
- rl.close()
- onError(new Error('canceled'))
- })
+ output.unmute()
+ rl.setPrompt(prompt)
+ rl.prompt()
- var timer
- if (timeout) {
- timer = setTimeout(function () {
- onError(new Error('timed out'))
- }, timeout)
- }
-
- function done () {
- called = true
- rl.close()
-
- if (process.version.match(/^v0\.6/)) {
- rl.input.removeAllListeners('data')
- rl.input.removeAllListeners('keypress')
- rl.input.pause()
+ if (silent) {
+ output.mute()
+ } else if (editDef) {
+ rl.line = def
+ rl.cursor = def.length
+ rl._refreshLine()
}
- clearTimeout(timer)
- output.mute()
- output.end()
- }
-
- function onError (er) {
- if (called) return
- done()
- return cb(er)
- }
-
- function onLine (line) {
- if (called) return
- if (silent && terminal) {
- output.unmute()
- output.write('\r\n')
+ const done = () => {
+ rl.close()
+ clearTimeout(timer)
+ output.mute()
+ output.end()
}
- done()
- // truncate the \n at the end.
- line = line.replace(/\r?\n$/, '')
- var isDefault = !!(editDef && line === def)
- if (def && !line) {
- isDefault = true
- line = def
+
+ const onError = (er) => {
+ done()
+ reject(er)
}
- cb(null, line, isDefault)
- }
+
+ rl.on('error', onError)
+ rl.on('line', (line) => {
+ if (silent && terminal) {
+ output.unmute()
+ output.write('\r\n')
+ }
+ done()
+ // truncate the \n at the end.
+ const res = line.replace(/\r?\n$/, '') || def || ''
+ return resolve(res)
+ })
+
+ rl.on('SIGINT', () => {
+ rl.close()
+ onError(new Error('canceled'))
+ })
+ })
}
diff --git a/deps/npm/node_modules/read/package.json b/deps/npm/node_modules/read/package.json
index 3b480aa7691175..7f8d7fc249c558 100644
--- a/deps/npm/node_modules/read/package.json
+++ b/deps/npm/node_modules/read/package.json
@@ -1,27 +1,51 @@
{
"name": "read",
- "version": "1.0.7",
+ "version": "2.0.0",
"main": "lib/read.js",
"dependencies": {
- "mute-stream": "~0.0.4"
+ "mute-stream": "~1.0.0"
},
"devDependencies": {
- "tap": "^1.2.0"
+ "@npmcli/eslint-config": "^4.0.0",
+ "@npmcli/template-oss": "4.11.0",
+ "tap": "^16.3.0"
},
"engines": {
- "node": ">=0.8"
+ "node": "^14.17.0 || ^16.13.0 || >=18.0.0"
},
- "author": "Isaac Z. Schlueter (http://blog.izs.me/)",
+ "author": "GitHub Inc.",
"description": "read(1) for node programs",
"repository": {
"type": "git",
- "url": "git://github.com/isaacs/read.git"
+ "url": "https://github.com/npm/read.git"
},
"license": "ISC",
"scripts": {
- "test": "tap test/*.js"
+ "test": "tap",
+ "lint": "eslint \"**/*.js\"",
+ "postlint": "template-oss-check",
+ "template-oss-apply": "template-oss-apply --force",
+ "lintfix": "npm run lint -- --fix",
+ "snap": "tap",
+ "posttest": "npm run lint"
},
"files": [
- "lib/read.js"
- ]
+ "bin/",
+ "lib/"
+ ],
+ "templateOSS": {
+ "//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
+ "version": "4.11.0"
+ },
+ "tap": {
+ "statements": 77,
+ "branches": 75,
+ "functions": 57,
+ "lines": 78,
+ "test-ignore": "fixtures/",
+ "nyc-arg": [
+ "--exclude",
+ "tap-snapshots/**"
+ ]
+ }
}
diff --git a/deps/npm/node_modules/readable-stream/lib/_stream_duplex.js b/deps/npm/node_modules/readable-stream/lib/_stream_duplex.js
index 67525192250f6d..e03c6bf5ff87aa 100644
--- a/deps/npm/node_modules/readable-stream/lib/_stream_duplex.js
+++ b/deps/npm/node_modules/readable-stream/lib/_stream_duplex.js
@@ -1,139 +1,4 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-// a duplex stream is just a stream that is both readable and writable.
-// Since JS doesn't have multiple prototypal inheritance, this class
-// prototypally inherits from Readable, and then parasitically from
-// Writable.
-'use strict';
-/**/
+'use strict'
-var objectKeys = Object.keys || function (obj) {
- var keys = [];
-
- for (var key in obj) {
- keys.push(key);
- }
-
- return keys;
-};
-/* */
-
-
-module.exports = Duplex;
-
-var Readable = require('./_stream_readable');
-
-var Writable = require('./_stream_writable');
-
-require('inherits')(Duplex, Readable);
-
-{
- // Allow the keys array to be GC'ed.
- var keys = objectKeys(Writable.prototype);
-
- for (var v = 0; v < keys.length; v++) {
- var method = keys[v];
- if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method];
- }
-}
-
-function Duplex(options) {
- if (!(this instanceof Duplex)) return new Duplex(options);
- Readable.call(this, options);
- Writable.call(this, options);
- this.allowHalfOpen = true;
-
- if (options) {
- if (options.readable === false) this.readable = false;
- if (options.writable === false) this.writable = false;
-
- if (options.allowHalfOpen === false) {
- this.allowHalfOpen = false;
- this.once('end', onend);
- }
- }
-}
-
-Object.defineProperty(Duplex.prototype, 'writableHighWaterMark', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._writableState.highWaterMark;
- }
-});
-Object.defineProperty(Duplex.prototype, 'writableBuffer', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._writableState && this._writableState.getBuffer();
- }
-});
-Object.defineProperty(Duplex.prototype, 'writableLength', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._writableState.length;
- }
-}); // the no-half-open enforcer
-
-function onend() {
- // If the writable side ended, then we're ok.
- if (this._writableState.ended) return; // no more data can be written.
- // But allow more writes to happen in this tick.
-
- process.nextTick(onEndNT, this);
-}
-
-function onEndNT(self) {
- self.end();
-}
-
-Object.defineProperty(Duplex.prototype, 'destroyed', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- if (this._readableState === undefined || this._writableState === undefined) {
- return false;
- }
-
- return this._readableState.destroyed && this._writableState.destroyed;
- },
- set: function set(value) {
- // we ignore the value if the stream
- // has not been initialized yet
- if (this._readableState === undefined || this._writableState === undefined) {
- return;
- } // backward compatibility, the user is explicitly
- // managing destroyed
-
-
- this._readableState.destroyed = value;
- this._writableState.destroyed = value;
- }
-});
\ No newline at end of file
+// Keep this file as an alias for the full stream module.
+module.exports = require('./stream').Duplex
diff --git a/deps/npm/node_modules/readable-stream/lib/_stream_passthrough.js b/deps/npm/node_modules/readable-stream/lib/_stream_passthrough.js
index 32e7414c5a8271..1206dc4555fe2d 100644
--- a/deps/npm/node_modules/readable-stream/lib/_stream_passthrough.js
+++ b/deps/npm/node_modules/readable-stream/lib/_stream_passthrough.js
@@ -1,39 +1,4 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-// a passthrough stream.
-// basically just the most minimal sort of Transform stream.
-// Every written chunk gets output as-is.
-'use strict';
+'use strict'
-module.exports = PassThrough;
-
-var Transform = require('./_stream_transform');
-
-require('inherits')(PassThrough, Transform);
-
-function PassThrough(options) {
- if (!(this instanceof PassThrough)) return new PassThrough(options);
- Transform.call(this, options);
-}
-
-PassThrough.prototype._transform = function (chunk, encoding, cb) {
- cb(null, chunk);
-};
\ No newline at end of file
+// Keep this file as an alias for the full stream module.
+module.exports = require('./stream').PassThrough
diff --git a/deps/npm/node_modules/readable-stream/lib/_stream_readable.js b/deps/npm/node_modules/readable-stream/lib/_stream_readable.js
index 192d451488f208..49416586f20981 100644
--- a/deps/npm/node_modules/readable-stream/lib/_stream_readable.js
+++ b/deps/npm/node_modules/readable-stream/lib/_stream_readable.js
@@ -1,1124 +1,4 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-'use strict';
+'use strict'
-module.exports = Readable;
-/**/
-
-var Duplex;
-/* */
-
-Readable.ReadableState = ReadableState;
-/**/
-
-var EE = require('events').EventEmitter;
-
-var EElistenerCount = function EElistenerCount(emitter, type) {
- return emitter.listeners(type).length;
-};
-/* */
-
-/**/
-
-
-var Stream = require('./internal/streams/stream');
-/* */
-
-
-var Buffer = require('buffer').Buffer;
-
-var OurUint8Array = global.Uint8Array || function () {};
-
-function _uint8ArrayToBuffer(chunk) {
- return Buffer.from(chunk);
-}
-
-function _isUint8Array(obj) {
- return Buffer.isBuffer(obj) || obj instanceof OurUint8Array;
-}
-/**/
-
-
-var debugUtil = require('util');
-
-var debug;
-
-if (debugUtil && debugUtil.debuglog) {
- debug = debugUtil.debuglog('stream');
-} else {
- debug = function debug() {};
-}
-/* */
-
-
-var BufferList = require('./internal/streams/buffer_list');
-
-var destroyImpl = require('./internal/streams/destroy');
-
-var _require = require('./internal/streams/state'),
- getHighWaterMark = _require.getHighWaterMark;
-
-var _require$codes = require('../errors').codes,
- ERR_INVALID_ARG_TYPE = _require$codes.ERR_INVALID_ARG_TYPE,
- ERR_STREAM_PUSH_AFTER_EOF = _require$codes.ERR_STREAM_PUSH_AFTER_EOF,
- ERR_METHOD_NOT_IMPLEMENTED = _require$codes.ERR_METHOD_NOT_IMPLEMENTED,
- ERR_STREAM_UNSHIFT_AFTER_END_EVENT = _require$codes.ERR_STREAM_UNSHIFT_AFTER_END_EVENT; // Lazy loaded to improve the startup performance.
-
-
-var StringDecoder;
-var createReadableStreamAsyncIterator;
-var from;
-
-require('inherits')(Readable, Stream);
-
-var errorOrDestroy = destroyImpl.errorOrDestroy;
-var kProxyEvents = ['error', 'close', 'destroy', 'pause', 'resume'];
-
-function prependListener(emitter, event, fn) {
- // Sadly this is not cacheable as some libraries bundle their own
- // event emitter implementation with them.
- if (typeof emitter.prependListener === 'function') return emitter.prependListener(event, fn); // This is a hack to make sure that our error handler is attached before any
- // userland ones. NEVER DO THIS. This is here only because this code needs
- // to continue to work with older versions of Node.js that do not include
- // the prependListener() method. The goal is to eventually remove this hack.
-
- if (!emitter._events || !emitter._events[event]) emitter.on(event, fn);else if (Array.isArray(emitter._events[event])) emitter._events[event].unshift(fn);else emitter._events[event] = [fn, emitter._events[event]];
-}
-
-function ReadableState(options, stream, isDuplex) {
- Duplex = Duplex || require('./_stream_duplex');
- options = options || {}; // Duplex streams are both readable and writable, but share
- // the same options object.
- // However, some cases require setting options to different
- // values for the readable and the writable sides of the duplex stream.
- // These options can be provided separately as readableXXX and writableXXX.
-
- if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof Duplex; // object stream flag. Used to make read(n) ignore n and to
- // make all the buffer merging and length checks go away
-
- this.objectMode = !!options.objectMode;
- if (isDuplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // the point at which it stops calling _read() to fill the buffer
- // Note: 0 is a valid value, means "don't call _read preemptively ever"
-
- this.highWaterMark = getHighWaterMark(this, options, 'readableHighWaterMark', isDuplex); // A linked list is used to store data chunks instead of an array because the
- // linked list can remove elements from the beginning faster than
- // array.shift()
-
- this.buffer = new BufferList();
- this.length = 0;
- this.pipes = null;
- this.pipesCount = 0;
- this.flowing = null;
- this.ended = false;
- this.endEmitted = false;
- this.reading = false; // a flag to be able to tell if the event 'readable'/'data' is emitted
- // immediately, or on a later tick. We set this to true at first, because
- // any actions that shouldn't happen until "later" should generally also
- // not happen before the first read call.
-
- this.sync = true; // whenever we return null, then we set a flag to say
- // that we're awaiting a 'readable' event emission.
-
- this.needReadable = false;
- this.emittedReadable = false;
- this.readableListening = false;
- this.resumeScheduled = false;
- this.paused = true; // Should close be emitted on destroy. Defaults to true.
-
- this.emitClose = options.emitClose !== false; // Should .destroy() be called after 'end' (and potentially 'finish')
-
- this.autoDestroy = !!options.autoDestroy; // has it been destroyed
-
- this.destroyed = false; // Crypto is kind of old and crusty. Historically, its default string
- // encoding is 'binary' so we have to make this configurable.
- // Everything else in the universe uses 'utf8', though.
-
- this.defaultEncoding = options.defaultEncoding || 'utf8'; // the number of writers that are awaiting a drain event in .pipe()s
-
- this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled
-
- this.readingMore = false;
- this.decoder = null;
- this.encoding = null;
-
- if (options.encoding) {
- if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder;
- this.decoder = new StringDecoder(options.encoding);
- this.encoding = options.encoding;
- }
-}
-
-function Readable(options) {
- Duplex = Duplex || require('./_stream_duplex');
- if (!(this instanceof Readable)) return new Readable(options); // Checking for a Stream.Duplex instance is faster here instead of inside
- // the ReadableState constructor, at least with V8 6.5
-
- var isDuplex = this instanceof Duplex;
- this._readableState = new ReadableState(options, this, isDuplex); // legacy
-
- this.readable = true;
-
- if (options) {
- if (typeof options.read === 'function') this._read = options.read;
- if (typeof options.destroy === 'function') this._destroy = options.destroy;
- }
-
- Stream.call(this);
-}
-
-Object.defineProperty(Readable.prototype, 'destroyed', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- if (this._readableState === undefined) {
- return false;
- }
-
- return this._readableState.destroyed;
- },
- set: function set(value) {
- // we ignore the value if the stream
- // has not been initialized yet
- if (!this._readableState) {
- return;
- } // backward compatibility, the user is explicitly
- // managing destroyed
-
-
- this._readableState.destroyed = value;
- }
-});
-Readable.prototype.destroy = destroyImpl.destroy;
-Readable.prototype._undestroy = destroyImpl.undestroy;
-
-Readable.prototype._destroy = function (err, cb) {
- cb(err);
-}; // Manually shove something into the read() buffer.
-// This returns true if the highWaterMark has not been hit yet,
-// similar to how Writable.write() returns true if you should
-// write() some more.
-
-
-Readable.prototype.push = function (chunk, encoding) {
- var state = this._readableState;
- var skipChunkCheck;
-
- if (!state.objectMode) {
- if (typeof chunk === 'string') {
- encoding = encoding || state.defaultEncoding;
-
- if (encoding !== state.encoding) {
- chunk = Buffer.from(chunk, encoding);
- encoding = '';
- }
-
- skipChunkCheck = true;
- }
- } else {
- skipChunkCheck = true;
- }
-
- return readableAddChunk(this, chunk, encoding, false, skipChunkCheck);
-}; // Unshift should *always* be something directly out of read()
-
-
-Readable.prototype.unshift = function (chunk) {
- return readableAddChunk(this, chunk, null, true, false);
-};
-
-function readableAddChunk(stream, chunk, encoding, addToFront, skipChunkCheck) {
- debug('readableAddChunk', chunk);
- var state = stream._readableState;
-
- if (chunk === null) {
- state.reading = false;
- onEofChunk(stream, state);
- } else {
- var er;
- if (!skipChunkCheck) er = chunkInvalid(state, chunk);
-
- if (er) {
- errorOrDestroy(stream, er);
- } else if (state.objectMode || chunk && chunk.length > 0) {
- if (typeof chunk !== 'string' && !state.objectMode && Object.getPrototypeOf(chunk) !== Buffer.prototype) {
- chunk = _uint8ArrayToBuffer(chunk);
- }
-
- if (addToFront) {
- if (state.endEmitted) errorOrDestroy(stream, new ERR_STREAM_UNSHIFT_AFTER_END_EVENT());else addChunk(stream, state, chunk, true);
- } else if (state.ended) {
- errorOrDestroy(stream, new ERR_STREAM_PUSH_AFTER_EOF());
- } else if (state.destroyed) {
- return false;
- } else {
- state.reading = false;
-
- if (state.decoder && !encoding) {
- chunk = state.decoder.write(chunk);
- if (state.objectMode || chunk.length !== 0) addChunk(stream, state, chunk, false);else maybeReadMore(stream, state);
- } else {
- addChunk(stream, state, chunk, false);
- }
- }
- } else if (!addToFront) {
- state.reading = false;
- maybeReadMore(stream, state);
- }
- } // We can push more data if we are below the highWaterMark.
- // Also, if we have no data yet, we can stand some more bytes.
- // This is to work around cases where hwm=0, such as the repl.
-
-
- return !state.ended && (state.length < state.highWaterMark || state.length === 0);
-}
-
-function addChunk(stream, state, chunk, addToFront) {
- if (state.flowing && state.length === 0 && !state.sync) {
- state.awaitDrain = 0;
- stream.emit('data', chunk);
- } else {
- // update the buffer info.
- state.length += state.objectMode ? 1 : chunk.length;
- if (addToFront) state.buffer.unshift(chunk);else state.buffer.push(chunk);
- if (state.needReadable) emitReadable(stream);
- }
-
- maybeReadMore(stream, state);
-}
-
-function chunkInvalid(state, chunk) {
- var er;
-
- if (!_isUint8Array(chunk) && typeof chunk !== 'string' && chunk !== undefined && !state.objectMode) {
- er = new ERR_INVALID_ARG_TYPE('chunk', ['string', 'Buffer', 'Uint8Array'], chunk);
- }
-
- return er;
-}
-
-Readable.prototype.isPaused = function () {
- return this._readableState.flowing === false;
-}; // backwards compatibility.
-
-
-Readable.prototype.setEncoding = function (enc) {
- if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder;
- var decoder = new StringDecoder(enc);
- this._readableState.decoder = decoder; // If setEncoding(null), decoder.encoding equals utf8
-
- this._readableState.encoding = this._readableState.decoder.encoding; // Iterate over current buffer to convert already stored Buffers:
-
- var p = this._readableState.buffer.head;
- var content = '';
-
- while (p !== null) {
- content += decoder.write(p.data);
- p = p.next;
- }
-
- this._readableState.buffer.clear();
-
- if (content !== '') this._readableState.buffer.push(content);
- this._readableState.length = content.length;
- return this;
-}; // Don't raise the hwm > 1GB
-
-
-var MAX_HWM = 0x40000000;
-
-function computeNewHighWaterMark(n) {
- if (n >= MAX_HWM) {
- // TODO(ronag): Throw ERR_VALUE_OUT_OF_RANGE.
- n = MAX_HWM;
- } else {
- // Get the next highest power of 2 to prevent increasing hwm excessively in
- // tiny amounts
- n--;
- n |= n >>> 1;
- n |= n >>> 2;
- n |= n >>> 4;
- n |= n >>> 8;
- n |= n >>> 16;
- n++;
- }
-
- return n;
-} // This function is designed to be inlinable, so please take care when making
-// changes to the function body.
-
-
-function howMuchToRead(n, state) {
- if (n <= 0 || state.length === 0 && state.ended) return 0;
- if (state.objectMode) return 1;
-
- if (n !== n) {
- // Only flow one buffer at a time
- if (state.flowing && state.length) return state.buffer.head.data.length;else return state.length;
- } // If we're asking for more than the current hwm, then raise the hwm.
-
-
- if (n > state.highWaterMark) state.highWaterMark = computeNewHighWaterMark(n);
- if (n <= state.length) return n; // Don't have enough
-
- if (!state.ended) {
- state.needReadable = true;
- return 0;
- }
-
- return state.length;
-} // you can override either this method, or the async _read(n) below.
-
-
-Readable.prototype.read = function (n) {
- debug('read', n);
- n = parseInt(n, 10);
- var state = this._readableState;
- var nOrig = n;
- if (n !== 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we
- // already have a bunch of data in the buffer, then just trigger
- // the 'readable' event and move on.
-
- if (n === 0 && state.needReadable && ((state.highWaterMark !== 0 ? state.length >= state.highWaterMark : state.length > 0) || state.ended)) {
- debug('read: emitReadable', state.length, state.ended);
- if (state.length === 0 && state.ended) endReadable(this);else emitReadable(this);
- return null;
- }
-
- n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up.
-
- if (n === 0 && state.ended) {
- if (state.length === 0) endReadable(this);
- return null;
- } // All the actual chunk generation logic needs to be
- // *below* the call to _read. The reason is that in certain
- // synthetic stream cases, such as passthrough streams, _read
- // may be a completely synchronous operation which may change
- // the state of the read buffer, providing enough data when
- // before there was *not* enough.
- //
- // So, the steps are:
- // 1. Figure out what the state of things will be after we do
- // a read from the buffer.
- //
- // 2. If that resulting state will trigger a _read, then call _read.
- // Note that this may be asynchronous, or synchronous. Yes, it is
- // deeply ugly to write APIs this way, but that still doesn't mean
- // that the Readable class should behave improperly, as streams are
- // designed to be sync/async agnostic.
- // Take note if the _read call is sync or async (ie, if the read call
- // has returned yet), so that we know whether or not it's safe to emit
- // 'readable' etc.
- //
- // 3. Actually pull the requested chunks out of the buffer and return.
- // if we need a readable event, then we need to do some reading.
-
-
- var doRead = state.needReadable;
- debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some
-
- if (state.length === 0 || state.length - n < state.highWaterMark) {
- doRead = true;
- debug('length less than watermark', doRead);
- } // however, if we've ended, then there's no point, and if we're already
- // reading, then it's unnecessary.
-
-
- if (state.ended || state.reading) {
- doRead = false;
- debug('reading or ended', doRead);
- } else if (doRead) {
- debug('do read');
- state.reading = true;
- state.sync = true; // if the length is currently zero, then we *need* a readable event.
-
- if (state.length === 0) state.needReadable = true; // call internal read method
-
- this._read(state.highWaterMark);
-
- state.sync = false; // If _read pushed data synchronously, then `reading` will be false,
- // and we need to re-evaluate how much data we can return to the user.
-
- if (!state.reading) n = howMuchToRead(nOrig, state);
- }
-
- var ret;
- if (n > 0) ret = fromList(n, state);else ret = null;
-
- if (ret === null) {
- state.needReadable = state.length <= state.highWaterMark;
- n = 0;
- } else {
- state.length -= n;
- state.awaitDrain = 0;
- }
-
- if (state.length === 0) {
- // If we have nothing in the buffer, then we want to know
- // as soon as we *do* get something into the buffer.
- if (!state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick.
-
- if (nOrig !== n && state.ended) endReadable(this);
- }
-
- if (ret !== null) this.emit('data', ret);
- return ret;
-};
-
-function onEofChunk(stream, state) {
- debug('onEofChunk');
- if (state.ended) return;
-
- if (state.decoder) {
- var chunk = state.decoder.end();
-
- if (chunk && chunk.length) {
- state.buffer.push(chunk);
- state.length += state.objectMode ? 1 : chunk.length;
- }
- }
-
- state.ended = true;
-
- if (state.sync) {
- // if we are sync, wait until next tick to emit the data.
- // Otherwise we risk emitting data in the flow()
- // the readable code triggers during a read() call
- emitReadable(stream);
- } else {
- // emit 'readable' now to make sure it gets picked up.
- state.needReadable = false;
-
- if (!state.emittedReadable) {
- state.emittedReadable = true;
- emitReadable_(stream);
- }
- }
-} // Don't emit readable right away in sync mode, because this can trigger
-// another read() call => stack overflow. This way, it might trigger
-// a nextTick recursion warning, but that's not so bad.
-
-
-function emitReadable(stream) {
- var state = stream._readableState;
- debug('emitReadable', state.needReadable, state.emittedReadable);
- state.needReadable = false;
-
- if (!state.emittedReadable) {
- debug('emitReadable', state.flowing);
- state.emittedReadable = true;
- process.nextTick(emitReadable_, stream);
- }
-}
-
-function emitReadable_(stream) {
- var state = stream._readableState;
- debug('emitReadable_', state.destroyed, state.length, state.ended);
-
- if (!state.destroyed && (state.length || state.ended)) {
- stream.emit('readable');
- state.emittedReadable = false;
- } // The stream needs another readable event if
- // 1. It is not flowing, as the flow mechanism will take
- // care of it.
- // 2. It is not ended.
- // 3. It is below the highWaterMark, so we can schedule
- // another readable later.
-
-
- state.needReadable = !state.flowing && !state.ended && state.length <= state.highWaterMark;
- flow(stream);
-} // at this point, the user has presumably seen the 'readable' event,
-// and called read() to consume some data. that may have triggered
-// in turn another _read(n) call, in which case reading = true if
-// it's in progress.
-// However, if we're not ended, or reading, and the length < hwm,
-// then go ahead and try to read some more preemptively.
-
-
-function maybeReadMore(stream, state) {
- if (!state.readingMore) {
- state.readingMore = true;
- process.nextTick(maybeReadMore_, stream, state);
- }
-}
-
-function maybeReadMore_(stream, state) {
- // Attempt to read more data if we should.
- //
- // The conditions for reading more data are (one of):
- // - Not enough data buffered (state.length < state.highWaterMark). The loop
- // is responsible for filling the buffer with enough data if such data
- // is available. If highWaterMark is 0 and we are not in the flowing mode
- // we should _not_ attempt to buffer any extra data. We'll get more data
- // when the stream consumer calls read() instead.
- // - No data in the buffer, and the stream is in flowing mode. In this mode
- // the loop below is responsible for ensuring read() is called. Failing to
- // call read here would abort the flow and there's no other mechanism for
- // continuing the flow if the stream consumer has just subscribed to the
- // 'data' event.
- //
- // In addition to the above conditions to keep reading data, the following
- // conditions prevent the data from being read:
- // - The stream has ended (state.ended).
- // - There is already a pending 'read' operation (state.reading). This is a
- // case where the the stream has called the implementation defined _read()
- // method, but they are processing the call asynchronously and have _not_
- // called push() with new data. In this case we skip performing more
- // read()s. The execution ends in this method again after the _read() ends
- // up calling push() with more data.
- while (!state.reading && !state.ended && (state.length < state.highWaterMark || state.flowing && state.length === 0)) {
- var len = state.length;
- debug('maybeReadMore read 0');
- stream.read(0);
- if (len === state.length) // didn't get any data, stop spinning.
- break;
- }
-
- state.readingMore = false;
-} // abstract method. to be overridden in specific implementation classes.
-// call cb(er, data) where data is <= n in length.
-// for virtual (non-string, non-buffer) streams, "length" is somewhat
-// arbitrary, and perhaps not very meaningful.
-
-
-Readable.prototype._read = function (n) {
- errorOrDestroy(this, new ERR_METHOD_NOT_IMPLEMENTED('_read()'));
-};
-
-Readable.prototype.pipe = function (dest, pipeOpts) {
- var src = this;
- var state = this._readableState;
-
- switch (state.pipesCount) {
- case 0:
- state.pipes = dest;
- break;
-
- case 1:
- state.pipes = [state.pipes, dest];
- break;
-
- default:
- state.pipes.push(dest);
- break;
- }
-
- state.pipesCount += 1;
- debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts);
- var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr;
- var endFn = doEnd ? onend : unpipe;
- if (state.endEmitted) process.nextTick(endFn);else src.once('end', endFn);
- dest.on('unpipe', onunpipe);
-
- function onunpipe(readable, unpipeInfo) {
- debug('onunpipe');
-
- if (readable === src) {
- if (unpipeInfo && unpipeInfo.hasUnpiped === false) {
- unpipeInfo.hasUnpiped = true;
- cleanup();
- }
- }
- }
-
- function onend() {
- debug('onend');
- dest.end();
- } // when the dest drains, it reduces the awaitDrain counter
- // on the source. This would be more elegant with a .once()
- // handler in flow(), but adding and removing repeatedly is
- // too slow.
-
-
- var ondrain = pipeOnDrain(src);
- dest.on('drain', ondrain);
- var cleanedUp = false;
-
- function cleanup() {
- debug('cleanup'); // cleanup event handlers once the pipe is broken
-
- dest.removeListener('close', onclose);
- dest.removeListener('finish', onfinish);
- dest.removeListener('drain', ondrain);
- dest.removeListener('error', onerror);
- dest.removeListener('unpipe', onunpipe);
- src.removeListener('end', onend);
- src.removeListener('end', unpipe);
- src.removeListener('data', ondata);
- cleanedUp = true; // if the reader is waiting for a drain event from this
- // specific writer, then it would cause it to never start
- // flowing again.
- // So, if this is awaiting a drain, then we just call it now.
- // If we don't know, then assume that we are waiting for one.
-
- if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain();
- }
-
- src.on('data', ondata);
-
- function ondata(chunk) {
- debug('ondata');
- var ret = dest.write(chunk);
- debug('dest.write', ret);
-
- if (ret === false) {
- // If the user unpiped during `dest.write()`, it is possible
- // to get stuck in a permanently paused state if that write
- // also returned false.
- // => Check whether `dest` is still a piping destination.
- if ((state.pipesCount === 1 && state.pipes === dest || state.pipesCount > 1 && indexOf(state.pipes, dest) !== -1) && !cleanedUp) {
- debug('false write response, pause', state.awaitDrain);
- state.awaitDrain++;
- }
-
- src.pause();
- }
- } // if the dest has an error, then stop piping into it.
- // however, don't suppress the throwing behavior for this.
-
-
- function onerror(er) {
- debug('onerror', er);
- unpipe();
- dest.removeListener('error', onerror);
- if (EElistenerCount(dest, 'error') === 0) errorOrDestroy(dest, er);
- } // Make sure our error handler is attached before userland ones.
-
-
- prependListener(dest, 'error', onerror); // Both close and finish should trigger unpipe, but only once.
-
- function onclose() {
- dest.removeListener('finish', onfinish);
- unpipe();
- }
-
- dest.once('close', onclose);
-
- function onfinish() {
- debug('onfinish');
- dest.removeListener('close', onclose);
- unpipe();
- }
-
- dest.once('finish', onfinish);
-
- function unpipe() {
- debug('unpipe');
- src.unpipe(dest);
- } // tell the dest that it's being piped to
-
-
- dest.emit('pipe', src); // start the flow if it hasn't been started already.
-
- if (!state.flowing) {
- debug('pipe resume');
- src.resume();
- }
-
- return dest;
-};
-
-function pipeOnDrain(src) {
- return function pipeOnDrainFunctionResult() {
- var state = src._readableState;
- debug('pipeOnDrain', state.awaitDrain);
- if (state.awaitDrain) state.awaitDrain--;
-
- if (state.awaitDrain === 0 && EElistenerCount(src, 'data')) {
- state.flowing = true;
- flow(src);
- }
- };
-}
-
-Readable.prototype.unpipe = function (dest) {
- var state = this._readableState;
- var unpipeInfo = {
- hasUnpiped: false
- }; // if we're not piping anywhere, then do nothing.
-
- if (state.pipesCount === 0) return this; // just one destination. most common case.
-
- if (state.pipesCount === 1) {
- // passed in one, but it's not the right one.
- if (dest && dest !== state.pipes) return this;
- if (!dest) dest = state.pipes; // got a match.
-
- state.pipes = null;
- state.pipesCount = 0;
- state.flowing = false;
- if (dest) dest.emit('unpipe', this, unpipeInfo);
- return this;
- } // slow case. multiple pipe destinations.
-
-
- if (!dest) {
- // remove all.
- var dests = state.pipes;
- var len = state.pipesCount;
- state.pipes = null;
- state.pipesCount = 0;
- state.flowing = false;
-
- for (var i = 0; i < len; i++) {
- dests[i].emit('unpipe', this, {
- hasUnpiped: false
- });
- }
-
- return this;
- } // try to find the right one.
-
-
- var index = indexOf(state.pipes, dest);
- if (index === -1) return this;
- state.pipes.splice(index, 1);
- state.pipesCount -= 1;
- if (state.pipesCount === 1) state.pipes = state.pipes[0];
- dest.emit('unpipe', this, unpipeInfo);
- return this;
-}; // set up data events if they are asked for
-// Ensure readable listeners eventually get something
-
-
-Readable.prototype.on = function (ev, fn) {
- var res = Stream.prototype.on.call(this, ev, fn);
- var state = this._readableState;
-
- if (ev === 'data') {
- // update readableListening so that resume() may be a no-op
- // a few lines down. This is needed to support once('readable').
- state.readableListening = this.listenerCount('readable') > 0; // Try start flowing on next tick if stream isn't explicitly paused
-
- if (state.flowing !== false) this.resume();
- } else if (ev === 'readable') {
- if (!state.endEmitted && !state.readableListening) {
- state.readableListening = state.needReadable = true;
- state.flowing = false;
- state.emittedReadable = false;
- debug('on readable', state.length, state.reading);
-
- if (state.length) {
- emitReadable(this);
- } else if (!state.reading) {
- process.nextTick(nReadingNextTick, this);
- }
- }
- }
-
- return res;
-};
-
-Readable.prototype.addListener = Readable.prototype.on;
-
-Readable.prototype.removeListener = function (ev, fn) {
- var res = Stream.prototype.removeListener.call(this, ev, fn);
-
- if (ev === 'readable') {
- // We need to check if there is someone still listening to
- // readable and reset the state. However this needs to happen
- // after readable has been emitted but before I/O (nextTick) to
- // support once('readable', fn) cycles. This means that calling
- // resume within the same tick will have no
- // effect.
- process.nextTick(updateReadableListening, this);
- }
-
- return res;
-};
-
-Readable.prototype.removeAllListeners = function (ev) {
- var res = Stream.prototype.removeAllListeners.apply(this, arguments);
-
- if (ev === 'readable' || ev === undefined) {
- // We need to check if there is someone still listening to
- // readable and reset the state. However this needs to happen
- // after readable has been emitted but before I/O (nextTick) to
- // support once('readable', fn) cycles. This means that calling
- // resume within the same tick will have no
- // effect.
- process.nextTick(updateReadableListening, this);
- }
-
- return res;
-};
-
-function updateReadableListening(self) {
- var state = self._readableState;
- state.readableListening = self.listenerCount('readable') > 0;
-
- if (state.resumeScheduled && !state.paused) {
- // flowing needs to be set to true now, otherwise
- // the upcoming resume will not flow.
- state.flowing = true; // crude way to check if we should resume
- } else if (self.listenerCount('data') > 0) {
- self.resume();
- }
-}
-
-function nReadingNextTick(self) {
- debug('readable nexttick read 0');
- self.read(0);
-} // pause() and resume() are remnants of the legacy readable stream API
-// If the user uses them, then switch into old mode.
-
-
-Readable.prototype.resume = function () {
- var state = this._readableState;
-
- if (!state.flowing) {
- debug('resume'); // we flow only if there is no one listening
- // for readable, but we still have to call
- // resume()
-
- state.flowing = !state.readableListening;
- resume(this, state);
- }
-
- state.paused = false;
- return this;
-};
-
-function resume(stream, state) {
- if (!state.resumeScheduled) {
- state.resumeScheduled = true;
- process.nextTick(resume_, stream, state);
- }
-}
-
-function resume_(stream, state) {
- debug('resume', state.reading);
-
- if (!state.reading) {
- stream.read(0);
- }
-
- state.resumeScheduled = false;
- stream.emit('resume');
- flow(stream);
- if (state.flowing && !state.reading) stream.read(0);
-}
-
-Readable.prototype.pause = function () {
- debug('call pause flowing=%j', this._readableState.flowing);
-
- if (this._readableState.flowing !== false) {
- debug('pause');
- this._readableState.flowing = false;
- this.emit('pause');
- }
-
- this._readableState.paused = true;
- return this;
-};
-
-function flow(stream) {
- var state = stream._readableState;
- debug('flow', state.flowing);
-
- while (state.flowing && stream.read() !== null) {
- ;
- }
-} // wrap an old-style stream as the async data source.
-// This is *not* part of the readable stream interface.
-// It is an ugly unfortunate mess of history.
-
-
-Readable.prototype.wrap = function (stream) {
- var _this = this;
-
- var state = this._readableState;
- var paused = false;
- stream.on('end', function () {
- debug('wrapped end');
-
- if (state.decoder && !state.ended) {
- var chunk = state.decoder.end();
- if (chunk && chunk.length) _this.push(chunk);
- }
-
- _this.push(null);
- });
- stream.on('data', function (chunk) {
- debug('wrapped data');
- if (state.decoder) chunk = state.decoder.write(chunk); // don't skip over falsy values in objectMode
-
- if (state.objectMode && (chunk === null || chunk === undefined)) return;else if (!state.objectMode && (!chunk || !chunk.length)) return;
-
- var ret = _this.push(chunk);
-
- if (!ret) {
- paused = true;
- stream.pause();
- }
- }); // proxy all the other methods.
- // important when wrapping filters and duplexes.
-
- for (var i in stream) {
- if (this[i] === undefined && typeof stream[i] === 'function') {
- this[i] = function methodWrap(method) {
- return function methodWrapReturnFunction() {
- return stream[method].apply(stream, arguments);
- };
- }(i);
- }
- } // proxy certain important events.
-
-
- for (var n = 0; n < kProxyEvents.length; n++) {
- stream.on(kProxyEvents[n], this.emit.bind(this, kProxyEvents[n]));
- } // when we try to consume some more bytes, simply unpause the
- // underlying stream.
-
-
- this._read = function (n) {
- debug('wrapped _read', n);
-
- if (paused) {
- paused = false;
- stream.resume();
- }
- };
-
- return this;
-};
-
-if (typeof Symbol === 'function') {
- Readable.prototype[Symbol.asyncIterator] = function () {
- if (createReadableStreamAsyncIterator === undefined) {
- createReadableStreamAsyncIterator = require('./internal/streams/async_iterator');
- }
-
- return createReadableStreamAsyncIterator(this);
- };
-}
-
-Object.defineProperty(Readable.prototype, 'readableHighWaterMark', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._readableState.highWaterMark;
- }
-});
-Object.defineProperty(Readable.prototype, 'readableBuffer', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._readableState && this._readableState.buffer;
- }
-});
-Object.defineProperty(Readable.prototype, 'readableFlowing', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._readableState.flowing;
- },
- set: function set(state) {
- if (this._readableState) {
- this._readableState.flowing = state;
- }
- }
-}); // exposed for testing purposes only.
-
-Readable._fromList = fromList;
-Object.defineProperty(Readable.prototype, 'readableLength', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._readableState.length;
- }
-}); // Pluck off n bytes from an array of buffers.
-// Length is the combined lengths of all the buffers in the list.
-// This function is designed to be inlinable, so please take care when making
-// changes to the function body.
-
-function fromList(n, state) {
- // nothing buffered
- if (state.length === 0) return null;
- var ret;
- if (state.objectMode) ret = state.buffer.shift();else if (!n || n >= state.length) {
- // read it all, truncate the list
- if (state.decoder) ret = state.buffer.join('');else if (state.buffer.length === 1) ret = state.buffer.first();else ret = state.buffer.concat(state.length);
- state.buffer.clear();
- } else {
- // read part of list
- ret = state.buffer.consume(n, state.decoder);
- }
- return ret;
-}
-
-function endReadable(stream) {
- var state = stream._readableState;
- debug('endReadable', state.endEmitted);
-
- if (!state.endEmitted) {
- state.ended = true;
- process.nextTick(endReadableNT, state, stream);
- }
-}
-
-function endReadableNT(state, stream) {
- debug('endReadableNT', state.endEmitted, state.length); // Check that we didn't get one last unshift.
-
- if (!state.endEmitted && state.length === 0) {
- state.endEmitted = true;
- stream.readable = false;
- stream.emit('end');
-
- if (state.autoDestroy) {
- // In case of duplex streams we need a way to detect
- // if the writable side is ready for autoDestroy as well
- var wState = stream._writableState;
-
- if (!wState || wState.autoDestroy && wState.finished) {
- stream.destroy();
- }
- }
- }
-}
-
-if (typeof Symbol === 'function') {
- Readable.from = function (iterable, opts) {
- if (from === undefined) {
- from = require('./internal/streams/from');
- }
-
- return from(Readable, iterable, opts);
- };
-}
-
-function indexOf(xs, x) {
- for (var i = 0, l = xs.length; i < l; i++) {
- if (xs[i] === x) return i;
- }
-
- return -1;
-}
\ No newline at end of file
+// Keep this file as an alias for the full stream module.
+module.exports = require('./stream').Readable
diff --git a/deps/npm/node_modules/readable-stream/lib/_stream_transform.js b/deps/npm/node_modules/readable-stream/lib/_stream_transform.js
index 41a738c4e93599..ef227b12c57c3d 100644
--- a/deps/npm/node_modules/readable-stream/lib/_stream_transform.js
+++ b/deps/npm/node_modules/readable-stream/lib/_stream_transform.js
@@ -1,201 +1,4 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-// a transform stream is a readable/writable stream where you do
-// something with the data. Sometimes it's called a "filter",
-// but that's not a great name for it, since that implies a thing where
-// some bits pass through, and others are simply ignored. (That would
-// be a valid example of a transform, of course.)
-//
-// While the output is causally related to the input, it's not a
-// necessarily symmetric or synchronous transformation. For example,
-// a zlib stream might take multiple plain-text writes(), and then
-// emit a single compressed chunk some time in the future.
-//
-// Here's how this works:
-//
-// The Transform stream has all the aspects of the readable and writable
-// stream classes. When you write(chunk), that calls _write(chunk,cb)
-// internally, and returns false if there's a lot of pending writes
-// buffered up. When you call read(), that calls _read(n) until
-// there's enough pending readable data buffered up.
-//
-// In a transform stream, the written data is placed in a buffer. When
-// _read(n) is called, it transforms the queued up data, calling the
-// buffered _write cb's as it consumes chunks. If consuming a single
-// written chunk would result in multiple output chunks, then the first
-// outputted bit calls the readcb, and subsequent chunks just go into
-// the read buffer, and will cause it to emit 'readable' if necessary.
-//
-// This way, back-pressure is actually determined by the reading side,
-// since _read has to be called to start processing a new chunk. However,
-// a pathological inflate type of transform can cause excessive buffering
-// here. For example, imagine a stream where every byte of input is
-// interpreted as an integer from 0-255, and then results in that many
-// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
-// 1kb of data being output. In this case, you could write a very small
-// amount of input, and end up with a very large amount of output. In
-// such a pathological inflating mechanism, there'd be no way to tell
-// the system to stop doing the transform. A single 4MB write could
-// cause the system to run out of memory.
-//
-// However, even in such a pathological case, only a single written chunk
-// would be consumed, and then the rest would wait (un-transformed) until
-// the results of the previous transformed chunk were consumed.
-'use strict';
+'use strict'
-module.exports = Transform;
-
-var _require$codes = require('../errors').codes,
- ERR_METHOD_NOT_IMPLEMENTED = _require$codes.ERR_METHOD_NOT_IMPLEMENTED,
- ERR_MULTIPLE_CALLBACK = _require$codes.ERR_MULTIPLE_CALLBACK,
- ERR_TRANSFORM_ALREADY_TRANSFORMING = _require$codes.ERR_TRANSFORM_ALREADY_TRANSFORMING,
- ERR_TRANSFORM_WITH_LENGTH_0 = _require$codes.ERR_TRANSFORM_WITH_LENGTH_0;
-
-var Duplex = require('./_stream_duplex');
-
-require('inherits')(Transform, Duplex);
-
-function afterTransform(er, data) {
- var ts = this._transformState;
- ts.transforming = false;
- var cb = ts.writecb;
-
- if (cb === null) {
- return this.emit('error', new ERR_MULTIPLE_CALLBACK());
- }
-
- ts.writechunk = null;
- ts.writecb = null;
- if (data != null) // single equals check for both `null` and `undefined`
- this.push(data);
- cb(er);
- var rs = this._readableState;
- rs.reading = false;
-
- if (rs.needReadable || rs.length < rs.highWaterMark) {
- this._read(rs.highWaterMark);
- }
-}
-
-function Transform(options) {
- if (!(this instanceof Transform)) return new Transform(options);
- Duplex.call(this, options);
- this._transformState = {
- afterTransform: afterTransform.bind(this),
- needTransform: false,
- transforming: false,
- writecb: null,
- writechunk: null,
- writeencoding: null
- }; // start out asking for a readable event once data is transformed.
-
- this._readableState.needReadable = true; // we have implemented the _read method, and done the other things
- // that Readable wants before the first _read call, so unset the
- // sync guard flag.
-
- this._readableState.sync = false;
-
- if (options) {
- if (typeof options.transform === 'function') this._transform = options.transform;
- if (typeof options.flush === 'function') this._flush = options.flush;
- } // When the writable side finishes, then flush out anything remaining.
-
-
- this.on('prefinish', prefinish);
-}
-
-function prefinish() {
- var _this = this;
-
- if (typeof this._flush === 'function' && !this._readableState.destroyed) {
- this._flush(function (er, data) {
- done(_this, er, data);
- });
- } else {
- done(this, null, null);
- }
-}
-
-Transform.prototype.push = function (chunk, encoding) {
- this._transformState.needTransform = false;
- return Duplex.prototype.push.call(this, chunk, encoding);
-}; // This is the part where you do stuff!
-// override this function in implementation classes.
-// 'chunk' is an input chunk.
-//
-// Call `push(newChunk)` to pass along transformed output
-// to the readable side. You may call 'push' zero or more times.
-//
-// Call `cb(err)` when you are done with this chunk. If you pass
-// an error, then that'll put the hurt on the whole operation. If you
-// never call cb(), then you'll never get another chunk.
-
-
-Transform.prototype._transform = function (chunk, encoding, cb) {
- cb(new ERR_METHOD_NOT_IMPLEMENTED('_transform()'));
-};
-
-Transform.prototype._write = function (chunk, encoding, cb) {
- var ts = this._transformState;
- ts.writecb = cb;
- ts.writechunk = chunk;
- ts.writeencoding = encoding;
-
- if (!ts.transforming) {
- var rs = this._readableState;
- if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark);
- }
-}; // Doesn't matter what the args are here.
-// _transform does all the work.
-// That we got here means that the readable side wants more data.
-
-
-Transform.prototype._read = function (n) {
- var ts = this._transformState;
-
- if (ts.writechunk !== null && !ts.transforming) {
- ts.transforming = true;
-
- this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform);
- } else {
- // mark that we need a transform, so that any data that comes in
- // will get processed, now that we've asked for it.
- ts.needTransform = true;
- }
-};
-
-Transform.prototype._destroy = function (err, cb) {
- Duplex.prototype._destroy.call(this, err, function (err2) {
- cb(err2);
- });
-};
-
-function done(stream, er, data) {
- if (er) return stream.emit('error', er);
- if (data != null) // single equals check for both `null` and `undefined`
- stream.push(data); // TODO(BridgeAR): Write a test for these two error cases
- // if there's nothing in the write buffer, then that means
- // that nothing more will ever be provided
-
- if (stream._writableState.length) throw new ERR_TRANSFORM_WITH_LENGTH_0();
- if (stream._transformState.transforming) throw new ERR_TRANSFORM_ALREADY_TRANSFORMING();
- return stream.push(null);
-}
\ No newline at end of file
+// Keep this file as an alias for the full stream module.
+module.exports = require('./stream').Transform
diff --git a/deps/npm/node_modules/readable-stream/lib/_stream_writable.js b/deps/npm/node_modules/readable-stream/lib/_stream_writable.js
index a2634d7c24fd5e..00c7b037ce7bff 100644
--- a/deps/npm/node_modules/readable-stream/lib/_stream_writable.js
+++ b/deps/npm/node_modules/readable-stream/lib/_stream_writable.js
@@ -1,697 +1,4 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-// A bit simpler than readable streams.
-// Implement an async ._write(chunk, encoding, cb), and it'll handle all
-// the drain event emission and buffering.
-'use strict';
+'use strict'
-module.exports = Writable;
-/* */
-
-function WriteReq(chunk, encoding, cb) {
- this.chunk = chunk;
- this.encoding = encoding;
- this.callback = cb;
- this.next = null;
-} // It seems a linked list but it is not
-// there will be only 2 of these for each stream
-
-
-function CorkedRequest(state) {
- var _this = this;
-
- this.next = null;
- this.entry = null;
-
- this.finish = function () {
- onCorkedFinish(_this, state);
- };
-}
-/* */
-
-/**/
-
-
-var Duplex;
-/* */
-
-Writable.WritableState = WritableState;
-/**/
-
-var internalUtil = {
- deprecate: require('util-deprecate')
-};
-/* */
-
-/**/
-
-var Stream = require('./internal/streams/stream');
-/* */
-
-
-var Buffer = require('buffer').Buffer;
-
-var OurUint8Array = global.Uint8Array || function () {};
-
-function _uint8ArrayToBuffer(chunk) {
- return Buffer.from(chunk);
-}
-
-function _isUint8Array(obj) {
- return Buffer.isBuffer(obj) || obj instanceof OurUint8Array;
-}
-
-var destroyImpl = require('./internal/streams/destroy');
-
-var _require = require('./internal/streams/state'),
- getHighWaterMark = _require.getHighWaterMark;
-
-var _require$codes = require('../errors').codes,
- ERR_INVALID_ARG_TYPE = _require$codes.ERR_INVALID_ARG_TYPE,
- ERR_METHOD_NOT_IMPLEMENTED = _require$codes.ERR_METHOD_NOT_IMPLEMENTED,
- ERR_MULTIPLE_CALLBACK = _require$codes.ERR_MULTIPLE_CALLBACK,
- ERR_STREAM_CANNOT_PIPE = _require$codes.ERR_STREAM_CANNOT_PIPE,
- ERR_STREAM_DESTROYED = _require$codes.ERR_STREAM_DESTROYED,
- ERR_STREAM_NULL_VALUES = _require$codes.ERR_STREAM_NULL_VALUES,
- ERR_STREAM_WRITE_AFTER_END = _require$codes.ERR_STREAM_WRITE_AFTER_END,
- ERR_UNKNOWN_ENCODING = _require$codes.ERR_UNKNOWN_ENCODING;
-
-var errorOrDestroy = destroyImpl.errorOrDestroy;
-
-require('inherits')(Writable, Stream);
-
-function nop() {}
-
-function WritableState(options, stream, isDuplex) {
- Duplex = Duplex || require('./_stream_duplex');
- options = options || {}; // Duplex streams are both readable and writable, but share
- // the same options object.
- // However, some cases require setting options to different
- // values for the readable and the writable sides of the duplex stream,
- // e.g. options.readableObjectMode vs. options.writableObjectMode, etc.
-
- if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof Duplex; // object stream flag to indicate whether or not this stream
- // contains buffers or objects.
-
- this.objectMode = !!options.objectMode;
- if (isDuplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // the point at which write() starts returning false
- // Note: 0 is a valid value, means that we always return false if
- // the entire buffer is not flushed immediately on write()
-
- this.highWaterMark = getHighWaterMark(this, options, 'writableHighWaterMark', isDuplex); // if _final has been called
-
- this.finalCalled = false; // drain event flag.
-
- this.needDrain = false; // at the start of calling end()
-
- this.ending = false; // when end() has been called, and returned
-
- this.ended = false; // when 'finish' is emitted
-
- this.finished = false; // has it been destroyed
-
- this.destroyed = false; // should we decode strings into buffers before passing to _write?
- // this is here so that some node-core streams can optimize string
- // handling at a lower level.
-
- var noDecode = options.decodeStrings === false;
- this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string
- // encoding is 'binary' so we have to make this configurable.
- // Everything else in the universe uses 'utf8', though.
-
- this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement
- // of how much we're waiting to get pushed to some underlying
- // socket or file.
-
- this.length = 0; // a flag to see when we're in the middle of a write.
-
- this.writing = false; // when true all writes will be buffered until .uncork() call
-
- this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately,
- // or on a later tick. We set this to true at first, because any
- // actions that shouldn't happen until "later" should generally also
- // not happen before the first write call.
-
- this.sync = true; // a flag to know if we're processing previously buffered items, which
- // may call the _write() callback in the same tick, so that we don't
- // end up in an overlapped onwrite situation.
-
- this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb)
-
- this.onwrite = function (er) {
- onwrite(stream, er);
- }; // the callback that the user supplies to write(chunk,encoding,cb)
-
-
- this.writecb = null; // the amount that is being written when _write is called.
-
- this.writelen = 0;
- this.bufferedRequest = null;
- this.lastBufferedRequest = null; // number of pending user-supplied write callbacks
- // this must be 0 before 'finish' can be emitted
-
- this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs
- // This is relevant for synchronous Transform streams
-
- this.prefinished = false; // True if the error was already emitted and should not be thrown again
-
- this.errorEmitted = false; // Should close be emitted on destroy. Defaults to true.
-
- this.emitClose = options.emitClose !== false; // Should .destroy() be called after 'finish' (and potentially 'end')
-
- this.autoDestroy = !!options.autoDestroy; // count buffered requests
-
- this.bufferedRequestCount = 0; // allocate the first CorkedRequest, there is always
- // one allocated and free to use, and we maintain at most two
-
- this.corkedRequestsFree = new CorkedRequest(this);
-}
-
-WritableState.prototype.getBuffer = function getBuffer() {
- var current = this.bufferedRequest;
- var out = [];
-
- while (current) {
- out.push(current);
- current = current.next;
- }
-
- return out;
-};
-
-(function () {
- try {
- Object.defineProperty(WritableState.prototype, 'buffer', {
- get: internalUtil.deprecate(function writableStateBufferGetter() {
- return this.getBuffer();
- }, '_writableState.buffer is deprecated. Use _writableState.getBuffer ' + 'instead.', 'DEP0003')
- });
- } catch (_) {}
-})(); // Test _writableState for inheritance to account for Duplex streams,
-// whose prototype chain only points to Readable.
-
-
-var realHasInstance;
-
-if (typeof Symbol === 'function' && Symbol.hasInstance && typeof Function.prototype[Symbol.hasInstance] === 'function') {
- realHasInstance = Function.prototype[Symbol.hasInstance];
- Object.defineProperty(Writable, Symbol.hasInstance, {
- value: function value(object) {
- if (realHasInstance.call(this, object)) return true;
- if (this !== Writable) return false;
- return object && object._writableState instanceof WritableState;
- }
- });
-} else {
- realHasInstance = function realHasInstance(object) {
- return object instanceof this;
- };
-}
-
-function Writable(options) {
- Duplex = Duplex || require('./_stream_duplex'); // Writable ctor is applied to Duplexes, too.
- // `realHasInstance` is necessary because using plain `instanceof`
- // would return false, as no `_writableState` property is attached.
- // Trying to use the custom `instanceof` for Writable here will also break the
- // Node.js LazyTransform implementation, which has a non-trivial getter for
- // `_writableState` that would lead to infinite recursion.
- // Checking for a Stream.Duplex instance is faster here instead of inside
- // the WritableState constructor, at least with V8 6.5
-
- var isDuplex = this instanceof Duplex;
- if (!isDuplex && !realHasInstance.call(Writable, this)) return new Writable(options);
- this._writableState = new WritableState(options, this, isDuplex); // legacy.
-
- this.writable = true;
-
- if (options) {
- if (typeof options.write === 'function') this._write = options.write;
- if (typeof options.writev === 'function') this._writev = options.writev;
- if (typeof options.destroy === 'function') this._destroy = options.destroy;
- if (typeof options.final === 'function') this._final = options.final;
- }
-
- Stream.call(this);
-} // Otherwise people can pipe Writable streams, which is just wrong.
-
-
-Writable.prototype.pipe = function () {
- errorOrDestroy(this, new ERR_STREAM_CANNOT_PIPE());
-};
-
-function writeAfterEnd(stream, cb) {
- var er = new ERR_STREAM_WRITE_AFTER_END(); // TODO: defer error events consistently everywhere, not just the cb
-
- errorOrDestroy(stream, er);
- process.nextTick(cb, er);
-} // Checks that a user-supplied chunk is valid, especially for the particular
-// mode the stream is in. Currently this means that `null` is never accepted
-// and undefined/non-string values are only allowed in object mode.
-
-
-function validChunk(stream, state, chunk, cb) {
- var er;
-
- if (chunk === null) {
- er = new ERR_STREAM_NULL_VALUES();
- } else if (typeof chunk !== 'string' && !state.objectMode) {
- er = new ERR_INVALID_ARG_TYPE('chunk', ['string', 'Buffer'], chunk);
- }
-
- if (er) {
- errorOrDestroy(stream, er);
- process.nextTick(cb, er);
- return false;
- }
-
- return true;
-}
-
-Writable.prototype.write = function (chunk, encoding, cb) {
- var state = this._writableState;
- var ret = false;
-
- var isBuf = !state.objectMode && _isUint8Array(chunk);
-
- if (isBuf && !Buffer.isBuffer(chunk)) {
- chunk = _uint8ArrayToBuffer(chunk);
- }
-
- if (typeof encoding === 'function') {
- cb = encoding;
- encoding = null;
- }
-
- if (isBuf) encoding = 'buffer';else if (!encoding) encoding = state.defaultEncoding;
- if (typeof cb !== 'function') cb = nop;
- if (state.ending) writeAfterEnd(this, cb);else if (isBuf || validChunk(this, state, chunk, cb)) {
- state.pendingcb++;
- ret = writeOrBuffer(this, state, isBuf, chunk, encoding, cb);
- }
- return ret;
-};
-
-Writable.prototype.cork = function () {
- this._writableState.corked++;
-};
-
-Writable.prototype.uncork = function () {
- var state = this._writableState;
-
- if (state.corked) {
- state.corked--;
- if (!state.writing && !state.corked && !state.bufferProcessing && state.bufferedRequest) clearBuffer(this, state);
- }
-};
-
-Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) {
- // node::ParseEncoding() requires lower case.
- if (typeof encoding === 'string') encoding = encoding.toLowerCase();
- if (!(['hex', 'utf8', 'utf-8', 'ascii', 'binary', 'base64', 'ucs2', 'ucs-2', 'utf16le', 'utf-16le', 'raw'].indexOf((encoding + '').toLowerCase()) > -1)) throw new ERR_UNKNOWN_ENCODING(encoding);
- this._writableState.defaultEncoding = encoding;
- return this;
-};
-
-Object.defineProperty(Writable.prototype, 'writableBuffer', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._writableState && this._writableState.getBuffer();
- }
-});
-
-function decodeChunk(state, chunk, encoding) {
- if (!state.objectMode && state.decodeStrings !== false && typeof chunk === 'string') {
- chunk = Buffer.from(chunk, encoding);
- }
-
- return chunk;
-}
-
-Object.defineProperty(Writable.prototype, 'writableHighWaterMark', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._writableState.highWaterMark;
- }
-}); // if we're already writing something, then just put this
-// in the queue, and wait our turn. Otherwise, call _write
-// If we return false, then we need a drain event, so set that flag.
-
-function writeOrBuffer(stream, state, isBuf, chunk, encoding, cb) {
- if (!isBuf) {
- var newChunk = decodeChunk(state, chunk, encoding);
-
- if (chunk !== newChunk) {
- isBuf = true;
- encoding = 'buffer';
- chunk = newChunk;
- }
- }
-
- var len = state.objectMode ? 1 : chunk.length;
- state.length += len;
- var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false.
-
- if (!ret) state.needDrain = true;
-
- if (state.writing || state.corked) {
- var last = state.lastBufferedRequest;
- state.lastBufferedRequest = {
- chunk: chunk,
- encoding: encoding,
- isBuf: isBuf,
- callback: cb,
- next: null
- };
-
- if (last) {
- last.next = state.lastBufferedRequest;
- } else {
- state.bufferedRequest = state.lastBufferedRequest;
- }
-
- state.bufferedRequestCount += 1;
- } else {
- doWrite(stream, state, false, len, chunk, encoding, cb);
- }
-
- return ret;
-}
-
-function doWrite(stream, state, writev, len, chunk, encoding, cb) {
- state.writelen = len;
- state.writecb = cb;
- state.writing = true;
- state.sync = true;
- if (state.destroyed) state.onwrite(new ERR_STREAM_DESTROYED('write'));else if (writev) stream._writev(chunk, state.onwrite);else stream._write(chunk, encoding, state.onwrite);
- state.sync = false;
-}
-
-function onwriteError(stream, state, sync, er, cb) {
- --state.pendingcb;
-
- if (sync) {
- // defer the callback if we are being called synchronously
- // to avoid piling up things on the stack
- process.nextTick(cb, er); // this can emit finish, and it will always happen
- // after error
-
- process.nextTick(finishMaybe, stream, state);
- stream._writableState.errorEmitted = true;
- errorOrDestroy(stream, er);
- } else {
- // the caller expect this to happen before if
- // it is async
- cb(er);
- stream._writableState.errorEmitted = true;
- errorOrDestroy(stream, er); // this can emit finish, but finish must
- // always follow error
-
- finishMaybe(stream, state);
- }
-}
-
-function onwriteStateUpdate(state) {
- state.writing = false;
- state.writecb = null;
- state.length -= state.writelen;
- state.writelen = 0;
-}
-
-function onwrite(stream, er) {
- var state = stream._writableState;
- var sync = state.sync;
- var cb = state.writecb;
- if (typeof cb !== 'function') throw new ERR_MULTIPLE_CALLBACK();
- onwriteStateUpdate(state);
- if (er) onwriteError(stream, state, sync, er, cb);else {
- // Check if we're actually ready to finish, but don't emit yet
- var finished = needFinish(state) || stream.destroyed;
-
- if (!finished && !state.corked && !state.bufferProcessing && state.bufferedRequest) {
- clearBuffer(stream, state);
- }
-
- if (sync) {
- process.nextTick(afterWrite, stream, state, finished, cb);
- } else {
- afterWrite(stream, state, finished, cb);
- }
- }
-}
-
-function afterWrite(stream, state, finished, cb) {
- if (!finished) onwriteDrain(stream, state);
- state.pendingcb--;
- cb();
- finishMaybe(stream, state);
-} // Must force callback to be called on nextTick, so that we don't
-// emit 'drain' before the write() consumer gets the 'false' return
-// value, and has a chance to attach a 'drain' listener.
-
-
-function onwriteDrain(stream, state) {
- if (state.length === 0 && state.needDrain) {
- state.needDrain = false;
- stream.emit('drain');
- }
-} // if there's something in the buffer waiting, then process it
-
-
-function clearBuffer(stream, state) {
- state.bufferProcessing = true;
- var entry = state.bufferedRequest;
-
- if (stream._writev && entry && entry.next) {
- // Fast case, write everything using _writev()
- var l = state.bufferedRequestCount;
- var buffer = new Array(l);
- var holder = state.corkedRequestsFree;
- holder.entry = entry;
- var count = 0;
- var allBuffers = true;
-
- while (entry) {
- buffer[count] = entry;
- if (!entry.isBuf) allBuffers = false;
- entry = entry.next;
- count += 1;
- }
-
- buffer.allBuffers = allBuffers;
- doWrite(stream, state, true, state.length, buffer, '', holder.finish); // doWrite is almost always async, defer these to save a bit of time
- // as the hot path ends with doWrite
-
- state.pendingcb++;
- state.lastBufferedRequest = null;
-
- if (holder.next) {
- state.corkedRequestsFree = holder.next;
- holder.next = null;
- } else {
- state.corkedRequestsFree = new CorkedRequest(state);
- }
-
- state.bufferedRequestCount = 0;
- } else {
- // Slow case, write chunks one-by-one
- while (entry) {
- var chunk = entry.chunk;
- var encoding = entry.encoding;
- var cb = entry.callback;
- var len = state.objectMode ? 1 : chunk.length;
- doWrite(stream, state, false, len, chunk, encoding, cb);
- entry = entry.next;
- state.bufferedRequestCount--; // if we didn't call the onwrite immediately, then
- // it means that we need to wait until it does.
- // also, that means that the chunk and cb are currently
- // being processed, so move the buffer counter past them.
-
- if (state.writing) {
- break;
- }
- }
-
- if (entry === null) state.lastBufferedRequest = null;
- }
-
- state.bufferedRequest = entry;
- state.bufferProcessing = false;
-}
-
-Writable.prototype._write = function (chunk, encoding, cb) {
- cb(new ERR_METHOD_NOT_IMPLEMENTED('_write()'));
-};
-
-Writable.prototype._writev = null;
-
-Writable.prototype.end = function (chunk, encoding, cb) {
- var state = this._writableState;
-
- if (typeof chunk === 'function') {
- cb = chunk;
- chunk = null;
- encoding = null;
- } else if (typeof encoding === 'function') {
- cb = encoding;
- encoding = null;
- }
-
- if (chunk !== null && chunk !== undefined) this.write(chunk, encoding); // .end() fully uncorks
-
- if (state.corked) {
- state.corked = 1;
- this.uncork();
- } // ignore unnecessary end() calls.
-
-
- if (!state.ending) endWritable(this, state, cb);
- return this;
-};
-
-Object.defineProperty(Writable.prototype, 'writableLength', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- return this._writableState.length;
- }
-});
-
-function needFinish(state) {
- return state.ending && state.length === 0 && state.bufferedRequest === null && !state.finished && !state.writing;
-}
-
-function callFinal(stream, state) {
- stream._final(function (err) {
- state.pendingcb--;
-
- if (err) {
- errorOrDestroy(stream, err);
- }
-
- state.prefinished = true;
- stream.emit('prefinish');
- finishMaybe(stream, state);
- });
-}
-
-function prefinish(stream, state) {
- if (!state.prefinished && !state.finalCalled) {
- if (typeof stream._final === 'function' && !state.destroyed) {
- state.pendingcb++;
- state.finalCalled = true;
- process.nextTick(callFinal, stream, state);
- } else {
- state.prefinished = true;
- stream.emit('prefinish');
- }
- }
-}
-
-function finishMaybe(stream, state) {
- var need = needFinish(state);
-
- if (need) {
- prefinish(stream, state);
-
- if (state.pendingcb === 0) {
- state.finished = true;
- stream.emit('finish');
-
- if (state.autoDestroy) {
- // In case of duplex streams we need a way to detect
- // if the readable side is ready for autoDestroy as well
- var rState = stream._readableState;
-
- if (!rState || rState.autoDestroy && rState.endEmitted) {
- stream.destroy();
- }
- }
- }
- }
-
- return need;
-}
-
-function endWritable(stream, state, cb) {
- state.ending = true;
- finishMaybe(stream, state);
-
- if (cb) {
- if (state.finished) process.nextTick(cb);else stream.once('finish', cb);
- }
-
- state.ended = true;
- stream.writable = false;
-}
-
-function onCorkedFinish(corkReq, state, err) {
- var entry = corkReq.entry;
- corkReq.entry = null;
-
- while (entry) {
- var cb = entry.callback;
- state.pendingcb--;
- cb(err);
- entry = entry.next;
- } // reuse the free corkReq.
-
-
- state.corkedRequestsFree.next = corkReq;
-}
-
-Object.defineProperty(Writable.prototype, 'destroyed', {
- // making it explicit this property is not enumerable
- // because otherwise some prototype manipulation in
- // userland will fail
- enumerable: false,
- get: function get() {
- if (this._writableState === undefined) {
- return false;
- }
-
- return this._writableState.destroyed;
- },
- set: function set(value) {
- // we ignore the value if the stream
- // has not been initialized yet
- if (!this._writableState) {
- return;
- } // backward compatibility, the user is explicitly
- // managing destroyed
-
-
- this._writableState.destroyed = value;
- }
-});
-Writable.prototype.destroy = destroyImpl.destroy;
-Writable.prototype._undestroy = destroyImpl.undestroy;
-
-Writable.prototype._destroy = function (err, cb) {
- cb(err);
-};
\ No newline at end of file
+// Keep this file as an alias for the full stream module.
+module.exports = require('./stream').Writable
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/add-abort-signal.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/add-abort-signal.js
similarity index 92%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/add-abort-signal.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/add-abort-signal.js
index 8d5a840f707938..c6ba8b9c298f18 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/add-abort-signal.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/add-abort-signal.js
@@ -1,38 +1,31 @@
'use strict'
const { AbortError, codes } = require('../../ours/errors')
-
const eos = require('./end-of-stream')
+const { ERR_INVALID_ARG_TYPE } = codes
-const { ERR_INVALID_ARG_TYPE } = codes // This method is inlined here for readable-stream
+// This method is inlined here for readable-stream
// It also does not allow for signal to not exist on the stream
// https://github.com/nodejs/node/pull/36061#discussion_r533718029
-
const validateAbortSignal = (signal, name) => {
if (typeof signal !== 'object' || !('aborted' in signal)) {
throw new ERR_INVALID_ARG_TYPE(name, 'AbortSignal', signal)
}
}
-
function isNodeStream(obj) {
return !!(obj && typeof obj.pipe === 'function')
}
-
module.exports.addAbortSignal = function addAbortSignal(signal, stream) {
validateAbortSignal(signal, 'signal')
-
if (!isNodeStream(stream)) {
throw new ERR_INVALID_ARG_TYPE('stream', 'stream.Stream', stream)
}
-
return module.exports.addAbortSignalNoValidate(signal, stream)
}
-
module.exports.addAbortSignalNoValidate = function (signal, stream) {
if (typeof signal !== 'object' || !('aborted' in signal)) {
return stream
}
-
const onAbort = () => {
stream.destroy(
new AbortError(undefined, {
@@ -40,13 +33,11 @@ module.exports.addAbortSignalNoValidate = function (signal, stream) {
})
)
}
-
if (signal.aborted) {
onAbort()
} else {
signal.addEventListener('abort', onAbort)
eos(stream, () => signal.removeEventListener('abort', onAbort))
}
-
return stream
}
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/buffer_list.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/buffer_list.js
index cdea425f19dd96..b55e35cf9a0f88 100644
--- a/deps/npm/node_modules/readable-stream/lib/internal/streams/buffer_list.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/buffer_list.js
@@ -1,210 +1,157 @@
-'use strict';
-
-function ownKeys(object, enumerableOnly) { var keys = Object.keys(object); if (Object.getOwnPropertySymbols) { var symbols = Object.getOwnPropertySymbols(object); if (enumerableOnly) symbols = symbols.filter(function (sym) { return Object.getOwnPropertyDescriptor(object, sym).enumerable; }); keys.push.apply(keys, symbols); } return keys; }
-
-function _objectSpread(target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i] != null ? arguments[i] : {}; if (i % 2) { ownKeys(Object(source), true).forEach(function (key) { _defineProperty(target, key, source[key]); }); } else if (Object.getOwnPropertyDescriptors) { Object.defineProperties(target, Object.getOwnPropertyDescriptors(source)); } else { ownKeys(Object(source)).forEach(function (key) { Object.defineProperty(target, key, Object.getOwnPropertyDescriptor(source, key)); }); } } return target; }
-
-function _defineProperty(obj, key, value) { if (key in obj) { Object.defineProperty(obj, key, { value: value, enumerable: true, configurable: true, writable: true }); } else { obj[key] = value; } return obj; }
-
-function _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError("Cannot call a class as a function"); } }
-
-function _defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if ("value" in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } }
-
-function _createClass(Constructor, protoProps, staticProps) { if (protoProps) _defineProperties(Constructor.prototype, protoProps); if (staticProps) _defineProperties(Constructor, staticProps); return Constructor; }
-
-var _require = require('buffer'),
- Buffer = _require.Buffer;
-
-var _require2 = require('util'),
- inspect = _require2.inspect;
-
-var custom = inspect && inspect.custom || 'inspect';
-
-function copyBuffer(src, target, offset) {
- Buffer.prototype.copy.call(src, target, offset);
-}
-
-module.exports =
-/*#__PURE__*/
-function () {
- function BufferList() {
- _classCallCheck(this, BufferList);
-
- this.head = null;
- this.tail = null;
- this.length = 0;
+'use strict'
+
+const { StringPrototypeSlice, SymbolIterator, TypedArrayPrototypeSet, Uint8Array } = require('../../ours/primordials')
+const { Buffer } = require('buffer')
+const { inspect } = require('../../ours/util')
+module.exports = class BufferList {
+ constructor() {
+ this.head = null
+ this.tail = null
+ this.length = 0
}
-
- _createClass(BufferList, [{
- key: "push",
- value: function push(v) {
- var entry = {
- data: v,
- next: null
- };
- if (this.length > 0) this.tail.next = entry;else this.head = entry;
- this.tail = entry;
- ++this.length;
+ push(v) {
+ const entry = {
+ data: v,
+ next: null
}
- }, {
- key: "unshift",
- value: function unshift(v) {
- var entry = {
- data: v,
- next: this.head
- };
- if (this.length === 0) this.tail = entry;
- this.head = entry;
- ++this.length;
- }
- }, {
- key: "shift",
- value: function shift() {
- if (this.length === 0) return;
- var ret = this.head.data;
- if (this.length === 1) this.head = this.tail = null;else this.head = this.head.next;
- --this.length;
- return ret;
+ if (this.length > 0) this.tail.next = entry
+ else this.head = entry
+ this.tail = entry
+ ++this.length
+ }
+ unshift(v) {
+ const entry = {
+ data: v,
+ next: this.head
}
- }, {
- key: "clear",
- value: function clear() {
- this.head = this.tail = null;
- this.length = 0;
+ if (this.length === 0) this.tail = entry
+ this.head = entry
+ ++this.length
+ }
+ shift() {
+ if (this.length === 0) return
+ const ret = this.head.data
+ if (this.length === 1) this.head = this.tail = null
+ else this.head = this.head.next
+ --this.length
+ return ret
+ }
+ clear() {
+ this.head = this.tail = null
+ this.length = 0
+ }
+ join(s) {
+ if (this.length === 0) return ''
+ let p = this.head
+ let ret = '' + p.data
+ while ((p = p.next) !== null) ret += s + p.data
+ return ret
+ }
+ concat(n) {
+ if (this.length === 0) return Buffer.alloc(0)
+ const ret = Buffer.allocUnsafe(n >>> 0)
+ let p = this.head
+ let i = 0
+ while (p) {
+ TypedArrayPrototypeSet(ret, p.data, i)
+ i += p.data.length
+ p = p.next
}
- }, {
- key: "join",
- value: function join(s) {
- if (this.length === 0) return '';
- var p = this.head;
- var ret = '' + p.data;
-
- while (p = p.next) {
- ret += s + p.data;
- }
+ return ret
+ }
- return ret;
+ // Consumes a specified amount of bytes or characters from the buffered data.
+ consume(n, hasStrings) {
+ const data = this.head.data
+ if (n < data.length) {
+ // `slice` is the same for buffers and strings.
+ const slice = data.slice(0, n)
+ this.head.data = data.slice(n)
+ return slice
}
- }, {
- key: "concat",
- value: function concat(n) {
- if (this.length === 0) return Buffer.alloc(0);
- var ret = Buffer.allocUnsafe(n >>> 0);
- var p = this.head;
- var i = 0;
-
- while (p) {
- copyBuffer(p.data, ret, i);
- i += p.data.length;
- p = p.next;
- }
-
- return ret;
- } // Consumes a specified amount of bytes or characters from the buffered data.
-
- }, {
- key: "consume",
- value: function consume(n, hasStrings) {
- var ret;
-
- if (n < this.head.data.length) {
- // `slice` is the same for buffers and strings.
- ret = this.head.data.slice(0, n);
- this.head.data = this.head.data.slice(n);
- } else if (n === this.head.data.length) {
- // First chunk is a perfect match.
- ret = this.shift();
- } else {
- // Result spans more than one buffer.
- ret = hasStrings ? this._getString(n) : this._getBuffer(n);
- }
-
- return ret;
+ if (n === data.length) {
+ // First chunk is a perfect match.
+ return this.shift()
}
- }, {
- key: "first",
- value: function first() {
- return this.head.data;
- } // Consumes a specified amount of characters from the buffered data.
-
- }, {
- key: "_getString",
- value: function _getString(n) {
- var p = this.head;
- var c = 1;
- var ret = p.data;
- n -= ret.length;
-
- while (p = p.next) {
- var str = p.data;
- var nb = n > str.length ? str.length : n;
- if (nb === str.length) ret += str;else ret += str.slice(0, n);
- n -= nb;
-
- if (n === 0) {
- if (nb === str.length) {
- ++c;
- if (p.next) this.head = p.next;else this.head = this.tail = null;
- } else {
- this.head = p;
- p.data = str.slice(nb);
- }
+ // Result spans more than one buffer.
+ return hasStrings ? this._getString(n) : this._getBuffer(n)
+ }
+ first() {
+ return this.head.data
+ }
+ *[SymbolIterator]() {
+ for (let p = this.head; p; p = p.next) {
+ yield p.data
+ }
+ }
- break;
+ // Consumes a specified amount of characters from the buffered data.
+ _getString(n) {
+ let ret = ''
+ let p = this.head
+ let c = 0
+ do {
+ const str = p.data
+ if (n > str.length) {
+ ret += str
+ n -= str.length
+ } else {
+ if (n === str.length) {
+ ret += str
+ ++c
+ if (p.next) this.head = p.next
+ else this.head = this.tail = null
+ } else {
+ ret += StringPrototypeSlice(str, 0, n)
+ this.head = p
+ p.data = StringPrototypeSlice(str, n)
}
-
- ++c;
+ break
}
+ ++c
+ } while ((p = p.next) !== null)
+ this.length -= c
+ return ret
+ }
- this.length -= c;
- return ret;
- } // Consumes a specified amount of bytes from the buffered data.
-
- }, {
- key: "_getBuffer",
- value: function _getBuffer(n) {
- var ret = Buffer.allocUnsafe(n);
- var p = this.head;
- var c = 1;
- p.data.copy(ret);
- n -= p.data.length;
-
- while (p = p.next) {
- var buf = p.data;
- var nb = n > buf.length ? buf.length : n;
- buf.copy(ret, ret.length - n, 0, nb);
- n -= nb;
-
- if (n === 0) {
- if (nb === buf.length) {
- ++c;
- if (p.next) this.head = p.next;else this.head = this.tail = null;
- } else {
- this.head = p;
- p.data = buf.slice(nb);
- }
-
- break;
+ // Consumes a specified amount of bytes from the buffered data.
+ _getBuffer(n) {
+ const ret = Buffer.allocUnsafe(n)
+ const retLen = n
+ let p = this.head
+ let c = 0
+ do {
+ const buf = p.data
+ if (n > buf.length) {
+ TypedArrayPrototypeSet(ret, buf, retLen - n)
+ n -= buf.length
+ } else {
+ if (n === buf.length) {
+ TypedArrayPrototypeSet(ret, buf, retLen - n)
+ ++c
+ if (p.next) this.head = p.next
+ else this.head = this.tail = null
+ } else {
+ TypedArrayPrototypeSet(ret, new Uint8Array(buf.buffer, buf.byteOffset, n), retLen - n)
+ this.head = p
+ p.data = buf.slice(n)
}
-
- ++c;
+ break
}
+ ++c
+ } while ((p = p.next) !== null)
+ this.length -= c
+ return ret
+ }
- this.length -= c;
- return ret;
- } // Make sure the linked list only shows the minimal necessary information.
-
- }, {
- key: custom,
- value: function value(_, options) {
- return inspect(this, _objectSpread({}, options, {
- // Only inspect one level.
- depth: 0,
- // It should not recurse.
- customInspect: false
- }));
- }
- }]);
-
- return BufferList;
-}();
\ No newline at end of file
+ // Make sure the linked list only shows the minimal necessary information.
+ [Symbol.for('nodejs.util.inspect.custom')](_, options) {
+ return inspect(this, {
+ ...options,
+ // Only inspect one level.
+ depth: 0,
+ // It should not recurse.
+ customInspect: false
+ })
+ }
+}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/compose.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/compose.js
similarity index 96%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/compose.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/compose.js
index 0a2e810a3e886a..4a00aead883c2f 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/compose.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/compose.js
@@ -1,63 +1,48 @@
'use strict'
const { pipeline } = require('./pipeline')
-
const Duplex = require('./duplex')
-
const { destroyer } = require('./destroy')
-
const { isNodeStream, isReadable, isWritable } = require('./utils')
-
const {
AbortError,
codes: { ERR_INVALID_ARG_VALUE, ERR_MISSING_ARGS }
} = require('../../ours/errors')
-
module.exports = function compose(...streams) {
if (streams.length === 0) {
throw new ERR_MISSING_ARGS('streams')
}
-
if (streams.length === 1) {
return Duplex.from(streams[0])
}
-
const orgStreams = [...streams]
-
if (typeof streams[0] === 'function') {
streams[0] = Duplex.from(streams[0])
}
-
if (typeof streams[streams.length - 1] === 'function') {
const idx = streams.length - 1
streams[idx] = Duplex.from(streams[idx])
}
-
for (let n = 0; n < streams.length; ++n) {
if (!isNodeStream(streams[n])) {
// TODO(ronag): Add checks for non streams.
continue
}
-
if (n < streams.length - 1 && !isReadable(streams[n])) {
throw new ERR_INVALID_ARG_VALUE(`streams[${n}]`, orgStreams[n], 'must be readable')
}
-
if (n > 0 && !isWritable(streams[n])) {
throw new ERR_INVALID_ARG_VALUE(`streams[${n}]`, orgStreams[n], 'must be writable')
}
}
-
let ondrain
let onfinish
let onreadable
let onclose
let d
-
function onfinished(err) {
const cb = onclose
onclose = null
-
if (cb) {
cb(err)
} else if (err) {
@@ -66,14 +51,14 @@ module.exports = function compose(...streams) {
d.destroy()
}
}
-
const head = streams[0]
const tail = pipeline(streams, onfinished)
const writable = !!isWritable(head)
- const readable = !!isReadable(tail) // TODO(ronag): Avoid double buffering.
+ const readable = !!isReadable(tail)
+
+ // TODO(ronag): Avoid double buffering.
// Implement Writable/Readable/Duplex traits.
// See, https://github.com/nodejs/node/pull/33515.
-
d = new Duplex({
// TODO (ronag): highWaterMark?
writableObjectMode: !!(head !== null && head !== undefined && head.writableObjectMode),
@@ -81,7 +66,6 @@ module.exports = function compose(...streams) {
writable,
readable
})
-
if (writable) {
d._write = function (chunk, encoding, callback) {
if (head.write(chunk, encoding)) {
@@ -90,12 +74,10 @@ module.exports = function compose(...streams) {
ondrain = callback
}
}
-
d._final = function (callback) {
head.end()
onfinish = callback
}
-
head.on('drain', function () {
if (ondrain) {
const cb = ondrain
@@ -111,7 +93,6 @@ module.exports = function compose(...streams) {
}
})
}
-
if (readable) {
tail.on('readable', function () {
if (onreadable) {
@@ -123,32 +104,26 @@ module.exports = function compose(...streams) {
tail.on('end', function () {
d.push(null)
})
-
d._read = function () {
while (true) {
const buf = tail.read()
-
if (buf === null) {
onreadable = d._read
return
}
-
if (!d.push(buf)) {
return
}
}
}
}
-
d._destroy = function (err, callback) {
if (!err && onclose !== null) {
err = new AbortError()
}
-
onreadable = null
ondrain = null
onfinish = null
-
if (onclose === null) {
callback(err)
} else {
@@ -156,6 +131,5 @@ module.exports = function compose(...streams) {
destroyer(tail, err)
}
}
-
return d
}
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/destroy.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/destroy.js
index 3268a16f3b6f23..768f2d79d3a893 100644
--- a/deps/npm/node_modules/readable-stream/lib/internal/streams/destroy.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/destroy.js
@@ -1,105 +1,287 @@
-'use strict'; // undocumented cb() API, needed for core, not for public API
+'use strict'
-function destroy(err, cb) {
- var _this = this;
+/* replacement start */
- var readableDestroyed = this._readableState && this._readableState.destroyed;
- var writableDestroyed = this._writableState && this._writableState.destroyed;
+const process = require('process/')
- if (readableDestroyed || writableDestroyed) {
- if (cb) {
- cb(err);
- } else if (err) {
- if (!this._writableState) {
- process.nextTick(emitErrorNT, this, err);
- } else if (!this._writableState.errorEmitted) {
- this._writableState.errorEmitted = true;
- process.nextTick(emitErrorNT, this, err);
- }
- }
-
- return this;
- } // we set destroyed to true before firing error callbacks in order
- // to make it re-entrance safe in case destroy() is called within callbacks
+/* replacement end */
+const {
+ aggregateTwoErrors,
+ codes: { ERR_MULTIPLE_CALLBACK },
+ AbortError
+} = require('../../ours/errors')
+const { Symbol } = require('../../ours/primordials')
+const { kDestroyed, isDestroyed, isFinished, isServerRequest } = require('./utils')
+const kDestroy = Symbol('kDestroy')
+const kConstruct = Symbol('kConstruct')
+function checkError(err, w, r) {
+ if (err) {
+ // Avoid V8 leak, https://github.com/nodejs/node/pull/34103#issuecomment-652002364
+ err.stack // eslint-disable-line no-unused-expressions
- if (this._readableState) {
- this._readableState.destroyed = true;
- } // if this is a duplex stream mark the writable part as destroyed as well
+ if (w && !w.errored) {
+ w.errored = err
+ }
+ if (r && !r.errored) {
+ r.errored = err
+ }
+ }
+}
+// Backwards compat. cb() is undocumented and unused in core but
+// unfortunately might be used by modules.
+function destroy(err, cb) {
+ const r = this._readableState
+ const w = this._writableState
+ // With duplex streams we use the writable side for state.
+ const s = w || r
+ if ((w && w.destroyed) || (r && r.destroyed)) {
+ if (typeof cb === 'function') {
+ cb()
+ }
+ return this
+ }
- if (this._writableState) {
- this._writableState.destroyed = true;
+ // We set destroyed to true before firing error callbacks in order
+ // to make it re-entrance safe in case destroy() is called within callbacks
+ checkError(err, w, r)
+ if (w) {
+ w.destroyed = true
+ }
+ if (r) {
+ r.destroyed = true
}
- this._destroy(err || null, function (err) {
- if (!cb && err) {
- if (!_this._writableState) {
- process.nextTick(emitErrorAndCloseNT, _this, err);
- } else if (!_this._writableState.errorEmitted) {
- _this._writableState.errorEmitted = true;
- process.nextTick(emitErrorAndCloseNT, _this, err);
- } else {
- process.nextTick(emitCloseNT, _this);
- }
- } else if (cb) {
- process.nextTick(emitCloseNT, _this);
- cb(err);
+ // If still constructing then defer calling _destroy.
+ if (!s.constructed) {
+ this.once(kDestroy, function (er) {
+ _destroy(this, aggregateTwoErrors(er, err), cb)
+ })
+ } else {
+ _destroy(this, err, cb)
+ }
+ return this
+}
+function _destroy(self, err, cb) {
+ let called = false
+ function onDestroy(err) {
+ if (called) {
+ return
+ }
+ called = true
+ const r = self._readableState
+ const w = self._writableState
+ checkError(err, w, r)
+ if (w) {
+ w.closed = true
+ }
+ if (r) {
+ r.closed = true
+ }
+ if (typeof cb === 'function') {
+ cb(err)
+ }
+ if (err) {
+ process.nextTick(emitErrorCloseNT, self, err)
} else {
- process.nextTick(emitCloseNT, _this);
+ process.nextTick(emitCloseNT, self)
}
- });
-
- return this;
+ }
+ try {
+ self._destroy(err || null, onDestroy)
+ } catch (err) {
+ onDestroy(err)
+ }
}
-
-function emitErrorAndCloseNT(self, err) {
- emitErrorNT(self, err);
- emitCloseNT(self);
+function emitErrorCloseNT(self, err) {
+ emitErrorNT(self, err)
+ emitCloseNT(self)
}
-
function emitCloseNT(self) {
- if (self._writableState && !self._writableState.emitClose) return;
- if (self._readableState && !self._readableState.emitClose) return;
- self.emit('close');
-}
-
-function undestroy() {
- if (this._readableState) {
- this._readableState.destroyed = false;
- this._readableState.reading = false;
- this._readableState.ended = false;
- this._readableState.endEmitted = false;
+ const r = self._readableState
+ const w = self._writableState
+ if (w) {
+ w.closeEmitted = true
}
-
- if (this._writableState) {
- this._writableState.destroyed = false;
- this._writableState.ended = false;
- this._writableState.ending = false;
- this._writableState.finalCalled = false;
- this._writableState.prefinished = false;
- this._writableState.finished = false;
- this._writableState.errorEmitted = false;
+ if (r) {
+ r.closeEmitted = true
+ }
+ if ((w && w.emitClose) || (r && r.emitClose)) {
+ self.emit('close')
}
}
-
function emitErrorNT(self, err) {
- self.emit('error', err);
+ const r = self._readableState
+ const w = self._writableState
+ if ((w && w.errorEmitted) || (r && r.errorEmitted)) {
+ return
+ }
+ if (w) {
+ w.errorEmitted = true
+ }
+ if (r) {
+ r.errorEmitted = true
+ }
+ self.emit('error', err)
}
-
-function errorOrDestroy(stream, err) {
+function undestroy() {
+ const r = this._readableState
+ const w = this._writableState
+ if (r) {
+ r.constructed = true
+ r.closed = false
+ r.closeEmitted = false
+ r.destroyed = false
+ r.errored = null
+ r.errorEmitted = false
+ r.reading = false
+ r.ended = r.readable === false
+ r.endEmitted = r.readable === false
+ }
+ if (w) {
+ w.constructed = true
+ w.destroyed = false
+ w.closed = false
+ w.closeEmitted = false
+ w.errored = null
+ w.errorEmitted = false
+ w.finalCalled = false
+ w.prefinished = false
+ w.ended = w.writable === false
+ w.ending = w.writable === false
+ w.finished = w.writable === false
+ }
+}
+function errorOrDestroy(stream, err, sync) {
// We have tests that rely on errors being emitted
// in the same tick, so changing this is semver major.
// For now when you opt-in to autoDestroy we allow
// the error to be emitted nextTick. In a future
// semver major update we should change the default to this.
- var rState = stream._readableState;
- var wState = stream._writableState;
- if (rState && rState.autoDestroy || wState && wState.autoDestroy) stream.destroy(err);else stream.emit('error', err);
+
+ const r = stream._readableState
+ const w = stream._writableState
+ if ((w && w.destroyed) || (r && r.destroyed)) {
+ return this
+ }
+ if ((r && r.autoDestroy) || (w && w.autoDestroy)) stream.destroy(err)
+ else if (err) {
+ // Avoid V8 leak, https://github.com/nodejs/node/pull/34103#issuecomment-652002364
+ err.stack // eslint-disable-line no-unused-expressions
+
+ if (w && !w.errored) {
+ w.errored = err
+ }
+ if (r && !r.errored) {
+ r.errored = err
+ }
+ if (sync) {
+ process.nextTick(emitErrorNT, stream, err)
+ } else {
+ emitErrorNT(stream, err)
+ }
+ }
+}
+function construct(stream, cb) {
+ if (typeof stream._construct !== 'function') {
+ return
+ }
+ const r = stream._readableState
+ const w = stream._writableState
+ if (r) {
+ r.constructed = false
+ }
+ if (w) {
+ w.constructed = false
+ }
+ stream.once(kConstruct, cb)
+ if (stream.listenerCount(kConstruct) > 1) {
+ // Duplex
+ return
+ }
+ process.nextTick(constructNT, stream)
+}
+function constructNT(stream) {
+ let called = false
+ function onConstruct(err) {
+ if (called) {
+ errorOrDestroy(stream, err !== null && err !== undefined ? err : new ERR_MULTIPLE_CALLBACK())
+ return
+ }
+ called = true
+ const r = stream._readableState
+ const w = stream._writableState
+ const s = w || r
+ if (r) {
+ r.constructed = true
+ }
+ if (w) {
+ w.constructed = true
+ }
+ if (s.destroyed) {
+ stream.emit(kDestroy, err)
+ } else if (err) {
+ errorOrDestroy(stream, err, true)
+ } else {
+ process.nextTick(emitConstructNT, stream)
+ }
+ }
+ try {
+ stream._construct(onConstruct)
+ } catch (err) {
+ onConstruct(err)
+ }
+}
+function emitConstructNT(stream) {
+ stream.emit(kConstruct)
+}
+function isRequest(stream) {
+ return stream && stream.setHeader && typeof stream.abort === 'function'
}
+function emitCloseLegacy(stream) {
+ stream.emit('close')
+}
+function emitErrorCloseLegacy(stream, err) {
+ stream.emit('error', err)
+ process.nextTick(emitCloseLegacy, stream)
+}
+
+// Normalize destroy for legacy.
+function destroyer(stream, err) {
+ if (!stream || isDestroyed(stream)) {
+ return
+ }
+ if (!err && !isFinished(stream)) {
+ err = new AbortError()
+ }
+ // TODO: Remove isRequest branches.
+ if (isServerRequest(stream)) {
+ stream.socket = null
+ stream.destroy(err)
+ } else if (isRequest(stream)) {
+ stream.abort()
+ } else if (isRequest(stream.req)) {
+ stream.req.abort()
+ } else if (typeof stream.destroy === 'function') {
+ stream.destroy(err)
+ } else if (typeof stream.close === 'function') {
+ // TODO: Don't lose err?
+ stream.close()
+ } else if (err) {
+ process.nextTick(emitErrorCloseLegacy, stream, err)
+ } else {
+ process.nextTick(emitCloseLegacy, stream)
+ }
+ if (!stream.destroyed) {
+ stream[kDestroyed] = true
+ }
+}
module.exports = {
- destroy: destroy,
- undestroy: undestroy,
- errorOrDestroy: errorOrDestroy
-};
\ No newline at end of file
+ construct,
+ destroyer,
+ destroy,
+ undestroy,
+ errorOrDestroy
+}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/duplex.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/duplex.js
similarity index 96%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/duplex.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/duplex.js
index 24e5be6fffff44..dd08396738baad 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/duplex.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/duplex.js
@@ -18,10 +18,12 @@
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
// a duplex stream is just a stream that is both readable and writable.
// Since JS doesn't have multiple prototype inheritance, this class
// prototypically inherits from Readable, and then parasitically from
// Writable.
+
'use strict'
const {
@@ -30,38 +32,30 @@ const {
ObjectKeys,
ObjectSetPrototypeOf
} = require('../../ours/primordials')
-
module.exports = Duplex
-
const Readable = require('./readable')
-
const Writable = require('./writable')
-
ObjectSetPrototypeOf(Duplex.prototype, Readable.prototype)
ObjectSetPrototypeOf(Duplex, Readable)
{
- const keys = ObjectKeys(Writable.prototype) // Allow the keys array to be GC'ed.
-
+ const keys = ObjectKeys(Writable.prototype)
+ // Allow the keys array to be GC'ed.
for (let i = 0; i < keys.length; i++) {
const method = keys[i]
if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]
}
}
-
function Duplex(options) {
if (!(this instanceof Duplex)) return new Duplex(options)
Readable.call(this, options)
Writable.call(this, options)
-
if (options) {
this.allowHalfOpen = options.allowHalfOpen !== false
-
if (options.readable === false) {
this._readableState.readable = false
this._readableState.ended = true
this._readableState.endEmitted = true
}
-
if (options.writable === false) {
this._writableState.writable = false
this._writableState.ending = true
@@ -72,7 +66,6 @@ function Duplex(options) {
this.allowHalfOpen = true
}
}
-
ObjectDefineProperties(Duplex.prototype, {
writable: {
__proto__: null,
@@ -112,15 +105,12 @@ ObjectDefineProperties(Duplex.prototype, {
},
destroyed: {
__proto__: null,
-
get() {
if (this._readableState === undefined || this._writableState === undefined) {
return false
}
-
return this._readableState.destroyed && this._writableState.destroyed
},
-
set(value) {
// Backward compatibility, the user is explicitly
// managing destroyed.
@@ -131,27 +121,23 @@ ObjectDefineProperties(Duplex.prototype, {
}
}
})
-let webStreamsAdapters // Lazy to avoid circular references
+let webStreamsAdapters
+// Lazy to avoid circular references
function lazyWebStreams() {
if (webStreamsAdapters === undefined) webStreamsAdapters = {}
return webStreamsAdapters
}
-
Duplex.fromWeb = function (pair, options) {
return lazyWebStreams().newStreamDuplexFromReadableWritablePair(pair, options)
}
-
Duplex.toWeb = function (duplex) {
return lazyWebStreams().newReadableWritablePairFromDuplex(duplex)
}
-
let duplexify
-
Duplex.from = function (body) {
if (!duplexify) {
duplexify = require('./duplexify')
}
-
return duplexify(body, 'body')
}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/duplexify.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/duplexify.js
similarity index 96%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/duplexify.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/duplexify.js
index 9ff4df96433e31..43300ddc8a45bc 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/duplexify.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/duplexify.js
@@ -1,11 +1,11 @@
/* replacement start */
-const process = require('process')
+
+const process = require('process/')
+
/* replacement end */
;('use strict')
-
const bufferModule = require('buffer')
-
const {
isReadable,
isWritable,
@@ -15,24 +15,16 @@ const {
isWritableNodeStream,
isDuplexNodeStream
} = require('./utils')
-
const eos = require('./end-of-stream')
-
const {
AbortError,
codes: { ERR_INVALID_ARG_TYPE, ERR_INVALID_RETURN_VALUE }
} = require('../../ours/errors')
-
const { destroyer } = require('./destroy')
-
const Duplex = require('./duplex')
-
const Readable = require('./readable')
-
const { createDeferredPromise } = require('../../ours/util')
-
const from = require('./from')
-
const Blob = globalThis.Blob || bufferModule.Blob
const isBlob =
typeof Blob !== 'undefined'
@@ -42,21 +34,21 @@ const isBlob =
: function isBlob(b) {
return false
}
-
const AbortController = globalThis.AbortController || require('abort-controller').AbortController
+const { FunctionPrototypeCall } = require('../../ours/primordials')
-const { FunctionPrototypeCall } = require('../../ours/primordials') // This is needed for pre node 17.
-
+// This is needed for pre node 17.
class Duplexify extends Duplex {
constructor(options) {
- super(options) // https://github.com/nodejs/node/pull/34385
+ super(options)
+
+ // https://github.com/nodejs/node/pull/34385
if ((options === null || options === undefined ? undefined : options.readable) === false) {
this._readableState.readable = false
this._readableState.ended = true
this._readableState.endEmitted = true
}
-
if ((options === null || options === undefined ? undefined : options.writable) === false) {
this._writableState.writable = false
this._writableState.ending = true
@@ -65,33 +57,32 @@ class Duplexify extends Duplex {
}
}
}
-
module.exports = function duplexify(body, name) {
if (isDuplexNodeStream(body)) {
return body
}
-
if (isReadableNodeStream(body)) {
return _duplexify({
readable: body
})
}
-
if (isWritableNodeStream(body)) {
return _duplexify({
writable: body
})
}
-
if (isNodeStream(body)) {
return _duplexify({
writable: false,
readable: false
})
- } // TODO: Webstreams
+ }
+
+ // TODO: Webstreams
// if (isReadableStream(body)) {
// return _duplexify({ readable: Readable.fromWeb(body) });
// }
+
// TODO: Webstreams
// if (isWritableStream(body)) {
// return _duplexify({ writable: Writable.fromWeb(body) });
@@ -99,7 +90,6 @@ module.exports = function duplexify(body, name) {
if (typeof body === 'function') {
const { value, write, final, destroy } = fromAsyncGen(body)
-
if (isIterable(value)) {
return from(Duplexify, value, {
// TODO (ronag): highWaterMark?
@@ -109,9 +99,7 @@ module.exports = function duplexify(body, name) {
destroy
})
}
-
const then = value === null || value === undefined ? undefined : value.then
-
if (typeof then === 'function') {
let d
const promise = FunctionPrototypeCall(
@@ -131,7 +119,6 @@ module.exports = function duplexify(body, name) {
objectMode: true,
readable: false,
write,
-
final(cb) {
final(async () => {
try {
@@ -142,25 +129,23 @@ module.exports = function duplexify(body, name) {
}
})
},
-
destroy
}))
}
-
throw new ERR_INVALID_RETURN_VALUE('Iterable, AsyncIterable or AsyncFunction', name, value)
}
-
if (isBlob(body)) {
return duplexify(body.arrayBuffer())
}
-
if (isIterable(body)) {
return from(Duplexify, body, {
// TODO (ronag): highWaterMark?
objectMode: true,
writable: false
})
- } // TODO: Webstreams.
+ }
+
+ // TODO: Webstreams.
// if (
// isReadableStream(body?.readable) &&
// isWritableStream(body?.writable)
@@ -193,9 +178,7 @@ module.exports = function duplexify(body, name) {
writable
})
}
-
const then = body === null || body === undefined ? undefined : body.then
-
if (typeof then === 'function') {
let d
FunctionPrototypeCall(
@@ -205,7 +188,6 @@ module.exports = function duplexify(body, name) {
if (val != null) {
d.push(val)
}
-
d.push(null)
},
(err) => {
@@ -215,11 +197,9 @@ module.exports = function duplexify(body, name) {
return (d = new Duplexify({
objectMode: true,
writable: false,
-
read() {}
}))
}
-
throw new ERR_INVALID_ARG_TYPE(
name,
[
@@ -236,7 +216,6 @@ module.exports = function duplexify(body, name) {
body
)
}
-
function fromAsyncGen(fn) {
let { promise, resolve } = createDeferredPromise()
const ac = new AbortController()
@@ -263,35 +242,29 @@ function fromAsyncGen(fn) {
)
return {
value,
-
write(chunk, encoding, cb) {
const _resolve = resolve
resolve = null
-
_resolve({
chunk,
done: false,
cb
})
},
-
final(cb) {
const _resolve = resolve
resolve = null
-
_resolve({
done: true,
cb
})
},
-
destroy(err, cb) {
ac.abort()
cb(err)
}
}
}
-
function _duplexify(pair) {
const r = pair.readable && typeof pair.readable.read !== 'function' ? Readable.wrap(pair.readable) : pair.readable
const w = pair.writable
@@ -302,11 +275,9 @@ function _duplexify(pair) {
let onreadable
let onclose
let d
-
function onfinished(err) {
const cb = onclose
onclose = null
-
if (cb) {
cb(err)
} else if (err) {
@@ -314,10 +285,11 @@ function _duplexify(pair) {
} else if (!readable && !writable) {
d.destroy()
}
- } // TODO(ronag): Avoid double buffering.
+ }
+
+ // TODO(ronag): Avoid double buffering.
// Implement Writable/Readable/Duplex traits.
// See, https://github.com/nodejs/node/pull/33515.
-
d = new Duplexify({
// TODO (ronag): highWaterMark?
readableObjectMode: !!(r !== null && r !== undefined && r.readableObjectMode),
@@ -325,18 +297,14 @@ function _duplexify(pair) {
readable,
writable
})
-
if (writable) {
eos(w, (err) => {
writable = false
-
if (err) {
destroyer(r, err)
}
-
onfinished(err)
})
-
d._write = function (chunk, encoding, callback) {
if (w.write(chunk, encoding)) {
callback()
@@ -344,12 +312,10 @@ function _duplexify(pair) {
ondrain = callback
}
}
-
d._final = function (callback) {
w.end()
onfinish = callback
}
-
w.on('drain', function () {
if (ondrain) {
const cb = ondrain
@@ -365,15 +331,12 @@ function _duplexify(pair) {
}
})
}
-
if (readable) {
eos(r, (err) => {
readable = false
-
if (err) {
destroyer(r, err)
}
-
onfinished(err)
})
r.on('readable', function () {
@@ -386,32 +349,26 @@ function _duplexify(pair) {
r.on('end', function () {
d.push(null)
})
-
d._read = function () {
while (true) {
const buf = r.read()
-
if (buf === null) {
onreadable = d._read
return
}
-
if (!d.push(buf)) {
return
}
}
}
}
-
d._destroy = function (err, callback) {
if (!err && onclose !== null) {
err = new AbortError()
}
-
onreadable = null
ondrain = null
onfinish = null
-
if (onclose === null) {
callback(err)
} else {
@@ -420,6 +377,5 @@ function _duplexify(pair) {
destroyer(r, err)
}
}
-
return d
}
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/end-of-stream.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
index 831f286d98fa95..57dbaa48a3ca5a 100644
--- a/deps/npm/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
@@ -1,104 +1,224 @@
-// Ported from https://github.com/mafintosh/end-of-stream with
-// permission from the author, Mathias Buus (@mafintosh).
-'use strict';
-
-var ERR_STREAM_PREMATURE_CLOSE = require('../../../errors').codes.ERR_STREAM_PREMATURE_CLOSE;
-
-function once(callback) {
- var called = false;
- return function () {
- if (called) return;
- called = true;
-
- for (var _len = arguments.length, args = new Array(_len), _key = 0; _key < _len; _key++) {
- args[_key] = arguments[_key];
- }
+/* replacement start */
- callback.apply(this, args);
- };
-}
+const process = require('process/')
-function noop() {}
+/* replacement end */
+// Ported from https://github.com/mafintosh/end-of-stream with
+// permission from the author, Mathias Buus (@mafintosh).
+;('use strict')
+const { AbortError, codes } = require('../../ours/errors')
+const { ERR_INVALID_ARG_TYPE, ERR_STREAM_PREMATURE_CLOSE } = codes
+const { kEmptyObject, once } = require('../../ours/util')
+const { validateAbortSignal, validateFunction, validateObject } = require('../validators')
+const { Promise } = require('../../ours/primordials')
+const {
+ isClosed,
+ isReadable,
+ isReadableNodeStream,
+ isReadableFinished,
+ isReadableErrored,
+ isWritable,
+ isWritableNodeStream,
+ isWritableFinished,
+ isWritableErrored,
+ isNodeStream,
+ willEmitClose: _willEmitClose
+} = require('./utils')
function isRequest(stream) {
- return stream.setHeader && typeof stream.abort === 'function';
+ return stream.setHeader && typeof stream.abort === 'function'
}
-
-function eos(stream, opts, callback) {
- if (typeof opts === 'function') return eos(stream, null, opts);
- if (!opts) opts = {};
- callback = once(callback || noop);
- var readable = opts.readable || opts.readable !== false && stream.readable;
- var writable = opts.writable || opts.writable !== false && stream.writable;
-
- var onlegacyfinish = function onlegacyfinish() {
- if (!stream.writable) onfinish();
- };
-
- var writableEnded = stream._writableState && stream._writableState.finished;
-
- var onfinish = function onfinish() {
- writable = false;
- writableEnded = true;
- if (!readable) callback.call(stream);
- };
-
- var readableEnded = stream._readableState && stream._readableState.endEmitted;
-
- var onend = function onend() {
- readable = false;
- readableEnded = true;
- if (!writable) callback.call(stream);
- };
-
- var onerror = function onerror(err) {
- callback.call(stream, err);
- };
-
- var onclose = function onclose() {
- var err;
-
- if (readable && !readableEnded) {
- if (!stream._readableState || !stream._readableState.ended) err = new ERR_STREAM_PREMATURE_CLOSE();
- return callback.call(stream, err);
+const nop = () => {}
+function eos(stream, options, callback) {
+ var _options$readable, _options$writable
+ if (arguments.length === 2) {
+ callback = options
+ options = kEmptyObject
+ } else if (options == null) {
+ options = kEmptyObject
+ } else {
+ validateObject(options, 'options')
+ }
+ validateFunction(callback, 'callback')
+ validateAbortSignal(options.signal, 'options.signal')
+ callback = once(callback)
+ const readable =
+ (_options$readable = options.readable) !== null && _options$readable !== undefined
+ ? _options$readable
+ : isReadableNodeStream(stream)
+ const writable =
+ (_options$writable = options.writable) !== null && _options$writable !== undefined
+ ? _options$writable
+ : isWritableNodeStream(stream)
+ if (!isNodeStream(stream)) {
+ // TODO: Webstreams.
+ throw new ERR_INVALID_ARG_TYPE('stream', 'Stream', stream)
+ }
+ const wState = stream._writableState
+ const rState = stream._readableState
+ const onlegacyfinish = () => {
+ if (!stream.writable) {
+ onfinish()
}
+ }
- if (writable && !writableEnded) {
- if (!stream._writableState || !stream._writableState.ended) err = new ERR_STREAM_PREMATURE_CLOSE();
- return callback.call(stream, err);
+ // TODO (ronag): Improve soft detection to include core modules and
+ // common ecosystem modules that do properly emit 'close' but fail
+ // this generic check.
+ let willEmitClose =
+ _willEmitClose(stream) && isReadableNodeStream(stream) === readable && isWritableNodeStream(stream) === writable
+ let writableFinished = isWritableFinished(stream, false)
+ const onfinish = () => {
+ writableFinished = true
+ // Stream should not be destroyed here. If it is that
+ // means that user space is doing something differently and
+ // we cannot trust willEmitClose.
+ if (stream.destroyed) {
+ willEmitClose = false
}
- };
-
- var onrequest = function onrequest() {
- stream.req.on('finish', onfinish);
- };
-
+ if (willEmitClose && (!stream.readable || readable)) {
+ return
+ }
+ if (!readable || readableFinished) {
+ callback.call(stream)
+ }
+ }
+ let readableFinished = isReadableFinished(stream, false)
+ const onend = () => {
+ readableFinished = true
+ // Stream should not be destroyed here. If it is that
+ // means that user space is doing something differently and
+ // we cannot trust willEmitClose.
+ if (stream.destroyed) {
+ willEmitClose = false
+ }
+ if (willEmitClose && (!stream.writable || writable)) {
+ return
+ }
+ if (!writable || writableFinished) {
+ callback.call(stream)
+ }
+ }
+ const onerror = (err) => {
+ callback.call(stream, err)
+ }
+ let closed = isClosed(stream)
+ const onclose = () => {
+ closed = true
+ const errored = isWritableErrored(stream) || isReadableErrored(stream)
+ if (errored && typeof errored !== 'boolean') {
+ return callback.call(stream, errored)
+ }
+ if (readable && !readableFinished && isReadableNodeStream(stream, true)) {
+ if (!isReadableFinished(stream, false)) return callback.call(stream, new ERR_STREAM_PREMATURE_CLOSE())
+ }
+ if (writable && !writableFinished) {
+ if (!isWritableFinished(stream, false)) return callback.call(stream, new ERR_STREAM_PREMATURE_CLOSE())
+ }
+ callback.call(stream)
+ }
+ const onrequest = () => {
+ stream.req.on('finish', onfinish)
+ }
if (isRequest(stream)) {
- stream.on('complete', onfinish);
- stream.on('abort', onclose);
- if (stream.req) onrequest();else stream.on('request', onrequest);
- } else if (writable && !stream._writableState) {
+ stream.on('complete', onfinish)
+ if (!willEmitClose) {
+ stream.on('abort', onclose)
+ }
+ if (stream.req) {
+ onrequest()
+ } else {
+ stream.on('request', onrequest)
+ }
+ } else if (writable && !wState) {
// legacy streams
- stream.on('end', onlegacyfinish);
- stream.on('close', onlegacyfinish);
+ stream.on('end', onlegacyfinish)
+ stream.on('close', onlegacyfinish)
}
- stream.on('end', onend);
- stream.on('finish', onfinish);
- if (opts.error !== false) stream.on('error', onerror);
- stream.on('close', onclose);
- return function () {
- stream.removeListener('complete', onfinish);
- stream.removeListener('abort', onclose);
- stream.removeListener('request', onrequest);
- if (stream.req) stream.req.removeListener('finish', onfinish);
- stream.removeListener('end', onlegacyfinish);
- stream.removeListener('close', onlegacyfinish);
- stream.removeListener('finish', onfinish);
- stream.removeListener('end', onend);
- stream.removeListener('error', onerror);
- stream.removeListener('close', onclose);
- };
+ // Not all streams will emit 'close' after 'aborted'.
+ if (!willEmitClose && typeof stream.aborted === 'boolean') {
+ stream.on('aborted', onclose)
+ }
+ stream.on('end', onend)
+ stream.on('finish', onfinish)
+ if (options.error !== false) {
+ stream.on('error', onerror)
+ }
+ stream.on('close', onclose)
+ if (closed) {
+ process.nextTick(onclose)
+ } else if (
+ (wState !== null && wState !== undefined && wState.errorEmitted) ||
+ (rState !== null && rState !== undefined && rState.errorEmitted)
+ ) {
+ if (!willEmitClose) {
+ process.nextTick(onclose)
+ }
+ } else if (
+ !readable &&
+ (!willEmitClose || isReadable(stream)) &&
+ (writableFinished || isWritable(stream) === false)
+ ) {
+ process.nextTick(onclose)
+ } else if (
+ !writable &&
+ (!willEmitClose || isWritable(stream)) &&
+ (readableFinished || isReadable(stream) === false)
+ ) {
+ process.nextTick(onclose)
+ } else if (rState && stream.req && stream.aborted) {
+ process.nextTick(onclose)
+ }
+ const cleanup = () => {
+ callback = nop
+ stream.removeListener('aborted', onclose)
+ stream.removeListener('complete', onfinish)
+ stream.removeListener('abort', onclose)
+ stream.removeListener('request', onrequest)
+ if (stream.req) stream.req.removeListener('finish', onfinish)
+ stream.removeListener('end', onlegacyfinish)
+ stream.removeListener('close', onlegacyfinish)
+ stream.removeListener('finish', onfinish)
+ stream.removeListener('end', onend)
+ stream.removeListener('error', onerror)
+ stream.removeListener('close', onclose)
+ }
+ if (options.signal && !closed) {
+ const abort = () => {
+ // Keep it because cleanup removes it.
+ const endCallback = callback
+ cleanup()
+ endCallback.call(
+ stream,
+ new AbortError(undefined, {
+ cause: options.signal.reason
+ })
+ )
+ }
+ if (options.signal.aborted) {
+ process.nextTick(abort)
+ } else {
+ const originalCallback = callback
+ callback = once((...args) => {
+ options.signal.removeEventListener('abort', abort)
+ originalCallback.apply(stream, args)
+ })
+ options.signal.addEventListener('abort', abort)
+ }
+ }
+ return cleanup
}
-
-module.exports = eos;
\ No newline at end of file
+function finished(stream, opts) {
+ return new Promise((resolve, reject) => {
+ eos(stream, opts, (err) => {
+ if (err) {
+ reject(err)
+ } else {
+ resolve()
+ }
+ })
+ })
+}
+module.exports = eos
+module.exports.finished = finished
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/from.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/from.js
index 6c41284416799c..c7e75314028794 100644
--- a/deps/npm/node_modules/readable-stream/lib/internal/streams/from.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/from.js
@@ -1,64 +1,98 @@
-'use strict';
+'use strict'
-function asyncGeneratorStep(gen, resolve, reject, _next, _throw, key, arg) { try { var info = gen[key](arg); var value = info.value; } catch (error) { reject(error); return; } if (info.done) { resolve(value); } else { Promise.resolve(value).then(_next, _throw); } }
+/* replacement start */
-function _asyncToGenerator(fn) { return function () { var self = this, args = arguments; return new Promise(function (resolve, reject) { var gen = fn.apply(self, args); function _next(value) { asyncGeneratorStep(gen, resolve, reject, _next, _throw, "next", value); } function _throw(err) { asyncGeneratorStep(gen, resolve, reject, _next, _throw, "throw", err); } _next(undefined); }); }; }
+const process = require('process/')
-function ownKeys(object, enumerableOnly) { var keys = Object.keys(object); if (Object.getOwnPropertySymbols) { var symbols = Object.getOwnPropertySymbols(object); if (enumerableOnly) symbols = symbols.filter(function (sym) { return Object.getOwnPropertyDescriptor(object, sym).enumerable; }); keys.push.apply(keys, symbols); } return keys; }
-
-function _objectSpread(target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i] != null ? arguments[i] : {}; if (i % 2) { ownKeys(Object(source), true).forEach(function (key) { _defineProperty(target, key, source[key]); }); } else if (Object.getOwnPropertyDescriptors) { Object.defineProperties(target, Object.getOwnPropertyDescriptors(source)); } else { ownKeys(Object(source)).forEach(function (key) { Object.defineProperty(target, key, Object.getOwnPropertyDescriptor(source, key)); }); } } return target; }
-
-function _defineProperty(obj, key, value) { if (key in obj) { Object.defineProperty(obj, key, { value: value, enumerable: true, configurable: true, writable: true }); } else { obj[key] = value; } return obj; }
-
-var ERR_INVALID_ARG_TYPE = require('../../../errors').codes.ERR_INVALID_ARG_TYPE;
+/* replacement end */
+const { PromisePrototypeThen, SymbolAsyncIterator, SymbolIterator } = require('../../ours/primordials')
+const { Buffer } = require('buffer')
+const { ERR_INVALID_ARG_TYPE, ERR_STREAM_NULL_VALUES } = require('../../ours/errors').codes
function from(Readable, iterable, opts) {
- var iterator;
-
- if (iterable && typeof iterable.next === 'function') {
- iterator = iterable;
- } else if (iterable && iterable[Symbol.asyncIterator]) iterator = iterable[Symbol.asyncIterator]();else if (iterable && iterable[Symbol.iterator]) iterator = iterable[Symbol.iterator]();else throw new ERR_INVALID_ARG_TYPE('iterable', ['Iterable'], iterable);
-
- var readable = new Readable(_objectSpread({
- objectMode: true
- }, opts)); // Reading boolean to protect against _read
+ let iterator
+ if (typeof iterable === 'string' || iterable instanceof Buffer) {
+ return new Readable({
+ objectMode: true,
+ ...opts,
+ read() {
+ this.push(iterable)
+ this.push(null)
+ }
+ })
+ }
+ let isAsync
+ if (iterable && iterable[SymbolAsyncIterator]) {
+ isAsync = true
+ iterator = iterable[SymbolAsyncIterator]()
+ } else if (iterable && iterable[SymbolIterator]) {
+ isAsync = false
+ iterator = iterable[SymbolIterator]()
+ } else {
+ throw new ERR_INVALID_ARG_TYPE('iterable', ['Iterable'], iterable)
+ }
+ const readable = new Readable({
+ objectMode: true,
+ highWaterMark: 1,
+ // TODO(ronag): What options should be allowed?
+ ...opts
+ })
+
+ // Flag to protect against _read
// being called before last iteration completion.
-
- var reading = false;
-
+ let reading = false
readable._read = function () {
if (!reading) {
- reading = true;
- next();
+ reading = true
+ next()
}
- };
-
- function next() {
- return _next2.apply(this, arguments);
}
-
- function _next2() {
- _next2 = _asyncToGenerator(function* () {
+ readable._destroy = function (error, cb) {
+ PromisePrototypeThen(
+ close(error),
+ () => process.nextTick(cb, error),
+ // nextTick is here in case cb throws
+ (e) => process.nextTick(cb, e || error)
+ )
+ }
+ async function close(error) {
+ const hadError = error !== undefined && error !== null
+ const hasThrow = typeof iterator.throw === 'function'
+ if (hadError && hasThrow) {
+ const { value, done } = await iterator.throw(error)
+ await value
+ if (done) {
+ return
+ }
+ }
+ if (typeof iterator.return === 'function') {
+ const { value } = await iterator.return()
+ await value
+ }
+ }
+ async function next() {
+ for (;;) {
try {
- var _ref = yield iterator.next(),
- value = _ref.value,
- done = _ref.done;
-
+ const { value, done } = isAsync ? await iterator.next() : iterator.next()
if (done) {
- readable.push(null);
- } else if (readable.push((yield value))) {
- next();
+ readable.push(null)
} else {
- reading = false;
+ const res = value && typeof value.then === 'function' ? await value : value
+ if (res === null) {
+ reading = false
+ throw new ERR_STREAM_NULL_VALUES()
+ } else if (readable.push(res)) {
+ continue
+ } else {
+ reading = false
+ }
}
} catch (err) {
- readable.destroy(err);
+ readable.destroy(err)
}
- });
- return _next2.apply(this, arguments);
+ break
+ }
}
-
- return readable;
+ return readable
}
-
-module.exports = from;
\ No newline at end of file
+module.exports = from
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/lazy_transform.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/lazy_transform.js
similarity index 99%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/lazy_transform.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/lazy_transform.js
index 466aa03544457e..439461a1278392 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/lazy_transform.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/lazy_transform.js
@@ -4,33 +4,24 @@
'use strict'
const { ObjectDefineProperties, ObjectDefineProperty, ObjectSetPrototypeOf } = require('../../ours/primordials')
-
const stream = require('../../stream')
-
const { getDefaultEncoding } = require('../crypto/util')
-
module.exports = LazyTransform
-
function LazyTransform(options) {
this._options = options
}
-
ObjectSetPrototypeOf(LazyTransform.prototype, stream.Transform.prototype)
ObjectSetPrototypeOf(LazyTransform, stream.Transform)
-
function makeGetter(name) {
return function () {
stream.Transform.call(this, this._options)
this._writableState.decodeStrings = false
-
if (!this._options || !this._options.defaultEncoding) {
this._writableState.defaultEncoding = getDefaultEncoding()
}
-
return this[name]
}
}
-
function makeSetter(name) {
return function (val) {
ObjectDefineProperty(this, name, {
@@ -42,7 +33,6 @@ function makeSetter(name) {
})
}
}
-
ObjectDefineProperties(LazyTransform.prototype, {
_readableState: {
__proto__: null,
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/legacy.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/legacy.js
similarity index 84%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/legacy.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/legacy.js
index 09c3b7201376f7..d492f7ff4e6b69 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/legacy.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/legacy.js
@@ -1,66 +1,56 @@
'use strict'
const { ArrayIsArray, ObjectSetPrototypeOf } = require('../../ours/primordials')
-
const { EventEmitter: EE } = require('events')
-
function Stream(opts) {
EE.call(this, opts)
}
-
ObjectSetPrototypeOf(Stream.prototype, EE.prototype)
ObjectSetPrototypeOf(Stream, EE)
-
Stream.prototype.pipe = function (dest, options) {
const source = this
-
function ondata(chunk) {
if (dest.writable && dest.write(chunk) === false && source.pause) {
source.pause()
}
}
-
source.on('data', ondata)
-
function ondrain() {
if (source.readable && source.resume) {
source.resume()
}
}
+ dest.on('drain', ondrain)
- dest.on('drain', ondrain) // If the 'end' option is not supplied, dest.end() will be called when
+ // If the 'end' option is not supplied, dest.end() will be called when
// source gets the 'end' or 'close' events. Only dest.end() once.
-
if (!dest._isStdio && (!options || options.end !== false)) {
source.on('end', onend)
source.on('close', onclose)
}
-
let didOnEnd = false
-
function onend() {
if (didOnEnd) return
didOnEnd = true
dest.end()
}
-
function onclose() {
if (didOnEnd) return
didOnEnd = true
if (typeof dest.destroy === 'function') dest.destroy()
- } // Don't leave dangling pipes when there are errors.
+ }
+ // Don't leave dangling pipes when there are errors.
function onerror(er) {
cleanup()
-
if (EE.listenerCount(this, 'error') === 0) {
this.emit('error', er)
}
}
-
prependListener(source, 'error', onerror)
- prependListener(dest, 'error', onerror) // Remove all the event listeners that were added.
+ prependListener(dest, 'error', onerror)
+ // Remove all the event listeners that were added.
function cleanup() {
source.removeListener('data', ondata)
dest.removeListener('drain', ondrain)
@@ -72,28 +62,27 @@ Stream.prototype.pipe = function (dest, options) {
source.removeListener('close', cleanup)
dest.removeListener('close', cleanup)
}
-
source.on('end', cleanup)
source.on('close', cleanup)
dest.on('close', cleanup)
- dest.emit('pipe', source) // Allow for unix-like usage: A.pipe(B).pipe(C)
+ dest.emit('pipe', source)
+ // Allow for unix-like usage: A.pipe(B).pipe(C)
return dest
}
-
function prependListener(emitter, event, fn) {
// Sadly this is not cacheable as some libraries bundle their own
// event emitter implementation with them.
- if (typeof emitter.prependListener === 'function') return emitter.prependListener(event, fn) // This is a hack to make sure that our error handler is attached before any
+ if (typeof emitter.prependListener === 'function') return emitter.prependListener(event, fn)
+
+ // This is a hack to make sure that our error handler is attached before any
// userland ones. NEVER DO THIS. This is here only because this code needs
// to continue to work with older versions of Node.js that do not include
// the prependListener() method. The goal is to eventually remove this hack.
-
if (!emitter._events || !emitter._events[event]) emitter.on(event, fn)
else if (ArrayIsArray(emitter._events[event])) emitter._events[event].unshift(fn)
else emitter._events[event] = [fn, emitter._events[event]]
}
-
module.exports = {
Stream,
prependListener
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/operators.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/operators.js
similarity index 97%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/operators.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/operators.js
index 11778e50f19566..323a74a17c32e9 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/operators.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/operators.js
@@ -1,18 +1,13 @@
'use strict'
const AbortController = globalThis.AbortController || require('abort-controller').AbortController
-
const {
codes: { ERR_INVALID_ARG_TYPE, ERR_MISSING_ARGS, ERR_OUT_OF_RANGE },
AbortError
} = require('../../ours/errors')
-
const { validateAbortSignal, validateInteger, validateObject } = require('../validators')
-
const kWeakHandler = require('../../ours/primordials').Symbol('kWeak')
-
const { finished } = require('./end-of-stream')
-
const {
ArrayPrototypePush,
MathFloor,
@@ -23,33 +18,25 @@ const {
PromisePrototypeThen,
Symbol
} = require('../../ours/primordials')
-
const kEmpty = Symbol('kEmpty')
const kEof = Symbol('kEof')
-
function map(fn, options) {
if (typeof fn !== 'function') {
throw new ERR_INVALID_ARG_TYPE('fn', ['Function', 'AsyncFunction'], fn)
}
-
if (options != null) {
validateObject(options, 'options')
}
-
if ((options === null || options === undefined ? undefined : options.signal) != null) {
validateAbortSignal(options.signal, 'options.signal')
}
-
let concurrency = 1
-
if ((options === null || options === undefined ? undefined : options.concurrency) != null) {
concurrency = MathFloor(options.concurrency)
}
-
validateInteger(concurrency, 'concurrency', 1)
return async function* map() {
var _options$signal, _options$signal2
-
const ac = new AbortController()
const stream = this
const queue = []
@@ -57,9 +44,7 @@ function map(fn, options) {
const signalOpt = {
signal
}
-
const abort = () => ac.abort()
-
if (
options !== null &&
options !== undefined &&
@@ -69,7 +54,6 @@ function map(fn, options) {
) {
abort()
}
-
options === null || options === undefined
? undefined
: (_options$signal2 = options.signal) === null || _options$signal2 === undefined
@@ -78,52 +62,41 @@ function map(fn, options) {
let next
let resume
let done = false
-
function onDone() {
done = true
}
-
async function pump() {
try {
for await (let val of stream) {
var _val
-
if (done) {
return
}
-
if (signal.aborted) {
throw new AbortError()
}
-
try {
val = fn(val, signalOpt)
} catch (err) {
val = PromiseReject(err)
}
-
if (val === kEmpty) {
continue
}
-
if (typeof ((_val = val) === null || _val === undefined ? undefined : _val.catch) === 'function') {
val.catch(onDone)
}
-
queue.push(val)
-
if (next) {
next()
next = null
}
-
if (!done && queue.length && queue.length >= concurrency) {
await new Promise((resolve) => {
resume = resolve
})
}
}
-
queue.push(kEof)
} catch (err) {
const val = PromiseReject(err)
@@ -131,14 +104,11 @@ function map(fn, options) {
queue.push(val)
} finally {
var _options$signal3
-
done = true
-
if (next) {
next()
next = null
}
-
options === null || options === undefined
? undefined
: (_options$signal3 = options.signal) === null || _options$signal3 === undefined
@@ -146,34 +116,26 @@ function map(fn, options) {
: _options$signal3.removeEventListener('abort', abort)
}
}
-
pump()
-
try {
while (true) {
while (queue.length > 0) {
const val = await queue[0]
-
if (val === kEof) {
return
}
-
if (signal.aborted) {
throw new AbortError()
}
-
if (val !== kEmpty) {
yield val
}
-
queue.shift()
-
if (resume) {
resume()
resume = null
}
}
-
await new Promise((resolve) => {
next = resolve
})
@@ -181,7 +143,6 @@ function map(fn, options) {
} finally {
ac.abort()
done = true
-
if (resume) {
resume()
resume = null
@@ -189,22 +150,17 @@ function map(fn, options) {
}
}.call(this)
}
-
function asIndexedPairs(options = undefined) {
if (options != null) {
validateObject(options, 'options')
}
-
if ((options === null || options === undefined ? undefined : options.signal) != null) {
validateAbortSignal(options.signal, 'options.signal')
}
-
return async function* asIndexedPairs() {
let index = 0
-
for await (const val of this) {
var _options$signal4
-
if (
options !== null &&
options !== undefined &&
@@ -216,25 +172,21 @@ function asIndexedPairs(options = undefined) {
cause: options.signal.reason
})
}
-
yield [index++, val]
}
}.call(this)
}
-
async function some(fn, options = undefined) {
for await (const unused of filter.call(this, fn, options)) {
return true
}
-
return false
}
-
async function every(fn, options = undefined) {
if (typeof fn !== 'function') {
throw new ERR_INVALID_ARG_TYPE('fn', ['Function', 'AsyncFunction'], fn)
- } // https://en.wikipedia.org/wiki/De_Morgan%27s_laws
-
+ }
+ // https://en.wikipedia.org/wiki/De_Morgan%27s_laws
return !(await some.call(
this,
async (...args) => {
@@ -243,69 +195,56 @@ async function every(fn, options = undefined) {
options
))
}
-
async function find(fn, options) {
for await (const result of filter.call(this, fn, options)) {
return result
}
-
return undefined
}
-
async function forEach(fn, options) {
if (typeof fn !== 'function') {
throw new ERR_INVALID_ARG_TYPE('fn', ['Function', 'AsyncFunction'], fn)
}
-
async function forEachFn(value, options) {
await fn(value, options)
return kEmpty
- } // eslint-disable-next-line no-unused-vars
-
+ }
+ // eslint-disable-next-line no-unused-vars
for await (const unused of map.call(this, forEachFn, options));
}
-
function filter(fn, options) {
if (typeof fn !== 'function') {
throw new ERR_INVALID_ARG_TYPE('fn', ['Function', 'AsyncFunction'], fn)
}
-
async function filterFn(value, options) {
if (await fn(value, options)) {
return value
}
-
return kEmpty
}
-
return map.call(this, filterFn, options)
-} // Specific to provide better error to reduce since the argument is only
-// missing if the stream has no items in it - but the code is still appropriate
+}
+// Specific to provide better error to reduce since the argument is only
+// missing if the stream has no items in it - but the code is still appropriate
class ReduceAwareErrMissingArgs extends ERR_MISSING_ARGS {
constructor() {
super('reduce')
this.message = 'Reduce of an empty stream requires an initial value'
}
}
-
async function reduce(reducer, initialValue, options) {
var _options$signal5
-
if (typeof reducer !== 'function') {
throw new ERR_INVALID_ARG_TYPE('reducer', ['Function', 'AsyncFunction'], reducer)
}
-
if (options != null) {
validateObject(options, 'options')
}
-
if ((options === null || options === undefined ? undefined : options.signal) != null) {
validateAbortSignal(options.signal, 'options.signal')
}
-
let hasInitialValue = arguments.length > 1
-
if (
options !== null &&
options !== undefined &&
@@ -317,14 +256,11 @@ async function reduce(reducer, initialValue, options) {
cause: options.signal.reason
})
this.once('error', () => {}) // The error is already propagated
-
await finished(this.destroy(err))
throw err
}
-
const ac = new AbortController()
const signal = ac.signal
-
if (options !== null && options !== undefined && options.signal) {
const opts = {
once: true,
@@ -332,15 +268,11 @@ async function reduce(reducer, initialValue, options) {
}
options.signal.addEventListener('abort', () => ac.abort(), opts)
}
-
let gotAnyItemFromStream = false
-
try {
for await (const value of this) {
var _options$signal6
-
gotAnyItemFromStream = true
-
if (
options !== null &&
options !== undefined &&
@@ -350,7 +282,6 @@ async function reduce(reducer, initialValue, options) {
) {
throw new AbortError()
}
-
if (!hasInitialValue) {
initialValue = value
hasInitialValue = true
@@ -360,31 +291,24 @@ async function reduce(reducer, initialValue, options) {
})
}
}
-
if (!gotAnyItemFromStream && !hasInitialValue) {
throw new ReduceAwareErrMissingArgs()
}
} finally {
ac.abort()
}
-
return initialValue
}
-
async function toArray(options) {
if (options != null) {
validateObject(options, 'options')
}
-
if ((options === null || options === undefined ? undefined : options.signal) != null) {
validateAbortSignal(options.signal, 'options.signal')
}
-
const result = []
-
for await (const val of this) {
var _options$signal7
-
if (
options !== null &&
options !== undefined &&
@@ -396,13 +320,10 @@ async function toArray(options) {
cause: options.signal.reason
})
}
-
ArrayPrototypePush(result, val)
}
-
return result
}
-
function flatMap(fn, options) {
const values = map.call(this, fn, options)
return async function* flatMap() {
@@ -411,36 +332,28 @@ function flatMap(fn, options) {
}
}.call(this)
}
-
function toIntegerOrInfinity(number) {
// We coerce here to align with the spec
// https://github.com/tc39/proposal-iterator-helpers/issues/169
number = Number(number)
-
if (NumberIsNaN(number)) {
return 0
}
-
if (number < 0) {
throw new ERR_OUT_OF_RANGE('number', '>= 0', number)
}
-
return number
}
-
function drop(number, options = undefined) {
if (options != null) {
validateObject(options, 'options')
}
-
if ((options === null || options === undefined ? undefined : options.signal) != null) {
validateAbortSignal(options.signal, 'options.signal')
}
-
number = toIntegerOrInfinity(number)
return async function* drop() {
var _options$signal8
-
if (
options !== null &&
options !== undefined &&
@@ -450,10 +363,8 @@ function drop(number, options = undefined) {
) {
throw new AbortError()
}
-
for await (const val of this) {
var _options$signal9
-
if (
options !== null &&
options !== undefined &&
@@ -463,27 +374,22 @@ function drop(number, options = undefined) {
) {
throw new AbortError()
}
-
if (number-- <= 0) {
yield val
}
}
}.call(this)
}
-
function take(number, options = undefined) {
if (options != null) {
validateObject(options, 'options')
}
-
if ((options === null || options === undefined ? undefined : options.signal) != null) {
validateAbortSignal(options.signal, 'options.signal')
}
-
number = toIntegerOrInfinity(number)
return async function* take() {
var _options$signal10
-
if (
options !== null &&
options !== undefined &&
@@ -493,10 +399,8 @@ function take(number, options = undefined) {
) {
throw new AbortError()
}
-
for await (const val of this) {
var _options$signal11
-
if (
options !== null &&
options !== undefined &&
@@ -506,7 +410,6 @@ function take(number, options = undefined) {
) {
throw new AbortError()
}
-
if (number-- > 0) {
yield val
} else {
@@ -515,7 +418,6 @@ function take(number, options = undefined) {
}
}.call(this)
}
-
module.exports.streamReturningOperators = {
asIndexedPairs,
drop,
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/passthrough.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/passthrough.js
similarity index 99%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/passthrough.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/passthrough.js
index 55c551723ee328..ed4f486c3baa44 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/passthrough.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/passthrough.js
@@ -18,25 +18,22 @@
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
// a passthrough stream.
// basically just the most minimal sort of Transform stream.
// Every written chunk gets output as-is.
+
'use strict'
const { ObjectSetPrototypeOf } = require('../../ours/primordials')
-
module.exports = PassThrough
-
const Transform = require('./transform')
-
ObjectSetPrototypeOf(PassThrough.prototype, Transform.prototype)
ObjectSetPrototypeOf(PassThrough, Transform)
-
function PassThrough(options) {
if (!(this instanceof PassThrough)) return new PassThrough(options)
Transform.call(this, options)
}
-
PassThrough.prototype._transform = function (chunk, encoding, cb) {
cb(null, chunk)
}
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/pipeline.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/pipeline.js
index 6589909889c585..016e96ee6ff247 100644
--- a/deps/npm/node_modules/readable-stream/lib/internal/streams/pipeline.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/pipeline.js
@@ -1,97 +1,378 @@
+/* replacement start */
+
+const process = require('process/')
+
+/* replacement end */
// Ported from https://github.com/mafintosh/pump with
// permission from the author, Mathias Buus (@mafintosh).
-'use strict';
-
-var eos;
-function once(callback) {
- var called = false;
- return function () {
- if (called) return;
- called = true;
- callback.apply(void 0, arguments);
- };
+;('use strict')
+const { ArrayIsArray, Promise, SymbolAsyncIterator } = require('../../ours/primordials')
+const eos = require('./end-of-stream')
+const { once } = require('../../ours/util')
+const destroyImpl = require('./destroy')
+const Duplex = require('./duplex')
+const {
+ aggregateTwoErrors,
+ codes: {
+ ERR_INVALID_ARG_TYPE,
+ ERR_INVALID_RETURN_VALUE,
+ ERR_MISSING_ARGS,
+ ERR_STREAM_DESTROYED,
+ ERR_STREAM_PREMATURE_CLOSE
+ },
+ AbortError
+} = require('../../ours/errors')
+const { validateFunction, validateAbortSignal } = require('../validators')
+const { isIterable, isReadable, isReadableNodeStream, isNodeStream } = require('./utils')
+const AbortController = globalThis.AbortController || require('abort-controller').AbortController
+let PassThrough
+let Readable
+function destroyer(stream, reading, writing) {
+ let finished = false
+ stream.on('close', () => {
+ finished = true
+ })
+ const cleanup = eos(
+ stream,
+ {
+ readable: reading,
+ writable: writing
+ },
+ (err) => {
+ finished = !err
+ }
+ )
+ return {
+ destroy: (err) => {
+ if (finished) return
+ finished = true
+ destroyImpl.destroyer(stream, err || new ERR_STREAM_DESTROYED('pipe'))
+ },
+ cleanup
+ }
}
-
-var _require$codes = require('../../../errors').codes,
- ERR_MISSING_ARGS = _require$codes.ERR_MISSING_ARGS,
- ERR_STREAM_DESTROYED = _require$codes.ERR_STREAM_DESTROYED;
-
-function noop(err) {
- // Rethrow the error if it exists to avoid swallowing it
- if (err) throw err;
+function popCallback(streams) {
+ // Streams should never be an empty array. It should always contain at least
+ // a single stream. Therefore optimize for the average case instead of
+ // checking for length === 0 as well.
+ validateFunction(streams[streams.length - 1], 'streams[stream.length - 1]')
+ return streams.pop()
}
-
-function isRequest(stream) {
- return stream.setHeader && typeof stream.abort === 'function';
+function makeAsyncIterable(val) {
+ if (isIterable(val)) {
+ return val
+ } else if (isReadableNodeStream(val)) {
+ // Legacy streams are not Iterable.
+ return fromReadable(val)
+ }
+ throw new ERR_INVALID_ARG_TYPE('val', ['Readable', 'Iterable', 'AsyncIterable'], val)
}
-
-function destroyer(stream, reading, writing, callback) {
- callback = once(callback);
- var closed = false;
- stream.on('close', function () {
- closed = true;
- });
- if (eos === undefined) eos = require('./end-of-stream');
- eos(stream, {
- readable: reading,
- writable: writing
- }, function (err) {
- if (err) return callback(err);
- closed = true;
- callback();
- });
- var destroyed = false;
- return function (err) {
- if (closed) return;
- if (destroyed) return;
- destroyed = true; // request.destroy just do .end - .abort is what we want
-
- if (isRequest(stream)) return stream.abort();
- if (typeof stream.destroy === 'function') return stream.destroy();
- callback(err || new ERR_STREAM_DESTROYED('pipe'));
- };
+async function* fromReadable(val) {
+ if (!Readable) {
+ Readable = require('./readable')
+ }
+ yield* Readable.prototype[SymbolAsyncIterator].call(val)
}
-
-function call(fn) {
- fn();
+async function pump(iterable, writable, finish, { end }) {
+ let error
+ let onresolve = null
+ const resume = (err) => {
+ if (err) {
+ error = err
+ }
+ if (onresolve) {
+ const callback = onresolve
+ onresolve = null
+ callback()
+ }
+ }
+ const wait = () =>
+ new Promise((resolve, reject) => {
+ if (error) {
+ reject(error)
+ } else {
+ onresolve = () => {
+ if (error) {
+ reject(error)
+ } else {
+ resolve()
+ }
+ }
+ }
+ })
+ writable.on('drain', resume)
+ const cleanup = eos(
+ writable,
+ {
+ readable: false
+ },
+ resume
+ )
+ try {
+ if (writable.writableNeedDrain) {
+ await wait()
+ }
+ for await (const chunk of iterable) {
+ if (!writable.write(chunk)) {
+ await wait()
+ }
+ }
+ if (end) {
+ writable.end()
+ }
+ await wait()
+ finish()
+ } catch (err) {
+ finish(error !== err ? aggregateTwoErrors(error, err) : err)
+ } finally {
+ cleanup()
+ writable.off('drain', resume)
+ }
}
-
-function pipe(from, to) {
- return from.pipe(to);
-}
-
-function popCallback(streams) {
- if (!streams.length) return noop;
- if (typeof streams[streams.length - 1] !== 'function') return noop;
- return streams.pop();
+function pipeline(...streams) {
+ return pipelineImpl(streams, once(popCallback(streams)))
}
+function pipelineImpl(streams, callback, opts) {
+ if (streams.length === 1 && ArrayIsArray(streams[0])) {
+ streams = streams[0]
+ }
+ if (streams.length < 2) {
+ throw new ERR_MISSING_ARGS('streams')
+ }
+ const ac = new AbortController()
+ const signal = ac.signal
+ const outerSignal = opts === null || opts === undefined ? undefined : opts.signal
-function pipeline() {
- for (var _len = arguments.length, streams = new Array(_len), _key = 0; _key < _len; _key++) {
- streams[_key] = arguments[_key];
+ // Need to cleanup event listeners if last stream is readable
+ // https://github.com/nodejs/node/issues/35452
+ const lastStreamCleanup = []
+ validateAbortSignal(outerSignal, 'options.signal')
+ function abort() {
+ finishImpl(new AbortError())
+ }
+ outerSignal === null || outerSignal === undefined ? undefined : outerSignal.addEventListener('abort', abort)
+ let error
+ let value
+ const destroys = []
+ let finishCount = 0
+ function finish(err) {
+ finishImpl(err, --finishCount === 0)
}
+ function finishImpl(err, final) {
+ if (err && (!error || error.code === 'ERR_STREAM_PREMATURE_CLOSE')) {
+ error = err
+ }
+ if (!error && !final) {
+ return
+ }
+ while (destroys.length) {
+ destroys.shift()(error)
+ }
+ outerSignal === null || outerSignal === undefined ? undefined : outerSignal.removeEventListener('abort', abort)
+ ac.abort()
+ if (final) {
+ if (!error) {
+ lastStreamCleanup.forEach((fn) => fn())
+ }
+ process.nextTick(callback, error, value)
+ }
+ }
+ let ret
+ for (let i = 0; i < streams.length; i++) {
+ const stream = streams[i]
+ const reading = i < streams.length - 1
+ const writing = i > 0
+ const end = reading || (opts === null || opts === undefined ? undefined : opts.end) !== false
+ const isLastStream = i === streams.length - 1
+ if (isNodeStream(stream)) {
+ if (end) {
+ const { destroy, cleanup } = destroyer(stream, reading, writing)
+ destroys.push(destroy)
+ if (isReadable(stream) && isLastStream) {
+ lastStreamCleanup.push(cleanup)
+ }
+ }
- var callback = popCallback(streams);
- if (Array.isArray(streams[0])) streams = streams[0];
+ // Catch stream errors that occur after pipe/pump has completed.
+ function onError(err) {
+ if (err && err.name !== 'AbortError' && err.code !== 'ERR_STREAM_PREMATURE_CLOSE') {
+ finish(err)
+ }
+ }
+ stream.on('error', onError)
+ if (isReadable(stream) && isLastStream) {
+ lastStreamCleanup.push(() => {
+ stream.removeListener('error', onError)
+ })
+ }
+ }
+ if (i === 0) {
+ if (typeof stream === 'function') {
+ ret = stream({
+ signal
+ })
+ if (!isIterable(ret)) {
+ throw new ERR_INVALID_RETURN_VALUE('Iterable, AsyncIterable or Stream', 'source', ret)
+ }
+ } else if (isIterable(stream) || isReadableNodeStream(stream)) {
+ ret = stream
+ } else {
+ ret = Duplex.from(stream)
+ }
+ } else if (typeof stream === 'function') {
+ ret = makeAsyncIterable(ret)
+ ret = stream(ret, {
+ signal
+ })
+ if (reading) {
+ if (!isIterable(ret, true)) {
+ throw new ERR_INVALID_RETURN_VALUE('AsyncIterable', `transform[${i - 1}]`, ret)
+ }
+ } else {
+ var _ret
+ if (!PassThrough) {
+ PassThrough = require('./passthrough')
+ }
- if (streams.length < 2) {
- throw new ERR_MISSING_ARGS('streams');
- }
+ // If the last argument to pipeline is not a stream
+ // we must create a proxy stream so that pipeline(...)
+ // always returns a stream which can be further
+ // composed through `.pipe(stream)`.
- var error;
- var destroys = streams.map(function (stream, i) {
- var reading = i < streams.length - 1;
- var writing = i > 0;
- return destroyer(stream, reading, writing, function (err) {
- if (!error) error = err;
- if (err) destroys.forEach(call);
- if (reading) return;
- destroys.forEach(call);
- callback(error);
- });
- });
- return streams.reduce(pipe);
-}
+ const pt = new PassThrough({
+ objectMode: true
+ })
-module.exports = pipeline;
\ No newline at end of file
+ // Handle Promises/A+ spec, `then` could be a getter that throws on
+ // second use.
+ const then = (_ret = ret) === null || _ret === undefined ? undefined : _ret.then
+ if (typeof then === 'function') {
+ finishCount++
+ then.call(
+ ret,
+ (val) => {
+ value = val
+ if (val != null) {
+ pt.write(val)
+ }
+ if (end) {
+ pt.end()
+ }
+ process.nextTick(finish)
+ },
+ (err) => {
+ pt.destroy(err)
+ process.nextTick(finish, err)
+ }
+ )
+ } else if (isIterable(ret, true)) {
+ finishCount++
+ pump(ret, pt, finish, {
+ end
+ })
+ } else {
+ throw new ERR_INVALID_RETURN_VALUE('AsyncIterable or Promise', 'destination', ret)
+ }
+ ret = pt
+ const { destroy, cleanup } = destroyer(ret, false, true)
+ destroys.push(destroy)
+ if (isLastStream) {
+ lastStreamCleanup.push(cleanup)
+ }
+ }
+ } else if (isNodeStream(stream)) {
+ if (isReadableNodeStream(ret)) {
+ finishCount += 2
+ const cleanup = pipe(ret, stream, finish, {
+ end
+ })
+ if (isReadable(stream) && isLastStream) {
+ lastStreamCleanup.push(cleanup)
+ }
+ } else if (isIterable(ret)) {
+ finishCount++
+ pump(ret, stream, finish, {
+ end
+ })
+ } else {
+ throw new ERR_INVALID_ARG_TYPE('val', ['Readable', 'Iterable', 'AsyncIterable'], ret)
+ }
+ ret = stream
+ } else {
+ ret = Duplex.from(stream)
+ }
+ }
+ if (
+ (signal !== null && signal !== undefined && signal.aborted) ||
+ (outerSignal !== null && outerSignal !== undefined && outerSignal.aborted)
+ ) {
+ process.nextTick(abort)
+ }
+ return ret
+}
+function pipe(src, dst, finish, { end }) {
+ let ended = false
+ dst.on('close', () => {
+ if (!ended) {
+ // Finish if the destination closes before the source has completed.
+ finish(new ERR_STREAM_PREMATURE_CLOSE())
+ }
+ })
+ src.pipe(dst, {
+ end
+ })
+ if (end) {
+ // Compat. Before node v10.12.0 stdio used to throw an error so
+ // pipe() did/does not end() stdio destinations.
+ // Now they allow it but "secretly" don't close the underlying fd.
+ src.once('end', () => {
+ ended = true
+ dst.end()
+ })
+ } else {
+ finish()
+ }
+ eos(
+ src,
+ {
+ readable: true,
+ writable: false
+ },
+ (err) => {
+ const rState = src._readableState
+ if (
+ err &&
+ err.code === 'ERR_STREAM_PREMATURE_CLOSE' &&
+ rState &&
+ rState.ended &&
+ !rState.errored &&
+ !rState.errorEmitted
+ ) {
+ // Some readable streams will emit 'close' before 'end'. However, since
+ // this is on the readable side 'end' should still be emitted if the
+ // stream has been ended and no error emitted. This should be allowed in
+ // favor of backwards compatibility. Since the stream is piped to a
+ // destination this should not result in any observable difference.
+ // We don't need to check if this is a writable premature close since
+ // eos will only fail with premature close on the reading side for
+ // duplex streams.
+ src.once('end', finish).once('error', finish)
+ } else {
+ finish(err)
+ }
+ }
+ )
+ return eos(
+ dst,
+ {
+ readable: false,
+ writable: true
+ },
+ finish
+ )
+}
+module.exports = {
+ pipelineImpl,
+ pipeline
+}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/readable.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/readable.js
similarity index 86%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/readable.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/readable.js
index 299299d244629b..3fc01d1f880932 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/readable.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/readable.js
@@ -1,5 +1,7 @@
/* replacement start */
-const process = require('process')
+
+const process = require('process/')
+
/* replacement end */
// Copyright Joyent, Inc. and other Node contributors.
//
@@ -23,7 +25,6 @@ const process = require('process')
// USE OR OTHER DEALINGS IN THE SOFTWARE.
;('use strict')
-
const {
ArrayPrototypeIndexOf,
NumberIsInteger,
@@ -37,30 +38,19 @@ const {
SymbolAsyncIterator,
Symbol
} = require('../../ours/primordials')
-
module.exports = Readable
Readable.ReadableState = ReadableState
-
const { EventEmitter: EE } = require('events')
-
const { Stream, prependListener } = require('./legacy')
-
const { Buffer } = require('buffer')
-
const { addAbortSignal } = require('./add-abort-signal')
-
const eos = require('./end-of-stream')
-
let debug = require('../../ours/util').debuglog('stream', (fn) => {
debug = fn
})
-
const BufferList = require('./buffer_list')
-
const destroyImpl = require('./destroy')
-
const { getHighWaterMark, getDefaultHighWaterMark } = require('./state')
-
const {
aggregateTwoErrors,
codes: {
@@ -71,118 +61,122 @@ const {
ERR_STREAM_UNSHIFT_AFTER_END_EVENT
}
} = require('../../ours/errors')
-
const { validateObject } = require('../validators')
-
const kPaused = Symbol('kPaused')
-
const { StringDecoder } = require('string_decoder')
-
const from = require('./from')
-
ObjectSetPrototypeOf(Readable.prototype, Stream.prototype)
ObjectSetPrototypeOf(Readable, Stream)
-
const nop = () => {}
-
const { errorOrDestroy } = destroyImpl
-
function ReadableState(options, stream, isDuplex) {
// Duplex streams are both readable and writable, but share
// the same options object.
// However, some cases require setting options to different
// values for the readable and the writable sides of the duplex stream.
// These options can be provided separately as readableXXX and writableXXX.
- if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof require('./duplex') // Object stream flag. Used to make read(n) ignore n and to
- // make all the buffer merging and length checks go away.
+ if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof require('./duplex')
+ // Object stream flag. Used to make read(n) ignore n and to
+ // make all the buffer merging and length checks go away.
this.objectMode = !!(options && options.objectMode)
- if (isDuplex) this.objectMode = this.objectMode || !!(options && options.readableObjectMode) // The point at which it stops calling _read() to fill the buffer
- // Note: 0 is a valid value, means "don't call _read preemptively ever"
+ if (isDuplex) this.objectMode = this.objectMode || !!(options && options.readableObjectMode)
+ // The point at which it stops calling _read() to fill the buffer
+ // Note: 0 is a valid value, means "don't call _read preemptively ever"
this.highWaterMark = options
? getHighWaterMark(this, options, 'readableHighWaterMark', isDuplex)
- : getDefaultHighWaterMark(false) // A linked list is used to store data chunks instead of an array because the
+ : getDefaultHighWaterMark(false)
+
+ // A linked list is used to store data chunks instead of an array because the
// linked list can remove elements from the beginning faster than
// array.shift().
-
this.buffer = new BufferList()
this.length = 0
this.pipes = []
this.flowing = null
this.ended = false
this.endEmitted = false
- this.reading = false // Stream is still being constructed and cannot be
+ this.reading = false
+
+ // Stream is still being constructed and cannot be
// destroyed until construction finished or failed.
// Async construction is opt in, therefore we start as
// constructed.
+ this.constructed = true
- this.constructed = true // A flag to be able to tell if the event 'readable'/'data' is emitted
+ // A flag to be able to tell if the event 'readable'/'data' is emitted
// immediately, or on a later tick. We set this to true at first, because
// any actions that shouldn't happen until "later" should generally also
// not happen before the first read call.
+ this.sync = true
- this.sync = true // Whenever we return null, then we set a flag to say
+ // Whenever we return null, then we set a flag to say
// that we're awaiting a 'readable' event emission.
-
this.needReadable = false
this.emittedReadable = false
this.readableListening = false
this.resumeScheduled = false
- this[kPaused] = null // True if the error was already emitted and should not be thrown again.
+ this[kPaused] = null
- this.errorEmitted = false // Should close be emitted on destroy. Defaults to true.
+ // True if the error was already emitted and should not be thrown again.
+ this.errorEmitted = false
- this.emitClose = !options || options.emitClose !== false // Should .destroy() be called after 'end' (and potentially 'finish').
+ // Should close be emitted on destroy. Defaults to true.
+ this.emitClose = !options || options.emitClose !== false
- this.autoDestroy = !options || options.autoDestroy !== false // Has it been destroyed.
+ // Should .destroy() be called after 'end' (and potentially 'finish').
+ this.autoDestroy = !options || options.autoDestroy !== false
- this.destroyed = false // Indicates whether the stream has errored. When true no further
+ // Has it been destroyed.
+ this.destroyed = false
+
+ // Indicates whether the stream has errored. When true no further
// _read calls, 'data' or 'readable' events should occur. This is needed
// since when autoDestroy is disabled we need a way to tell whether the
// stream has failed.
+ this.errored = null
- this.errored = null // Indicates whether the stream has finished destroying.
+ // Indicates whether the stream has finished destroying.
+ this.closed = false
- this.closed = false // True if close has been emitted or would have been emitted
+ // True if close has been emitted or would have been emitted
// depending on emitClose.
+ this.closeEmitted = false
- this.closeEmitted = false // Crypto is kind of old and crusty. Historically, its default string
+ // Crypto is kind of old and crusty. Historically, its default string
// encoding is 'binary' so we have to make this configurable.
// Everything else in the universe uses 'utf8', though.
+ this.defaultEncoding = (options && options.defaultEncoding) || 'utf8'
- this.defaultEncoding = (options && options.defaultEncoding) || 'utf8' // Ref the piped dest which we need a drain event on it
+ // Ref the piped dest which we need a drain event on it
// type: null | Writable | Set.
-
this.awaitDrainWriters = null
- this.multiAwaitDrain = false // If true, a maybeReadMore has been scheduled.
+ this.multiAwaitDrain = false
+ // If true, a maybeReadMore has been scheduled.
this.readingMore = false
this.dataEmitted = false
this.decoder = null
this.encoding = null
-
if (options && options.encoding) {
this.decoder = new StringDecoder(options.encoding)
this.encoding = options.encoding
}
}
-
function Readable(options) {
- if (!(this instanceof Readable)) return new Readable(options) // Checking for a Stream.Duplex instance is faster here instead of inside
- // the ReadableState constructor, at least with V8 6.5.
+ if (!(this instanceof Readable)) return new Readable(options)
+ // Checking for a Stream.Duplex instance is faster here instead of inside
+ // the ReadableState constructor, at least with V8 6.5.
const isDuplex = this instanceof require('./duplex')
-
this._readableState = new ReadableState(options, this, isDuplex)
-
if (options) {
if (typeof options.read === 'function') this._read = options.read
if (typeof options.destroy === 'function') this._destroy = options.destroy
if (typeof options.construct === 'function') this._construct = options.construct
if (options.signal && !isDuplex) addAbortSignal(options.signal, this)
}
-
Stream.call(this, options)
destroyImpl.construct(this, () => {
if (this._readableState.needReadable) {
@@ -190,38 +184,34 @@ function Readable(options) {
}
})
}
-
Readable.prototype.destroy = destroyImpl.destroy
Readable.prototype._undestroy = destroyImpl.undestroy
-
Readable.prototype._destroy = function (err, cb) {
cb(err)
}
-
Readable.prototype[EE.captureRejectionSymbol] = function (err) {
this.destroy(err)
-} // Manually shove something into the read() buffer.
+}
+
+// Manually shove something into the read() buffer.
// This returns true if the highWaterMark has not been hit yet,
// similar to how Writable.write() returns true if you should
// write() some more.
-
Readable.prototype.push = function (chunk, encoding) {
return readableAddChunk(this, chunk, encoding, false)
-} // Unshift should *always* be something directly out of read().
+}
+// Unshift should *always* be something directly out of read().
Readable.prototype.unshift = function (chunk, encoding) {
return readableAddChunk(this, chunk, encoding, true)
}
-
function readableAddChunk(stream, chunk, encoding, addToFront) {
debug('readableAddChunk', chunk)
const state = stream._readableState
let err
-
if (!state.objectMode) {
if (typeof chunk === 'string') {
encoding = encoding || state.defaultEncoding
-
if (state.encoding !== encoding) {
if (addToFront && state.encoding) {
// When unshifting, if state.encoding is set, we have to save
@@ -241,7 +231,6 @@ function readableAddChunk(stream, chunk, encoding, addToFront) {
err = new ERR_INVALID_ARG_TYPE('chunk', ['string', 'Buffer', 'Uint8Array'], chunk)
}
}
-
if (err) {
errorOrDestroy(stream, err)
} else if (chunk === null) {
@@ -258,7 +247,6 @@ function readableAddChunk(stream, chunk, encoding, addToFront) {
return false
} else {
state.reading = false
-
if (state.decoder && !encoding) {
chunk = state.decoder.write(chunk)
if (state.objectMode || chunk.length !== 0) addChunk(stream, state, chunk, false)
@@ -270,13 +258,13 @@ function readableAddChunk(stream, chunk, encoding, addToFront) {
} else if (!addToFront) {
state.reading = false
maybeReadMore(stream, state)
- } // We can push more data if we are below the highWaterMark.
+ }
+
+ // We can push more data if we are below the highWaterMark.
// Also, if we have no data yet, we can stand some more bytes.
// This is to work around cases where hwm=0, such as the repl.
-
return !state.ended && (state.length < state.highWaterMark || state.length === 0)
}
-
function addChunk(stream, state, chunk, addToFront) {
if (state.flowing && state.length === 0 && !state.sync && stream.listenerCount('data') > 0) {
// Use the guard to avoid creating `Set()` repeatedly
@@ -286,7 +274,6 @@ function addChunk(stream, state, chunk, addToFront) {
} else {
state.awaitDrainWriters = null
}
-
state.dataEmitted = true
stream.emit('data', chunk)
} else {
@@ -296,36 +283,33 @@ function addChunk(stream, state, chunk, addToFront) {
else state.buffer.push(chunk)
if (state.needReadable) emitReadable(stream)
}
-
maybeReadMore(stream, state)
}
-
Readable.prototype.isPaused = function () {
const state = this._readableState
return state[kPaused] === true || state.flowing === false
-} // Backwards compatibility.
+}
+// Backwards compatibility.
Readable.prototype.setEncoding = function (enc) {
const decoder = new StringDecoder(enc)
- this._readableState.decoder = decoder // If setEncoding(null), decoder.encoding equals utf8.
-
+ this._readableState.decoder = decoder
+ // If setEncoding(null), decoder.encoding equals utf8.
this._readableState.encoding = this._readableState.decoder.encoding
- const buffer = this._readableState.buffer // Iterate over current buffer to convert already stored Buffers:
-
+ const buffer = this._readableState.buffer
+ // Iterate over current buffer to convert already stored Buffers:
let content = ''
-
for (const data of buffer) {
content += decoder.write(data)
}
-
buffer.clear()
if (content !== '') buffer.push(content)
this._readableState.length = content.length
return this
-} // Don't raise the hwm > 1GB.
+}
+// Don't raise the hwm > 1GB.
const MAX_HWM = 0x40000000
-
function computeNewHighWaterMark(n) {
if (n > MAX_HWM) {
throw new ERR_OUT_OF_RANGE('size', '<= 1GiB', n)
@@ -340,43 +324,43 @@ function computeNewHighWaterMark(n) {
n |= n >>> 16
n++
}
-
return n
-} // This function is designed to be inlinable, so please take care when making
-// changes to the function body.
+}
+// This function is designed to be inlinable, so please take care when making
+// changes to the function body.
function howMuchToRead(n, state) {
if (n <= 0 || (state.length === 0 && state.ended)) return 0
if (state.objectMode) return 1
-
if (NumberIsNaN(n)) {
// Only flow one buffer at a time.
if (state.flowing && state.length) return state.buffer.first().length
return state.length
}
-
if (n <= state.length) return n
return state.ended ? state.length : 0
-} // You can override either this method, or the async _read(n) below.
+}
+// You can override either this method, or the async _read(n) below.
Readable.prototype.read = function (n) {
- debug('read', n) // Same as parseInt(undefined, 10), however V8 7.3 performance regressed
+ debug('read', n)
+ // Same as parseInt(undefined, 10), however V8 7.3 performance regressed
// in this scenario, so we are doing it manually.
-
if (n === undefined) {
n = NaN
} else if (!NumberIsInteger(n)) {
n = NumberParseInt(n, 10)
}
-
const state = this._readableState
- const nOrig = n // If we're asking for more than the current hwm, then raise the hwm.
+ const nOrig = n
+ // If we're asking for more than the current hwm, then raise the hwm.
if (n > state.highWaterMark) state.highWaterMark = computeNewHighWaterMark(n)
- if (n !== 0) state.emittedReadable = false // If we're doing read(0) to trigger a readable event, but we
+ if (n !== 0) state.emittedReadable = false
+
+ // If we're doing read(0) to trigger a readable event, but we
// already have a bunch of data in the buffer, then just trigger
// the 'readable' event and move on.
-
if (
n === 0 &&
state.needReadable &&
@@ -387,13 +371,15 @@ Readable.prototype.read = function (n) {
else emitReadable(this)
return null
}
+ n = howMuchToRead(n, state)
- n = howMuchToRead(n, state) // If we've ended, and we're now clear, then finish it up.
-
+ // If we've ended, and we're now clear, then finish it up.
if (n === 0 && state.ended) {
if (state.length === 0) endReadable(this)
return null
- } // All the actual chunk generation logic needs to be
+ }
+
+ // All the actual chunk generation logic needs to be
// *below* the call to _read. The reason is that in certain
// synthetic stream cases, such as passthrough streams, _read
// may be a completely synchronous operation which may change
@@ -414,88 +400,80 @@ Readable.prototype.read = function (n) {
// 'readable' etc.
//
// 3. Actually pull the requested chunks out of the buffer and return.
- // if we need a readable event, then we need to do some reading.
+ // if we need a readable event, then we need to do some reading.
let doRead = state.needReadable
- debug('need readable', doRead) // If we currently have less than the highWaterMark, then also read some.
+ debug('need readable', doRead)
+ // If we currently have less than the highWaterMark, then also read some.
if (state.length === 0 || state.length - n < state.highWaterMark) {
doRead = true
debug('length less than watermark', doRead)
- } // However, if we've ended, then there's no point, if we're already
+ }
+
+ // However, if we've ended, then there's no point, if we're already
// reading, then it's unnecessary, if we're constructing we have to wait,
// and if we're destroyed or errored, then it's not allowed,
-
if (state.ended || state.reading || state.destroyed || state.errored || !state.constructed) {
doRead = false
debug('reading, ended or constructing', doRead)
} else if (doRead) {
debug('do read')
state.reading = true
- state.sync = true // If the length is currently zero, then we *need* a readable event.
-
- if (state.length === 0) state.needReadable = true // Call internal read method
+ state.sync = true
+ // If the length is currently zero, then we *need* a readable event.
+ if (state.length === 0) state.needReadable = true
+ // Call internal read method
try {
this._read(state.highWaterMark)
} catch (err) {
errorOrDestroy(this, err)
}
-
- state.sync = false // If _read pushed data synchronously, then `reading` will be false,
+ state.sync = false
+ // If _read pushed data synchronously, then `reading` will be false,
// and we need to re-evaluate how much data we can return to the user.
-
if (!state.reading) n = howMuchToRead(nOrig, state)
}
-
let ret
if (n > 0) ret = fromList(n, state)
else ret = null
-
if (ret === null) {
state.needReadable = state.length <= state.highWaterMark
n = 0
} else {
state.length -= n
-
if (state.multiAwaitDrain) {
state.awaitDrainWriters.clear()
} else {
state.awaitDrainWriters = null
}
}
-
if (state.length === 0) {
// If we have nothing in the buffer, then we want to know
// as soon as we *do* get something into the buffer.
- if (!state.ended) state.needReadable = true // If we tried to read() past the EOF, then emit end on the next tick.
+ if (!state.ended) state.needReadable = true
+ // If we tried to read() past the EOF, then emit end on the next tick.
if (nOrig !== n && state.ended) endReadable(this)
}
-
if (ret !== null && !state.errorEmitted && !state.closeEmitted) {
state.dataEmitted = true
this.emit('data', ret)
}
-
return ret
}
-
function onEofChunk(stream, state) {
debug('onEofChunk')
if (state.ended) return
-
if (state.decoder) {
const chunk = state.decoder.end()
-
if (chunk && chunk.length) {
state.buffer.push(chunk)
state.length += state.objectMode ? 1 : chunk.length
}
}
-
state.ended = true
-
if (state.sync) {
// If we are sync, wait until next tick to emit the data.
// Otherwise we risk emitting data in the flow()
@@ -504,57 +482,56 @@ function onEofChunk(stream, state) {
} else {
// Emit 'readable' now to make sure it gets picked up.
state.needReadable = false
- state.emittedReadable = true // We have to emit readable now that we are EOF. Modules
+ state.emittedReadable = true
+ // We have to emit readable now that we are EOF. Modules
// in the ecosystem (e.g. dicer) rely on this event being sync.
-
emitReadable_(stream)
}
-} // Don't emit readable right away in sync mode, because this can trigger
+}
+
+// Don't emit readable right away in sync mode, because this can trigger
// another read() call => stack overflow. This way, it might trigger
// a nextTick recursion warning, but that's not so bad.
-
function emitReadable(stream) {
const state = stream._readableState
debug('emitReadable', state.needReadable, state.emittedReadable)
state.needReadable = false
-
if (!state.emittedReadable) {
debug('emitReadable', state.flowing)
state.emittedReadable = true
process.nextTick(emitReadable_, stream)
}
}
-
function emitReadable_(stream) {
const state = stream._readableState
debug('emitReadable_', state.destroyed, state.length, state.ended)
-
if (!state.destroyed && !state.errored && (state.length || state.ended)) {
stream.emit('readable')
state.emittedReadable = false
- } // The stream needs another readable event if:
+ }
+
+ // The stream needs another readable event if:
// 1. It is not flowing, as the flow mechanism will take
// care of it.
// 2. It is not ended.
// 3. It is below the highWaterMark, so we can schedule
// another readable later.
-
state.needReadable = !state.flowing && !state.ended && state.length <= state.highWaterMark
flow(stream)
-} // At this point, the user has presumably seen the 'readable' event,
+}
+
+// At this point, the user has presumably seen the 'readable' event,
// and called read() to consume some data. that may have triggered
// in turn another _read(n) call, in which case reading = true if
// it's in progress.
// However, if we're not ended, or reading, and the length < hwm,
// then go ahead and try to read some more preemptively.
-
function maybeReadMore(stream, state) {
if (!state.readingMore && state.constructed) {
state.readingMore = true
process.nextTick(maybeReadMore_, stream, state)
}
}
-
function maybeReadMore_(stream, state) {
// Attempt to read more data if we should.
//
@@ -591,28 +568,25 @@ function maybeReadMore_(stream, state) {
// Didn't get any data, stop spinning.
break
}
-
state.readingMore = false
-} // Abstract method. to be overridden in specific implementation classes.
+}
+
+// Abstract method. to be overridden in specific implementation classes.
// call cb(er, data) where data is <= n in length.
// for virtual (non-string, non-buffer) streams, "length" is somewhat
// arbitrary, and perhaps not very meaningful.
-
Readable.prototype._read = function (n) {
throw new ERR_METHOD_NOT_IMPLEMENTED('_read()')
}
-
Readable.prototype.pipe = function (dest, pipeOpts) {
const src = this
const state = this._readableState
-
if (state.pipes.length === 1) {
if (!state.multiAwaitDrain) {
state.multiAwaitDrain = true
state.awaitDrainWriters = new SafeSet(state.awaitDrainWriters ? [state.awaitDrainWriters] : [])
}
}
-
state.pipes.push(dest)
debug('pipe count=%d opts=%j', state.pipes.length, pipeOpts)
const doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr
@@ -620,10 +594,8 @@ Readable.prototype.pipe = function (dest, pipeOpts) {
if (state.endEmitted) process.nextTick(endFn)
else src.once('end', endFn)
dest.on('unpipe', onunpipe)
-
function onunpipe(readable, unpipeInfo) {
debug('onunpipe')
-
if (readable === src) {
if (unpipeInfo && unpipeInfo.hasUnpiped === false) {
unpipeInfo.hasUnpiped = true
@@ -631,39 +603,34 @@ Readable.prototype.pipe = function (dest, pipeOpts) {
}
}
}
-
function onend() {
debug('onend')
dest.end()
}
-
let ondrain
let cleanedUp = false
-
function cleanup() {
- debug('cleanup') // Cleanup event handlers once the pipe is broken.
-
+ debug('cleanup')
+ // Cleanup event handlers once the pipe is broken.
dest.removeListener('close', onclose)
dest.removeListener('finish', onfinish)
-
if (ondrain) {
dest.removeListener('drain', ondrain)
}
-
dest.removeListener('error', onerror)
dest.removeListener('unpipe', onunpipe)
src.removeListener('end', onend)
src.removeListener('end', unpipe)
src.removeListener('data', ondata)
- cleanedUp = true // If the reader is waiting for a drain event from this
+ cleanedUp = true
+
+ // If the reader is waiting for a drain event from this
// specific writer, then it would cause it to never start
// flowing again.
// So, if this is awaiting a drain, then we just call it now.
// If we don't know, then assume that we are waiting for one.
-
if (ondrain && state.awaitDrainWriters && (!dest._writableState || dest._writableState.needDrain)) ondrain()
}
-
function pause() {
// If the user unpiped during `dest.write()`, it is possible
// to get stuck in a permanently paused state if that write
@@ -678,10 +645,8 @@ Readable.prototype.pipe = function (dest, pipeOpts) {
debug('false write response, pause', state.awaitDrainWriters.size)
state.awaitDrainWriters.add(dest)
}
-
src.pause()
}
-
if (!ondrain) {
// When the dest drains, it reduces the awaitDrain counter
// on the source. This would be more elegant with a .once()
@@ -691,28 +656,24 @@ Readable.prototype.pipe = function (dest, pipeOpts) {
dest.on('drain', ondrain)
}
}
-
src.on('data', ondata)
-
function ondata(chunk) {
debug('ondata')
const ret = dest.write(chunk)
debug('dest.write', ret)
-
if (ret === false) {
pause()
}
- } // If the dest has an error, then stop piping into it.
- // However, don't suppress the throwing behavior for this.
+ }
+ // If the dest has an error, then stop piping into it.
+ // However, don't suppress the throwing behavior for this.
function onerror(er) {
debug('onerror', er)
unpipe()
dest.removeListener('error', onerror)
-
if (dest.listenerCount('error') === 0) {
const s = dest._writableState || dest._readableState
-
if (s && !s.errorEmitted) {
// User incorrectly emitted 'error' directly on the stream.
errorOrDestroy(dest, er)
@@ -720,31 +681,32 @@ Readable.prototype.pipe = function (dest, pipeOpts) {
dest.emit('error', er)
}
}
- } // Make sure our error handler is attached before userland ones.
+ }
- prependListener(dest, 'error', onerror) // Both close and finish should trigger unpipe, but only once.
+ // Make sure our error handler is attached before userland ones.
+ prependListener(dest, 'error', onerror)
+ // Both close and finish should trigger unpipe, but only once.
function onclose() {
dest.removeListener('finish', onfinish)
unpipe()
}
-
dest.once('close', onclose)
-
function onfinish() {
debug('onfinish')
dest.removeListener('close', onclose)
unpipe()
}
-
dest.once('finish', onfinish)
-
function unpipe() {
debug('unpipe')
src.unpipe(dest)
- } // Tell the dest that it's being piped to.
+ }
+
+ // Tell the dest that it's being piped to.
+ dest.emit('pipe', src)
- dest.emit('pipe', src) // Start the flow if it hasn't been started already.
+ // Start the flow if it hasn't been started already.
if (dest.writableNeedDrain === true) {
if (state.flowing) {
@@ -754,16 +716,15 @@ Readable.prototype.pipe = function (dest, pipeOpts) {
debug('pipe resume')
src.resume()
}
-
return dest
}
-
function pipeOnDrain(src, dest) {
return function pipeOnDrainFunctionResult() {
- const state = src._readableState // `ondrain` will call directly,
+ const state = src._readableState
+
+ // `ondrain` will call directly,
// `this` maybe not a reference to dest,
// so we use the real dest here.
-
if (state.awaitDrainWriters === dest) {
debug('pipeOnDrain', 1)
state.awaitDrainWriters = null
@@ -771,53 +732,51 @@ function pipeOnDrain(src, dest) {
debug('pipeOnDrain', state.awaitDrainWriters.size)
state.awaitDrainWriters.delete(dest)
}
-
if ((!state.awaitDrainWriters || state.awaitDrainWriters.size === 0) && src.listenerCount('data')) {
src.resume()
}
}
}
-
Readable.prototype.unpipe = function (dest) {
const state = this._readableState
const unpipeInfo = {
hasUnpiped: false
- } // If we're not piping anywhere, then do nothing.
+ }
+ // If we're not piping anywhere, then do nothing.
if (state.pipes.length === 0) return this
-
if (!dest) {
// remove all.
const dests = state.pipes
state.pipes = []
this.pause()
-
for (let i = 0; i < dests.length; i++)
dests[i].emit('unpipe', this, {
hasUnpiped: false
})
-
return this
- } // Try to find the right one.
+ }
+ // Try to find the right one.
const index = ArrayPrototypeIndexOf(state.pipes, dest)
if (index === -1) return this
state.pipes.splice(index, 1)
if (state.pipes.length === 0) this.pause()
dest.emit('unpipe', this, unpipeInfo)
return this
-} // Set up data events if they are asked for
-// Ensure readable listeners eventually get something.
+}
+// Set up data events if they are asked for
+// Ensure readable listeners eventually get something.
Readable.prototype.on = function (ev, fn) {
const res = Stream.prototype.on.call(this, ev, fn)
const state = this._readableState
-
if (ev === 'data') {
// Update readableListening so that resume() may be a no-op
// a few lines down. This is needed to support once('readable').
- state.readableListening = this.listenerCount('readable') > 0 // Try start flowing on next tick if stream isn't explicitly paused.
+ state.readableListening = this.listenerCount('readable') > 0
+ // Try start flowing on next tick if stream isn't explicitly paused.
if (state.flowing !== false) this.resume()
} else if (ev === 'readable') {
if (!state.endEmitted && !state.readableListening) {
@@ -825,7 +784,6 @@ Readable.prototype.on = function (ev, fn) {
state.flowing = false
state.emittedReadable = false
debug('on readable', state.length, state.reading)
-
if (state.length) {
emitReadable(this)
} else if (!state.reading) {
@@ -833,15 +791,11 @@ Readable.prototype.on = function (ev, fn) {
}
}
}
-
return res
}
-
Readable.prototype.addListener = Readable.prototype.on
-
Readable.prototype.removeListener = function (ev, fn) {
const res = Stream.prototype.removeListener.call(this, ev, fn)
-
if (ev === 'readable') {
// We need to check if there is someone still listening to
// readable and reset the state. However this needs to happen
@@ -851,15 +805,11 @@ Readable.prototype.removeListener = function (ev, fn) {
// effect.
process.nextTick(updateReadableListening, this)
}
-
return res
}
-
Readable.prototype.off = Readable.prototype.removeListener
-
Readable.prototype.removeAllListeners = function (ev) {
const res = Stream.prototype.removeAllListeners.apply(this, arguments)
-
if (ev === 'readable' || ev === undefined) {
// We need to check if there is someone still listening to
// readable and reset the state. However this needs to happen
@@ -869,91 +819,82 @@ Readable.prototype.removeAllListeners = function (ev) {
// effect.
process.nextTick(updateReadableListening, this)
}
-
return res
}
-
function updateReadableListening(self) {
const state = self._readableState
state.readableListening = self.listenerCount('readable') > 0
-
if (state.resumeScheduled && state[kPaused] === false) {
// Flowing needs to be set to true now, otherwise
// the upcoming resume will not flow.
- state.flowing = true // Crude way to check if we should resume.
+ state.flowing = true
+
+ // Crude way to check if we should resume.
} else if (self.listenerCount('data') > 0) {
self.resume()
} else if (!state.readableListening) {
state.flowing = null
}
}
-
function nReadingNextTick(self) {
debug('readable nexttick read 0')
self.read(0)
-} // pause() and resume() are remnants of the legacy readable stream API
-// If the user uses them, then switch into old mode.
+}
+// pause() and resume() are remnants of the legacy readable stream API
+// If the user uses them, then switch into old mode.
Readable.prototype.resume = function () {
const state = this._readableState
-
if (!state.flowing) {
- debug('resume') // We flow only if there is no one listening
+ debug('resume')
+ // We flow only if there is no one listening
// for readable, but we still have to call
// resume().
-
state.flowing = !state.readableListening
resume(this, state)
}
-
state[kPaused] = false
return this
}
-
function resume(stream, state) {
if (!state.resumeScheduled) {
state.resumeScheduled = true
process.nextTick(resume_, stream, state)
}
}
-
function resume_(stream, state) {
debug('resume', state.reading)
-
if (!state.reading) {
stream.read(0)
}
-
state.resumeScheduled = false
stream.emit('resume')
flow(stream)
if (state.flowing && !state.reading) stream.read(0)
}
-
Readable.prototype.pause = function () {
debug('call pause flowing=%j', this._readableState.flowing)
-
if (this._readableState.flowing !== false) {
debug('pause')
this._readableState.flowing = false
this.emit('pause')
}
-
this._readableState[kPaused] = true
return this
}
-
function flow(stream) {
const state = stream._readableState
debug('flow', state.flowing)
-
while (state.flowing && stream.read() !== null);
-} // Wrap an old-style stream as the async data source.
+}
+
+// Wrap an old-style stream as the async data source.
// This is *not* part of the readable stream interface.
// It is an ugly unfortunate mess of history.
-
Readable.prototype.wrap = function (stream) {
- let paused = false // TODO (ronag): Should this.destroy(err) emit
+ let paused = false
+
+ // TODO (ronag): Should this.destroy(err) emit
// 'error' on the wrapped stream? Would require
// a static factory method, e.g. Readable.wrap(stream).
@@ -975,54 +916,44 @@ Readable.prototype.wrap = function (stream) {
stream.on('destroy', () => {
this.destroy()
})
-
this._read = () => {
if (paused && stream.resume) {
paused = false
stream.resume()
}
- } // Proxy all the other methods. Important when wrapping filters and duplexes.
+ }
+ // Proxy all the other methods. Important when wrapping filters and duplexes.
const streamKeys = ObjectKeys(stream)
-
for (let j = 1; j < streamKeys.length; j++) {
const i = streamKeys[j]
-
if (this[i] === undefined && typeof stream[i] === 'function') {
this[i] = stream[i].bind(stream)
}
}
-
return this
}
-
Readable.prototype[SymbolAsyncIterator] = function () {
return streamToAsyncIterator(this)
}
-
Readable.prototype.iterator = function (options) {
if (options !== undefined) {
validateObject(options, 'options')
}
-
return streamToAsyncIterator(this, options)
}
-
function streamToAsyncIterator(stream, options) {
if (typeof stream.read !== 'function') {
stream = Readable.wrap(stream, {
objectMode: true
})
}
-
const iter = createAsyncIterator(stream, options)
iter.stream = stream
return iter
}
-
async function* createAsyncIterator(stream, options) {
let callback = nop
-
function next(resolve) {
if (this === stream) {
callback()
@@ -1031,7 +962,6 @@ async function* createAsyncIterator(stream, options) {
callback = resolve
}
}
-
stream.on('readable', next)
let error
const cleanup = eos(
@@ -1045,11 +975,9 @@ async function* createAsyncIterator(stream, options) {
callback = nop
}
)
-
try {
while (true) {
const chunk = stream.destroyed ? null : stream.read()
-
if (chunk !== null) {
yield chunk
} else if (error) {
@@ -1074,23 +1002,22 @@ async function* createAsyncIterator(stream, options) {
cleanup()
}
}
-} // Making it explicit these properties are not enumerable
+}
+
+// Making it explicit these properties are not enumerable
// because otherwise some prototype manipulation in
// userland will fail.
-
ObjectDefineProperties(Readable.prototype, {
readable: {
__proto__: null,
-
get() {
- const r = this._readableState // r.readable === false means that this is part of a Duplex stream
+ const r = this._readableState
+ // r.readable === false means that this is part of a Duplex stream
// where the readable side was disabled upon construction.
// Compat. The user might manually disable readable side through
// deprecated setter.
-
return !!r && r.readable !== false && !r.destroyed && !r.errorEmitted && !r.endEmitted
},
-
set(val) {
// Backwards compat.
if (this._readableState) {
@@ -1145,7 +1072,6 @@ ObjectDefineProperties(Readable.prototype, {
readableLength: {
__proto__: null,
enumerable: false,
-
get() {
return this._readableState.length
}
@@ -1153,7 +1079,6 @@ ObjectDefineProperties(Readable.prototype, {
readableObjectMode: {
__proto__: null,
enumerable: false,
-
get() {
return this._readableState ? this._readableState.objectMode : false
}
@@ -1161,7 +1086,6 @@ ObjectDefineProperties(Readable.prototype, {
readableEncoding: {
__proto__: null,
enumerable: false,
-
get() {
return this._readableState ? this._readableState.encoding : null
}
@@ -1169,14 +1093,12 @@ ObjectDefineProperties(Readable.prototype, {
errored: {
__proto__: null,
enumerable: false,
-
get() {
return this._readableState ? this._readableState.errored : null
}
},
closed: {
__proto__: null,
-
get() {
return this._readableState ? this._readableState.closed : false
}
@@ -1184,26 +1106,24 @@ ObjectDefineProperties(Readable.prototype, {
destroyed: {
__proto__: null,
enumerable: false,
-
get() {
return this._readableState ? this._readableState.destroyed : false
},
-
set(value) {
// We ignore the value if the stream
// has not been initialized yet.
if (!this._readableState) {
return
- } // Backward compatibility, the user is explicitly
- // managing destroyed.
+ }
+ // Backward compatibility, the user is explicitly
+ // managing destroyed.
this._readableState.destroyed = value
}
},
readableEnded: {
__proto__: null,
enumerable: false,
-
get() {
return this._readableState ? this._readableState.endEmitted : false
}
@@ -1213,7 +1133,6 @@ ObjectDefineProperties(ReadableState.prototype, {
// Legacy getter for `pipesCount`.
pipesCount: {
__proto__: null,
-
get() {
return this.pipes.length
}
@@ -1221,22 +1140,22 @@ ObjectDefineProperties(ReadableState.prototype, {
// Legacy property for `paused`.
paused: {
__proto__: null,
-
get() {
return this[kPaused] !== false
},
-
set(value) {
this[kPaused] = !!value
}
}
-}) // Exposed for testing purposes only.
+})
+
+// Exposed for testing purposes only.
+Readable._fromList = fromList
-Readable._fromList = fromList // Pluck off n bytes from an array of buffers.
+// Pluck off n bytes from an array of buffers.
// Length is the combined lengths of all the buffers in the list.
// This function is designed to be inlinable, so please take care when making
// changes to the function body.
-
function fromList(n, state) {
// nothing buffered.
if (state.length === 0) return null
@@ -1254,24 +1173,21 @@ function fromList(n, state) {
}
return ret
}
-
function endReadable(stream) {
const state = stream._readableState
debug('endReadable', state.endEmitted)
-
if (!state.endEmitted) {
state.ended = true
process.nextTick(endReadableNT, state, stream)
}
}
-
function endReadableNT(state, stream) {
- debug('endReadableNT', state.endEmitted, state.length) // Check that we didn't get one last unshift.
+ debug('endReadableNT', state.endEmitted, state.length)
+ // Check that we didn't get one last unshift.
if (!state.errored && !state.closeEmitted && !state.endEmitted && state.length === 0) {
state.endEmitted = true
stream.emit('end')
-
if (stream.writable && stream.allowHalfOpen === false) {
process.nextTick(endWritableNT, stream)
} else if (state.autoDestroy) {
@@ -1280,47 +1196,40 @@ function endReadableNT(state, stream) {
const wState = stream._writableState
const autoDestroy =
!wState ||
- (wState.autoDestroy && // We don't expect the writable to ever 'finish'
+ (wState.autoDestroy &&
+ // We don't expect the writable to ever 'finish'
// if writable is explicitly set to false.
(wState.finished || wState.writable === false))
-
if (autoDestroy) {
stream.destroy()
}
}
}
}
-
function endWritableNT(stream) {
const writable = stream.writable && !stream.writableEnded && !stream.destroyed
-
if (writable) {
stream.end()
}
}
-
Readable.from = function (iterable, opts) {
return from(Readable, iterable, opts)
}
+let webStreamsAdapters
-let webStreamsAdapters // Lazy to avoid circular references
-
+// Lazy to avoid circular references
function lazyWebStreams() {
if (webStreamsAdapters === undefined) webStreamsAdapters = {}
return webStreamsAdapters
}
-
Readable.fromWeb = function (readableStream, options) {
return lazyWebStreams().newStreamReadableFromReadableStream(readableStream, options)
}
-
Readable.toWeb = function (streamReadable, options) {
return lazyWebStreams().newReadableStreamFromStreamReadable(streamReadable, options)
}
-
Readable.wrap = function (src, options) {
var _ref, _src$readableObjectMo
-
return new Readable({
objectMode:
(_ref =
@@ -1330,7 +1239,6 @@ Readable.wrap = function (src, options) {
? _ref
: true,
...options,
-
destroy(err, callback) {
destroyImpl.destroyer(src, err)
callback(err)
diff --git a/deps/npm/node_modules/readable-stream/lib/internal/streams/state.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/state.js
index 19887eb8a9070e..18c2d845ff0186 100644
--- a/deps/npm/node_modules/readable-stream/lib/internal/streams/state.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/state.js
@@ -1,27 +1,27 @@
-'use strict';
-
-var ERR_INVALID_OPT_VALUE = require('../../../errors').codes.ERR_INVALID_OPT_VALUE;
+'use strict'
+const { MathFloor, NumberIsInteger } = require('../../ours/primordials')
+const { ERR_INVALID_ARG_VALUE } = require('../../ours/errors').codes
function highWaterMarkFrom(options, isDuplex, duplexKey) {
- return options.highWaterMark != null ? options.highWaterMark : isDuplex ? options[duplexKey] : null;
+ return options.highWaterMark != null ? options.highWaterMark : isDuplex ? options[duplexKey] : null
+}
+function getDefaultHighWaterMark(objectMode) {
+ return objectMode ? 16 : 16 * 1024
}
-
function getHighWaterMark(state, options, duplexKey, isDuplex) {
- var hwm = highWaterMarkFrom(options, isDuplex, duplexKey);
-
+ const hwm = highWaterMarkFrom(options, isDuplex, duplexKey)
if (hwm != null) {
- if (!(isFinite(hwm) && Math.floor(hwm) === hwm) || hwm < 0) {
- var name = isDuplex ? duplexKey : 'highWaterMark';
- throw new ERR_INVALID_OPT_VALUE(name, hwm);
+ if (!NumberIsInteger(hwm) || hwm < 0) {
+ const name = isDuplex ? `options.${duplexKey}` : 'options.highWaterMark'
+ throw new ERR_INVALID_ARG_VALUE(name, hwm)
}
+ return MathFloor(hwm)
+ }
- return Math.floor(hwm);
- } // Default value
-
-
- return state.objectMode ? 16 : 16 * 1024;
+ // Default value
+ return getDefaultHighWaterMark(state.objectMode)
}
-
module.exports = {
- getHighWaterMark: getHighWaterMark
-};
\ No newline at end of file
+ getHighWaterMark,
+ getDefaultHighWaterMark
+}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/transform.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/transform.js
similarity index 95%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/transform.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/transform.js
index 18601011e03f4a..fa9413a447463c 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/transform.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/transform.js
@@ -18,6 +18,7 @@
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
// a transform stream is a readable/writable stream where you do
// something with the data. Sometimes it's called a "filter",
// but that's not a great name for it, since that implies a thing where
@@ -59,29 +60,24 @@
// However, even in such a pathological case, only a single written chunk
// would be consumed, and then the rest would wait (un-transformed) until
// the results of the previous transformed chunk were consumed.
+
'use strict'
const { ObjectSetPrototypeOf, Symbol } = require('../../ours/primordials')
-
module.exports = Transform
-
const { ERR_METHOD_NOT_IMPLEMENTED } = require('../../ours/errors').codes
-
const Duplex = require('./duplex')
-
const { getHighWaterMark } = require('./state')
-
ObjectSetPrototypeOf(Transform.prototype, Duplex.prototype)
ObjectSetPrototypeOf(Transform, Duplex)
const kCallback = Symbol('kCallback')
-
function Transform(options) {
- if (!(this instanceof Transform)) return new Transform(options) // TODO (ronag): This should preferably always be
+ if (!(this instanceof Transform)) return new Transform(options)
+
+ // TODO (ronag): This should preferably always be
// applied but would be semver-major. Or even better;
// make Transform a Readable with the Writable interface.
-
const readableHighWaterMark = options ? getHighWaterMark(this, options, 'readableHighWaterMark', true) : null
-
if (readableHighWaterMark === 0) {
// A Duplex will buffer both on the writable and readable side while
// a Transform just wants to buffer hwm number of elements. To avoid
@@ -97,25 +93,24 @@ function Transform(options) {
writableHighWaterMark: options.writableHighWaterMark || 0
}
}
+ Duplex.call(this, options)
- Duplex.call(this, options) // We have implemented the _read method, and done the other things
+ // We have implemented the _read method, and done the other things
// that Readable wants before the first _read call, so unset the
// sync guard flag.
-
this._readableState.sync = false
this[kCallback] = null
-
if (options) {
if (typeof options.transform === 'function') this._transform = options.transform
if (typeof options.flush === 'function') this._flush = options.flush
- } // When the writable side finishes, then flush out anything remaining.
+ }
+
+ // When the writable side finishes, then flush out anything remaining.
// Backwards compat. Some Transform streams incorrectly implement _final
// instead of or in addition to _flush. By using 'prefinish' instead of
// implementing _final we continue supporting this unfortunate use case.
-
this.on('prefinish', prefinish)
}
-
function final(cb) {
if (typeof this._flush === 'function' && !this.destroyed) {
this._flush((er, data) => {
@@ -125,59 +120,49 @@ function final(cb) {
} else {
this.destroy(er)
}
-
return
}
-
if (data != null) {
this.push(data)
}
-
this.push(null)
-
if (cb) {
cb()
}
})
} else {
this.push(null)
-
if (cb) {
cb()
}
}
}
-
function prefinish() {
if (this._final !== final) {
final.call(this)
}
}
-
Transform.prototype._final = final
-
Transform.prototype._transform = function (chunk, encoding, callback) {
throw new ERR_METHOD_NOT_IMPLEMENTED('_transform()')
}
-
Transform.prototype._write = function (chunk, encoding, callback) {
const rState = this._readableState
const wState = this._writableState
const length = rState.length
-
this._transform(chunk, encoding, (err, val) => {
if (err) {
callback(err)
return
}
-
if (val != null) {
this.push(val)
}
-
if (
- wState.ended || // Backwards compat.
- length === rState.length || // Backwards compat.
+ wState.ended ||
+ // Backwards compat.
+ length === rState.length ||
+ // Backwards compat.
rState.length < rState.highWaterMark
) {
callback()
@@ -186,7 +171,6 @@ Transform.prototype._write = function (chunk, encoding, callback) {
}
})
}
-
Transform.prototype._read = function () {
if (this[kCallback]) {
const callback = this[kCallback]
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/utils.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/utils.js
similarity index 97%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/utils.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/utils.js
index b1aa7d81705c04..f87e9fe68e6a82 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/utils.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/utils.js
@@ -1,15 +1,12 @@
'use strict'
const { Symbol, SymbolAsyncIterator, SymbolIterator } = require('../../ours/primordials')
-
const kDestroyed = Symbol('kDestroyed')
const kIsErrored = Symbol('kIsErrored')
const kIsReadable = Symbol('kIsReadable')
const kIsDisturbed = Symbol('kIsDisturbed')
-
function isReadableNodeStream(obj, strict = false) {
var _obj$_readableState
-
return !!(
(
obj &&
@@ -19,7 +16,8 @@ function isReadableNodeStream(obj, strict = false) {
(!obj._writableState ||
((_obj$_readableState = obj._readableState) === null || _obj$_readableState === undefined
? undefined
- : _obj$_readableState.readable) !== false) && // Duplex
+ : _obj$_readableState.readable) !== false) &&
+ // Duplex
(!obj._writableState || obj._readableState)
) // Writable has .pipe.
)
@@ -27,7 +25,6 @@ function isReadableNodeStream(obj, strict = false) {
function isWritableNodeStream(obj) {
var _obj$_writableState
-
return !!(
(
obj &&
@@ -50,7 +47,6 @@ function isDuplexNodeStream(obj) {
typeof obj.write === 'function'
)
}
-
function isNodeStream(obj) {
return (
obj &&
@@ -60,22 +56,21 @@ function isNodeStream(obj) {
(typeof obj.pipe === 'function' && typeof obj.on === 'function'))
)
}
-
function isIterable(obj, isAsync) {
if (obj == null) return false
if (isAsync === true) return typeof obj[SymbolAsyncIterator] === 'function'
if (isAsync === false) return typeof obj[SymbolIterator] === 'function'
return typeof obj[SymbolAsyncIterator] === 'function' || typeof obj[SymbolIterator] === 'function'
}
-
function isDestroyed(stream) {
if (!isNodeStream(stream)) return null
const wState = stream._writableState
const rState = stream._readableState
const state = wState || rState
return !!(stream.destroyed || stream[kDestroyed] || (state !== null && state !== undefined && state.destroyed))
-} // Have been end():d.
+}
+// Have been end():d.
function isWritableEnded(stream) {
if (!isWritableNodeStream(stream)) return null
if (stream.writableEnded === true) return true
@@ -83,8 +78,9 @@ function isWritableEnded(stream) {
if (wState !== null && wState !== undefined && wState.errored) return false
if (typeof (wState === null || wState === undefined ? undefined : wState.ended) !== 'boolean') return null
return wState.ended
-} // Have emitted 'finish'.
+}
+// Have emitted 'finish'.
function isWritableFinished(stream, strict) {
if (!isWritableNodeStream(stream)) return null
if (stream.writableFinished === true) return true
@@ -92,8 +88,9 @@ function isWritableFinished(stream, strict) {
if (wState !== null && wState !== undefined && wState.errored) return false
if (typeof (wState === null || wState === undefined ? undefined : wState.finished) !== 'boolean') return null
return !!(wState.finished || (strict === false && wState.ended === true && wState.length === 0))
-} // Have been push(null):d.
+}
+// Have been push(null):d.
function isReadableEnded(stream) {
if (!isReadableNodeStream(stream)) return null
if (stream.readableEnded === true) return true
@@ -101,8 +98,9 @@ function isReadableEnded(stream) {
if (!rState || rState.errored) return false
if (typeof (rState === null || rState === undefined ? undefined : rState.ended) !== 'boolean') return null
return rState.ended
-} // Have emitted 'end'.
+}
+// Have emitted 'end'.
function isReadableFinished(stream, strict) {
if (!isReadableNodeStream(stream)) return null
const rState = stream._readableState
@@ -110,51 +108,40 @@ function isReadableFinished(stream, strict) {
if (typeof (rState === null || rState === undefined ? undefined : rState.endEmitted) !== 'boolean') return null
return !!(rState.endEmitted || (strict === false && rState.ended === true && rState.length === 0))
}
-
function isReadable(stream) {
if (stream && stream[kIsReadable] != null) return stream[kIsReadable]
if (typeof (stream === null || stream === undefined ? undefined : stream.readable) !== 'boolean') return null
if (isDestroyed(stream)) return false
return isReadableNodeStream(stream) && stream.readable && !isReadableFinished(stream)
}
-
function isWritable(stream) {
if (typeof (stream === null || stream === undefined ? undefined : stream.writable) !== 'boolean') return null
if (isDestroyed(stream)) return false
return isWritableNodeStream(stream) && stream.writable && !isWritableEnded(stream)
}
-
function isFinished(stream, opts) {
if (!isNodeStream(stream)) {
return null
}
-
if (isDestroyed(stream)) {
return true
}
-
if ((opts === null || opts === undefined ? undefined : opts.readable) !== false && isReadable(stream)) {
return false
}
-
if ((opts === null || opts === undefined ? undefined : opts.writable) !== false && isWritable(stream)) {
return false
}
-
return true
}
-
function isWritableErrored(stream) {
var _stream$_writableStat, _stream$_writableStat2
-
if (!isNodeStream(stream)) {
return null
}
-
if (stream.writableErrored) {
return stream.writableErrored
}
-
return (_stream$_writableStat =
(_stream$_writableStat2 = stream._writableState) === null || _stream$_writableStat2 === undefined
? undefined
@@ -162,18 +149,14 @@ function isWritableErrored(stream) {
? _stream$_writableStat
: null
}
-
function isReadableErrored(stream) {
var _stream$_readableStat, _stream$_readableStat2
-
if (!isNodeStream(stream)) {
return null
}
-
if (stream.readableErrored) {
return stream.readableErrored
}
-
return (_stream$_readableStat =
(_stream$_readableStat2 = stream._readableState) === null || _stream$_readableStat2 === undefined
? undefined
@@ -181,19 +164,15 @@ function isReadableErrored(stream) {
? _stream$_readableStat
: null
}
-
function isClosed(stream) {
if (!isNodeStream(stream)) {
return null
}
-
if (typeof stream.closed === 'boolean') {
return stream.closed
}
-
const wState = stream._writableState
const rState = stream._readableState
-
if (
typeof (wState === null || wState === undefined ? undefined : wState.closed) === 'boolean' ||
typeof (rState === null || rState === undefined ? undefined : rState.closed) === 'boolean'
@@ -203,14 +182,11 @@ function isClosed(stream) {
(rState === null || rState === undefined ? undefined : rState.closed)
)
}
-
if (typeof stream._closed === 'boolean' && isOutgoingMessage(stream)) {
return stream._closed
}
-
return null
}
-
function isOutgoingMessage(stream) {
return (
typeof stream._closed === 'boolean' &&
@@ -219,14 +195,11 @@ function isOutgoingMessage(stream) {
typeof stream._removedContLen === 'boolean'
)
}
-
function isServerResponse(stream) {
return typeof stream._sent100 === 'boolean' && isOutgoingMessage(stream)
}
-
function isServerRequest(stream) {
var _stream$req
-
return (
typeof stream._consuming === 'boolean' &&
typeof stream._dumped === 'boolean' &&
@@ -234,7 +207,6 @@ function isServerRequest(stream) {
undefined
)
}
-
function willEmitClose(stream) {
if (!isNodeStream(stream)) return null
const wState = stream._writableState
@@ -244,10 +216,8 @@ function willEmitClose(stream) {
(!state && isServerResponse(stream)) || !!(state && state.autoDestroy && state.emitClose && state.closed === false)
)
}
-
function isDisturbed(stream) {
var _stream$kIsDisturbed
-
return !!(
stream &&
((_stream$kIsDisturbed = stream[kIsDisturbed]) !== null && _stream$kIsDisturbed !== undefined
@@ -255,7 +225,6 @@ function isDisturbed(stream) {
: stream.readableDidRead || stream.readableAborted)
)
}
-
function isErrored(stream) {
var _ref,
_ref2,
@@ -267,7 +236,6 @@ function isErrored(stream) {
_stream$_writableStat3,
_stream$_readableStat4,
_stream$_writableStat4
-
return !!(
stream &&
((_ref =
@@ -298,7 +266,6 @@ function isErrored(stream) {
: _stream$_writableStat4.errored)
)
}
-
module.exports = {
kDestroyed,
isDisturbed,
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/writable.js b/deps/npm/node_modules/readable-stream/lib/internal/streams/writable.js
similarity index 85%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/writable.js
rename to deps/npm/node_modules/readable-stream/lib/internal/streams/writable.js
index 9b792e60d7fc8b..8a28003465766d 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/streams/writable.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/streams/writable.js
@@ -1,5 +1,7 @@
/* replacement start */
-const process = require('process')
+
+const process = require('process/')
+
/* replacement end */
// Copyright Joyent, Inc. and other Node contributors.
//
@@ -21,12 +23,12 @@ const process = require('process')
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
// A bit simpler than readable streams.
// Implement an async ._write(chunk, encoding, cb), and it'll handle all
// the drain event emission and buffering.
;('use strict')
-
const {
ArrayPrototypeSlice,
Error,
@@ -38,22 +40,14 @@ const {
Symbol,
SymbolHasInstance
} = require('../../ours/primordials')
-
module.exports = Writable
Writable.WritableState = WritableState
-
const { EventEmitter: EE } = require('events')
-
const Stream = require('./legacy').Stream
-
const { Buffer } = require('buffer')
-
const destroyImpl = require('./destroy')
-
const { addAbortSignal } = require('./add-abort-signal')
-
const { getHighWaterMark, getDefaultHighWaterMark } = require('./state')
-
const {
ERR_INVALID_ARG_TYPE,
ERR_METHOD_NOT_IMPLEMENTED,
@@ -65,142 +59,158 @@ const {
ERR_STREAM_WRITE_AFTER_END,
ERR_UNKNOWN_ENCODING
} = require('../../ours/errors').codes
-
const { errorOrDestroy } = destroyImpl
ObjectSetPrototypeOf(Writable.prototype, Stream.prototype)
ObjectSetPrototypeOf(Writable, Stream)
-
function nop() {}
-
const kOnFinished = Symbol('kOnFinished')
-
function WritableState(options, stream, isDuplex) {
// Duplex streams are both readable and writable, but share
// the same options object.
// However, some cases require setting options to different
// values for the readable and the writable sides of the duplex stream,
// e.g. options.readableObjectMode vs. options.writableObjectMode, etc.
- if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof require('./duplex') // Object stream flag to indicate whether or not this stream
- // contains buffers or objects.
+ if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof require('./duplex')
+ // Object stream flag to indicate whether or not this stream
+ // contains buffers or objects.
this.objectMode = !!(options && options.objectMode)
- if (isDuplex) this.objectMode = this.objectMode || !!(options && options.writableObjectMode) // The point at which write() starts returning false
+ if (isDuplex) this.objectMode = this.objectMode || !!(options && options.writableObjectMode)
+
+ // The point at which write() starts returning false
// Note: 0 is a valid value, means that we always return false if
// the entire buffer is not flushed immediately on write().
-
this.highWaterMark = options
? getHighWaterMark(this, options, 'writableHighWaterMark', isDuplex)
- : getDefaultHighWaterMark(false) // if _final has been called.
+ : getDefaultHighWaterMark(false)
- this.finalCalled = false // drain event flag.
+ // if _final has been called.
+ this.finalCalled = false
- this.needDrain = false // At the start of calling end()
+ // drain event flag.
+ this.needDrain = false
+ // At the start of calling end()
+ this.ending = false
+ // When end() has been called, and returned.
+ this.ended = false
+ // When 'finish' is emitted.
+ this.finished = false
- this.ending = false // When end() has been called, and returned.
+ // Has it been destroyed
+ this.destroyed = false
- this.ended = false // When 'finish' is emitted.
-
- this.finished = false // Has it been destroyed
-
- this.destroyed = false // Should we decode strings into buffers before passing to _write?
+ // Should we decode strings into buffers before passing to _write?
// this is here so that some node-core streams can optimize string
// handling at a lower level.
-
const noDecode = !!(options && options.decodeStrings === false)
- this.decodeStrings = !noDecode // Crypto is kind of old and crusty. Historically, its default string
+ this.decodeStrings = !noDecode
+
+ // Crypto is kind of old and crusty. Historically, its default string
// encoding is 'binary' so we have to make this configurable.
// Everything else in the universe uses 'utf8', though.
+ this.defaultEncoding = (options && options.defaultEncoding) || 'utf8'
- this.defaultEncoding = (options && options.defaultEncoding) || 'utf8' // Not an actual buffer we keep track of, but a measurement
+ // Not an actual buffer we keep track of, but a measurement
// of how much we're waiting to get pushed to some underlying
// socket or file.
+ this.length = 0
- this.length = 0 // A flag to see when we're in the middle of a write.
+ // A flag to see when we're in the middle of a write.
+ this.writing = false
- this.writing = false // When true all writes will be buffered until .uncork() call.
+ // When true all writes will be buffered until .uncork() call.
+ this.corked = 0
- this.corked = 0 // A flag to be able to tell if the onwrite cb is called immediately,
+ // A flag to be able to tell if the onwrite cb is called immediately,
// or on a later tick. We set this to true at first, because any
// actions that shouldn't happen until "later" should generally also
// not happen before the first write call.
+ this.sync = true
- this.sync = true // A flag to know if we're processing previously buffered items, which
+ // A flag to know if we're processing previously buffered items, which
// may call the _write() callback in the same tick, so that we don't
// end up in an overlapped onwrite situation.
+ this.bufferProcessing = false
- this.bufferProcessing = false // The callback that's passed to _write(chunk, cb).
+ // The callback that's passed to _write(chunk, cb).
+ this.onwrite = onwrite.bind(undefined, stream)
- this.onwrite = onwrite.bind(undefined, stream) // The callback that the user supplies to write(chunk, encoding, cb).
+ // The callback that the user supplies to write(chunk, encoding, cb).
+ this.writecb = null
- this.writecb = null // The amount that is being written when _write is called.
+ // The amount that is being written when _write is called.
+ this.writelen = 0
- this.writelen = 0 // Storage for data passed to the afterWrite() callback in case of
+ // Storage for data passed to the afterWrite() callback in case of
// synchronous _write() completion.
-
this.afterWriteTickInfo = null
- resetBuffer(this) // Number of pending user-supplied write callbacks
+ resetBuffer(this)
+
+ // Number of pending user-supplied write callbacks
// this must be 0 before 'finish' can be emitted.
+ this.pendingcb = 0
- this.pendingcb = 0 // Stream is still being constructed and cannot be
+ // Stream is still being constructed and cannot be
// destroyed until construction finished or failed.
// Async construction is opt in, therefore we start as
// constructed.
+ this.constructed = true
- this.constructed = true // Emit prefinish if the only thing we're waiting for is _write cbs
+ // Emit prefinish if the only thing we're waiting for is _write cbs
// This is relevant for synchronous Transform streams.
+ this.prefinished = false
- this.prefinished = false // True if the error was already emitted and should not be thrown again.
+ // True if the error was already emitted and should not be thrown again.
+ this.errorEmitted = false
- this.errorEmitted = false // Should close be emitted on destroy. Defaults to true.
+ // Should close be emitted on destroy. Defaults to true.
+ this.emitClose = !options || options.emitClose !== false
- this.emitClose = !options || options.emitClose !== false // Should .destroy() be called after 'finish' (and potentially 'end').
+ // Should .destroy() be called after 'finish' (and potentially 'end').
+ this.autoDestroy = !options || options.autoDestroy !== false
- this.autoDestroy = !options || options.autoDestroy !== false // Indicates whether the stream has errored. When true all write() calls
+ // Indicates whether the stream has errored. When true all write() calls
// should return false. This is needed since when autoDestroy
// is disabled we need a way to tell whether the stream has failed.
+ this.errored = null
- this.errored = null // Indicates whether the stream has finished destroying.
+ // Indicates whether the stream has finished destroying.
+ this.closed = false
- this.closed = false // True if close has been emitted or would have been emitted
+ // True if close has been emitted or would have been emitted
// depending on emitClose.
-
this.closeEmitted = false
this[kOnFinished] = []
}
-
function resetBuffer(state) {
state.buffered = []
state.bufferedIndex = 0
state.allBuffers = true
state.allNoop = true
}
-
WritableState.prototype.getBuffer = function getBuffer() {
return ArrayPrototypeSlice(this.buffered, this.bufferedIndex)
}
-
ObjectDefineProperty(WritableState.prototype, 'bufferedRequestCount', {
__proto__: null,
-
get() {
return this.buffered.length - this.bufferedIndex
}
})
-
function Writable(options) {
// Writable ctor is applied to Duplexes, too.
// `realHasInstance` is necessary because using plain `instanceof`
// would return false, as no `_writableState` property is attached.
+
// Trying to use the custom `instanceof` for Writable here will also break the
// Node.js LazyTransform implementation, which has a non-trivial getter for
// `_writableState` that would lead to infinite recursion.
+
// Checking for a Stream.Duplex instance is faster here instead of inside
// the WritableState constructor, at least with V8 6.5.
const isDuplex = this instanceof require('./duplex')
-
if (!isDuplex && !FunctionPrototypeSymbolHasInstance(Writable, this)) return new Writable(options)
this._writableState = new WritableState(options, this, isDuplex)
-
if (options) {
if (typeof options.write === 'function') this._write = options.write
if (typeof options.writev === 'function') this._writev = options.writev
@@ -209,19 +219,15 @@ function Writable(options) {
if (typeof options.construct === 'function') this._construct = options.construct
if (options.signal) addAbortSignal(options.signal, this)
}
-
Stream.call(this, options)
destroyImpl.construct(this, () => {
const state = this._writableState
-
if (!state.writing) {
clearBuffer(this, state)
}
-
finishMaybe(this, state)
})
}
-
ObjectDefineProperty(Writable, SymbolHasInstance, {
__proto__: null,
value: function (object) {
@@ -229,15 +235,14 @@ ObjectDefineProperty(Writable, SymbolHasInstance, {
if (this !== Writable) return false
return object && object._writableState instanceof WritableState
}
-}) // Otherwise people can pipe Writable streams, which is just wrong.
+})
+// Otherwise people can pipe Writable streams, which is just wrong.
Writable.prototype.pipe = function () {
errorOrDestroy(this, new ERR_STREAM_CANNOT_PIPE())
}
-
function _write(stream, chunk, encoding, cb) {
const state = stream._writableState
-
if (typeof encoding === 'function') {
cb = encoding
encoding = state.defaultEncoding
@@ -246,7 +251,6 @@ function _write(stream, chunk, encoding, cb) {
else if (encoding !== 'buffer' && !Buffer.isEncoding(encoding)) throw new ERR_UNKNOWN_ENCODING(encoding)
if (typeof cb !== 'function') cb = nop
}
-
if (chunk === null) {
throw new ERR_STREAM_NULL_VALUES()
} else if (!state.objectMode) {
@@ -264,71 +268,61 @@ function _write(stream, chunk, encoding, cb) {
throw new ERR_INVALID_ARG_TYPE('chunk', ['string', 'Buffer', 'Uint8Array'], chunk)
}
}
-
let err
-
if (state.ending) {
err = new ERR_STREAM_WRITE_AFTER_END()
} else if (state.destroyed) {
err = new ERR_STREAM_DESTROYED('write')
}
-
if (err) {
process.nextTick(cb, err)
errorOrDestroy(stream, err, true)
return err
}
-
state.pendingcb++
return writeOrBuffer(stream, state, chunk, encoding, cb)
}
-
Writable.prototype.write = function (chunk, encoding, cb) {
return _write(this, chunk, encoding, cb) === true
}
-
Writable.prototype.cork = function () {
this._writableState.corked++
}
-
Writable.prototype.uncork = function () {
const state = this._writableState
-
if (state.corked) {
state.corked--
if (!state.writing) clearBuffer(this, state)
}
}
-
Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) {
// node::ParseEncoding() requires lower case.
if (typeof encoding === 'string') encoding = StringPrototypeToLowerCase(encoding)
if (!Buffer.isEncoding(encoding)) throw new ERR_UNKNOWN_ENCODING(encoding)
this._writableState.defaultEncoding = encoding
return this
-} // If we're already writing something, then just put this
+}
+
+// If we're already writing something, then just put this
// in the queue, and wait our turn. Otherwise, call _write
// If we return false, then we need a drain event, so set that flag.
-
function writeOrBuffer(stream, state, chunk, encoding, callback) {
const len = state.objectMode ? 1 : chunk.length
- state.length += len // stream._write resets state.length
-
- const ret = state.length < state.highWaterMark // We must ensure that previous needDrain will not be reset to false.
+ state.length += len
+ // stream._write resets state.length
+ const ret = state.length < state.highWaterMark
+ // We must ensure that previous needDrain will not be reset to false.
if (!ret) state.needDrain = true
-
if (state.writing || state.corked || state.errored || !state.constructed) {
state.buffered.push({
chunk,
encoding,
callback
})
-
if (state.allBuffers && encoding !== 'buffer') {
state.allBuffers = false
}
-
if (state.allNoop && callback !== nop) {
state.allNoop = false
}
@@ -337,16 +331,14 @@ function writeOrBuffer(stream, state, chunk, encoding, callback) {
state.writecb = callback
state.writing = true
state.sync = true
-
stream._write(chunk, encoding, state.onwrite)
-
state.sync = false
- } // Return false if errored or destroyed in order to break
- // any synchronous while(stream.write(data)) loops.
+ }
+ // Return false if errored or destroyed in order to break
+ // any synchronous while(stream.write(data)) loops.
return ret && !state.errored && !state.destroyed
}
-
function doWrite(stream, state, writev, len, chunk, encoding, cb) {
state.writelen = len
state.writecb = cb
@@ -357,47 +349,42 @@ function doWrite(stream, state, writev, len, chunk, encoding, cb) {
else stream._write(chunk, encoding, state.onwrite)
state.sync = false
}
-
function onwriteError(stream, state, er, cb) {
--state.pendingcb
- cb(er) // Ensure callbacks are invoked even when autoDestroy is
+ cb(er)
+ // Ensure callbacks are invoked even when autoDestroy is
// not enabled. Passing `er` here doesn't make sense since
// it's related to one specific write, not to the buffered
// writes.
-
- errorBuffer(state) // This can emit error, but error must always follow cb.
-
+ errorBuffer(state)
+ // This can emit error, but error must always follow cb.
errorOrDestroy(stream, er)
}
-
function onwrite(stream, er) {
const state = stream._writableState
const sync = state.sync
const cb = state.writecb
-
if (typeof cb !== 'function') {
errorOrDestroy(stream, new ERR_MULTIPLE_CALLBACK())
return
}
-
state.writing = false
state.writecb = null
state.length -= state.writelen
state.writelen = 0
-
if (er) {
// Avoid V8 leak, https://github.com/nodejs/node/pull/34103#issuecomment-652002364
er.stack // eslint-disable-line no-unused-expressions
if (!state.errored) {
state.errored = er
- } // In case of duplex streams we need to notify the readable side of the
- // error.
+ }
+ // In case of duplex streams we need to notify the readable side of the
+ // error.
if (stream._readableState && !stream._readableState.errored) {
stream._readableState.errored = er
}
-
if (sync) {
process.nextTick(onwriteError, stream, state, er, cb)
} else {
@@ -407,7 +394,6 @@ function onwrite(stream, er) {
if (state.buffered.length > state.bufferedIndex) {
clearBuffer(stream, state)
}
-
if (sync) {
// It is a common case that the callback passed to .write() is always
// the same. In that case, we do not schedule a new nextTick(), but
@@ -429,40 +415,33 @@ function onwrite(stream, er) {
}
}
}
-
function afterWriteTick({ stream, state, count, cb }) {
state.afterWriteTickInfo = null
return afterWrite(stream, state, count, cb)
}
-
function afterWrite(stream, state, count, cb) {
const needDrain = !state.ending && !stream.destroyed && state.length === 0 && state.needDrain
-
if (needDrain) {
state.needDrain = false
stream.emit('drain')
}
-
while (count-- > 0) {
state.pendingcb--
cb()
}
-
if (state.destroyed) {
errorBuffer(state)
}
-
finishMaybe(stream, state)
-} // If there's something in the buffer waiting, then invoke callbacks.
+}
+// If there's something in the buffer waiting, then invoke callbacks.
function errorBuffer(state) {
if (state.writing) {
return
}
-
for (let n = state.bufferedIndex; n < state.buffered.length; ++n) {
var _state$errored
-
const { chunk, callback } = state.buffered[n]
const len = state.objectMode ? 1 : chunk.length
state.length -= len
@@ -472,37 +451,30 @@ function errorBuffer(state) {
: new ERR_STREAM_DESTROYED('write')
)
}
-
const onfinishCallbacks = state[kOnFinished].splice(0)
-
for (let i = 0; i < onfinishCallbacks.length; i++) {
var _state$errored2
-
onfinishCallbacks[i](
(_state$errored2 = state.errored) !== null && _state$errored2 !== undefined
? _state$errored2
: new ERR_STREAM_DESTROYED('end')
)
}
-
resetBuffer(state)
-} // If there's something in the buffer waiting, then process it.
+}
+// If there's something in the buffer waiting, then process it.
function clearBuffer(stream, state) {
if (state.corked || state.bufferProcessing || state.destroyed || !state.constructed) {
return
}
-
const { buffered, bufferedIndex, objectMode } = state
const bufferedLength = buffered.length - bufferedIndex
-
if (!bufferedLength) {
return
}
-
let i = bufferedIndex
state.bufferProcessing = true
-
if (bufferedLength > 1 && stream._writev) {
state.pendingcb -= bufferedLength - 1
const callback = state.allNoop
@@ -511,9 +483,9 @@ function clearBuffer(stream, state) {
for (let n = i; n < buffered.length; ++n) {
buffered[n].callback(err)
}
- } // Make a copy of `buffered` if it's going to be used by `callback` above,
+ }
+ // Make a copy of `buffered` if it's going to be used by `callback` above,
// since `doWrite` will mutate the array.
-
const chunks = state.allNoop && i === 0 ? buffered : ArrayPrototypeSlice(buffered, i)
chunks.allBuffers = state.allBuffers
doWrite(stream, state, true, state.length, chunks, '', callback)
@@ -525,7 +497,6 @@ function clearBuffer(stream, state) {
const len = objectMode ? 1 : chunk.length
doWrite(stream, state, false, len, chunk, encoding, callback)
} while (i < buffered.length && !state.writing)
-
if (i === buffered.length) {
resetBuffer(state)
} else if (i > 256) {
@@ -535,10 +506,8 @@ function clearBuffer(stream, state) {
state.bufferedIndex = i
}
}
-
state.bufferProcessing = false
}
-
Writable.prototype._write = function (chunk, encoding, cb) {
if (this._writev) {
this._writev(
@@ -554,12 +523,9 @@ Writable.prototype._write = function (chunk, encoding, cb) {
throw new ERR_METHOD_NOT_IMPLEMENTED('_write()')
}
}
-
Writable.prototype._writev = null
-
Writable.prototype.end = function (chunk, encoding, cb) {
const state = this._writableState
-
if (typeof chunk === 'function') {
cb = chunk
chunk = null
@@ -568,22 +534,19 @@ Writable.prototype.end = function (chunk, encoding, cb) {
cb = encoding
encoding = null
}
-
let err
-
if (chunk !== null && chunk !== undefined) {
const ret = _write(this, chunk, encoding)
-
if (ret instanceof Error) {
err = ret
}
- } // .end() fully uncorks.
+ }
+ // .end() fully uncorks.
if (state.corked) {
state.corked = 1
this.uncork()
}
-
if (err) {
// Do nothing...
} else if (!state.errored && !state.ending) {
@@ -592,6 +555,7 @@ Writable.prototype.end = function (chunk, encoding, cb) {
// hard error can be disproportionately destructive. It is not always
// trivial for the user to determine whether end() needs to be called
// or not.
+
state.ending = true
finishMaybe(this, state, true)
state.ended = true
@@ -600,7 +564,6 @@ Writable.prototype.end = function (chunk, encoding, cb) {
} else if (state.destroyed) {
err = new ERR_STREAM_DESTROYED('end')
}
-
if (typeof cb === 'function') {
if (err || state.finished) {
process.nextTick(cb, err)
@@ -608,10 +571,8 @@ Writable.prototype.end = function (chunk, encoding, cb) {
state[kOnFinished].push(cb)
}
}
-
return this
}
-
function needFinish(state) {
return (
state.ending &&
@@ -626,50 +587,40 @@ function needFinish(state) {
!state.closeEmitted
)
}
-
function callFinal(stream, state) {
let called = false
-
function onFinish(err) {
if (called) {
errorOrDestroy(stream, err !== null && err !== undefined ? err : ERR_MULTIPLE_CALLBACK())
return
}
-
called = true
state.pendingcb--
-
if (err) {
const onfinishCallbacks = state[kOnFinished].splice(0)
-
for (let i = 0; i < onfinishCallbacks.length; i++) {
onfinishCallbacks[i](err)
}
-
errorOrDestroy(stream, err, state.sync)
} else if (needFinish(state)) {
state.prefinished = true
- stream.emit('prefinish') // Backwards compat. Don't check state.sync here.
+ stream.emit('prefinish')
+ // Backwards compat. Don't check state.sync here.
// Some streams assume 'finish' will be emitted
// asynchronously relative to _final callback.
-
state.pendingcb++
process.nextTick(finish, stream, state)
}
}
-
state.sync = true
state.pendingcb++
-
try {
stream._final(onFinish)
} catch (err) {
onFinish(err)
}
-
state.sync = false
}
-
function prefinish(stream, state) {
if (!state.prefinished && !state.finalCalled) {
if (typeof stream._final === 'function' && !state.destroyed) {
@@ -681,11 +632,9 @@ function prefinish(stream, state) {
}
}
}
-
function finishMaybe(stream, state, sync) {
if (needFinish(state)) {
prefinish(stream, state)
-
if (state.pendingcb === 0) {
if (sync) {
state.pendingcb++
@@ -707,49 +656,41 @@ function finishMaybe(stream, state, sync) {
}
}
}
-
function finish(stream, state) {
state.pendingcb--
state.finished = true
const onfinishCallbacks = state[kOnFinished].splice(0)
-
for (let i = 0; i < onfinishCallbacks.length; i++) {
onfinishCallbacks[i]()
}
-
stream.emit('finish')
-
if (state.autoDestroy) {
// In case of duplex streams we need a way to detect
// if the readable side is ready for autoDestroy as well.
const rState = stream._readableState
const autoDestroy =
!rState ||
- (rState.autoDestroy && // We don't expect the readable to ever 'end'
+ (rState.autoDestroy &&
+ // We don't expect the readable to ever 'end'
// if readable is explicitly set to false.
(rState.endEmitted || rState.readable === false))
-
if (autoDestroy) {
stream.destroy()
}
}
}
-
ObjectDefineProperties(Writable.prototype, {
closed: {
__proto__: null,
-
get() {
return this._writableState ? this._writableState.closed : false
}
},
destroyed: {
__proto__: null,
-
get() {
return this._writableState ? this._writableState.destroyed : false
},
-
set(value) {
// Backward compatibility, the user is explicitly managing destroyed.
if (this._writableState) {
@@ -759,16 +700,14 @@ ObjectDefineProperties(Writable.prototype, {
},
writable: {
__proto__: null,
-
get() {
- const w = this._writableState // w.writable === false means that this is part of a Duplex stream
+ const w = this._writableState
+ // w.writable === false means that this is part of a Duplex stream
// where the writable side was disabled upon construction.
// Compat. The user might manually disable writable side through
// deprecated setter.
-
return !!w && w.writable !== false && !w.destroyed && !w.errored && !w.ending && !w.ended
},
-
set(val) {
// Backwards compatible.
if (this._writableState) {
@@ -778,35 +717,30 @@ ObjectDefineProperties(Writable.prototype, {
},
writableFinished: {
__proto__: null,
-
get() {
return this._writableState ? this._writableState.finished : false
}
},
writableObjectMode: {
__proto__: null,
-
get() {
return this._writableState ? this._writableState.objectMode : false
}
},
writableBuffer: {
__proto__: null,
-
get() {
return this._writableState && this._writableState.getBuffer()
}
},
writableEnded: {
__proto__: null,
-
get() {
return this._writableState ? this._writableState.ending : false
}
},
writableNeedDrain: {
__proto__: null,
-
get() {
const wState = this._writableState
if (!wState) return false
@@ -815,21 +749,18 @@ ObjectDefineProperties(Writable.prototype, {
},
writableHighWaterMark: {
__proto__: null,
-
get() {
return this._writableState && this._writableState.highWaterMark
}
},
writableCorked: {
__proto__: null,
-
get() {
return this._writableState ? this._writableState.corked : 0
}
},
writableLength: {
__proto__: null,
-
get() {
return this._writableState && this._writableState.length
}
@@ -837,7 +768,6 @@ ObjectDefineProperties(Writable.prototype, {
errored: {
__proto__: null,
enumerable: false,
-
get() {
return this._writableState ? this._writableState.errored : null
}
@@ -855,39 +785,33 @@ ObjectDefineProperties(Writable.prototype, {
}
})
const destroy = destroyImpl.destroy
-
Writable.prototype.destroy = function (err, cb) {
- const state = this._writableState // Invoke pending callbacks.
+ const state = this._writableState
+ // Invoke pending callbacks.
if (!state.destroyed && (state.bufferedIndex < state.buffered.length || state[kOnFinished].length)) {
process.nextTick(errorBuffer, state)
}
-
destroy.call(this, err, cb)
return this
}
-
Writable.prototype._undestroy = destroyImpl.undestroy
-
Writable.prototype._destroy = function (err, cb) {
cb(err)
}
-
Writable.prototype[EE.captureRejectionSymbol] = function (err) {
this.destroy(err)
}
+let webStreamsAdapters
-let webStreamsAdapters // Lazy to avoid circular references
-
+// Lazy to avoid circular references
function lazyWebStreams() {
if (webStreamsAdapters === undefined) webStreamsAdapters = {}
return webStreamsAdapters
}
-
Writable.fromWeb = function (writableStream, options) {
return lazyWebStreams().newStreamWritableFromWritableStream(writableStream, options)
}
-
Writable.toWeb = function (streamWritable) {
return lazyWebStreams().newWritableStreamFromStreamWritable(streamWritable)
}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/validators.js b/deps/npm/node_modules/readable-stream/lib/internal/validators.js
similarity index 98%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/validators.js
rename to deps/npm/node_modules/readable-stream/lib/internal/validators.js
index 3225949f38f943..f9e6e555971a1b 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/internal/validators.js
+++ b/deps/npm/node_modules/readable-stream/lib/internal/validators.js
@@ -16,36 +16,32 @@ const {
StringPrototypeToUpperCase,
StringPrototypeTrim
} = require('../ours/primordials')
-
const {
hideStackFrames,
codes: { ERR_SOCKET_BAD_PORT, ERR_INVALID_ARG_TYPE, ERR_INVALID_ARG_VALUE, ERR_OUT_OF_RANGE, ERR_UNKNOWN_SIGNAL }
} = require('../ours/errors')
-
const { normalizeEncoding } = require('../ours/util')
-
const { isAsyncFunction, isArrayBufferView } = require('../ours/util').types
-
const signals = {}
+
/**
* @param {*} value
* @returns {boolean}
*/
-
function isInt32(value) {
return value === (value | 0)
}
+
/**
* @param {*} value
* @returns {boolean}
*/
-
function isUint32(value) {
return value === value >>> 0
}
-
const octalReg = /^[0-7]+$/
const modeDesc = 'must be a 32-bit unsigned integer or an octal string'
+
/**
* Parse and validate values that will be converted into mode_t (the S_*
* constants). Only valid numbers and octal strings are allowed. They could be
@@ -58,23 +54,20 @@ const modeDesc = 'must be a 32-bit unsigned integer or an octal string'
* @param {number} [def] If specified, will be returned for invalid values
* @returns {number}
*/
-
function parseFileMode(value, name, def) {
if (typeof value === 'undefined') {
value = def
}
-
if (typeof value === 'string') {
if (RegExpPrototypeExec(octalReg, value) === null) {
throw new ERR_INVALID_ARG_VALUE(name, value, modeDesc)
}
-
value = NumberParseInt(value, 8)
}
-
validateUint32(value, name)
return value
}
+
/**
* @callback validateInteger
* @param {*} value
@@ -85,12 +78,12 @@ function parseFileMode(value, name, def) {
*/
/** @type {validateInteger} */
-
const validateInteger = hideStackFrames((value, name, min = NumberMIN_SAFE_INTEGER, max = NumberMAX_SAFE_INTEGER) => {
if (typeof value !== 'number') throw new ERR_INVALID_ARG_TYPE(name, 'number', value)
if (!NumberIsInteger(value)) throw new ERR_OUT_OF_RANGE(name, 'an integer', value)
if (value < min || value > max) throw new ERR_OUT_OF_RANGE(name, `>= ${min} && <= ${max}`, value)
})
+
/**
* @callback validateInt32
* @param {*} value
@@ -101,21 +94,19 @@ const validateInteger = hideStackFrames((value, name, min = NumberMIN_SAFE_INTEG
*/
/** @type {validateInt32} */
-
const validateInt32 = hideStackFrames((value, name, min = -2147483648, max = 2147483647) => {
// The defaults for min and max correspond to the limits of 32-bit integers.
if (typeof value !== 'number') {
throw new ERR_INVALID_ARG_TYPE(name, 'number', value)
}
-
if (!NumberIsInteger(value)) {
throw new ERR_OUT_OF_RANGE(name, 'an integer', value)
}
-
if (value < min || value > max) {
throw new ERR_OUT_OF_RANGE(name, `>= ${min} && <= ${max}`, value)
}
})
+
/**
* @callback validateUint32
* @param {*} value
@@ -125,24 +116,21 @@ const validateInt32 = hideStackFrames((value, name, min = -2147483648, max = 214
*/
/** @type {validateUint32} */
-
const validateUint32 = hideStackFrames((value, name, positive = false) => {
if (typeof value !== 'number') {
throw new ERR_INVALID_ARG_TYPE(name, 'number', value)
}
-
if (!NumberIsInteger(value)) {
throw new ERR_OUT_OF_RANGE(name, 'an integer', value)
}
-
- const min = positive ? 1 : 0 // 2 ** 32 === 4294967296
-
- const max = 4_294_967_295
-
+ const min = positive ? 1 : 0
+ // 2 ** 32 === 4294967296
+ const max = 4294967295
if (value < min || value > max) {
throw new ERR_OUT_OF_RANGE(name, `>= ${min} && <= ${max}`, value)
}
})
+
/**
* @callback validateString
* @param {*} value
@@ -151,10 +139,10 @@ const validateUint32 = hideStackFrames((value, name, positive = false) => {
*/
/** @type {validateString} */
-
function validateString(value, name) {
if (typeof value !== 'string') throw new ERR_INVALID_ARG_TYPE(name, 'string', value)
}
+
/**
* @callback validateNumber
* @param {*} value
@@ -165,10 +153,8 @@ function validateString(value, name) {
*/
/** @type {validateNumber} */
-
function validateNumber(value, name, min = undefined, max) {
if (typeof value !== 'number') throw new ERR_INVALID_ARG_TYPE(name, 'number', value)
-
if (
(min != null && value < min) ||
(max != null && value > max) ||
@@ -181,6 +167,7 @@ function validateNumber(value, name, min = undefined, max) {
)
}
}
+
/**
* @callback validateOneOf
* @template T
@@ -190,7 +177,6 @@ function validateNumber(value, name, min = undefined, max) {
*/
/** @type {validateOneOf} */
-
const validateOneOf = hideStackFrames((value, name, oneOf) => {
if (!ArrayPrototypeIncludes(oneOf, value)) {
const allowed = ArrayPrototypeJoin(
@@ -201,6 +187,7 @@ const validateOneOf = hideStackFrames((value, name, oneOf) => {
throw new ERR_INVALID_ARG_VALUE(name, value, reason)
}
})
+
/**
* @callback validateBoolean
* @param {*} value
@@ -209,14 +196,13 @@ const validateOneOf = hideStackFrames((value, name, oneOf) => {
*/
/** @type {validateBoolean} */
-
function validateBoolean(value, name) {
if (typeof value !== 'boolean') throw new ERR_INVALID_ARG_TYPE(name, 'boolean', value)
}
-
function getOwnPropertyValueOrDefault(options, key, defaultValue) {
return options == null || !ObjectPrototypeHasOwnProperty(options, key) ? defaultValue : options[key]
}
+
/**
* @callback validateObject
* @param {*} value
@@ -229,12 +215,10 @@ function getOwnPropertyValueOrDefault(options, key, defaultValue) {
*/
/** @type {validateObject} */
-
const validateObject = hideStackFrames((value, name, options = null) => {
const allowArray = getOwnPropertyValueOrDefault(options, 'allowArray', false)
const allowFunction = getOwnPropertyValueOrDefault(options, 'allowFunction', false)
const nullable = getOwnPropertyValueOrDefault(options, 'nullable', false)
-
if (
(!nullable && value === null) ||
(!allowArray && ArrayIsArray(value)) ||
@@ -243,6 +227,7 @@ const validateObject = hideStackFrames((value, name, options = null) => {
throw new ERR_INVALID_ARG_TYPE(name, 'Object', value)
}
})
+
/**
* @callback validateArray
* @param {*} value
@@ -252,35 +237,32 @@ const validateObject = hideStackFrames((value, name, options = null) => {
*/
/** @type {validateArray} */
-
const validateArray = hideStackFrames((value, name, minLength = 0) => {
if (!ArrayIsArray(value)) {
throw new ERR_INVALID_ARG_TYPE(name, 'Array', value)
}
-
if (value.length < minLength) {
const reason = `must be longer than ${minLength}`
throw new ERR_INVALID_ARG_VALUE(name, value, reason)
}
-}) // eslint-disable-next-line jsdoc/require-returns-check
+})
+// eslint-disable-next-line jsdoc/require-returns-check
/**
* @param {*} signal
* @param {string} [name='signal']
* @returns {asserts signal is keyof signals}
*/
-
function validateSignalName(signal, name = 'signal') {
validateString(signal, name)
-
if (signals[signal] === undefined) {
if (signals[StringPrototypeToUpperCase(signal)] !== undefined) {
throw new ERR_UNKNOWN_SIGNAL(signal + ' (signals must use all capital letters)')
}
-
throw new ERR_UNKNOWN_SIGNAL(signal)
}
}
+
/**
* @callback validateBuffer
* @param {*} buffer
@@ -289,25 +271,24 @@ function validateSignalName(signal, name = 'signal') {
*/
/** @type {validateBuffer} */
-
const validateBuffer = hideStackFrames((buffer, name = 'buffer') => {
if (!isArrayBufferView(buffer)) {
throw new ERR_INVALID_ARG_TYPE(name, ['Buffer', 'TypedArray', 'DataView'], buffer)
}
})
+
/**
* @param {string} data
* @param {string} encoding
*/
-
function validateEncoding(data, encoding) {
const normalizedEncoding = normalizeEncoding(encoding)
const length = data.length
-
if (normalizedEncoding === 'hex' && length % 2 !== 0) {
throw new ERR_INVALID_ARG_VALUE('encoding', encoding, `is invalid for data of length ${length}`)
}
}
+
/**
* Check that the port number is not NaN when coerced to a number,
* is an integer and that it falls within the legal range of port numbers.
@@ -316,7 +297,6 @@ function validateEncoding(data, encoding) {
* @param {boolean} [allowZero=true]
* @returns {number}
*/
-
function validatePort(port, name = 'Port', allowZero = true) {
if (
(typeof port !== 'number' && typeof port !== 'string') ||
@@ -327,9 +307,9 @@ function validatePort(port, name = 'Port', allowZero = true) {
) {
throw new ERR_SOCKET_BAD_PORT(name, port, allowZero)
}
-
return port | 0
}
+
/**
* @callback validateAbortSignal
* @param {*} signal
@@ -337,12 +317,12 @@ function validatePort(port, name = 'Port', allowZero = true) {
*/
/** @type {validateAbortSignal} */
-
const validateAbortSignal = hideStackFrames((signal, name) => {
if (signal !== undefined && (signal === null || typeof signal !== 'object' || !('aborted' in signal))) {
throw new ERR_INVALID_ARG_TYPE(name, 'AbortSignal', signal)
}
})
+
/**
* @callback validateFunction
* @param {*} value
@@ -351,10 +331,10 @@ const validateAbortSignal = hideStackFrames((signal, name) => {
*/
/** @type {validateFunction} */
-
const validateFunction = hideStackFrames((value, name) => {
if (typeof value !== 'function') throw new ERR_INVALID_ARG_TYPE(name, 'Function', value)
})
+
/**
* @callback validatePlainFunction
* @param {*} value
@@ -363,10 +343,10 @@ const validateFunction = hideStackFrames((value, name) => {
*/
/** @type {validatePlainFunction} */
-
const validatePlainFunction = hideStackFrames((value, name) => {
if (typeof value !== 'function' || isAsyncFunction(value)) throw new ERR_INVALID_ARG_TYPE(name, 'Function', value)
})
+
/**
* @callback validateUndefined
* @param {*} value
@@ -375,23 +355,21 @@ const validatePlainFunction = hideStackFrames((value, name) => {
*/
/** @type {validateUndefined} */
-
const validateUndefined = hideStackFrames((value, name) => {
if (value !== undefined) throw new ERR_INVALID_ARG_TYPE(name, 'undefined', value)
})
+
/**
* @template T
* @param {T} value
* @param {string} name
* @param {T[]} union
*/
-
function validateUnion(value, name, union) {
if (!ArrayPrototypeIncludes(union, value)) {
throw new ERR_INVALID_ARG_TYPE(name, `('${ArrayPrototypeJoin(union, '|')}')`, value)
}
}
-
module.exports = {
isInt32,
isUint32,
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/browser.js b/deps/npm/node_modules/readable-stream/lib/ours/browser.js
similarity index 87%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/browser.js
rename to deps/npm/node_modules/readable-stream/lib/ours/browser.js
index 7083fb31e59723..39acef3d7d9f69 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/browser.js
+++ b/deps/npm/node_modules/readable-stream/lib/ours/browser.js
@@ -1,12 +1,11 @@
'use strict'
const CustomStream = require('../stream')
-
const promises = require('../stream/promises')
-
const originalDestroy = CustomStream.Readable.destroy
-module.exports = CustomStream.Readable // Explicit export naming is needed for ESM
+module.exports = CustomStream.Readable
+// Explicit export naming is needed for ESM
module.exports._uint8ArrayToBuffer = CustomStream._uint8ArrayToBuffer
module.exports._isUint8Array = CustomStream._isUint8Array
module.exports.isDisturbed = CustomStream.isDisturbed
@@ -26,11 +25,11 @@ module.exports.compose = CustomStream.compose
Object.defineProperty(CustomStream, 'promises', {
configurable: true,
enumerable: true,
-
get() {
return promises
}
})
-module.exports.Stream = CustomStream.Stream // Allow default importing
+module.exports.Stream = CustomStream.Stream
+// Allow default importing
module.exports.default = module.exports
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/errors.js b/deps/npm/node_modules/readable-stream/lib/ours/errors.js
similarity index 96%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/errors.js
rename to deps/npm/node_modules/readable-stream/lib/ours/errors.js
index 7fd9a97c94ca29..97866d14f5351d 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/errors.js
+++ b/deps/npm/node_modules/readable-stream/lib/ours/errors.js
@@ -1,6 +1,7 @@
'use strict'
const { format, inspect, AggregateError: CustomAggregateError } = require('./util')
+
/*
This file is a reduced and adapted version of the main lib/internal/errors.js file defined at
@@ -16,7 +17,8 @@ const kTypes = [
'string',
'function',
'number',
- 'object', // Accept 'Function' and 'Object' as alternative to the lower cased version.
+ 'object',
+ // Accept 'Function' and 'Object' as alternative to the lower cased version.
'Function',
'Object',
'boolean',
@@ -26,62 +28,53 @@ const kTypes = [
const classRegExp = /^([A-Z][a-z0-9]*)+$/
const nodeInternalPrefix = '__node_internal_'
const codes = {}
-
function assert(value, message) {
if (!value) {
throw new codes.ERR_INTERNAL_ASSERTION(message)
}
-} // Only use this for integers! Decimal numbers do not work with this function.
+}
+// Only use this for integers! Decimal numbers do not work with this function.
function addNumericalSeparator(val) {
let res = ''
let i = val.length
const start = val[0] === '-' ? 1 : 0
-
for (; i >= start + 4; i -= 3) {
res = `_${val.slice(i - 3, i)}${res}`
}
-
return `${val.slice(0, i)}${res}`
}
-
function getMessage(key, msg, args) {
if (typeof msg === 'function') {
assert(
- msg.length <= args.length, // Default options do not count.
+ msg.length <= args.length,
+ // Default options do not count.
`Code: ${key}; The provided arguments length (${args.length}) does not match the required ones (${msg.length}).`
)
return msg(...args)
}
-
const expectedLength = (msg.match(/%[dfijoOs]/g) || []).length
assert(
expectedLength === args.length,
`Code: ${key}; The provided arguments length (${args.length}) does not match the required ones (${expectedLength}).`
)
-
if (args.length === 0) {
return msg
}
-
return format(msg, ...args)
}
-
function E(code, message, Base) {
if (!Base) {
Base = Error
}
-
class NodeError extends Base {
constructor(...args) {
super(getMessage(code, message, args))
}
-
toString() {
return `${this.name} [${code}]: ${this.message}`
}
}
-
Object.defineProperties(NodeError.prototype, {
name: {
value: Base.name,
@@ -93,7 +86,6 @@ function E(code, message, Base) {
value() {
return `${this.name} [${code}]: ${this.message}`
},
-
writable: true,
enumerable: false,
configurable: true
@@ -103,7 +95,6 @@ function E(code, message, Base) {
NodeError.prototype[kIsNodeError] = true
codes[code] = NodeError
}
-
function hideStackFrames(fn) {
// We rename the functions that will be hidden to cut off the stacktrace
// at the outermost one
@@ -113,7 +104,6 @@ function hideStackFrames(fn) {
})
return fn
}
-
function aggregateTwoErrors(innerError, outerError) {
if (innerError && outerError && innerError !== outerError) {
if (Array.isArray(outerError.errors)) {
@@ -121,54 +111,43 @@ function aggregateTwoErrors(innerError, outerError) {
outerError.errors.push(innerError)
return outerError
}
-
const err = new AggregateError([outerError, innerError], outerError.message)
err.code = outerError.code
return err
}
-
return innerError || outerError
}
-
class AbortError extends Error {
constructor(message = 'The operation was aborted', options = undefined) {
if (options !== undefined && typeof options !== 'object') {
throw new codes.ERR_INVALID_ARG_TYPE('options', 'Object', options)
}
-
super(message, options)
this.code = 'ABORT_ERR'
this.name = 'AbortError'
}
}
-
E('ERR_ASSERTION', '%s', Error)
E(
'ERR_INVALID_ARG_TYPE',
(name, expected, actual) => {
assert(typeof name === 'string', "'name' must be a string")
-
if (!Array.isArray(expected)) {
expected = [expected]
}
-
let msg = 'The '
-
if (name.endsWith(' argument')) {
// For cases like 'first argument'
msg += `${name} `
} else {
msg += `"${name}" ${name.includes('.') ? 'property' : 'argument'} `
}
-
msg += 'must be '
const types = []
const instances = []
const other = []
-
for (const value of expected) {
assert(typeof value === 'string', 'All expected entries have to be of type string')
-
if (kTypes.includes(value)) {
types.push(value.toLowerCase())
} else if (classRegExp.test(value)) {
@@ -177,89 +156,74 @@ E(
assert(value !== 'object', 'The value "object" should be written as "Object"')
other.push(value)
}
- } // Special handle `object` in case other instances are allowed to outline
- // the differences between each other.
+ }
+ // Special handle `object` in case other instances are allowed to outline
+ // the differences between each other.
if (instances.length > 0) {
const pos = types.indexOf('object')
-
if (pos !== -1) {
types.splice(types, pos, 1)
instances.push('Object')
}
}
-
if (types.length > 0) {
switch (types.length) {
case 1:
msg += `of type ${types[0]}`
break
-
case 2:
msg += `one of type ${types[0]} or ${types[1]}`
break
-
default: {
const last = types.pop()
msg += `one of type ${types.join(', ')}, or ${last}`
}
}
-
if (instances.length > 0 || other.length > 0) {
msg += ' or '
}
}
-
if (instances.length > 0) {
switch (instances.length) {
case 1:
msg += `an instance of ${instances[0]}`
break
-
case 2:
msg += `an instance of ${instances[0]} or ${instances[1]}`
break
-
default: {
const last = instances.pop()
msg += `an instance of ${instances.join(', ')}, or ${last}`
}
}
-
if (other.length > 0) {
msg += ' or '
}
}
-
switch (other.length) {
case 0:
break
-
case 1:
if (other[0].toLowerCase() !== other[0]) {
msg += 'an '
}
-
msg += `${other[0]}`
break
-
case 2:
msg += `one of ${other[0]} or ${other[1]}`
break
-
default: {
const last = other.pop()
msg += `one of ${other.join(', ')}, or ${last}`
}
}
-
if (actual == null) {
msg += `. Received ${actual}`
} else if (typeof actual === 'function' && actual.name) {
msg += `. Received function ${actual.name}`
} else if (typeof actual === 'object') {
var _actual$constructor
-
if (
(_actual$constructor = actual.constructor) !== null &&
_actual$constructor !== undefined &&
@@ -276,14 +240,11 @@ E(
let inspected = inspect(actual, {
colors: false
})
-
if (inspected.length > 25) {
inspected = `${inspected.slice(0, 25)}...`
}
-
msg += `. Received type ${typeof actual} (${inspected})`
}
-
return msg
},
TypeError
@@ -292,11 +253,9 @@ E(
'ERR_INVALID_ARG_VALUE',
(name, value, reason = 'is invalid') => {
let inspected = inspect(value)
-
if (inspected.length > 128) {
inspected = inspected.slice(0, 128) + '...'
}
-
const type = name.includes('.') ? 'property' : 'argument'
return `The ${type} '${name}' ${reason}. Received ${inspected}`
},
@@ -306,7 +265,6 @@ E(
'ERR_INVALID_RETURN_VALUE',
(input, name, value) => {
var _value$constructor
-
const type =
value !== null &&
value !== undefined &&
@@ -326,16 +284,13 @@ E(
let msg
const len = args.length
args = (Array.isArray(args) ? args : [args]).map((a) => `"${a}"`).join(' or ')
-
switch (len) {
case 1:
msg += `The ${args[0]} argument`
break
-
case 2:
msg += `The ${args[0]} and ${args[1]} arguments`
break
-
default:
{
const last = args.pop()
@@ -343,7 +298,6 @@ E(
}
break
}
-
return `${msg} must be specified`
},
TypeError
@@ -353,21 +307,17 @@ E(
(str, range, input) => {
assert(range, 'Missing "range" argument')
let received
-
if (Number.isInteger(input) && Math.abs(input) > 2 ** 32) {
received = addNumericalSeparator(String(input))
} else if (typeof input === 'bigint') {
received = String(input)
-
if (input > 2n ** 32n || input < -(2n ** 32n)) {
received = addNumericalSeparator(received)
}
-
received += 'n'
} else {
received = inspect(input)
}
-
return `The value of "${str}" is out of range. It must be ${range}. Received ${received}`
},
RangeError
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/index.js b/deps/npm/node_modules/readable-stream/lib/ours/index.js
similarity index 91%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/index.js
rename to deps/npm/node_modules/readable-stream/lib/ours/index.js
index 1a6af8ad86bf77..6cdd2d78557677 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/index.js
+++ b/deps/npm/node_modules/readable-stream/lib/ours/index.js
@@ -1,10 +1,10 @@
'use strict'
const Stream = require('stream')
-
if (Stream && process.env.READABLE_STREAM === 'disable') {
- const promises = Stream.promises // Explicit export naming is needed for ESM
+ const promises = Stream.promises
+ // Explicit export naming is needed for ESM
module.exports._uint8ArrayToBuffer = Stream._uint8ArrayToBuffer
module.exports._isUint8Array = Stream._isUint8Array
module.exports.isDisturbed = Stream.isDisturbed
@@ -23,7 +23,6 @@ if (Stream && process.env.READABLE_STREAM === 'disable') {
Object.defineProperty(Stream, 'promises', {
configurable: true,
enumerable: true,
-
get() {
return promises
}
@@ -31,12 +30,11 @@ if (Stream && process.env.READABLE_STREAM === 'disable') {
module.exports.Stream = Stream.Stream
} else {
const CustomStream = require('../stream')
-
const promises = require('../stream/promises')
-
const originalDestroy = CustomStream.Readable.destroy
- module.exports = CustomStream.Readable // Explicit export naming is needed for ESM
+ module.exports = CustomStream.Readable
+ // Explicit export naming is needed for ESM
module.exports._uint8ArrayToBuffer = CustomStream._uint8ArrayToBuffer
module.exports._isUint8Array = CustomStream._isUint8Array
module.exports.isDisturbed = CustomStream.isDisturbed
@@ -56,12 +54,12 @@ if (Stream && process.env.READABLE_STREAM === 'disable') {
Object.defineProperty(CustomStream, 'promises', {
configurable: true,
enumerable: true,
-
get() {
return promises
}
})
module.exports.Stream = CustomStream.Stream
-} // Allow default importing
+}
+// Allow default importing
module.exports.default = module.exports
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/primordials.js b/deps/npm/node_modules/readable-stream/lib/ours/primordials.js
similarity index 98%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/primordials.js
rename to deps/npm/node_modules/readable-stream/lib/ours/primordials.js
index fab7a28e444ad8..6a98b01681caf0 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/primordials.js
+++ b/deps/npm/node_modules/readable-stream/lib/ours/primordials.js
@@ -1,4 +1,5 @@
'use strict'
+
/*
This file is a reduced and adapted version of the main lib/internal/per_context/primordials.js file defined at
@@ -6,50 +7,38 @@
Don't try to replace with the original file and keep it up to date with the upstream file.
*/
-
module.exports = {
ArrayIsArray(self) {
return Array.isArray(self)
},
-
ArrayPrototypeIncludes(self, el) {
return self.includes(el)
},
-
ArrayPrototypeIndexOf(self, el) {
return self.indexOf(el)
},
-
ArrayPrototypeJoin(self, sep) {
return self.join(sep)
},
-
ArrayPrototypeMap(self, fn) {
return self.map(fn)
},
-
ArrayPrototypePop(self, el) {
return self.pop(el)
},
-
ArrayPrototypePush(self, el) {
return self.push(el)
},
-
ArrayPrototypeSlice(self, start, end) {
return self.slice(start, end)
},
-
Error,
-
FunctionPrototypeCall(fn, thisArgs, ...args) {
return fn.call(thisArgs, ...args)
},
-
FunctionPrototypeSymbolHasInstance(self, instance) {
return Function.prototype[Symbol.hasInstance].call(self, instance)
},
-
MathFloor: Math.floor,
Number,
NumberIsInteger: Number.isInteger,
@@ -57,74 +46,55 @@ module.exports = {
NumberMAX_SAFE_INTEGER: Number.MAX_SAFE_INTEGER,
NumberMIN_SAFE_INTEGER: Number.MIN_SAFE_INTEGER,
NumberParseInt: Number.parseInt,
-
ObjectDefineProperties(self, props) {
return Object.defineProperties(self, props)
},
-
ObjectDefineProperty(self, name, prop) {
return Object.defineProperty(self, name, prop)
},
-
ObjectGetOwnPropertyDescriptor(self, name) {
return Object.getOwnPropertyDescriptor(self, name)
},
-
ObjectKeys(obj) {
return Object.keys(obj)
},
-
ObjectSetPrototypeOf(target, proto) {
return Object.setPrototypeOf(target, proto)
},
-
Promise,
-
PromisePrototypeCatch(self, fn) {
return self.catch(fn)
},
-
PromisePrototypeThen(self, thenFn, catchFn) {
return self.then(thenFn, catchFn)
},
-
PromiseReject(err) {
return Promise.reject(err)
},
-
ReflectApply: Reflect.apply,
-
RegExpPrototypeTest(self, value) {
return self.test(value)
},
-
SafeSet: Set,
String,
-
StringPrototypeSlice(self, start, end) {
return self.slice(start, end)
},
-
StringPrototypeToLowerCase(self) {
return self.toLowerCase()
},
-
StringPrototypeToUpperCase(self) {
return self.toUpperCase()
},
-
StringPrototypeTrim(self) {
return self.trim()
},
-
Symbol,
SymbolAsyncIterator: Symbol.asyncIterator,
SymbolHasInstance: Symbol.hasInstance,
SymbolIterator: Symbol.iterator,
-
TypedArrayPrototypeSet(self, buf, len) {
return self.set(buf, len)
},
-
Uint8Array
}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/util.js b/deps/npm/node_modules/readable-stream/lib/ours/util.js
similarity index 97%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/util.js
rename to deps/npm/node_modules/readable-stream/lib/ours/util.js
index fdaaacd6753d12..e125ce17aa83c2 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/ours/util.js
+++ b/deps/npm/node_modules/readable-stream/lib/ours/util.js
@@ -1,11 +1,9 @@
'use strict'
const bufferModule = require('buffer')
-
const AsyncFunction = Object.getPrototypeOf(async function () {}).constructor
const Blob = globalThis.Blob || bufferModule.Blob
/* eslint-disable indent */
-
const isBlob =
typeof Blob !== 'undefined'
? function isBlob(b) {
@@ -16,46 +14,40 @@ const isBlob =
return false
}
/* eslint-enable indent */
-// This is a simplified version of AggregateError
+// This is a simplified version of AggregateError
class AggregateError extends Error {
constructor(errors) {
if (!Array.isArray(errors)) {
throw new TypeError(`Expected input to be an Array, got ${typeof errors}`)
}
-
let message = ''
-
for (let i = 0; i < errors.length; i++) {
message += ` ${errors[i].stack}\n`
}
-
super(message)
this.name = 'AggregateError'
this.errors = errors
}
}
-
module.exports = {
AggregateError,
kEmptyObject: Object.freeze({}),
-
once(callback) {
let called = false
return function (...args) {
if (called) {
return
}
-
called = true
callback.apply(this, args)
}
},
-
createDeferredPromise: function () {
let resolve
- let reject // eslint-disable-next-line promise/param-names
+ let reject
+ // eslint-disable-next-line promise/param-names
const promise = new Promise((res, rej) => {
resolve = res
reject = rej
@@ -66,28 +58,23 @@ module.exports = {
reject
}
},
-
promisify(fn) {
return new Promise((resolve, reject) => {
fn((err, ...args) => {
if (err) {
return reject(err)
}
-
return resolve(...args)
})
})
},
-
debuglog() {
return function () {}
},
-
format(format, ...args) {
// Simplified version of https://nodejs.org/api/util.html#utilformatformat-args
return format.replace(/%([sdifj])/g, function (...[_unused, type]) {
const replacement = args.shift()
-
if (type === 'f') {
return replacement.toFixed(6)
} else if (type === 'j') {
@@ -100,7 +87,6 @@ module.exports = {
}
})
},
-
inspect(value) {
// Vastly simplified version of https://nodejs.org/api/util.html#utilinspectobject-options
switch (typeof value) {
@@ -112,35 +98,27 @@ module.exports = {
return `\`${value}\``
}
}
-
return `'${value}'`
-
case 'number':
if (isNaN(value)) {
return 'NaN'
} else if (Object.is(value, -0)) {
return String(value)
}
-
return value
-
case 'bigint':
return `${String(value)}n`
-
case 'boolean':
case 'undefined':
return String(value)
-
case 'object':
return '{}'
}
},
-
types: {
isAsyncFunction(fn) {
return fn instanceof AsyncFunction
},
-
isArrayBufferView(arr) {
return ArrayBuffer.isView(arr)
}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/stream.js b/deps/npm/node_modules/readable-stream/lib/stream.js
similarity index 98%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/stream.js
rename to deps/npm/node_modules/readable-stream/lib/stream.js
index f5268171458d96..e9bb6ba9080331 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/stream.js
+++ b/deps/npm/node_modules/readable-stream/lib/stream.js
@@ -1,5 +1,7 @@
/* replacement start */
+
const { Buffer } = require('buffer')
+
/* replacement end */
// Copyright Joyent, Inc. and other Node contributors.
//
@@ -23,51 +25,34 @@ const { Buffer } = require('buffer')
// USE OR OTHER DEALINGS IN THE SOFTWARE.
;('use strict')
-
const { ObjectDefineProperty, ObjectKeys, ReflectApply } = require('./ours/primordials')
-
const {
promisify: { custom: customPromisify }
} = require('./ours/util')
-
const { streamReturningOperators, promiseReturningOperators } = require('./internal/streams/operators')
-
const {
codes: { ERR_ILLEGAL_CONSTRUCTOR }
} = require('./ours/errors')
-
const compose = require('./internal/streams/compose')
-
const { pipeline } = require('./internal/streams/pipeline')
-
const { destroyer } = require('./internal/streams/destroy')
-
const eos = require('./internal/streams/end-of-stream')
-
const internalBuffer = {}
-
const promises = require('./stream/promises')
-
const utils = require('./internal/streams/utils')
-
const Stream = (module.exports = require('./internal/streams/legacy').Stream)
-
Stream.isDisturbed = utils.isDisturbed
Stream.isErrored = utils.isErrored
Stream.isReadable = utils.isReadable
Stream.Readable = require('./internal/streams/readable')
-
for (const key of ObjectKeys(streamReturningOperators)) {
const op = streamReturningOperators[key]
-
function fn(...args) {
if (new.target) {
throw ERR_ILLEGAL_CONSTRUCTOR()
}
-
return Stream.Readable.from(ReflectApply(op, this, args))
}
-
ObjectDefineProperty(fn, 'name', {
__proto__: null,
value: op.name
@@ -84,18 +69,14 @@ for (const key of ObjectKeys(streamReturningOperators)) {
writable: true
})
}
-
for (const key of ObjectKeys(promiseReturningOperators)) {
const op = promiseReturningOperators[key]
-
function fn(...args) {
if (new.target) {
throw ERR_ILLEGAL_CONSTRUCTOR()
}
-
return ReflectApply(op, this, args)
}
-
ObjectDefineProperty(fn, 'name', {
__proto__: null,
value: op.name
@@ -112,15 +93,12 @@ for (const key of ObjectKeys(promiseReturningOperators)) {
writable: true
})
}
-
Stream.Writable = require('./internal/streams/writable')
Stream.Duplex = require('./internal/streams/duplex')
Stream.Transform = require('./internal/streams/transform')
Stream.PassThrough = require('./internal/streams/passthrough')
Stream.pipeline = pipeline
-
const { addAbortSignal } = require('./internal/streams/add-abort-signal')
-
Stream.addAbortSignal = addAbortSignal
Stream.finished = eos
Stream.destroy = destroyer
@@ -129,7 +107,6 @@ ObjectDefineProperty(Stream, 'promises', {
__proto__: null,
configurable: true,
enumerable: true,
-
get() {
return promises
}
@@ -137,7 +114,6 @@ ObjectDefineProperty(Stream, 'promises', {
ObjectDefineProperty(pipeline, customPromisify, {
__proto__: null,
enumerable: true,
-
get() {
return promises.pipeline
}
@@ -145,18 +121,16 @@ ObjectDefineProperty(pipeline, customPromisify, {
ObjectDefineProperty(eos, customPromisify, {
__proto__: null,
enumerable: true,
-
get() {
return promises.finished
}
-}) // Backwards-compat with node 0.4.x
+})
+// Backwards-compat with node 0.4.x
Stream.Stream = Stream
-
Stream._isUint8Array = function isUint8Array(value) {
return value instanceof Uint8Array
}
-
Stream._uint8ArrayToBuffer = function _uint8ArrayToBuffer(chunk) {
return Buffer.from(chunk.buffer, chunk.byteOffset, chunk.byteLength)
}
diff --git a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/stream/promises.js b/deps/npm/node_modules/readable-stream/lib/stream/promises.js
similarity index 99%
rename from deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/stream/promises.js
rename to deps/npm/node_modules/readable-stream/lib/stream/promises.js
index 5e7972ee8acb77..d44dd8ad0e0f3f 100644
--- a/deps/npm/node_modules/are-we-there-yet/node_modules/readable-stream/lib/stream/promises.js
+++ b/deps/npm/node_modules/readable-stream/lib/stream/promises.js
@@ -1,25 +1,19 @@
'use strict'
const { ArrayPrototypePop, Promise } = require('../ours/primordials')
-
const { isIterable, isNodeStream } = require('../internal/streams/utils')
-
const { pipelineImpl: pl } = require('../internal/streams/pipeline')
-
const { finished } = require('../internal/streams/end-of-stream')
-
function pipeline(...streams) {
return new Promise((resolve, reject) => {
let signal
let end
const lastArg = streams[streams.length - 1]
-
if (lastArg && typeof lastArg === 'object' && !isNodeStream(lastArg) && !isIterable(lastArg)) {
const options = ArrayPrototypePop(streams)
signal = options.signal
end = options.end
}
-
pl(
streams,
(err, value) => {
@@ -36,7 +30,6 @@ function pipeline(...streams) {
)
})
}
-
module.exports = {
finished,
pipeline
diff --git a/deps/npm/node_modules/readable-stream/package.json b/deps/npm/node_modules/readable-stream/package.json
index 0b0c4bd207ace3..7df83d9eb990a9 100644
--- a/deps/npm/node_modules/readable-stream/package.json
+++ b/deps/npm/node_modules/readable-stream/package.json
@@ -1,68 +1,84 @@
{
"name": "readable-stream",
- "version": "3.6.0",
- "description": "Streams3, a user-land copy of the stream library from Node.js",
- "main": "readable.js",
- "engines": {
- "node": ">= 6"
- },
- "dependencies": {
- "inherits": "^2.0.3",
- "string_decoder": "^1.1.1",
- "util-deprecate": "^1.0.1"
- },
- "devDependencies": {
- "@babel/cli": "^7.2.0",
- "@babel/core": "^7.2.0",
- "@babel/polyfill": "^7.0.0",
- "@babel/preset-env": "^7.2.0",
- "airtap": "0.0.9",
- "assert": "^1.4.0",
- "bl": "^2.0.0",
- "deep-strict-equal": "^0.2.0",
- "events.once": "^2.0.2",
- "glob": "^7.1.2",
- "gunzip-maybe": "^1.4.1",
- "hyperquest": "^2.1.3",
- "lolex": "^2.6.0",
- "nyc": "^11.0.0",
- "pump": "^3.0.0",
- "rimraf": "^2.6.2",
- "tap": "^12.0.0",
- "tape": "^4.9.0",
- "tar-fs": "^1.16.2",
- "util-promisify": "^2.1.0"
- },
- "scripts": {
- "test": "tap -J --no-esm test/parallel/*.js test/ours/*.js",
- "ci": "TAP=1 tap --no-esm test/parallel/*.js test/ours/*.js | tee test.tap",
- "test-browsers": "airtap --sauce-connect --loopback airtap.local -- test/browser.js",
- "test-browser-local": "airtap --open --local -- test/browser.js",
- "cover": "nyc npm test",
- "report": "nyc report --reporter=lcov",
- "update-browser-errors": "babel -o errors-browser.js errors.js"
- },
- "repository": {
- "type": "git",
- "url": "git://github.com/nodejs/readable-stream"
- },
+ "version": "4.3.0",
+ "description": "Node.js Streams, a user-land copy of the stream library from Node.js",
+ "homepage": "https://github.com/nodejs/readable-stream",
+ "license": "MIT",
+ "licenses": [
+ {
+ "type": "MIT",
+ "url": "https://choosealicense.com/licenses/mit/"
+ }
+ ],
"keywords": [
"readable",
"stream",
"pipe"
],
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/nodejs/readable-stream"
+ },
+ "bugs": {
+ "url": "https://github.com/nodejs/readable-stream/issues"
+ },
+ "main": "lib/ours/index.js",
+ "files": [
+ "lib",
+ "LICENSE",
+ "README.md"
+ ],
"browser": {
- "util": false,
- "worker_threads": false,
- "./errors": "./errors-browser.js",
- "./readable.js": "./readable-browser.js",
- "./lib/internal/streams/from.js": "./lib/internal/streams/from-browser.js",
- "./lib/internal/streams/stream.js": "./lib/internal/streams/stream-browser.js"
+ "util": "./lib/ours/util.js",
+ "./lib/ours/index.js": "./lib/ours/browser.js"
},
- "nyc": {
- "include": [
- "lib/**.js"
- ]
+ "scripts": {
+ "build": "node build/build.mjs",
+ "postbuild": "prettier -w lib test",
+ "test": "tap --rcfile=./tap.yml test/parallel/test-*.js test/ours/test-*.js",
+ "test:prepare": "node test/browser/runner-prepare.mjs",
+ "test:browsers": "node test/browser/runner-browser.mjs",
+ "test:bundlers": "node test/browser/runner-node.mjs",
+ "coverage": "c8 -c ./c8.json tap --rcfile=./tap.yml test/parallel/test-*.js test/ours/test-*.js",
+ "format": "prettier -w src lib test",
+ "lint": "eslint src"
+ },
+ "dependencies": {
+ "abort-controller": "^3.0.0",
+ "buffer": "^6.0.3",
+ "events": "^3.3.0",
+ "process": "^0.11.10"
},
- "license": "MIT"
+ "devDependencies": {
+ "@babel/core": "^7.17.10",
+ "@babel/plugin-proposal-nullish-coalescing-operator": "^7.16.7",
+ "@babel/plugin-proposal-optional-chaining": "^7.16.7",
+ "@rollup/plugin-commonjs": "^22.0.0",
+ "@rollup/plugin-inject": "^4.0.4",
+ "@rollup/plugin-node-resolve": "^13.3.0",
+ "@sinonjs/fake-timers": "^9.1.2",
+ "browserify": "^17.0.0",
+ "c8": "^7.11.2",
+ "esbuild": "^0.14.39",
+ "esbuild-plugin-alias": "^0.2.1",
+ "eslint": "^8.15.0",
+ "eslint-config-standard": "^17.0.0",
+ "eslint-plugin-import": "^2.26.0",
+ "eslint-plugin-n": "^15.2.0",
+ "eslint-plugin-promise": "^6.0.0",
+ "playwright": "^1.21.1",
+ "prettier": "^2.6.2",
+ "rollup": "^2.72.1",
+ "rollup-plugin-polyfill-node": "^0.9.0",
+ "tap": "^16.2.0",
+ "tap-mocha-reporter": "^5.0.3",
+ "tape": "^5.5.3",
+ "tar": "^6.1.11",
+ "undici": "^5.1.1",
+ "webpack": "^5.72.1",
+ "webpack-cli": "^4.9.2"
+ },
+ "engines": {
+ "node": "^12.22.0 || ^14.17.0 || >=16.0.0"
+ }
}
diff --git a/deps/npm/node_modules/safe-buffer/index.js b/deps/npm/node_modules/safe-buffer/index.js
index f8d3ec98852f44..22438dabbbceef 100644
--- a/deps/npm/node_modules/safe-buffer/index.js
+++ b/deps/npm/node_modules/safe-buffer/index.js
@@ -1,4 +1,3 @@
-/*! safe-buffer. MIT License. Feross Aboukhadijeh */
/* eslint-disable node/no-deprecated-api */
var buffer = require('buffer')
var Buffer = buffer.Buffer
@@ -21,8 +20,6 @@ function SafeBuffer (arg, encodingOrOffset, length) {
return Buffer(arg, encodingOrOffset, length)
}
-SafeBuffer.prototype = Object.create(Buffer.prototype)
-
// Copy static methods from Buffer
copyProps(Buffer, SafeBuffer)
diff --git a/deps/npm/node_modules/safe-buffer/package.json b/deps/npm/node_modules/safe-buffer/package.json
index f2869e256477a9..623fbc3f6b0c48 100644
--- a/deps/npm/node_modules/safe-buffer/package.json
+++ b/deps/npm/node_modules/safe-buffer/package.json
@@ -1,18 +1,18 @@
{
"name": "safe-buffer",
"description": "Safer Node.js Buffer API",
- "version": "5.2.1",
+ "version": "5.1.2",
"author": {
"name": "Feross Aboukhadijeh",
"email": "feross@feross.org",
- "url": "https://feross.org"
+ "url": "http://feross.org"
},
"bugs": {
"url": "https://github.com/feross/safe-buffer/issues"
},
"devDependencies": {
"standard": "*",
- "tape": "^5.0.0"
+ "tape": "^4.0.0"
},
"homepage": "https://github.com/feross/safe-buffer",
"keywords": [
@@ -33,19 +33,5 @@
},
"scripts": {
"test": "standard && tape test/*.js"
- },
- "funding": [
- {
- "type": "github",
- "url": "https://github.com/sponsors/feross"
- },
- {
- "type": "patreon",
- "url": "https://www.patreon.com/feross"
- },
- {
- "type": "consulting",
- "url": "https://feross.org/support"
- }
- ]
+ }
}
diff --git a/deps/npm/node_modules/sigstore/LICENSE b/deps/npm/node_modules/sigstore/LICENSE
new file mode 100644
index 00000000000000..d645695673349e
--- /dev/null
+++ b/deps/npm/node_modules/sigstore/LICENSE
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/deps/npm/node_modules/sigstore/README.md b/deps/npm/node_modules/sigstore/README.md
new file mode 100644
index 00000000000000..0a8b690423a0f3
--- /dev/null
+++ b/deps/npm/node_modules/sigstore/README.md
@@ -0,0 +1,180 @@
+# sigstore-js
+
+A JavaScript library for generating and verifying Sigstore signatures. One of
+the intended uses is to sign and verify npm packages but it can be used to sign
+and verify any file.
+
+## Features
+
+* Support for signing using an OpenID Connect identity
+* Support for publishing signatures to a [Rekor][1] instance
+* Support for verifying Sigstore bundles
+
+## Prerequisites
+
+- Node.js version >= 14.17.0
+
+## Installation
+
+```
+npm install sigstore
+```
+
+## Usage
+
+```javascript
+const { sigstore } = require('sigstore')
+```
+
+```javascript
+import { sigstore } from 'sigstore'
+```
+
+### sign(payload[, options])
+
+Generates a Sigstore signature for the supplied payload. Returns a
+[Sigstore bundle][2] containing the signature and the verification material
+necessary to verify the signature.
+
+* `payload` ``: The bytes of the artifact to be signed.
+* `options` `