Scan your project for vulnerabilities and automatically install any compatible
+updates to vulnerable dependencies:
+
$ npm audit fix
Run audit fix without modifying node_modules, but still updating the
+pkglock:
+
$ npm audit fix --package-lock-only
Skip updating devDependencies:
+
$ npm audit fix --only=prod
Have audit fix install semver-major updates to toplevel dependencies, not just
+semver-compatible ones:
+
$ npm audit fix --force
Do a dry run to get an idea of what audit fix will do, and also output
+install information in JSON format:
+
$ npm audit fix --dry-run --json
Scan your project for vulnerabilities and just show the details, without fixing
+anything:
+
$ npm audit
Get the detailed audit report in JSON format:
+
$ npm audit --json
Get the detailed audit report in plain text result, separated by tab characters, allowing for
+future reuse in scripting or command line post processing, like for example, selecting
+some of the columns printed:
+
$ npm audit --parseable
To parse columns, you can use for example awk, and just print some of them:
The audit command submits a description of the dependencies configured in
+your project to your default registry and asks for a report of known
+vulnerabilities. The report returned includes instructions on how to act on
+this information.
+
You can also have npm automatically fix the vulnerabilities by running npm
+audit fix. Note that some vulnerabilities cannot be fixed automatically and
+will require manual intervention or review. Also note that since npm audit fix
+runs a full-fledged npm install under the hood, all configs that apply to the
+installer will also apply to npm install -- so things like npm audit fix
+--package-lock-only will work as expected.
+
CONTENT SUBMITTED
+
+
npm_version
+
node_version
+
platform
+
node_env
+
A scrubbed version of your package-lock.json or npm-shrinkwrap.json
+
+
SCRUBBING
+
In order to ensure that potentially sensitive information is not included in
+the audit data bundle, some dependencies may have their names (and sometimes
+versions) replaced with opaque non-reversible identifiers. It is done for
+the following dependency types:
+
+
Any module referencing a scope that is configured for a non-default
+registry has its name scrubbed. (That is, a scope you did a npm login --scope=@ourscope for.)
+
All git dependencies have their names and specifiers scrubbed.
+
All remote tarball dependencies have their names and specifiers scrubbed.
+
All local directory and tarball dependencies have their names and specifiers scrubbed.
+
+
The non-reversible identifiers are a sha256 of a session-specific UUID and the
+value being replaced, ensuring a consistent value within the payload that is
+different between runs.
This command tries to guess at the likely location of a package's
bug tracker URL, and then tries to open it using the --browser
config param. If no package name is provided, it will search for
@@ -55,5 +54,5 @@
Make sure you have a package-lock and an up-to-date install:
+
$ cd ./my/npm/project
+$ npm install
+added 154 packages in 10s
+$ ls | grep package-lock
Run npm ci in that project
+
$ npm ci
+added 154 packages in 5s
Configure Travis to build using npm ci instead of npm install:
+
# .travis.yml
+install:
+- npm ci
+# keep the npm cache around to speed up installs
+cache:
+ directories:
+ - "$HOME/.npm"
DESCRIPTION
+
This command is similar to npm-install(1), except it's meant to be used in
+automated environments such as test platforms, continuous integration, and
+deployment. It can be significantly faster than a regular npm install by
+skipping certain user-oriented features. It is also more strict than a regular
+install, which can help catch errors or inconsistencies caused by the
+incrementally-installed local environments of most npm users.
+
In short, the main differences between using npm install and npm ci are:
+
+
The project must have an existing package-lock.json or npm-shrinkwrap.json.
+
If dependencies in the package lock do not match those in package.json, npm ci will exit with an error, instead of updating the package lock.
+
npm ci can only install entire projects at a time: individual dependencies cannot be added with this command.
+
If a node_modules is already present, it will be automatically removed before npm ci begins its install.
+
It will never write to package.json or any of the package-locks: installs are essentially frozen.
The synopsis above
loads the completions into your current shell. Adding it to
your ~/.bashrc or ~/.zshrc will make the completions available
everywhere:
You may of course also pipe the output of npm completion to a file
+npm completion >> ~/.zshrc
You may of course also pipe the output of npm completion to a file
such as /usr/local/etc/bash_completion.d/npm if you have a system
that will read that file for you.
When COMP_CWORD, COMP_LINE, and COMP_POINT are defined in the
@@ -43,5 +41,5 @@
Searches the local package tree and attempts to simplify the overall
structure by moving dependencies further up the tree, where they can
be more effectively shared by multiple dependent packages.
@@ -24,13 +23,11 @@
SYNOPSIS
+-- b <-- depends on c@1.0.x
| `-- c@1.0.3
`-- d <-- depends on c@~1.0.9
- `-- c@1.0.10
-
In this case, npm-dedupe(1) will transform the tree to:
+ `-- c@1.0.10
In this case, npm-dedupe(1) will transform the tree to:
a
+-- b
+-- d
-`-- c@1.0.10
-
Because of the hierarchical nature of node's module lookup, b and d
+`-- c@1.0.10
Because of the hierarchical nature of node's module lookup, b and d
will both get their dependency met by the single c package at the root
level of the tree.
The deduplication algorithm walks the tree, moving each dependency as far
@@ -61,5 +58,5 @@
Add, remove, and enumerate distribution tags on a package:
add:
Tags the specified version of the package with the specified tag, or the
---tag config if not specified. The tag you're adding is latest and you
+--tag config if not specified. If the tag you're adding is latest and you
have two-factor authentication on auth-and-writes then you'll need to include
an otp on the command line with --otp.
@@ -35,10 +34,8 @@
SYNOPSIS
A tag can be used when installing packages as a reference to a version instead
of using a specific version number:
-
npm install <name>@<tag>
-
When installing dependencies, a preferred tagged version may be specified:
-
npm install --tag <tag>
-
This also applies to npm dedupe.
+
npm install <name>@<tag>
When installing dependencies, a preferred tagged version may be specified:
+
npm install --tag <tag>
This also applies to npm dedupe.
Publishing a package sets the latest tag to the published version unless the
--tag option is used. For example, npm publish --tag=beta.
By default, npm install <pkg> (without any @<version> or @<tag>
@@ -88,5 +85,5 @@
npm docs [<pkgname> [<pkgname> ...]]
npm docs .
npm home [<pkgname> [<pkgname> ...]]
-npm home .
-
DESCRIPTION
+npm home .
DESCRIPTION
This command tries to guess at the likely location of a package's
documentation URL, and then tries to open it using the --browser
config param. You can pass multiple package names at once. If no
@@ -56,5 +55,5 @@
npm doctor runs a set of checks to ensure that your npm installation has
what it needs to manage your JavaScript packages. npm is mostly a standalone tool, but it does
have some basic requirements that must be met:
@@ -32,7 +31,7 @@
SYNOPSIS
better than an old version.
npm doctor verifies the following items in your environment, and if there are
any recommended changes, it will display them.
-
npm ping
+
npm ping
By default, npm installs from the primary npm registry, registry.npmjs.org.
npm doctor hits a special ping endpoint within the registry. This can also be
checked with npm ping. If this check fails, you may be using a proxy that
@@ -42,7 +41,7 @@
npm ping
what that is by running npm config get registry), and if you're using a
private registry that doesn't support the /whoami endpoint supported by the
primary registry, this check may fail.
-
npm -v
+
npm -v
While Node.js may come bundled with a particular version of npm, it's the
policy of the CLI team that we recommend all users run npm@latest if they
can. As the CLI is maintained by a small team of contributors, there are only
@@ -50,21 +49,21 @@
npm -v
releases typically only receive critical security and regression fixes. The
team believes that the latest tested version of npm is almost always likely to
be the most functional and defect-free version of npm.
-
node -v
+
node -v
For most users, in most circumstances, the best version of Node will be the
latest long-term support (LTS) release. Those of you who want access to new
ECMAscript features or bleeding-edge changes to Node's standard library may be
running a newer version, and some of you may be required to run an older
version of Node because of enterprise change control policies. That's OK! But
in general, the npm team recommends that most users run Node.js LTS.
-
npm config get registry
+
npm config get registry
Some of you may be installing from private package registries for your project
or company. That's great! Others of you may be following tutorials or
StackOverflow questions in an effort to troubleshoot problems you may be
having. Sometimes, this may entail changing the registry you're pointing at.
This part of npm doctor just lets you, and maybe whoever's helping you with
support, know that you're not using the default registry.
-
which git
+
which git
While it's documented in the README, it may not be obvious that npm needs Git
installed to do many of the things that it does. Also, in some cases
– especially on Windows – you may have Git set up in such a way that it's not
@@ -103,5 +102,4 @@
If supplied a topic, then show the appropriate documentation page.
If the topic does not exist, or if multiple terms are provided, then run
the help-search command to find a match. Note that, if help-search
@@ -50,5 +49,5 @@
Allows you to manage npm
+hooks,
+including adding, removing, listing, and updating.
+
Hooks allow you to configure URL endpoints that will be notified whenever a
+change happens to any of the supported entity types. Three different types of
+entities can be watched by hooks: packages, owners, and scopes.
+
To create a package hook, simply reference the package name.
+
To create an owner hook, prefix the owner name with ~ (as in, ~youruser).
+
To create a scope hook, prefix the scope name with @ (as in, @yourscope).
+
The hook id used by update and rm are the IDs listed in npm hook ls for
+that particular hook.
+
The shared secret will be sent along to the URL endpoint so you can verify the
+request came from your own configured hook.
npm init <initializer> can be used to set up a new or existing npm package.
+
initializer in this case is an npm package named create-<initializer>, which
+will be installed by npx(1), and then have its main bin
+executed -- presumably creating or updating package.json and running any other
+initialization-related operations.
+
The init command is transformed to a corresponding npx operation as follows:
-
Default: none
-
Type: String
+
npm init foo -> npx create-foo
+
npm init @usr/foo -> npx @usr/create-foo
+
npm init @usr -> npx @usr/create
-
The scope under which the new module should be created.
+
Any additional options will be passed directly to the command, so npm init foo
+--hello will map to npx create-foo --hello.
+
If the initializer is omitted (by just calling npm init), init will fall back
+to legacy init behavior. It will ask you a bunch of questions, and then write a
+package.json for you. It will attempt to make reasonable guesses based on
+existing fields, dependencies, and options selected. It is strictly additive, so
+it will keep any fields and values that were already set. You can also use
+-y/--yes to skip the questionnaire altogether. If you pass --scope, it
+will create a scoped package.
This command installs a package, and any packages that it depends on. If the
package has a package-lock or shrinkwrap file, the installation of dependencies
will be driven by that, with an npm-shrinkwrap.json taking precedence if both
@@ -54,6 +53,10 @@
SYNOPSIS
With the --production flag (or when the NODE_ENV environment variable
is set to production), npm will not install modules listed in
devDependencies.
+
+
NOTE: The --production flag has no particular meaning when adding a
+ dependency to a project.
+
npm install <folder>:
Install the package in the directory as a symlink in the current project.
@@ -64,31 +67,36 @@
SYNOPSIS
npm install <tarball file>:
Install a package that is sitting on the filesystem. Note: if you just want
to link a dev directory into your npm root, you can do this more easily by
- using npm link. The filename must use .tar, .tar.gz, or .tgz as
- the extension.
-
Example:
-
npm install ./package.tgz
-
+ using npm link.
+
Tarball requirements:
+
+
The filename must use .tar, .tar.gz, or .tgz as
+the extension.
+
+
The package contents should reside in a subfolder inside the tarball (usually it is called package/). npm strips one directory layer when installing the package (an equivalent of tar x --strip-components=1 is run).
+
+
The package must contain a package.json file with name and version properties.
+
Example:
+
npm install ./package.tgz
+
+
npm install <tarball url>:
Fetch the tarball url, and then install it. In order to distinguish between
this and other options, the argument must start with "http://" or "https://"
Do a <name>@<tag> install, where <tag> is the "tag" config. (See
npm-config(7). The config's default value is latest.)
In most cases, this will install the version of the modules tagged as
latest on the npm registry.
Example:
-
npm install sax
-
npm install saves any specified packages into dependencies by default.
+
npm install sax
npm install saves any specified packages into dependencies by default.
Additionally, you can control where and how they get saved with some
additional flags:
-P, --save-prod: Package will appear in your dependencies. This is the
-
default unless `-D` or `-O` are present.
-
+
default unless `-D` or `-O` are present.
-D, --save-dev: Package will appear in your devDependencies.
-O, --save-optional: Package will appear in your optionalDependencies.
**Note**: If there is a file or folder named `<name>` in the current
working directory, then it will try to install that, and only try to
-fetch the package by name if it is not valid.
-
+fetch the package by name if it is not valid.
npm install [<@scope>/]<name>@<tag>:
Install the version of the package that is referenced by the specified tag.
If the tag does not exist in the registry data for that package, then this
will fail.
Install a version of the package matching the specified version range. This
will follow the same rules for resolving dependencies described in package.json(5).
<protocol> is one of git, git+ssh, git+http, git+https, or
git+file.
If #<commit-ish> is provided, it will be used to clone exactly that
commit. If the commit-ish has the format #semver:<semver>, <semver> can
@@ -170,21 +172,26 @@
SYNOPSIS
The following git environment variables are recognized by npm and will be
added to the environment when running git:
Install the package at https://gist.github.com/gistID by attempting to
clone it using git. The GitHub username associated with the gist is
@@ -213,8 +219,7 @@
SYNOPSIS
be installed if the package has a prepare script, before the package is
done installing.
The --tag argument will apply to all of the specified install targets. If a
+
npm install sax@">=0.1.0 <0.2.0" bench supervisor
The --tag argument will apply to all of the specified install targets. If a
tag with the given name exists, the tagged version is preferred over newer
versions.
The --dry-run argument will report in the usual way what the install would
@@ -259,8 +261,7 @@
SYNOPSIS
instead of checking node_modules and downloading dependencies.
The -f or --force argument will force npm to fetch remote resources even if a
local copy exists on disk.
-
npm install sax --force
-
The -g or --global argument will cause npm to install the package globally
+
npm install sax --force
The -g or --global argument will cause npm to install the package globally
rather than locally. See npm-folders(5).
The --global-style argument will cause npm to install the package into
your local node_modules folder with the same layout it uses with the
@@ -281,11 +282,14 @@
SYNOPSIS
The --no-shrinkwrap argument, which will ignore an available
package lock or shrinkwrap file and use the package.json instead.
The --no-package-lock argument will prevent npm from creating a
-package-lock.json file.
+package-lock.json file. When running with package-lock's disabled npm
+will not automatically prune your node modules when installing.
The --nodedir=/path/to/node/source argument will allow npm to find the
node source code so that npm can compile native modules.
The --only={prod[uction]|dev[elopment]} argument will cause either only
devDependencies or only non-devDependencies to be installed regardless of the NODE_ENV.
+
The --no-audit argument can be used to disable sending of audit reports to
+the configured registries. See npm-audit(1) for details on what is sent.
See npm-config(7). Many of the configuration params have some
effect on installation, since that's most of what npm does.
ALGORITHM
@@ -299,14 +303,12 @@
ALGORITHM
compare the original tree with the cloned tree and make a list of
actions to take to convert one to the other
execute all of the actions, deepest first
- kinds of actions are install, update, remove and move
-
For this package{dep} structure: A{B,C}, B{C}, C{D},
+ kinds of actions are install, update, remove and move
For this package{dep} structure: A{B,C}, B{C}, C{D},
this algorithm produces:
A
+-- B
+-- C
-+-- D
-
That is, the dependency from B to C is satisfied by the fact that A
++-- D
That is, the dependency from B to C is satisfied by the fact that A
already caused C to be installed at a higher level. D is still installed
at the top level because nothing conflicts with it.
For A{B,C}, B{C,D@1}, C{D@2}, this algorithm produces:
@@ -314,8 +316,7 @@
ALGORITHM
+-- B
+-- C
`-- D@2
-+-- D@1
-
Because B's D@1 will be installed in the top level, C now has to install D@2
++-- D@1
Because B's D@1 will be installed in the top level, C now has to install D@2
privately for itself. This algorithm is deterministic, but different trees may
be produced if two dependencies are requested for installation in a different
order.
@@ -328,8 +329,7 @@
Limitations of npm's Install
There are some very rare and pathological edge-cases where a cycle can
cause npm to try to install a never-ending tree of packages. Here is
the simplest case:
-
A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ...
-
where A is some version of a package, and A' is a different version
+
A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ...
where A is some version of a package, and A' is a different version
of the same package. Because B depends on a different version of A
than the one that is already in the tree, it must install a separate
copy. The same is true of A', which must install B'. Because B'
@@ -344,6 +344,7 @@
npm link (in package dir)
npm link [<@scope>/]<pkg>[@<version>]
-alias: npm ln
-
DESCRIPTION
+alias: npm ln
DESCRIPTION
Package linking is a two-step process.
First, npm link in a package folder will create a symlink in the global folder
{prefix}/lib/node_modules/<package> that links to the package where the npm
@@ -36,23 +35,21 @@
SYNOPSIS
cd ~/projects/node-redis # go into the package directory
npm link # creates global link
cd ~/projects/node-bloggy # go into some other package directory.
-npm link redis # link-install the package
-
Now, any changes to ~/projects/node-redis will be reflected in
+npm link redis # link-install the package
Now, any changes to ~/projects/node-redis will be reflected in
~/projects/node-bloggy/node_modules/node-redis/. Note that the link should
be to the package name, not the directory name for that package.
You may also shortcut the two steps in one. For example, to do the
above use-case in a shorter way:
cd ~/projects/node-bloggy # go into the dir of your main project
-npm link ../node-redis # link the dir of your dependency
-
The second line is the equivalent of doing:
+npm link ../node-redis # link the dir of your dependency
The second line is the equivalent of doing:
(cd ../node-redis; npm link)
-npm link node-redis
-
That is, it first creates a global link, and then links the global
+npm link redis
That is, it first creates a global link, and then links the global
installation target into your project's node_modules folder.
+
Note that in this case, you are referring to the directory name, node-redis,
+rather than the package name redis.
If your linked package is scoped (see npm-scope(7)) your link command must
include that scope, e.g.
When logged into a registry that supports token-based authentication, tell the
server to end this token's session. This will invalidate the token everywhere
you're using it, not just for the current environment.
@@ -29,8 +28,7 @@
registry
scope
Default: The scope of your current project, if any, otherwise none.
If specified, you will be logged out of the specified scope. See npm-scope(7).
npm ls [[<@scope>/]<pkg> ...]
-aliases: list, la, ll
-
DESCRIPTION
+aliases: list, la, ll
DESCRIPTION
This command will print to stdout all the versions of packages that are
installed, as well as their dependencies, in a tree-structure.
Positional arguments are name@version-range identifiers, which will
limit the results to only the paths to the packages named. Note that
nested packages will also show the paths to the specified packages.
For example, running npm ls promzard in npm's source tree will show:
It will print out extraneous, missing, and invalid packages.
+ └── promzard@0.1.5
It will print out extraneous, missing, and invalid packages.
If a project specifies git urls for dependencies these are shown
in parentheses after the name@version to make it easier for users to
recognize potential forks of a project.
@@ -68,7 +66,7 @@
prod / production
Default: false
Display only the dependency tree for packages in dependencies.
This command will check the registry to see if any (or, specific) installed
packages are currently outdated.
In the output:
@@ -32,6 +31,8 @@
SYNOPSIS
package type (when using --long / -l) tells you whether this package is
a dependency or a devDependency. Packages not included in package.json
are always marked dependencies.
+
Red means there's a newer version matching your semver requirements, so you should update now.
+
Yellow indicates that there's a newer version above your semver requirements (usually new major, or new 0.x minor) so proceed with caution.
glob requires ^5, which prevents npm from installing glob@6, which is
@@ -58,10 +57,9 @@
An example
something immutable, like a commit SHA), or it might not, so npm outdated and
npm update have to fetch Git repos to check. This is why currently doing a
reinstall of a Git dependency always forces a new clone and install.
-
npm@3.5.2 is marked as "wanted", but "latest" is npm@3.5.1 because npm
-uses dist-tags to manage its latest and next release channels. npm update
-will install the newest version, but npm install npm (with no semver range)
-will install whatever's tagged as latest.
+
`npm@3.5.2is marked as "wanted", but "latest" isnpm@3.5.1because npm
+uses dist-tags to manage itslatestandnextrelease channels.npm updatewill install the _newest_ version, butnpm install npm(with no semver range)
+will install whatever's tagged aslatest`.
once is just plain out of date. Reinstalling node_modules from scratch or
running npm update will bring it up to spec.
For anything that's installable (that is, a package folder, tarball,
tarball url, name@tag, name@version, name, or scoped name), this
command will fetch it to the cache, and then copy the tarball to the
@@ -21,6 +20,8 @@
SYNOPSIS
If the same package is specified multiple times, then the file will be
overwritten the second time.
If no arguments are supplied, then npm packs the current package folder.
+
The --dry-run argument will do everything that pack usually does without
+actually packing anything. Reports on what would have gone into the tarball.
npm profile set <property> <value>:
Set the value of a profile property. You can set the following properties this way:
email, fullname, homepage, freenode, twitter, github
This command removes "extraneous" packages. If a package name is
provided, then only packages matching one of the supplied names are
removed.
@@ -20,8 +19,16 @@
SYNOPSIS
package's dependencies list.
If the --production flag is specified or the NODE_ENV environment
variable is set to production, this command will remove the packages
-specified in your devDependencies. Setting --production=false will
+specified in your devDependencies. Setting --no-production will
negate NODE_ENV being set to production.
+
If the --dry-run flag is used then no changes will actually be made.
+
If the --json flag is used then the changes npm prune made (or would
+have made with --dry-run) are printed as a JSON object.
+
In normal operation with package-locks enabled, extraneous modules are
+pruned automatically when modules are installed and you'll only need
+this command with the --production flag.
+
If you've disabled package-locks then extraneous modules will not be removed
+and it's up to you to run npm prune from time-to-time to remove them.
npm publish [<tarball>|<folder>] [--tag <tag>] [--access <public|restricted>] [--otp otpcode] [--dry-run]
Publishes '.' if no argument supplied
-Sets tag 'latest' if no --tag specified
-
DESCRIPTION
+Sets tag 'latest' if no --tag specified
DESCRIPTION
Publishes a package to the registry so that it can be installed by name. All
files in the package directory are included if no local .gitignore or
.npmignore file exists. If both files exist and a file is ignored by
@@ -50,6 +49,10 @@
SYNOPSIS
then you can provide a code from your authenticator with this. If you
don't include this and you're running from a TTY then you'll be prompted.
+
[--dry-run]
+Does everything publish would do except actually publishing to the registry.
+Reports the details of what would have been published.
+
Fails if the package name and version combination already exists in
the specified registry.
@@ -59,9 +62,8 @@
SYNOPSIS
As of npm@5, both a sha1sum and an integrity field with a sha512sum of the
tarball will be submitted to the registry during publication. Subsequent
installs will use the strongest supported algorithm to verify downloads.
-
For a "dry run" that does everything except actually publishing to the
-registry, see npm-pack(1), which figures out the files to be included and
-packs them into a tarball to be uploaded to the registry.
+
Similar to --dry-run see npm-pack(1), which figures out the files to be
+included and packs them into a tarball to be uploaded to the registry.
This command runs the npm build command on the matched folders. This is useful
when you install a new version of node, and must recompile all your C++ addons with
the new binary.
This command tries to guess at the likely location of a package's
repository URL, and then tries to open it using the --browser
config param. If no package name is provided, it will search for
@@ -41,5 +40,5 @@
npm run-script <command> [--silent] [-- <args>...]
-alias: npm run
-
DESCRIPTION
+alias: npm run
DESCRIPTION
This runs an arbitrary command from a package's "scripts" object. If no
"command" is provided, it will list the available scripts. run[-script] is
used by the test, start, restart, and stop commands, but can be called
directly, as well. When the scripts in the package are printed out, they're
separated into lifecycle (test, start, restart) and directly-run scripts.
As of npm@2.0.0, you can
use custom arguments when executing scripts. The special option -- is used by
-getopt to delimit the end of the options. npm will pass
+getopt to delimit the end of the options. npm will pass
all the arguments after the -- directly to your script:
-
npm run test -- --grep="pattern"
-
The arguments will only be passed to the script specified after npm run
+
npm run test -- --grep="pattern"
The arguments will only be passed to the script specified after npm run
and not to any pre or post script.
The env script is a special built-in command that can be used to list
environment variables that will be available to the script at runtime. If an
@@ -36,10 +34,8 @@
SYNOPSIS
locally-installed dependencies can be used without the node_modules/.bin
prefix. For example, if there is a devDependency on tap in your package,
you should write:
-
The actual shell your script is run within is platform dependent. By default,
on Unix-like systems it is the /bin/sh command, on Windows it is the cmd.exe.
The actual shell referred to by /bin/sh also depends on the system.
@@ -59,6 +55,9 @@
SYNOPSIS
If you try to run a script without having a node_modules directory and it fails,
you will be given a warning to run npm install, just in case you've forgotten.
You can use the --silent flag to prevent showing npm ERR! output on error.
+
You can use the --if-present flag to avoid exiting with a non-zero exit code
+when the script is undefined. This lets you run potentially undefined scripts
+without breaking the execution chain.
Search the registry for packages matching the search terms. npm search
performs a linear, incremental, lexically-ordered search through package
metadata for all files in the registry. If color is enabled, it will further
@@ -109,5 +108,5 @@
This command repurposes package-lock.json into a publishable
npm-shrinkwrap.json or simply creates a new one. The file created and updated
by this command will then take precedence over any other existing or future
@@ -41,5 +40,5 @@
This runs an arbitrary command specified in the package's "start" property of
its "scripts" object. If no "start" property is specified on the
"scripts" object, it will run node server.js.
npm token create [--read-only] [--cidr=<cidr-ranges>]:
Create a new authentication token. It can be --read-only or accept a list of
CIDR ranges to
@@ -53,12 +51,11 @@
npm token revoke <token|id>:
This removes an authentication token, making it immediately unusable. This can accept
both complete tokens (as you get back from npm token create and will
-find in your .npmrc) and ids as seen in the npm token list output.
+find in your .npmrc) and ids as seen in the npm token list output.
This will NOT accept the truncated token found in npm token list output.
It is generally considered bad behavior to remove versions of a library
that others are depending on!
Consider using the deprecate command
@@ -24,12 +23,14 @@
DESCRIPTION
If no version is specified, or if all versions are removed then
the root package entry is removed from the registry entirely.
Even if a package version is unpublished, that specific name and
-version combination can never be reused. In order to publish the
-package again, a new version number must be used.
+version combination can never be reused. In order to publish the
+package again, a new version number must be used. Additionally,
+new versions of packages with every version unpublished may not
+be republished until 24 hours have passed.
With the default registry (registry.npmjs.org), unpublish is
-only allowed with versions published in the last 24 hours. If you
+only allowed with versions published in the last 72 hours. If you
are trying to unpublish a version published longer ago than that,
-contact support@npmjs.com.
This command will update all the packages listed to the latest version
(specified by the tag config), respecting semver.
It will also install missing packages. As with all commands that install
@@ -24,12 +23,15 @@
SYNOPSIS
packages.
If no package name is specified, all packages in the specified location (global
or local) will be updated.
-
As of npm@2.6.1, the npm update will only inspect top-level packages.
-Prior versions of npm would also recursively inspect all dependencies.
-To get the old behavior, use npm --depth 9999 update.
+
As of `npm@2.6.1, thenpm updatewill only inspect top-level packages.
+Prior versions ofnpmwould also recursively inspect all dependencies.
+To get the old behavior, usenpm --depth 9999 update`.
+
As of `npm@5.0.0, thenpm updatewill changepackage.jsonto save the
+new version as the minimum required dependency. To get the old behavior,
+usenpm update --no-save`.
EXAMPLES
-
IMPORTANT VERSION NOTE: these examples assume npm@2.6.1 or later. For
-older versions of npm, you must specify --depth 0 to get the behavior
+
IMPORTANT VERSION NOTE: these examples assume `npm@2.6.1or later. For
+older versions ofnpm, you must specify--depth 0` to get the behavior
described below.
For the examples below, assume that the current package is app and it depends
on dependencies, dep1 (dep2, .. etc.). The published versions of dep1 are:
@@ -46,51 +48,29 @@
EXAMPLES
"0.4.0",
"0.2.0"
]
-}
-
Caret Dependencies
+}
Caret Dependencies
If app's package.json contains:
"dependencies": {
"dep1": "^1.1.1"
-}
-
Then npm update will install dep1@1.2.2, because 1.2.2 is latest and
-1.2.2 satisfies ^1.1.1.
+}
Then npm update will install `dep1@1.2.2, because1.2.2islatestand1.2.2satisfies^1.1.1`.
Tilde Dependencies
However, if app's package.json contains:
"dependencies": {
"dep1": "~1.1.1"
-}
-
In this case, running npm update will install dep1@1.1.2. Even though the latest
-tag points to 1.2.2, this version does not satisfy ~1.1.1, which is equivalent
-to >=1.1.1 <1.2.0. So the highest-sorting version that satisfies ~1.1.1 is used,
-which is 1.1.2.
+}
In this case, running npm update will install `dep1@1.1.2. Even though thelatesttag points to1.2.2, this version does not satisfy1.1.1, which is equivalent
+to>=1.1.1 <1.2.0. So the highest-sorting version that satisfies1.1.1is used,
+which is1.1.2`.
Caret Dependencies below 1.0.0
Suppose app has a caret dependency on a version below 1.0.0, for example:
"dependencies": {
"dep1": "^0.2.0"
-}
-
npm update will install dep1@0.2.0, because there are no other
-versions which satisfy ^0.2.0.
+}
npm update will install `dep1@0.2.0, because there are no other
+versions which satisfy^0.2.0`.
If the dependence were on ^0.4.0:
"dependencies": {
"dep1": "^0.4.0"
-}
-
Then npm update will install dep1@0.4.1, because that is the highest-sorting
-version that satisfies ^0.4.0 (>= 0.4.0 <0.5.0)
-
Recording Updates with --save
-
When you want to update a package and save the new version as
-the minimum required dependency in package.json, you can use
-npm update -S or npm update --save. For example if
-package.json contains:
-
"dependencies": {
- "dep1": "^1.1.1"
-}
-
Then npm update --save will install dep1@1.2.2 (i.e., latest),
-and package.json will be modified:
-
"dependencies": {
- "dep1": "^1.2.2"
-}
-
Note that npm will only write an updated version to package.json
-if it installs a new package.
+}
Then npm update will install `dep1@0.4.1, because that is the highest-sorting
+version that satisfies^0.4.0(>= 0.4.0 <0.5.0`)
Updating Globally-Installed Packages
npm update -g will apply the update action to each globally installed
package that is outdated -- that is, has a version that is different from
@@ -118,5 +98,5 @@
npm version [<newversion> | major | minor | patch | premajor | preminor | prepatch | prerelease | from-git]
+
npm version [<newversion> | major | minor | patch | premajor | preminor | prepatch | prerelease [--preid=<prerelease-id>] | from-git]
'npm [-v | --version]' to print npm version
'npm view <pkg> version' to view a package's published version
-'npm ls' to inspect current package/dependency versions
-
DESCRIPTION
+'npm ls' to inspect current package/dependency versions
DESCRIPTION
Run this in a package directory to bump the version and write the new
-data back to package.json and, if present, npm-shrinkwrap.json.
+data back to package.json, package-lock.json, and, if present, npm-shrinkwrap.json.
The newversion argument should be a valid semver string, a
valid second argument to semver.inc (one of patch, minor, major,
prepatch, preminor, premajor, prerelease), or from-git. In the second case,
@@ -33,8 +32,7 @@
SYNOPSIS
use it as a commit message when creating a version commit. If the
message config contains %s then that will be replaced with the
resulting version number. For example:
-
npm version patch -m "Upgrade to %s for reasons"
-
If the sign-git-tag config is set, then the tag will be signed using
+
npm version patch -m "Upgrade to %s for reasons"
If the sign-git-tag config is set, then the tag will be signed using
the -s flag to git. Note that you must have a default GPG key set up
in your git config for this to work properly. For example:
$ npm config set sign-git-tag true
@@ -44,8 +42,7 @@
SYNOPSIS
user: "isaacs (http://blog.izs.me/) <i@izs.me>"
2048-bit RSA key, ID 6C481CF6, created 2010-08-31
-Enter passphrase:
-
If preversion, version, or postversion are in the scripts property of
+Enter passphrase:
If preversion, version, or postversion are in the scripts property of
the package.json, they will be executed as part of running npm version.
This runs all your tests, and proceeds only if they pass. Then runs your build script, and
+}
This runs all your tests, and proceeds only if they pass. Then runs your build script, and
adds everything in the dist directory to the commit. After the commit, it pushes the new commit
and tag up to the server, and deletes the build/temp directory.
CONFIGURATION
@@ -78,7 +74,7 @@
allow-same-version
Default: false
Type: Boolean
-
Prevents throwing an error when npm version is used to set the new version
+
Prevents throwing an error when npm version is used to set the new version
to the same value as the current version.
npm view [<@scope>/]<name>[@<version>] [<field>[.<subfield>]...]
-aliases: info, show, v
-
DESCRIPTION
+aliases: info, show, v
DESCRIPTION
This command shows data about a package and prints it to the stream
referenced by the outfd config, which defaults to stdout.
To show the package registry entry for the connect package, you can do
this:
-
npm view connect
-
The default version is "latest" if unspecified.
+
npm view connect
The default version is "latest" if unspecified.
Field names can be specified after the package descriptor.
For example, to show the dependencies of the ronn package at version
0.3.5, you could do the following:
-
npm view ronn@0.3.5 dependencies
-
You can view child fields by separating them with a period.
+
npm view ronn@0.3.5 dependencies
You can view child fields by separating them with a period.
To view the git repository URL for the latest version of npm, you could
do this:
-
npm view npm repository.url
-
This makes it easy to view information about a dependency with a bit of
+
npm view npm repository.url
This makes it easy to view information about a dependency with a bit of
shell scripting. For example, to view all the data about the version of
opts that ronn depends on, you can do this:
-
npm view opts@$(npm view ronn dependencies.opts)
-
For fields that are arrays, requesting a non-numeric field will return
+
npm view opts@$(npm view ronn dependencies.opts)
For fields that are arrays, requesting a non-numeric field will return
all of the values from the objects in the list. For example, to get all
the contributor names for the "express" project, you can do this:
-
npm view express contributors.email
-
You may also use numeric indices in square braces to specifically select
+
npm view express contributors.email
You may also use numeric indices in square braces to specifically select
an item in an array field. To just get the email address of the first
contributor in the list, you can do this:
-
npm view express contributors[0].email
-
Multiple fields may be specified, and will be printed one after another.
+
npm view express contributors[0].email
Multiple fields may be specified, and will be printed one after another.
For example, to get all the contributor names and email addresses, you
can do this:
"Person" fields are shown as a string if they would be shown as an
object. So, for example, this will show the list of npm contributors in
the shortened string format. (See package.json(5) for more on this.)
-
npm view npm contributors
-
If a version range is provided, then data will be printed for every
+
npm view npm contributors
If a version range is provided, then data will be printed for every
matching version of the package. This will show which version of jsdom
was required by each matching version of yui3:
-
npm view yui3@'>0.5.4' dependencies.jsdom
-
To show the connect package version history, you can do
+
npm view yui3@'>0.5.4' dependencies.jsdom
To show the connect package version history, you can do
this:
-
npm view connect versions
-
OUTPUT
+
npm view connect versions
OUTPUT
If only a single string field for a single version is output, then it
will not be colorized or quoted, so as to enable piping the output to
another command. If the field is an object, it will be output as a JavaScript object literal.
npm is the package manager for the Node JavaScript platform. It puts
modules in place so that node can find them, and manages dependency
@@ -22,6 +21,13 @@
DESCRIPTION
Most commonly, it is used to publish, discover, install, and develop node
programs.
You can configure npm to use any compatible registry you like, and even run
+your own registry. Use of someone else's registry may be governed by their
+terms of use.
INTRODUCTION
You probably got npm because you want to install stuff.
Use npm install blerg to install the latest version of "blerg". Check out
@@ -45,9 +51,11 @@
global mode: npm installs packages into the install prefix at
+
global mode:
+npm installs packages into the install prefix at
prefix/lib/node_modules and bins are installed in prefix/bin.
-
local mode: npm installs packages into the current project directory, which
+
local mode:
+npm installs packages into the current project directory, which
defaults to the current working directory. Packages are installed to
./node_modules, and bins are installed to ./node_modules/.bin.
@@ -77,56 +85,52 @@
CONFIGURATION
npm is extremely configurable. It reads its configuration options from
5 places.
-
Command line switches: Set a config with --key val. All keys take a value, even if they
+
Command line switches:
+Set a config with --key val. All keys take a value, even if they
are booleans (the config parser doesn't know what the options are at
-the time of parsing.) If no value is provided, then the option is set
+the time of parsing). If no value is provided, then the option is set
to boolean true.
-
Environment Variables: Set any config by prefixing the name in an environment variable with
+
Environment Variables:
+Set any config by prefixing the name in an environment variable with
npm_config_. For example, export npm_config_key=val.
-
User Configs: The file at $HOME/.npmrc is an ini-formatted list of configs. If
+
User Configs:
+The file at $HOME/.npmrc is an ini-formatted list of configs. If
present, it is parsed. If the userconfig option is set in the cli
or env, then that will be used instead.
-
Global Configs: The file found at ../etc/npmrc (from the node executable, by default
+
Global Configs:
+The file found at ../etc/npmrc (from the node executable, by default
this resolves to /usr/local/etc/npmrc) will be parsed if it is found.
If the globalconfig option is set in the cli, env, or user config,
then that file is parsed instead.
-
Defaults: npm's default configuration options are defined in
+
Defaults:
+npm's default configuration options are defined in
lib/utils/config-defs.js. These must not be changed.
code:
-Read through npm-coding-style(7) if you plan to submit code.
-You don't have to agree with it, but you do have to follow it.
-
docs:
-If you find an error in the documentation, edit the appropriate markdown
-file in the "doc" folder. (Don't worry about generating the man page.)
-
-
Contributors are listed in npm's package.json file. You can view them
-easily by doing npm view npm contributors.
If you would like to contribute, but don't know what to work on, read
the contributing guidelines and check the issues list.
Be sure to include all of the output from the npm command that didn't work
-as expected. The npm-debug.log file is also helpful to provide.
-
You can also look for isaacs in #node.js on irc://irc.freenode.net. He
-will no doubt tell you to put the output in a gist or email.
+
Be sure to follow the template and bug reporting guidelines. You can also ask
+for help in the support forum if you're
+unsure if it's actually a bug or are having trouble coming up with a detailed
+reproduction to report.
Since foo depends directly on bar@1.2.3 and baz@1.2.3, those are
-installed in foo's node_modules folder.
+ `-- quux (3.2.0) <---[E]
Since foo depends directly on `bar@1.2.3andbaz@1.2.3, those are
+installed in foo'snode_modules` folder.
Even though the latest copy of blerg is 1.3.7, foo has a specific
dependency on version 1.2.5. So, that gets installed at [A]. Since the
-parent installation of blerg satisfies bar's dependency on blerg@1.x,
+parent installation of blerg satisfies bar's dependency on `blerg@1.x`,
it does not install another copy under [B].
Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar's node_modules folder. Because it depends on baz@2.x, it cannot
-re-use the baz@1.2.3 installed in the parent node_modules folder [D],
+bar's node_modules folder. Because it depends on `baz@2.x, it cannot
+re-use thebaz@1.2.3installed in the parentnode_modules` folder [D],
and must install its own copy [C].
Underneath bar, the baz -> quux -> bar dependency creates a cycle.
However, because bar is already in quux's ancestry [B], it does not
@@ -181,5 +179,5 @@
Since foo depends directly on bar@1.2.3 and baz@1.2.3, those are
-installed in foo's node_modules folder.
+ `-- quux (3.2.0) <---[E]
Since foo depends directly on `bar@1.2.3andbaz@1.2.3, those are
+installed in foo'snode_modules` folder.
Even though the latest copy of blerg is 1.3.7, foo has a specific
dependency on version 1.2.5. So, that gets installed at [A]. Since the
-parent installation of blerg satisfies bar's dependency on blerg@1.x,
+parent installation of blerg satisfies bar's dependency on `blerg@1.x`,
it does not install another copy under [B].
Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar's node_modules folder. Because it depends on baz@2.x, it cannot
-re-use the baz@1.2.3 installed in the parent node_modules folder [D],
+bar's node_modules folder. Because it depends on `baz@2.x, it cannot
+re-use thebaz@1.2.3installed in the parentnode_modules` folder [D],
and must install its own copy [C].
Underneath bar, the baz -> quux -> bar dependency creates a cycle.
However, because bar is already in quux's ancestry [B], it does not
@@ -181,5 +179,5 @@
A lot of the behavior described in this document is affected by the config
settings described in npm-config(7).
name
-
The most important things in your package.json are the name and version fields.
-Those are actually required, and your package won't install without
-them. The name and version together form an identifier that is assumed
-to be completely unique. Changes to the package should come along with
-changes to the version.
+
If you plan to publish your package, the most important things in your
+package.json are the name and version fields as they will be required. The name
+and version together form an identifier that is assumed to be completely unique.
+Changes to the package should come along with changes to the version. If you don't
+plan to publish your package, the name and version fields are optional.
The name is what your thing is called.
Some rules:
@@ -45,11 +45,11 @@
name
A name can be optionally prefixed by a scope, e.g. @myorg/mypackage. See
npm-scope(7) for more detail.
version
-
The most important things in your package.json are the name and version fields.
-Those are actually required, and your package won't install without
-them. The name and version together form an identifier that is assumed
-to be completely unique. Changes to the package should come along with
-changes to the version.
+
If you plan to publish your package, the most important things in your
+package.json are the name and version fields as they will be required. The name
+and version together form an identifier that is assumed to be completely unique.
+Changes to the package should come along with changes to the version. If you don't
+plan to publish your package, the name and version fields are optional.
Version must be parseable by
node-semver, which is bundled
with npm as a dependency. (npm install semver to use it yourself.)
@@ -62,15 +62,15 @@
keywords
discover your package as it's listed in npm search.
The url to your project's issue tracker and / or the email address to which
issues should be reported. These are helpful for people who encounter issues
with your package.
You can specify either one or both values. If you want to provide only a url,
+}
You can specify either one or both values. If you want to provide only a url,
you can specify the value for "bugs" as a simple string instead of an object.
If a url is provided, it will be used by the npm bugs command.
license
@@ -78,17 +78,14 @@
license
permitted to use it, and any restrictions you're placing on it.
If you're using a common license such as BSD-2-Clause or MIT, add a
current SPDX license identifier for the license you're using, like this:
Finally, if you do not wish to grant others the right to use a private or
+{ "license": "(MIT OR Apache-2.0)" }
Finally, if you do not wish to grant others the right to use a private or
unpublished package under any terms:
-
{ "license": "UNLICENSED" }
-
Consider also setting "private": true to prevent accidental publication.
+
{ "license": "UNLICENSED" }
Consider also setting "private": true to prevent accidental publication.
people fields: author, contributors
The "author" is one person. "contributors" is an array of people. A "person"
is an object with a "name" field and optionally "url" and "email", like this:
npm also sets a top-level "maintainers" field with your npm user info.
files
-
The optional "files" field is an array of file patterns that describes
+
The optional files field is an array of file patterns that describes
the entries to be included when your package is installed as a
-dependency. If the files array is omitted, everything except
-automatically-excluded files will be included in your publish. If you
-name a folder in the array, then it will also include the files inside
-that folder (unless they would be ignored by another rule in this
-section.).
+dependency. File patterns follow a similar syntax to .gitignore, but
+reversed: including a file, directory, or glob pattern (*, **/*, and such)
+will make it so that file is included in the tarball when it's packed. Omitting
+the field will make it default to ["*"], which means it will include all files.
+
Some special files and directories are also included or excluded regardless of
+whether they exist in the files array (see below).
You can also provide a .npmignore file in the root of your package or
in subdirectories, which will keep files from being included. At the
root of your package it will not override the "files" field, but in
@@ -179,6 +172,10 @@
main
This should be a module ID relative to the root of your package folder.
For most modules, it makes the most sense to have a main script and often not
much else.
+
browser
+
If your module is meant to be used client-side the browser field should be
+used instead of the main field. This is helpful to hint users that it might
+rely on primitives that aren't available in Node.js modules. (e.g. window)
bin
A lot of packages have one or more executable files that they'd like to
install into the PATH. npm makes this pretty easy (in fact, it uses this
@@ -188,19 +185,16 @@
bin
prefix/bin for global installs, or ./node_modules/.bin/ for local
installs.
For example, myapp could have this:
-
{ "bin" : { "myapp" : "./cli.js" } }
-
So, when you install myapp, it'll create a symlink from the cli.js script to
+
{ "bin" : { "myapp" : "./cli.js" } }
So, when you install myapp, it'll create a symlink from the cli.js script to
/usr/local/bin/myapp.
If you have a single executable, and its name should be the name
of the package, then you can just supply it as a string. For example:
Man files must end with a number, and optionally a .gz suffix if they are
compressed. The number dictates which man section the file is installed into.
The URL should be a publicly available (perhaps read-only) url that can be handed
+"repository": {
+ "type" : "svn",
+ "url" : "https://v8.googlecode.com/svn/trunk/"
+}
The URL should be a publicly available (perhaps read-only) url that can be handed
directly to a VCS program without any modification. It should not be a url to an
html project page that you put in your browser. It's for computers.
For GitHub, GitHub gist, Bitbucket, or GitLab repositories you can use the same
@@ -287,8 +277,7 @@
The "scripts" property is a dictionary containing script commands that are run
at various times in the lifecycle of your package. The key is the lifecycle
event, and the value is the command to run at that point.
@@ -298,8 +287,7 @@
config
scripts that persist across upgrades. For instance, if a package had the
following:
and then had a "start" command that then referenced the
+, "config" : { "port" : "8080" } }
and then had a "start" command that then referenced the
npm_package_config_port environment variable, then the user could
override that by doing npm config set foo:port 8001.
<protocol> is one of git, git+ssh, git+http, git+https, or
git+file.
If #<commit-ish> is provided, it will be used to clone exactly that
commit. If the commit-ish has the format #semver:<semver>, <semver> can
@@ -363,11 +349,10 @@
Git URLs as Dependencies
registry dependency. If neither #<commit-ish> or #semver:<semver> is
specified, then master is used.
As of version 1.1.65, you can refer to GitHub urls as just "foo":
"user/foo-project". Just as with git URLs, a commit-ish suffix can be
included. For example:
As of version 2.0.0 you can provide a path to a local directory that contains a
package. Local paths can be saved using npm install -S or
npm install --save, using any of these forms:
../foo/bar
~/foo/bar
./foo/bar
-/foo/bar
-
in which case they will be normalized to a relative path and added to your
+/foo/bar
in which case they will be normalized to a relative path and added to your
package.json. For example:
This feature is helpful for local offline development and creating
+}
This feature is helpful for local offline development and creating
tests that require npm installing where you don't want to hit an
external server, but should not be used when publishing packages
to the public registry.
The prepare script will be run before publishing, so that users
+}
The prepare script will be run before publishing, so that users
can consume the functionality without requiring them to compile it
themselves. In dev mode (ie, locally running npm install), it'll
run this script as well, so that you can test it easily.
@@ -440,13 +421,11 @@
peerDependencies
"peerDependencies": {
"tea": "2.x"
}
-}
-
This ensures your package tea-latte can be installed along with the second
+}
This ensures your package tea-latte can be installed along with the second
major version of the host package tea only. npm install tea-latte could
possibly yield the following dependency graph:
├── tea-latte@1.3.5
-└── tea@2.2.0
-
NOTE: npm versions 1 and 2 will automatically install peerDependencies if
+└── tea@2.2.0
NOTE: npm versions 1 and 2 will automatically install peerDependencies if
they are not explicitly depended upon higher in the dependency tree. In the
next major version of npm (npm@3), this will no longer be the case. You will
receive a warning that the peerDependency is not installed instead. The
@@ -455,7 +434,7 @@
peerDependencies
Trying to install another plugin with a conflicting requirement will cause an
error. For this reason, make sure your plugin requirement is as broad as
possible, and not to lock it down to specific patch versions.
-
Assuming the host complies with semver, only changes in
+
Assuming the host complies with semver, only changes in
the host package's major version will break your plugin. Thus, if you've worked
with every 1.x version of the host package, use "^1.0" or "1.x" to express
this. If you depend on features introduced in 1.5.2, use ">= 1.5.2 < 2".
we can obtain awesome-web-framework-1.0.0.tgz file by running npm pack.
+}
we can obtain awesome-web-framework-1.0.0.tgz file by running npm pack.
This file contains the dependencies renderized and super-streams which
can be installed in a new project by executing npm install
awesome-web-framework-1.0.0.tgz.
@@ -502,21 +480,18 @@
optionalDependencies
if (foo) {
foo.doFooThings()
-}
-
Entries in optionalDependencies will override entries of the same name in
+}
Entries in optionalDependencies will override entries of the same name in
dependencies, so it's usually best to only put in one place.
engines
You can specify the version of node that your stuff works on:
-
{ "engines" : { "node" : ">=0.10.3 <0.12" } }
-
And, like with dependencies, if you don't specify the version (or if you
+
{ "engines" : { "node" : ">=0.10.3 <0.12" } }
And, like with dependencies, if you don't specify the version (or if you
specify "*" as the version), then any version of node will do.
If you specify an "engines" field, then npm will require that "node" be
somewhere on that list. If "engines" is omitted, then npm will just assume
that it works on node.
You can also use the "engines" field to specify which versions of npm
are capable of properly installing your program. For example:
-
{ "engines" : { "npm" : "~1.0.20" } }
-
Unless the user has set the engine-strict config flag, this
+
{ "engines" : { "npm" : "~1.0.20" } }
Unless the user has set the engine-strict config flag, this
field is advisory only and will only produce warnings when your package is installed as a dependency.
engineStrict
This feature was removed in npm 3.0.0
@@ -525,20 +500,16 @@
engineStrict
os
You can specify which operating systems your
module will run on:
-
"os" : [ "darwin", "linux" ]
-
You can also blacklist instead of whitelist operating systems,
+
"os" : [ "darwin", "linux" ]
You can also blacklist instead of whitelist operating systems,
just prepend the blacklisted os with a '!':
-
"os" : [ "!win32" ]
-
The host operating system is determined by process.platform
+
"os" : [ "!win32" ]
The host operating system is determined by process.platform
It is allowed to both blacklist, and whitelist, although there isn't any
good reason to do this.
cpu
If your code only runs on certain cpu architectures,
you can specify which ones.
-
"cpu" : [ "x64", "ia32" ]
-
Like the os option, you can also blacklist architectures:
-
"cpu" : [ "!arm", "!mips" ]
-
The host architecture is determined by process.arch
+
"cpu" : [ "x64", "ia32" ]
Like the os option, you can also blacklist architectures:
+
"cpu" : [ "!arm", "!mips" ]
The host architecture is determined by process.arch
preferGlobal
DEPRECATED
This option used to trigger an npm warning, but it will no longer warn. It is
@@ -557,8 +528,8 @@
publishConfig
especially handy if you want to set the tag, registry or access, so that
you can ensure that a given package is not tagged with "latest", published
to the global public registry or that a scoped module is private by default.
-
Any config values can be overridden, but of course only "tag", "registry" and
-"access" probably matter for the purposes of publishing.
+
Any config values can be overridden, but only "tag", "registry" and "access"
+probably matter for the purposes of publishing.
See npm-config(7) to see the list of config options that can be
overridden.
If these are the only versions of A, B, and C available in the
+}
If these are the only versions of A, B, and C available in the
registry, then a normal npm install A will install:
A@0.1.0
`-- B@0.0.1
- `-- C@0.0.1
-
However, if B@0.0.2 is published, then a fresh npm install A will
+ `-- C@0.0.1
However, if B@0.0.2 is published, then a fresh npm install A will
install:
A@0.1.0
`-- B@0.0.2
- `-- C@0.0.1
-
assuming the new version did not modify B's dependencies. Of course,
+ `-- C@0.0.1
assuming the new version did not modify B's dependencies. Of course,
the new version of B could include a new version of C and any number
of new dependencies. If such changes are undesirable, the author of A
-could specify a dependency on B@0.0.1. However, if A's author and B's
+could specify a dependency on B@0.0.1. However, if A's author and B's
author are not the same person, there's no way for A's author to say
that he or she does not want to pull in newly published versions of C
when B hasn't changed at all.
@@ -85,8 +80,7 @@
DESCRIPTION
}
}
}
-}
-
This file describes an exact, and more importantly reproducible
+}
This file describes an exact, and more importantly reproduciblenode_modules tree. Once it's present, any future installation will base its
work off this file, instead of recalculating dependency versions off
package.json(5).
Using a locked package is no different than using any package without a package
lock: any commands that update node_modules and/or package.json's
dependencies will automatically sync the existing lockfile. This includes npm
@@ -125,6 +118,19 @@
DESCRIPTION
on. Additionally, the diffs from these changes are human-readable and will
inform you of any changes npm has made to your node_modules, so you can notice
if any transitive dependencies were updated, hoisted, etc.
+
Resolving lockfile conflicts
+
Occasionally, two separate npm install will create package locks that cause
+merge conflicts in source control systems. As of `npm@5.7.0, these conflicts
+can be resolved by manually fixing anypackage.jsonconflicts, and then
+runningnpm install [--package-lock-only]again. npm will automatically
+resolve any conflicts for you and write a merged package lock that includes all
+the dependencies from both branches in a reasonable tree. If--package-lock-onlyis provided, it will do this without also modifying your
+localnode_modules/`.
+
To make this process seamless on git, consider installing
+npm-merge-driver, which will teach git how
+to do this itself without any user interaction. In short: $ npx
+npm-merge-driver install -g will let you do this, and even works with
+pre-`npm@5.7.0versions of npm 5, albeit a bit more noisily. Note that ifpackage.jsonitself conflicts, you will have to resolve that by hand and runnpm install` manually, even with the merge driver.
All npm config files are an ini-formatted list of key = value
parameters. Environment variables can be replaced using
${VARIABLE_NAME}. For example:
-
prefix = ${HOME}/.npm-packages
-
Each of these files is loaded, and config options are resolved in
+
prefix = ${HOME}/.npm-packages
Each of these files is loaded, and config options are resolved in
priority order. For example, a setting in the userconfig file would
override the setting in the globalconfig file.
Array values are specified by adding "[]" after the key name. For
example:
key[] = "first value"
-key[] = "second value"
-
Comments
+key[] = "second value"
Comments
Lines in .npmrc files are interpreted as comments when they begin with a ; or # character. .npmrc files are parsed by npm/ini, which specifies this comment syntax.
For example:
# last modified: 01 Jan 2016
; Set a new registry for a scoped package
-@myscope:registry=https://mycustomregistry.example.org
-
When working locally in a project, a .npmrc file in the root of the
project (ie, a sibling of node_modules and package.json) will set
config values specific to this project.
transitive dependency of a non-optional dependency of the top level.
All optional dependencies should be included even if they're uninstallable
on the current platform.
+
requires
+
This is a mapping of module name to version. This is a list of everything
+this module requires, regardless of where it will be installed. The version
+should match via normal matching rules a dependency either in our
+dependencies or in a level higher than us.
dependencies
The dependencies of this dependency, exactly as at the top level.
A lot of the behavior described in this document is affected by the config
settings described in npm-config(7).
name
-
The most important things in your package.json are the name and version fields.
-Those are actually required, and your package won't install without
-them. The name and version together form an identifier that is assumed
-to be completely unique. Changes to the package should come along with
-changes to the version.
+
If you plan to publish your package, the most important things in your
+package.json are the name and version fields as they will be required. The name
+and version together form an identifier that is assumed to be completely unique.
+Changes to the package should come along with changes to the version. If you don't
+plan to publish your package, the name and version fields are optional.
The name is what your thing is called.
Some rules:
@@ -45,11 +45,11 @@
name
A name can be optionally prefixed by a scope, e.g. @myorg/mypackage. See
npm-scope(7) for more detail.
version
-
The most important things in your package.json are the name and version fields.
-Those are actually required, and your package won't install without
-them. The name and version together form an identifier that is assumed
-to be completely unique. Changes to the package should come along with
-changes to the version.
+
If you plan to publish your package, the most important things in your
+package.json are the name and version fields as they will be required. The name
+and version together form an identifier that is assumed to be completely unique.
+Changes to the package should come along with changes to the version. If you don't
+plan to publish your package, the name and version fields are optional.
Version must be parseable by
node-semver, which is bundled
with npm as a dependency. (npm install semver to use it yourself.)
@@ -62,15 +62,15 @@
keywords
discover your package as it's listed in npm search.
The url to your project's issue tracker and / or the email address to which
issues should be reported. These are helpful for people who encounter issues
with your package.
You can specify either one or both values. If you want to provide only a url,
+}
You can specify either one or both values. If you want to provide only a url,
you can specify the value for "bugs" as a simple string instead of an object.
If a url is provided, it will be used by the npm bugs command.
license
@@ -78,17 +78,14 @@
license
permitted to use it, and any restrictions you're placing on it.
If you're using a common license such as BSD-2-Clause or MIT, add a
current SPDX license identifier for the license you're using, like this:
Finally, if you do not wish to grant others the right to use a private or
+{ "license": "(MIT OR Apache-2.0)" }
Finally, if you do not wish to grant others the right to use a private or
unpublished package under any terms:
-
{ "license": "UNLICENSED" }
-
Consider also setting "private": true to prevent accidental publication.
+
{ "license": "UNLICENSED" }
Consider also setting "private": true to prevent accidental publication.
people fields: author, contributors
The "author" is one person. "contributors" is an array of people. A "person"
is an object with a "name" field and optionally "url" and "email", like this:
npm also sets a top-level "maintainers" field with your npm user info.
files
-
The optional "files" field is an array of file patterns that describes
+
The optional files field is an array of file patterns that describes
the entries to be included when your package is installed as a
-dependency. If the files array is omitted, everything except
-automatically-excluded files will be included in your publish. If you
-name a folder in the array, then it will also include the files inside
-that folder (unless they would be ignored by another rule in this
-section.).
+dependency. File patterns follow a similar syntax to .gitignore, but
+reversed: including a file, directory, or glob pattern (*, **/*, and such)
+will make it so that file is included in the tarball when it's packed. Omitting
+the field will make it default to ["*"], which means it will include all files.
+
Some special files and directories are also included or excluded regardless of
+whether they exist in the files array (see below).
You can also provide a .npmignore file in the root of your package or
in subdirectories, which will keep files from being included. At the
root of your package it will not override the "files" field, but in
@@ -179,6 +172,10 @@
main
This should be a module ID relative to the root of your package folder.
For most modules, it makes the most sense to have a main script and often not
much else.
+
browser
+
If your module is meant to be used client-side the browser field should be
+used instead of the main field. This is helpful to hint users that it might
+rely on primitives that aren't available in Node.js modules. (e.g. window)
bin
A lot of packages have one or more executable files that they'd like to
install into the PATH. npm makes this pretty easy (in fact, it uses this
@@ -188,19 +185,16 @@
bin
prefix/bin for global installs, or ./node_modules/.bin/ for local
installs.
For example, myapp could have this:
-
{ "bin" : { "myapp" : "./cli.js" } }
-
So, when you install myapp, it'll create a symlink from the cli.js script to
+
{ "bin" : { "myapp" : "./cli.js" } }
So, when you install myapp, it'll create a symlink from the cli.js script to
/usr/local/bin/myapp.
If you have a single executable, and its name should be the name
of the package, then you can just supply it as a string. For example:
Man files must end with a number, and optionally a .gz suffix if they are
compressed. The number dictates which man section the file is installed into.
The URL should be a publicly available (perhaps read-only) url that can be handed
+"repository": {
+ "type" : "svn",
+ "url" : "https://v8.googlecode.com/svn/trunk/"
+}
The URL should be a publicly available (perhaps read-only) url that can be handed
directly to a VCS program without any modification. It should not be a url to an
html project page that you put in your browser. It's for computers.
For GitHub, GitHub gist, Bitbucket, or GitLab repositories you can use the same
@@ -287,8 +277,7 @@
The "scripts" property is a dictionary containing script commands that are run
at various times in the lifecycle of your package. The key is the lifecycle
event, and the value is the command to run at that point.
@@ -298,8 +287,7 @@
config
scripts that persist across upgrades. For instance, if a package had the
following:
and then had a "start" command that then referenced the
+, "config" : { "port" : "8080" } }
and then had a "start" command that then referenced the
npm_package_config_port environment variable, then the user could
override that by doing npm config set foo:port 8001.
<protocol> is one of git, git+ssh, git+http, git+https, or
git+file.
If #<commit-ish> is provided, it will be used to clone exactly that
commit. If the commit-ish has the format #semver:<semver>, <semver> can
@@ -363,11 +349,10 @@
Git URLs as Dependencies
registry dependency. If neither #<commit-ish> or #semver:<semver> is
specified, then master is used.
As of version 1.1.65, you can refer to GitHub urls as just "foo":
"user/foo-project". Just as with git URLs, a commit-ish suffix can be
included. For example:
As of version 2.0.0 you can provide a path to a local directory that contains a
package. Local paths can be saved using npm install -S or
npm install --save, using any of these forms:
../foo/bar
~/foo/bar
./foo/bar
-/foo/bar
-
in which case they will be normalized to a relative path and added to your
+/foo/bar
in which case they will be normalized to a relative path and added to your
package.json. For example:
This feature is helpful for local offline development and creating
+}
This feature is helpful for local offline development and creating
tests that require npm installing where you don't want to hit an
external server, but should not be used when publishing packages
to the public registry.
The prepare script will be run before publishing, so that users
+}
The prepare script will be run before publishing, so that users
can consume the functionality without requiring them to compile it
themselves. In dev mode (ie, locally running npm install), it'll
run this script as well, so that you can test it easily.
@@ -440,13 +421,11 @@
peerDependencies
"peerDependencies": {
"tea": "2.x"
}
-}
-
This ensures your package tea-latte can be installed along with the second
+}
This ensures your package tea-latte can be installed along with the second
major version of the host package tea only. npm install tea-latte could
possibly yield the following dependency graph:
├── tea-latte@1.3.5
-└── tea@2.2.0
-
NOTE: npm versions 1 and 2 will automatically install peerDependencies if
+└── tea@2.2.0
NOTE: npm versions 1 and 2 will automatically install peerDependencies if
they are not explicitly depended upon higher in the dependency tree. In the
next major version of npm (npm@3), this will no longer be the case. You will
receive a warning that the peerDependency is not installed instead. The
@@ -455,7 +434,7 @@
peerDependencies
Trying to install another plugin with a conflicting requirement will cause an
error. For this reason, make sure your plugin requirement is as broad as
possible, and not to lock it down to specific patch versions.
-
Assuming the host complies with semver, only changes in
+
Assuming the host complies with semver, only changes in
the host package's major version will break your plugin. Thus, if you've worked
with every 1.x version of the host package, use "^1.0" or "1.x" to express
this. If you depend on features introduced in 1.5.2, use ">= 1.5.2 < 2".
we can obtain awesome-web-framework-1.0.0.tgz file by running npm pack.
+}
we can obtain awesome-web-framework-1.0.0.tgz file by running npm pack.
This file contains the dependencies renderized and super-streams which
can be installed in a new project by executing npm install
awesome-web-framework-1.0.0.tgz.
@@ -502,21 +480,18 @@
optionalDependencies
if (foo) {
foo.doFooThings()
-}
-
Entries in optionalDependencies will override entries of the same name in
+}
Entries in optionalDependencies will override entries of the same name in
dependencies, so it's usually best to only put in one place.
engines
You can specify the version of node that your stuff works on:
-
{ "engines" : { "node" : ">=0.10.3 <0.12" } }
-
And, like with dependencies, if you don't specify the version (or if you
+
{ "engines" : { "node" : ">=0.10.3 <0.12" } }
And, like with dependencies, if you don't specify the version (or if you
specify "*" as the version), then any version of node will do.
If you specify an "engines" field, then npm will require that "node" be
somewhere on that list. If "engines" is omitted, then npm will just assume
that it works on node.
You can also use the "engines" field to specify which versions of npm
are capable of properly installing your program. For example:
-
{ "engines" : { "npm" : "~1.0.20" } }
-
Unless the user has set the engine-strict config flag, this
+
{ "engines" : { "npm" : "~1.0.20" } }
Unless the user has set the engine-strict config flag, this
field is advisory only and will only produce warnings when your package is installed as a dependency.
engineStrict
This feature was removed in npm 3.0.0
@@ -525,20 +500,16 @@
engineStrict
os
You can specify which operating systems your
module will run on:
-
"os" : [ "darwin", "linux" ]
-
You can also blacklist instead of whitelist operating systems,
+
"os" : [ "darwin", "linux" ]
You can also blacklist instead of whitelist operating systems,
just prepend the blacklisted os with a '!':
-
"os" : [ "!win32" ]
-
The host operating system is determined by process.platform
+
"os" : [ "!win32" ]
The host operating system is determined by process.platform
It is allowed to both blacklist, and whitelist, although there isn't any
good reason to do this.
cpu
If your code only runs on certain cpu architectures,
you can specify which ones.
-
"cpu" : [ "x64", "ia32" ]
-
Like the os option, you can also blacklist architectures:
-
"cpu" : [ "!arm", "!mips" ]
-
The host architecture is determined by process.arch
+
"cpu" : [ "x64", "ia32" ]
Like the os option, you can also blacklist architectures:
+
"cpu" : [ "!arm", "!mips" ]
The host architecture is determined by process.arch
preferGlobal
DEPRECATED
This option used to trigger an npm warning, but it will no longer warn. It is
@@ -557,8 +528,8 @@
publishConfig
especially handy if you want to set the tag, registry or access, so that
you can ensure that a given package is not tagged with "latest", published
to the global public registry or that a scoped module is private by default.
-
Any config values can be overridden, but of course only "tag", "registry" and
-"access" probably matter for the purposes of publishing.
+
Any config values can be overridden, but only "tag", "registry" and "access"
+probably matter for the purposes of publishing.
See npm-config(7) to see the list of config options that can be
overridden.
If multiple single-character shorthands are strung together, and the
+npm ls --parseable
If multiple single-character shorthands are strung together, and the
resulting combination is unambiguously not some other configuration
param, then it is expanded to its various component pieces. For
example:
npm ls -gpld
# same as:
-npm ls --global --parseable --long --loglevel info
-
Per-Package Config Settings
+npm ls --global --parseable --long --loglevel info
Per-Package Config Settings
When running scripts (see npm-scripts(7)) the package.json "config"
keys are overwritten in the environment if there is a config param of
<name>[@<version>]:<key>. For example, if the package.json has
this:
When "dev" or "development" and running local npm shrinkwrap,
npm outdated, or npm update, is an alias for --dev.
+
audit
+
+
Default: true
+
Type: Boolean
+
+
When "true" submit audit reports alongside npm install runs to the default
+registry and all registries configured for scopes. See the documentation
+for npm-audit(1) for details on what is submitted.
+
audit-level
+
+
Default: "low"
+
Type: 'low', 'moderate', 'high', 'critical'
+
+
The minimum level of vulnerability for npm audit to exit with
+a non-zero exit code.
auth-type
Default: 'legacy'
@@ -164,13 +174,11 @@
ca
The Certificate Authority signing certificate that is trusted for SSL
connections to the registry. Values should be in PEM format (Windows calls it "Base-64 encoded X.509 (.CER)") with newlines
replaced by the string "\n". For example:
Set to null to only allow "known" registrars, or to a specific CA cert
to trust only that specific signing authority.
Multiple CAs can be trusted by specifying an array of certificates:
ca[]="..."
-ca[]="..."
-
See also the strict-ssl config.
+ca[]="..."
See also the strict-ssl config.
cafile
Default: null
@@ -224,8 +232,7 @@
cert
A client certificate to pass when accessing the registry. Values should be in
PEM format (Windows calls it "Base-64 encoded X.509 (.CER)") with newlines replaced by the string "\n". For example:
It is not the path to a certificate file (and there is no "certfile" option).
cidr
Default: null
@@ -239,6 +246,8 @@
color
If false, never shows colors. If "always" then always shows colors.
If true, then only prints color codes for tty file descriptors.
+
This option can also be changed using the environment: colors are
+disabled when the environment variable NO_COLOR is set to any value.
depth
Default: Infinity
@@ -270,8 +279,8 @@
dry-run
Indicates that you don't want npm to make any changes and that it should
only report what it would have done. This can be passed into any of the
commands that modify your local installation, eg, install, update,
-dedupe, uninstall. This is NOT currently honored by network related
-commands, eg dist-tags, owner, publish, etc.
+dedupe, uninstall. This is NOT currently honored by some network related
+commands, eg dist-tags, owner, etc.
editor
Default: EDITOR environment variable if set, or "vi" on Posix,
@@ -477,8 +486,7 @@
key
A client key to pass when accessing the registry. Values should be in PEM
format with newlines replaced by the string "\n". For example:
The registry you want to send cli metrics to if send-metrics is true.
@@ -578,6 +586,12 @@
node-version
Type: semver or false
The node version to use when checking a package's engines map.
+
noproxy
+
+
Default: null
+
Type: String or Array
+
+
A comma-separated string or an array of domain extensions that a proxy should not be used for.
offline
Default: false
@@ -628,13 +642,16 @@
package-lock
If set to false, then ignore package-lock.json files when installing. This
will also prevent writingpackage-lock.json if save is true.
+
When package package-locks are disabled, automatic pruning of extraneous
+modules will also be disabled. To remove extraneous modules with
+package-locks disabled use npm prune.
This option is an alias for --shrinkwrap.
package-lock-only
Default: false
Type: Boolean
-
If set to true, it will update only the package-json,
+
If set to true, it will update only the package-lock.json,
instead of checking node_modules and downloading dependencies.
parseable
@@ -665,6 +682,13 @@
prefix
The location to install global items. If set on the command line, then
it forces non-global commands to run in the specified folder.
+
preid
+
+
Default: ""
+
Type: String
+
+
The "prerelease identifier" to use as a prefix for the "prerelease" part of a
+semver. Like the rc in 1.2.0-rc.8.
production
Default: false
@@ -718,7 +742,7 @@
rollback
Remove failed installs.
save
-
Default: false
+
Default: true
Type: Boolean
Save installed packages to a package.json file as dependencies.
@@ -865,6 +889,15 @@
shrinkwrap
If set to false, then ignore npm-shrinkwrap.json files when installing. This
will also prevent writingnpm-shrinkwrap.json if save is true.
This option is an alias for --package-lock.
+
sign-git-commit
+
+
Default: false
+
Type: Boolean
+
+
If set to true, then the npm version command will commit the new package
+version using -S to add a signature.
+
Note that git requires you to have set up GPG keys in your git configs
+for this to work properly.
sign-git-tag
Default: false
@@ -947,6 +980,13 @@
unsafe-perm
Set to true to suppress the UID/GID switching when running package
scripts. If set explicitly to false, then installing as a non-root user
will fail.
+
update-notifier
+
+
Default: true
+
Type: Boolean
+
+
Set to false to suppress the update notification when using an older
+version of npm than the latest.
These are man pages. If you install npm, you should be able to
then do man npm-thing to get the documentation on a particular
topic, or npm help thing to see the same information.
-
What is a package
+
What is a package
A package is:
a) a folder containing a program described by a package.json file
The commit-ish can be any tag, sha, or branch which can be supplied as
+git+https://user@hostname/project/blah.git#commit-ish
The commit-ish can be any tag, sha, or branch which can be supplied as
an argument to git checkout. The default is master.
The package.json File
You need to have a package.json file in the root of your project to do
@@ -53,7 +52,7 @@
The package.json File
use the name to specify that it runs on node, or is in JavaScript.
You can use the "engines" field to explicitly state the versions of
node (or whatever else) that your program requires, and it's pretty
-well assumed that it's javascript.
+well assumed that it's JavaScript.
It does not necessarily need to match your github repository name.
So, node-foo and bar-js are bad names. foo or bar are better.
@@ -154,26 +153,21 @@
Before Publ
publish it, but you'll be publishing a broken or pointless package.
So don't do that.
In the root of your package, do this:
-
npm install . -g
-
That'll show you that it's working. If you'd rather just create a symlink
+
npm install . -g
That'll show you that it's working. If you'd rather just create a symlink
package that points to your working directory, then do this:
-
npm link
-
Use npm ls -g to see if it's there.
+
npm link
Use npm ls -g to see if it's there.
To test a local install, go into some other folder, and then do:
cd ../some-other-folder
-npm install ../my-package
-
to install it locally into the node_modules folder in that other place.
+npm install ../my-package
to install it locally into the node_modules folder in that other place.
Then go into the node-repl, and try using require("my-thing") to
bring in your module's main module.
Create a User Account
Create a user with the adduser command. It works like this:
After a few weeks, if there's no resolution, we'll sort it out.
Don't squat on package names. Publish code or move out of the way.
@@ -29,22 +29,25 @@
DESCRIPTION
some other user wants to use that name. Here are some common ways that happens
(each of these is based on actual events.)
-
Alice writes a JavaScript module foo, which is not node-specific. Alice
+
Alice writes a JavaScript module foo, which is not node-specific. Alice
doesn't use node at all. Yusuf wants to use foo in node, so he wraps it in
an npm module. Some time later, Alice starts using node, and wants to take
-over management of her program.
-
Yusuf writes an npm module foo, and publishes it. Perhaps much later, Alice
+over management of her program.
+
+
Yusuf writes an npm module foo, and publishes it. Perhaps much later, Alice
finds a bug in foo, and fixes it. She sends a pull request to Yusuf, but
Yusuf doesn't have the time to deal with it, because he has a new job and a
new baby and is focused on his new Erlang project, and kind of not involved
with node any more. Alice would like to publish a new foo, but can't,
-because the name is taken.
-
Yusuf writes a 10-line flow-control library, and calls it foo, and
+because the name is taken.
+
+
Yusuf writes a 10-line flow-control library, and calls it foo, and
publishes it to the npm registry. Being a simple little thing, it never
really has to be updated. Alice works for Foo Inc, the makers of the
critically acclaimed and widely-marketed foo JavaScript toolkit framework.
They publish it to npm as foojs, but people are routinely confused when
-npm installfoo`` is some different thing.
+npm install foo is some different thing.
+
Yusuf writes a parser for the widely-known foo file format, because he
needs it for work. Then, he gets a new job, and never updates the prototype.
Later on, Alice writes a much more complete foo parser, but can't publish,
@@ -53,15 +56,17 @@
DESCRIPTION
npm owner ls foo. This will tell Alice the email address of the owner
(Yusuf).
-
Alice emails Yusuf, explaining the situation as respectfully as possible,
+
Alice emails Yusuf, explaining the situation as respectfully as possible,
and what she would like to do with the module name. She adds the npm support
-staff support@npmjs.com to the CC list of the email. Mention in the email
+staff support@npmjs.com to the CC list of the email. Mention in the email
that Yusuf can run npm owner add alice foo to add Alice as an owner of the
-foo package.
-
After a reasonable amount of time, if Yusuf has not responded, or if Yusuf
+foo package.
+
+
After a reasonable amount of time, if Yusuf has not responded, or if Yusuf
and Alice can't come to any sort of resolution, email support
-support@npmjs.com and we'll sort it out. ("Reasonable" is usually at least
-4 weeks.)
+support@npmjs.com and we'll sort it out. ("Reasonable" is usually at least
+4 weeks.)
+
REASONING
In almost every case so far, the parties involved have been able to reach an
@@ -96,14 +101,14 @@
EXCEPTIONS
Code of Conduct such as hateful
language, pornographic content, or harassment.
-
If you see bad behavior like this, please report it to abuse@npmjs.com right
+
If you see bad behavior like this, please report it to abuse@npmjs.com right
away. You are never expected to resolve abusive behavior on your own. We are
here to help.
TRADEMARKS
If you think another npm publisher is infringing your trademark, such as by
-using a confusingly similar package name, email abuse@npmjs.com with a link to
-the package or user account on https://npmjs.com. Attach a
-copy of your trademark registration certificate.
+using a confusingly similar package name, email abuse@npmjs.com with a link to
+the package or user account on https://www.npmjs.com/.
+Attach a copy of your trademark registration certificate.
If we see that the package's publisher is intentionally misleading others by
misusing your registered mark without permission, we will transfer the package
name to you. Otherwise, we will contact the package publisher and ask them to
@@ -111,7 +116,7 @@
Each org is automatically given a developers team, so you can see the whole list of team members in your org. This team automatically gets read-write access to all packages, but you can change that with the access command.
Create a new team:
-
npm team create <org:team>
-
+
npm team create <org:team>
Add members to that team:
-
npm team add <org:team> <user>
-
Publish a package and adjust package access
+
npm team add <org:team> <user>
Publish a package and adjust package access
In package directory, run
-
npm init --scope=<org>
-
to scope it for your org & publish as usual
+
npm init --scope=<org>
to scope it for your org & publish as usual
Grant access:
-
npm access grant <read-only|read-write> <org:team> [<package>]
-
+
npm access grant <read-only|read-write> <org:team> [<package>]
To resolve packages by name and version, npm talks to a registry website
that implements the CommonJS Package Registry specification for reading
package info.
-
Additionally, npm's package registry implementation supports several
+
You can configure npm to use any compatible registry you like, and even run
+your own registry. Use of someone else's registry may be governed by their
+terms of use.
+
npm's package registry implementation supports several
write APIs as well, to allow for publishing packages and managing user
account information.
-
The official public npm registry is at https://registry.npmjs.org/. It
-is powered by a CouchDB database, of which there is a public mirror at
+
follows the usual rules for package names (URL-safe characters, no leading dots
or underscores). When used in package names, scopes are preceded by an @ symbol
and followed by a slash, e.g.
-
@somescope/somepackagename
-
Scopes are a way of grouping related packages together, and also affect a few
+
@somescope/somepackagename
Scopes are a way of grouping related packages together, and also affect a few
things about the way npm treats the package.
Each npm user/organization has their own scope, and only you can add packages
in your scope. This means you don't have to worry about someone taking your
@@ -34,18 +33,15 @@
Installing scoped packages
contain any number of scoped packages.
A scoped package is installed by referencing it by name, preceded by an
@ symbol, in npm install:
You can then publish the module with npm publish or npm publish
--access restricted, and it will be present in the npm registry, with
@@ -71,12 +67,10 @@
Associating a scope with a registry
seamlessly use a mix of packages from the primary npm registry and one or more
private registries, such as npm Enterprise.
You can associate a scope with a registry at login, e.g.
Scopes have a many-to-one relationship with registries: one registry can
host multiple scopes, but a scope only ever points to one registry.
You can also associate a scope with a registry using npm config:
-
npm config set @myco:registry http://reg.example.com
-
Once a scope is associated with a registry, any npm install for a package
+
npm config set @myco:registry http://reg.example.com
Once a scope is associated with a registry, any npm install for a package
with that scope will request packages from that registry instead. Any
npm publish for a package name that contains the scope will be published to
that registry instead.
Additionally, arbitrary scripts can be executed by running npm
run-script <stage>. Pre and post commands with matching
names will be run for those as well (e.g. premyscript, myscript,
-postmyscript). Scripts from dependencies can be run with `npm explore
-
-- npm run `.
+postmyscript). Scripts from dependencies can be run with npm explore
+<pkg> -- npm run <stage>.
PREPUBLISH AND PREPARE
DEPRECATION NOTE
-
Since npm@1.1.71, the npm CLI has run the prepublish script for both npm
-publish and npm install, because it's a convenient way to prepare a package
+
Since `npm@1.1.71, the npm CLI has run theprepublishscript for bothnpm
+publishandnpm install, because it's a convenient way to prepare a package
for use (some common use cases are described in the section below). It has
-also turned out to be, in practice, very
-confusing. As of npm@4.0.0, a new
-event has been introduced, prepare, that preserves this existing behavior. A
-new event, prepublishOnly has been added as a transitional strategy to
+also turned out to be, in practice, [very
+confusing](https://github.com/npm/npm/issues/10074). As ofnpm@4.0.0, a new
+event has been introduced,prepare, that preserves this existing behavior. A
+_new_ event,prepublishOnlyhas been added as a transitional strategy to
allow users to avoid the confusing behavior of existing npm versions and only
-run on npm publish (for instance, running the tests one last time to ensure
+run onnpm publish` (for instance, running the tests one last time to ensure
they're in good shape).
then you could run npm start to execute the bar script, which is
+, "scripts": { "start" : "bar ./test" } }
then you could run npm start to execute the bar script, which is
exported into the node_modules/.bin directory on npm install.
package.json vars
The package.json fields are tacked onto the npm_package_ prefix. So,
for instance, if you had {"name":"foo", "version":"1.2.5"} in your
package.json file, then your package scripts would have the
npm_package_name environment variable set to "foo", and the
-npm_package_version set to "1.2.5"
+npm_package_version set to "1.2.5". You can access these variables
+in your code with process.env.npm_package_name and
+process.env.npm_package_version, and so on for other fields.
configuration
Configuration parameters are put in the environment with the
npm_config_ prefix. For instance, you can view the effective root
@@ -144,12 +145,9 @@
Special: package.json "config&q
if the package.json has this:
Lastly, the npm_lifecycle_event environment variable is set to
whichever stage of the cycle is being executed. So, you could have a
single script used for different parts of the process which switches
@@ -157,16 +155,14 @@
Special: package.json "config&q
Objects are flattened following this format, so if you had
{"scripts":{"install":"foo.js"}} in your package.json, then you'd
see this in the script:
then scripts/install.js will be called for the install
+}
then scripts/install.js will be called for the install
and post-install stages of the lifecycle, and scripts/uninstall.js
will be called when the package is uninstalled. Since
scripts/install.js is running for two different phases, it would
@@ -179,8 +175,7 @@
If you installed things with npm, then your best bet is to uninstall
them with npm first, and then install them again once you have a
proper install. This can help find any symlinks that are lying
around:
-
ls -laF /usr/local/{lib/node{,/.npm},bin,share/man} | grep npm
-
Prior to version 0.3, npm used shim files for executables and node
+
ls -laF /usr/local/{lib/node{,/.npm},bin,share/man} | grep npm
Prior to version 0.3, npm used shim files for executables and node
modules. To track those down, you can do the following:
-l --loose
Interpret versions and ranges loosely
+-c --coerce
+ Coerce a string into SemVer if possible
+ (does not imply --loose)
+
Program exits successfully if any valid version satisfies
all supplied ranges, and prints all satisfying versions.
If no satisfying versions are found, then exits failure.
Versions are printed in ascending order, so supplying
-multiple versions to the utility will just sort them.
-
Versions
+multiple versions to the utility will just sort them.
Versions
A "version" is described by the v2.0.0 specification found at
http://semver.org/.
A leading "=" or "v" character is stripped off and ignored.
@@ -117,23 +120,20 @@
Prerelease Tags
Prerelease Identifiers
The method .inc takes an additional identifier string argument that
will append the value of the string as a prerelease identifier:
Advanced range syntax desugars to primitive comparators in
deterministic ways.
Advanced ranges may be combined in the same way as primitive
comparators using white space or ||.
-
Hyphen Ranges X.Y.Z - A.B.C
+
Hyphen Ranges X.Y.Z - A.B.C
Specifies an inclusive set.
1.2.3 - 2.3.4 := >=1.2.3 <=2.3.4
@@ -166,7 +166,7 @@
X-Ranges 1.2.x1.X1 := 1.x.x := >=1.0.0 <2.0.0
1.2 := 1.2.x := >=1.2.0 <1.3.0
-
Tilde Ranges ~1.2.3~1.2~1
+
Tilde Ranges ~1.2.3~1.2~1
Allows patch-level changes if a minor version is specified on the
comparator. Allows minor-level changes if not.
@@ -182,7 +182,7 @@
Tilde Ranges ~1.2.3~1.2
1.2.4-beta.2 would not, because it is a prerelease of a
different [major, minor, patch] tuple.
-
Caret Ranges ^1.2.3^0.2.5^0.0.4
+
Caret Ranges ^1.2.3^0.2.5^0.0.4
Allows changes that do not modify the left-most non-zero digit in the
[major, minor, patch] tuple. In other words, this allows patch and
minor updates for versions 1.0.0 and above, patch updates for
@@ -225,7 +225,7 @@
Caret Ranges ^1.2.3
Range Grammar
Putting all this together, here is a Backus-Naur grammar for ranges,
for the benefit of parser authors:
pre ::= parts
build ::= parts
parts ::= part ( '.' part ) *
-part ::= nr | [-0-9A-Za-z]+
-
+part ::= nr | [-0-9A-Za-z]+
Functions
All methods and classes take a final loose boolean argument that, if
true, will be more forgiving about not-quite-valid semver strings.
@@ -324,6 +323,21 @@
Ranges
satisfy the range.
If you want to know if a version satisfies or does not satisfy a
range, use the satisfies(version, range) function.
+
Coercion
+
+
coerce(version): Coerces a string to semver if possible
+
+
This aims to provide a very forgiving translation of a non-semver
+string to semver. It looks for the first digit in a string, and
+consumes all remaining characters which satisfy at least a partial semver
+(e.g., 1, 1.2, 1.2.3) up to the max permitted length (256 characters).
+Longer versions are simply truncated (4.6.3.9.2-alpha2 becomes 4.6.3).
+All surrounding text is simply ignored (v3.4 replaces v3.3.1 becomes 3.4.0).
+Only text which lacks digits will fail coercion (version one is not valid).
+The maximum length for any semver component considered for coercion is 16 characters;
+longer components will be ignored (10000000000000000.4.7.4 becomes 4.7.4).
+The maximum value for any semver component is Integer.MAX_SAFE_INTEGER || (2**53 - 1);
+higher value components are invalid (9999999999999999.4.7.4 is likely invalid).
diff --git a/deps/npm/lib/access.js b/deps/npm/lib/access.js
index ad7a1f54bd34a1..164ea3b7d741a1 100644
--- a/deps/npm/lib/access.js
+++ b/deps/npm/lib/access.js
@@ -1,4 +1,5 @@
'use strict'
+/* eslint-disable standard/no-callback-literal */
var resolve = require('path').resolve
@@ -21,7 +22,7 @@ access.usage =
'npm access edit []'
access.subcommands = ['public', 'restricted', 'grant', 'revoke',
- 'ls-packages', 'ls-collaborators', 'edit']
+ 'ls-packages', 'ls-collaborators', 'edit']
access.completion = function (opts, cb) {
var argv = opts.conf.argv.remain
diff --git a/deps/npm/lib/adduser.js b/deps/npm/lib/adduser.js
index 0aac6b7fbb4330..e1c221032568d6 100644
--- a/deps/npm/lib/adduser.js
+++ b/deps/npm/lib/adduser.js
@@ -17,7 +17,7 @@ adduser.usage = usage(
function adduser (args, cb) {
if (!crypto) {
return cb(new Error(
- 'You must compile node with ssl support to use the adduser feature'
+ 'You must compile node with ssl support to use the adduser feature'
))
}
diff --git a/deps/npm/lib/audit.js b/deps/npm/lib/audit.js
new file mode 100644
index 00000000000000..06852610e64663
--- /dev/null
+++ b/deps/npm/lib/audit.js
@@ -0,0 +1,273 @@
+'use strict'
+
+const Bluebird = require('bluebird')
+
+const audit = require('./install/audit.js')
+const fs = require('graceful-fs')
+const Installer = require('./install.js').Installer
+const lockVerify = require('lock-verify')
+const log = require('npmlog')
+const npa = require('npm-package-arg')
+const npm = require('./npm.js')
+const output = require('./utils/output.js')
+const parseJson = require('json-parse-better-errors')
+
+const readFile = Bluebird.promisify(fs.readFile)
+
+module.exports = auditCmd
+
+const usage = require('./utils/usage')
+auditCmd.usage = usage(
+ 'audit',
+ '\nnpm audit [--json]' +
+ '\nnpm audit fix ' +
+ '[--force|--package-lock-only|--dry-run|--production|--only=(dev|prod)]'
+)
+
+auditCmd.completion = function (opts, cb) {
+ const argv = opts.conf.argv.remain
+
+ switch (argv[2]) {
+ case 'audit':
+ return cb(null, [])
+ default:
+ return cb(new Error(argv[2] + ' not recognized'))
+ }
+}
+
+class Auditor extends Installer {
+ constructor (where, dryrun, args, opts) {
+ super(where, dryrun, args, opts)
+ this.deepArgs = (opts && opts.deepArgs) || []
+ this.runId = opts.runId || ''
+ this.audit = false
+ }
+
+ loadAllDepsIntoIdealTree (cb) {
+ Bluebird.fromNode(cb => super.loadAllDepsIntoIdealTree(cb)).then(() => {
+ if (this.deepArgs && this.deepArgs.length) {
+ this.deepArgs.forEach(arg => {
+ arg.reduce((acc, child, ii) => {
+ if (!acc) {
+ // We might not always be able to find `target` through the given
+ // path. If we can't we'll just ignore it.
+ return
+ }
+ const spec = npa(child)
+ const target = (
+ acc.requires.find(n => n.package.name === spec.name) ||
+ acc.requires.find(
+ n => audit.scrub(n.package.name, this.runId) === spec.name
+ )
+ )
+ if (target && ii === arg.length - 1) {
+ target.loaded = false
+ // This kills `hasModernMeta()` and forces a re-fetch
+ target.package = {
+ name: spec.name,
+ version: spec.fetchSpec,
+ _requested: target.package._requested
+ }
+ delete target.fakeChild
+ let parent = target.parent
+ while (parent) {
+ parent.loaded = false
+ parent = parent.parent
+ }
+ target.requiredBy.forEach(par => {
+ par.loaded = false
+ delete par.fakeChild
+ })
+ }
+ return target
+ }, this.idealTree)
+ })
+ return Bluebird.fromNode(cb => super.loadAllDepsIntoIdealTree(cb))
+ }
+ }).nodeify(cb)
+ }
+
+ // no top level lifecycles on audit
+ runPreinstallTopLevelLifecycles (cb) { cb() }
+ runPostinstallTopLevelLifecycles (cb) { cb() }
+}
+
+function maybeReadFile (name) {
+ const file = `${npm.prefix}/${name}`
+ return readFile(file)
+ .then((data) => {
+ try {
+ return parseJson(data)
+ } catch (ex) {
+ ex.code = 'EJSONPARSE'
+ throw ex
+ }
+ })
+ .catch({code: 'ENOENT'}, () => null)
+ .catch((ex) => {
+ ex.file = file
+ throw ex
+ })
+}
+
+function filterEnv (action) {
+ const includeDev = npm.config.get('dev') ||
+ (!/^prod(uction)?$/.test(npm.config.get('only')) && !npm.config.get('production')) ||
+ /^dev(elopment)?$/.test(npm.config.get('only')) ||
+ /^dev(elopment)?$/.test(npm.config.get('also'))
+ const includeProd = !/^dev(elopment)?$/.test(npm.config.get('only'))
+ const resolves = action.resolves.filter(({dev}) => {
+ return (dev && includeDev) || (!dev && includeProd)
+ })
+ if (resolves.length) {
+ return Object.assign({}, action, {resolves})
+ }
+}
+
+function auditCmd (args, cb) {
+ if (npm.config.get('global')) {
+ const err = new Error('`npm audit` does not support testing globals')
+ err.code = 'EAUDITGLOBAL'
+ throw err
+ }
+ if (args.length && args[0] !== 'fix') {
+ return cb(new Error('Invalid audit subcommand: `' + args[0] + '`\n\nUsage:\n' + auditCmd.usage))
+ }
+ return Bluebird.all([
+ maybeReadFile('npm-shrinkwrap.json'),
+ maybeReadFile('package-lock.json'),
+ maybeReadFile('package.json')
+ ]).spread((shrinkwrap, lockfile, pkgJson) => {
+ const sw = shrinkwrap || lockfile
+ if (!pkgJson) {
+ const err = new Error('No package.json found: Cannot audit a project without a package.json')
+ err.code = 'EAUDITNOPJSON'
+ throw err
+ }
+ if (!sw) {
+ const err = new Error('Neither npm-shrinkwrap.json nor package-lock.json found: Cannot audit a project without a lockfile')
+ err.code = 'EAUDITNOLOCK'
+ throw err
+ } else if (shrinkwrap && lockfile) {
+ log.warn('audit', 'Both npm-shrinkwrap.json and package-lock.json exist, using npm-shrinkwrap.json.')
+ }
+ const requires = Object.assign(
+ {},
+ (pkgJson && pkgJson.dependencies) || {},
+ (pkgJson && pkgJson.devDependencies) || {}
+ )
+ return lockVerify(npm.prefix).then((result) => {
+ if (result.status) return audit.generate(sw, requires)
+
+ const lockFile = shrinkwrap ? 'npm-shrinkwrap.json' : 'package-lock.json'
+ const err = new Error(`Errors were found in your ${lockFile}, run npm install to fix them.\n ` +
+ result.errors.join('\n '))
+ err.code = 'ELOCKVERIFY'
+ throw err
+ })
+ }).then((auditReport) => {
+ return audit.submitForFullReport(auditReport)
+ }).catch((err) => {
+ if (err.statusCode === 404 || err.statusCode >= 500) {
+ const ne = new Error(`Your configured registry (${npm.config.get('registry')}) does not support audit requests.`)
+ ne.code = 'ENOAUDIT'
+ ne.wrapped = err
+ throw ne
+ }
+ throw err
+ }).then((auditResult) => {
+ if (args[0] === 'fix') {
+ const actions = (auditResult.actions || []).reduce((acc, action) => {
+ action = filterEnv(action)
+ if (!action) { return acc }
+ if (action.isMajor) {
+ acc.major.add(`${action.module}@${action.target}`)
+ action.resolves.forEach(({id, path}) => acc.majorFixes.add(`${id}::${path}`))
+ } else if (action.action === 'install') {
+ acc.install.add(`${action.module}@${action.target}`)
+ action.resolves.forEach(({id, path}) => acc.installFixes.add(`${id}::${path}`))
+ } else if (action.action === 'update') {
+ const name = action.module
+ const version = action.target
+ action.resolves.forEach(vuln => {
+ acc.updateFixes.add(`${vuln.id}::${vuln.path}`)
+ const modPath = vuln.path.split('>')
+ const newPath = modPath.slice(
+ 0, modPath.indexOf(name)
+ ).concat(`${name}@${version}`)
+ if (newPath.length === 1) {
+ acc.install.add(newPath[0])
+ } else {
+ acc.update.add(newPath.join('>'))
+ }
+ })
+ } else if (action.action === 'review') {
+ action.resolves.forEach(({id, path}) => acc.review.add(`${id}::${path}`))
+ }
+ return acc
+ }, {
+ install: new Set(),
+ installFixes: new Set(),
+ update: new Set(),
+ updateFixes: new Set(),
+ major: new Set(),
+ majorFixes: new Set(),
+ review: new Set()
+ })
+ return Bluebird.try(() => {
+ const installMajor = npm.config.get('force')
+ const installCount = actions.install.size + (installMajor ? actions.major.size : 0) + actions.update.size
+ const vulnFixCount = new Set([...actions.installFixes, ...actions.updateFixes, ...(installMajor ? actions.majorFixes : [])]).size
+ const metavuln = auditResult.metadata.vulnerabilities
+ const total = Object.keys(metavuln).reduce((acc, key) => acc + metavuln[key], 0)
+ if (installCount) {
+ log.verbose(
+ 'audit',
+ 'installing',
+ [...actions.install, ...(installMajor ? actions.major : []), ...actions.update]
+ )
+ }
+ return Bluebird.fromNode(cb => {
+ new Auditor(
+ npm.prefix,
+ !!npm.config.get('dry-run'),
+ [...actions.install, ...(installMajor ? actions.major : [])],
+ {
+ runId: auditResult.runId,
+ deepArgs: [...actions.update].map(u => u.split('>'))
+ }
+ ).run(cb)
+ }).then(() => {
+ const numScanned = auditResult.metadata.totalDependencies
+ if (!npm.config.get('json') && !npm.config.get('parseable')) {
+ output(`fixed ${vulnFixCount} of ${total} vulnerabilit${total === 1 ? 'y' : 'ies'} in ${numScanned} scanned package${numScanned === 1 ? '' : 's'}`)
+ if (actions.review.size) {
+ output(` ${actions.review.size} vulnerabilit${actions.review.size === 1 ? 'y' : 'ies'} required manual review and could not be updated`)
+ }
+ if (actions.major.size) {
+ output(` ${actions.major.size} package update${actions.major.size === 1 ? '' : 's'} for ${actions.majorFixes.size} vuln${actions.majorFixes.size === 1 ? '' : 's'} involved breaking changes`)
+ if (installMajor) {
+ output(' (installed due to `--force` option)')
+ } else {
+ output(' (use `npm audit fix --force` to install breaking changes;' +
+ ' or refer to `npm audit` for steps to fix these manually)')
+ }
+ }
+ }
+ })
+ })
+ } else {
+ const levels = ['low', 'moderate', 'high', 'critical']
+ const minLevel = levels.indexOf(npm.config.get('audit-level'))
+ const vulns = levels.reduce((count, level, i) => {
+ return i < minLevel ? count : count + (auditResult.metadata.vulnerabilities[level] || 0)
+ }, 0)
+ if (vulns > 0) process.exitCode = 1
+ if (npm.config.get('parseable')) {
+ return audit.printParseableReport(auditResult)
+ } else {
+ return audit.printFullReport(auditResult)
+ }
+ }
+ }).asCallback(cb)
+}
diff --git a/deps/npm/lib/auth/legacy.js b/deps/npm/lib/auth/legacy.js
index 92bf44c119af39..8c25df0288e677 100644
--- a/deps/npm/lib/auth/legacy.js
+++ b/deps/npm/lib/auth/legacy.js
@@ -6,52 +6,74 @@ const npm = require('../npm.js')
const output = require('../utils/output.js')
const pacoteOpts = require('../config/pacote')
const fetchOpts = require('../config/fetch-opts')
+const openUrl = require('../utils/open-url')
-module.exports.login = function login (creds, registry, scope, cb) {
- let username = creds.username || ''
- let password = creds.password || ''
- let email = creds.email || ''
- const auth = {}
- if (npm.config.get('otp')) auth.otp = npm.config.get('otp')
+const openerPromise = (url) => new Promise((resolve, reject) => {
+ openUrl(url, 'to complete your login please visit', (er) => er ? reject(er) : resolve())
+})
- return read.username('Username:', username, {log: log}).then((u) => {
- username = u
- return read.password('Password: ', password)
+const loginPrompter = (creds) => {
+ const opts = { log: log }
+ return read.username('Username:', creds.username, opts).then((u) => {
+ creds.username = u
+ return read.password('Password:', creds.password)
}).then((p) => {
- password = p
- return read.email('Email: (this IS public) ', email, {log: log})
+ creds.password = p
+ return read.email('Email: (this IS public) ', creds.email, opts)
}).then((e) => {
- email = e
- return profile.login(username, password, {registry: registry, auth: auth}).catch((err) => {
+ creds.email = e
+ return creds
+ })
+}
+
+module.exports.login = (creds, registry, scope, cb) => {
+ const conf = {
+ log: log,
+ creds: creds,
+ registry: registry,
+ auth: {
+ otp: npm.config.get('otp')
+ },
+ scope: scope,
+ opts: fetchOpts.fromPacote(pacoteOpts())
+ }
+ login(conf).then((newCreds) => cb(null, newCreds)).catch(cb)
+}
+
+function login (conf) {
+ return profile.login(openerPromise, loginPrompter, conf)
+ .catch((err) => {
if (err.code === 'EOTP') throw err
- return profile.adduser(username, email, password, {
- registry: registry,
- opts: fetchOpts.fromPacote(pacoteOpts())
+ const u = conf.creds.username
+ const p = conf.creds.password
+ const e = conf.creds.email
+ if (!(u && p && e)) throw err
+ return profile.adduserCouch(u, e, p, conf)
+ })
+ .catch((err) => {
+ if (err.code !== 'EOTP') throw err
+ return read.otp('Enter one-time password from your authenticator app: ').then((otp) => {
+ conf.auth.otp = otp
+ const u = conf.creds.username
+ const p = conf.creds.password
+ return profile.loginCouch(u, p, conf)
})
- }).catch((err) => {
- if (err.code === 'EOTP' && !auth.otp) {
- return read.otp('Authenticator provided OTP:').then((otp) => {
- auth.otp = otp
- return profile.login(username, password, {registry: registry, auth: auth})
- })
+ }).then((result) => {
+ const newCreds = {}
+ if (result && result.token) {
+ newCreds.token = result.token
} else {
- throw err
+ newCreds.username = conf.creds.username
+ newCreds.password = conf.creds.password
+ newCreds.email = conf.creds.email
+ newCreds.alwaysAuth = npm.config.get('always-auth')
}
- })
- }).then((result) => {
- const newCreds = {}
- if (result && result.token) {
- newCreds.token = result.token
- } else {
- newCreds.username = username
- newCreds.password = password
- newCreds.email = email
- newCreds.alwaysAuth = npm.config.get('always-auth')
- }
- log.info('adduser', 'Authorized user %s', username)
- const scopeMessage = scope ? ' to scope ' + scope : ''
- output('Logged in as %s%s on %s.', username, scopeMessage, registry)
- cb(null, newCreds)
- }).catch(cb)
+ const usermsg = conf.creds.username ? ' user ' + conf.creds.username : ''
+ conf.log.info('login', 'Authorized' + usermsg)
+ const scopeMessage = conf.scope ? ' to scope ' + conf.scope : ''
+ const userout = conf.creds.username ? ' as ' + conf.creds.username : ''
+ output('Logged in%s%s on %s.', userout, scopeMessage, conf.registry)
+ return newCreds
+ })
}
diff --git a/deps/npm/lib/auth/sso.js b/deps/npm/lib/auth/sso.js
index faffe2fa595033..519ca8496c74c2 100644
--- a/deps/npm/lib/auth/sso.js
+++ b/deps/npm/lib/auth/sso.js
@@ -1,7 +1,7 @@
var log = require('npmlog')
var npm = require('../npm.js')
var output = require('../utils/output')
-var opener = require('opener')
+var openUrl = require('../utils/open-url')
module.exports.login = function login (creds, registry, scope, cb) {
var ssoType = npm.config.get('sso-type')
@@ -22,10 +22,7 @@ module.exports.login = function login (creds, registry, scope, cb) {
if (!doc || !doc.token) return cb(new Error('no SSO token returned'))
if (!doc.sso) return cb(new Error('no SSO URL returned by services'))
- output('If your browser doesn\'t open, visit ' +
- doc.sso +
- ' to complete authentication')
- opener(doc.sso, { command: npm.config.get('browser') }, function () {
+ openUrl(doc.sso, 'to complete your login please visit', function () {
pollForSession(registry, doc.token, function (err, username) {
if (err) return cb(err)
diff --git a/deps/npm/lib/bugs.js b/deps/npm/lib/bugs.js
index 5f166c33f6f2f6..10300d1e136203 100644
--- a/deps/npm/lib/bugs.js
+++ b/deps/npm/lib/bugs.js
@@ -1,8 +1,7 @@
module.exports = bugs
-var npm = require('./npm.js')
var log = require('npmlog')
-var opener = require('opener')
+var openUrl = require('./utils/open-url')
var fetchPackageMetadata = require('./fetch-package-metadata.js')
var usage = require('./utils/usage')
@@ -27,6 +26,6 @@ function bugs (args, cb) {
url = 'https://www.npmjs.org/package/' + d.name
}
log.silly('bugs', 'url', url)
- opener(url, { command: npm.config.get('browser') }, cb)
+ openUrl(url, 'bug list available at the following URL', cb)
})
}
diff --git a/deps/npm/lib/build.js b/deps/npm/lib/build.js
index 395f9437b4576c..f8b3c4933ed1be 100644
--- a/deps/npm/lib/build.js
+++ b/deps/npm/lib/build.js
@@ -106,7 +106,7 @@ function rebuildBundles (pkg, folder, cb) {
if (!npm.config.get('rebuild-bundle')) return cb()
var deps = Object.keys(pkg.dependencies || {})
- .concat(Object.keys(pkg.devDependencies || {}))
+ .concat(Object.keys(pkg.devDependencies || {}))
var bundles = pkg.bundleDependencies || pkg.bundledDependencies || []
fs.readdir(path.resolve(folder, 'node_modules'), function (er, files) {
@@ -119,7 +119,7 @@ function rebuildBundles (pkg, folder, cb) {
chain(files.filter(function (file) {
// rebuild if:
// not a .folder, like .bin or .hooks
- return !file.match(/^[\._-]/) &&
+ return !file.match(/^[._-]/) &&
// not some old 0.x style bundle
file.indexOf('@') === -1 &&
// either not a dep, or explicitly bundled
diff --git a/deps/npm/lib/cache.js b/deps/npm/lib/cache.js
index 8bd2d5fcb1aea4..169f192cad5f2c 100644
--- a/deps/npm/lib/cache.js
+++ b/deps/npm/lib/cache.js
@@ -1,4 +1,5 @@
'use strict'
+/* eslint-disable standard/no-callback-literal */
const BB = require('bluebird')
@@ -68,7 +69,7 @@ function clean (args) {
}
const cachePath = path.join(npm.cache, '_cacache')
if (!npm.config.get('force')) {
- return BB.reject(new Error("As of npm@5, the npm cache self-heals from corruption issues and data extracted from the cache is guaranteed to be valid. If you want to make sure everything is consistent, use 'npm cache verify' instead.\n\nIf you're sure you want to delete the entire cache, rerun this command with --force."))
+ return BB.reject(new Error("As of npm@5, the npm cache self-heals from corruption issues and data extracted from the cache is guaranteed to be valid. If you want to make sure everything is consistent, use 'npm cache verify' instead. On the other hand, if you're debugging an issue with the installer, you can use `npm install --cache /tmp/empty-cache` to use a temporary cache instead of nuking the actual one.\n\nIf you're sure you want to delete the entire cache, rerun this command with --force."))
}
// TODO - remove specific packages or package versions
return rm(cachePath)
diff --git a/deps/npm/lib/ci.js b/deps/npm/lib/ci.js
new file mode 100644
index 00000000000000..e71d89cfddb2f8
--- /dev/null
+++ b/deps/npm/lib/ci.js
@@ -0,0 +1,40 @@
+'use strict'
+
+const Installer = require('libcipm')
+const lifecycleOpts = require('./config/lifecycle.js')
+const npm = require('./npm.js')
+const npmlog = require('npmlog')
+const pacoteOpts = require('./config/pacote.js')
+
+ci.usage = 'npm ci'
+
+ci.completion = (cb) => cb(null, [])
+
+Installer.CipmConfig.impl(npm.config, {
+ get: npm.config.get,
+ set: npm.config.set,
+ toLifecycle (moreOpts) {
+ return lifecycleOpts(moreOpts)
+ },
+ toPacote (moreOpts) {
+ return pacoteOpts(moreOpts)
+ }
+})
+
+module.exports = ci
+function ci (args, cb) {
+ return new Installer({
+ config: npm.config,
+ log: npmlog
+ })
+ .run()
+ .then(
+ (details) => {
+ npmlog.disableProgress()
+ console.error(`added ${details.pkgCount} packages in ${
+ details.runTime / 1000
+ }s`)
+ }
+ )
+ .then(() => cb(), cb)
+}
diff --git a/deps/npm/lib/completion.js b/deps/npm/lib/completion.js
index 3157255bfb625b..a682c134a77377 100644
--- a/deps/npm/lib/completion.js
+++ b/deps/npm/lib/completion.js
@@ -49,7 +49,7 @@ function completion (args, cb) {
if (isWindowsShell) {
var e = new Error('npm completion supported only in MINGW / Git bash on Windows')
e.code = 'ENOTSUP'
- e.errno = require('constants').ENOTSUP
+ e.errno = require('constants').ENOTSUP // eslint-disable-line node/no-deprecated-api
return cb(e)
}
@@ -150,7 +150,7 @@ function dumpScript (cb) {
fs.readFile(p, 'utf8', function (er, d) {
if (er) return cb(er)
- d = d.replace(/^\#\!.*?\n/, '')
+ d = d.replace(/^#!.*?\n/, '')
process.stdout.write(d, function () { cb() })
process.stdout.on('error', function (er) {
diff --git a/deps/npm/lib/config.js b/deps/npm/lib/config.js
index d260c04a54ce65..0d4161d3b53e85 100644
--- a/deps/npm/lib/config.js
+++ b/deps/npm/lib/config.js
@@ -1,3 +1,4 @@
+/* eslint-disable standard/no-callback-literal */
module.exports = config
var log = require('npmlog')
@@ -9,6 +10,8 @@ var types = npmconf.defs.types
var ini = require('ini')
var editor = require('editor')
var os = require('os')
+var path = require('path')
+var mkdirp = require('mkdirp')
var umask = require('./utils/umask')
var usage = require('./utils/usage')
var output = require('./utils/output')
@@ -39,7 +42,7 @@ config.completion = function (opts, cb) {
// todo: complete with valid values, if possible.
if (argv.length > 3) return cb(null, [])
// fallthrough
- /*eslint no-fallthrough:0*/
+ /* eslint no-fallthrough:0 */
case 'get':
case 'delete':
case 'rm':
@@ -89,7 +92,7 @@ function edit (cb) {
data = [
';;;;',
'; npm ' + (npm.config.get('global')
- ? 'globalconfig' : 'userconfig') + ' file',
+ ? 'globalconfig' : 'userconfig') + ' file',
'; this is a simple ini-formatted file',
'; lines that start with semi-colons are comments.',
'; read `npm help config` for help on the various options',
@@ -111,16 +114,19 @@ function edit (cb) {
.replace(/\n/g, '\n; ')
.split('\n'))
}, []))
- .concat([''])
- .join(os.EOL)
- writeFileAtomic(
- f,
- data,
- function (er) {
- if (er) return cb(er)
- editor(f, { editor: e }, noProgressTillDone(cb))
- }
- )
+ .concat([''])
+ .join(os.EOL)
+ mkdirp(path.dirname(f), function (er) {
+ if (er) return cb(er)
+ writeFileAtomic(
+ f,
+ data,
+ function (er) {
+ if (er) return cb(er)
+ editor(f, { editor: e }, noProgressTillDone(cb))
+ }
+ )
+ })
})
})
}
diff --git a/deps/npm/lib/config/cmd-list.js b/deps/npm/lib/config/cmd-list.js
index 49c445a4f0d14f..2069b5ea33ec78 100644
--- a/deps/npm/lib/config/cmd-list.js
+++ b/deps/npm/lib/config/cmd-list.js
@@ -4,8 +4,10 @@ var shorthands = {
'rb': 'rebuild',
'list': 'ls',
'ln': 'link',
+ 'create': 'init',
'i': 'install',
'it': 'install-test',
+ 'cit': 'install-ci-test',
'up': 'update',
'c': 'config',
's': 'search',
@@ -22,6 +24,8 @@ var affordances = {
'la': 'ls',
'll': 'ls',
'verison': 'version',
+ 'ic': 'ci',
+ 'innit': 'init',
'isntall': 'install',
'dist-tags': 'dist-tag',
'apihelp': 'help',
@@ -41,11 +45,14 @@ var affordances = {
'remove': 'uninstall',
'rm': 'uninstall',
'r': 'uninstall',
- 'rum': 'run-script'
+ 'rum': 'run-script',
+ 'sit': 'cit',
+ 'urn': 'run-script'
}
// these are filenames in .
var cmdList = [
+ 'ci',
'install',
'install-test',
'uninstall',
@@ -58,6 +65,7 @@ var cmdList = [
'prune',
'pack',
'dedupe',
+ 'hook',
'rebuild',
'link',
@@ -76,6 +84,7 @@ var cmdList = [
'shrinkwrap',
'token',
'profile',
+ 'audit',
'help',
'help-search',
diff --git a/deps/npm/lib/config/core.js b/deps/npm/lib/config/core.js
index 50cf4772e79f48..b9851f98d0e0c7 100644
--- a/deps/npm/lib/config/core.js
+++ b/deps/npm/lib/config/core.js
@@ -21,18 +21,20 @@ exports.defs = configDefs
Object.defineProperty(exports, 'defaults', { get: function () {
return configDefs.defaults
-}, enumerable: true })
+},
+enumerable: true })
Object.defineProperty(exports, 'types', { get: function () {
return configDefs.types
-}, enumerable: true })
+},
+enumerable: true })
exports.validate = validate
var myUid = process.env.SUDO_UID !== undefined
- ? process.env.SUDO_UID : (process.getuid && process.getuid())
+ ? process.env.SUDO_UID : (process.getuid && process.getuid())
var myGid = process.env.SUDO_GID !== undefined
- ? process.env.SUDO_GID : (process.getgid && process.getgid())
+ ? process.env.SUDO_GID : (process.getgid && process.getgid())
var loading = false
var loadCbs = []
@@ -153,17 +155,10 @@ function load_ (builtin, rc, cli, cb) {
// annoying humans and their expectations!
if (conf.get('prefix')) {
var etc = path.resolve(conf.get('prefix'), 'etc')
- mkdirp(etc, function () {
- defaults.globalconfig = path.resolve(etc, 'npmrc')
- defaults.globalignorefile = path.resolve(etc, 'npmignore')
- afterUserContinuation()
- })
- } else {
- afterUserContinuation()
+ defaults.globalconfig = path.resolve(etc, 'npmrc')
+ defaults.globalignorefile = path.resolve(etc, 'npmignore')
}
- }
- function afterUserContinuation () {
conf.addFile(conf.get('globalconfig'), 'global')
// move the builtin into the conf stack now.
@@ -274,7 +269,7 @@ Conf.prototype.save = function (where, cb) {
if (cb) return cb(er)
else return this.emit('error', er)
}
- this._saving --
+ this._saving--
if (this._saving === 0) {
if (cb) cb()
this.emit('save')
@@ -283,7 +278,7 @@ Conf.prototype.save = function (where, cb) {
then = then.bind(this)
done = done.bind(this)
- this._saving ++
+ this._saving++
var mode = where === 'user' ? '0600' : '0666'
if (!data.trim()) {
@@ -331,7 +326,10 @@ Conf.prototype.parse = function (content, file) {
Conf.prototype.add = function (data, marker) {
try {
Object.keys(data).forEach(function (k) {
- data[k] = parseField(data[k], k)
+ const newKey = envReplace(k)
+ const newField = parseField(data[k], newKey)
+ delete data[k]
+ data[newKey] = newField
})
} catch (e) {
this.emit('error', e)
@@ -351,8 +349,8 @@ Conf.prototype.addEnv = function (env) {
// leave first char untouched, even if
// it is a '_' - convert all other to '-'
var p = k.toLowerCase()
- .replace(/^npm_config_/, '')
- .replace(/(?!^)_/g, '-')
+ .replace(/^npm_config_/, '')
+ .replace(/(?!^)_/g, '-')
conf[p] = env[k]
})
return CC.prototype.addEnv.call(this, '', conf, 'env')
diff --git a/deps/npm/lib/config/defaults.js b/deps/npm/lib/config/defaults.js
index c049f213fa76d1..991a2129f68944 100644
--- a/deps/npm/lib/config/defaults.js
+++ b/deps/npm/lib/config/defaults.js
@@ -82,7 +82,7 @@ if (home) process.env.HOME = home
else home = path.resolve(temp, 'npm-' + uidOrPid)
var cacheExtra = process.platform === 'win32' ? 'npm-cache' : '.npm'
-var cacheRoot = process.platform === 'win32' && process.env.APPDATA || home
+var cacheRoot = (process.platform === 'win32' && process.env.APPDATA) || home
var cache = path.resolve(cacheRoot, cacheExtra)
var globalPrefix
@@ -109,6 +109,8 @@ Object.defineProperty(exports, 'defaults', {get: function () {
'allow-same-version': false,
'always-auth': false,
also: null,
+ audit: true,
+ 'audit-level': 'low',
'auth-type': 'legacy',
'bin-links': true,
@@ -130,7 +132,7 @@ Object.defineProperty(exports, 'defaults', {get: function () {
cidr: null,
- color: true,
+ color: process.env.NO_COLOR == null,
depth: Infinity,
description: true,
dev: false,
@@ -152,7 +154,7 @@ Object.defineProperty(exports, 'defaults', {get: function () {
globalconfig: path.resolve(globalPrefix, 'etc', 'npmrc'),
'global-style': false,
group: process.platform === 'win32' ? 0
- : process.env.SUDO_GID || (process.getgid && process.getgid()),
+ : process.env.SUDO_GID || (process.getgid && process.getgid()),
'ham-it-up': false,
heading: 'npm',
'if-present': false,
@@ -189,10 +191,12 @@ Object.defineProperty(exports, 'defaults', {get: function () {
'prefer-offline': false,
'prefer-online': false,
prefix: globalPrefix,
+ preid: '',
production: process.env.NODE_ENV === 'production',
'progress': !process.env.TRAVIS && !process.env.CI,
proxy: null,
'https-proxy': null,
+ 'noproxy': null,
'user-agent': 'npm/{npm-version} ' +
'node/{node-version} ' +
'{platform} ' +
@@ -218,6 +222,7 @@ Object.defineProperty(exports, 'defaults', {get: function () {
'send-metrics': false,
shell: osenv.shell(),
shrinkwrap: true,
+ 'sign-git-commit': false,
'sign-git-tag': false,
'sso-poll-frequency': 500,
'sso-type': 'oauth',
@@ -232,6 +237,7 @@ Object.defineProperty(exports, 'defaults', {get: function () {
!(process.getuid && process.setuid &&
process.getgid && process.setgid) ||
process.getuid() !== 0,
+ 'update-notifier': true,
usage: false,
user: process.platform === 'win32' ? 0 : 'nobody',
userconfig: path.resolve(home, '.npmrc'),
@@ -251,6 +257,8 @@ exports.types = {
'allow-same-version': Boolean,
'always-auth': Boolean,
also: [null, 'dev', 'development'],
+ audit: Boolean,
+ 'audit-level': ['low', 'moderate', 'high', 'critical'],
'auth-type': ['legacy', 'sso', 'saml', 'oauth'],
'bin-links': Boolean,
browser: [null, String],
@@ -300,8 +308,6 @@ exports.types = {
key: [null, String],
'legacy-bundling': Boolean,
link: Boolean,
- // local-address must be listed as an IP for a local network interface
- // must be IPv4 due to node bug
'local-address': getLocalAddresses(),
loglevel: ['silent', 'error', 'warn', 'notice', 'http', 'timing', 'info', 'verbose', 'silly'],
logstream: Stream,
@@ -312,17 +318,19 @@ exports.types = {
'metrics-registry': [null, String],
'node-options': [null, String],
'node-version': [null, semver],
+ 'noproxy': [null, String, Array],
offline: Boolean,
'onload-script': [null, String],
only: [null, 'dev', 'development', 'prod', 'production'],
optional: Boolean,
'package-lock': Boolean,
- otp: Number,
+ otp: [null, String],
'package-lock-only': Boolean,
parseable: Boolean,
'prefer-offline': Boolean,
'prefer-online': Boolean,
prefix: path,
+ preid: String,
production: Boolean,
progress: Boolean,
proxy: [null, false, url], // allow proxy to be disabled explicitly
@@ -347,6 +355,7 @@ exports.types = {
'send-metrics': Boolean,
shell: String,
shrinkwrap: Boolean,
+ 'sign-git-commit': Boolean,
'sign-git-tag': Boolean,
'sso-poll-frequency': Number,
'sso-type': [null, 'oauth', 'saml'],
@@ -356,6 +365,7 @@ exports.types = {
tmp: path,
unicode: Boolean,
'unsafe-perm': Boolean,
+ 'update-notifier': Boolean,
usage: Boolean,
user: [Number, String],
userconfig: path,
@@ -378,16 +388,9 @@ function getLocalAddresses () {
interfaces = {}
}
- return Object.keys(interfaces).map(function (nic) {
- return interfaces[nic].filter(function (addr) {
- return addr.family === 'IPv4'
- })
- .map(function (addr) {
- return addr.address
- })
- }).reduce(function (curr, next) {
- return curr.concat(next)
- }, []).concat(undefined)
+ return Object.keys(interfaces).map(
+ nic => interfaces[nic].map(({address}) => address)
+ ).reduce((curr, next) => curr.concat(next), []).concat(undefined)
}
exports.shorthands = {
diff --git a/deps/npm/lib/config/fetch-opts.js b/deps/npm/lib/config/fetch-opts.js
index 1a030c378ea09c..213c293d6c7c9e 100644
--- a/deps/npm/lib/config/fetch-opts.js
+++ b/deps/npm/lib/config/fetch-opts.js
@@ -26,12 +26,12 @@ function fromPacote (opts) {
function getCacheMode (opts) {
return opts.offline
- ? 'only-if-cached'
- : opts.preferOffline
- ? 'force-cache'
- : opts.preferOnline
- ? 'no-cache'
- : 'default'
+ ? 'only-if-cached'
+ : opts.preferOffline
+ ? 'force-cache'
+ : opts.preferOnline
+ ? 'no-cache'
+ : 'default'
}
function getHeaders (uri, registry, opts) {
diff --git a/deps/npm/lib/config/get-credentials-by-uri.js b/deps/npm/lib/config/get-credentials-by-uri.js
index d04f6137de9ba1..21926c68659932 100644
--- a/deps/npm/lib/config/get-credentials-by-uri.js
+++ b/deps/npm/lib/config/get-credentials-by-uri.js
@@ -34,20 +34,26 @@ function getCredentialsByURI (uri) {
return c
}
+ if (this.get(nerfed + ':-authtoken')) {
+ c.token = this.get(nerfed + ':-authtoken')
+ // the bearer token is enough, don't confuse things
+ return c
+ }
+
// Handle the old-style _auth= style for the default
// registry, if set.
var authDef = this.get('_auth')
var userDef = this.get('username')
var passDef = this.get('_password')
if (authDef && !(userDef && passDef)) {
- authDef = new Buffer(authDef, 'base64').toString()
+ authDef = Buffer.from(authDef, 'base64').toString()
authDef = authDef.split(':')
userDef = authDef.shift()
passDef = authDef.join(':')
}
if (this.get(nerfed + ':_password')) {
- c.password = new Buffer(this.get(nerfed + ':_password'), 'base64').toString('utf8')
+ c.password = Buffer.from(this.get(nerfed + ':_password'), 'base64').toString('utf8')
} else if (nerfed === defnerf && passDef) {
c.password = passDef
}
@@ -65,7 +71,7 @@ function getCredentialsByURI (uri) {
}
if (c.username && c.password) {
- c.auth = new Buffer(c.username + ':' + c.password).toString('base64')
+ c.auth = Buffer.from(c.username + ':' + c.password).toString('base64')
}
return c
diff --git a/deps/npm/lib/config/load-prefix.js b/deps/npm/lib/config/load-prefix.js
index c2af00c7f61da5..090865d2157c02 100644
--- a/deps/npm/lib/config/load-prefix.js
+++ b/deps/npm/lib/config/load-prefix.js
@@ -34,7 +34,7 @@ function loadPrefix (cb) {
Object.defineProperty(this, 'localPrefix',
{ set: function (prefix) { p = prefix },
get: function () { return p },
- enumerable: true })
+ enumerable: true })
// try to guess at a good node_modules location.
// If we are *explicitly* given a prefix on the cli, then
diff --git a/deps/npm/lib/config/pacote.js b/deps/npm/lib/config/pacote.js
index ec43178c7727dd..505b69da375a44 100644
--- a/deps/npm/lib/config/pacote.js
+++ b/deps/npm/lib/config/pacote.js
@@ -26,6 +26,7 @@ function pacoteOpts (moreOpts) {
defaultTag: npm.config.get('tag'),
dirPacker: pack.packGitDep,
hashAlgorithm: 'sha1',
+ includeDeprecated: false,
key: npm.config.get('key'),
localAddress: npm.config.get('local-address'),
log: log,
@@ -37,6 +38,7 @@ function pacoteOpts (moreOpts) {
preferOnline: npm.config.get('prefer-online') || npm.config.get('cache-max') <= 0,
projectScope: npm.projectScope,
proxy: npm.config.get('https-proxy') || npm.config.get('proxy'),
+ noProxy: npm.config.get('noproxy'),
refer: npm.registry.refer,
registry: npm.config.get('registry'),
retry: {
diff --git a/deps/npm/lib/config/set-credentials-by-uri.js b/deps/npm/lib/config/set-credentials-by-uri.js
index 74211380d86b6f..4723d561a8af6f 100644
--- a/deps/npm/lib/config/set-credentials-by-uri.js
+++ b/deps/npm/lib/config/set-credentials-by-uri.js
@@ -23,7 +23,7 @@ function setCredentialsByURI (uri, c) {
this.del(nerfed + ':_authToken', 'user')
- var encoded = new Buffer(c.password, 'utf8').toString('base64')
+ var encoded = Buffer.from(c.password, 'utf8').toString('base64')
this.set(nerfed + ':_password', encoded, 'user')
this.set(nerfed + ':username', c.username, 'user')
this.set(nerfed + ':email', c.email, 'user')
diff --git a/deps/npm/lib/dedupe.js b/deps/npm/lib/dedupe.js
index 71e60619c4f1b7..325faeaabcd43f 100644
--- a/deps/npm/lib/dedupe.js
+++ b/deps/npm/lib/dedupe.js
@@ -134,7 +134,7 @@ function hoistChildren_ (tree, diff, seen, next) {
if (seen.has(tree)) return next()
seen.add(tree)
asyncMap(tree.children, function (child, done) {
- if (!tree.parent) return hoistChildren_(child, diff, seen, done)
+ if (!tree.parent || child.fromBundle || child.package._inBundle) return hoistChildren_(child, diff, seen, done)
var better = findRequirement(tree.parent, moduleName(child), getRequested(child) || npa(packageId(child)))
if (better) {
return chain([
@@ -142,7 +142,7 @@ function hoistChildren_ (tree, diff, seen, next) {
[andComputeMetadata(tree)]
], done)
}
- var hoistTo = earliestInstallable(tree, tree.parent, child.package)
+ var hoistTo = earliestInstallable(tree, tree.parent, child.package, log)
if (hoistTo) {
move(child, hoistTo, diff)
chain([
diff --git a/deps/npm/lib/deprecate.js b/deps/npm/lib/deprecate.js
index 15ae58e01457ce..9b71d1de494ad7 100644
--- a/deps/npm/lib/deprecate.js
+++ b/deps/npm/lib/deprecate.js
@@ -1,3 +1,4 @@
+/* eslint-disable standard/no-callback-literal */
var npm = require('./npm.js')
var mapToRegistry = require('./utils/map-to-registry.js')
var npa = require('npm-package-arg')
diff --git a/deps/npm/lib/dist-tag.js b/deps/npm/lib/dist-tag.js
index 7c20ea99015304..bd0c5ae8a27a7d 100644
--- a/deps/npm/lib/dist-tag.js
+++ b/deps/npm/lib/dist-tag.js
@@ -1,3 +1,4 @@
+/* eslint-disable standard/no-callback-literal */
module.exports = distTag
var log = require('npmlog')
diff --git a/deps/npm/lib/docs.js b/deps/npm/lib/docs.js
index 2248702a461954..6d67da4e120907 100644
--- a/deps/npm/lib/docs.js
+++ b/deps/npm/lib/docs.js
@@ -1,7 +1,6 @@
module.exports = docs
-var npm = require('./npm.js')
-var opener = require('opener')
+var openUrl = require('./utils/open-url')
var log = require('npmlog')
var fetchPackageMetadata = require('./fetch-package-metadata.js')
var usage = require('./utils/usage')
@@ -37,6 +36,6 @@ function getDoc (project, cb) {
if (er) return cb(er)
var url = d.homepage
if (!url) url = 'https://www.npmjs.org/package/' + d.name
- return opener(url, {command: npm.config.get('browser')}, cb)
+ return openUrl(url, 'docs available at the following URL', cb)
})
}
diff --git a/deps/npm/lib/edit.js b/deps/npm/lib/edit.js
index 8e9bbd179709e5..48bcd5d346cad6 100644
--- a/deps/npm/lib/edit.js
+++ b/deps/npm/lib/edit.js
@@ -22,8 +22,8 @@ function edit (args, cb) {
))
}
p = p.split('/')
- .join('/node_modules/')
- .replace(/(\/node_modules)+/, '/node_modules')
+ .join('/node_modules/')
+ .replace(/(\/node_modules)+/, '/node_modules')
var f = path.resolve(npm.dir, p)
fs.lstat(f, function (er) {
if (er) return cb(er)
diff --git a/deps/npm/lib/help-search.js b/deps/npm/lib/help-search.js
index ffbe554b7bc064..475f305e49103c 100644
--- a/deps/npm/lib/help-search.js
+++ b/deps/npm/lib/help-search.js
@@ -70,7 +70,7 @@ function searchFiles (args, files, cb) {
if (nextLine) {
for (a = 0, ll = args.length; a < ll && !match; a++) {
match = nextLine.toLowerCase()
- .indexOf(args[a].toLowerCase()) !== -1
+ .indexOf(args[a].toLowerCase()) !== -1
}
if (match) {
// skip over the next line, and the line after it.
@@ -107,7 +107,7 @@ function searchFiles (args, files, cb) {
lines.forEach(function (line) {
args.forEach(function (arg) {
var hit = (line || '').toLowerCase()
- .split(arg.toLowerCase()).length - 1
+ .split(arg.toLowerCase()).length - 1
if (hit > 0) {
found[arg] = (found[arg] || 0) + hit
totalHits += hit
@@ -144,12 +144,12 @@ function searchFiles (args, files, cb) {
// then by number of matching lines
results = results.sort(function (a, b) {
return a.found.length > b.found.length ? -1
- : a.found.length < b.found.length ? 1
- : a.totalHits > b.totalHits ? -1
- : a.totalHits < b.totalHits ? 1
- : a.lines.length > b.lines.length ? -1
- : a.lines.length < b.lines.length ? 1
- : 0
+ : a.found.length < b.found.length ? 1
+ : a.totalHits > b.totalHits ? -1
+ : a.totalHits < b.totalHits ? 1
+ : a.lines.length > b.lines.length ? -1
+ : a.lines.length < b.lines.length ? 1
+ : 0
})
cb(null, results)
@@ -170,7 +170,7 @@ function formatResults (args, results, cb) {
}).join(' ')
out += ((new Array(Math.max(1, cols - out.length - r.length)))
- .join(' ')) + r
+ .join(' ')) + r
if (!npm.config.get('long')) return out
diff --git a/deps/npm/lib/help.js b/deps/npm/lib/help.js
index 64c80f78745647..3f70f2dc1f84c7 100644
--- a/deps/npm/lib/help.js
+++ b/deps/npm/lib/help.js
@@ -10,7 +10,7 @@ var path = require('path')
var spawn = require('./utils/spawn')
var npm = require('./npm.js')
var log = require('npmlog')
-var opener = require('opener')
+var openUrl = require('./utils/open-url')
var glob = require('glob')
var didYouMean = require('./utils/did-you-mean')
var cmdList = require('./config/cmd-list').cmdList
@@ -97,8 +97,8 @@ function pickMan (mans, pref_) {
var an = a.match(nre)[1]
var bn = b.match(nre)[1]
return an === bn ? (a > b ? -1 : 1)
- : pref[an] < pref[bn] ? -1
- : 1
+ : pref[an] < pref[bn] ? -1
+ : 1
})
return mans[0]
}
@@ -127,7 +127,7 @@ function viewMan (man, cb) {
break
case 'browser':
- opener(htmlMan(man), { command: npm.config.get('browser') }, cb)
+ openUrl(htmlMan(man), 'help available at the following URL', cb)
break
default:
@@ -168,12 +168,12 @@ function npmUsage (valid, cb) {
'',
'where is one of:',
npm.config.get('long') ? usages()
- : ' ' + wrap(commands),
+ : ' ' + wrap(commands),
'',
- 'npm -h quick help on ',
- 'npm -l display full usage info',
- 'npm help search for help on ',
- 'npm help npm involved overview',
+ 'npm -h quick help on ',
+ 'npm -l display full usage info',
+ 'npm help search for help on ',
+ 'npm help npm involved overview',
'',
'Specify configs in the ini-formatted file:',
' ' + npm.config.get('userconfig'),
@@ -184,7 +184,7 @@ function npmUsage (valid, cb) {
].join('\n'))
if (npm.argv.length > 1) {
- didYouMean(npm.argv[1], commands)
+ output(didYouMean(npm.argv[1], commands))
}
cb(valid)
diff --git a/deps/npm/lib/hook.js b/deps/npm/lib/hook.js
new file mode 100644
index 00000000000000..b0552c74740ea3
--- /dev/null
+++ b/deps/npm/lib/hook.js
@@ -0,0 +1,135 @@
+'use strict'
+
+const BB = require('bluebird')
+
+const crypto = require('crypto')
+const hookApi = require('libnpmhook')
+const log = require('npmlog')
+const npm = require('./npm.js')
+const output = require('./utils/output.js')
+const pudding = require('figgy-pudding')
+const relativeDate = require('tiny-relative-date')
+const Table = require('cli-table3')
+const usage = require('./utils/usage.js')
+const validate = require('aproba')
+
+hook.usage = usage([
+ 'npm hook add [--type=]',
+ 'npm hook ls [pkg]',
+ 'npm hook rm ',
+ 'npm hook update '
+])
+
+hook.completion = (opts, cb) => {
+ validate('OF', [opts, cb])
+ return cb(null, []) // fill in this array with completion values
+}
+
+const npmSession = crypto.randomBytes(8).toString('hex')
+const hookConfig = pudding()
+function config () {
+ return hookConfig({
+ refer: npm.refer,
+ projectScope: npm.projectScope,
+ log,
+ npmSession
+ }, npm.config)
+}
+
+module.exports = (args, cb) => BB.try(() => hook(args)).nodeify(cb)
+function hook (args) {
+ switch (args[0]) {
+ case 'add':
+ return add(args[1], args[2], args[3])
+ case 'ls':
+ return ls(args[1])
+ case 'rm':
+ return rm(args[1])
+ case 'update':
+ case 'up':
+ return update(args[1], args[2], args[3])
+ }
+}
+
+function add (pkg, uri, secret) {
+ return hookApi.add(pkg, uri, secret, config())
+ .then((hook) => {
+ if (npm.config.get('json')) {
+ output(JSON.stringify(hook, null, 2))
+ } else {
+ output(`+ ${hookName(hook)} ${
+ npm.config.get('unicode') ? ' ➜ ' : ' -> '
+ } ${hook.endpoint}`)
+ }
+ })
+}
+
+function ls (pkg) {
+ return hookApi.ls(pkg, config())
+ .then((hooks) => {
+ if (npm.config.get('json')) {
+ output(JSON.stringify(hooks, null, 2))
+ } else if (!hooks.length) {
+ output("You don't have any hooks configured yet.")
+ } else {
+ if (hooks.length === 1) {
+ output('You have one hook configured.')
+ } else {
+ output(`You have ${hooks.length} hooks configured.`)
+ }
+ const table = new Table({head: ['id', 'target', 'endpoint']})
+ hooks.forEach((hook) => {
+ table.push([
+ {rowSpan: 2, content: hook.id},
+ hookName(hook),
+ hook.endpoint
+ ])
+ if (hook.last_delivery) {
+ table.push([
+ {
+ colSpan: 1,
+ content: `triggered ${relativeDate(hook.last_delivery)}`
+ },
+ hook.response_code
+ ])
+ } else {
+ table.push([{colSpan: 2, content: 'never triggered'}])
+ }
+ })
+ output(table.toString())
+ }
+ })
+}
+
+function rm (id) {
+ return hookApi.rm(id, config())
+ .then((hook) => {
+ if (npm.config.get('json')) {
+ output(JSON.stringify(hook, null, 2))
+ } else {
+ output(`- ${hookName(hook)} ${
+ npm.config.get('unicode') ? ' ✘ ' : ' X '
+ } ${hook.endpoint}`)
+ }
+ })
+}
+
+function update (id, uri, secret) {
+ return hookApi.update(id, uri, secret, config())
+ .then((hook) => {
+ if (npm.config.get('json')) {
+ output(JSON.stringify(hook, null, 2))
+ } else {
+ output(`+ ${hookName(hook)} ${
+ npm.config.get('unicode') ? ' ➜ ' : ' -> '
+ } ${hook.endpoint}`)
+ }
+ })
+}
+
+function hookName (hook) {
+ let target = hook.name
+ if (hook.type === 'scope') { target = '@' + target }
+ if (hook.type === 'owner') { target = '~' + target }
+ return target
+}
diff --git a/deps/npm/lib/init.js b/deps/npm/lib/init.js
index 000fa1a5b689e9..9d873689f6b7e1 100644
--- a/deps/npm/lib/init.js
+++ b/deps/npm/lib/init.js
@@ -2,15 +2,59 @@
module.exports = init
+var path = require('path')
var log = require('npmlog')
+var npa = require('npm-package-arg')
var npm = require('./npm.js')
+var npx = require('libnpx')
var initJson = require('init-package-json')
+var isRegistry = require('./utils/is-registry.js')
var output = require('./utils/output.js')
var noProgressTillDone = require('./utils/no-progress-while-running').tillDone
+var usage = require('./utils/usage')
-init.usage = 'npm init [--force|-f|--yes|-y]'
+init.usage = usage(
+ 'init',
+ '\nnpm init [--force|-f|--yes|-y|--scope]' +
+ '\nnpm init <@scope> (same as `npx <@scope>/create`)' +
+ '\nnpm init [<@scope>/] (same as `npx [<@scope>/]create-`)'
+)
function init (args, cb) {
+ if (args.length) {
+ var NPM_PATH = path.resolve(__dirname, '../bin/npm-cli.js')
+ var initerName = args[0]
+ var packageName = initerName
+ if (/^@[^/]+$/.test(initerName)) {
+ packageName = initerName + '/create'
+ } else {
+ var req = npa(initerName)
+ if (req.type === 'git' && req.hosted) {
+ var { user, project } = req.hosted
+ packageName = initerName
+ .replace(user + '/' + project, user + '/create-' + project)
+ } else if (isRegistry(req)) {
+ packageName = req.name.replace(/^(@[^/]+\/)?/, '$1create-')
+ if (req.rawSpec) {
+ packageName += '@' + req.rawSpec
+ }
+ } else {
+ var err = new Error(
+ 'Unrecognized initializer: ' + initerName +
+ '\nFor more package binary executing power check out `npx`:' +
+ '\nhttps://www.npmjs.com/package/npx'
+ )
+ err.code = 'EUNSUPPORTED'
+ throw err
+ }
+ }
+ var npxArgs = [process.argv0, '[fake arg]', '--always-spawn', packageName, ...process.argv.slice(4)]
+ var parsed = npx.parseArgs(npxArgs, NPM_PATH)
+
+ return npx(parsed)
+ .then(() => cb())
+ .catch(cb)
+ }
var dir = process.cwd()
log.pause()
var initFile = npm.config.get('init-module')
diff --git a/deps/npm/lib/install-ci-test.js b/deps/npm/lib/install-ci-test.js
new file mode 100644
index 00000000000000..26120f4a216dfa
--- /dev/null
+++ b/deps/npm/lib/install-ci-test.js
@@ -0,0 +1,26 @@
+'use strict'
+
+// npm install-ci-test
+// Runs `npm ci` and then runs `npm test`
+
+module.exports = installTest
+var ci = require('./ci.js')
+var test = require('./test.js')
+var usage = require('./utils/usage')
+
+installTest.usage = usage(
+ 'install-ci-test',
+ '\nnpm install-ci-test [args]' +
+ '\nSame args as `npm ci`'
+)
+
+installTest.completion = ci.completion
+
+function installTest (args, cb) {
+ ci(args, function (er) {
+ if (er) {
+ return cb(er)
+ }
+ test([], cb)
+ })
+}
diff --git a/deps/npm/lib/install.js b/deps/npm/lib/install.js
index 42906f2394e895..e15bc479191001 100644
--- a/deps/npm/lib/install.js
+++ b/deps/npm/lib/install.js
@@ -1,4 +1,6 @@
'use strict'
+/* eslint-disable camelcase */
+/* eslint-disable standard/no-callback-literal */
// npm install
//
// See doc/cli/npm-install.md for more description
@@ -135,6 +137,7 @@ var validateTree = require('./install/validate-tree.js')
var validateArgs = require('./install/validate-args.js')
var saveRequested = require('./install/save.js').saveRequested
var saveShrinkwrap = require('./install/save.js').saveShrinkwrap
+var audit = require('./install/audit.js')
var getSaveType = require('./install/save.js').getSaveType
var doSerialActions = require('./install/actions.js').doSerial
var doReverseSerialActions = require('./install/actions.js').doReverseSerial
@@ -181,8 +184,8 @@ function install (where, args, cb) {
var globalTop = path.resolve(npm.globalDir, '..')
if (!where) {
where = npm.config.get('global')
- ? globalTop
- : npm.prefix
+ ? globalTop
+ : npm.prefix
}
validate('SAF', [where, args, cb])
// the /path/to/node_modules/..
@@ -220,6 +223,8 @@ function Installer (where, dryrun, args, opts) {
this.noPackageJsonOk = !!args.length
this.topLevelLifecycles = !args.length
+ this.autoPrune = npm.config.get('package-lock')
+
const dev = npm.config.get('dev')
const only = npm.config.get('only')
const onlyProd = /^prod(uction)?$/.test(only)
@@ -234,6 +239,7 @@ function Installer (where, dryrun, args, opts) {
this.link = opts.link != null ? opts.link : npm.config.get('link')
this.saveOnlyLock = opts.saveOnlyLock
this.global = opts.global != null ? opts.global : this.where === path.resolve(npm.globalDir, '..')
+ this.audit = npm.config.get('audit') && !this.global
this.started = Date.now()
}
Installer.prototype = {}
@@ -294,7 +300,9 @@ Installer.prototype.run = function (_cb) {
[this, this.finishTracker, 'generateActionsToTake'],
[this, this.debugActions, 'diffTrees', 'differences'],
- [this, this.debugActions, 'decomposeActions', 'todo'])
+ [this, this.debugActions, 'decomposeActions', 'todo'],
+ [this, this.startAudit]
+ )
if (this.packageLockOnly) {
postInstallSteps.push(
@@ -436,8 +444,8 @@ Installer.prototype.pruneIdealTree = function (cb) {
// if our lock file didn't have the requires field and there
// are any fake children then forgo pruning until we have more info.
if (!this.idealTree.hasRequiresFromLock && this.idealTree.children.some((n) => n.fakeChild)) return cb()
- var toPrune = this.idealTree.children
- .filter(isExtraneous)
+ const toPrune = this.idealTree.children
+ .filter((child) => isExtraneous(child) && (this.autoPrune || child.removing))
.map((n) => ({name: moduleName(n)}))
return removeExtraneous(toPrune, this.idealTree, cb)
}
@@ -456,21 +464,13 @@ Installer.prototype.loadAllDepsIntoIdealTree = function (cb) {
steps.push([loadRequestedDeps, this.args, this.idealTree, saveDeps, cg.newGroup('loadRequestedDeps')])
} else {
const depsToPreload = Object.assign({},
- this.dev ? this.idealTree.package.devDependencies : {},
- this.prod ? this.idealTree.package.dependencies : {}
+ this.idealTree.package.devDependencies,
+ this.idealTree.package.dependencies
)
- if (this.prod || this.dev) {
- steps.push(
- [prefetchDeps, this.idealTree, depsToPreload, cg.newGroup('prefetchDeps')])
- }
- if (this.prod) {
- steps.push(
- [loadDeps, this.idealTree, cg.newGroup('loadDeps')])
- }
- if (this.dev) {
- steps.push(
- [loadDevDeps, this.idealTree, cg.newGroup('loadDevDeps')])
- }
+ steps.push(
+ [prefetchDeps, this.idealTree, depsToPreload, cg.newGroup('prefetchDeps')],
+ [loadDeps, this.idealTree, cg.newGroup('loadDeps')],
+ [loadDevDeps, this.idealTree, cg.newGroup('loadDevDeps')])
}
steps.push(
[loadExtraneous.andResolveDeps, this.idealTree, cg.newGroup('loadExtraneous')])
@@ -630,6 +630,16 @@ Installer.prototype.runPostinstallTopLevelLifecycles = function (cb) {
chain(steps, cb)
}
+Installer.prototype.startAudit = function (cb) {
+ if (!this.audit) return cb()
+ this.auditSubmission = Bluebird.try(() => {
+ return audit.generateFromInstall(this.idealTree, this.differences, this.args, this.remove)
+ }).then((auditData) => {
+ return audit.submitForInstallReport(auditData)
+ }).catch(_ => {})
+ cb()
+}
+
Installer.prototype.saveToDependencies = function (cb) {
validate('F', arguments)
if (this.failing) return cb()
@@ -692,27 +702,19 @@ Installer.prototype.readLocalPackageData = function (cb) {
Installer.prototype.cloneCurrentTreeToIdealTree = function (cb) {
validate('F', arguments)
log.silly('install', 'cloneCurrentTreeToIdealTree')
- this.idealTree = copyTree(this.currentTree, (child) => {
- // Filter out any children we didn't install ourselves. They need to be
- // reinstalled in order for things to be correct.
- return child.isTop || isLink(child) || (
- child.package &&
- child.package._resolved &&
- (child.package._integrity || child.package._shasum)
- )
- })
+
+ this.idealTree = copyTree(this.currentTree)
this.idealTree.warnings = []
cb()
}
-function isLink (child) {
- return child.isLink || (child.parent && isLink(child.parent))
-}
-
Installer.prototype.loadShrinkwrap = function (cb) {
validate('F', arguments)
log.silly('install', 'loadShrinkwrap')
- readShrinkwrap.andInflate(this.idealTree, cb)
+ readShrinkwrap.andInflate(this.idealTree, iferr(cb, () => {
+ computeMetadata(this.idealTree)
+ cb()
+ }))
}
Installer.prototype.getInstalledModules = function () {
@@ -760,20 +762,32 @@ Installer.prototype.printInstalled = function (cb) {
diffs.push(['remove', r])
})
}
- if (npm.config.get('json')) {
- return this.printInstalledForJSON(diffs, cb)
- } else if (npm.config.get('parseable')) {
- return this.printInstalledForParseable(diffs, cb)
- } else {
- return this.printInstalledForHuman(diffs, cb)
- }
+ return Bluebird.try(() => {
+ if (!this.auditSubmission) return
+ return Bluebird.resolve(this.auditSubmission).timeout(10000).catch(() => null)
+ }).then((auditResult) => {
+ if (auditResult && !auditResult.metadata) {
+ log.warn('audit', 'Audit result from registry missing metadata. This is probably an issue with the registry.')
+ }
+ // maybe write audit report w/ hash of pjson & shrinkwrap for later reading by `npm audit`
+ if (npm.config.get('json')) {
+ return this.printInstalledForJSON(diffs, auditResult)
+ } else if (npm.config.get('parseable')) {
+ return this.printInstalledForParseable(diffs, auditResult)
+ } else {
+ return this.printInstalledForHuman(diffs, auditResult)
+ }
+ }).asCallback(cb)
}
-Installer.prototype.printInstalledForHuman = function (diffs, cb) {
+Installer.prototype.printInstalledForHuman = function (diffs, auditResult) {
var removed = 0
var added = 0
var updated = 0
var moved = 0
+ // Count the number of contributors to packages added, tracking
+ // contributors we've seen, so we can produce a running unique count.
+ var contributors = new Set()
diffs.forEach(function (action) {
var mutation = action[0]
var pkg = action[1]
@@ -784,6 +798,26 @@ Installer.prototype.printInstalledForHuman = function (diffs, cb) {
++moved
} else if (mutation === 'add') {
++added
+ // Count contributors to added packages. Start by combining `author`
+ // and `contributors` data into a single array of contributor-people
+ // for this package.
+ var people = []
+ var meta = pkg.package
+ if (meta.author) people.push(meta.author)
+ if (meta.contributors && Array.isArray(meta.contributors)) {
+ people = people.concat(meta.contributors)
+ }
+ // Make sure a normalized string for every person behind this
+ // package is in `contributors`.
+ people.forEach(function (person) {
+ // Ignore errors from malformed `author` and `contributors`.
+ try {
+ var normalized = normalizePerson(person)
+ } catch (error) {
+ return
+ }
+ if (!contributors.has(normalized)) contributors.add(normalized)
+ })
} else if (mutation === 'update' || mutation === 'update-linked') {
++updated
}
@@ -795,10 +829,17 @@ Installer.prototype.printInstalledForHuman = function (diffs, cb) {
}).join('\n') + '\n'
}
var actions = []
- if (added) actions.push('added ' + packages(added))
+ if (added) {
+ var action = 'added ' + packages(added)
+ if (contributors.size) action += from(contributors.size)
+ actions.push(action)
+ }
if (removed) actions.push('removed ' + packages(removed))
if (updated) actions.push('updated ' + packages(updated))
if (moved) actions.push('moved ' + packages(moved))
+ if (auditResult && auditResult.metadata && auditResult.metadata.totalDependencies) {
+ actions.push('audited ' + packages(auditResult.metadata.totalDependencies))
+ }
if (actions.length === 0) {
report += 'up to date'
} else if (actions.length === 1) {
@@ -810,14 +851,31 @@ Installer.prototype.printInstalledForHuman = function (diffs, cb) {
report += ' in ' + ((Date.now() - this.started) / 1000) + 's'
output(report)
- return cb()
+ return auditResult && audit.printInstallReport(auditResult)
function packages (num) {
return num + ' package' + (num > 1 ? 's' : '')
}
+
+ function from (num) {
+ return ' from ' + num + ' contributor' + (num > 1 ? 's' : '')
+ }
+
+ // Values of `author` and elements of `contributors` in `package.json`
+ // files can be e-mail style strings or Objects with `name`, `email,
+ // and `url` String properties. Convert Objects to Strings so that
+ // we can efficiently keep a set of contributors we have already seen.
+ function normalizePerson (argument) {
+ if (typeof argument === 'string') return argument
+ var returned = ''
+ if (argument.name) returned += argument.name
+ if (argument.email) returned += ' <' + argument.email + '>'
+ if (argument.url) returned += ' (' + argument.email + ')'
+ return returned
+ }
}
-Installer.prototype.printInstalledForJSON = function (diffs, cb) {
+Installer.prototype.printInstalledForJSON = function (diffs, auditResult) {
var result = {
added: [],
removed: [],
@@ -825,6 +883,7 @@ Installer.prototype.printInstalledForJSON = function (diffs, cb) {
moved: [],
failed: [],
warnings: [],
+ audit: auditResult,
elapsed: Date.now() - this.started
}
var self = this
@@ -855,7 +914,6 @@ Installer.prototype.printInstalledForJSON = function (diffs, cb) {
}
})
output(JSON.stringify(result, null, 2))
- cb()
function flattenMessage (msg) {
return msg.map(function (logline) { return logline.slice(1).join(' ') }).join('\n')
@@ -879,7 +937,7 @@ Installer.prototype.printInstalledForJSON = function (diffs, cb) {
}
}
-Installer.prototype.printInstalledForParseable = function (diffs, cb) {
+Installer.prototype.printInstalledForParseable = function (diffs) {
var self = this
diffs.forEach(function (action) {
var mutation = action[0]
@@ -897,7 +955,6 @@ Installer.prototype.printInstalledForParseable = function (diffs, cb) {
(previousVersion || '') + '\t' +
(previousPath || ''))
})
- return cb()
}
Installer.prototype.debugActions = function (name, actionListName, cb) {
diff --git a/deps/npm/lib/install/action/extract-worker.js b/deps/npm/lib/install/action/extract-worker.js
index 24508c780495eb..2b082b4a574c25 100644
--- a/deps/npm/lib/install/action/extract-worker.js
+++ b/deps/npm/lib/install/action/extract-worker.js
@@ -10,9 +10,9 @@ module.exports = (args, cb) => {
const spec = parsed[0]
const extractTo = parsed[1]
const opts = parsed[2]
- if (!opts.log && opts.loglevel) {
+ if (!opts.log) {
opts.log = npmlog
- opts.log.level = opts.loglevel
}
+ opts.log.level = opts.loglevel || opts.log.level
BB.resolve(extract(spec, extractTo, opts)).nodeify(cb)
}
diff --git a/deps/npm/lib/install/action/extract.js b/deps/npm/lib/install/action/extract.js
index 6b827f36ea92fd..e8d7a6c4f6d1f0 100644
--- a/deps/npm/lib/install/action/extract.js
+++ b/deps/npm/lib/install/action/extract.js
@@ -4,9 +4,7 @@ const BB = require('bluebird')
const stat = BB.promisify(require('graceful-fs').stat)
const gentlyRm = BB.promisify(require('../../utils/gently-rm.js'))
-const log = require('npmlog')
const mkdirp = BB.promisify(require('mkdirp'))
-const moduleName = require('../../utils/module-name.js')
const moduleStagingPath = require('../module-staging-path.js')
const move = require('../../utils/move.js')
const npa = require('npm-package-arg')
@@ -59,12 +57,11 @@ function extract (staging, pkg, log) {
pacoteOpts = require('../../config/pacote')
}
const opts = pacoteOpts({
- integrity: pkg.package._integrity
+ integrity: pkg.package._integrity,
+ resolved: pkg.package._resolved
})
const args = [
- pkg.package._resolved
- ? npa.resolve(pkg.package.name, pkg.package._resolved)
- : pkg.package._requested,
+ pkg.package._requested,
extractTo,
opts
]
@@ -112,18 +109,6 @@ function readBundled (pkg, staging, extractTo) {
}, {concurrency: 10})
}
-function getTree (pkg) {
- while (pkg.parent) pkg = pkg.parent
- return pkg
-}
-
-function warn (pkg, code, msg) {
- const tree = getTree(pkg)
- const err = new Error(msg)
- err.code = code
- tree.warnings.push(err)
-}
-
function stageBundledModule (bundler, child, staging, parentPath) {
const stageFrom = path.join(parentPath, 'node_modules', child.package.name)
const stageTo = moduleStagingPath(staging, child)
@@ -146,15 +131,6 @@ function finishModule (bundler, child, stageTo, stageFrom) {
return move(stageFrom, stageTo)
})
} else {
- return stat(stageFrom).then(() => {
- const bundlerId = packageId(bundler)
- if (!getTree(bundler).warnings.some((w) => {
- return w.code === 'EBUNDLEOVERRIDE'
- })) {
- warn(bundler, 'EBUNDLEOVERRIDE', `${bundlerId} had bundled packages that do not match the required version(s). They have been replaced with non-bundled versions.`)
- }
- log.verbose('bundle', `EBUNDLEOVERRIDE: Replacing ${bundlerId}'s bundled version of ${moduleName(child)} with ${packageId(child)}.`)
- return gentlyRm(stageFrom)
- }, () => {})
+ return stat(stageFrom).then(() => gentlyRm(stageFrom), () => {})
}
}
diff --git a/deps/npm/lib/install/action/fetch.js b/deps/npm/lib/install/action/fetch.js
index a4d760fe829a2b..5ad34e29dd27ef 100644
--- a/deps/npm/lib/install/action/fetch.js
+++ b/deps/npm/lib/install/action/fetch.js
@@ -12,5 +12,5 @@ function fetch (staging, pkg, log, next) {
log.silly('fetch', packageId(pkg))
const opts = pacoteOpts({integrity: pkg.package._integrity})
return finished(pacote.tarball.stream(pkg.package._requested, opts))
- .then(() => next(), next)
+ .then(() => next(), next)
}
diff --git a/deps/npm/lib/install/actions.js b/deps/npm/lib/install/actions.js
index 9608a943a5aeec..a34d03ffe21465 100644
--- a/deps/npm/lib/install/actions.js
+++ b/deps/npm/lib/install/actions.js
@@ -118,7 +118,7 @@ function doParallel (type, staging, actionsToRun, log, next) {
}
return acc
}, [])
- log.silly('doParallel', type + ' ' + actionsToRun.length)
+ log.silly('doParallel', type + ' ' + acts.length)
time(log)
if (!acts.length) { return next() }
return withInit(actions[type], () => {
diff --git a/deps/npm/lib/install/audit.js b/deps/npm/lib/install/audit.js
new file mode 100644
index 00000000000000..23a60beb311389
--- /dev/null
+++ b/deps/npm/lib/install/audit.js
@@ -0,0 +1,282 @@
+'use strict'
+exports.generate = generate
+exports.generateFromInstall = generateFromInstall
+exports.submitForInstallReport = submitForInstallReport
+exports.submitForFullReport = submitForFullReport
+exports.printInstallReport = printInstallReport
+exports.printParseableReport = printParseableReport
+exports.printFullReport = printFullReport
+
+const Bluebird = require('bluebird')
+const auditReport = require('npm-audit-report')
+const treeToShrinkwrap = require('../shrinkwrap.js').treeToShrinkwrap
+const packageId = require('../utils/package-id.js')
+const output = require('../utils/output.js')
+const npm = require('../npm.js')
+const qw = require('qw')
+const registryFetch = require('npm-registry-fetch')
+const zlib = require('zlib')
+const gzip = Bluebird.promisify(zlib.gzip)
+const log = require('npmlog')
+const perf = require('../utils/perf.js')
+const url = require('url')
+const npa = require('npm-package-arg')
+const uuid = require('uuid')
+const ssri = require('ssri')
+const cloneDeep = require('lodash.clonedeep')
+const pacoteOpts = require('../config/pacote.js')
+
+// used when scrubbing module names/specifiers
+const runId = uuid.v4()
+
+function submitForInstallReport (auditData) {
+ const cfg = npm.config // avoid the no-dynamic-lookups test
+ const scopedRegistries = cfg.keys.filter(_ => /:registry$/.test(_)).map(_ => cfg.get(_))
+ perf.emit('time', 'audit compress')
+ // TODO: registryFetch will be adding native support for `Content-Encoding: gzip` at which point
+ // we'll pass in something like `gzip: true` and not need to JSON stringify, gzip or headers.
+ return gzip(JSON.stringify(auditData)).then(body => {
+ perf.emit('timeEnd', 'audit compress')
+ log.info('audit', 'Submitting payload of ' + body.length + 'bytes')
+ scopedRegistries.forEach(reg => {
+ // we don't care about the response so destroy the stream if we can, or leave it flowing
+ // so it can eventually finish and clean up after itself
+ fetchAudit(url.resolve(reg, '/-/npm/v1/security/audits/quick'))
+ .then(_ => {
+ _.body.on('error', () => {})
+ if (_.body.destroy) {
+ _.body.destroy()
+ } else {
+ _.body.resume()
+ }
+ }, _ => {})
+ })
+ perf.emit('time', 'audit submit')
+ return fetchAudit('/-/npm/v1/security/audits/quick', body).then(response => {
+ perf.emit('timeEnd', 'audit submit')
+ perf.emit('time', 'audit body')
+ return response.json()
+ }).then(result => {
+ perf.emit('timeEnd', 'audit body')
+ return result
+ })
+ })
+}
+
+function submitForFullReport (auditData) {
+ perf.emit('time', 'audit compress')
+ // TODO: registryFetch will be adding native support for `Content-Encoding: gzip` at which point
+ // we'll pass in something like `gzip: true` and not need to JSON stringify, gzip or headers.
+ return gzip(JSON.stringify(auditData)).then(body => {
+ perf.emit('timeEnd', 'audit compress')
+ log.info('audit', 'Submitting payload of ' + body.length + ' bytes')
+ perf.emit('time', 'audit submit')
+ return fetchAudit('/-/npm/v1/security/audits', body).then(response => {
+ perf.emit('timeEnd', 'audit submit')
+ perf.emit('time', 'audit body')
+ return response.json()
+ }).then(result => {
+ perf.emit('timeEnd', 'audit body')
+ result.runId = runId
+ return result
+ })
+ })
+}
+
+function fetchAudit (href, body) {
+ const opts = pacoteOpts()
+ return registryFetch(href, {
+ method: 'POST',
+ headers: { 'Content-Encoding': 'gzip', 'Content-Type': 'application/json' },
+ config: npm.config,
+ npmSession: opts.npmSession,
+ projectScope: npm.projectScope,
+ log: log,
+ body: body
+ })
+}
+
+function printInstallReport (auditResult) {
+ return auditReport(auditResult, {
+ reporter: 'install',
+ withColor: npm.color,
+ withUnicode: npm.config.get('unicode')
+ }).then(result => output(result.report))
+}
+
+function printFullReport (auditResult) {
+ return auditReport(auditResult, {
+ log: output,
+ reporter: npm.config.get('json') ? 'json' : 'detail',
+ withColor: npm.color,
+ withUnicode: npm.config.get('unicode')
+ }).then(result => output(result.report))
+}
+
+function printParseableReport (auditResult) {
+ return auditReport(auditResult, {
+ log: output,
+ reporter: 'parseable',
+ withColor: npm.color,
+ withUnicode: npm.config.get('unicode')
+ }).then(result => output(result.report))
+}
+
+function generate (shrinkwrap, requires, diffs, install, remove) {
+ const sw = cloneDeep(shrinkwrap)
+ delete sw.lockfileVersion
+ sw.requires = scrubRequires(requires)
+ scrubDeps(sw.dependencies)
+
+ // sw.diffs = diffs || {}
+ sw.install = (install || []).map(scrubArg)
+ sw.remove = (remove || []).map(scrubArg)
+ return generateMetadata().then((md) => {
+ sw.metadata = md
+ return sw
+ })
+}
+
+const scrubKeys = qw`version`
+const deleteKeys = qw`from resolved`
+
+function scrubDeps (deps) {
+ if (!deps) return
+ Object.keys(deps).forEach(name => {
+ if (!shouldScrubName(name) && !shouldScrubSpec(name, deps[name].version)) return
+ const value = deps[name]
+ delete deps[name]
+ deps[scrub(name)] = value
+ })
+ Object.keys(deps).forEach(name => {
+ for (let toScrub of scrubKeys) {
+ if (!deps[name][toScrub]) continue
+ deps[name][toScrub] = scrubSpec(name, deps[name][toScrub])
+ }
+ for (let toDelete of deleteKeys) delete deps[name][toDelete]
+
+ scrubRequires(deps[name].requires)
+ scrubDeps(deps[name].dependencies)
+ })
+}
+
+function scrubRequires (reqs) {
+ if (!reqs) return reqs
+ Object.keys(reqs).forEach(name => {
+ const spec = reqs[name]
+ if (shouldScrubName(name) || shouldScrubSpec(name, spec)) {
+ delete reqs[name]
+ reqs[scrub(name)] = scrubSpec(name, spec)
+ } else {
+ reqs[name] = scrubSpec(name, spec)
+ }
+ })
+ return reqs
+}
+
+function getScope (name) {
+ if (name[0] === '@') return name.slice(0, name.indexOf('/'))
+}
+
+function shouldScrubName (name) {
+ const scope = getScope(name)
+ const cfg = npm.config // avoid the no-dynamic-lookups test
+ return Boolean(scope && cfg.get(scope + ':registry'))
+}
+function shouldScrubSpec (name, spec) {
+ const req = npa.resolve(name, spec)
+ return !req.registry
+}
+
+function scrubArg (arg) {
+ const req = npa(arg)
+ let name = req.name
+ if (shouldScrubName(name) || shouldScrubSpec(name, req.rawSpec)) {
+ name = scrubName(name)
+ }
+ const spec = scrubSpec(req.name, req.rawSpec)
+ return name + '@' + spec
+}
+
+function scrubName (name) {
+ return shouldScrubName(name) ? scrub(name) : name
+}
+
+function scrubSpec (name, spec) {
+ const req = npa.resolve(name, spec)
+ if (req.registry) return spec
+ if (req.type === 'git') {
+ return 'git+ssh://' + scrub(spec)
+ } else if (req.type === 'remote') {
+ return 'https://' + scrub(spec)
+ } else if (req.type === 'directory') {
+ return 'file:' + scrub(spec)
+ } else if (req.type === 'file') {
+ return 'file:' + scrub(spec) + '.tar'
+ } else {
+ return scrub(spec)
+ }
+}
+
+module.exports.scrub = scrub
+function scrub (value, rid) {
+ return ssri.fromData((rid || runId) + ' ' + value, {algorithms: ['sha256']}).hexDigest()
+}
+
+function generateMetadata () {
+ const meta = {}
+ meta.npm_version = npm.version
+ meta.node_version = process.version
+ meta.platform = process.platform
+ meta.node_env = process.env.NODE_ENV
+
+ return Promise.resolve(meta)
+}
+/*
+ const head = path.resolve(npm.prefix, '.git/HEAD')
+ return readFile(head, 'utf8').then((head) => {
+ if (!head.match(/^ref: /)) {
+ meta.commit_hash = head.trim()
+ return
+ }
+ const headFile = head.replace(/^ref: /, '').trim()
+ meta.branch = headFile.replace(/^refs[/]heads[/]/, '')
+ return readFile(path.resolve(npm.prefix, '.git', headFile), 'utf8')
+ }).then((commitHash) => {
+ meta.commit_hash = commitHash.trim()
+ const proc = spawn('git', qw`diff --quiet --exit-code package.json package-lock.json`, {cwd: npm.prefix, stdio: 'ignore'})
+ return new Promise((resolve, reject) => {
+ proc.once('error', reject)
+ proc.on('exit', (code, signal) => {
+ if (signal == null) meta.state = code === 0 ? 'clean' : 'dirty'
+ resolve()
+ })
+ })
+ }).then(() => meta, () => meta)
+*/
+
+function generateFromInstall (tree, diffs, install, remove) {
+ const requires = {}
+ tree.requires.forEach((pkg) => {
+ requires[pkg.package.name] = tree.package.dependencies[pkg.package.name] || tree.package.devDependencies[pkg.package.name] || pkg.package.version
+ })
+
+ const auditInstall = (install || []).filter((a) => a.name).map(packageId)
+ const auditRemove = (remove || []).filter((a) => a.name).map(packageId)
+ const auditDiffs = {}
+ diffs.forEach((action) => {
+ const mutation = action[0]
+ const child = action[1]
+ if (mutation !== 'add' && mutation !== 'update' && mutation !== 'remove') return
+ if (!auditDiffs[mutation]) auditDiffs[mutation] = []
+ if (mutation === 'add') {
+ auditDiffs[mutation].push({location: child.location})
+ } else if (mutation === 'update') {
+ auditDiffs[mutation].push({location: child.location, previous: packageId(child.oldPkg)})
+ } else if (mutation === 'remove') {
+ auditDiffs[mutation].push({previous: packageId(child)})
+ }
+ })
+
+ return generate(treeToShrinkwrap(tree), requires, auditDiffs, auditInstall, auditRemove)
+}
diff --git a/deps/npm/lib/install/copy-tree.js b/deps/npm/lib/install/copy-tree.js
index a5b558cf598b73..2bf7064f334896 100644
--- a/deps/npm/lib/install/copy-tree.js
+++ b/deps/npm/lib/install/copy-tree.js
@@ -1,27 +1,26 @@
'use strict'
var createNode = require('./node.js').create
-module.exports = function (tree, filter) {
- return copyTree(tree, {}, filter)
+module.exports = function (tree) {
+ return copyTree(tree, {})
}
-function copyTree (tree, cache, filter) {
- if (filter && !filter(tree)) { return null }
+function copyTree (tree, cache) {
if (cache[tree.path]) { return cache[tree.path] }
var newTree = cache[tree.path] = createNode(Object.assign({}, tree))
- copyModuleList(newTree, 'children', cache, filter)
+ copyModuleList(newTree, 'children', cache)
newTree.children.forEach(function (child) {
child.parent = newTree
})
- copyModuleList(newTree, 'requires', cache, filter)
- copyModuleList(newTree, 'requiredBy', cache, filter)
+ copyModuleList(newTree, 'requires', cache)
+ copyModuleList(newTree, 'requiredBy', cache)
return newTree
}
-function copyModuleList (tree, key, cache, filter) {
+function copyModuleList (tree, key, cache) {
var newList = []
if (tree[key]) {
tree[key].forEach(function (child) {
- const copy = copyTree(child, cache, filter)
+ const copy = copyTree(child, cache)
if (copy) {
newList.push(copy)
}
diff --git a/deps/npm/lib/install/decompose-actions.js b/deps/npm/lib/install/decompose-actions.js
index 57dc7cd6874647..ba08e6e7684e51 100644
--- a/deps/npm/lib/install/decompose-actions.js
+++ b/deps/npm/lib/install/decompose-actions.js
@@ -1,72 +1,79 @@
'use strict'
var validate = require('aproba')
-var asyncMap = require('slide').asyncMap
var npm = require('../npm.js')
module.exports = function (differences, decomposed, next) {
validate('AAF', arguments)
- asyncMap(differences, function (action, done) {
+ differences.forEach((action) => {
var cmd = action[0]
var pkg = action[1]
switch (cmd) {
case 'add':
- addSteps(decomposed, pkg, done)
+ addSteps(decomposed, pkg)
break
case 'update':
- updateSteps(decomposed, pkg, done)
+ updateSteps(decomposed, pkg)
break
case 'move':
- moveSteps(decomposed, pkg, done)
+ moveSteps(decomposed, pkg)
break
case 'remove':
- removeSteps(decomposed, pkg, done)
+ removeSteps(decomposed, pkg)
break
default:
- defaultSteps(decomposed, cmd, pkg, done)
+ defaultSteps(decomposed, cmd, pkg)
}
- }, next)
+ })
+ next()
+}
+
+function addAction (decomposed, action, pkg) {
+ if (decomposed.some((_) => _[0] === action && _[1] === pkg)) return
+ decomposed.push([action, pkg])
}
-function addSteps (decomposed, pkg, done) {
+function addSteps (decomposed, pkg) {
+ if (pkg.fromBundle) {
+ // make sure our source module exists to extract ourselves from
+ // if we're installing our source module anyway, the duplication
+ // of these steps will be elided by `addAction` automatically
+ addAction(decomposed, 'fetch', pkg.fromBundle)
+ addAction(decomposed, 'extract', pkg.fromBundle)
+ }
if (!pkg.fromBundle && !pkg.isLink) {
- decomposed.push(['fetch', pkg])
- decomposed.push(['extract', pkg])
+ addAction(decomposed, 'fetch', pkg)
+ addAction(decomposed, 'extract', pkg)
}
if (!pkg.fromBundle || npm.config.get('rebuild-bundle')) {
- decomposed.push(['preinstall', pkg])
- decomposed.push(['build', pkg])
- decomposed.push(['install', pkg])
- decomposed.push(['postinstall', pkg])
+ addAction(decomposed, 'preinstall', pkg)
+ addAction(decomposed, 'build', pkg)
+ addAction(decomposed, 'install', pkg)
+ addAction(decomposed, 'postinstall', pkg)
}
if (!pkg.fromBundle || !pkg.isLink) {
- decomposed.push(['finalize', pkg])
+ addAction(decomposed, 'finalize', pkg)
}
- decomposed.push(['refresh-package-json', pkg])
- done()
+ addAction(decomposed, 'refresh-package-json', pkg)
}
-function updateSteps (decomposed, pkg, done) {
- removeSteps(decomposed, pkg.oldPkg, () => {
- addSteps(decomposed, pkg, done)
- })
+function updateSteps (decomposed, pkg) {
+ removeSteps(decomposed, pkg.oldPkg)
+ addSteps(decomposed, pkg)
}
-function removeSteps (decomposed, pkg, done) {
- decomposed.push(['unbuild', pkg])
- decomposed.push(['remove', pkg])
- done()
+function removeSteps (decomposed, pkg) {
+ addAction(decomposed, 'unbuild', pkg)
+ addAction(decomposed, 'remove', pkg)
}
-function moveSteps (decomposed, pkg, done) {
- decomposed.push(['move', pkg])
- decomposed.push(['build', pkg])
- decomposed.push(['install', pkg])
- decomposed.push(['postinstall', pkg])
- decomposed.push(['refresh-package-json', pkg])
- done()
+function moveSteps (decomposed, pkg) {
+ addAction(decomposed, 'move', pkg)
+ addAction(decomposed, 'build', pkg)
+ addAction(decomposed, 'install', pkg)
+ addAction(decomposed, 'postinstall', pkg)
+ addAction(decomposed, 'refresh-package-json', pkg)
}
-function defaultSteps (decomposed, cmd, pkg, done) {
- decomposed.push([cmd, pkg])
- done()
+function defaultSteps (decomposed, cmd, pkg) {
+ addAction(decomposed, cmd, pkg)
}
diff --git a/deps/npm/lib/install/deps.js b/deps/npm/lib/install/deps.js
index 93c4adffd7e554..c36265093b090b 100644
--- a/deps/npm/lib/install/deps.js
+++ b/deps/npm/lib/install/deps.js
@@ -33,6 +33,7 @@ var getSaveType = require('./save.js').getSaveType
var unixFormatPath = require('../utils/unix-format-path.js')
var isExtraneous = require('./is-extraneous.js')
var isRegistry = require('../utils/is-registry.js')
+var hasModernMeta = require('./has-modern-meta.js')
// The export functions in this module mutate a dependency tree, adding
// items to them.
@@ -50,6 +51,12 @@ function doesChildVersionMatch (child, requested, requestor) {
return path.relative(child.realpath, requested.fetchSpec) === ''
}
+ if (requested.type === 'git' && child.fromShrinkwrap) {
+ const fromSw = child.package._from ? npa(child.package._from) : child.fromShrinkwrap
+ fromSw.name = requested.name // we're only checking specifiers here
+ if (fromSw.toString() === requested.toString()) return true
+ }
+
if (!registryTypes[requested.type]) {
var childReq = child.package._requested
if (childReq) {
@@ -65,7 +72,7 @@ function doesChildVersionMatch (child, requested, requestor) {
// You'll see this scenario happen with at least tags and git dependencies.
// Some buggy clients will write spaces into the module name part of a _from.
if (child.package._from) {
- var fromReq = npa.resolve(moduleName(child), child.package._from.replace(new RegExp('^\s*' + moduleName(child) + '\s*@'), ''))
+ var fromReq = npa.resolve(moduleName(child), child.package._from.replace(new RegExp('^\\s*' + moduleName(child) + '\\s*@'), ''))
if (fromReq.rawSpec === requested.rawSpec) return true
if (fromReq.type === requested.type && fromReq.saveSpec && fromReq.saveSpec === requested.saveSpec) return true
}
@@ -78,8 +85,8 @@ function doesChildVersionMatch (child, requested, requestor) {
}
}
-function childDependencySpecifier (tree, name, spec) {
- return npa.resolve(name, spec, packageRelativePath(tree))
+function childDependencySpecifier (tree, name, spec, where) {
+ return npa.resolve(name, spec, where || packageRelativePath(tree))
}
exports.computeMetadata = computeMetadata
@@ -104,14 +111,13 @@ function computeMetadata (tree, seen) {
resolveWithExistingModule(child, tree)
return true
}
- return
}
const deps = tree.package.dependencies || {}
const reqs = tree.swRequires || {}
for (let name of Object.keys(deps)) {
if (findChild(name, deps[name])) continue
- if (findChild(name, reqs[name])) continue
+ if (name in reqs && findChild(name, reqs[name])) continue
tree.missingDeps[name] = deps[name]
}
if (tree.isTop) {
@@ -186,15 +192,14 @@ function packageRelativePath (tree) {
var requested = tree.package._requested || {}
var isLocal = requested.type === 'directory' || requested.type === 'file'
return isLocal ? requested.fetchSpec
- : (tree.isLink || tree.isInLink) && !preserveSymlinks() ? tree.realpath
- : tree.path
+ : (tree.isLink || tree.isInLink) && !preserveSymlinks() ? tree.realpath
+ : tree.path
}
function matchingDep (tree, name) {
if (!tree || !tree.package) return
if (tree.package.dependencies && tree.package.dependencies[name]) return tree.package.dependencies[name]
if (tree.package.devDependencies && tree.package.devDependencies[name]) return tree.package.devDependencies[name]
- return
}
exports.getAllMetadata = function (args, tree, where, next) {
@@ -261,6 +266,7 @@ exports.loadRequestedDeps = function (args, tree, saveToDependencies, log, next)
delete tree.package[saveType][childName]
}
}
+ if (child.save === 'optionalDependencies') tree.package.dependencies[childName] = child.saveSpec
}
// For things the user asked to install, that aren't a dependency (or
@@ -282,10 +288,12 @@ function computeVersionSpec (tree, child) {
validate('OO', arguments)
var requested
var childReq = child.package._requested
- if (childReq && (isNotEmpty(childReq.saveSpec) || (isNotEmpty(childReq.rawSpec) && isNotEmpty(childReq.fetchSpec)))) {
+ if (child.isLink) {
+ requested = npa.resolve(child.package.name, 'file:' + child.realpath, getTop(tree).path)
+ } else if (childReq && (isNotEmpty(childReq.saveSpec) || (isNotEmpty(childReq.rawSpec) && isNotEmpty(childReq.fetchSpec)))) {
requested = child.package._requested
} else if (child.package._from) {
- requested = npa(child.package._from)
+ requested = npa(child.package._from, tree.path)
} else {
requested = npa.resolve(child.package.name, child.package.version)
}
@@ -299,7 +307,7 @@ function computeVersionSpec (tree, child) {
}
return rangeDescriptor + version
} else if (requested.type === 'directory' || requested.type === 'file') {
- return 'file:' + unixFormatPath(path.relative(tree.path, requested.fetchSpec))
+ return 'file:' + unixFormatPath(path.relative(getTop(tree).path, requested.fetchSpec))
} else {
return requested.saveSpec || requested.rawSpec
}
@@ -332,9 +340,21 @@ exports.removeDeps = function (args, tree, saveToDependencies, next) {
parent.requires = parent.requires.filter((child) => child !== pkgToRemove)
}
pkgToRemove.requiredBy = pkgToRemove.requiredBy.filter((parent) => parent !== tree)
+ flagAsRemoving(pkgToRemove)
}
next()
}
+
+function flagAsRemoving (toRemove, seen) {
+ if (!seen) seen = new Set()
+ if (seen.has(toRemove)) return
+ seen.add(toRemove)
+ toRemove.removing = true
+ toRemove.requires.forEach((required) => {
+ flagAsRemoving(required, seen)
+ })
+}
+
exports.removeExtraneous = function (args, tree, next) {
for (let pkg of args) {
var pkgName = moduleName(pkg)
@@ -369,8 +389,22 @@ function andForEachChild (load, next) {
function isDepOptional (tree, name, pkg) {
if (pkg.package && pkg.package._optional) return true
- if (!tree.package.optionalDependencies) return false
- if (tree.package.optionalDependencies[name] != null) return true
+ const optDeps = tree.package.optionalDependencies
+ if (optDeps && optDeps[name] != null) return true
+
+ const devDeps = tree.package.devDependencies
+ if (devDeps && devDeps[name] != null) {
+ const includeDev = npm.config.get('dev') ||
+ (!/^prod(uction)?$/.test(npm.config.get('only')) && !npm.config.get('production')) ||
+ /^dev(elopment)?$/.test(npm.config.get('only')) ||
+ /^dev(elopment)?$/.test(npm.config.get('also'))
+ return !includeDev
+ }
+ const prodDeps = tree.package.dependencies
+ if (prodDeps && prodDeps[name] != null) {
+ const includeProd = !/^dev(elopment)?$/.test(npm.config.get('only'))
+ return !includeProd
+ }
return false
}
@@ -461,12 +495,6 @@ function loadDeps (tree, log, next) {
if (!tree.package.dependencies) tree.package.dependencies = {}
asyncMap(Object.keys(tree.package.dependencies), function (dep, done) {
var version = tree.package.dependencies[dep]
- if (tree.package.optionalDependencies &&
- tree.package.optionalDependencies[dep] &&
- !npm.config.get('optional')) {
- return done()
- }
-
addDependency(dep, version, tree, log.newGroup('loadDep:' + dep), andHandleOptionalErrors(log, tree, dep, done))
}, andForEachChild(loadDeps, andFinishTracker(log, next)))
}
@@ -481,7 +509,7 @@ exports.loadDevDeps = function (tree, log, next) {
if (tree.package.dependencies[dep]) return done()
var logGroup = log.newGroup('loadDevDep:' + dep)
- addDependency(dep, tree.package.devDependencies[dep], tree, logGroup, done)
+ addDependency(dep, tree.package.devDependencies[dep], tree, logGroup, andHandleOptionalErrors(log, tree, dep, done))
}, andForEachChild(loadDeps, andFinishTracker(log, next)))
}
@@ -519,14 +547,14 @@ function addDependency (name, versionSpec, tree, log, done) {
try {
var req = childDependencySpecifier(tree, name, versionSpec)
if (tree.swRequires && tree.swRequires[name]) {
- var swReq = childDependencySpecifier(tree, name, tree.swRequires[name])
+ var swReq = childDependencySpecifier(tree, name, tree.swRequires[name], tree.package._where)
}
} catch (err) {
return done(err)
}
var child = findRequirement(tree, name, req)
if (!child && swReq) child = findRequirement(tree, name, swReq)
- if (child) {
+ if (hasModernMeta(child)) {
resolveWithExistingModule(child, tree)
if (child.package._shrinkwrap === undefined) {
readShrinkwrap.andInflate(child, function (er) { next(er, child, log) })
@@ -534,12 +562,42 @@ function addDependency (name, versionSpec, tree, log, done) {
next(null, child, log)
}
} else {
+ if (child) {
+ if (req.registry) {
+ req = childDependencySpecifier(tree, name, child.package.version)
+ }
+ if (child.fromBundle) reportBundleOverride(child, log)
+ removeObsoleteDep(child, log)
+ }
fetchPackageMetadata(req, packageRelativePath(tree), {tracker: log.newItem('fetchMetadata')}, iferr(next, function (pkg) {
resolveWithNewModule(pkg, tree, log, next)
}))
}
}
+function getTop (pkg) {
+ const seen = new Set()
+ while (pkg.parent && !seen.has(pkg.parent)) {
+ pkg = pkg.parent
+ seen.add(pkg)
+ }
+ return pkg
+}
+
+function reportBundleOverride (child, log) {
+ const code = 'EBUNDLEOVERRIDE'
+ const top = getTop(child.fromBundle)
+ const bundlerId = packageId(child.fromBundle)
+ if (!top.warnings.some((w) => {
+ return w.code === code
+ })) {
+ const err = new Error(`${bundlerId} had bundled packages that do not match the required version(s). They have been replaced with non-bundled versions.`)
+ err.code = code
+ top.warnings.push(err)
+ }
+ if (log) log.verbose('bundle', `${code}: Replacing ${bundlerId}'s bundled version of ${moduleName(child)} with ${packageId(child)}.`)
+}
+
function resolveWithExistingModule (child, tree) {
validate('OO', arguments)
addRequiredDep(tree, child)
@@ -592,7 +650,7 @@ function resolveWithNewModule (pkg, tree, log, next) {
return isInstallable(pkg, (err) => {
let installable = !err
addBundled(pkg, (bundleErr) => {
- var parent = earliestInstallable(tree, tree, pkg) || tree
+ var parent = earliestInstallable(tree, tree, pkg, log) || tree
var isLink = pkg._requested.type === 'directory'
var child = createChild({
package: pkg,
@@ -609,7 +667,10 @@ function resolveWithNewModule (pkg, tree, log, next) {
var hasBundled = child.children.length
var replaced = replaceModuleByName(parent, 'children', child)
- if (replaced) removeObsoleteDep(replaced)
+ if (replaced) {
+ if (replaced.fromBundle) reportBundleOverride(replaced, log)
+ removeObsoleteDep(replaced)
+ }
addRequiredDep(tree, child)
child.location = flatNameFromTree(child)
@@ -694,12 +755,25 @@ function preserveSymlinks () {
// Find the highest level in the tree that we can install this module in.
// If the module isn't installed above us yet, that'd be the very top.
// If it is, then it's the level below where its installed.
-var earliestInstallable = exports.earliestInstallable = function (requiredBy, tree, pkg) {
- validate('OOO', arguments)
+var earliestInstallable = exports.earliestInstallable = function (requiredBy, tree, pkg, log) {
+ validate('OOOO', arguments)
+
function undeletedModuleMatches (child) {
return !child.removed && moduleName(child) === pkg.name
}
- if (tree.children.some(undeletedModuleMatches)) return null
+ const undeletedMatches = tree.children.filter(undeletedModuleMatches)
+ if (undeletedMatches.length) {
+ // if there's a conflict with another child AT THE SAME level then we're replacing it, so
+ // mark it as removed and continue with resolution normally.
+ if (tree === requiredBy) {
+ undeletedMatches.forEach((pkg) => {
+ if (pkg.fromBundle) reportBundleOverride(pkg, log)
+ removeObsoleteDep(pkg, log)
+ })
+ } else {
+ return null
+ }
+ }
// If any of the children of this tree have conflicting
// binaries then we need to decline to install this package here.
@@ -738,5 +812,5 @@ var earliestInstallable = exports.earliestInstallable = function (requiredBy, tr
if (!preserveSymlinks() && /^[.][.][\\/]/.test(path.relative(tree.parent.realpath, tree.realpath))) return tree
- return (earliestInstallable(requiredBy, tree.parent, pkg) || tree)
+ return (earliestInstallable(requiredBy, tree.parent, pkg, log) || tree)
}
diff --git a/deps/npm/lib/install/diff-trees.js b/deps/npm/lib/install/diff-trees.js
index 4316f351cc6f6a..346846fdc0ffed 100644
--- a/deps/npm/lib/install/diff-trees.js
+++ b/deps/npm/lib/install/diff-trees.js
@@ -8,6 +8,7 @@ var log = require('npmlog')
var path = require('path')
var ssri = require('ssri')
var moduleName = require('../utils/module-name.js')
+var isOnlyOptional = require('./is-only-optional.js')
// we don't use get-requested because we're operating on files on disk, and
// we don't want to extropolate from what _should_ be there.
@@ -50,7 +51,7 @@ function pkgIntegrity (pkg) {
if (Object.keys(integrity).length === 0) return
return integrity
} catch (ex) {
- return
+
}
}
@@ -70,6 +71,9 @@ function sriMatch (aa, bb) {
function pkgAreEquiv (aa, bb) {
// coming in we know they share a path…
+ // if one is inside a link and the other is not, then they are not equivalent
+ // this happens when we're replacing a linked dep with a non-linked version
+ if (aa.isInLink !== bb.isInLink) return false
// if they share package metadata _identity_, they're the same thing
if (aa.package === bb.package) return true
// if they share integrity information, they're the same thing
@@ -162,6 +166,11 @@ var sortActions = module.exports.sortActions = function (differences) {
sorted.unshift(action)
}
+ // safety net, anything excluded above gets tacked on the end
+ differences.forEach((_) => {
+ if (sorted.indexOf(_) === -1) sorted.push(_)
+ })
+
return sorted
}
@@ -213,9 +222,8 @@ var diffTrees = module.exports._diffTrees = function (oldTree, newTree) {
pkg.fromPath = toMv.pkg.path
setAction(differences, 'move', pkg)
delete toRemove[toMv.flatname]
- // we don't generate add actions for things found in links (which already exist on disk) or
- // for bundled modules (which will be installed when we install their parent)
- } else if (!(pkg.isInLink && pkg.fromBundle)) {
+ // we don't generate add actions for things found in links (which already exist on disk)
+ } else if (!pkg.isInLink || !(pkg.fromBundle && pkg.fromBundle.isLink)) {
setAction(differences, 'add', pkg)
}
}
@@ -227,18 +235,26 @@ var diffTrees = module.exports._diffTrees = function (oldTree, newTree) {
.map((flatname) => toRemove[flatname])
.forEach((pkg) => setAction(differences, 'remove', pkg))
+ return filterActions(differences)
+}
+
+function filterActions (differences) {
+ const includeOpt = npm.config.get('optional')
const includeDev = npm.config.get('dev') ||
(!/^prod(uction)?$/.test(npm.config.get('only')) && !npm.config.get('production')) ||
/^dev(elopment)?$/.test(npm.config.get('only')) ||
/^dev(elopment)?$/.test(npm.config.get('also'))
const includeProd = !/^dev(elopment)?$/.test(npm.config.get('only'))
- if (!includeProd || !includeDev) {
- log.silly('diff-trees', 'filtering actions:', 'includeDev', includeDev, 'includeProd', includeProd)
- differences = differences.filter((diff) => {
- const pkg = diff[1]
- const pkgIsOnlyDev = isOnlyDev(pkg)
- return (!includeProd && pkgIsOnlyDev) || (includeDev && pkgIsOnlyDev) || (includeProd && !pkgIsOnlyDev)
- })
- }
- return differences
+ if (includeProd && includeDev && includeOpt) return differences
+
+ log.silly('diff-trees', 'filtering actions:', 'includeDev', includeDev, 'includeProd', includeProd, 'includeOpt', includeOpt)
+ return differences.filter((diff) => {
+ const pkg = diff[1]
+ const pkgIsOnlyDev = isOnlyDev(pkg)
+ const pkgIsOnlyOpt = isOnlyOptional(pkg)
+ if (!includeProd && pkgIsOnlyDev) return true
+ if (includeDev && pkgIsOnlyDev) return true
+ if (includeProd && !pkgIsOnlyDev && (includeOpt || !pkgIsOnlyOpt)) return true
+ return false
+ })
}
diff --git a/deps/npm/lib/install/get-requested.js b/deps/npm/lib/install/get-requested.js
index f6c44d14634356..ab410ffc9b6e3c 100644
--- a/deps/npm/lib/install/get-requested.js
+++ b/deps/npm/lib/install/get-requested.js
@@ -2,9 +2,9 @@
const npa = require('npm-package-arg')
const moduleName = require('../utils/module-name.js')
-module.exports = function (child) {
+module.exports = function (child, reqBy) {
if (!child.requiredBy.length) return
- const reqBy = child.requiredBy[0]
+ if (!reqBy) reqBy = child.requiredBy[0]
const deps = reqBy.package.dependencies || {}
const devDeps = reqBy.package.devDependencies || {}
const name = moduleName(child)
diff --git a/deps/npm/lib/install/has-modern-meta.js b/deps/npm/lib/install/has-modern-meta.js
new file mode 100644
index 00000000000000..bf801d0d31f5f7
--- /dev/null
+++ b/deps/npm/lib/install/has-modern-meta.js
@@ -0,0 +1,20 @@
+'use strict'
+module.exports = hasModernMeta
+
+const npa = require('npm-package-arg')
+const moduleName = require('../utils/module-name.js')
+
+function isLink (child) {
+ return child.isLink || (child.parent && isLink(child.parent))
+}
+
+function hasModernMeta (child) {
+ if (!child) return false
+ const resolved = child.package._resolved && npa.resolve(moduleName(child), child.package._resolved)
+ const version = npa.resolve(moduleName(child), child.package.version)
+ return child.isTop ||
+ isLink(child) ||
+ child.fromBundle || child.package._inBundle ||
+ child.package._integrity || child.package._shasum ||
+ (resolved && resolved.type === 'git') || (version && version.type === 'git')
+}
diff --git a/deps/npm/lib/install/inflate-shrinkwrap.js b/deps/npm/lib/install/inflate-shrinkwrap.js
index 43ac9136f010f4..bf1ab7065724c8 100644
--- a/deps/npm/lib/install/inflate-shrinkwrap.js
+++ b/deps/npm/lib/install/inflate-shrinkwrap.js
@@ -14,6 +14,9 @@ const realizeShrinkwrapSpecifier = require('./realize-shrinkwrap-specifier.js')
const validate = require('aproba')
const path = require('path')
const isRegistry = require('../utils/is-registry.js')
+const hasModernMeta = require('./has-modern-meta.js')
+const ssri = require('ssri')
+const npa = require('npm-package-arg')
module.exports = function (tree, sw, opts, finishInflating) {
if (!fetchPackageMetadata) {
@@ -66,11 +69,43 @@ function normalizePackageDataNoErrors (pkg) {
}
}
+function quotemeta (str) {
+ return str.replace(/([^A-Za-z_0-9/])/g, '\\$1')
+}
+
+function tarballToVersion (name, tb) {
+ const registry = quotemeta(npm.config.get('registry'))
+ .replace(/https?:/, 'https?:')
+ .replace(/([^/])$/, '$1/')
+ let matchRegTarball
+ if (name) {
+ const nameMatch = quotemeta(name)
+ matchRegTarball = new RegExp(`^${registry}${nameMatch}/-/${nameMatch}-(.*)[.]tgz$`)
+ } else {
+ matchRegTarball = new RegExp(`^${registry}(.*)?/-/\\1-(.*)[.]tgz$`)
+ }
+ const match = tb.match(matchRegTarball)
+ if (!match) return
+ return match[2] || match[1]
+}
+
function inflatableChild (onDiskChild, name, topPath, tree, sw, requested, opts) {
validate('OSSOOOO|ZSSOOOO', arguments)
- if (onDiskChild && childIsEquivalent(sw, requested, onDiskChild)) {
+ const usesIntegrity = (
+ requested.registry ||
+ requested.type === 'remote' ||
+ requested.type === 'file'
+ )
+ const regTarball = tarballToVersion(name, sw.version)
+ if (regTarball) {
+ sw.resolved = sw.version
+ sw.version = regTarball
+ }
+ if (sw.requires) Object.keys(sw.requires).map(_ => { sw.requires[_] = tarballToVersion(_, sw.requires[_]) || sw.requires[_] })
+ const modernLink = requested.type === 'directory' && !sw.from
+ if (hasModernMeta(onDiskChild) && childIsEquivalent(sw, requested, onDiskChild)) {
// The version on disk matches the shrinkwrap entry.
- if (!onDiskChild.fromShrinkwrap) onDiskChild.fromShrinkwrap = true
+ if (!onDiskChild.fromShrinkwrap) onDiskChild.fromShrinkwrap = requested
onDiskChild.package._requested = requested
onDiskChild.package._spec = requested.rawSpec
onDiskChild.package._where = topPath
@@ -88,7 +123,7 @@ function inflatableChild (onDiskChild, name, topPath, tree, sw, requested, opts)
onDiskChild.swRequires = sw.requires
tree.children.push(onDiskChild)
return BB.resolve(onDiskChild)
- } else if ((sw.version && sw.integrity) || sw.bundled) {
+ } else if ((sw.version && (sw.integrity || !usesIntegrity) && (requested.type !== 'directory' || modernLink)) || sw.bundled) {
// The shrinkwrap entry has an integrity field. We can fake a pkg to get
// the installer to do a content-address fetch from the cache, if possible.
return BB.resolve(makeFakeChild(name, topPath, tree, sw, requested))
@@ -100,13 +135,18 @@ function inflatableChild (onDiskChild, name, topPath, tree, sw, requested, opts)
}
}
+function isGit (sw) {
+ const version = npa.resolve(sw.name, sw.version)
+ return (version && version.type === 'git')
+}
+
function makeFakeChild (name, topPath, tree, sw, requested) {
const from = sw.from || requested.raw
const pkg = {
name: name,
version: sw.version,
_id: name + '@' + sw.version,
- _resolved: adaptResolved(requested, sw.resolved),
+ _resolved: sw.resolved || (isGit(sw) && sw.version),
_requested: requested,
_optional: sw.optional,
_development: sw.dev,
@@ -127,15 +167,15 @@ function makeFakeChild (name, topPath, tree, sw, requested) {
}
const child = createChild({
package: pkg,
- loaded: true,
+ loaded: false,
parent: tree,
children: [],
- fromShrinkwrap: true,
+ fromShrinkwrap: requested,
fakeChild: sw,
fromBundle: sw.bundled ? tree.fromBundle || tree : null,
path: childPath(tree.path, pkg),
- realpath: childPath(tree.realpath, pkg),
- location: tree.location + '/' + pkg.name,
+ realpath: requested.type === 'directory' ? requested.fetchSpec : childPath(tree.realpath, pkg),
+ location: (tree.location === '/' ? '' : tree.location + '/') + pkg.name,
isLink: requested.type === 'directory',
isInLink: tree.isLink,
swRequires: sw.requires
@@ -144,23 +184,6 @@ function makeFakeChild (name, topPath, tree, sw, requested) {
return child
}
-function adaptResolved (requested, resolved) {
- const registry = requested.scope
- ? npm.config.get(`${requested.scope}:registry`) || npm.config.get('registry')
- : npm.config.get('registry')
- if (!isRegistry(requested) || (resolved && resolved.indexOf(registry) === 0)) {
- // Nothing to worry about here. Pass it through.
- return resolved
- } else {
- // We could fast-path for registry.npmjs.org here, but if we do, it
- // would end up getting written back to the `resolved` field. By always
- // returning `null` for other registries, `pacote.extract()` will take
- // care of any required metadata fetches internally, without altering
- // the tree we're going to write out to shrinkwrap/lockfile.
- return null
- }
-}
-
function fetchChild (topPath, tree, sw, requested) {
return fetchPackageMetadata(requested, topPath).then((pkg) => {
pkg._from = sw.from || requested.raw
@@ -178,7 +201,7 @@ function fetchChild (topPath, tree, sw, requested) {
path: childPath(tree.path, pkg),
realpath: isLink ? requested.fetchSpec : childPath(tree.realpath, pkg),
children: pkg._bundled || [],
- location: tree.location + '/' + pkg.name,
+ location: (tree.location === '/' ? '' : tree.location + '/') + pkg.name,
fromBundle: null,
isLink: isLink,
isInLink: tree.isLink,
@@ -196,7 +219,11 @@ function fetchChild (topPath, tree, sw, requested) {
function childIsEquivalent (sw, requested, child) {
if (!child) return false
if (child.fromShrinkwrap) return true
- if (sw.integrity && child.package._integrity === sw.integrity) return true
+ if (
+ sw.integrity &&
+ child.package._integrity &&
+ ssri.parse(sw.integrity).match(child.package._integrity)
+ ) return true
if (child.isLink && requested.type === 'directory') return path.relative(child.realpath, requested.fetchSpec) === ''
if (sw.resolved) return child.package._resolved === sw.resolved
diff --git a/deps/npm/lib/install/is-only-optional.js b/deps/npm/lib/install/is-only-optional.js
index 7366e9abe1b326..72d6f065e6745b 100644
--- a/deps/npm/lib/install/is-only-optional.js
+++ b/deps/npm/lib/install/is-only-optional.js
@@ -11,8 +11,9 @@ function isOptional (node, seen) {
return false
}
seen.add(node)
-
+ const swOptional = node.fromShrinkwrap && node.package._optional
return node.requiredBy.every(function (req) {
+ if (req.fakeChild && swOptional) return true
return isOptDep(req, node.package.name) || isOptional(req, seen)
})
}
diff --git a/deps/npm/lib/install/read-shrinkwrap.js b/deps/npm/lib/install/read-shrinkwrap.js
index 45e883caa2f5e2..70746780111275 100644
--- a/deps/npm/lib/install/read-shrinkwrap.js
+++ b/deps/npm/lib/install/read-shrinkwrap.js
@@ -25,14 +25,7 @@ function readShrinkwrap (child, next) {
log.warn('read-shrinkwrap', 'Ignoring package-lock.json because there is already an npm-shrinkwrap.json. Please use only one of the two.')
}
const name = shrinkwrap ? 'npm-shrinkwrap.json' : 'package-lock.json'
- let parsed = null
- if (shrinkwrap || lockfile) {
- try {
- parsed = parseJSON(shrinkwrap || lockfile)
- } catch (ex) {
- throw ex
- }
- }
+ const parsed = parsePkgLock(shrinkwrap || lockfile, name)
if (parsed && parsed.lockfileVersion !== PKGLOCK_VERSION) {
log.warn('read-shrinkwrap', `This version of npm is compatible with lockfileVersion@${PKGLOCK_VERSION}, but ${name} was generated for lockfileVersion@${parsed.lockfileVersion || 0}. I'll try to do my best with it!`)
}
@@ -43,7 +36,8 @@ function readShrinkwrap (child, next) {
function maybeReadFile (name, child) {
return readFileAsync(
- path.join(child.path, name)
+ path.join(child.path, name),
+ 'utf8'
).catch({code: 'ENOENT'}, () => null)
}
@@ -56,3 +50,59 @@ module.exports.andInflate = function (child, next) {
}
}))
}
+
+const PARENT_RE = /\|{7,}/g
+const OURS_RE = /<{7,}/g
+const THEIRS_RE = /={7,}/g
+const END_RE = />{7,}/g
+
+module.exports._isDiff = isDiff
+function isDiff (str) {
+ return str.match(OURS_RE) && str.match(THEIRS_RE) && str.match(END_RE)
+}
+
+module.exports._parsePkgLock = parsePkgLock
+function parsePkgLock (str, filename) {
+ if (!str) { return null }
+ try {
+ return parseJSON(str)
+ } catch (e) {
+ if (isDiff(str)) {
+ log.warn('conflict', `A git conflict was detected in ${filename}. Attempting to auto-resolve.`)
+ log.warn('conflict', 'To make this happen automatically on git rebase/merge, consider using the npm-merge-driver:')
+ log.warn('conflict', '$ npx npm-merge-driver install -g')
+ const pieces = str.split(/[\n\r]+/g).reduce((acc, line) => {
+ if (line.match(PARENT_RE)) acc.state = 'parent'
+ else if (line.match(OURS_RE)) acc.state = 'ours'
+ else if (line.match(THEIRS_RE)) acc.state = 'theirs'
+ else if (line.match(END_RE)) acc.state = 'top'
+ else {
+ if (acc.state === 'top' || acc.state === 'ours') acc.ours += line
+ if (acc.state === 'top' || acc.state === 'theirs') acc.theirs += line
+ if (acc.state === 'top' || acc.state === 'parent') acc.parent += line
+ }
+ return acc
+ }, {
+ state: 'top',
+ ours: '',
+ theirs: '',
+ parent: ''
+ })
+ try {
+ const ours = parseJSON(pieces.ours)
+ const theirs = parseJSON(pieces.theirs)
+ return reconcileLockfiles(ours, theirs)
+ } catch (_e) {
+ log.error('conflict', `Automatic conflict resolution failed. Please manually resolve conflicts in ${filename} and try again.`)
+ log.silly('conflict', `Error during resolution: ${_e}`)
+ throw e
+ }
+ } else {
+ throw e
+ }
+ }
+}
+
+function reconcileLockfiles (parent, ours, theirs) {
+ return Object.assign({}, ours, theirs)
+}
diff --git a/deps/npm/lib/install/save.js b/deps/npm/lib/install/save.js
index f0c61f555d64e4..7227e78852ac75 100644
--- a/deps/npm/lib/install/save.js
+++ b/deps/npm/lib/install/save.js
@@ -1,8 +1,8 @@
'use strict'
-const createShrinkwrap = require('../shrinkwrap.js').createShrinkwrap
const deepSortObject = require('../utils/deep-sort-object.js')
const detectIndent = require('detect-indent')
+const detectNewline = require('detect-newline')
const fs = require('graceful-fs')
const iferr = require('iferr')
const log = require('npmlog')
@@ -10,6 +10,7 @@ const moduleName = require('../utils/module-name.js')
const npm = require('../npm.js')
const parseJSON = require('../utils/parse-json.js')
const path = require('path')
+const stringifyPackage = require('stringify-package')
const validate = require('aproba')
const without = require('lodash.without')
const writeFileAtomic = require('write-file-atomic')
@@ -44,9 +45,9 @@ exports.saveShrinkwrap = saveShrinkwrap
function saveShrinkwrap (tree, next) {
validate('OF', arguments)
if (!npm.config.get('shrinkwrap') || !npm.config.get('package-lock')) {
- next()
+ return next()
}
- createShrinkwrap(tree, {silent: false}, next)
+ require('../shrinkwrap.js').createShrinkwrap(tree, {silent: false}, next)
}
function savePackageJson (tree, next) {
@@ -60,7 +61,8 @@ function savePackageJson (tree, next) {
// don't use readJson, because we don't want to do all the other
// tricky npm-specific stuff that's in there.
fs.readFile(saveTarget, 'utf8', iferr(next, function (packagejson) {
- const indent = detectIndent(packagejson).indent || ' '
+ const indent = detectIndent(packagejson).indent
+ const newline = detectNewline(packagejson)
try {
tree.package = parseJSON(packagejson)
} catch (ex) {
@@ -122,7 +124,7 @@ function savePackageJson (tree, next) {
tree.package.bundleDependencies = deepSortObject(bundle)
}
- var json = JSON.stringify(tree.package, null, indent) + '\n'
+ var json = stringifyPackage(tree.package, indent, newline)
if (json === packagejson) {
log.verbose('shrinkwrap', 'skipping write for package.json because there were no changes.')
next()
diff --git a/deps/npm/lib/link.js b/deps/npm/lib/link.js
index 158d9b06456ba3..e05526c4080d3b 100644
--- a/deps/npm/lib/link.js
+++ b/deps/npm/lib/link.js
@@ -25,7 +25,7 @@ link.completion = function (opts, cb) {
var dir = npm.globalDir
fs.readdir(dir, function (er, files) {
cb(er, files.filter(function (f) {
- return !f.match(/^[\._-]/)
+ return !f.match(/^[._-]/)
}))
})
}
@@ -37,7 +37,7 @@ function link (args, cb) {
var msg = 'npm link not supported on windows prior to node 0.7.9'
var e = new Error(msg)
e.code = 'ENOTSUP'
- e.errno = require('constants').ENOTSUP
+ e.errno = require('constants').ENOTSUP // eslint-disable-line node/no-deprecated-api
return cb(e)
}
}
@@ -148,8 +148,8 @@ function linkPkg (folder, cb_) {
er = new Error('Package must have a name field to be linked')
return cb(er)
}
- if (npm.config.get('dry-run')) return resultPrinter(path.basename(me), me, target, cb)
var target = path.resolve(npm.globalDir, d.name)
+ if (npm.config.get('dry-run')) return resultPrinter(path.basename(me), me, target, cb)
symlink(me, target, false, true, function (er) {
if (er) return cb(er)
log.verbose('link', 'build target', target)
diff --git a/deps/npm/lib/ls.js b/deps/npm/lib/ls.js
index 7c0ea71e773f98..bb5e433f78fdea 100644
--- a/deps/npm/lib/ls.js
+++ b/deps/npm/lib/ls.js
@@ -139,9 +139,9 @@ function filterByEnv (data) {
return
}
- if ((dev && inList(devKeys, name)) || // only --dev
- (production && inList(prodKeys, name)) || // only --production
- (!dev && !production)) { // no --production|--dev|--only=xxx
+ if ((dev && inList(devKeys, name)) || // only --dev
+ (production && inList(prodKeys, name)) || // only --production
+ (!dev && !production)) { // no --production|--dev|--only=xxx
dependencies[name] = data.dependencies[name]
}
})
@@ -165,7 +165,7 @@ function alphasort (a, b) {
a = a.toLowerCase()
b = b.toLowerCase()
return a > b ? 1
- : a < b ? -1 : 0
+ : a < b ? -1 : 0
}
function isCruft (data) {
@@ -520,16 +520,16 @@ function makeParseable_ (data, long, dir, depth, parent, d) {
if (data.missing) {
if (depth < npm.config.get('depth')) {
data = npm.config.get('long')
- ? path.resolve(parent.path, 'node_modules', d) +
+ ? path.resolve(parent.path, 'node_modules', d) +
':' + d + '@' + JSON.stringify(data.requiredBy) + ':INVALID:MISSING'
- : ''
+ : ''
} else {
data = path.resolve(dir || '', 'node_modules', d || '') +
(npm.config.get('long')
- ? ':' + d + '@' + JSON.stringify(data.requiredBy) +
+ ? ':' + d + '@' + JSON.stringify(data.requiredBy) +
':' + // no realpath resolved
':MAXDEPTH'
- : '')
+ : '')
}
return data
diff --git a/deps/npm/lib/npm.js b/deps/npm/lib/npm.js
index e58712603bf839..da5a3636021223 100644
--- a/deps/npm/lib/npm.js
+++ b/deps/npm/lib/npm.js
@@ -1,6 +1,6 @@
;(function () {
// windows: running 'npm blah' in this folder will invoke WSH, not node.
- /*globals WScript*/
+ /* globals WScript */
if (typeof WScript !== 'undefined') {
WScript.echo(
'npm does not work when run\n' +
@@ -164,11 +164,13 @@
})
return commandCache[a]
- }, enumerable: fullList.indexOf(c) !== -1, configurable: true })
+ },
+ enumerable: fullList.indexOf(c) !== -1,
+ configurable: true })
// make css-case commands callable via camelCase as well
- if (c.match(/\-([a-z])/)) {
- addCommand(c.replace(/\-([a-z])/g, function (a, b) {
+ if (c.match(/-([a-z])/)) {
+ addCommand(c.replace(/-([a-z])/g, function (a, b) {
return b.toUpperCase()
}))
}
@@ -189,7 +191,9 @@
}
if (plumbing.indexOf(c) !== -1) return c
var a = abbrevs[c]
- if (aliases[a]) a = aliases[a]
+ while (aliases[a]) {
+ a = aliases[a]
+ }
return a
}
@@ -288,7 +292,11 @@
var color = config.get('color')
- log.level = config.get('loglevel')
+ if (npm.config.get('timing') && npm.config.get('loglevel') === 'notice') {
+ log.level = 'timing'
+ } else {
+ log.level = config.get('loglevel')
+ }
log.heading = config.get('heading') || 'npm'
log.stream = config.get('logstream')
@@ -411,8 +419,8 @@
{
get: function () {
return (process.platform !== 'win32')
- ? path.resolve(npm.globalPrefix, 'lib', 'node_modules')
- : path.resolve(npm.globalPrefix, 'node_modules')
+ ? path.resolve(npm.globalPrefix, 'lib', 'node_modules')
+ : path.resolve(npm.globalPrefix, 'node_modules')
},
enumerable: true
})
@@ -455,7 +463,9 @@
}
npm.commands[n](args, cb)
}
- }, enumerable: false, configurable: true })
+ },
+ enumerable: false,
+ configurable: true })
})
if (require.main === module) {
diff --git a/deps/npm/lib/outdated.js b/deps/npm/lib/outdated.js
index a38137b66c88c5..8b0a43d6ba336c 100644
--- a/deps/npm/lib/outdated.js
+++ b/deps/npm/lib/outdated.js
@@ -24,13 +24,14 @@ var os = require('os')
var url = require('url')
var path = require('path')
var readPackageTree = require('read-package-tree')
-var readJson = require('read-package-json')
var asyncMap = require('slide').asyncMap
var color = require('ansicolors')
var styles = require('ansistyles')
var table = require('text-table')
var semver = require('semver')
var npa = require('npm-package-arg')
+var pickManifest = require('npm-pick-manifest')
+var fetchPackageMetadata = require('./fetch-package-metadata.js')
var mutateIntoLogicalTree = require('./install/mutate-into-logical-tree.js')
var npm = require('./npm.js')
var long = npm.config.get('long')
@@ -41,7 +42,6 @@ var computeVersionSpec = require('./install/deps.js').computeVersionSpec
var moduleName = require('./utils/module-name.js')
var output = require('./utils/output.js')
var ansiTrim = require('./utils/ansi-trim')
-var fetchPackageMetadata = require('./fetch-package-metadata.js')
function uniq (list) {
// we maintain the array because we need an array, not iterator, return
@@ -89,11 +89,11 @@ function outdated (args, silent, cb) {
} else {
var outList = list.map(makePretty)
var outHead = [ 'Package',
- 'Current',
- 'Wanted',
- 'Latest',
- 'Location'
- ]
+ 'Current',
+ 'Wanted',
+ 'Latest',
+ 'Location'
+ ]
if (long) outHead.push('Package Type')
var outTable = [outHead].concat(outList)
@@ -117,25 +117,19 @@ function outdated (args, silent, cb) {
// [[ dir, dep, has, want, latest, type ]]
function makePretty (p) {
- var dep = p[0]
var depname = p[1]
- var dir = dep.path
var has = p[2]
var want = p[3]
var latest = p[4]
var type = p[6]
var deppath = p[7]
- if (!npm.config.get('global')) {
- dir = path.relative(process.cwd(), dir)
- }
-
var columns = [ depname,
- has || 'MISSING',
- want,
- latest,
- deppath
- ]
+ has || 'MISSING',
+ want,
+ latest,
+ deppath
+ ]
if (long) columns[5] = type
if (npm.color) {
@@ -183,10 +177,10 @@ function makeJSON (list) {
dir = path.relative(process.cwd(), dir)
}
out[depname] = { current: has,
- wanted: want,
- latest: latest,
- location: dir
- }
+ wanted: want,
+ latest: latest,
+ location: dir
+ }
if (long) out[depname].type = type
})
return JSON.stringify(out, null, 2)
@@ -202,13 +196,15 @@ function outdated_ (args, path, tree, parentHas, depth, cb) {
var types = {}
var pkg = tree.package
+ if (!tree.children) tree.children = []
+
var deps = tree.error ? tree.children : tree.children.filter((child) => !isExtraneous(child))
deps.forEach(function (dep) {
types[moduleName(dep)] = 'dependencies'
})
- Object.keys(tree.missingDeps).forEach(function (name) {
+ Object.keys(tree.missingDeps || {}).forEach(function (name) {
deps.push({
package: { name: name },
path: tree.path,
@@ -262,7 +258,7 @@ function outdated_ (args, path, tree, parentHas, depth, cb) {
!npm.config.get('global')
)
if (doUpdate) {
- Object.keys(pkg.devDependencies).forEach(function (k) {
+ Object.keys(pkg.devDependencies || {}).forEach(function (k) {
if (!(k in parentHas)) {
deps[k] = pkg.devDependencies[k]
types[k] = 'devDependencies'
@@ -276,8 +272,8 @@ function outdated_ (args, path, tree, parentHas, depth, cb) {
deps = deps.filter(function (dep) { return dep !== child })
}
has[child.package.name] = {
- version: child.package.version,
- from: child.package._from
+ version: child.isLink ? 'linked' : child.package.version,
+ from: child.isLink ? 'file:' + child.path : child.package._from
}
})
@@ -286,11 +282,17 @@ function outdated_ (args, path, tree, parentHas, depth, cb) {
// otherwise dive into the folder
asyncMap(deps, function (dep, cb) {
var name = moduleName(dep)
- var required = (tree.package.dependencies)[name] ||
- (tree.package.optionalDependencies)[name] ||
- (tree.package.devDependencies)[name] ||
- computeVersionSpec(tree, dep) ||
- '*'
+ var required
+ if (tree.package.dependencies && name in tree.package.dependencies) {
+ required = tree.package.dependencies[name]
+ } else if (tree.package.optionalDependencies && name in tree.package.optionalDependencies) {
+ required = tree.package.optionalDependencies[name]
+ } else if (tree.package.devDependencies && name in tree.package.devDependencies) {
+ required = tree.package.devDependencies[name]
+ } else if (has[name]) {
+ required = computeVersionSpec(tree, dep)
+ }
+
if (!long) return shouldUpdate(args, dep, name, has, required, depth, path, cb)
shouldUpdate(args, dep, name, has, required, depth, path, cb, types[name])
@@ -309,11 +311,11 @@ function shouldUpdate (args, tree, dep, has, req, depth, pkgpath, cb, type) {
// show user that no viable version can be found
if (er) return cb(er)
outdated_(args,
- pkgpath,
- tree,
- has,
- depth + 1,
- cb)
+ pkgpath,
+ tree,
+ has,
+ depth + 1,
+ cb)
}
function doIt (wanted, latest) {
@@ -324,23 +326,32 @@ function shouldUpdate (args, tree, dep, has, req, depth, pkgpath, cb, type) {
}
if (args.length && args.indexOf(dep) === -1) return skip()
+
+ if (tree.isLink && req == null) return skip()
+
+ if (req == null || req === '') req = '*'
+
var parsed = npa.resolve(dep, req)
- if (tree.isLink && tree.parent && tree.parent.isTop) {
- return doIt('linked', 'linked')
- }
- if (parsed.type === 'git' || parsed.type === 'hosted') {
+ if (parsed.type === 'directory') {
+ if (tree.isLink) {
+ return skip()
+ } else {
+ return doIt('linked', 'linked')
+ }
+ } else if (parsed.type === 'git') {
return doIt('git', 'git')
- }
+ } else if (parsed.type === 'file') {
+ return updateLocalDeps()
+ } else {
+ return mapToRegistry(dep, npm.config, function (er, uri, auth) {
+ if (er) return cb(er)
- // search for the latest package
- mapToRegistry(dep, npm.config, function (er, uri, auth) {
- if (er) return cb(er)
-
- npm.registry.get(uri, { auth: auth }, updateDeps)
- })
+ npm.registry.get(uri, { auth: auth }, updateDeps)
+ })
+ }
function updateLocalDeps (latestRegistryVersion) {
- readJson(path.resolve(parsed.fetchSpec, 'package.json'), function (er, localDependency) {
+ fetchPackageMetadata('file:' + parsed.fetchSpec, '.', (er, localDependency) => {
if (er) return cb()
var wanted = localDependency.version
@@ -363,63 +374,31 @@ function shouldUpdate (args, tree, dep, has, req, depth, pkgpath, cb, type) {
}
function updateDeps (er, d) {
- if (er) {
- if (parsed.type !== 'directory' && parsed.type !== 'file') return cb(er)
- return updateLocalDeps()
- }
-
- if (!d || !d['dist-tags'] || !d.versions) return cb()
- var l = d.versions[d['dist-tags'].latest]
- if (!l) return cb()
-
- var r = req
- if (d['dist-tags'][req]) {
- r = d['dist-tags'][req]
- }
-
- if (semver.validRange(r, true)) {
- // some kind of semver range.
- // see if it's in the doc.
- var vers = Object.keys(d.versions)
- var v = semver.maxSatisfying(vers, r, true)
- if (v) {
- return onCacheAdd(null, d.versions[v])
- }
- }
+ if (er) return cb(er)
- // We didn't find the version in the doc. See if we can find it in metadata.
- var spec = dep
- if (req) {
- spec = dep + '@' + req
- }
- fetchPackageMetadata(spec, '', onCacheAdd)
-
- function onCacheAdd (er, d) {
- // if this fails, then it means we can't update this thing.
- // it's probably a thing that isn't published.
- if (er) {
- if (er.code && er.code === 'ETARGET') {
- // no viable version found
- return skip(er)
- }
+ try {
+ var l = pickManifest(d, 'latest')
+ var m = pickManifest(d, req)
+ } catch (er) {
+ if (er.code === 'ETARGET') {
+ return skip(er)
+ } else {
return skip()
}
+ }
- // check that the url origin hasn't changed (#1727) and that
- // there is no newer version available
- var dFromUrl = d._from && url.parse(d._from).protocol
- var cFromUrl = curr && curr.from && url.parse(curr.from).protocol
-
- if (!curr ||
- dFromUrl && cFromUrl && d._from !== curr.from ||
- d.version !== curr.version ||
- d.version !== l.version) {
- if (parsed.type === 'file' || parsed.type === 'directory') return updateLocalDeps(l.version)
-
- doIt(d.version, l.version)
- } else {
- skip()
- }
+ // check that the url origin hasn't changed (#1727) and that
+ // there is no newer version available
+ var dFromUrl = m._from && url.parse(m._from).protocol
+ var cFromUrl = curr && curr.from && url.parse(curr.from).protocol
+
+ if (!curr ||
+ (dFromUrl && cFromUrl && m._from !== curr.from) ||
+ m.version !== curr.version ||
+ m.version !== l.version) {
+ doIt(m.version, l.version)
+ } else {
+ skip()
}
}
}
diff --git a/deps/npm/lib/owner.js b/deps/npm/lib/owner.js
index 64d086af78f9b1..3c2660ace113d5 100644
--- a/deps/npm/lib/owner.js
+++ b/deps/npm/lib/owner.js
@@ -1,3 +1,4 @@
+/* eslint-disable standard/no-callback-literal */
module.exports = owner
var npm = require('./npm.js')
@@ -53,7 +54,7 @@ owner.completion = function (opts, cb) {
})
}
// else fallthrough
- /*eslint no-fallthrough:0*/
+ /* eslint no-fallthrough:0 */
case 'add':
if (argv.length > 3) {
theUser = encodeURIComponent(argv[3])
diff --git a/deps/npm/lib/pack.js b/deps/npm/lib/pack.js
index f6a0eff805f50f..3b3f5b7bbc7007 100644
--- a/deps/npm/lib/pack.js
+++ b/deps/npm/lib/pack.js
@@ -6,7 +6,9 @@
const BB = require('bluebird')
+const byteSize = require('byte-size')
const cacache = require('cacache')
+const columnify = require('columnify')
const cp = require('child_process')
const deprCheck = require('./utils/depr-check')
const fpm = require('./fetch-package-metadata')
@@ -28,8 +30,9 @@ const pinflight = require('promise-inflight')
const readJson = BB.promisify(require('read-package-json'))
const tar = require('tar')
const packlist = require('npm-packlist')
+const ssri = require('ssri')
-pack.usage = 'npm pack [[<@scope>/]...]'
+pack.usage = 'npm pack [[<@scope>/]...] [--dry-run]'
// if it can be installed, it can be packed.
pack.completion = install.completion
@@ -46,35 +49,55 @@ function pack (args, silent, cb) {
BB.all(
args.map((arg) => pack_(arg, cwd))
- ).then((files) => {
- if (!silent) {
- output(files.map((f) => path.relative(cwd, f)).join('\n'))
+ ).then((tarballs) => {
+ if (!silent && npm.config.get('json')) {
+ output(JSON.stringify(tarballs, null, 2))
+ } else if (!silent) {
+ tarballs.forEach(logContents)
+ output(tarballs.map((f) => path.relative(cwd, f.filename)).join('\n'))
}
- cb(null, files)
- }, cb)
+ return tarballs
+ }).nodeify(cb)
}
-// add to cache, then cp to the cwd
function pack_ (pkg, dir) {
return BB.fromNode((cb) => fpm(pkg, dir, cb)).then((mani) => {
let name = mani.name[0] === '@'
// scoped packages get special treatment
- ? mani.name.substr(1).replace(/\//g, '-')
- : mani.name
+ ? mani.name.substr(1).replace(/\//g, '-')
+ : mani.name
const target = `${name}-${mani.version}.tgz`
return pinflight(target, () => {
+ const dryRun = npm.config.get('dry-run')
if (mani._requested.type === 'directory') {
- return prepareDirectory(mani._resolved).then(() => {
- return packDirectory(mani, mani._resolved, target)
+ return prepareDirectory(mani._resolved)
+ .then(() => {
+ return packDirectory(mani, mani._resolved, target, target, true, dryRun)
+ })
+ } else if (dryRun) {
+ log.verbose('pack', '--dry-run mode enabled. Skipping write.')
+ return cacache.tmp.withTmp(npm.tmp, {tmpPrefix: 'packing'}, (tmp) => {
+ const tmpTarget = path.join(tmp, path.basename(target))
+ return packFromPackage(pkg, tmpTarget, target)
})
} else {
- return pacote.tarball.toFile(pkg, target, pacoteOpts())
- .then(() => target)
+ return packFromPackage(pkg, target, target)
}
})
})
}
+function packFromPackage (arg, target, filename) {
+ const opts = pacoteOpts()
+ return pacote.tarball.toFile(arg, target, pacoteOpts())
+ .then(() => cacache.tmp.withTmp(npm.tmp, {tmpPrefix: 'unpacking'}, (tmp) => {
+ const tmpTarget = path.join(tmp, filename)
+ return pacote.extract(arg, tmpTarget, opts)
+ .then(() => readJson(path.join(tmpTarget, 'package.json')))
+ }))
+ .then((pkg) => getContents(pkg, target, filename))
+}
+
module.exports.prepareDirectory = prepareDirectory
function prepareDirectory (dir) {
return readJson(path.join(dir, 'package.json')).then((pkg) => {
@@ -105,7 +128,7 @@ function prepareDirectory (dir) {
}
module.exports.packDirectory = packDirectory
-function packDirectory (mani, dir, target) {
+function packDirectory (mani, dir, target, filename, logIt, dryRun) {
deprCheck(mani)
return readJson(path.join(dir, 'package.json')).then((pkg) => {
return lifecycle(pkg, 'prepack', dir)
@@ -120,22 +143,128 @@ function packDirectory (mani, dir, target) {
cwd: dir,
prefix: 'package/',
portable: true,
- noMtime: true,
+ // Provide a specific date in the 1980s for the benefit of zip,
+ // which is confounded by files dated at the Unix epoch 0.
+ mtime: new Date('1985-10-26T08:15:00.000Z'),
gzip: true
}
- return packlist({ path: dir })
+ return BB.resolve(packlist({ path: dir }))
// NOTE: node-tar does some Magic Stuff depending on prefixes for files
// specifically with @ signs, so we just neutralize that one
// and any such future "features" by prepending `./`
.then((files) => tar.create(tarOpt, files.map((f) => `./${f}`)))
- .then(() => move(tmpTarget, target, {Promise: BB, fs}))
- .then(() => lifecycle(pkg, 'postpack', dir))
- .then(() => target)
+ .then(() => getContents(pkg, tmpTarget, filename, logIt))
+ // thread the content info through
+ .tap(() => {
+ if (dryRun) {
+ log.verbose('pack', '--dry-run mode enabled. Skipping write.')
+ } else {
+ return move(tmpTarget, target, {Promise: BB, fs})
+ }
+ })
+ .tap(() => lifecycle(pkg, 'postpack', dir))
})
})
}
+module.exports.logContents = logContents
+function logContents (tarball) {
+ log.notice('')
+ log.notice('', `${npm.config.get('unicode') ? '📦 ' : 'package:'} ${tarball.name}@${tarball.version}`)
+ log.notice('=== Tarball Contents ===')
+ if (tarball.files.length) {
+ log.notice('', columnify(tarball.files.map((f) => {
+ const bytes = byteSize(f.size)
+ return {path: f.path, size: `${bytes.value}${bytes.unit}`}
+ }), {
+ include: ['size', 'path'],
+ showHeaders: false
+ }))
+ }
+ if (tarball.bundled.length) {
+ log.notice('=== Bundled Dependencies ===')
+ tarball.bundled.forEach((name) => log.notice('', name))
+ }
+ log.notice('=== Tarball Details ===')
+ log.notice('', columnify([
+ {name: 'name:', value: tarball.name},
+ {name: 'version:', value: tarball.version},
+ tarball.filename && {name: 'filename:', value: tarball.filename},
+ {name: 'package size:', value: byteSize(tarball.size)},
+ {name: 'unpacked size:', value: byteSize(tarball.unpackedSize)},
+ {name: 'shasum:', value: tarball.shasum},
+ {
+ name: 'integrity:',
+ value: tarball.integrity.toString().substr(0, 20) + '[...]' + tarball.integrity.toString().substr(80)},
+ tarball.bundled.length && {name: 'bundled deps:', value: tarball.bundled.length},
+ tarball.bundled.length && {name: 'bundled files:', value: tarball.entryCount - tarball.files.length},
+ tarball.bundled.length && {name: 'own files:', value: tarball.files.length},
+ {name: 'total files:', value: tarball.entryCount}
+ ].filter((x) => x), {
+ include: ['name', 'value'],
+ showHeaders: false
+ }))
+ log.notice('', '')
+}
+
+module.exports.getContents = getContents
+function getContents (pkg, target, filename, silent) {
+ const bundledWanted = new Set(
+ pkg.bundleDependencies ||
+ pkg.bundledDependencies ||
+ []
+ )
+ const files = []
+ const bundled = new Set()
+ let totalEntries = 0
+ let totalEntrySize = 0
+ return tar.t({
+ file: target,
+ onentry (entry) {
+ totalEntries++
+ totalEntrySize += entry.size
+ const p = entry.path
+ if (p.startsWith('package/node_modules/')) {
+ const name = p.match(/^package\/node_modules\/((?:@[^/]+\/)?[^/]+)/)[1]
+ if (bundledWanted.has(name)) {
+ bundled.add(name)
+ }
+ } else {
+ files.push({
+ path: entry.path.replace(/^package\//, ''),
+ size: entry.size,
+ mode: entry.mode
+ })
+ }
+ },
+ strip: 1
+ })
+ .then(() => BB.all([
+ BB.fromNode((cb) => fs.stat(target, cb)),
+ ssri.fromStream(fs.createReadStream(target), {
+ algorithms: ['sha1', 'sha512']
+ })
+ ]))
+ .then(([stat, integrity]) => {
+ const shasum = integrity['sha1'][0].hexDigest()
+ return {
+ id: pkg._id,
+ name: pkg.name,
+ version: pkg.version,
+ from: pkg._from,
+ size: stat.size,
+ unpackedSize: totalEntrySize,
+ shasum,
+ integrity: ssri.parse(integrity['sha512'][0]),
+ filename,
+ files,
+ entryCount: totalEntries,
+ bundled: Array.from(bundled)
+ }
+ })
+}
+
const PASSTHROUGH_OPTS = [
'always-auth',
'auth-type',
@@ -170,7 +299,7 @@ function packGitDep (manifest, dir) {
return acc
}, [])
const child = cp.spawn(process.env.NODE || process.execPath, [
- require.main.filename,
+ require.resolve('../bin/npm-cli.js'),
'install',
'--dev',
'--prod',
diff --git a/deps/npm/lib/profile.js b/deps/npm/lib/profile.js
index 587a26ca8b5e68..ff01db90f722f4 100644
--- a/deps/npm/lib/profile.js
+++ b/deps/npm/lib/profile.js
@@ -4,7 +4,7 @@ const npm = require('./npm.js')
const log = require('npmlog')
const output = require('./utils/output.js')
const qw = require('qw')
-const Table = require('cli-table2')
+const Table = require('cli-table3')
const ansistyles = require('ansistyles')
const Bluebird = require('bluebird')
const readUserInfo = require('./utils/read-user-info.js')
@@ -82,7 +82,18 @@ function config () {
registry: npm.config.get('registry'),
otp: npm.config.get('otp')
}
- conf.auth = npm.config.getCredentialsByURI(conf.registry)
+ const creds = npm.config.getCredentialsByURI(conf.registry)
+ if (creds.token) {
+ conf.auth = {token: creds.token}
+ } else if (creds.username) {
+ conf.auth = {basic: {username: creds.username, password: creds.password}}
+ } else if (creds.auth) {
+ const auth = Buffer.from(creds.auth, 'base64').toString().split(':', 2)
+ conf.auth = {basic: {username: auth[0], password: auth[1]}}
+ } else {
+ conf.auth = {}
+ }
+
if (conf.otp) conf.auth.otp = conf.otp
return conf
}
@@ -126,7 +137,6 @@ function get (args) {
output(`${key}\t${info[key]}`)
}
})
- return
} else {
const table = new Table()
Object.keys(cleaned).forEach((k) => table.push({[ansistyles.bright(k)]: cleaned[k]}))
@@ -155,12 +165,17 @@ function set (args) {
return Promise.reject(Error(`"${prop}" is not a property we can set. Valid properties are: ` + writableProfileKeys.join(', ')))
}
return Bluebird.try(() => {
- if (prop !== 'password') return
- return readUserInfo.password('Current password: ').then((current) => {
- return readPasswords().then((newpassword) => {
- value = {old: current, new: newpassword}
+ if (prop === 'password') {
+ return readUserInfo.password('Current password: ').then((current) => {
+ return readPasswords().then((newpassword) => {
+ value = {old: current, new: newpassword}
+ })
})
- })
+ } else if (prop === 'email') {
+ return readUserInfo.password('Password: ').then((current) => {
+ return {password: current, email: value}
+ })
+ }
function readPasswords () {
return readUserInfo.password('New password: ').then((password1) => {
return readUserInfo.password(' Again: ').then((password2) => {
@@ -180,7 +195,7 @@ function set (args) {
newUser[prop] = value
return profile.set(newUser, conf).catch((err) => {
if (err.code !== 'EOTP') throw err
- return readUserInfo.otp('Enter OTP: ').then((otp) => {
+ return readUserInfo.otp().then((otp) => {
conf.auth.otp = otp
return profile.set(newUser, conf)
})
@@ -247,7 +262,7 @@ function enable2fa (args) {
return pulseTillDone.withPromise(profile.set({tfa: {password, mode: 'disable'}}, conf))
} else {
if (conf.auth.otp) return
- return readUserInfo.otp('Enter OTP: ').then((otp) => {
+ return readUserInfo.otp('Enter one-time password from your authenticator app: ').then((otp) => {
conf.auth.otp = otp
})
}
diff --git a/deps/npm/lib/prune.js b/deps/npm/lib/prune.js
index 4ac8139576bd04..010e471e4b328e 100644
--- a/deps/npm/lib/prune.js
+++ b/deps/npm/lib/prune.js
@@ -26,6 +26,7 @@ function prune (args, cb) {
function Pruner (where, dryrun, args) {
Installer.call(this, where, dryrun, args)
+ this.autoPrune = true
}
util.inherits(Pruner, Installer)
@@ -64,3 +65,4 @@ Pruner.prototype.loadAllDepsIntoIdealTree = function (cb) {
Pruner.prototype.runPreinstallTopLevelLifecycles = function (cb) { cb() }
Pruner.prototype.runPostinstallTopLevelLifecycles = function (cb) { cb() }
+Pruner.prototype.saveToDependencies = function (cb) { cb() }
diff --git a/deps/npm/lib/publish.js b/deps/npm/lib/publish.js
index 20bd2603e6ff0a..1ae87d7900fa14 100644
--- a/deps/npm/lib/publish.js
+++ b/deps/npm/lib/publish.js
@@ -16,11 +16,11 @@ const pacote = require('pacote')
const pacoteOpts = require('./config/pacote')
const path = require('path')
const readJson = BB.promisify(require('read-package-json'))
+const readUserInfo = require('./utils/read-user-info.js')
const semver = require('semver')
const statAsync = BB.promisify(require('graceful-fs').stat)
-const readUserInfo = require('./utils/read-user-info.js')
-publish.usage = 'npm publish [|] [--tag ] [--access ]' +
+publish.usage = 'npm publish [|] [--tag ] [--access ] [--dry-run]' +
"\n\nPublishes '.' if no argument supplied" +
'\n\nSets tag `latest` if no --tag specified'
@@ -47,10 +47,16 @@ function publish (args, isRetry, cb) {
return cb(new Error('Tag name must not be a valid SemVer range: ' + t))
}
- publish_(args[0]).then((pkg) => {
- output(`+ ${pkg._id}`)
- cb()
- }, cb)
+ return publish_(args[0])
+ .then((tarball) => {
+ const silent = log.level === 'silent'
+ if (!silent && npm.config.get('json')) {
+ output(JSON.stringify(tarball, null, 2))
+ } else if (!silent) {
+ output(`+ ${tarball.id}`)
+ }
+ })
+ .nodeify(cb)
}
function publish_ (arg) {
@@ -76,6 +82,7 @@ function publish_ (arg) {
function publishFromDirectory (arg) {
// All this readJson is because any of the given scripts might modify the
// package.json in question, so we need to refresh after every step.
+ let contents
return pack.prepareDirectory(arg).then(() => {
return readJson(path.join(arg, 'package.json'))
}).then((pkg) => {
@@ -85,9 +92,10 @@ function publishFromDirectory (arg) {
}).then((pkg) => {
return cacache.tmp.withTmp(npm.tmp, {tmpPrefix: 'fromDir'}, (tmpDir) => {
const target = path.join(tmpDir, 'package.tgz')
- return pack.packDirectory(pkg, arg, target).then(() => {
- return upload(arg, pkg, false, target)
- })
+ return pack.packDirectory(pkg, arg, target, null, true)
+ .tap((c) => { contents = c })
+ .then((c) => !npm.config.get('json') && pack.logContents(c))
+ .then(() => upload(arg, pkg, false, target))
})
}).then(() => {
return readJson(path.join(arg, 'package.json'))
@@ -96,6 +104,7 @@ function publishFromDirectory (arg) {
}).tap((pkg) => {
return lifecycle(pkg, 'postpublish', arg)
})
+ .then(() => contents)
}
function publishFromPackage (arg) {
@@ -104,9 +113,13 @@ function publishFromPackage (arg) {
const target = path.join(tmp, 'package.json')
const opts = pacoteOpts()
return pacote.tarball.toFile(arg, target, opts)
- .then(() => pacote.extract(arg, extracted, opts))
- .then(() => readJson(path.join(extracted, 'package.json')))
- .tap((pkg) => upload(arg, pkg, false, target))
+ .then(() => pacote.extract(arg, extracted, opts))
+ .then(() => readJson(path.join(extracted, 'package.json')))
+ .then((pkg) => {
+ return BB.resolve(pack.getContents(pkg, target))
+ .tap((c) => !npm.config.get('json') && pack.logContents(c))
+ .tap(() => upload(arg, pkg, false, target))
+ })
})
}
@@ -120,7 +133,6 @@ function upload (arg, pkg, isRetry, cached) {
"Remove the 'private' field from the package.json to publish it."
))
}
-
const mappedConfig = getPublishConfig(
pkg.publishConfig,
npm.config,
@@ -151,7 +163,7 @@ function upload (arg, pkg, isRetry, cached) {
const params = {
metadata: pkg,
- body: createReadStream(cached),
+ body: !npm.config.get('dry-run') && createReadStream(cached),
auth: auth
}
@@ -165,6 +177,11 @@ function upload (arg, pkg, isRetry, cached) {
params.access = config.get('access')
}
+ if (npm.config.get('dry-run')) {
+ log.verbose('publish', '--dry-run mode enabled. Skipping upload.')
+ return BB.resolve()
+ }
+
log.showProgress('publish:' + pkg._id)
return BB.fromNode((cb) => {
registry.publish(registryBase, params, cb)
@@ -192,7 +209,7 @@ function upload (arg, pkg, isRetry, cached) {
if (err.code !== 'EOTP' && !(err.code === 'E401' && /one-time pass/.test(err.message))) throw err
// we prompt on stdout and read answers from stdin, so they need to be ttys.
if (!process.stdin.isTTY || !process.stdout.isTTY) throw err
- return readUserInfo.otp('Enter OTP: ').then((otp) => {
+ return readUserInfo.otp().then((otp) => {
npm.config.set('otp', otp)
return upload(arg, pkg, isRetry, cached)
})
diff --git a/deps/npm/lib/repo.js b/deps/npm/lib/repo.js
index d7e79d76ab6b42..d5aa81a6a00ebd 100644
--- a/deps/npm/lib/repo.js
+++ b/deps/npm/lib/repo.js
@@ -2,8 +2,7 @@ module.exports = repo
repo.usage = 'npm repo []'
-var npm = require('./npm.js')
-var opener = require('opener')
+var openUrl = require('./utils/open-url')
var hostedGitInfo = require('hosted-git-info')
var url_ = require('url')
var fetchPackageMetadata = require('./fetch-package-metadata.js')
@@ -32,7 +31,7 @@ function getUrlAndOpen (d, cb) {
if (!url) return cb(new Error('no repository: could not get url'))
- opener(url, { command: npm.config.get('browser') }, cb)
+ openUrl(url, 'repository available at the following URL', cb)
}
function unknownHostedUrl (url) {
@@ -43,8 +42,8 @@ function unknownHostedUrl (url) {
}
url = url_.parse(url)
var protocol = url.protocol === 'https:'
- ? 'https:'
- : 'http:'
+ ? 'https:'
+ : 'http:'
return protocol + '//' + (url.host || '') +
url.path.replace(/\.git$/, '')
} catch (e) {}
diff --git a/deps/npm/lib/run-script.js b/deps/npm/lib/run-script.js
index fb7781f55179b4..33025763117190 100644
--- a/deps/npm/lib/run-script.js
+++ b/deps/npm/lib/run-script.js
@@ -8,6 +8,8 @@ var log = require('npmlog')
var chain = require('slide').chain
var usage = require('./utils/usage')
var output = require('./utils/output.js')
+var didYouMean = require('./utils/did-you-mean')
+var isWindowsShell = require('./utils/is-windows-shell.js')
runScript.usage = usage(
'run-script',
@@ -32,7 +34,7 @@ runScript.completion = function (opts, cb) {
if (scripts.indexOf(argv[2]) !== -1) return cb()
// ok, try to find out which package it was, then
var pref = npm.config.get('global') ? npm.config.get('prefix')
- : npm.localPrefix
+ : npm.localPrefix
var pkgDir = path.resolve(pref, 'node_modules', argv[2], 'package.json')
readJson(pkgDir, function (er, d) {
if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er)
@@ -138,7 +140,7 @@ function run (pkg, wd, cmd, args, cb) {
if (cmd === 'test') {
pkg.scripts.test = 'echo \'Error: no test specified\''
} else if (cmd === 'env') {
- if (process.platform === 'win32') {
+ if (isWindowsShell) {
log.verbose('run-script using default platform env: SET (Windows)')
pkg.scripts[cmd] = 'SET'
} else {
@@ -148,7 +150,9 @@ function run (pkg, wd, cmd, args, cb) {
} else if (npm.config.get('if-present')) {
return cb(null)
} else {
- return cb(new Error('missing script: ' + cmd))
+ let suggestions = didYouMean(cmd, Object.keys(pkg.scripts))
+ suggestions = suggestions ? '\n' + suggestions : ''
+ return cb(new Error('missing script: ' + cmd + suggestions))
}
}
cmds = [cmd]
diff --git a/deps/npm/lib/search/format-package-stream.js b/deps/npm/lib/search/format-package-stream.js
index a312e3f48379f0..bb0f552ba09d19 100644
--- a/deps/npm/lib/search/format-package-stream.js
+++ b/deps/npm/lib/search/format-package-stream.js
@@ -50,8 +50,8 @@ function prettify (data, num, opts) {
var pkg = normalizePackage(data, opts)
var columns = opts.description
- ? ['name', 'description', 'author', 'date', 'version', 'keywords']
- : ['name', 'author', 'date', 'version', 'keywords']
+ ? ['name', 'description', 'author', 'date', 'version', 'keywords']
+ : ['name', 'author', 'date', 'version', 'keywords']
if (opts.parseable) {
return columns.map(function (col) {
@@ -157,16 +157,16 @@ function normalizePackage (data, opts) {
return '=' + m.username
}).join(' '),
keywords: Array.isArray(data.keywords)
- ? data.keywords.join(' ')
- : typeof data.keywords === 'string'
- ? data.keywords.replace(/[,\s]+/, ' ')
- : '',
+ ? data.keywords.join(' ')
+ : typeof data.keywords === 'string'
+ ? data.keywords.replace(/[,\s]+/, ' ')
+ : '',
version: data.version,
- date: data.date &&
+ date: (data.date &&
(data.date.toISOString() // remove time
.split('T').join(' ')
.replace(/:[0-9]{2}\.[0-9]{3}Z$/, ''))
- .slice(0, -5) ||
+ .slice(0, -5)) ||
'prehistoric'
}
}
diff --git a/deps/npm/lib/search/package-filter.js b/deps/npm/lib/search/package-filter.js
index ac2950f46b71aa..892adb08c96a50 100644
--- a/deps/npm/lib/search/package-filter.js
+++ b/deps/npm/lib/search/package-filter.js
@@ -8,16 +8,16 @@ function filter (data, include, exclude, opts) {
function getWords (data, opts) {
return [ data.name ]
- .concat((opts && opts.description) ? data.description : [])
- .concat((data.maintainers || []).map(function (m) {
- return '=' + m.name
- }))
- .concat(data.versions && data.versions.length && data.url && ('<' + data.url + '>'))
- .concat(data.keywords || [])
- .map(function (f) { return f && f.trim && f.trim() })
- .filter(function (f) { return f })
- .join(' ')
- .toLowerCase()
+ .concat((opts && opts.description) ? data.description : [])
+ .concat((data.maintainers || []).map(function (m) {
+ return '=' + m.name
+ }))
+ .concat(data.versions && data.versions.length && data.url && ('<' + data.url + '>'))
+ .concat(data.keywords || [])
+ .map(function (f) { return f && f.trim && f.trim() })
+ .filter(function (f) { return f })
+ .join(' ')
+ .toLowerCase()
}
function filterWords (data, include, exclude, opts) {
diff --git a/deps/npm/lib/shrinkwrap.js b/deps/npm/lib/shrinkwrap.js
index ddfff2c681655e..90a4426523cabc 100644
--- a/deps/npm/lib/shrinkwrap.js
+++ b/deps/npm/lib/shrinkwrap.js
@@ -4,6 +4,7 @@ const BB = require('bluebird')
const chain = require('slide').chain
const detectIndent = require('detect-indent')
+const detectNewline = require('detect-newline')
const readFile = BB.promisify(require('graceful-fs').readFile)
const getRequested = require('./install/get-requested.js')
const id = require('./install/deps.js')
@@ -18,6 +19,7 @@ const npm = require('./npm.js')
const path = require('path')
const readPackageTree = BB.promisify(require('read-package-tree'))
const ssri = require('ssri')
+const stringifyPackage = require('stringify-package')
const validate = require('aproba')
const writeFileAtomic = require('write-file-atomic')
const unixFormatPath = require('./utils/unix-format-path.js')
@@ -32,6 +34,8 @@ const PKGLOCK_VERSION = npm.lockfileVersion
shrinkwrap.usage = 'npm shrinkwrap'
module.exports = exports = shrinkwrap
+exports.treeToShrinkwrap = treeToShrinkwrap
+
function shrinkwrap (args, silent, cb) {
if (typeof cb !== 'function') {
cb = silent
@@ -103,14 +107,13 @@ function shrinkwrapDeps (deps, top, tree, seen) {
if (seen.has(tree)) return
seen.add(tree)
sortModules(tree.children).forEach(function (child) {
- if (child.fakeChild) {
- deps[moduleName(child)] = child.fakeChild
- return
- }
var childIsOnlyDev = isOnlyDev(child)
var pkginfo = deps[moduleName(child)] = {}
- var requested = child.package._requested || getRequested(child) || {}
+ var requested = getRequested(child) || child.package._requested || {}
pkginfo.version = childVersion(top, child, requested)
+ if (requested.type === 'git' && child.package._from) {
+ pkginfo.from = child.package._from
+ }
if (child.fromBundle || child.isInLink) {
pkginfo.bundled = true
} else {
@@ -121,7 +124,7 @@ function shrinkwrapDeps (deps, top, tree, seen) {
// tarball and we can't (yet) create consistent tarballs from a stable
// source.
if (requested.type !== 'git') {
- pkginfo.integrity = child.package._integrity
+ pkginfo.integrity = child.package._integrity || undefined
if (!pkginfo.integrity && child.package._shasum) {
pkginfo.integrity = ssri.fromHex(child.package._shasum, 'sha1')
}
@@ -132,8 +135,8 @@ function shrinkwrapDeps (deps, top, tree, seen) {
if (child.requires.length) {
pkginfo.requires = {}
sortModules(child.requires).forEach((required) => {
- var requested = required.package._requested || getRequested(required) || {}
- pkginfo.requires[moduleName(required)] = childVersion(top, required, requested)
+ var requested = getRequested(required, child) || required.package._requested || {}
+ pkginfo.requires[moduleName(required)] = childRequested(top, required, requested)
})
}
if (child.children.length) {
@@ -161,6 +164,24 @@ function childVersion (top, child, req) {
}
}
+function childRequested (top, child, requested) {
+ if (requested.type === 'directory' || requested.type === 'file') {
+ return 'file:' + unixFormatPath(path.relative(top.path, child.package._resolved || requested.fetchSpec))
+ } else if (!isRegistry(requested) && !child.fromBundle) {
+ return child.package._resolved || requested.saveSpec || requested.rawSpec
+ } else if (requested.type === 'tag') {
+ // tags are not ranges we can match against, so we invent a "reasonable"
+ // one based on what we actually installed.
+ return npm.config.get('save-prefix') + child.package.version
+ } else if (requested.saveSpec || requested.rawSpec) {
+ return requested.saveSpec || requested.rawSpec
+ } else if (child.package._from || (child.package._requested && child.package._requested.rawSpec)) {
+ return child.package._from.replace(/^@?[^@]+@/, '') || child.package._requested.rawSpec
+ } else {
+ return child.package.version
+ }
+}
+
function shrinkwrap_ (dir, pkginfo, opts, cb) {
save(dir, pkginfo, opts, cb)
}
@@ -179,11 +200,12 @@ function save (dir, pkginfo, opts, cb) {
{
path: path.resolve(dir, opts.defaultFile || PKGLOCK),
data: '{}',
- indent: (pkg && pkg.indent) || 2
+ indent: pkg && pkg.indent,
+ newline: pkg && pkg.newline
}
)
- const updated = updateLockfileMetadata(pkginfo, pkg && pkg.data)
- const swdata = JSON.stringify(updated, null, info.indent) + '\n'
+ const updated = updateLockfileMetadata(pkginfo, pkg && JSON.parse(pkg.raw))
+ const swdata = stringifyPackage(updated, info.indent, info.newline)
if (swdata === info.raw) {
// skip writing if file is identical
log.verbose('shrinkwrap', `skipping write for ${path.basename(info.path)} because there were no changes.`)
@@ -244,9 +266,8 @@ function checkPackageFile (dir, name) {
return {
path: file,
raw: data,
- data: JSON.parse(data),
- indent: detectIndent(data).indent || 2
+ indent: detectIndent(data).indent,
+ newline: detectNewline(data)
}
}).catch({code: 'ENOENT'}, () => {})
}
-
diff --git a/deps/npm/lib/team.js b/deps/npm/lib/team.js
index f99063b2787148..2d9e61cd4384b6 100644
--- a/deps/npm/lib/team.js
+++ b/deps/npm/lib/team.js
@@ -1,3 +1,4 @@
+/* eslint-disable standard/no-callback-literal */
var mapToRegistry = require('./utils/map-to-registry.js')
var npm = require('./npm')
var output = require('./utils/output.js')
@@ -41,7 +42,7 @@ function team (args, cb) {
try {
return npm.registry.team(cmd, uri, {
auth: auth,
- scope: entity[0],
+ scope: entity[0].replace(/^@/, ''), // '@' prefix on scope is optional.
team: entity[1],
user: args.shift()
}, function (err, data) {
diff --git a/deps/npm/lib/test.js b/deps/npm/lib/test.js
index 06138ac00a3be7..05bffed86d59c1 100644
--- a/deps/npm/lib/test.js
+++ b/deps/npm/lib/test.js
@@ -1,3 +1,4 @@
+/* eslint-disable standard/no-callback-literal */
module.exports = test
const testCmd = require('./utils/lifecycle-cmd.js')('test')
diff --git a/deps/npm/lib/token.js b/deps/npm/lib/token.js
index 2a3b65e6ad7015..d442d37eb806bc 100644
--- a/deps/npm/lib/token.js
+++ b/deps/npm/lib/token.js
@@ -2,10 +2,10 @@
const profile = require('npm-profile')
const npm = require('./npm.js')
const output = require('./utils/output.js')
-const Table = require('cli-table2')
+const Table = require('cli-table3')
const Bluebird = require('bluebird')
-const isCidrV4 = require('is-cidr').isCidrV4
-const isCidrV6 = require('is-cidr').isCidrV6
+const isCidrV4 = require('is-cidr').v4
+const isCidrV6 = require('is-cidr').v6
const readUserInfo = require('./utils/read-user-info.js')
const ansistyles = require('ansistyles')
const log = require('npmlog')
@@ -13,6 +13,8 @@ const pulseTillDone = require('./utils/pulse-till-done.js')
module.exports = token
+token._validateCIDRList = validateCIDRList
+
token.usage =
'npm token list\n' +
'npm token revoke \n' +
@@ -81,7 +83,17 @@ function config () {
registry: npm.config.get('registry'),
otp: npm.config.get('otp')
}
- conf.auth = npm.config.getCredentialsByURI(conf.registry)
+ const creds = npm.config.getCredentialsByURI(conf.registry)
+ if (creds.token) {
+ conf.auth = {token: creds.token}
+ } else if (creds.username) {
+ conf.auth = {basic: {username: creds.username, password: creds.password}}
+ } else if (creds.auth) {
+ const auth = Buffer.from(creds.auth, 'base64').toString().split(':', 2)
+ conf.auth = {basic: {username: auth[0], password: auth[1]}}
+ } else {
+ conf.auth = {}
+ }
if (conf.otp) conf.auth.otp = conf.otp
return conf
}
@@ -149,8 +161,14 @@ function rm (args) {
}
})
return Bluebird.map(toRemove, (key) => {
- progress.info('token', 'removing', key)
- profile.removeToken(key, conf).then(() => profile.completeWork(1))
+ return profile.removeToken(key, conf).catch((ex) => {
+ if (ex.code !== 'EOTP') throw ex
+ log.info('token', 'failed because revoking this token requires OTP')
+ return readUserInfo.otp().then((otp) => {
+ conf.auth.otp = otp
+ return profile.removeToken(key, conf)
+ })
+ })
})
})).then(() => {
if (conf.json) {
@@ -174,7 +192,7 @@ function create (args) {
return profile.createToken(password, readonly, validCIDR, conf).catch((ex) => {
if (ex.code !== 'EOTP') throw ex
log.info('token', 'failed because it requires OTP')
- return readUserInfo.otp('Authenticator provided OTP:').then((otp) => {
+ return readUserInfo.otp().then((otp) => {
conf.auth.otp = otp
log.info('token', 'creating with OTP')
return pulseTillDone.withPromise(profile.createToken(password, readonly, validCIDR, conf))
@@ -205,7 +223,8 @@ function validateCIDR (cidr) {
}
function validateCIDRList (cidrs) {
- const list = Array.isArray(cidrs) ? cidrs : cidrs ? cidrs.split(/,\s*/) : []
+ const maybeList = cidrs ? (Array.isArray(cidrs) ? cidrs : [cidrs]) : []
+ const list = maybeList.length === 1 ? maybeList[0].split(/,\s*/) : maybeList
list.forEach(validateCIDR)
return list
}
diff --git a/deps/npm/lib/unbuild.js b/deps/npm/lib/unbuild.js
index 78293c9ca269b6..d527778e92b07c 100644
--- a/deps/npm/lib/unbuild.js
+++ b/deps/npm/lib/unbuild.js
@@ -77,7 +77,7 @@ function rmBins (pkg, folder, parent, top, cb) {
asyncMap(Object.keys(pkg.bin), function (b, cb) {
if (process.platform === 'win32') {
chain([ [gentlyRm, path.resolve(binRoot, b) + '.cmd', true, folder],
- [gentlyRm, path.resolve(binRoot, b), true, folder] ], cb)
+ [gentlyRm, path.resolve(binRoot, b), true, folder] ], cb)
} else {
gentlyRm(path.resolve(binRoot, b), true, folder, cb)
}
diff --git a/deps/npm/lib/uninstall.js b/deps/npm/lib/uninstall.js
index 333d3e9d69c3c9..c4bd23ea319959 100644
--- a/deps/npm/lib/uninstall.js
+++ b/deps/npm/lib/uninstall.js
@@ -29,8 +29,8 @@ function uninstall (args, cb) {
if (args.length === 1 && args[0] === '.') args = []
const where = npm.config.get('global') || !args.length
- ? path.resolve(npm.globalDir, '..')
- : npm.prefix
+ ? path.resolve(npm.globalDir, '..')
+ : npm.prefix
args = args.filter(function (a) {
return path.resolve(a) !== where
diff --git a/deps/npm/lib/unpublish.js b/deps/npm/lib/unpublish.js
index 4ea8187025f808..c2e9edd8006f51 100644
--- a/deps/npm/lib/unpublish.js
+++ b/deps/npm/lib/unpublish.js
@@ -1,3 +1,4 @@
+/* eslint-disable standard/no-callback-literal */
module.exports = unpublish
@@ -100,10 +101,10 @@ function gotProject (project, version, publishConfig, cb_) {
// remove from the cache first
// npm.commands.cache(['clean', project, version], function (er) {
- // if (er) {
- // log.error('unpublish', 'Failed to clean cache')
- // return cb(er)
- // }
+ // if (er) {
+ // log.error('unpublish', 'Failed to clean cache')
+ // return cb(er)
+ // }
mapToRegistry(project, config, function (er, uri, auth) {
if (er) return cb(er)
diff --git a/deps/npm/lib/update.js b/deps/npm/lib/update.js
index efb56f5e415ba7..9b1345f9dfbfb4 100644
--- a/deps/npm/lib/update.js
+++ b/deps/npm/lib/update.js
@@ -57,7 +57,7 @@ function update_ (args) {
// use the initial installation method (repo, tar, git) for updating
if (url.parse(ww.req).protocol) ww.what = ww.req
- const where = ww.dep.parent && ww.dep.parent.path || ww.dep.path
+ const where = (ww.dep.parent && ww.dep.parent.path) || ww.dep.path
const isTransitive = !(ww.dep.requiredBy || []).some((p) => p.isTop)
const key = where + ':' + String(isTransitive)
if (!toInstall[key]) toInstall[key] = {where: where, opts: {saveOnlyLock: isTransitive}, what: []}
diff --git a/deps/npm/lib/utils/did-you-mean.js b/deps/npm/lib/utils/did-you-mean.js
index 8e72dde5fa0132..479f04755d7db9 100644
--- a/deps/npm/lib/utils/did-you-mean.js
+++ b/deps/npm/lib/utils/did-you-mean.js
@@ -1,19 +1,16 @@
var meant = require('meant')
-var output = require('./output.js')
function didYouMean (scmd, commands) {
var bestSimilarity = meant(scmd, commands).map(function (str) {
return ' ' + str
})
- if (bestSimilarity.length === 0) return
+ if (bestSimilarity.length === 0) return ''
if (bestSimilarity.length === 1) {
- output('\nDid you mean this?\n' + bestSimilarity[0])
+ return '\nDid you mean this?\n' + bestSimilarity[0]
} else {
- output(
- ['\nDid you mean one of these?']
- .concat(bestSimilarity.slice(0, 3)).join('\n')
- )
+ return ['\nDid you mean one of these?']
+ .concat(bestSimilarity.slice(0, 3)).join('\n')
}
}
diff --git a/deps/npm/lib/utils/error-handler.js b/deps/npm/lib/utils/error-handler.js
index b2fd45a5f3e5fb..c6481abf6737d6 100644
--- a/deps/npm/lib/utils/error-handler.js
+++ b/deps/npm/lib/utils/error-handler.js
@@ -57,7 +57,7 @@ process.on('exit', function (code) {
log.error('', 'cb() never called!')
console.error('')
log.error('', 'This is an error with npm itself. Please report this error at:')
- log.error('', ' ')
+ log.error('', ' ')
writeLogFile()
}
@@ -247,6 +247,6 @@ function writeLogFile () {
log.record.length = 0
wroteLogFile = true
} catch (ex) {
- return
+
}
}
diff --git a/deps/npm/lib/utils/error-message.js b/deps/npm/lib/utils/error-message.js
index 85504f5edc5bb5..6e148981833d32 100644
--- a/deps/npm/lib/utils/error-message.js
+++ b/deps/npm/lib/utils/error-message.js
@@ -9,6 +9,17 @@ function errorMessage (er) {
var short = []
var detail = []
switch (er.code) {
+ case 'ENOAUDIT':
+ short.push(['audit', er.message])
+ break
+ case 'EAUDITNOPJSON':
+ short.push(['audit', er.message])
+ break
+ case 'EAUDITNOLOCK':
+ short.push(['audit', er.message])
+ detail.push(['audit', 'Try creating one first with: npm i --package-lock-only'])
+ break
+
case 'ECONNREFUSED':
short.push(['', er])
detail.push([
@@ -23,8 +34,17 @@ function errorMessage (er) {
case 'EACCES':
case 'EPERM':
short.push(['', er])
- detail.push(['', ['\nPlease try running this command again as root/Administrator.'
- ].join('\n')])
+ detail.push([
+ '',
+ [
+ '\nThe operation was rejected by your operating system.',
+ (process.platform === 'win32'
+ ? 'It\'s possible that the file was already in use (by a text editor or antivirus),\nor that you lack permissions to access it.'
+ : 'It is likely you do not have the permissions to access this file as the current user'),
+ '\nIf you believe this might be a permissions issue, please double-check the',
+ 'permissions of the file and its containing directories, or try running',
+ 'the command again as root/Administrator (though this is not recommended).'
+ ].join('\n')])
break
case 'ELIFECYCLE':
@@ -52,17 +72,32 @@ function errorMessage (er) {
break
case 'EJSONPARSE':
- short.push(['', er.message])
- short.push(['', 'File: ' + er.file])
+ const path = require('path')
+ // Check whether we ran into a conflict in our own package.json
+ if (er.file === path.join(npm.prefix, 'package.json')) {
+ const isDiff = require('../install/read-shrinkwrap.js')._isDiff
+ const txt = require('fs').readFileSync(er.file, 'utf8')
+ if (isDiff(txt)) {
+ detail.push([
+ '',
+ [
+ 'Merge conflict detected in your package.json.',
+ '',
+ 'Please resolve the package.json conflict and retry the command:',
+ '',
+ `$ ${process.argv.join(' ')}`
+ ].join('\n')
+ ])
+ break
+ }
+ }
+ short.push(['JSON.parse', er.message])
detail.push([
- '',
+ 'JSON.parse',
[
'Failed to parse package.json data.',
- 'package.json must be actual JSON, not just JavaScript.',
- '',
- 'Tell the package author to fix their package.json file.'
- ].join('\n'),
- 'JSON.parse'
+ 'package.json must be actual JSON, not just JavaScript.'
+ ].join('\n')
])
break
@@ -70,7 +105,7 @@ function errorMessage (er) {
case 'E401':
// the E401 message checking is a hack till we replace npm-registry-client with something
// OTP aware.
- if (er.code === 'EOTP' || (er.code === 'E401' && /one-time pass/.test(er.message))) {
+ if (er.code === 'EOTP' || /one-time pass/.test(er.message)) {
short.push(['', 'This operation requires a one-time password from your authenticator.'])
detail.push([
'',
@@ -80,42 +115,40 @@ function errorMessage (er) {
'it, or it timed out. Please try again.'
].join('\n')
])
- break
} else {
// npm ERR! code E401
// npm ERR! Unable to authenticate, need: Basic
- if (er.headers && er.headers['www-authenticate']) {
- const auth = er.headers['www-authenticate'].map((au) => au.split(/,\s*/))[0] || []
- if (auth.indexOf('Bearer') !== -1) {
- short.push(['', 'Unable to authenticate, your authentication token seems to be invalid.'])
- detail.push([
- '',
- [
- 'To correct this please trying logging in again with:',
- ' npm login'
- ].join('\n')
- ])
- break
- } else if (auth.indexOf('Basic') !== -1) {
- short.push(['', 'Incorrect or missing password.'])
- detail.push([
+ const auth = (er.headers && er.headers['www-authenticate'] && er.headers['www-authenticate'].map((au) => au.split(/,\s*/))[0]) || []
+ if (auth.indexOf('Bearer') !== -1) {
+ short.push(['', 'Unable to authenticate, your authentication token seems to be invalid.'])
+ detail.push([
+ '',
+ [
+ 'To correct this please trying logging in again with:',
+ ' npm login'
+ ].join('\n')
+ ])
+ } else if (auth.indexOf('Basic') !== -1) {
+ short.push(['', 'Incorrect or missing password.'])
+ detail.push([
+ '',
+ [
+ 'If you were trying to login, change your password, create an',
+ 'authentication token or enable two-factor authentication then',
+ 'that means you likely typed your password in incorrectly.',
+ 'Please try again, or recover your password at:',
+ ' https://www.npmjs.com/forgot',
'',
- [
- 'If you were trying to login, change your password, create an',
- 'authentication token or enable two-factor authentication then',
- 'that means you likely typed your password in incorrectly.',
- 'Please try again, or recover your password at:',
- ' https://www.npmjs.com/forgot',
- '',
- 'If you were doing some other operation then your saved credentials are',
- 'probably out of date. To correct this please try logging in again with:',
- ' npm login'
- ].join('\n')
- ])
- break
- }
+ 'If you were doing some other operation then your saved credentials are',
+ 'probably out of date. To correct this please try logging in again with:',
+ ' npm login'
+ ].join('\n')
+ ])
+ } else {
+ short.push(['', er.message || er])
}
}
+ break
case 'E404':
// There's no need to have 404 in the message as well.
@@ -271,7 +304,7 @@ function errorMessage (er) {
])
break
} // else passthrough
- /*eslint no-fallthrough:0*/
+ /* eslint no-fallthrough:0 */
case 'ENOSPC':
short.push(['nospc', er.message])
@@ -315,7 +348,7 @@ function errorMessage (er) {
'typeerror',
[
'This is an error with npm itself. Please report this error at:',
- ' '
+ ' '
].join('\n')
])
break
diff --git a/deps/npm/lib/utils/gunzip-maybe.js b/deps/npm/lib/utils/gunzip-maybe.js
index db75f0601713f5..adf7e4402aa2d9 100644
--- a/deps/npm/lib/utils/gunzip-maybe.js
+++ b/deps/npm/lib/utils/gunzip-maybe.js
@@ -11,8 +11,8 @@ function gunzip () {
var stream = duplex()
var peeker = through(function (chunk, enc, cb) {
var newStream = hasGzipHeader(chunk)
- ? zlib.createGunzip()
- : through()
+ ? zlib.createGunzip()
+ : through()
stream.setReadable(newStream)
stream.setWritable(newStream)
stream.write(chunk)
diff --git a/deps/npm/lib/utils/is-registry.js b/deps/npm/lib/utils/is-registry.js
index 434cdbecbf18ad..e5f08e16a0b961 100644
--- a/deps/npm/lib/utils/is-registry.js
+++ b/deps/npm/lib/utils/is-registry.js
@@ -9,4 +9,3 @@ function isRegistry (req) {
if (req.type === 'range' || req.type === 'version' || req.type === 'tag') return true
return false
}
-
diff --git a/deps/npm/lib/utils/is-windows-bash.js b/deps/npm/lib/utils/is-windows-bash.js
index bc4ac7db493f3a..0a6c179680e8c6 100644
--- a/deps/npm/lib/utils/is-windows-bash.js
+++ b/deps/npm/lib/utils/is-windows-bash.js
@@ -1,3 +1,4 @@
'use strict'
var isWindows = require('./is-windows.js')
-module.exports = isWindows && /^MINGW(32|64)$/.test(process.env.MSYSTEM)
+module.exports = isWindows &&
+ (/^MINGW(32|64)$/.test(process.env.MSYSTEM) || process.env.TERM === 'cygwin')
diff --git a/deps/npm/lib/utils/metrics-launch.js b/deps/npm/lib/utils/metrics-launch.js
index 821f8bc7e4fde3..7e2a8d1cc98d61 100644
--- a/deps/npm/lib/utils/metrics-launch.js
+++ b/deps/npm/lib/utils/metrics-launch.js
@@ -1,4 +1,5 @@
'use strict'
+/* eslint-disable camelcase */
module.exports = launchSendMetrics
var fs = require('graceful-fs')
var child_process = require('child_process')
diff --git a/deps/npm/lib/utils/open-url.js b/deps/npm/lib/utils/open-url.js
new file mode 100644
index 00000000000000..7a48d2e868959b
--- /dev/null
+++ b/deps/npm/lib/utils/open-url.js
@@ -0,0 +1,16 @@
+'use strict'
+const npm = require('../npm.js')
+const output = require('./output.js')
+const opener = require('opener')
+
+// attempt to open URL in web-browser, print address otherwise:
+module.exports = function open (url, errMsg, cb, browser = npm.config.get('browser')) {
+ opener(url, { command: npm.config.get('browser') }, (er) => {
+ if (er && er.code === 'ENOENT') {
+ output(`${errMsg}:\n\n${url}`)
+ return cb()
+ } else {
+ return cb(er)
+ }
+ })
+}
diff --git a/deps/npm/lib/utils/parse-json.js b/deps/npm/lib/utils/parse-json.js
index 5c0b959a0d39ee..c2ebac35819adc 100644
--- a/deps/npm/lib/utils/parse-json.js
+++ b/deps/npm/lib/utils/parse-json.js
@@ -1,13 +1,14 @@
'use strict'
+var parseJsonWithErrors = require('json-parse-better-errors')
var parseJSON = module.exports = function (content) {
- return JSON.parse(stripBOM(content))
+ return parseJsonWithErrors(stripBOM(content))
}
parseJSON.noExceptions = function (content) {
try {
return parseJSON(content)
} catch (ex) {
- return
+
}
}
diff --git a/deps/npm/lib/utils/perf.js b/deps/npm/lib/utils/perf.js
index 04232632254531..d314860792d2a3 100644
--- a/deps/npm/lib/utils/perf.js
+++ b/deps/npm/lib/utils/perf.js
@@ -18,10 +18,9 @@ function time (name) {
function timeEnd (name) {
if (name in timings) {
- process.emit('timing', name, Date.now() - timings[name])
+ perf.emit('timing', name, Date.now() - timings[name])
delete timings[name]
} else {
log.silly('timing', "Tried to end timer that doesn't exist:", name)
- return
}
}
diff --git a/deps/npm/lib/utils/pick-manifest-from-registry-metadata.js b/deps/npm/lib/utils/pick-manifest-from-registry-metadata.js
index e2c0d2e5aa45e0..589cef207dcd9d 100644
--- a/deps/npm/lib/utils/pick-manifest-from-registry-metadata.js
+++ b/deps/npm/lib/utils/pick-manifest-from-registry-metadata.js
@@ -21,6 +21,6 @@ function pickManifestFromRegistryMetadata (spec, tag, versions, metadata) {
} else if (spec === '*' && versions.length && tagged && metadata.versions[tagged]) {
return {resolvedTo: tag, manifest: metadata.versions[tagged]}
} else {
- return
+
}
}
diff --git a/deps/npm/lib/utils/read-user-info.js b/deps/npm/lib/utils/read-user-info.js
index 359432cf70de85..445bdfeea3e846 100644
--- a/deps/npm/lib/utils/read-user-info.js
+++ b/deps/npm/lib/utils/read-user-info.js
@@ -19,7 +19,15 @@ function read (opts) {
}
function readOTP (msg, otp, isRetry) {
- if (!msg) msg = 'Enter OTP: '
+ if (!msg) {
+ msg = [
+ 'There was an error while trying authentication due to OTP (One-Time-Password).',
+ 'The One-Time-Password is generated via applications like Authy or',
+ 'Google Authenticator, for more information see:',
+ 'https://docs.npmjs.com/getting-started/using-two-factor-authentication',
+ 'Enter OTP: '
+ ].join('\n')
+ }
if (isRetry && otp && /^[\d ]+$|^[A-Fa-f0-9]{64,64}$/.test(otp)) return otp.replace(/\s+/g, '')
return read({prompt: msg, default: otp || ''})
@@ -63,4 +71,3 @@ function readEmail (msg, email, opts, isRetry) {
return read({prompt: msg, default: email || ''})
.then((username) => readEmail(msg, username, opts, true))
}
-
diff --git a/deps/npm/lib/utils/unsupported.js b/deps/npm/lib/utils/unsupported.js
index b3a8a9b33ae29b..09d7784dd5125d 100644
--- a/deps/npm/lib/utils/unsupported.js
+++ b/deps/npm/lib/utils/unsupported.js
@@ -1,11 +1,11 @@
'use strict'
var semver = require('semver')
var supportedNode = [
- {ver: '4', min: '4.7.0'},
{ver: '6', min: '6.0.0'},
- {ver: '7', min: '7.0.0'},
{ver: '8', min: '8.0.0'},
- {ver: '9', min: '9.0.0'}
+ {ver: '9', min: '9.0.0'},
+ {ver: '10', min: '10.0.0'},
+ {ver: '11', min: '11.0.0'}
]
var knownBroken = '<4.7.0'
@@ -25,7 +25,7 @@ exports.checkForBrokenNode = function () {
supportedNode.forEach(function (rel) {
if (semver.satisfies(nodejs.version, rel.ver)) {
console.error('Node.js ' + rel.ver + " is supported but the specific version you're running has")
- console.error(`a bug known to break npm. Please update to at least ${rel.min} to use this`)
+ console.error('a bug known to break npm. Please update to at least ' + rel.min + ' to use this')
console.error('version of npm. You can find the latest release of Node.js at https://nodejs.org/')
process.exit(1)
}
diff --git a/deps/npm/lib/version.js b/deps/npm/lib/version.js
index edcd664f2a7c4e..4439f679b3b894 100644
--- a/deps/npm/lib/version.js
+++ b/deps/npm/lib/version.js
@@ -4,6 +4,7 @@ const BB = require('bluebird')
const assert = require('assert')
const chain = require('slide').chain
const detectIndent = require('detect-indent')
+const detectNewline = require('detect-newline')
const fs = require('graceful-fs')
const readFile = BB.promisify(require('graceful-fs').readFile)
const git = require('./utils/git.js')
@@ -14,9 +15,10 @@ const output = require('./utils/output.js')
const parseJSON = require('./utils/parse-json.js')
const path = require('path')
const semver = require('semver')
+const stringifyPackage = require('stringify-package')
const writeFileAtomic = require('write-file-atomic')
-version.usage = 'npm version [ | major | minor | patch | premajor | preminor | prepatch | prerelease | from-git]' +
+version.usage = 'npm version [ | major | minor | patch | premajor | preminor | prepatch | prerelease [--preid=] | from-git]' +
'\n(run in package dir)\n' +
"'npm -v' or 'npm --version' to print npm version " +
'(' + npm.version + ')\n' +
@@ -33,7 +35,7 @@ function version (args, silent, cb_) {
}
if (args.length > 1) return cb_(version.usage)
- readPackage(function (er, data, indent) {
+ readPackage(function (er, data, indent, newline) {
if (!args.length) return dump(data, cb_)
if (er) {
@@ -45,7 +47,7 @@ function version (args, silent, cb_) {
retrieveTagVersion(silent, data, cb_)
} else {
var newVersion = semver.valid(args[0])
- if (!newVersion) newVersion = semver.inc(data.version, args[0])
+ if (!newVersion) newVersion = semver.inc(data.version, args[0], npm.config.get('preid'))
if (!newVersion) return cb_(version.usage)
persistVersion(newVersion, silent, data, cb_)
}
@@ -115,14 +117,16 @@ function readPackage (cb) {
fs.readFile(packagePath, 'utf8', function (er, data) {
if (er) return cb(new Error(er))
var indent
+ var newline
try {
- indent = detectIndent(data).indent || ' '
+ indent = detectIndent(data).indent
+ newline = detectNewline(data)
data = JSON.parse(data)
} catch (e) {
er = e
data = null
}
- cb(er, data, indent)
+ cb(er, data, indent, newline)
})
}
@@ -132,10 +136,10 @@ function updatePackage (newVersion, silent, cb_) {
cb_(er)
}
- readPackage(function (er, data, indent) {
+ readPackage(function (er, data, indent, newline) {
if (er) return cb(new Error(er))
data.version = newVersion
- write(data, 'package.json', indent, cb)
+ write(data, 'package.json', indent, newline, cb)
})
}
@@ -168,15 +172,17 @@ function updateShrinkwrap (newVersion, cb) {
const file = shrinkwrap ? SHRINKWRAP : PKGLOCK
let data
let indent
+ let newline
try {
data = parseJSON(shrinkwrap || lockfile)
- indent = detectIndent(shrinkwrap || lockfile).indent || ' '
+ indent = detectIndent(shrinkwrap || lockfile).indent
+ newline = detectNewline(shrinkwrap || lockfile)
} catch (err) {
log.error('version', `Bad ${file} data.`)
return cb(err)
}
data.version = newVersion
- write(data, file, indent, (err) => {
+ write(data, file, indent, newline, (err) => {
if (err) {
log.error('version', `Failed to update version in ${file}`)
return cb(err)
@@ -288,9 +294,10 @@ function buildCommitArgs (args) {
function _commit (version, localData, cb) {
const options = { env: process.env }
const message = npm.config.get('message').replace(/%s/g, version)
- const sign = npm.config.get('sign-git-tag')
- const commitArgs = buildCommitArgs([ 'commit', '-m', message ])
- const flagForTag = sign ? '-sm' : '-am'
+ const signTag = npm.config.get('sign-git-tag')
+ const signCommit = npm.config.get('sign-git-commit')
+ const commitArgs = buildCommitArgs([ 'commit', signCommit ? '-S -m' : '-m', message ])
+ const flagForTag = signTag ? '-sm' : '-am'
stagePackageFiles(localData, options).then(() => {
return git.exec(commitArgs, options)
@@ -307,9 +314,9 @@ function _commit (version, localData, cb) {
function stagePackageFiles (localData, options) {
return addLocalFile('package.json', options, false).then(() => {
if (localData.hasShrinkwrap) {
- return addLocalFile('npm-shrinkwrap.json', options, false)
+ return addLocalFile('npm-shrinkwrap.json', options, true)
} else if (localData.hasPackageLock) {
- return addLocalFile('package-lock.json', options, false)
+ return addLocalFile('package-lock.json', options, true)
}
})
}
@@ -317,18 +324,18 @@ function stagePackageFiles (localData, options) {
function addLocalFile (file, options, ignoreFailure) {
const p = git.exec(['add', path.join(npm.localPrefix, file)], options)
return ignoreFailure
- ? p.catch(() => {})
- : p
+ ? p.catch(() => {})
+ : p
}
-function write (data, file, indent, cb) {
+function write (data, file, indent, newline, cb) {
assert(data && typeof data === 'object', 'must pass data to version write')
assert(typeof file === 'string', 'must pass filename to write to version write')
log.verbose('version.write', 'data', data, 'to', file)
writeFileAtomic(
path.join(npm.localPrefix, file),
- new Buffer(JSON.stringify(data, null, indent || 2) + '\n'),
+ stringifyPackage(data, indent, newline),
cb
)
}
diff --git a/deps/npm/lib/view.js b/deps/npm/lib/view.js
index e0904048df8ab4..b7d7f6ec803100 100644
--- a/deps/npm/lib/view.js
+++ b/deps/npm/lib/view.js
@@ -1,6 +1,15 @@
+'use strict'
+
// npm view [pkg [pkg ...]]
module.exports = view
+const BB = require('bluebird')
+
+const byteSize = require('byte-size')
+const color = require('ansicolors')
+const columns = require('cli-columns')
+const relativeDate = require('tiny-relative-date')
+const style = require('ansistyles')
var npm = require('./npm.js')
var readJson = require('read-package-json')
var log = require('npmlog')
@@ -111,7 +120,7 @@ function fetchAndRead (nv, args, silent, cb) {
npm.registry.get(uri, { auth: auth }, function (er, data) {
if (er) return cb(er)
- if (data['dist-tags'] && data['dist-tags'].hasOwnProperty(version)) {
+ if (data['dist-tags'] && data['dist-tags'][version]) {
version = data['dist-tags'][version]
}
@@ -146,20 +155,162 @@ function fetchAndRead (nv, args, silent, cb) {
})
}
})
- results = results.reduce(reducer, {})
- var retval = results
+ var retval = results.reduce(reducer, {})
if (args.length === 1 && args[0] === '') {
retval = cleanBlanks(retval)
log.silly('cleanup', retval)
}
- if (error || silent) cb(error, retval)
- else printData(results, data._id, cb.bind(null, error, retval))
+ if (error || silent) {
+ cb(error, retval)
+ } else if (
+ !npm.config.get('json') &&
+ args.length === 1 &&
+ args[0] === ''
+ ) {
+ data.version = version
+ BB.all(results.map((v) => prettyView(data, v[Object.keys(v)[0]][''])))
+ .nodeify(cb)
+ .then(() => retval)
+ } else {
+ printData(retval, data._id, cb.bind(null, error, retval))
+ }
})
})
}
+function prettyView (packument, manifest) {
+ // More modern, pretty printing of default view
+ const unicode = npm.config.get('unicode')
+ return BB.try(() => {
+ if (!manifest) {
+ log.error(
+ 'view',
+ 'No matching versions.\n' +
+ 'To see a list of versions, run:\n' +
+ `> npm view ${packument.name} versions`
+ )
+ return
+ }
+ const tags = []
+ Object.keys(packument['dist-tags']).forEach((t) => {
+ const version = packument['dist-tags'][t]
+ tags.push(`${style.bright(color.green(t))}: ${version}`)
+ })
+ const unpackedSize = manifest.dist.unpackedSize &&
+ byteSize(manifest.dist.unpackedSize)
+ const licenseField = manifest.license || manifest.licence || 'Proprietary'
+ const info = {
+ name: color.green(manifest.name),
+ version: color.green(manifest.version),
+ bins: Object.keys(manifest.bin || {}).map(color.yellow),
+ versions: color.yellow(packument.versions.length + ''),
+ description: manifest.description,
+ deprecated: manifest.deprecated,
+ keywords: (packument.keywords || []).map(color.yellow),
+ license: typeof licenseField === 'string'
+ ? licenseField
+ : (licenseField.type || 'Proprietary'),
+ deps: Object.keys(manifest.dependencies || {}).map((dep) => {
+ return `${color.yellow(dep)}: ${manifest.dependencies[dep]}`
+ }),
+ publisher: manifest._npmUser && unparsePerson({
+ name: color.yellow(manifest._npmUser.name),
+ email: color.cyan(manifest._npmUser.email)
+ }),
+ modified: color.yellow(relativeDate(packument.time[packument.version])),
+ maintainers: (packument.maintainers || []).map((u) => unparsePerson({
+ name: color.yellow(u.name),
+ email: color.cyan(u.email)
+ })),
+ repo: (
+ manifest.bugs && (manifest.bugs.url || manifest.bugs)
+ ) || (
+ manifest.repository && (manifest.repository.url || manifest.repository)
+ ),
+ site: (
+ manifest.homepage && (manifest.homepage.url || manifest.homepage)
+ ),
+ stars: color.yellow('' + packument.users ? Object.keys(packument.users || {}).length : 0),
+ tags,
+ tarball: color.cyan(manifest.dist.tarball),
+ shasum: color.yellow(manifest.dist.shasum),
+ integrity: manifest.dist.integrity && color.yellow(manifest.dist.integrity),
+ fileCount: manifest.dist.fileCount && color.yellow(manifest.dist.fileCount),
+ unpackedSize: unpackedSize && color.yellow(unpackedSize.value) + ' ' + unpackedSize.unit
+ }
+ if (info.license.toLowerCase().trim() === 'proprietary') {
+ info.license = style.bright(color.red(info.license))
+ } else {
+ info.license = color.green(info.license)
+ }
+ console.log('')
+ console.log(
+ style.underline(style.bright(`${info.name}@${info.version}`)) +
+ ' | ' + info.license +
+ ' | deps: ' + (info.deps.length ? color.cyan(info.deps.length) : color.green('none')) +
+ ' | versions: ' + info.versions
+ )
+ info.description && console.log(info.description)
+ if (info.repo || info.site) {
+ info.site && console.log(color.cyan(info.site))
+ }
+
+ const warningSign = unicode ? ' ⚠️ ' : '!!'
+ info.deprecated && console.log(
+ `\n${style.bright(color.red('DEPRECATED'))}${
+ warningSign
+ } - ${info.deprecated}`
+ )
+
+ if (info.keywords.length) {
+ console.log('')
+ console.log('keywords:', info.keywords.join(', '))
+ }
+
+ if (info.bins.length) {
+ console.log('')
+ console.log('bin:', info.bins.join(', '))
+ }
+
+ console.log('')
+ console.log('dist')
+ console.log('.tarball:', info.tarball)
+ console.log('.shasum:', info.shasum)
+ info.integrity && console.log('.integrity:', info.integrity)
+ info.unpackedSize && console.log('.unpackedSize:', info.unpackedSize)
+
+ const maxDeps = 24
+ if (info.deps.length) {
+ console.log('')
+ console.log('dependencies:')
+ console.log(columns(info.deps.slice(0, maxDeps), {padding: 1}))
+ if (info.deps.length > maxDeps) {
+ console.log(`(...and ${info.deps.length - maxDeps} more.)`)
+ }
+ }
+
+ if (info.maintainers && info.maintainers.length) {
+ console.log('')
+ console.log('maintainers:')
+ info.maintainers.forEach((u) => console.log('-', u))
+ }
+
+ console.log('')
+ console.log('dist-tags:')
+ console.log(columns(info.tags))
+
+ if (info.publisher || info.modified) {
+ let publishInfo = 'published'
+ if (info.modified) { publishInfo += ` ${info.modified}` }
+ if (info.publisher) { publishInfo += ` by ${info.publisher}` }
+ console.log('')
+ console.log(publishInfo)
+ }
+ })
+}
+
function cleanBlanks (obj) {
var clean = {}
Object.keys(obj).forEach(function (version) {
@@ -323,8 +474,8 @@ function cleanup (data) {
if (keys.length <= 3 &&
data.name &&
(keys.length === 1 ||
- keys.length === 3 && data.email && data.url ||
- keys.length === 2 && (data.email || data.url))) {
+ (keys.length === 3 && data.email && data.url) ||
+ (keys.length === 2 && (data.email || data.url)))) {
data = unparsePerson(data)
}
return data
diff --git a/deps/npm/lib/xmas.js b/deps/npm/lib/xmas.js
index 25535533e10fbd..65c0c131abd484 100644
--- a/deps/npm/lib/xmas.js
+++ b/deps/npm/lib/xmas.js
@@ -48,7 +48,7 @@ module.exports = function (args, cb) {
w('\n\n')
log.heading = ''
log.addLevel('npm', 100000, log.headingStyle)
- log.npm('loves you', 'Happy Xmas, Noders!')
+ log.npm('loves you', 'Happy Xmas, JavaScripters!')
cb()
}
var dg = false
diff --git a/deps/npm/man/man1/npm-README.1 b/deps/npm/man/man1/npm-README.1
index e0ecb72d36c9d8..ecfa0c9ce130db 100644
--- a/deps/npm/man/man1/npm-README.1
+++ b/deps/npm/man/man1/npm-README.1
@@ -1,30 +1,27 @@
-.TH "NPM" "1" "February 2018" "" ""
+.TH "NPM" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm\fR \- a JavaScript package manager
.P
-Build Status \fIhttps://img\.shields\.io/travis/npm/npm/latest\.svg\fR \fIhttps://travis\-ci\.org/npm/npm\fR
+Build Status \fIhttps://img\.shields\.io/travis/npm/cli/latest\.svg\fR \fIhttps://travis\-ci\.org/npm/cli\fR
.SH SYNOPSIS
.P
This is just enough info to get you up and running\.
.P
-Much more info available via \fBnpm help\fP once it's installed\.
+Much more info will be available via \fBnpm help\fP once it's installed\.
.SH IMPORTANT
.P
-\fBYou need node v4 or higher to run this program\.\fR
+\fBYou need node v6 or higher to run this program\.\fR
.P
-To install an old \fBand unsupported\fR version of npm that works on node v0\.12
+To install an old \fBand unsupported\fR version of npm that works on node v5
and prior, clone the git repo and dig through the old tags and branches\.
.P
-\fBnpm is configured to use npm, Inc\.'s public package registry at
-https://registry\.npmjs\.org by default\.\fR
+\fBnpm is configured to use npm, Inc\.'s public registry at
+https://registry\.npmjs\.org by default\.\fR Use of the npm public registry
+is subject to terms of use available at https://www\.npmjs\.com/policies/terms\|\.
.P
You can configure npm to use any compatible registry you
like, and even run your own registry\. Check out the doc on
registries \fIhttps://docs\.npmjs\.com/misc/registry\fR\|\.
-.P
-Use of someone else's registry may be governed by terms of use\. The
-terms of use for the default public registry are available at
-https://www\.npmjs\.com\|\.
.SH Super Easy Install
.P
npm is bundled with node \fIhttps://nodejs\.org/en/download/\fR\|\.
@@ -85,7 +82,7 @@ experience if you run a recent version of npm\. To upgrade, either use Microsoft
upgrade tool \fIhttps://github\.com/felixrieseberg/npm\-windows\-upgrade\fR,
download a new version of Node \fIhttps://nodejs\.org/en/download/\fR,
or follow the Windows upgrade instructions in the
-npm Troubleshooting Guide \fI\|\./TROUBLESHOOTING\.md\fR\|\.
+Installing/upgrading npm \fIhttps://npm\.community/t/installing\-upgrading\-npm/251/2\fR post\.
.P
If that's not fancy enough for you, then you can fetch the code with
git, and mess with it directly\.
@@ -155,14 +152,14 @@ When you find issues, please report them:
.RS 0
.IP \(bu 2
web:
-https://github\.com/npm/npm/issues
+https://npm\.community/c/bugs
.RE
.P
Be sure to include \fIall\fR of the output from the npm command that didn't work
as expected\. The \fBnpm\-debug\.log\fP file is also helpful to provide\.
.P
-You can also find npm people in \fB#npm\fP on https://package\.community/ or
+You can also find npm people in \fB#npm\fP on https:// or
on Twitter \fIhttps://twitter\.com/npm_support\fR\|\. Whoever responds will no
doubt tell you to put the output in a gist or email\.
.SH SEE ALSO
diff --git a/deps/npm/man/man1/npm-access.1 b/deps/npm/man/man1/npm-access.1
index 478e759bbfe353..18780136c84132 100644
--- a/deps/npm/man/man1/npm-access.1
+++ b/deps/npm/man/man1/npm-access.1
@@ -1,4 +1,4 @@
-.TH "NPM\-ACCESS" "1" "February 2018" "" ""
+.TH "NPM\-ACCESS" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-access\fR \- Set access level on published packages
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-adduser.1 b/deps/npm/man/man1/npm-adduser.1
index 881dc5be2e0375..72dfffdf692f3d 100644
--- a/deps/npm/man/man1/npm-adduser.1
+++ b/deps/npm/man/man1/npm-adduser.1
@@ -1,4 +1,4 @@
-.TH "NPM\-ADDUSER" "1" "February 2018" "" ""
+.TH "NPM\-ADDUSER" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-adduser\fR \- Add a registry user account
.SH SYNOPSIS
@@ -31,7 +31,7 @@ your existing record\.
.SH CONFIGURATION
.SS registry
.P
-Default: https://registry\.npmjs\.org/
+Default: https://
.P
The base URL of the npm package registry\. If \fBscope\fP is also specified,
this registry will only be used for packages with that scope\. \fBscope\fP defaults
diff --git a/deps/npm/man/man1/npm-audit.1 b/deps/npm/man/man1/npm-audit.1
new file mode 100644
index 00000000000000..e84f5b54ae938e
--- /dev/null
+++ b/deps/npm/man/man1/npm-audit.1
@@ -0,0 +1,150 @@
+.TH "NPM\-AUDIT" "1" "August 2018" "" ""
+.SH "NAME"
+\fBnpm-audit\fR \- Run a security audit
+.SH SYNOPSIS
+.P
+.RS 2
+.nf
+npm audit [\-\-json|\-\-parseable]
+npm audit fix [\-\-force|\-\-package\-lock\-only|\-\-dry\-run|\-\-production|\-\-only=dev]
+.fi
+.RE
+.SH EXAMPLES
+.P
+Scan your project for vulnerabilities and automatically install any compatible
+updates to vulnerable dependencies:
+.P
+.RS 2
+.nf
+$ npm audit fix
+.fi
+.RE
+.P
+Run \fBaudit fix\fP without modifying \fBnode_modules\fP, but still updating the
+pkglock:
+.P
+.RS 2
+.nf
+$ npm audit fix \-\-package\-lock\-only
+.fi
+.RE
+.P
+Skip updating \fBdevDependencies\fP:
+.P
+.RS 2
+.nf
+$ npm audit fix \-\-only=prod
+.fi
+.RE
+.P
+Have \fBaudit fix\fP install semver\-major updates to toplevel dependencies, not just
+semver\-compatible ones:
+.P
+.RS 2
+.nf
+$ npm audit fix \-\-force
+.fi
+.RE
+.P
+Do a dry run to get an idea of what \fBaudit fix\fP will do, and \fIalso\fR output
+install information in JSON format:
+.P
+.RS 2
+.nf
+$ npm audit fix \-\-dry\-run \-\-json
+.fi
+.RE
+.P
+Scan your project for vulnerabilities and just show the details, without fixing
+anything:
+.P
+.RS 2
+.nf
+$ npm audit
+.fi
+.RE
+.P
+Get the detailed audit report in JSON format:
+.P
+.RS 2
+.nf
+$ npm audit \-\-json
+.fi
+.RE
+.P
+Get the detailed audit report in plain text result, separated by tab characters, allowing for
+future reuse in scripting or command line post processing, like for example, selecting
+some of the columns printed:
+.P
+.RS 2
+.nf
+$ npm audit \-\-parseable
+.fi
+.RE
+.P
+To parse columns, you can use for example \fBawk\fP, and just print some of them:
+.P
+.RS 2
+.nf
+$ npm audit \-\-parseable | awk \-F $'\\t' '{print $1,$4}'
+.fi
+.RE
+.SH DESCRIPTION
+.P
+The audit command submits a description of the dependencies configured in
+your project to your default registry and asks for a report of known
+vulnerabilities\. The report returned includes instructions on how to act on
+this information\.
+.P
+You can also have npm automatically fix the vulnerabilities by running \fBnpm
+audit fix\fP\|\. Note that some vulnerabilities cannot be fixed automatically and
+will require manual intervention or review\. Also note that since \fBnpm audit fix\fP
+runs a full\-fledged \fBnpm install\fP under the hood, all configs that apply to the
+installer will also apply to \fBnpm install\fP \-\- so things like \fBnpm audit fix
+\-\-package\-lock\-only\fP will work as expected\.
+.SH CONTENT SUBMITTED
+.RS 0
+.IP \(bu 2
+npm_version
+.IP \(bu 2
+node_version
+.IP \(bu 2
+platform
+.IP \(bu 2
+node_env
+.IP \(bu 2
+A scrubbed version of your package\-lock\.json or npm\-shrinkwrap\.json
+
+.RE
+.SS SCRUBBING
+.P
+In order to ensure that potentially sensitive information is not included in
+the audit data bundle, some dependencies may have their names (and sometimes
+versions) replaced with opaque non\-reversible identifiers\. It is done for
+the following dependency types:
+.RS 0
+.IP \(bu 2
+Any module referencing a scope that is configured for a non\-default
+registry has its name scrubbed\. (That is, a scope you did a \fBnpm login \-\-scope=@ourscope\fP for\.)
+.IP \(bu 2
+All git dependencies have their names and specifiers scrubbed\.
+.IP \(bu 2
+All remote tarball dependencies have their names and specifiers scrubbed\.
+.IP \(bu 2
+All local directory and tarball dependencies have their names and specifiers scrubbed\.
+
+.RE
+.P
+The non\-reversible identifiers are a sha256 of a session\-specific UUID and the
+value being replaced, ensuring a consistent value within the payload that is
+different between runs\.
+.SH SEE ALSO
+.RS 0
+.IP \(bu 2
+npm help install
+.IP \(bu 2
+npm help 5 package\-locks
+.IP \(bu 2
+npm help 7 config
+
+.RE
diff --git a/deps/npm/man/man1/npm-bin.1 b/deps/npm/man/man1/npm-bin.1
index 2f371421b9e73d..ca90c9ecde60fc 100644
--- a/deps/npm/man/man1/npm-bin.1
+++ b/deps/npm/man/man1/npm-bin.1
@@ -1,4 +1,4 @@
-.TH "NPM\-BIN" "1" "February 2018" "" ""
+.TH "NPM\-BIN" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-bin\fR \- Display npm bin folder
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-bugs.1 b/deps/npm/man/man1/npm-bugs.1
index b2d549b8cd5a8e..40520bf0d43a52 100644
--- a/deps/npm/man/man1/npm-bugs.1
+++ b/deps/npm/man/man1/npm-bugs.1
@@ -1,4 +1,4 @@
-.TH "NPM\-BUGS" "1" "February 2018" "" ""
+.TH "NPM\-BUGS" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-bugs\fR \- Bugs for a package in a web browser maybe
.SH SYNOPSIS
@@ -30,7 +30,7 @@ The browser that is called by the \fBnpm bugs\fP command to open websites\.
.SS registry
.RS 0
.IP \(bu 2
-Default: https://registry\.npmjs\.org/
+Default: https://
.IP \(bu 2
Type: url
diff --git a/deps/npm/man/man1/npm-build.1 b/deps/npm/man/man1/npm-build.1
index 39bc2de42d4b5c..758d56904f4ad9 100644
--- a/deps/npm/man/man1/npm-build.1
+++ b/deps/npm/man/man1/npm-build.1
@@ -1,4 +1,4 @@
-.TH "NPM\-BUILD" "1" "February 2018" "" ""
+.TH "NPM\-BUILD" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-build\fR \- Build a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-bundle.1 b/deps/npm/man/man1/npm-bundle.1
index f3302bb27a9c88..56c37b7b01fb57 100644
--- a/deps/npm/man/man1/npm-bundle.1
+++ b/deps/npm/man/man1/npm-bundle.1
@@ -1,4 +1,4 @@
-.TH "NPM\-BUNDLE" "1" "February 2018" "" ""
+.TH "NPM\-BUNDLE" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-bundle\fR \- REMOVED
.SH DESCRIPTION
diff --git a/deps/npm/man/man1/npm-cache.1 b/deps/npm/man/man1/npm-cache.1
index 736a5d4072803e..fc5e85858c9045 100644
--- a/deps/npm/man/man1/npm-cache.1
+++ b/deps/npm/man/man1/npm-cache.1
@@ -1,4 +1,4 @@
-.TH "NPM\-CACHE" "1" "February 2018" "" ""
+.TH "NPM\-CACHE" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-cache\fR \- Manipulates packages cache
.SH SYNOPSIS
@@ -88,9 +88,9 @@ npm help publish
.IP \(bu 2
npm help pack
.IP \(bu 2
-https://npm\.im/cacache
+https://
.IP \(bu 2
-https://npm\.im/pacote
+https://
.RE
diff --git a/deps/npm/man/man1/npm-ci.1 b/deps/npm/man/man1/npm-ci.1
new file mode 100644
index 00000000000000..48e10b443ef74f
--- /dev/null
+++ b/deps/npm/man/man1/npm-ci.1
@@ -0,0 +1,76 @@
+.TH "NPM\-CI" "1" "August 2018" "" ""
+.SH "NAME"
+\fBnpm-ci\fR \- Install a project with a clean slate
+.SH SYNOPSIS
+.P
+.RS 2
+.nf
+npm ci
+.fi
+.RE
+.SH EXAMPLE
+.P
+Make sure you have a package\-lock and an up\-to\-date install:
+.P
+.RS 2
+.nf
+$ cd \./my/npm/project
+$ npm install
+added 154 packages in 10s
+$ ls | grep package\-lock
+.fi
+.RE
+.P
+Run \fBnpm ci\fP in that project
+.P
+.RS 2
+.nf
+$ npm ci
+added 154 packages in 5s
+.fi
+.RE
+.P
+Configure Travis to build using \fBnpm ci\fP instead of \fBnpm install\fP:
+.P
+.RS 2
+.nf
+# \.travis\.yml
+install:
+\- npm ci
+# keep the npm cache around to speed up installs
+cache:
+ directories:
+ \- "$HOME/\.npm"
+.fi
+.RE
+.SH DESCRIPTION
+.P
+This command is similar to npm help \fBnpm\-install\fP, except it's meant to be used in
+automated environments such as test platforms, continuous integration, and
+deployment\. It can be significantly faster than a regular npm install by
+skipping certain user\-oriented features\. It is also more strict than a regular
+install, which can help catch errors or inconsistencies caused by the
+incrementally\-installed local environments of most npm users\.
+.P
+In short, the main differences between using \fBnpm install\fP and \fBnpm ci\fP are:
+.RS 0
+.IP \(bu 2
+The project \fBmust\fR have an existing \fBpackage\-lock\.json\fP or \fBnpm\-shrinkwrap\.json\fP\|\.
+.IP \(bu 2
+If dependencies in the package lock do not match those in \fBpackage\.json\fP, \fBnpm ci\fP will exit with an error, instead of updating the package lock\.
+.IP \(bu 2
+\fBnpm ci\fP can only install entire projects at a time: individual dependencies cannot be added with this command\.
+.IP \(bu 2
+If a \fBnode_modules\fP is already present, it will be automatically removed before \fBnpm ci\fP begins its install\.
+.IP \(bu 2
+It will never write to \fBpackage\.json\fP or any of the package\-locks: installs are essentially frozen\.
+
+.RE
+.SH SEE ALSO
+.RS 0
+.IP \(bu 2
+npm help install
+.IP \(bu 2
+npm help 5 package\-locks
+
+.RE
diff --git a/deps/npm/man/man1/npm-completion.1 b/deps/npm/man/man1/npm-completion.1
index d7e3c745fff7ae..ad8f911f0cc112 100644
--- a/deps/npm/man/man1/npm-completion.1
+++ b/deps/npm/man/man1/npm-completion.1
@@ -1,4 +1,4 @@
-.TH "NPM\-COMPLETION" "1" "February 2018" "" ""
+.TH "NPM\-COMPLETION" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-completion\fR \- Tab Completion for npm
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-config.1 b/deps/npm/man/man1/npm-config.1
index 7299171a025b34..9a523dbe15e74a 100644
--- a/deps/npm/man/man1/npm-config.1
+++ b/deps/npm/man/man1/npm-config.1
@@ -1,4 +1,4 @@
-.TH "NPM\-CONFIG" "1" "February 2018" "" ""
+.TH "NPM\-CONFIG" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-config\fR \- Manage the npm configuration files
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-dedupe.1 b/deps/npm/man/man1/npm-dedupe.1
index 5c437e458a43ac..a5d84d05a0a264 100644
--- a/deps/npm/man/man1/npm-dedupe.1
+++ b/deps/npm/man/man1/npm-dedupe.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DEDUPE" "1" "February 2018" "" ""
+.TH "NPM\-DEDUPE" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-dedupe\fR \- Reduce duplication
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-deprecate.1 b/deps/npm/man/man1/npm-deprecate.1
index 6dd2fee7399c66..d146fe9ae6b60a 100644
--- a/deps/npm/man/man1/npm-deprecate.1
+++ b/deps/npm/man/man1/npm-deprecate.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DEPRECATE" "1" "February 2018" "" ""
+.TH "NPM\-DEPRECATE" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-deprecate\fR \- Deprecate a version of a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-dist-tag.1 b/deps/npm/man/man1/npm-dist-tag.1
index 4b3d814c1c8d74..6aa8ab5523aa5d 100644
--- a/deps/npm/man/man1/npm-dist-tag.1
+++ b/deps/npm/man/man1/npm-dist-tag.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DIST\-TAG" "1" "February 2018" "" ""
+.TH "NPM\-DIST\-TAG" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-dist-tag\fR \- Modify package distribution tags
.SH SYNOPSIS
@@ -19,7 +19,7 @@ Add, remove, and enumerate distribution tags on a package:
.IP \(bu 2
add:
Tags the specified version of the package with the specified tag, or the
-\fB\-\-tag\fP config if not specified\. The tag you're adding is \fBlatest\fP and you
+\fB\-\-tag\fP config if not specified\. If the tag you're adding is \fBlatest\fP and you
have two\-factor authentication on auth\-and\-writes then you'll need to include
an otp on the command line with \fB\-\-otp\fP\|\.
.IP \(bu 2
diff --git a/deps/npm/man/man1/npm-docs.1 b/deps/npm/man/man1/npm-docs.1
index f5a50daf6e4a92..b7246688c2185d 100644
--- a/deps/npm/man/man1/npm-docs.1
+++ b/deps/npm/man/man1/npm-docs.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DOCS" "1" "February 2018" "" ""
+.TH "NPM\-DOCS" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-docs\fR \- Docs for a package in a web browser maybe
.SH SYNOPSIS
@@ -32,7 +32,7 @@ The browser that is called by the \fBnpm docs\fP command to open websites\.
.SS registry
.RS 0
.IP \(bu 2
-Default: https://registry\.npmjs\.org/
+Default: https://
.IP \(bu 2
Type: url
diff --git a/deps/npm/man/man1/npm-doctor.1 b/deps/npm/man/man1/npm-doctor.1
index 494eb0c2c0f6ab..0193d88f272c75 100644
--- a/deps/npm/man/man1/npm-doctor.1
+++ b/deps/npm/man/man1/npm-doctor.1
@@ -1,4 +1,4 @@
-.TH "NPM\-DOCTOR" "1" "February 2018" "" ""
+.TH "NPM\-DOCTOR" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-doctor\fR \- Check your environments
.SH SYNOPSIS
@@ -111,4 +111,3 @@ npm help help
npm help ping
.RE
-
diff --git a/deps/npm/man/man1/npm-edit.1 b/deps/npm/man/man1/npm-edit.1
index 704a8e47ee6454..3d51b31c511baf 100644
--- a/deps/npm/man/man1/npm-edit.1
+++ b/deps/npm/man/man1/npm-edit.1
@@ -1,4 +1,4 @@
-.TH "NPM\-EDIT" "1" "February 2018" "" ""
+.TH "NPM\-EDIT" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-edit\fR \- Edit an installed package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-explore.1 b/deps/npm/man/man1/npm-explore.1
index ad4f16f8f110ff..525b2b8dc027eb 100644
--- a/deps/npm/man/man1/npm-explore.1
+++ b/deps/npm/man/man1/npm-explore.1
@@ -1,4 +1,4 @@
-.TH "NPM\-EXPLORE" "1" "February 2018" "" ""
+.TH "NPM\-EXPLORE" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-explore\fR \- Browse an installed package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-help-search.1 b/deps/npm/man/man1/npm-help-search.1
index 85ee70a77635fa..75a7bac374c1fb 100644
--- a/deps/npm/man/man1/npm-help-search.1
+++ b/deps/npm/man/man1/npm-help-search.1
@@ -1,4 +1,4 @@
-.TH "NPM\-HELP\-SEARCH" "1" "February 2018" "" ""
+.TH "NPM\-HELP\-SEARCH" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-help-search\fR \- Search npm help documentation
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-help.1 b/deps/npm/man/man1/npm-help.1
index 96fd790f040066..ce810be351c58e 100644
--- a/deps/npm/man/man1/npm-help.1
+++ b/deps/npm/man/man1/npm-help.1
@@ -1,4 +1,4 @@
-.TH "NPM\-HELP" "1" "February 2018" "" ""
+.TH "NPM\-HELP" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-help\fR \- Get help on npm
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-hook.1 b/deps/npm/man/man1/npm-hook.1
new file mode 100644
index 00000000000000..1f5b414a31bc65
--- /dev/null
+++ b/deps/npm/man/man1/npm-hook.1
@@ -0,0 +1,97 @@
+.TH "NPM\-HOOK" "1" "August 2018" "" ""
+.SH "NAME"
+\fBnpm-hook\fR \- Manage registry hooks
+.SH SYNOPSIS
+.P
+.RS 2
+.nf
+npm hook ls [pkg]
+npm hook add
+npm hook update [secret]
+npm hook rm
+.fi
+.RE
+.SH EXAMPLE
+.P
+Add a hook to watch a package for changes:
+.P
+.RS 2
+.nf
+$ npm hook add lodash https://example\.com/ my\-shared\-secret
+.fi
+.RE
+.P
+Add a hook to watch packages belonging to the user \fBsubstack\fP:
+.P
+.RS 2
+.nf
+$ npm hook add ~substack https://example\.com/ my\-shared\-secret
+.fi
+.RE
+.P
+Add a hook to watch packages in the scope \fB@npm\fP
+.P
+.RS 2
+.nf
+$ npm hook add @npm https://example\.com/ my\-shared\-secret
+.fi
+.RE
+.P
+List all your active hooks:
+.P
+.RS 2
+.nf
+$ npm hook ls
+.fi
+.RE
+.P
+List your active hooks for the \fBlodash\fP package:
+.P
+.RS 2
+.nf
+$ npm hook ls lodash
+.fi
+.RE
+.P
+Update an existing hook's url:
+.P
+.RS 2
+.nf
+$ npm hook update id\-deadbeef https://my\-new\-website\.here/
+.fi
+.RE
+.P
+Remove a hook:
+.P
+.RS 2
+.nf
+$ npm hook rm id\-deadbeef
+.fi
+.RE
+.SH DESCRIPTION
+.P
+Allows you to manage npm
+hooks \fIhttps://blog\.npmjs\.org/post/145260155635/introducing\-hooks\-get\-notifications\-of\-npm\fR,
+including adding, removing, listing, and updating\.
+.P
+Hooks allow you to configure URL endpoints that will be notified whenever a
+change happens to any of the supported entity types\. Three different types of
+entities can be watched by hooks: packages, owners, and scopes\.
+.P
+To create a package hook, simply reference the package name\.
+.P
+To create an owner hook, prefix the owner name with \fB~\fP (as in, \fB~youruser\fP)\.
+.P
+To create a scope hook, prefix the scope name with \fB@\fP (as in, \fB@yourscope\fP)\.
+.P
+The hook \fBid\fP used by \fBupdate\fP and \fBrm\fP are the IDs listed in \fBnpm hook ls\fP for
+that particular hook\.
+.P
+The shared secret will be sent along to the URL endpoint so you can verify the
+request came from your own configured hook\.
+.SH SEE ALSO
+.RS 0
+.IP \(bu 2
+"Introducing Hooks" blog post \fIhttps://blog\.npmjs\.org/post/145260155635/introducing\-hooks\-get\-notifications\-of\-npm\fR
+
+.RE
diff --git a/deps/npm/man/man1/npm-init.1 b/deps/npm/man/man1/npm-init.1
index aec58ddb6f7b84..60d64e272cc473 100644
--- a/deps/npm/man/man1/npm-init.1
+++ b/deps/npm/man/man1/npm-init.1
@@ -1,39 +1,81 @@
-.TH "NPM\-INIT" "1" "February 2018" "" ""
+.TH "NPM\-INIT" "1" "August 2018" "" ""
.SH "NAME"
-\fBnpm-init\fR \- Interactively create a package\.json file
+\fBnpm-init\fR \- create a package\.json file
.SH SYNOPSIS
.P
.RS 2
.nf
-npm init [\-f|\-\-force|\-y|\-\-yes]
+npm init [\-\-force|\-f|\-\-yes|\-y|\-\-scope]
+npm init <@scope> (same as `npx <@scope>/create`)
+npm init [<@scope>/] (same as `npx [<@scope>/]create\-`)
+.fi
+.RE
+.SH EXAMPLES
+.P
+Create a new React\-based project using \fBcreate\-react\-app\fP \fIhttps://npm\.im/create\-react\-app\fR:
+.P
+.RS 2
+.nf
+$ npm init react\-app \./my\-react\-app
+.fi
+.RE
+.P
+Create a new \fBesm\fP\-compatible package using \fBcreate\-esm\fP \fIhttps://npm\.im/create\-esm\fR:
+.P
+.RS 2
+.nf
+$ mkdir my\-esm\-lib && cd my\-esm\-lib
+$ npm init esm \-\-yes
.fi
.RE
-.SH DESCRIPTION
.P
-This will ask you a bunch of questions, and then write a package\.json for you\.
+Generate a plain old package\.json using legacy init:
.P
-It attempts to make reasonable guesses about what you want things to be set to,
-and then writes a package\.json file with the options you've selected\.
+.RS 2
+.nf
+$ mkdir my\-npm\-pkg && cd my\-npm\-pkg
+$ git init
+$ npm init
+.fi
+.RE
.P
-If you already have a package\.json file, it'll read that first, and default to
-the options in there\.
+Generate it without having it ask any questions:
.P
-It is strictly additive, so it does not delete options from your package\.json
-without a really good reason to do so\.
+.RS 2
+.nf
+$ npm init \-y
+.fi
+.RE
+.SH DESCRIPTION
+.P
+\fBnpm init \fP can be used to set up a new or existing npm package\.
.P
-If you invoke it with \fB\-f\fP, \fB\-\-force\fP, \fB\-y\fP, or \fB\-\-yes\fP, it will use only
-defaults and not prompt you for any options\.
-.SH CONFIGURATION
-.SS scope
+\fBinitializer\fP in this case is an npm package named \fBcreate\-\fP, which
+will be installed by npm help \fBnpx\fP \fIhttps://npm\.im/npx\fR, and then have its main bin
+executed \-\- presumably creating or updating \fBpackage\.json\fP and running any other
+initialization\-related operations\.
+.P
+The init command is transformed to a corresponding \fBnpx\fP operation as follows:
.RS 0
.IP \(bu 2
-Default: none
+\fBnpm init foo\fP \-> \fBnpx create\-foo\fP
+.IP \(bu 2
+\fBnpm init @usr/foo\fP \-> \fBnpx @usr/create\-foo\fP
.IP \(bu 2
-Type: String
+\fBnpm init @usr\fP \-> \fBnpx @usr/create\fP
.RE
.P
-The scope under which the new module should be created\.
+Any additional options will be passed directly to the command, so \fBnpm init foo
+\-\-hello\fP will map to \fBnpx create\-foo \-\-hello\fP\|\.
+.P
+If the initializer is omitted (by just calling \fBnpm init\fP), init will fall back
+to legacy init behavior\. It will ask you a bunch of questions, and then write a
+package\.json for you\. It will attempt to make reasonable guesses based on
+existing fields, dependencies, and options selected\. It is strictly additive, so
+it will keep any fields and values that were already set\. You can also use
+\fB\-y\fP/\fB\-\-yes\fP to skip the questionnaire altogether\. If you pass \fB\-\-scope\fP, it
+will create a scoped package\.
.SH SEE ALSO
.RS 0
.IP \(bu 2
diff --git a/deps/npm/man/man1/npm-install-ci-test.1 b/deps/npm/man/man1/npm-install-ci-test.1
new file mode 100644
index 00000000000000..8f513dc56bba87
--- /dev/null
+++ b/deps/npm/man/man1/npm-install-ci-test.1
@@ -0,0 +1,23 @@
+.TH "NPM" "" "August 2018" "" ""
+.SH "NAME"
+\fBnpm\fR
+.SH SYNOPSIS
+.P
+.RS 2
+.nf
+npm install\-ci\-test
+
+alias: npm cit
+.fi
+.RE
+.SH DESCRIPTION
+.P
+This command runs an \fBnpm ci\fP followed immediately by an \fBnpm test\fP\|\.
+.SH SEE ALSO
+.RS 0
+.IP \(bu 2
+npm help ci
+.IP \(bu 2
+npm help test
+
+.RE
diff --git a/deps/npm/man/man1/npm-install-test.1 b/deps/npm/man/man1/npm-install-test.1
index fc48e5db574141..fc8124b35edd98 100644
--- a/deps/npm/man/man1/npm-install-test.1
+++ b/deps/npm/man/man1/npm-install-test.1
@@ -1,4 +1,4 @@
-.TH "NPM" "" "February 2018" "" ""
+.TH "NPM" "" "August 2018" "" ""
.SH "NAME"
\fBnpm\fR
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-install.1 b/deps/npm/man/man1/npm-install.1
index a0f4adf16bb132..6830f133ab0ba5 100644
--- a/deps/npm/man/man1/npm-install.1
+++ b/deps/npm/man/man1/npm-install.1
@@ -1,4 +1,4 @@
-.TH "NPM\-INSTALL" "1" "February 2018" "" ""
+.TH "NPM\-INSTALL" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-install\fR \- Install a package
.SH SYNOPSIS
@@ -62,6 +62,11 @@ after packing it up into a tarball (b)\.
With the \fB\-\-production\fP flag (or when the \fBNODE_ENV\fP environment variable
is set to \fBproduction\fP), npm will not install modules listed in
\fBdevDependencies\fP\|\.
+.QP
+NOTE: The \fB\-\-production\fP flag has no particular meaning when adding a
+ dependency to a project\.
+
+.
.IP \(bu 2
\fBnpm install \fP:
Install the package in the directory as a symlink in the current project\.
@@ -72,14 +77,24 @@ after packing it up into a tarball (b)\.
\fBnpm install \fP:
Install a package that is sitting on the filesystem\. Note: if you just want
to link a dev directory into your npm root, you can do this more easily by
- using \fBnpm link\fP\|\. The filename \fImust\fR use \fB\|\.tar\fP, \fB\|\.tar\.gz\fP, or \fB\|\.tgz\fP as
- the extension\.
- Example:
+ using \fBnpm link\fP\|\.
+ Tarball requirements:
+.RS 0
+.IP \(bu 2
+The filename \fImust\fR use \fB\|\.tar\fP, \fB\|\.tar\.gz\fP, or \fB\|\.tgz\fP as
+the extension\.
+.IP \(bu 2
+The package contents should reside in a subfolder inside the tarball (usually it is called \fBpackage/\fP)\. npm strips one directory layer when installing the package (an equivalent of \fBtar x \-\-strip\-components=1\fP is run)\.
+.IP \(bu 2
+The package must contain a \fBpackage\.json\fP file with \fBname\fP and \fBversion\fP properties\.
+Example:
.P
.RS 2
.nf
- npm install \./package\.tgz
+npm install \./package\.tgz
.fi
+.RE
+
.RE
.IP \(bu 2
\fBnpm install \fP:
@@ -249,11 +264,11 @@ Examples:
.P
.RS 2
.nf
-npm install git+ssh://git@github\.com:npm/npm\.git#v1\.0\.27
-npm install git+ssh://git@github\.com:npm/npm#semver:^5\.0
-npm install git+https://isaacs@github\.com/npm/npm\.git
-npm install git://github\.com/npm/npm\.git#v1\.0\.27
-GIT_SSH_COMMAND='ssh \-i ~/\.ssh/custom_ident' npm install git+ssh://git@github\.com:npm/npm\.git
+npm install git+ssh://git@github\.com:npm/cli\.git#v1\.0\.27
+npm install git+ssh://git@github\.com:npm/cli#semver:^5\.0
+npm install git+https://isaacs@github\.com/npm/cli\.git
+npm install git://github\.com/npm/cli\.git#v1\.0\.27
+GIT_SSH_COMMAND='ssh \-i ~/\.ssh/custom_ident' npm install git+ssh://git@github\.com:npm/cli\.git
.fi
.RE
@@ -397,7 +412,8 @@ The \fB\-\-no\-shrinkwrap\fP argument, which will ignore an available
package lock or shrinkwrap file and use the package\.json instead\.
.P
The \fB\-\-no\-package\-lock\fP argument will prevent npm from creating a
-\fBpackage\-lock\.json\fP file\.
+\fBpackage\-lock\.json\fP file\. When running with package\-lock's disabled npm
+will not automatically prune your node modules when installing\.
.P
The \fB\-\-nodedir=/path/to/node/source\fP argument will allow npm to find the
node source code so that npm can compile native modules\.
@@ -405,6 +421,9 @@ node source code so that npm can compile native modules\.
The \fB\-\-only={prod[uction]|dev[elopment]}\fP argument will cause either only
\fBdevDependencies\fP or only non\-\fBdevDependencies\fP to be installed regardless of the \fBNODE_ENV\fP\|\.
.P
+The \fB\-\-no\-audit\fP argument can be used to disable sending of audit reports to
+the configured registries\. See npm help \fBnpm\-audit\fP for details on what is sent\.
+.P
See npm help 7 \fBnpm\-config\fP\|\. Many of the configuration params have some
effect on installation, since that's most of what npm does\.
.SH ALGORITHM
@@ -496,6 +515,8 @@ npm help 5 folders
.IP \(bu 2
npm help update
.IP \(bu 2
+npm help audit
+.IP \(bu 2
npm help link
.IP \(bu 2
npm help rebuild
diff --git a/deps/npm/man/man1/npm-link.1 b/deps/npm/man/man1/npm-link.1
index 1b4f3c6dd16727..711d73bfb8887b 100644
--- a/deps/npm/man/man1/npm-link.1
+++ b/deps/npm/man/man1/npm-link.1
@@ -1,4 +1,4 @@
-.TH "NPM\-LINK" "1" "February 2018" "" ""
+.TH "NPM\-LINK" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-link\fR \- Symlink a package folder
.SH SYNOPSIS
@@ -66,13 +66,16 @@ The second line is the equivalent of doing:
.RS 2
.nf
(cd \.\./node\-redis; npm link)
-npm link node\-redis
+npm link redis
.fi
.RE
.P
That is, it first creates a global link, and then links the global
installation target into your project's \fBnode_modules\fP folder\.
.P
+Note that in this case, you are referring to the directory name, \fBnode\-redis\fP,
+rather than the package name \fBredis\fP\|\.
+.P
If your linked package is scoped (see npm help 7 \fBnpm\-scope\fP) your link command must
include that scope, e\.g\.
.P
diff --git a/deps/npm/man/man1/npm-logout.1 b/deps/npm/man/man1/npm-logout.1
index b4f2f3f4578cef..cc0bbe63770b2e 100644
--- a/deps/npm/man/man1/npm-logout.1
+++ b/deps/npm/man/man1/npm-logout.1
@@ -1,4 +1,4 @@
-.TH "NPM\-LOGOUT" "1" "February 2018" "" ""
+.TH "NPM\-LOGOUT" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-logout\fR \- Log out of the registry
.SH SYNOPSIS
@@ -23,7 +23,7 @@ connected to that scope, if set\.
.SH CONFIGURATION
.SS registry
.P
-Default: https://registry\.npmjs\.org/
+Default: https://
.P
The base URL of the npm package registry\. If \fBscope\fP is also specified,
it takes precedence\.
diff --git a/deps/npm/man/man1/npm-ls.1 b/deps/npm/man/man1/npm-ls.1
index 2127aed2ec0ef7..28d875bd735507 100644
--- a/deps/npm/man/man1/npm-ls.1
+++ b/deps/npm/man/man1/npm-ls.1
@@ -1,4 +1,4 @@
-.TH "NPM\-LS" "1" "February 2018" "" ""
+.TH "NPM\-LS" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-ls\fR \- List installed packages
.SH SYNOPSIS
@@ -22,7 +22,7 @@ For example, running \fBnpm ls promzard\fP in npm's source tree will show:
.P
.RS 2
.nf
-npm@5.6.0 /path/to/npm
+npm@6.4.1 /path/to/npm
└─┬ init\-package\-json@0\.0\.4
└── promzard@0\.1\.5
.fi
@@ -98,7 +98,7 @@ Default: false
.RE
.P
Display only the dependency tree for packages in \fBdependencies\fP\|\.
-.SS dev
+.SS dev / development
.RS 0
.IP \(bu 2
Type: Boolean
diff --git a/deps/npm/man/man1/npm-outdated.1 b/deps/npm/man/man1/npm-outdated.1
index e580351ad312ac..0da04eec2a6300 100644
--- a/deps/npm/man/man1/npm-outdated.1
+++ b/deps/npm/man/man1/npm-outdated.1
@@ -1,4 +1,4 @@
-.TH "NPM\-OUTDATED" "1" "February 2018" "" ""
+.TH "NPM\-OUTDATED" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-outdated\fR \- Check for outdated packages
.SH SYNOPSIS
@@ -34,6 +34,10 @@ always be seeing only top\-level dependencies that are outdated\.
\fBpackage type\fP (when using \fB\-\-long\fP / \fB\-l\fP) tells you whether this package is
a \fBdependency\fP or a \fBdevDependency\fP\|\. Packages not included in \fBpackage\.json\fP
are always marked \fBdependencies\fP\|\.
+.IP \(bu 2
+Red means there's a newer version matching your semver requirements, so you should update now\.
+.IP \(bu 2
+Yellow indicates that there's a newer version above your semver requirements (usually new major, or new 0\.x minor) so proceed with caution\.
.RE
.SS An example
@@ -75,10 +79,9 @@ something immutable, like a commit SHA), or it might not, so \fBnpm outdated\fP
\fBnpm update\fP have to fetch Git repos to check\. This is why currently doing a
reinstall of a Git dependency always forces a new clone and install\.
.IP \(bu 2
-\fBnpm@3\.5\.2\fP is marked as "wanted", but "latest" is \fBnpm@3\.5\.1\fP because npm
-uses dist\-tags to manage its \fBlatest\fP and \fBnext\fP release channels\. \fBnpm update\fP
-will install the \fInewest\fR version, but \fBnpm install npm\fP (with no semver range)
-will install whatever's tagged as \fBlatest\fP\|\.
+\fBis marked as "wanted", but "latest" is\fP\fBbecause npm
+uses dist\-tags to manage its\fPlatest\fBand\fPnext\fBrelease channels\.\fPnpm update\fBwill install the _newest_ version, but\fPnpm install npm\fB(with no semver range)
+will install whatever's tagged as\fPlatest`\.
.IP \(bu 2
\fBonce\fP is just plain out of date\. Reinstalling \fBnode_modules\fP from scratch or
running \fBnpm update\fP will bring it up to spec\.
diff --git a/deps/npm/man/man1/npm-owner.1 b/deps/npm/man/man1/npm-owner.1
index 7308f29e164fc8..c5820e3f74a84b 100644
--- a/deps/npm/man/man1/npm-owner.1
+++ b/deps/npm/man/man1/npm-owner.1
@@ -1,4 +1,4 @@
-.TH "NPM\-OWNER" "1" "February 2018" "" ""
+.TH "NPM\-OWNER" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-owner\fR \- Manage package owners
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-pack.1 b/deps/npm/man/man1/npm-pack.1
index 488e8c7ff19f71..b13ca0cbac74f3 100644
--- a/deps/npm/man/man1/npm-pack.1
+++ b/deps/npm/man/man1/npm-pack.1
@@ -1,11 +1,11 @@
-.TH "NPM\-PACK" "1" "February 2018" "" ""
+.TH "NPM\-PACK" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-pack\fR \- Create a tarball from a package
.SH SYNOPSIS
.P
.RS 2
.nf
-npm pack [[<@scope>/]\.\.\.]
+npm pack [[<@scope>/]\.\.\.] [\-\-dry\-run]
.fi
.RE
.SH DESCRIPTION
@@ -20,6 +20,9 @@ If the same package is specified multiple times, then the file will be
overwritten the second time\.
.P
If no arguments are supplied, then npm packs the current package folder\.
+.P
+The \fB\-\-dry\-run\fP argument will do everything that pack usually does without
+actually packing anything\. Reports on what would have gone into the tarball\.
.SH SEE ALSO
.RS 0
.IP \(bu 2
diff --git a/deps/npm/man/man1/npm-ping.1 b/deps/npm/man/man1/npm-ping.1
index a06b56c25b5c7d..7432e645889e37 100644
--- a/deps/npm/man/man1/npm-ping.1
+++ b/deps/npm/man/man1/npm-ping.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PING" "1" "February 2018" "" ""
+.TH "NPM\-PING" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-ping\fR \- Ping npm registry
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-prefix.1 b/deps/npm/man/man1/npm-prefix.1
index 859fa041b0b4f1..28fced48b8c020 100644
--- a/deps/npm/man/man1/npm-prefix.1
+++ b/deps/npm/man/man1/npm-prefix.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PREFIX" "1" "February 2018" "" ""
+.TH "NPM\-PREFIX" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-prefix\fR \- Display prefix
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-profile.1 b/deps/npm/man/man1/npm-profile.1
index 6c57d425b64a33..a93bb1f889a189 100644
--- a/deps/npm/man/man1/npm-profile.1
+++ b/deps/npm/man/man1/npm-profile.1
@@ -1,4 +1,4 @@
-.TH "NPM\-PROFILE" "1" "February 2018" "" ""
+.TH "NPM\-PROFILE" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-profile\fR \- Change settings on your registry profile
.SH SYNOPSIS
@@ -89,4 +89,3 @@ available on non npmjs\.com registries\.
npm help 7 config
.RE
-
diff --git a/deps/npm/man/man1/npm-prune.1 b/deps/npm/man/man1/npm-prune.1
index 843ff8f7478f4c..be400b75044da4 100644
--- a/deps/npm/man/man1/npm-prune.1
+++ b/deps/npm/man/man1/npm-prune.1
@@ -1,11 +1,11 @@
-.TH "NPM\-PRUNE" "1" "February 2018" "" ""
+.TH "NPM\-PRUNE" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-prune\fR \- Remove extraneous packages
.SH SYNOPSIS
.P
.RS 2
.nf
-npm prune [[<@scope>/]\.\.\.] [\-\-production]
+npm prune [[<@scope>/]\.\.\.] [\-\-production] [\-\-dry\-run] [\-\-json]
.fi
.RE
.SH DESCRIPTION
@@ -19,8 +19,20 @@ package's dependencies list\.
.P
If the \fB\-\-production\fP flag is specified or the \fBNODE_ENV\fP environment
variable is set to \fBproduction\fP, this command will remove the packages
-specified in your \fBdevDependencies\fP\|\. Setting \fB\-\-production=false\fP will
+specified in your \fBdevDependencies\fP\|\. Setting \fB\-\-no\-production\fP will
negate \fBNODE_ENV\fP being set to \fBproduction\fP\|\.
+.P
+If the \fB\-\-dry\-run\fP flag is used then no changes will actually be made\.
+.P
+If the \fB\-\-json\fP flag is used then the changes \fBnpm prune\fP made (or would
+have made with \fB\-\-dry\-run\fP) are printed as a JSON object\.
+.P
+In normal operation with package\-locks enabled, extraneous modules are
+pruned automatically when modules are installed and you'll only need
+this command with the \fB\-\-production\fP flag\.
+.P
+If you've disabled package\-locks then extraneous modules will not be removed
+and it's up to you to run \fBnpm prune\fP from time\-to\-time to remove them\.
.SH SEE ALSO
.RS 0
.IP \(bu 2
diff --git a/deps/npm/man/man1/npm-publish.1 b/deps/npm/man/man1/npm-publish.1
index 1477e11168e021..3034f6aaa32203 100644
--- a/deps/npm/man/man1/npm-publish.1
+++ b/deps/npm/man/man1/npm-publish.1
@@ -1,11 +1,11 @@
-.TH "NPM\-PUBLISH" "1" "February 2018" "" ""
+.TH "NPM\-PUBLISH" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-publish\fR \- Publish a package
.SH SYNOPSIS
.P
.RS 2
.nf
-npm publish [|] [\-\-tag ] [\-\-access ] [\-\-otp otpcode]
+npm publish [|] [\-\-tag ] [\-\-access ] [\-\-otp otpcode] [\-\-dry\-run]
Publishes '\.' if no argument supplied
Sets tag 'latest' if no \-\-tag specified
@@ -48,6 +48,10 @@ to publish scoped packages\.
If you have two\-factor authentication enabled in \fBauth\-and\-writes\fP mode
then you can provide a code from your authenticator with this\. If you
don't include this and you're running from a TTY then you'll be prompted\.
+.IP \(bu 2
+\fB[\-\-dry\-run]\fP
+Does everything publish would do except actually publishing to the registry\.
+Reports the details of what would have been published\.
.RE
.P
@@ -62,9 +66,8 @@ As of \fBnpm@5\fP, both a sha1sum and an integrity field with a sha512sum of the
tarball will be submitted to the registry during publication\. Subsequent
installs will use the strongest supported algorithm to verify downloads\.
.P
-For a "dry run" that does everything except actually publishing to the
-registry, see npm help \fBnpm\-pack\fP, which figures out the files to be included and
-packs them into a tarball to be uploaded to the registry\.
+Similar to \fB\-\-dry\-run\fP see npm help \fBnpm\-pack\fP, which figures out the files to be
+included and packs them into a tarball to be uploaded to the registry\.
.SH SEE ALSO
.RS 0
.IP \(bu 2
diff --git a/deps/npm/man/man1/npm-rebuild.1 b/deps/npm/man/man1/npm-rebuild.1
index aa65a577b985b5..c4cc56fdb394d1 100644
--- a/deps/npm/man/man1/npm-rebuild.1
+++ b/deps/npm/man/man1/npm-rebuild.1
@@ -1,4 +1,4 @@
-.TH "NPM\-REBUILD" "1" "February 2018" "" ""
+.TH "NPM\-REBUILD" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-rebuild\fR \- Rebuild a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-repo.1 b/deps/npm/man/man1/npm-repo.1
index 032e0bc17588bb..060da8a0f2cc07 100644
--- a/deps/npm/man/man1/npm-repo.1
+++ b/deps/npm/man/man1/npm-repo.1
@@ -1,4 +1,4 @@
-.TH "NPM\-REPO" "1" "February 2018" "" ""
+.TH "NPM\-REPO" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-repo\fR \- Open package repository page in the browser
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-restart.1 b/deps/npm/man/man1/npm-restart.1
index fe1e66a20fa879..117b0faceaa37b 100644
--- a/deps/npm/man/man1/npm-restart.1
+++ b/deps/npm/man/man1/npm-restart.1
@@ -1,4 +1,4 @@
-.TH "NPM\-RESTART" "1" "February 2018" "" ""
+.TH "NPM\-RESTART" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-restart\fR \- Restart a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-root.1 b/deps/npm/man/man1/npm-root.1
index ff0d64168deda2..cd811853ff25dc 100644
--- a/deps/npm/man/man1/npm-root.1
+++ b/deps/npm/man/man1/npm-root.1
@@ -1,4 +1,4 @@
-.TH "NPM\-ROOT" "1" "February 2018" "" ""
+.TH "NPM\-ROOT" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-root\fR \- Display npm root
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-run-script.1 b/deps/npm/man/man1/npm-run-script.1
index 1e56a9cf9d1c82..e1ecf2c775aded 100644
--- a/deps/npm/man/man1/npm-run-script.1
+++ b/deps/npm/man/man1/npm-run-script.1
@@ -1,4 +1,4 @@
-.TH "NPM\-RUN\-SCRIPT" "1" "February 2018" "" ""
+.TH "NPM\-RUN\-SCRIPT" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-run-script\fR \- Run arbitrary package scripts
.SH SYNOPSIS
@@ -18,9 +18,9 @@ used by the test, start, restart, and stop commands, but can be called
directly, as well\. When the scripts in the package are printed out, they're
separated into lifecycle (test, start, restart) and directly\-run scripts\.
.P
-As of \fBnpm@2\.0\.0\fP \fIhttp://blog\.npmjs\.org/post/98131109725/npm\-2\-0\-0\fR, you can
+As of ` \fIhttps://blog\.npmjs\.org/post/98131109725/npm\-2\-0\-0\fR, you can
use custom arguments when executing scripts\. The special option \fB\-\-\fP is used by
-getopt \fIhttp://goo\.gl/KxMmtG\fR to delimit the end of the options\. npm will pass
+getopt \fIhttps://goo\.gl/KxMmtG\fR to delimit the end of the options\. npm will pass
all the arguments after the \fB\-\-\fP directly to your script:
.P
.RS 2
@@ -53,7 +53,7 @@ instead of
.P
.RS 2
.nf
-"scripts": {"test": "node_modules/\.bin/tap test/\\*\.js"}
+"scripts": {"test": "node_modules/\.bin/tap test/\\*\.js"}
.fi
.RE
.P
@@ -62,7 +62,7 @@ to run your tests\.
The actual shell your script is run within is platform dependent\. By default,
on Unix\-like systems it is the \fB/bin/sh\fP command, on Windows it is the \fBcmd\.exe\fP\|\.
The actual shell referred to by \fB/bin/sh\fP also depends on the system\.
-As of \fBnpm@5\.1\.0\fP \fIhttps://github\.com/npm/npm/releases/tag/v5\.1\.0\fR you can
+As of ` \fIhttps://github\.com/npm/npm/releases/tag/v5\.1\.0\fR you can
customize the shell with the \fBscript\-shell\fP configuration\.
.P
Scripts are run from the root of the module, regardless of what your current
@@ -82,6 +82,10 @@ If you try to run a script without having a \fBnode_modules\fP directory and it
you will be given a warning to run \fBnpm install\fP, just in case you've forgotten\.
.P
You can use the \fB\-\-silent\fP flag to prevent showing \fBnpm ERR!\fP output on error\.
+.P
+You can use the \fB\-\-if\-present\fP flag to avoid exiting with a non\-zero exit code
+when the script is undefined\. This lets you run potentially undefined scripts
+without breaking the execution chain\.
.SH SEE ALSO
.RS 0
.IP \(bu 2
diff --git a/deps/npm/man/man1/npm-search.1 b/deps/npm/man/man1/npm-search.1
index f58b5cbef95e8f..53055265e46108 100644
--- a/deps/npm/man/man1/npm-search.1
+++ b/deps/npm/man/man1/npm-search.1
@@ -1,4 +1,4 @@
-.TH "NPM\-SEARCH" "1" "February 2018" "" ""
+.TH "NPM\-SEARCH" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-search\fR \- Search for packages
.SH SYNOPSIS
@@ -110,7 +110,7 @@ The age of the cache, in seconds, before another registry request is made\.
.SS registry
.RS 0
.IP \(bu 2
-Default: https://registry\.npmjs\.org/
+Default: https://
.IP \(bu 2
Type: url
diff --git a/deps/npm/man/man1/npm-shrinkwrap.1 b/deps/npm/man/man1/npm-shrinkwrap.1
index 6bb356604a1bb4..60f3c1b7fd0530 100644
--- a/deps/npm/man/man1/npm-shrinkwrap.1
+++ b/deps/npm/man/man1/npm-shrinkwrap.1
@@ -1,4 +1,4 @@
-.TH "NPM\-SHRINKWRAP" "1" "February 2018" "" ""
+.TH "NPM\-SHRINKWRAP" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-shrinkwrap\fR \- Lock down dependency versions for publication
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-star.1 b/deps/npm/man/man1/npm-star.1
index 885336ea6b5178..2a5db1609a7660 100644
--- a/deps/npm/man/man1/npm-star.1
+++ b/deps/npm/man/man1/npm-star.1
@@ -1,4 +1,4 @@
-.TH "NPM\-STAR" "1" "February 2018" "" ""
+.TH "NPM\-STAR" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-star\fR \- Mark your favorite packages
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-stars.1 b/deps/npm/man/man1/npm-stars.1
index 9671d63e45d78a..511f65d4ec3f80 100644
--- a/deps/npm/man/man1/npm-stars.1
+++ b/deps/npm/man/man1/npm-stars.1
@@ -1,4 +1,4 @@
-.TH "NPM\-STARS" "1" "February 2018" "" ""
+.TH "NPM\-STARS" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-stars\fR \- View packages marked as favorites
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-start.1 b/deps/npm/man/man1/npm-start.1
index 2861e4e7bcc66c..1eb7bab7e77009 100644
--- a/deps/npm/man/man1/npm-start.1
+++ b/deps/npm/man/man1/npm-start.1
@@ -1,4 +1,4 @@
-.TH "NPM\-START" "1" "February 2018" "" ""
+.TH "NPM\-START" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-start\fR \- Start a package
.SH SYNOPSIS
@@ -14,7 +14,7 @@ This runs an arbitrary command specified in the package's \fB"start"\fP property
its \fB"scripts"\fP object\. If no \fB"start"\fP property is specified on the
\fB"scripts"\fP object, it will run \fBnode server\.js\fP\|\.
.P
-As of \fBnpm@2\.0\.0\fP \fIhttp://blog\.npmjs\.org/post/98131109725/npm\-2\-0\-0\fR, you can
+As of ` \fIhttps://blog\.npmjs\.org/post/98131109725/npm\-2\-0\-0\fR, you can
use custom arguments when executing scripts\. Refer to npm help run\-script for
more details\.
.SH SEE ALSO
diff --git a/deps/npm/man/man1/npm-stop.1 b/deps/npm/man/man1/npm-stop.1
index 4441cd9fc0fac1..0ac18b8e13f965 100644
--- a/deps/npm/man/man1/npm-stop.1
+++ b/deps/npm/man/man1/npm-stop.1
@@ -1,4 +1,4 @@
-.TH "NPM\-STOP" "1" "February 2018" "" ""
+.TH "NPM\-STOP" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-stop\fR \- Stop a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-team.1 b/deps/npm/man/man1/npm-team.1
index 7bc9025cf97795..652c7b65fe706a 100644
--- a/deps/npm/man/man1/npm-team.1
+++ b/deps/npm/man/man1/npm-team.1
@@ -1,4 +1,4 @@
-.TH "NPM\-TEAM" "1" "February 2018" "" ""
+.TH "NPM\-TEAM" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-team\fR \- Manage organization teams and team memberships
.SH SYNOPSIS
@@ -37,6 +37,9 @@ ls:
If performed on an organization name, will return a list of existing teams
under that organization\. If performed on a team, it will instead return a list
of all users belonging to that particular team\.
+.IP \(bu 2
+edit:
+Edit a current team\.
.RE
.SH DETAILS
diff --git a/deps/npm/man/man1/npm-test.1 b/deps/npm/man/man1/npm-test.1
index d7d8cafe00d2a1..3b52ebe1610fd4 100644
--- a/deps/npm/man/man1/npm-test.1
+++ b/deps/npm/man/man1/npm-test.1
@@ -1,4 +1,4 @@
-.TH "NPM\-TEST" "1" "February 2018" "" ""
+.TH "NPM\-TEST" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-test\fR \- Test a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-token.1 b/deps/npm/man/man1/npm-token.1
index 2e6821dea38d26..796e6ae2c59661 100644
--- a/deps/npm/man/man1/npm-token.1
+++ b/deps/npm/man/man1/npm-token.1
@@ -1,4 +1,4 @@
-.TH "NPM\-TOKEN" "1" "February 2018" "" ""
+.TH "NPM\-TOKEN" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-token\fR \- Manage your authentication tokens
.SH SYNOPSIS
@@ -67,8 +67,7 @@ two\-factor authentication enabled, an otp\.
\fBnpm token revoke \fP:
This removes an authentication token, making it immediately unusable\. This can accept
both complete tokens (as you get back from \fBnpm token create\fP and will
-find in your \fB\|\.npmrc\fP) and ids as seen in the \fBnpm token list\fP output\.
+find in your \fB\|\.npmrc\fP) and ids as seen in the \fBnpm token list\fP output\.
This will NOT accept the truncated token found in \fBnpm token list\fP output\.
.RE
-
diff --git a/deps/npm/man/man1/npm-uninstall.1 b/deps/npm/man/man1/npm-uninstall.1
index f05cae0ba674d6..4f5c95fe53bf4e 100644
--- a/deps/npm/man/man1/npm-uninstall.1
+++ b/deps/npm/man/man1/npm-uninstall.1
@@ -1,4 +1,4 @@
-.TH "NPM\-UNINSTALL" "1" "February 2018" "" ""
+.TH "NPM\-UNINSTALL" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-uninstall\fR \- Remove a package
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-unpublish.1 b/deps/npm/man/man1/npm-unpublish.1
index 90d6e92bf6f810..0a7c8f28a11e9d 100644
--- a/deps/npm/man/man1/npm-unpublish.1
+++ b/deps/npm/man/man1/npm-unpublish.1
@@ -1,4 +1,4 @@
-.TH "NPM\-UNPUBLISH" "1" "February 2018" "" ""
+.TH "NPM\-UNPUBLISH" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-unpublish\fR \- Remove a package from the registry
.SH SYNOPSIS
@@ -26,13 +26,15 @@ If no version is specified, or if all versions are removed then
the root package entry is removed from the registry entirely\.
.P
Even if a package version is unpublished, that specific name and
-version combination can never be reused\. In order to publish the
-package again, a new version number must be used\.
+version combination can never be reused\. In order to publish the
+package again, a new version number must be used\. Additionally,
+new versions of packages with every version unpublished may not
+be republished until 24 hours have passed\.
.P
With the default registry (\fBregistry\.npmjs\.org\fP), unpublish is
-only allowed with versions published in the last 24 hours\. If you
+only allowed with versions published in the last 72 hours\. If you
are trying to unpublish a version published longer ago than that,
-contact support@npmjs\.com\.
+contact \|\.
.P
The scope is optional and follows the usual rules for npm help 7 \fBnpm\-scope\fP\|\.
.SH SEE ALSO
diff --git a/deps/npm/man/man1/npm-update.1 b/deps/npm/man/man1/npm-update.1
index 5ffde2e0b0d431..69013e8a1e9180 100644
--- a/deps/npm/man/man1/npm-update.1
+++ b/deps/npm/man/man1/npm-update.1
@@ -1,4 +1,4 @@
-.TH "NPM\-UPDATE" "1" "February 2018" "" ""
+.TH "NPM\-UPDATE" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-update\fR \- Update a package
.SH SYNOPSIS
@@ -25,13 +25,17 @@ packages\.
If no package name is specified, all packages in the specified location (global
or local) will be updated\.
.P
-As of \fBnpm@2\.6\.1\fP, the \fBnpm update\fP will only inspect top\-level packages\.
-Prior versions of \fBnpm\fP would also recursively inspect all dependencies\.
-To get the old behavior, use \fBnpm \-\-depth 9999 update\fP\|\.
+As of \fB, the\fPnpm update\fBwill only inspect top\-level packages\.
+Prior versions of\fPnpm\fBwould also recursively inspect all dependencies\.
+To get the old behavior, use\fPnpm \-\-depth 9999 update`\.
+.P
+As of \fB, the\fPnpm update\fBwill change\fPpackage\.json\fBto save the
+new version as the minimum required dependency\. To get the old behavior,
+use\fPnpm update \-\-no\-save`\.
.SH EXAMPLES
.P
-IMPORTANT VERSION NOTE: these examples assume \fBnpm@2\.6\.1\fP or later\. For
-older versions of \fBnpm\fP, you must specify \fB\-\-depth 0\fP to get the behavior
+IMPORTANT VERSION NOTE: these examples assume \fBor later\. For
+older versions of\fPnpm\fB, you must specify\fP\-\-depth 0` to get the behavior
described below\.
.P
For the examples below, assume that the current package is \fBapp\fP and it depends
@@ -67,8 +71,7 @@ If \fBapp\fP\|'s \fBpackage\.json\fP contains:
.fi
.RE
.P
-Then \fBnpm update\fP will install \fBdep1@1\.2\.2\fP, because \fB1\.2\.2\fP is \fBlatest\fP and
-\fB1\.2\.2\fP satisfies \fB^1\.1\.1\fP\|\.
+Then \fBnpm update\fP will install \fB, because\fP1\.2\.2\fBis\fPlatest\fBand\fP1\.2\.2\fBsatisfies\fP^1\.1\.1`\.
.SS Tilde Dependencies
.P
However, if \fBapp\fP\|'s \fBpackage\.json\fP contains:
@@ -81,10 +84,9 @@ However, if \fBapp\fP\|'s \fBpackage\.json\fP contains:
.fi
.RE
.P
-In this case, running \fBnpm update\fP will install \fBdep1@1\.1\.2\fP\|\. Even though the \fBlatest\fP
-tag points to \fB1\.2\.2\fP, this version does not satisfy \fB~1\.1\.1\fP, which is equivalent
-to \fB>=1\.1\.1 <1\.2\.0\fP\|\. So the highest\-sorting version that satisfies \fB~1\.1\.1\fP is used,
-which is \fB1\.1\.2\fP\|\.
+In this case, running \fBnpm update\fP will install \fB\|\. Even though the\fPlatest\fBtag points to\fP1\.2\.2\fB, this version does not satisfy\fP~1\.1\.1\fB, which is equivalent
+to\fP>=1\.1\.1 <1\.2\.0\fB\|\. So the highest\-sorting version that satisfies\fP~1\.1\.1\fBis used,
+which is\fP1\.1\.2`\.
.SS Caret Dependencies below 1\.0\.0
.P
Suppose \fBapp\fP has a caret dependency on a version below \fB1\.0\.0\fP, for example:
@@ -97,8 +99,8 @@ Suppose \fBapp\fP has a caret dependency on a version below \fB1\.0\.0\fP, for e
.fi
.RE
.P
-\fBnpm update\fP will install \fBdep1@0\.2\.0\fP, because there are no other
-versions which satisfy \fB^0\.2\.0\fP\|\.
+\fBnpm update\fP will install \fB, because there are no other
+versions which satisfy\fP^0\.2\.0`\.
.P
If the dependence were on \fB^0\.4\.0\fP:
.P
@@ -110,36 +112,8 @@ If the dependence were on \fB^0\.4\.0\fP:
.fi
.RE
.P
-Then \fBnpm update\fP will install \fBdep1@0\.4\.1\fP, because that is the highest\-sorting
-version that satisfies \fB^0\.4\.0\fP (\fB>= 0\.4\.0 <0\.5\.0\fP)
-.SS Recording Updates with \fB\-\-save\fP
-.P
-When you want to update a package and save the new version as
-the minimum required dependency in \fBpackage\.json\fP, you can use
-\fBnpm update \-S\fP or \fBnpm update \-\-save\fP\|\. For example if
-\fBpackage\.json\fP contains:
-.P
-.RS 2
-.nf
-"dependencies": {
- "dep1": "^1\.1\.1"
-}
-.fi
-.RE
-.P
-Then \fBnpm update \-\-save\fP will install \fBdep1@1\.2\.2\fP (i\.e\., \fBlatest\fP),
-and \fBpackage\.json\fP will be modified:
-.P
-.RS 2
-.nf
-"dependencies": {
- "dep1": "^1\.2\.2"
-}
-.fi
-.RE
-.P
-Note that \fBnpm\fP will only write an updated version to \fBpackage\.json\fP
-if it installs a new package\.
+Then \fBnpm update\fP will install \fB, because that is the highest\-sorting
+version that satisfies\fP^0\.4\.0\fB(\fP>= 0\.4\.0 <0\.5\.0`)
.SS Updating Globally\-Installed Packages
.P
\fBnpm update \-g\fP will apply the \fBupdate\fP action to each globally installed
diff --git a/deps/npm/man/man1/npm-version.1 b/deps/npm/man/man1/npm-version.1
index 0f15ccc28bd075..bb61682dc9e292 100644
--- a/deps/npm/man/man1/npm-version.1
+++ b/deps/npm/man/man1/npm-version.1
@@ -1,11 +1,11 @@
-.TH "NPM\-VERSION" "1" "February 2018" "" ""
+.TH "NPM\-VERSION" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-version\fR \- Bump a package version
.SH SYNOPSIS
.P
.RS 2
.nf
-npm version [ | major | minor | patch | premajor | preminor | prepatch | prerelease | from\-git]
+npm version [ | major | minor | patch | premajor | preminor | prepatch | prerelease [\-\-preid=] | from\-git]
\|'npm [\-v | \-\-version]' to print npm version
\|'npm view version' to view a package's published version
@@ -15,7 +15,7 @@ npm version [ | major | minor | patch | premajor | preminor | prepat
.SH DESCRIPTION
.P
Run this in a package directory to bump the version and write the new
-data back to \fBpackage\.json\fP and, if present, \fBnpm\-shrinkwrap\.json\fP\|\.
+data back to \fBpackage\.json\fP, \fBpackage\-lock\.json\fP, and, if present, \fBnpm\-shrinkwrap\.json\fP\|\.
.P
The \fBnewversion\fP argument should be a valid semver string, a
valid second argument to semver\.inc \fIhttps://github\.com/npm/node\-semver#functions\fR (one of \fBpatch\fP, \fBminor\fP, \fBmajor\fP,
@@ -109,7 +109,7 @@ Type: Boolean
.RE
.P
-Prevents throwing an error when \fBnpm version\fP is used to set the new version
+Prevents throwing an error when \fBnpm version\fP is used to set the new version
to the same value as the current version\.
.SS git\-tag\-version
.RS 0
diff --git a/deps/npm/man/man1/npm-view.1 b/deps/npm/man/man1/npm-view.1
index 47609c5e3be818..1863eba6b88c36 100644
--- a/deps/npm/man/man1/npm-view.1
+++ b/deps/npm/man/man1/npm-view.1
@@ -1,4 +1,4 @@
-.TH "NPM\-VIEW" "1" "February 2018" "" ""
+.TH "NPM\-VIEW" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-view\fR \- View registry info
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm-whoami.1 b/deps/npm/man/man1/npm-whoami.1
index d2108627c10cfc..4e7d7f0652a04a 100644
--- a/deps/npm/man/man1/npm-whoami.1
+++ b/deps/npm/man/man1/npm-whoami.1
@@ -1,4 +1,4 @@
-.TH "NPM\-WHOAMI" "1" "February 2018" "" ""
+.TH "NPM\-WHOAMI" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-whoami\fR \- Display npm username
.SH SYNOPSIS
diff --git a/deps/npm/man/man1/npm.1 b/deps/npm/man/man1/npm.1
index 05a9467bf80eb9..d8a04d3e2ef365 100644
--- a/deps/npm/man/man1/npm.1
+++ b/deps/npm/man/man1/npm.1
@@ -1,4 +1,4 @@
-.TH "NPM" "1" "February 2018" "" ""
+.TH "NPM" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm\fR \- javascript package manager
.SH SYNOPSIS
@@ -10,7 +10,7 @@ npm [args]
.RE
.SH VERSION
.P
-5.6.0
+6.4.1
.SH DESCRIPTION
.P
npm is the package manager for the Node JavaScript platform\. It puts
@@ -22,6 +22,15 @@ Most commonly, it is used to publish, discover, install, and develop node
programs\.
.P
Run \fBnpm help\fP to get a list of available commands\.
+.SH IMPORTANT
+.P
+npm is configured to use npm, Inc\.'s public registry at
+https:// by default\. Use of the npm public registry is
+subject to terms of use available at https://
+.P
+You can configure npm to use any compatible registry you like, and even run
+your own registry\. Use of someone else's registry may be governed by their
+terms of use\.
.SH INTRODUCTION
.P
You probably got npm because you want to install stuff\.
@@ -54,12 +63,10 @@ In particular, npm has two modes of operation:
.RS 0
.IP \(bu 2
global mode:
-.br
npm installs packages into the install prefix at
\fBprefix/lib/node_modules\fP and bins are installed in \fBprefix/bin\fP\|\.
.IP \(bu 2
local mode:
-.br
npm installs packages into the current project directory, which
defaults to the current working directory\. Packages are installed to
\fB\|\./node_modules\fP, and bins are installed to \fB\|\./node_modules/\.bin\fP\|\.
@@ -102,32 +109,27 @@ npm is extremely configurable\. It reads its configuration options from
.RS 0
.IP \(bu 2
Command line switches:
-.br
Set a config with \fB\-\-key val\fP\|\. All keys take a value, even if they
are booleans (the config parser doesn't know what the options are at
-the time of parsing\.) If no value is provided, then the option is set
+the time of parsing)\. If no value is provided, then the option is set
to boolean \fBtrue\fP\|\.
.IP \(bu 2
Environment Variables:
-.br
Set any config by prefixing the name in an environment variable with
\fBnpm_config_\fP\|\. For example, \fBexport npm_config_key=val\fP\|\.
.IP \(bu 2
User Configs:
-.br
The file at $HOME/\.npmrc is an ini\-formatted list of configs\. If
present, it is parsed\. If the \fBuserconfig\fP option is set in the cli
or env, then that will be used instead\.
.IP \(bu 2
Global Configs:
-.br
The file found at \.\./etc/npmrc (from the node executable, by default
this resolves to /usr/local/etc/npmrc) will be parsed if it is found\.
If the \fBglobalconfig\fP option is set in the cli, env, or user config,
then that file is parsed instead\.
.IP \(bu 2
Defaults:
-.br
npm's default configuration options are defined in
lib/utils/config\-defs\.js\. These must not be changed\.
@@ -137,28 +139,16 @@ See npm help 7 \fBnpm\-config\fP for much much more information\.
.SH CONTRIBUTIONS
.P
Patches welcome!
-.RS 0
-.IP \(bu 2
-code:
-Read through npm help 7 \fBnpm\-coding\-style\fP if you plan to submit code\.
-You don't have to agree with it, but you do have to follow it\.
-.IP \(bu 2
-docs:
-If you find an error in the documentation, edit the appropriate markdown
-file in the "doc" folder\. (Don't worry about generating the man page\.)
-
-.RE
-.P
-Contributors are listed in npm's \fBpackage\.json\fP file\. You can view them
-easily by doing \fBnpm view npm contributors\fP\|\.
.P
If you would like to contribute, but don't know what to work on, read
the contributing guidelines and check the issues list\.
.RS 0
.IP \(bu 2
-https://github\.com/npm/npm/wiki/Contributing\-Guidelines
+CONTRIBUTING\.md \fIhttps://github\.com/npm/cli/blob/latest/CONTRIBUTING\.md\fR
.IP \(bu 2
-https://github\.com/npm/npm/issues
+Bug tracker \fIhttps://npm\.community/c/bugs\fR
+.IP \(bu 2
+Support tracker \fIhttps://npm\.community/c/support\fR
.RE
.SH BUGS
@@ -167,20 +157,19 @@ When you find issues, please report them:
.RS 0
.IP \(bu 2
web:
-https://github\.com/npm/npm/issues
+https://npm\.community/c/bugs
.RE
.P
-Be sure to include \fIall\fR of the output from the npm command that didn't work
-as expected\. The \fBnpm\-debug\.log\fP file is also helpful to provide\.
-.P
-You can also look for isaacs in #node\.js on irc://irc\.freenode\.net\. He
-will no doubt tell you to put the output in a gist or email\.
+Be sure to follow the template and bug reporting guidelines\. You can also ask
+for help in the support forum \fIhttps://npm\.community/c/support\fR if you're
+unsure if it's actually a bug or are having trouble coming up with a detailed
+reproduction to report\.
.SH AUTHOR
.P
Isaac Z\. Schlueter \fIhttp://blog\.izs\.me/\fR ::
isaacs \fIhttps://github\.com/isaacs/\fR ::
-@izs \fIhttp://twitter\.com/izs\fR ::
+@izs \fIhttps://twitter\.com/izs\fR ::
i@izs\.me
.SH SEE ALSO
.RS 0
diff --git a/deps/npm/man/man1/npx.1 b/deps/npm/man/man1/npx.1
index 4a4117bafce59c..d00c489c39b213 100644
--- a/deps/npm/man/man1/npx.1
+++ b/deps/npm/man/man1/npx.1
@@ -1,4 +1,4 @@
-.TH "NPX" "1" "October 2017" "npx@9.7.0" "User Commands"
+.TH "NPX" "1" "April 2018" "npx@10.1.1" "User Commands"
.SH "NAME"
\fBnpx\fR \- execute npm package binaries
.SH SYNOPSIS
@@ -101,6 +101,13 @@ $ npx \-\-node\-arg=\-\-inspect cowsay
Debugger listening on ws://127\.0\.0\.1:9229/\.\.\.\.
.fi
.RE
+.SS Specify a node version to run npm scripts (or anything else!)
+.P
+.RS 2
+.nf
+npx \-p node@8 npm run build
+.fi
+.RE
.SH SHELL AUTO FALLBACK
.P
You can configure \fBnpx\fP to run as your default fallback command when you type something in the command line with an \fB@\fP but the command is not found\. This includes installing packages that were not found in the local prefix either\.
@@ -165,4 +172,3 @@ This work is released by its authors into the public domain under CC0\-1\.0\. Se
\fBnpm\-config(7)\fP
.RE
-
diff --git a/deps/npm/man/man5/npm-folders.5 b/deps/npm/man/man5/npm-folders.5
index b7b9928ee8a628..93b6c855835f13 100644
--- a/deps/npm/man/man5/npm-folders.5
+++ b/deps/npm/man/man5/npm-folders.5
@@ -1,4 +1,4 @@
-.TH "NPM\-FOLDERS" "5" "February 2018" "" ""
+.TH "NPM\-FOLDERS" "5" "August 2018" "" ""
.SH "NAME"
\fBnpm-folders\fR \- Folder Structures Used by npm
.SH DESCRIPTION
@@ -69,7 +69,7 @@ Man pages are not installed on Windows systems\.
.SS Cache
.P
See npm help \fBnpm\-cache\fP\|\. Cache files are stored in \fB~/\.npm\fP on Posix, or
-\fB~/npm\-cache\fP on Windows\.
+\fB%AppData%/npm\-cache\fP on Windows\.
.P
This is controlled by the \fBcache\fP configuration param\.
.SS Temp Files
@@ -173,17 +173,17 @@ foo
.fi
.RE
.P
-Since foo depends directly on \fBbar@1\.2\.3\fP and \fBbaz@1\.2\.3\fP, those are
-installed in foo's \fBnode_modules\fP folder\.
+Since foo depends directly on \fBand\fP\fB, those are
+installed in foo's\fPnode_modules` folder\.
.P
Even though the latest copy of blerg is 1\.3\.7, foo has a specific
dependency on version 1\.2\.5\. So, that gets installed at [A]\. Since the
-parent installation of blerg satisfies bar's dependency on \fBblerg@1\.x\fP,
+parent installation of blerg satisfies bar's dependency on `,
it does not install another copy under [B]\.
.P
Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar's \fBnode_modules\fP folder\. Because it depends on \fBbaz@2\.x\fP, it cannot
-re\-use the \fBbaz@1\.2\.3\fP installed in the parent \fBnode_modules\fP folder [D],
+bar's \fBnode_modules\fP folder\. Because it depends on \fB, it cannot
+re\-use the\fP\fBinstalled in the parent\fPnode_modules` folder [D],
and must install its own copy [C]\.
.P
Underneath bar, the \fBbaz \-> quux \-> bar\fP dependency creates a cycle\.
diff --git a/deps/npm/man/man5/npm-global.5 b/deps/npm/man/man5/npm-global.5
index b7b9928ee8a628..93b6c855835f13 100644
--- a/deps/npm/man/man5/npm-global.5
+++ b/deps/npm/man/man5/npm-global.5
@@ -1,4 +1,4 @@
-.TH "NPM\-FOLDERS" "5" "February 2018" "" ""
+.TH "NPM\-FOLDERS" "5" "August 2018" "" ""
.SH "NAME"
\fBnpm-folders\fR \- Folder Structures Used by npm
.SH DESCRIPTION
@@ -69,7 +69,7 @@ Man pages are not installed on Windows systems\.
.SS Cache
.P
See npm help \fBnpm\-cache\fP\|\. Cache files are stored in \fB~/\.npm\fP on Posix, or
-\fB~/npm\-cache\fP on Windows\.
+\fB%AppData%/npm\-cache\fP on Windows\.
.P
This is controlled by the \fBcache\fP configuration param\.
.SS Temp Files
@@ -173,17 +173,17 @@ foo
.fi
.RE
.P
-Since foo depends directly on \fBbar@1\.2\.3\fP and \fBbaz@1\.2\.3\fP, those are
-installed in foo's \fBnode_modules\fP folder\.
+Since foo depends directly on \fBand\fP\fB, those are
+installed in foo's\fPnode_modules` folder\.
.P
Even though the latest copy of blerg is 1\.3\.7, foo has a specific
dependency on version 1\.2\.5\. So, that gets installed at [A]\. Since the
-parent installation of blerg satisfies bar's dependency on \fBblerg@1\.x\fP,
+parent installation of blerg satisfies bar's dependency on `,
it does not install another copy under [B]\.
.P
Bar [B] also has dependencies on baz and asdf, so those are installed in
-bar's \fBnode_modules\fP folder\. Because it depends on \fBbaz@2\.x\fP, it cannot
-re\-use the \fBbaz@1\.2\.3\fP installed in the parent \fBnode_modules\fP folder [D],
+bar's \fBnode_modules\fP folder\. Because it depends on \fB, it cannot
+re\-use the\fP\fBinstalled in the parent\fPnode_modules` folder [D],
and must install its own copy [C]\.
.P
Underneath bar, the \fBbaz \-> quux \-> bar\fP dependency creates a cycle\.
diff --git a/deps/npm/man/man5/npm-json.5 b/deps/npm/man/man5/npm-json.5
index dd7c36bf2dabde..efa8cafa0697bd 100644
--- a/deps/npm/man/man5/npm-json.5
+++ b/deps/npm/man/man5/npm-json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE\.JSON" "5" "February 2018" "" ""
+.TH "PACKAGE\.JSON" "5" "August 2018" "" ""
.SH "NAME"
\fBpackage.json\fR \- Specifics of npm's package\.json handling
.SH DESCRIPTION
@@ -10,11 +10,11 @@ A lot of the behavior described in this document is affected by the config
settings described in npm help 7 \fBnpm\-config\fP\|\.
.SH name
.P
-The \fImost\fR important things in your package\.json are the name and version fields\.
-Those are actually required, and your package won't install without
-them\. The name and version together form an identifier that is assumed
-to be completely unique\. Changes to the package should come along with
-changes to the version\.
+If you plan to publish your package, the \fImost\fR important things in your
+package\.json are the name and version fields as they will be required\. The name
+and version together form an identifier that is assumed to be completely unique\.
+Changes to the package should come along with changes to the version\. If you don't
+plan to publish your package, the name and version fields are optional\.
.P
The name is what your thing is called\.
.P
@@ -54,11 +54,11 @@ A name can be optionally prefixed by a scope, e\.g\. \fB@myorg/mypackage\fP\|\.
npm help 7 \fBnpm\-scope\fP for more detail\.
.SH version
.P
-The \fImost\fR important things in your package\.json are the name and version fields\.
-Those are actually required, and your package won't install without
-them\. The name and version together form an identifier that is assumed
-to be completely unique\. Changes to the package should come along with
-changes to the version\.
+If you plan to publish your package, the \fImost\fR important things in your
+package\.json are the name and version fields as they will be required\. The name
+and version together form an identifier that is assumed to be completely unique\.
+Changes to the package should come along with changes to the version\. If you don't
+plan to publish your package, the name and version fields are optional\.
.P
Version must be parseable by
node\-semver \fIhttps://github\.com/isaacs/node\-semver\fR, which is bundled
@@ -76,6 +76,14 @@ discover your package as it's listed in \fBnpm search\fP\|\.
.SH homepage
.P
The url to the project homepage\.
+.P
+Example:
+.P
+.RS 2
+.nf
+"homepage": "https://github\.com/owner/project#readme"
+.fi
+.RE
.SH bugs
.P
The url to your project's issue tracker and / or the email address to which
@@ -115,7 +123,7 @@ Ideally you should pick one that is
OSI \fIhttps://opensource\.org/licenses/alphabetical\fR approved\.
.P
If your package is licensed under multiple common licenses, use an SPDX license
-expression syntax version 2\.0 string \fIhttps://npmjs\.com/package/spdx\fR, like this:
+expression syntax version 2\.0 string \fIhttps://www\.npmjs\.com/package/spdx\fR, like this:
.P
.RS 2
.nf
@@ -207,13 +215,15 @@ Both email and url are optional either way\.
npm also sets a top\-level "maintainers" field with your npm user info\.
.SH files
.P
-The optional "files" field is an array of file patterns that describes
+The optional \fBfiles\fP field is an array of file patterns that describes
the entries to be included when your package is installed as a
-dependency\. If the files array is omitted, everything except
-automatically\-excluded files will be included in your publish\. If you
-name a folder in the array, then it will also include the files inside
-that folder (unless they would be ignored by another rule in this
-section\.)\.
+dependency\. File patterns follow a similar syntax to \fB\|\.gitignore\fP, but
+reversed: including a file, directory, or glob pattern (\fB*\fP, \fB**/*\fP, and such)
+will make it so that file is included in the tarball when it's packed\. Omitting
+the field will make it default to \fB["*"]\fP, which means it will include all files\.
+.P
+Some special files and directories are also included or excluded regardless of
+whether they exist in the \fBfiles\fP array (see below)\.
.P
You can also provide a \fB\|\.npmignore\fP file in the root of your package or
in subdirectories, which will keep files from being included\. At the
@@ -288,6 +298,11 @@ This should be a module ID relative to the root of your package folder\.
.P
For most modules, it makes the most sense to have a main script and often not
much else\.
+.SH browser
+.P
+If your module is meant to be used client\-side the browser field should be
+used instead of the main field\. This is helpful to hint users that it might
+rely on primitives that aren't available in Node\.js modules\. (e\.g\. \fBwindow\fP)
.SH bin
.P
A lot of packages have one or more executable files that they'd like to
@@ -432,15 +447,15 @@ Do it like this:
.P
.RS 2
.nf
-"repository" :
- { "type" : "git"
- , "url" : "https://github\.com/npm/npm\.git"
- }
+"repository": {
+ "type" : "git",
+ "url" : "https://github\.com/npm/cli\.git"
+}
-"repository" :
- { "type" : "svn"
- , "url" : "https://v8\.googlecode\.com/svn/trunk/"
- }
+"repository": {
+ "type" : "svn",
+ "url" : "https://v8\.googlecode\.com/svn/trunk/"
+}
.fi
.RE
.P
@@ -590,10 +605,10 @@ Examples:
.P
.RS 2
.nf
-git+ssh://git@github\.com:npm/npm\.git#v1\.0\.27
-git+ssh://git@github\.com:npm/npm#semver:^5\.0
-git+https://isaacs@github\.com/npm/npm\.git
-git://github\.com/npm/npm\.git#v1\.0\.27
+git+ssh://git@github\.com:npm/cli\.git#v1\.0\.27
+git+ssh://git@github\.com:npm/cli#semver:^5\.0
+git+https://isaacs@github\.com/npm/cli\.git
+git://github\.com/npm/cli\.git#v1\.0\.27
.fi
.RE
.SS GitHub URLs
@@ -730,7 +745,7 @@ Trying to install another plugin with a conflicting requirement will cause an
error\. For this reason, make sure your plugin requirement is as broad as
possible, and not to lock it down to specific patch versions\.
.P
-Assuming the host complies with semver \fIhttp://semver\.org/\fR, only changes in
+Assuming the host complies with semver \fIhttps://semver\.org/\fR, only changes in
the host package's major version will break your plugin\. Thus, if you've worked
with every 1\.x version of the host package, use \fB"^1\.0"\fP or \fB"1\.x"\fP to express
this\. If you depend on features introduced in 1\.5\.2, use \fB">= 1\.5\.2 < 2"\fP\|\.
@@ -901,8 +916,8 @@ especially handy if you want to set the tag, registry or access, so that
you can ensure that a given package is not tagged with "latest", published
to the global public registry or that a scoped module is private by default\.
.P
-Any config values can be overridden, but of course only "tag", "registry" and
-"access" probably matter for the purposes of publishing\.
+Any config values can be overridden, but only "tag", "registry" and "access"
+probably matter for the purposes of publishing\.
.P
See npm help 7 \fBnpm\-config\fP to see the list of config options that can be
overridden\.
diff --git a/deps/npm/man/man5/npm-package-locks.5 b/deps/npm/man/man5/npm-package-locks.5
index cd3f4bfeaa7852..e7329365f97f47 100644
--- a/deps/npm/man/man5/npm-package-locks.5
+++ b/deps/npm/man/man5/npm-package-locks.5
@@ -1,4 +1,4 @@
-.TH "NPM\-PACKAGE\-LOCKS" "5" "February 2018" "" ""
+.TH "NPM\-PACKAGE\-LOCKS" "5" "August 2018" "" ""
.SH "NAME"
\fBnpm-package-locks\fR \- An explanation of npm lockfiles
.SH DESCRIPTION
@@ -71,7 +71,7 @@ A@0\.1\.0
.fi
.RE
.P
-However, if B@0\.0\.2 is published, then a fresh \fBnpm install A\fP will
+However, if is published, then a fresh \fBnpm install A\fP will
install:
.P
.RS 2
@@ -85,7 +85,7 @@ A@0\.1\.0
assuming the new version did not modify B's dependencies\. Of course,
the new version of B could include a new version of C and any number
of new dependencies\. If such changes are undesirable, the author of A
-could specify a dependency on B@0\.0\.1\. However, if A's author and B's
+could specify a dependency on \|\. However, if A's author and B's
author are not the same person, there's no way for A's author to say
that he or she does not want to pull in newly published versions of C
when B hasn't changed at all\.
@@ -167,10 +167,25 @@ package source to get the exact same dependency tree that you were developing
on\. Additionally, the diffs from these changes are human\-readable and will
inform you of any changes npm has made to your \fBnode_modules\fP, so you can notice
if any transitive dependencies were updated, hoisted, etc\.
+.SS Resolving lockfile conflicts
+.P
+Occasionally, two separate npm install will create package locks that cause
+merge conflicts in source control systems\. As of \fB, these conflicts
+can be resolved by manually fixing any\fPpackage\.json\fBconflicts, and then
+running\fPnpm install [\-\-package\-lock\-only]\fBagain\. npm will automatically
+resolve any conflicts for you and write a merged package lock that includes all
+the dependencies from both branches in a reasonable tree\. If\fP\-\-package\-lock\-only\fBis provided, it will do this without also modifying your
+local\fPnode_modules/`\.
+.P
+To make this process seamless on git, consider installing
+\fBnpm\-merge\-driver\fP \fIhttps://npm\.im/npm\-merge\-driver\fR, which will teach git how
+to do this itself without any user interaction\. In short: \fB$ npx
+npm\-merge\-driver install \-g\fP will let you do this, and even works with
+\fBversions of npm 5, albeit a bit more noisily\. Note that if\fPpackage\.json\fBitself conflicts, you will have to resolve that by hand and run\fPnpm install` manually, even with the merge driver\.
.SH SEE ALSO
.RS 0
.IP \(bu 2
-https://medium\.com/@sdboyer/so\-you\-want\-to\-write\-a\-package\-manager\-4ae9c17d9527
+https://
.IP \(bu 2
npm help 5 package\.json
.IP \(bu 2
@@ -181,4 +196,3 @@ npm help 5 shrinkwrap\.json
npm help shrinkwrap
.RE
-
diff --git a/deps/npm/man/man5/npm-shrinkwrap.json.5 b/deps/npm/man/man5/npm-shrinkwrap.json.5
index 92cf65b526f518..1663a5274bc3f9 100644
--- a/deps/npm/man/man5/npm-shrinkwrap.json.5
+++ b/deps/npm/man/man5/npm-shrinkwrap.json.5
@@ -1,4 +1,4 @@
-.TH "NPM\-SHRINKWRAP\.JSON" "5" "February 2018" "" ""
+.TH "NPM\-SHRINKWRAP\.JSON" "5" "August 2018" "" ""
.SH "NAME"
\fBnpm-shrinkwrap.json\fR \- A publishable lockfile
.SH DESCRIPTION
@@ -30,4 +30,3 @@ npm help 5 package\.json
npm help install
.RE
-
diff --git a/deps/npm/man/man5/npmrc.5 b/deps/npm/man/man5/npmrc.5
index c70b95af669ef8..addad5e27434d8 100644
--- a/deps/npm/man/man5/npmrc.5
+++ b/deps/npm/man/man5/npmrc.5
@@ -1,4 +1,4 @@
-.TH "NPMRC" "5" "February 2018" "" ""
+.TH "NPMRC" "5" "August 2018" "" ""
.SH "NAME"
\fBnpmrc\fR \- The npm config files
.SH DESCRIPTION
diff --git a/deps/npm/man/man5/package-lock.json.5 b/deps/npm/man/man5/package-lock.json.5
index 0a57eea26e4577..172b659e432e46 100644
--- a/deps/npm/man/man5/package-lock.json.5
+++ b/deps/npm/man/man5/package-lock.json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE\-LOCK\.JSON" "5" "February 2018" "" ""
+.TH "PACKAGE\-LOCK\.JSON" "5" "August 2018" "" ""
.SH "NAME"
\fBpackage-lock.json\fR \- A manifestation of the manifest
.SH DESCRIPTION
@@ -127,6 +127,12 @@ transitive dependency of a non\-optional dependency of the top level\.
.P
All optional dependencies should be included even if they're uninstallable
on the current platform\.
+.SS requires
+.P
+This is a mapping of module name to version\. This is a list of everything
+this module requires, regardless of where it will be installed\. The version
+should match via normal matching rules a dependency either in our
+\fBdependencies\fP or in a level higher than us\.
.SS dependencies
.P
The dependencies of this dependency, exactly as at the top level\.
@@ -137,9 +143,10 @@ npm help shrinkwrap
.IP \(bu 2
npm help 5 shrinkwrap\.json
.IP \(bu 2
+npm help 5 package\-locks
+.IP \(bu 2
npm help 5 package\.json
.IP \(bu 2
npm help install
.RE
-
diff --git a/deps/npm/man/man5/package.json.5 b/deps/npm/man/man5/package.json.5
index dd7c36bf2dabde..efa8cafa0697bd 100644
--- a/deps/npm/man/man5/package.json.5
+++ b/deps/npm/man/man5/package.json.5
@@ -1,4 +1,4 @@
-.TH "PACKAGE\.JSON" "5" "February 2018" "" ""
+.TH "PACKAGE\.JSON" "5" "August 2018" "" ""
.SH "NAME"
\fBpackage.json\fR \- Specifics of npm's package\.json handling
.SH DESCRIPTION
@@ -10,11 +10,11 @@ A lot of the behavior described in this document is affected by the config
settings described in npm help 7 \fBnpm\-config\fP\|\.
.SH name
.P
-The \fImost\fR important things in your package\.json are the name and version fields\.
-Those are actually required, and your package won't install without
-them\. The name and version together form an identifier that is assumed
-to be completely unique\. Changes to the package should come along with
-changes to the version\.
+If you plan to publish your package, the \fImost\fR important things in your
+package\.json are the name and version fields as they will be required\. The name
+and version together form an identifier that is assumed to be completely unique\.
+Changes to the package should come along with changes to the version\. If you don't
+plan to publish your package, the name and version fields are optional\.
.P
The name is what your thing is called\.
.P
@@ -54,11 +54,11 @@ A name can be optionally prefixed by a scope, e\.g\. \fB@myorg/mypackage\fP\|\.
npm help 7 \fBnpm\-scope\fP for more detail\.
.SH version
.P
-The \fImost\fR important things in your package\.json are the name and version fields\.
-Those are actually required, and your package won't install without
-them\. The name and version together form an identifier that is assumed
-to be completely unique\. Changes to the package should come along with
-changes to the version\.
+If you plan to publish your package, the \fImost\fR important things in your
+package\.json are the name and version fields as they will be required\. The name
+and version together form an identifier that is assumed to be completely unique\.
+Changes to the package should come along with changes to the version\. If you don't
+plan to publish your package, the name and version fields are optional\.
.P
Version must be parseable by
node\-semver \fIhttps://github\.com/isaacs/node\-semver\fR, which is bundled
@@ -76,6 +76,14 @@ discover your package as it's listed in \fBnpm search\fP\|\.
.SH homepage
.P
The url to the project homepage\.
+.P
+Example:
+.P
+.RS 2
+.nf
+"homepage": "https://github\.com/owner/project#readme"
+.fi
+.RE
.SH bugs
.P
The url to your project's issue tracker and / or the email address to which
@@ -115,7 +123,7 @@ Ideally you should pick one that is
OSI \fIhttps://opensource\.org/licenses/alphabetical\fR approved\.
.P
If your package is licensed under multiple common licenses, use an SPDX license
-expression syntax version 2\.0 string \fIhttps://npmjs\.com/package/spdx\fR, like this:
+expression syntax version 2\.0 string \fIhttps://www\.npmjs\.com/package/spdx\fR, like this:
.P
.RS 2
.nf
@@ -207,13 +215,15 @@ Both email and url are optional either way\.
npm also sets a top\-level "maintainers" field with your npm user info\.
.SH files
.P
-The optional "files" field is an array of file patterns that describes
+The optional \fBfiles\fP field is an array of file patterns that describes
the entries to be included when your package is installed as a
-dependency\. If the files array is omitted, everything except
-automatically\-excluded files will be included in your publish\. If you
-name a folder in the array, then it will also include the files inside
-that folder (unless they would be ignored by another rule in this
-section\.)\.
+dependency\. File patterns follow a similar syntax to \fB\|\.gitignore\fP, but
+reversed: including a file, directory, or glob pattern (\fB*\fP, \fB**/*\fP, and such)
+will make it so that file is included in the tarball when it's packed\. Omitting
+the field will make it default to \fB["*"]\fP, which means it will include all files\.
+.P
+Some special files and directories are also included or excluded regardless of
+whether they exist in the \fBfiles\fP array (see below)\.
.P
You can also provide a \fB\|\.npmignore\fP file in the root of your package or
in subdirectories, which will keep files from being included\. At the
@@ -288,6 +298,11 @@ This should be a module ID relative to the root of your package folder\.
.P
For most modules, it makes the most sense to have a main script and often not
much else\.
+.SH browser
+.P
+If your module is meant to be used client\-side the browser field should be
+used instead of the main field\. This is helpful to hint users that it might
+rely on primitives that aren't available in Node\.js modules\. (e\.g\. \fBwindow\fP)
.SH bin
.P
A lot of packages have one or more executable files that they'd like to
@@ -432,15 +447,15 @@ Do it like this:
.P
.RS 2
.nf
-"repository" :
- { "type" : "git"
- , "url" : "https://github\.com/npm/npm\.git"
- }
+"repository": {
+ "type" : "git",
+ "url" : "https://github\.com/npm/cli\.git"
+}
-"repository" :
- { "type" : "svn"
- , "url" : "https://v8\.googlecode\.com/svn/trunk/"
- }
+"repository": {
+ "type" : "svn",
+ "url" : "https://v8\.googlecode\.com/svn/trunk/"
+}
.fi
.RE
.P
@@ -590,10 +605,10 @@ Examples:
.P
.RS 2
.nf
-git+ssh://git@github\.com:npm/npm\.git#v1\.0\.27
-git+ssh://git@github\.com:npm/npm#semver:^5\.0
-git+https://isaacs@github\.com/npm/npm\.git
-git://github\.com/npm/npm\.git#v1\.0\.27
+git+ssh://git@github\.com:npm/cli\.git#v1\.0\.27
+git+ssh://git@github\.com:npm/cli#semver:^5\.0
+git+https://isaacs@github\.com/npm/cli\.git
+git://github\.com/npm/cli\.git#v1\.0\.27
.fi
.RE
.SS GitHub URLs
@@ -730,7 +745,7 @@ Trying to install another plugin with a conflicting requirement will cause an
error\. For this reason, make sure your plugin requirement is as broad as
possible, and not to lock it down to specific patch versions\.
.P
-Assuming the host complies with semver \fIhttp://semver\.org/\fR, only changes in
+Assuming the host complies with semver \fIhttps://semver\.org/\fR, only changes in
the host package's major version will break your plugin\. Thus, if you've worked
with every 1\.x version of the host package, use \fB"^1\.0"\fP or \fB"1\.x"\fP to express
this\. If you depend on features introduced in 1\.5\.2, use \fB">= 1\.5\.2 < 2"\fP\|\.
@@ -901,8 +916,8 @@ especially handy if you want to set the tag, registry or access, so that
you can ensure that a given package is not tagged with "latest", published
to the global public registry or that a scoped module is private by default\.
.P
-Any config values can be overridden, but of course only "tag", "registry" and
-"access" probably matter for the purposes of publishing\.
+Any config values can be overridden, but only "tag", "registry" and "access"
+probably matter for the purposes of publishing\.
.P
See npm help 7 \fBnpm\-config\fP to see the list of config options that can be
overridden\.
diff --git a/deps/npm/man/man7/npm-coding-style.7 b/deps/npm/man/man7/npm-coding-style.7
index b562f77b9e8f07..455c30d9d2a9cc 100644
--- a/deps/npm/man/man7/npm-coding-style.7
+++ b/deps/npm/man/man7/npm-coding-style.7
@@ -1,4 +1,4 @@
-.TH "NPM\-CODING\-STYLE" "7" "February 2018" "" ""
+.TH "NPM\-CODING\-STYLE" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-coding-style\fR \- npm's "funny" coding style
.SH DESCRIPTION
@@ -147,7 +147,7 @@ var alsoOk = "String contains 'single' quotes or apostrophe"
.RE
.SH Whitespace
.P
-Put a single space in front of ( for anything other than a function call\.
+Put a single space in front of \fB(\fP for anything other than a function call\.
Also use a single space wherever it makes things more readable\.
.P
Don't leave trailing whitespace at the end of lines\. Don't indent empty
diff --git a/deps/npm/man/man7/npm-config.7 b/deps/npm/man/man7/npm-config.7
index 4748fb823a2ad4..22847a0443c23f 100644
--- a/deps/npm/man/man7/npm-config.7
+++ b/deps/npm/man/man7/npm-config.7
@@ -1,4 +1,4 @@
-.TH "NPM\-CONFIG" "7" "February 2018" "" ""
+.TH "NPM\-CONFIG" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-config\fR \- More than you probably want to know about npm configuration
.SH DESCRIPTION
@@ -211,6 +211,29 @@ Type: String
.P
When "dev" or "development" and running local \fBnpm shrinkwrap\fP,
\fBnpm outdated\fP, or \fBnpm update\fP, is an alias for \fB\-\-dev\fP\|\.
+.SS audit
+.RS 0
+.IP \(bu 2
+Default: true
+.IP \(bu 2
+Type: Boolean
+
+.RE
+.P
+When "true" submit audit reports alongside \fBnpm install\fP runs to the default
+registry and all registries configured for scopes\. See the documentation
+for npm help audit for details on what is submitted\.
+.SS audit\-level
+.RS 0
+.IP \(bu 2
+Default: \fB"low"\fP
+.IP \(bu 2
+Type: \fB\|'low'\fP, \fB\|'moderate'\fP, \fB\|'high'\fP, \fB\|'critical'\fP
+
+.RE
+.P
+The minimum level of vulnerability for \fBnpm audit\fP to exit with
+a non\-zero exit code\.
.SS auth\-type
.RS 0
.IP \(bu 2
@@ -394,6 +417,9 @@ Type: Boolean or \fB"always"\fP
.P
If false, never shows colors\. If \fB"always"\fP then always shows colors\.
If true, then only prints color codes for tty file descriptors\.
+.P
+This option can also be changed using the environment: colors are
+disabled when the environment variable \fBNO_COLOR\fP is set to any value\.
.SS depth
.RS 0
.IP \(bu 2
@@ -442,8 +468,8 @@ Type: Boolean
Indicates that you don't want npm to make any changes and that it should
only report what it would have done\. This can be passed into any of the
commands that modify your local installation, eg, \fBinstall\fP, \fBupdate\fP,
-\fBdedupe\fP, \fBuninstall\fP\|\. This is NOT currently honored by network related
-commands, eg \fBdist\-tags\fP, \fBowner\fP, \fBpublish\fP, etc\.
+\fBdedupe\fP, \fBuninstall\fP\|\. This is NOT currently honored by some network related
+commands, eg \fBdist\-tags\fP, \fBowner\fP, etc\.
.SS editor
.RS 0
.IP \(bu 2
@@ -902,7 +928,7 @@ Any "%s" in the message will be replaced with the version number\.
.SS metrics\-registry
.RS 0
.IP \(bu 2
-Default: The value of \fBregistry\fP (which defaults to "https://registry\.npmjs\.org/")
+Default: The value of \fBregistry\fP (which defaults to "https://
.IP \(bu 2
Type: String
@@ -931,6 +957,16 @@ Type: semver or false
.RE
.P
The node version to use when checking a package's \fBengines\fP map\.
+.SS noproxy
+.RS 0
+.IP \(bu 2
+Default: null
+.IP \(bu 2
+Type: String or Array
+
+.RE
+.P
+A comma\-separated string or an array of domain extensions that a proxy should not be used for\.
.SS offline
.RS 0
.IP \(bu 2
@@ -1009,6 +1045,10 @@ Type: Boolean
If set to false, then ignore \fBpackage\-lock\.json\fP files when installing\. This
will also prevent \fIwriting\fR \fBpackage\-lock\.json\fP if \fBsave\fP is true\.
.P
+When package package\-locks are disabled, automatic pruning of extraneous
+modules will also be disabled\. To remove extraneous modules with
+package\-locks disabled use \fBnpm prune\fP\|\.
+.P
This option is an alias for \fB\-\-shrinkwrap\fP\|\.
.SS package\-lock\-only
.RS 0
@@ -1019,7 +1059,7 @@ Type: Boolean
.RE
.P
-If set to true, it will update only the \fBpackage\-json\fP,
+If set to true, it will update only the \fBpackage\-lock\.json\fP,
instead of checking \fBnode_modules\fP and downloading dependencies\.
.SS parseable
.RS 0
@@ -1067,6 +1107,17 @@ Type: path
.P
The location to install global items\. If set on the command line, then
it forces non\-global commands to run in the specified folder\.
+.SS preid
+.RS 0
+.IP \(bu 2
+Default: ""
+.IP \(bu 2
+Type: String
+
+.RE
+.P
+The "prerelease identifier" to use as a prefix for the "prerelease" part of a
+semver\. Like the \fBrc\fP in \fB1\.2\.0\-rc\.8\fP\|\.
.SS production
.RS 0
.IP \(bu 2
@@ -1133,7 +1184,7 @@ Rebuild bundled dependencies after installation\.
.SS registry
.RS 0
.IP \(bu 2
-Default: https://registry\.npmjs\.org/
+Default: https://
.IP \(bu 2
Type: url
@@ -1153,7 +1204,7 @@ Remove failed installs\.
.SS save
.RS 0
.IP \(bu 2
-Default: false
+Default: true
.IP \(bu 2
Type: Boolean
@@ -1379,6 +1430,20 @@ If set to false, then ignore \fBnpm\-shrinkwrap\.json\fP files when installing\.
will also prevent \fIwriting\fR \fBnpm\-shrinkwrap\.json\fP if \fBsave\fP is true\.
.P
This option is an alias for \fB\-\-package\-lock\fP\|\.
+.SS sign\-git\-commit
+.RS 0
+.IP \(bu 2
+Default: false
+.IP \(bu 2
+Type: Boolean
+
+.RE
+.P
+If set to true, then the \fBnpm version\fP command will commit the new package
+version using \fB\-S\fP to add a signature\.
+.P
+Note that git requires you to have set up GPG keys in your git configs
+for this to work properly\.
.SS sign\-git\-tag
.RS 0
.IP \(bu 2
@@ -1505,6 +1570,17 @@ Type: Boolean
Set to true to suppress the UID/GID switching when running package
scripts\. If set explicitly to false, then installing as a non\-root user
will fail\.
+.SS update\-notifier
+.RS 0
+.IP \(bu 2
+Default: true
+.IP \(bu 2
+Type: Boolean
+
+.RE
+.P
+Set to false to suppress the update notification when using an older
+version of npm than the latest\.
.SS usage
.RS 0
.IP \(bu 2
diff --git a/deps/npm/man/man7/npm-developers.7 b/deps/npm/man/man7/npm-developers.7
index edd4abba388f50..22ccbd70d2efaa 100644
--- a/deps/npm/man/man7/npm-developers.7
+++ b/deps/npm/man/man7/npm-developers.7
@@ -1,4 +1,4 @@
-.TH "NPM\-DEVELOPERS" "7" "February 2018" "" ""
+.TH "NPM\-DEVELOPERS" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-developers\fR \- Developer Guide
.SH DESCRIPTION
@@ -68,7 +68,7 @@ This should be a string that identifies your project\. Please do not
use the name to specify that it runs on node, or is in JavaScript\.
You can use the "engines" field to explicitly state the versions of
node (or whatever else) that your program requires, and it's pretty
-well assumed that it's javascript\.
+well assumed that it's JavaScript\.
It does not necessarily need to match your github repository name\.
So, \fBnode\-foo\fP and \fBbar\-js\fP are bad names\. \fBfoo\fP or \fBbar\fP are better\.
.IP \(bu 2
diff --git a/deps/npm/man/man7/npm-disputes.7 b/deps/npm/man/man7/npm-disputes.7
index b4bb49d0b23f79..23cd81e0b546f4 100644
--- a/deps/npm/man/man7/npm-disputes.7
+++ b/deps/npm/man/man7/npm-disputes.7
@@ -1,4 +1,4 @@
-.TH "NPM\-DISPUTES" "7" "February 2018" "" ""
+.TH "NPM\-DISPUTES" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-disputes\fR \- Handling Module Name Disputes
.P
@@ -46,7 +46,7 @@ publishes it to the npm registry\. Being a simple little thing, it never
really has to be updated\. Alice works for Foo Inc, the makers of the
critically acclaimed and widely\-marketed \fBfoo\fP JavaScript toolkit framework\.
They publish it to npm as \fBfoojs\fP, but people are routinely confused when
-\fBnpm install\fPfoo`` is some different thing\.
+\fBnpm install foo\fP is some different thing\.
.IP 4. 3
Yusuf writes a parser for the widely\-known \fBfoo\fP file format, because he
needs it for work\. Then, he gets a new job, and never updates the prototype\.
@@ -120,8 +120,8 @@ here to help\.\fR
.P
If you think another npm publisher is infringing your trademark, such as by
using a confusingly similar package name, email abuse@npmjs\.com with a link to
-the package or user account on \fIhttps://npmjs\.com\fR\|\. Attach a
-copy of your trademark registration certificate\.
+the package or user account on https:// \fIhttps://www\.npmjs\.com/\fR\|\.
+Attach a copy of your trademark registration certificate\.
.P
If we see that the package's publisher is intentionally misleading others by
misusing your registered mark without permission, we will transfer the package
@@ -131,7 +131,7 @@ metadata\.
.SH CHANGES
.P
This is a living document and may be updated from time to time\. Please refer to
-the git history for this document \fIhttps://github\.com/npm/npm/commits/master/doc/misc/npm\-disputes\.md\fR
+the git history for this document \fIhttps://github\.com/npm/cli/commits/latest/doc/misc/npm\-disputes\.md\fR
to view the changes\.
.SH LICENSE
.P
diff --git a/deps/npm/man/man7/npm-index.7 b/deps/npm/man/man7/npm-index.7
index 515c1a3f0b807a..209d6ae4093c39 100644
--- a/deps/npm/man/man7/npm-index.7
+++ b/deps/npm/man/man7/npm-index.7
@@ -1,4 +1,4 @@
-.TH "NPM\-INDEX" "7" "February 2018" "" ""
+.TH "NPM\-INDEX" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-index\fR \- Index of all npm documentation
.SS npm help README
@@ -16,6 +16,9 @@ Set access level on published packages
.SS npm help adduser
.P
Add a registry user account
+.SS npm help audit
+.P
+Run a security audit
.SS npm help bin
.P
Display npm bin folder
@@ -31,6 +34,9 @@ REMOVED
.SS npm help cache
.P
Manipulates packages cache
+.SS npm help ci
+.P
+Install a project with a clean slate
.SS npm help completion
.P
Tab Completion for npm
@@ -64,9 +70,15 @@ Search npm help documentation
.SS npm help help
.P
Get help on npm
+.SS npm help hook
+.P
+Manage registry hooks
.SS npm help init
.P
-Interactively create a package\.json file
+create a package\.json file
+.SS npm help install\-ci\-test
+.P
+Install a project with a clean slate and run tests
.SS npm help install\-test
.P
Install package(s) and run tests
diff --git a/deps/npm/man/man7/npm-orgs.7 b/deps/npm/man/man7/npm-orgs.7
index 1c6193835e9e9f..db865891615f5e 100644
--- a/deps/npm/man/man7/npm-orgs.7
+++ b/deps/npm/man/man7/npm-orgs.7
@@ -1,4 +1,4 @@
-.TH "NPM\-ORGS" "7" "February 2018" "" ""
+.TH "NPM\-ORGS" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-orgs\fR \- Working with Teams & Orgs
.SH DESCRIPTION
diff --git a/deps/npm/man/man7/npm-registry.7 b/deps/npm/man/man7/npm-registry.7
index 9b7878edc584f1..e2ca1842233b11 100644
--- a/deps/npm/man/man7/npm-registry.7
+++ b/deps/npm/man/man7/npm-registry.7
@@ -1,4 +1,4 @@
-.TH "NPM\-REGISTRY" "7" "February 2018" "" ""
+.TH "NPM\-REGISTRY" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-registry\fR \- The JavaScript Package Registry
.SH DESCRIPTION
@@ -7,12 +7,20 @@ To resolve packages by name and version, npm talks to a registry website
that implements the CommonJS Package Registry specification for reading
package info\.
.P
-Additionally, npm's package registry implementation supports several
+npm is configured to use npm, Inc\.'s public registry at
+https://registry\.npmjs\.org by default\. Use of the npm public registry is
+subject to terms of use available at https://www\.npmjs\.com/policies/terms\|\.
+.P
+You can configure npm to use any compatible registry you like, and even run
+your own registry\. Use of someone else's registry may be governed by their
+terms of use\.
+.P
+npm's package registry implementation supports several
write APIs as well, to allow for publishing packages and managing user
account information\.
.P
-The official public npm registry is at https://registry\.npmjs\.org/\|\. It
-is powered by a CouchDB database, of which there is a public mirror at
+The npm public registry is powered by a CouchDB database,
+of which there is a public mirror at
https://skimdb\.npmjs\.com/registry\|\. The code for the couchapp is
available at https://github\.com/npm/npm\-registry\-couchapp\|\.
.P
@@ -44,8 +52,8 @@ build farms\.
.RE
.P
-The npm registry does not to correlate the information in these headers with
-any authenticated accounts that may be used in the same requests\.
+The npm registry does not try to correlate the information in these headers
+with any authenticated accounts that may be used in the same requests\.
.SH Can I run my own private registry?
.P
Yes!
@@ -79,7 +87,7 @@ No, but it's way easier\. Basically, yes, you do, or you have to
effectively implement the entire CouchDB API anyway\.
.SH Is there a website or something to see package docs and such?
.P
-Yes, head over to https://npmjs\.com/
+Yes, head over to https://www\.npmjs\.com/
.SH SEE ALSO
.RS 0
.IP \(bu 2
diff --git a/deps/npm/man/man7/npm-scope.7 b/deps/npm/man/man7/npm-scope.7
index 30c2b8d22e019a..1a541398444923 100644
--- a/deps/npm/man/man7/npm-scope.7
+++ b/deps/npm/man/man7/npm-scope.7
@@ -1,4 +1,4 @@
-.TH "NPM\-SCOPE" "7" "February 2018" "" ""
+.TH "NPM\-SCOPE" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-scope\fR \- Scoped packages
.SH DESCRIPTION
@@ -86,7 +86,7 @@ to \fBpublic\fP as if you had run \fBnpm access public\fP after publishing\.
.SS Publishing private scoped packages to the npm registry
.P
To publish a private scoped package to the npm registry, you must have
-an npm Private Modules \fIhttps://www\.npmjs\.com/private\-modules\fR
+an npm Private Modules \fIhttps://docs\.npmjs\.com/private\-modules/intro\fR
account\.
.P
You can then publish the module with \fBnpm publish\fP or \fBnpm publish
diff --git a/deps/npm/man/man7/npm-scripts.7 b/deps/npm/man/man7/npm-scripts.7
index 9fa22c2b4909c5..3a7e80870af281 100644
--- a/deps/npm/man/man7/npm-scripts.7
+++ b/deps/npm/man/man7/npm-scripts.7
@@ -1,4 +1,4 @@
-.TH "NPM\-SCRIPTS" "7" "February 2018" "" ""
+.TH "NPM\-SCRIPTS" "7" "August 2018" "" ""
.SH "NAME"
\fBnpm-scripts\fR \- How npm handles the "scripts" field
.SH DESCRIPTION
@@ -78,15 +78,15 @@ names will be run for those as well (e\.g\. \fBpremyscript\fP, \fBmyscript\fP,
.SH PREPUBLISH AND PREPARE
.SS DEPRECATION NOTE
.P
-Since \fBnpm@1\.1\.71\fP, the npm CLI has run the \fBprepublish\fP script for both \fBnpm
-publish\fP and \fBnpm install\fP, because it's a convenient way to prepare a package
+Since \fB, the npm CLI has run the\fPprepublish\fBscript for both\fPnpm
+publish\fBand\fPnpm install\fB, because it's a convenient way to prepare a package
for use (some common use cases are described in the section below)\. It has
-also turned out to be, in practice, very
-confusing \fIhttps://github\.com/npm/npm/issues/10074\fR\|\. As of \fBnpm@4\.0\.0\fP, a new
-event has been introduced, \fBprepare\fP, that preserves this existing behavior\. A
-\fInew\fR event, \fBprepublishOnly\fP has been added as a transitional strategy to
+also turned out to be, in practice, [very
+confusing](https://github\.com/npm/npm/issues/10074)\. As of\fP\fB, a new
+event has been introduced,\fPprepare\fB, that preserves this existing behavior\. A
+_new_ event,\fPprepublishOnly\fBhas been added as a transitional strategy to
allow users to avoid the confusing behavior of existing npm versions and only
-run on \fBnpm publish\fP (for instance, running the tests one last time to ensure
+run on\fPnpm publish` (for instance, running the tests one last time to ensure
they're in good shape)\.
.P
See https://github\.com/npm/npm/issues/10074 for a much lengthier
@@ -170,7 +170,9 @@ The package\.json fields are tacked onto the \fBnpm_package_\fP prefix\. So,
for instance, if you had \fB{"name":"foo", "version":"1\.2\.5"}\fP in your
package\.json file, then your package scripts would have the
\fBnpm_package_name\fP environment variable set to "foo", and the
-\fBnpm_package_version\fP set to "1\.2\.5"
+\fBnpm_package_version\fP set to "1\.2\.5"\. You can access these variables
+in your code with \fBprocess\.env\.npm_package_name\fP and
+\fBprocess\.env\.npm_package_version\fP, and so on for other fields\.
.SS configuration
.P
Configuration parameters are put in the environment with the
diff --git a/deps/npm/man/man7/removing-npm.7 b/deps/npm/man/man7/removing-npm.7
index b25a310f18a6e8..99fa7b79981995 100644
--- a/deps/npm/man/man7/removing-npm.7
+++ b/deps/npm/man/man7/removing-npm.7
@@ -1,4 +1,4 @@
-.TH "NPM\-REMOVAL" "1" "February 2018" "" ""
+.TH "NPM\-REMOVAL" "1" "August 2018" "" ""
.SH "NAME"
\fBnpm-removal\fR \- Cleaning the Slate
.SH SYNOPSIS
diff --git a/deps/npm/man/man7/semver.7 b/deps/npm/man/man7/semver.7
index 5851cc3aee3e69..abc92686aaee77 100644
--- a/deps/npm/man/man7/semver.7
+++ b/deps/npm/man/man7/semver.7
@@ -1,4 +1,4 @@
-.TH "SEMVER" "7" "February 2018" "" ""
+.TH "SEMVER" "7" "August 2018" "" ""
.SH "NAME"
\fBsemver\fR \- The semantic versioner for npm
.SH Install
@@ -23,6 +23,8 @@ semver\.clean(' =v1\.2\.3 ') // '1\.2\.3'
semver\.satisfies('1\.2\.3', '1\.x || >=2\.5\.0 || 5\.0\.0 \- 7\.2\.3') // true
semver\.gt('1\.2\.3', '9\.8\.7') // false
semver\.lt('1\.2\.3', '9\.8\.7') // true
+semver\.valid(semver\.coerce('v2')) // '2\.0\.0'
+semver\.valid(semver\.coerce('42\.6\.7\.9\.3\-alpha')) // '42\.6\.7'
.fi
.RE
.P
@@ -57,6 +59,10 @@ Options:
\-l \-\-loose
Interpret versions and ranges loosely
+\-c \-\-coerce
+ Coerce a string into SemVer if possible
+ (does not imply \-\-loose)
+
Program exits successfully if any valid version satisfies
all supplied ranges, and prints all satisfying versions\.
@@ -455,4 +461,22 @@ satisfy the range\.
.P
If you want to know if a version satisfies or does not satisfy a
range, use the \fBsatisfies(version, range)\fP function\.
+.SS Coercion
+.RS 0
+.IP \(bu 2
+\fBcoerce(version)\fP: Coerces a string to semver if possible
+
+.RE
+.P
+This aims to provide a very forgiving translation of a non\-semver
+string to semver\. It looks for the first digit in a string, and
+consumes all remaining characters which satisfy at least a partial semver
+(e\.g\., \fB1\fP, \fB1\.2\fP, \fB1\.2\.3\fP) up to the max permitted length (256 characters)\.
+Longer versions are simply truncated (\fB4\.6\.3\.9\.2\-alpha2\fP becomes \fB4\.6\.3\fP)\.
+All surrounding text is simply ignored (\fBv3\.4 replaces v3\.3\.1\fP becomes \fB3\.4\.0\fP)\.
+Only text which lacks digits will fail coercion (\fBversion one\fP is not valid)\.
+The maximum length for any semver component considered for coercion is 16 characters;
+longer components will be ignored (\fB10000000000000000\.4\.7\.4\fP becomes \fB4\.7\.4\fP)\.
+The maximum value for any semver component is \fBInteger\.MAX_SAFE_INTEGER || (2**53 \- 1)\fP;
+higher value components are invalid (\fB9999999999999999\.4\.7\.4\fP is likely invalid)\.
diff --git a/deps/npm/node_modules/JSONStream/.npmignore b/deps/npm/node_modules/JSONStream/.npmignore
deleted file mode 100644
index a9a9d586e5944f..00000000000000
--- a/deps/npm/node_modules/JSONStream/.npmignore
+++ /dev/null
@@ -1,2 +0,0 @@
-node_modules/*
-node_modules
diff --git a/deps/npm/node_modules/JSONStream/.travis.yml b/deps/npm/node_modules/JSONStream/.travis.yml
index 5f30bb5bd1aad4..2f60c363d24cf4 100644
--- a/deps/npm/node_modules/JSONStream/.travis.yml
+++ b/deps/npm/node_modules/JSONStream/.travis.yml
@@ -4,5 +4,3 @@ node_js:
- 5
- 6
sudo: false
-
-
diff --git a/deps/npm/node_modules/JSONStream/LICENSE.MIT b/deps/npm/node_modules/JSONStream/LICENSE.MIT
index 6eafbd734a6e06..49e7da41fec2be 100644
--- a/deps/npm/node_modules/JSONStream/LICENSE.MIT
+++ b/deps/npm/node_modules/JSONStream/LICENSE.MIT
@@ -2,23 +2,23 @@ The MIT License
Copyright (c) 2011 Dominic Tarr
-Permission is hereby granted, free of charge,
-to any person obtaining a copy of this software and
-associated documentation files (the "Software"), to
-deal in the Software without restriction, including
-without limitation the rights to use, copy, modify,
-merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom
-the Software is furnished to do so,
+Permission is hereby granted, free of charge,
+to any person obtaining a copy of this software and
+associated documentation files (the "Software"), to
+deal in the Software without restriction, including
+without limitation the rights to use, copy, modify,
+merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom
+the Software is furnished to do so,
subject to the following conditions:
-The above copyright notice and this permission notice
+The above copyright notice and this permission notice
shall be included in all copies or substantial portions of the Software.
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR
-ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
-TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
+OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR
+ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/deps/npm/node_modules/JSONStream/bin.js b/deps/npm/node_modules/JSONStream/bin.js
new file mode 100755
index 00000000000000..32209630f2f026
--- /dev/null
+++ b/deps/npm/node_modules/JSONStream/bin.js
@@ -0,0 +1,10 @@
+#! /usr/bin/env node
+
+var JSONStream = require('./')
+
+if(!module.parent && process.title !== 'browser') {
+ process.stdin
+ .pipe(JSONStream.parse(process.argv[2]))
+ .pipe(JSONStream.stringify('[', ',\n', ']\n', 2))
+ .pipe(process.stdout)
+}
diff --git a/deps/npm/node_modules/JSONStream/examples/all_docs.js b/deps/npm/node_modules/JSONStream/examples/all_docs.js
index fa87fe52da53dc..f20781e18c9dbf 100644
--- a/deps/npm/node_modules/JSONStream/examples/all_docs.js
+++ b/deps/npm/node_modules/JSONStream/examples/all_docs.js
@@ -6,7 +6,7 @@ var parser = JSONStream.parse(['rows', true]) //emit parts that match this path
, req = request({url: 'http://isaacs.couchone.com/registry/_all_docs'})
, logger = es.mapSync(function (data) { //create a stream that logs to stderr,
console.error(data)
- return data
+ return data
})
req.pipe(parser)
diff --git a/deps/npm/node_modules/JSONStream/index.js b/deps/npm/node_modules/JSONStream/index.js
index 6d68a19ae20272..a92967f568a4c2 100755
--- a/deps/npm/node_modules/JSONStream/index.js
+++ b/deps/npm/node_modules/JSONStream/index.js
@@ -1,5 +1,3 @@
-#! /usr/bin/env node
-
'use strict'
var Parser = require('jsonparse')
@@ -104,7 +102,7 @@ exports.parse = function (path, map) {
count ++
var actualPath = this.stack.slice(1).map(function(element) { return element.key }).concat([this.key])
- var data = this.value[this.key]
+ var data = value
if(null != data)
if(null != (data = map ? map(data, actualPath) : data)) {
if (emitKey || emitPath) {
@@ -117,7 +115,7 @@ exports.parse = function (path, map) {
stream.queue(data)
}
- delete this.value[this.key]
+ if (this.value) delete this.value[this.key]
for(var k in this.stack)
if (!Object.isFrozen(this.stack[k]))
this.stack[k].value = null
@@ -244,10 +242,3 @@ exports.stringifyObject = function (op, sep, cl, indent) {
return stream
}
-if(!module.parent && process.title !== 'browser') {
- process.stdin
- .pipe(exports.parse(process.argv[2]))
- .pipe(exports.stringify('[', ',\n', ']\n', 2))
- .pipe(process.stdout)
-}
-
diff --git a/deps/npm/node_modules/JSONStream/node_modules/jsonparse/LICENSE b/deps/npm/node_modules/JSONStream/node_modules/jsonparse/LICENSE
deleted file mode 100644
index 6dc24be5e5027a..00000000000000
--- a/deps/npm/node_modules/JSONStream/node_modules/jsonparse/LICENSE
+++ /dev/null
@@ -1,24 +0,0 @@
-The MIT License
-
-Copyright (c) 2012 Tim Caswell
-
-Permission is hereby granted, free of charge,
-to any person obtaining a copy of this software and
-associated documentation files (the "Software"), to
-deal in the Software without restriction, including
-without limitation the rights to use, copy, modify,
-merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom
-the Software is furnished to do so,
-subject to the following conditions:
-
-The above copyright notice and this permission notice
-shall be included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR
-ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
-TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
-SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/deps/npm/node_modules/JSONStream/node_modules/jsonparse/README.markdown b/deps/npm/node_modules/JSONStream/node_modules/jsonparse/README.markdown
deleted file mode 100644
index 0f405d359fe6cb..00000000000000
--- a/deps/npm/node_modules/JSONStream/node_modules/jsonparse/README.markdown
+++ /dev/null
@@ -1,11 +0,0 @@
-This is a streaming JSON parser. For a simpler, sax-based version see this gist: https://gist.github.com/1821394
-
-The MIT License (MIT)
-Copyright (c) 2011-2012 Tim Caswell
-
-Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
-
diff --git a/deps/npm/node_modules/JSONStream/node_modules/jsonparse/package.json b/deps/npm/node_modules/JSONStream/node_modules/jsonparse/package.json
deleted file mode 100644
index fe0cfcd1f4a151..00000000000000
--- a/deps/npm/node_modules/JSONStream/node_modules/jsonparse/package.json
+++ /dev/null
@@ -1,58 +0,0 @@
-{
- "_from": "jsonparse@^1.2.0",
- "_id": "jsonparse@1.3.1",
- "_inBundle": false,
- "_integrity": "sha1-P02uSpH6wxX3EGL4UhzCOfE2YoA=",
- "_location": "/JSONStream/jsonparse",
- "_phantomChildren": {},
- "_requested": {
- "type": "range",
- "registry": true,
- "raw": "jsonparse@^1.2.0",
- "name": "jsonparse",
- "escapedName": "jsonparse",
- "rawSpec": "^1.2.0",
- "saveSpec": null,
- "fetchSpec": "^1.2.0"
- },
- "_requiredBy": [
- "/JSONStream"
- ],
- "_resolved": "https://registry.npmjs.org/jsonparse/-/jsonparse-1.3.1.tgz",
- "_shasum": "3f4dae4a91fac315f71062f8521cc239f1366280",
- "_spec": "jsonparse@^1.2.0",
- "_where": "/Users/rebecca/code/npm/node_modules/JSONStream",
- "author": {
- "name": "Tim Caswell",
- "email": "tim@creationix.com"
- },
- "bugs": {
- "url": "http://github.com/creationix/jsonparse/issues"
- },
- "bundleDependencies": false,
- "deprecated": false,
- "description": "This is a pure-js JSON streaming parser for node.js",
- "devDependencies": {
- "tap": "~0.3.3",
- "tape": "~0.1.1"
- },
- "engines": [
- "node >= 0.2.0"
- ],
- "homepage": "https://github.com/creationix/jsonparse#readme",
- "license": "MIT",
- "main": "jsonparse.js",
- "name": "jsonparse",
- "repository": {
- "type": "git",
- "url": "git+ssh://git@github.com/creationix/jsonparse.git"
- },
- "scripts": {
- "test": "tap test/*.js"
- },
- "tags": [
- "json",
- "stream"
- ],
- "version": "1.3.1"
-}
diff --git a/deps/npm/node_modules/JSONStream/node_modules/through/LICENSE.MIT b/deps/npm/node_modules/JSONStream/node_modules/through/LICENSE.MIT
deleted file mode 100644
index 6eafbd734a6e06..00000000000000
--- a/deps/npm/node_modules/JSONStream/node_modules/through/LICENSE.MIT
+++ /dev/null
@@ -1,24 +0,0 @@
-The MIT License
-
-Copyright (c) 2011 Dominic Tarr
-
-Permission is hereby granted, free of charge,
-to any person obtaining a copy of this software and
-associated documentation files (the "Software"), to
-deal in the Software without restriction, including
-without limitation the rights to use, copy, modify,
-merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom
-the Software is furnished to do so,
-subject to the following conditions:
-
-The above copyright notice and this permission notice
-shall be included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR
-ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
-TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
-SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/deps/npm/node_modules/JSONStream/node_modules/through/index.js b/deps/npm/node_modules/JSONStream/node_modules/through/index.js
deleted file mode 100644
index ca5fc5901fd875..00000000000000
--- a/deps/npm/node_modules/JSONStream/node_modules/through/index.js
+++ /dev/null
@@ -1,108 +0,0 @@
-var Stream = require('stream')
-
-// through
-//
-// a stream that does nothing but re-emit the input.
-// useful for aggregating a series of changing but not ending streams into one stream)
-
-exports = module.exports = through
-through.through = through
-
-//create a readable writable stream.
-
-function through (write, end, opts) {
- write = write || function (data) { this.queue(data) }
- end = end || function () { this.queue(null) }
-
- var ended = false, destroyed = false, buffer = [], _ended = false
- var stream = new Stream()
- stream.readable = stream.writable = true
- stream.paused = false
-
-// stream.autoPause = !(opts && opts.autoPause === false)
- stream.autoDestroy = !(opts && opts.autoDestroy === false)
-
- stream.write = function (data) {
- write.call(this, data)
- return !stream.paused
- }
-
- function drain() {
- while(buffer.length && !stream.paused) {
- var data = buffer.shift()
- if(null === data)
- return stream.emit('end')
- else
- stream.emit('data', data)
- }
- }
-
- stream.queue = stream.push = function (data) {
-// console.error(ended)
- if(_ended) return stream
- if(data === null) _ended = true
- buffer.push(data)
- drain()
- return stream
- }
-
- //this will be registered as the first 'end' listener
- //must call destroy next tick, to make sure we're after any
- //stream piped from here.
- //this is only a problem if end is not emitted synchronously.
- //a nicer way to do this is to make sure this is the last listener for 'end'
-
- stream.on('end', function () {
- stream.readable = false
- if(!stream.writable && stream.autoDestroy)
- process.nextTick(function () {
- stream.destroy()
- })
- })
-
- function _end () {
- stream.writable = false
- end.call(stream)
- if(!stream.readable && stream.autoDestroy)
- stream.destroy()
- }
-
- stream.end = function (data) {
- if(ended) return
- ended = true
- if(arguments.length) stream.write(data)
- _end() // will emit or queue
- return stream
- }
-
- stream.destroy = function () {
- if(destroyed) return
- destroyed = true
- ended = true
- buffer.length = 0
- stream.writable = stream.readable = false
- stream.emit('close')
- return stream
- }
-
- stream.pause = function () {
- if(stream.paused) return
- stream.paused = true
- return stream
- }
-
- stream.resume = function () {
- if(stream.paused) {
- stream.paused = false
- stream.emit('resume')
- }
- drain()
- //may have become paused again,
- //as drain emits 'data'.
- if(!stream.paused)
- stream.emit('drain')
- return stream
- }
- return stream
-}
-
diff --git a/deps/npm/node_modules/JSONStream/node_modules/through/package.json b/deps/npm/node_modules/JSONStream/node_modules/through/package.json
deleted file mode 100644
index 645e8a4f6a3487..00000000000000
--- a/deps/npm/node_modules/JSONStream/node_modules/through/package.json
+++ /dev/null
@@ -1,72 +0,0 @@
-{
- "_from": "through@>=2.2.7 <3",
- "_id": "through@2.3.8",
- "_integrity": "sha1-DdTJ/6q8NXlgsbckEV1+Doai4fU=",
- "_location": "/JSONStream/through",
- "_phantomChildren": {},
- "_requested": {
- "type": "range",
- "registry": true,
- "raw": "through@>=2.2.7 <3",
- "name": "through",
- "escapedName": "through",
- "rawSpec": ">=2.2.7 <3",
- "saveSpec": null,
- "fetchSpec": ">=2.2.7 <3"
- },
- "_requiredBy": [
- "/JSONStream"
- ],
- "_resolved": "https://registry.npmjs.org/through/-/through-2.3.8.tgz",
- "_shasum": "0dd4c9ffaabc357960b1b724115d7e0e86a2e1f5",
- "_shrinkwrap": null,
- "_spec": "through@>=2.2.7 <3",
- "_where": "/Users/zkat/Documents/code/npm/node_modules/JSONStream",
- "author": {
- "name": "Dominic Tarr",
- "email": "dominic.tarr@gmail.com",
- "url": "dominictarr.com"
- },
- "bin": null,
- "bugs": {
- "url": "https://github.com/dominictarr/through/issues"
- },
- "bundleDependencies": false,
- "dependencies": {},
- "deprecated": false,
- "description": "simplified stream construction",
- "devDependencies": {
- "from": "~0.1.3",
- "stream-spec": "~0.3.5",
- "tape": "~2.3.2"
- },
- "homepage": "https://github.com/dominictarr/through",
- "keywords": [
- "stream",
- "streams",
- "user-streams",
- "pipe"
- ],
- "license": "MIT",
- "main": "index.js",
- "name": "through",
- "optionalDependencies": {},
- "peerDependencies": {},
- "repository": {
- "type": "git",
- "url": "git+https://github.com/dominictarr/through.git"
- },
- "scripts": {
- "test": "set -e; for t in test/*.js; do node $t; done"
- },
- "testling": {
- "browsers": [
- "ie/8..latest",
- "ff/15..latest",
- "chrome/20..latest",
- "safari/5.1..latest"
- ],
- "files": "test/*.js"
- },
- "version": "2.3.8"
-}
diff --git a/deps/npm/node_modules/JSONStream/node_modules/through/readme.markdown b/deps/npm/node_modules/JSONStream/node_modules/through/readme.markdown
deleted file mode 100644
index cb34c8135f53eb..00000000000000
--- a/deps/npm/node_modules/JSONStream/node_modules/through/readme.markdown
+++ /dev/null
@@ -1,64 +0,0 @@
-#through
-
-[![build status](https://secure.travis-ci.org/dominictarr/through.png)](http://travis-ci.org/dominictarr/through)
-[![testling badge](https://ci.testling.com/dominictarr/through.png)](https://ci.testling.com/dominictarr/through)
-
-Easy way to create a `Stream` that is both `readable` and `writable`.
-
-* Pass in optional `write` and `end` methods.
-* `through` takes care of pause/resume logic if you use `this.queue(data)` instead of `this.emit('data', data)`.
-* Use `this.pause()` and `this.resume()` to manage flow.
-* Check `this.paused` to see current flow state. (`write` always returns `!this.paused`).
-
-This function is the basis for most of the synchronous streams in
-[event-stream](http://github.com/dominictarr/event-stream).
-
-``` js
-var through = require('through')
-
-through(function write(data) {
- this.queue(data) //data *must* not be null
- },
- function end () { //optional
- this.queue(null)
- })
-```
-
-Or, can also be used _without_ buffering on pause, use `this.emit('data', data)`,
-and this.emit('end')
-
-``` js
-var through = require('through')
-
-through(function write(data) {
- this.emit('data', data)
- //this.pause()
- },
- function end () { //optional
- this.emit('end')
- })
-```
-
-## Extended Options
-
-You will probably not need these 99% of the time.
-
-### autoDestroy=false
-
-By default, `through` emits close when the writable
-and readable side of the stream has ended.
-If that is not desired, set `autoDestroy=false`.
-
-``` js
-var through = require('through')
-
-//like this
-var ts = through(write, end, {autoDestroy: false})
-//or like this
-var ts = through(write, end)
-ts.autoDestroy = false
-```
-
-## License
-
-MIT / Apache2
diff --git a/deps/npm/node_modules/JSONStream/node_modules/through/test/index.js b/deps/npm/node_modules/JSONStream/node_modules/through/test/index.js
deleted file mode 100644
index 96da82f97c74cf..00000000000000
--- a/deps/npm/node_modules/JSONStream/node_modules/through/test/index.js
+++ /dev/null
@@ -1,133 +0,0 @@
-
-var test = require('tape')
-var spec = require('stream-spec')
-var through = require('../')
-
-/*
- I'm using these two functions, and not streams and pipe
- so there is less to break. if this test fails it must be
- the implementation of _through_
-*/
-
-function write(array, stream) {
- array = array.slice()
- function next() {
- while(array.length)
- if(stream.write(array.shift()) === false)
- return stream.once('drain', next)
-
- stream.end()
- }
-
- next()
-}
-
-function read(stream, callback) {
- var actual = []
- stream.on('data', function (data) {
- actual.push(data)
- })
- stream.once('end', function () {
- callback(null, actual)
- })
- stream.once('error', function (err) {
- callback(err)
- })
-}
-
-test('simple defaults', function(assert) {
-
- var l = 1000
- , expected = []
-
- while(l--) expected.push(l * Math.random())
-
- var t = through()
- var s = spec(t).through().pausable()
-
- read(t, function (err, actual) {
- assert.ifError(err)
- assert.deepEqual(actual, expected)
- assert.end()
- })
-
- t.on('close', s.validate)
-
- write(expected, t)
-});
-
-test('simple functions', function(assert) {
-
- var l = 1000
- , expected = []
-
- while(l--) expected.push(l * Math.random())
-
- var t = through(function (data) {
- this.emit('data', data*2)
- })
- var s = spec(t).through().pausable()
-
-
- read(t, function (err, actual) {
- assert.ifError(err)
- assert.deepEqual(actual, expected.map(function (data) {
- return data*2
- }))
- assert.end()
- })
-
- t.on('close', s.validate)
-
- write(expected, t)
-})
-
-test('pauses', function(assert) {
-
- var l = 1000
- , expected = []
-
- while(l--) expected.push(l) //Math.random())
-
- var t = through()
-
- var s = spec(t)
- .through()
- .pausable()
-
- t.on('data', function () {
- if(Math.random() > 0.1) return
- t.pause()
- process.nextTick(function () {
- t.resume()
- })
- })
-
- read(t, function (err, actual) {
- assert.ifError(err)
- assert.deepEqual(actual, expected)
- })
-
- t.on('close', function () {
- s.validate()
- assert.end()
- })
-
- write(expected, t)
-})
-
-test('does not soft-end on `undefined`', function(assert) {
- var stream = through()
- , count = 0
-
- stream.on('data', function (data) {
- count++
- })
-
- stream.write(undefined)
- stream.write(undefined)
-
- assert.equal(count, 2)
-
- assert.end()
-})
diff --git a/deps/npm/node_modules/JSONStream/package.json b/deps/npm/node_modules/JSONStream/package.json
index 2086da717bcf41..23588d31b13a4a 100644
--- a/deps/npm/node_modules/JSONStream/package.json
+++ b/deps/npm/node_modules/JSONStream/package.json
@@ -1,35 +1,35 @@
{
- "_from": "JSONStream@~1.3.1",
- "_id": "JSONStream@1.3.1",
+ "_from": "JSONStream@1.3.4",
+ "_id": "JSONStream@1.3.4",
"_inBundle": false,
- "_integrity": "sha1-cH92HgHa6eFvG8+TcDt4xwlmV5o=",
+ "_integrity": "sha512-Y7vfi3I5oMOYIr+WxV8NZxDSwcbNgzdKYsTNInmycOq9bUYwGg9ryu57Wg5NLmCjqdFPNUmpMBo3kSJN9tCbXg==",
"_location": "/JSONStream",
"_phantomChildren": {},
"_requested": {
- "type": "range",
+ "type": "version",
"registry": true,
- "raw": "JSONStream@~1.3.1",
+ "raw": "JSONStream@1.3.4",
"name": "JSONStream",
"escapedName": "JSONStream",
- "rawSpec": "~1.3.1",
+ "rawSpec": "1.3.4",
"saveSpec": null,
- "fetchSpec": "~1.3.1"
+ "fetchSpec": "1.3.4"
},
"_requiredBy": [
"#USER",
"/"
],
- "_resolved": "https://registry.npmjs.org/JSONStream/-/JSONStream-1.3.1.tgz",
- "_shasum": "707f761e01dae9e16f1bcf93703b78c70966579a",
- "_spec": "JSONStream@~1.3.1",
- "_where": "/Users/rebecca/code/npm",
+ "_resolved": "https://registry.npmjs.org/JSONStream/-/JSONStream-1.3.4.tgz",
+ "_shasum": "615bb2adb0cd34c8f4c447b5f6512fa1d8f16a2e",
+ "_spec": "JSONStream@1.3.4",
+ "_where": "/Users/zkat/Documents/code/work/npm",
"author": {
"name": "Dominic Tarr",
"email": "dominic.tarr@gmail.com",
"url": "http://bit.ly/dominictarr"
},
"bin": {
- "JSONStream": "./index.js"
+ "JSONStream": "./bin.js"
},
"bugs": {
"url": "https://github.com/dominictarr/JSONStream/issues"
@@ -71,5 +71,5 @@
"scripts": {
"test": "set -e; for t in test/*.js; do echo '***' $t '***'; node $t; done"
},
- "version": "1.3.1"
+ "version": "1.3.4"
}
diff --git a/deps/npm/node_modules/JSONStream/readme.markdown b/deps/npm/node_modules/JSONStream/readme.markdown
index 422c3df2cc616a..7e94ddd7f4c029 100644
--- a/deps/npm/node_modules/JSONStream/readme.markdown
+++ b/deps/npm/node_modules/JSONStream/readme.markdown
@@ -118,9 +118,9 @@ stream.on('data', function(data) {
### recursive patterns (..)
-`JSONStream.parse('docs..value')`
+`JSONStream.parse('docs..value')`
(or `JSONStream.parse(['docs', {recurse: true}, 'value'])` using an array)
-will emit every `value` object that is a child, grand-child, etc. of the
+will emit every `value` object that is a child, grand-child, etc. of the
`docs` object. In this example, it will match exactly 5 times at various depth
levels, emitting 0, 1, 2, 3 and 4 as results.
@@ -204,4 +204,3 @@ https://github.com/Floby/node-json-streams
## license
Dual-licensed under the MIT License or the Apache License, version 2.0
-
diff --git a/deps/npm/node_modules/JSONStream/test/bool.js b/deps/npm/node_modules/JSONStream/test/bool.js
index 6c386d609f07f5..9b87b1730f107d 100644
--- a/deps/npm/node_modules/JSONStream/test/bool.js
+++ b/deps/npm/node_modules/JSONStream/test/bool.js
@@ -13,7 +13,7 @@ var fs = require ('fs')
lies: true,
nothing: [null],
// stuff: [Math.random(),Math.random(),Math.random()]
- }
+ }
: ['AOREC', 'reoubaor', {ouec: 62642}, [[[], {}, 53]]]
)
}
@@ -25,7 +25,7 @@ var expected = []
, called = 0
, count = 10
, ended = false
-
+
while (count --)
expected.push(randomObj())
@@ -34,7 +34,7 @@ while (count --)
stringify,
JSONStream.parse([true]),
es.writeArray(function (err, lines) {
-
+
it(lines).has(expected)
console.error('PASSED')
})
diff --git a/deps/npm/node_modules/JSONStream/test/disabled/doubledot1.js b/deps/npm/node_modules/JSONStream/test/doubledot1.js
similarity index 99%
rename from deps/npm/node_modules/JSONStream/test/disabled/doubledot1.js
rename to deps/npm/node_modules/JSONStream/test/doubledot1.js
index 78149b93f6e7c3..ceaa3edb33162b 100644
--- a/deps/npm/node_modules/JSONStream/test/disabled/doubledot1.js
+++ b/deps/npm/node_modules/JSONStream/test/doubledot1.js
@@ -11,7 +11,7 @@ var expected = JSON.parse(fs.readFileSync(file))
, parsed = []
fs.createReadStream(file).pipe(parser)
-
+
parser.on('data', function (data) {
called ++
parsed.push(data)
diff --git a/deps/npm/node_modules/JSONStream/test/disabled/doubledot2.js b/deps/npm/node_modules/JSONStream/test/doubledot2.js
similarity index 80%
rename from deps/npm/node_modules/JSONStream/test/disabled/doubledot2.js
rename to deps/npm/node_modules/JSONStream/test/doubledot2.js
index f99d88197b2ed4..980024153c697a 100644
--- a/deps/npm/node_modules/JSONStream/test/disabled/doubledot2.js
+++ b/deps/npm/node_modules/JSONStream/test/doubledot2.js
@@ -11,7 +11,7 @@
, parsed = []
fs.createReadStream(file).pipe(parser)
-
+
parser.on('data', function (data) {
called ++
parsed.push(data)
@@ -22,8 +22,9 @@
})
process.on('exit', function () {
- it(called).equal(5)
+ var expectedValues = [0, [1], {"a": 2}, "3", 4]
+ it(called).equal(expectedValues.length)
for (var i = 0 ; i < 5 ; i++)
- it(parsed[i]).deepEqual(i)
+ it(parsed[i]).deepEqual(expectedValues[i])
console.error('PASSED')
})
diff --git a/deps/npm/node_modules/JSONStream/test/fixtures/depth.json b/deps/npm/node_modules/JSONStream/test/fixtures/depth.json
index 868062f30657af..9b4bfb93764403 100644
--- a/deps/npm/node_modules/JSONStream/test/fixtures/depth.json
+++ b/deps/npm/node_modules/JSONStream/test/fixtures/depth.json
@@ -7,9 +7,9 @@
"some": "property"
}
},
- {"value": 1},
- {"value": 2},
- {"blbl": [{}, {"a":0, "b":1, "value":3}, 10]},
+ {"value": [1]},
+ {"value": {"a":2}},
+ {"blbl": [{}, {"a":0, "b":1, "value":"3"}, 10]},
{"value": 4}
]
-}
\ No newline at end of file
+}
diff --git a/deps/npm/node_modules/JSONStream/test/fn.js b/deps/npm/node_modules/JSONStream/test/fn.js
index 4acc672627fd16..01e61e88fa6b61 100644
--- a/deps/npm/node_modules/JSONStream/test/fn.js
+++ b/deps/npm/node_modules/JSONStream/test/fn.js
@@ -17,7 +17,7 @@ var expected = JSON.parse(fs.readFileSync(file))
, parsed = []
fs.createReadStream(file).pipe(parser)
-
+
parser.on('data', function (data) {
called ++
it.has({
diff --git a/deps/npm/node_modules/JSONStream/test/gen.js b/deps/npm/node_modules/JSONStream/test/gen.js
index c233722ac31a20..75e87d56e45a49 100644
--- a/deps/npm/node_modules/JSONStream/test/gen.js
+++ b/deps/npm/node_modules/JSONStream/test/gen.js
@@ -111,7 +111,7 @@ var tape = require('tape')
items++
if(Math.random() < 0.01) console.log(items, '...')
});
-
+
parser.on('end', function () {
t.equal(items, size)
});
@@ -126,10 +126,10 @@ var tape = require('tape')
console.log(stat)
if(err)
generateTestData(testJSONStreamParse_causesOutOfMem);
- else
+ else
testJSONStreamParse_causesOutOfMem()
})
})
-
+
// }
diff --git a/deps/npm/node_modules/JSONStream/test/keys.js b/deps/npm/node_modules/JSONStream/test/keys.js
index 747723d11e2cc3..86b65b257b9572 100644
--- a/deps/npm/node_modules/JSONStream/test/keys.js
+++ b/deps/npm/node_modules/JSONStream/test/keys.js
@@ -41,7 +41,7 @@ test('keys via array', function(t) {
test('path via array', function(t) {
var stream = JSONStream.parse(['obj',{emitPath: true}]);
-
+
var paths = [];
var values = [];
stream.on('data', function(data) {
diff --git a/deps/npm/node_modules/JSONStream/test/map.js b/deps/npm/node_modules/JSONStream/test/map.js
index 29b9d896913570..6c05fc68406c4b 100644
--- a/deps/npm/node_modules/JSONStream/test/map.js
+++ b/deps/npm/node_modules/JSONStream/test/map.js
@@ -37,4 +37,3 @@ test('filter function', function (t) {
stream.end()
})
-
diff --git a/deps/npm/node_modules/JSONStream/test/null.js b/deps/npm/node_modules/JSONStream/test/null.js
index 95dd60c0af04dc..25628ee585568c 100644
--- a/deps/npm/node_modules/JSONStream/test/null.js
+++ b/deps/npm/node_modules/JSONStream/test/null.js
@@ -14,7 +14,7 @@ var test = require('tape')
test ('null properties', function (t) {
var actual = []
- var stream =
+ var stream =
JSONStream.parse('*.optional')
.on('data', function (v) { actual.push(v) })
diff --git a/deps/npm/node_modules/JSONStream/test/parsejson.js b/deps/npm/node_modules/JSONStream/test/parsejson.js
index a94333446df2af..7f157175f5c48d 100644
--- a/deps/npm/node_modules/JSONStream/test/parsejson.js
+++ b/deps/npm/node_modules/JSONStream/test/parsejson.js
@@ -7,7 +7,7 @@
var r = Math.random()
, Parser = require('jsonparse')
, p = new Parser()
- , assert = require('assert')
+ , assert = require('assert')
, times = 20
while (times --) {
diff --git a/deps/npm/node_modules/JSONStream/test/stringify.js b/deps/npm/node_modules/JSONStream/test/stringify.js
index b6de85ed253f22..20b996957524b9 100644
--- a/deps/npm/node_modules/JSONStream/test/stringify.js
+++ b/deps/npm/node_modules/JSONStream/test/stringify.js
@@ -13,7 +13,7 @@ var fs = require ('fs')
lies: true,
nothing: [null],
stuff: [Math.random(),Math.random(),Math.random()]
- }
+ }
: ['AOREC', 'reoubaor', {ouec: 62642}, [[[], {}, 53]]]
)
}
@@ -25,7 +25,7 @@ var expected = []
, called = 0
, count = 10
, ended = false
-
+
while (count --)
expected.push(randomObj())
@@ -34,7 +34,7 @@ while (count --)
stringify,
//JSONStream.parse([/./]),
es.writeArray(function (err, lines) {
-
+
it(JSON.parse(lines.join(''))).deepEqual(expected)
console.error('PASSED')
})
diff --git a/deps/npm/node_modules/JSONStream/test/stringify_object.js b/deps/npm/node_modules/JSONStream/test/stringify_object.js
index 9490115a0db996..73a2b8350d83cf 100644
--- a/deps/npm/node_modules/JSONStream/test/stringify_object.js
+++ b/deps/npm/node_modules/JSONStream/test/stringify_object.js
@@ -16,7 +16,7 @@ var fs = require ('fs')
lies: true,
nothing: [null],
stuff: [Math.random(),Math.random(),Math.random()]
- }
+ }
: ['AOREC', 'reoubaor', {ouec: 62642}, [[[], {}, 53]]]
)
}
@@ -24,7 +24,7 @@ var fs = require ('fs')
for (var ix = 0; ix < pending; ix++) (function (count) {
var expected = {}
, stringify = JSONStream.stringifyObject()
-
+
es.connect(
stringify,
es.writeArray(function (err, lines) {
diff --git a/deps/npm/node_modules/JSONStream/test/test.js b/deps/npm/node_modules/JSONStream/test/test.js
index 8ea7c2e1f13895..adc3d7569590ec 100644
--- a/deps/npm/node_modules/JSONStream/test/test.js
+++ b/deps/npm/node_modules/JSONStream/test/test.js
@@ -13,7 +13,7 @@ var expected = JSON.parse(fs.readFileSync(file))
, parsed = []
fs.createReadStream(file).pipe(parser)
-
+
parser.on('data', function (data) {
called ++
it.has({
diff --git a/deps/npm/node_modules/JSONStream/test/test2.js b/deps/npm/node_modules/JSONStream/test/test2.js
index d09df7be4d3ee0..a77ca3910a9cfe 100644
--- a/deps/npm/node_modules/JSONStream/test/test2.js
+++ b/deps/npm/node_modules/JSONStream/test/test2.js
@@ -13,7 +13,7 @@ var expected = JSON.parse(fs.readFileSync(file))
, parsed = []
fs.createReadStream(file).pipe(parser)
-
+
parser.on('data', function (data) {
called ++
it(data).deepEqual(expected)
diff --git a/deps/npm/node_modules/JSONStream/test/two-ways.js b/deps/npm/node_modules/JSONStream/test/two-ways.js
index 8f3b89c8bfe6ec..a74dfba36e86f7 100644
--- a/deps/npm/node_modules/JSONStream/test/two-ways.js
+++ b/deps/npm/node_modules/JSONStream/test/two-ways.js
@@ -13,7 +13,7 @@ var fs = require ('fs')
lies: true,
nothing: [null],
// stuff: [Math.random(),Math.random(),Math.random()]
- }
+ }
: ['AOREC', 'reoubaor', {ouec: 62642}, [[[], {}, 53]]]
)
}
@@ -25,7 +25,7 @@ var expected = []
, called = 0
, count = 10
, ended = false
-
+
while (count --)
expected.push(randomObj())
@@ -34,7 +34,7 @@ while (count --)
stringify,
JSONStream.parse([/./]),
es.writeArray(function (err, lines) {
-
+
it(lines).has(expected)
console.error('PASSED')
})
diff --git a/deps/npm/node_modules/abbrev/package.json b/deps/npm/node_modules/abbrev/package.json
index 0c44f79d60baf2..4c05db1efe758f 100644
--- a/deps/npm/node_modules/abbrev/package.json
+++ b/deps/npm/node_modules/abbrev/package.json
@@ -1,4 +1,10 @@
{
+ "_args": [
+ [
+ "abbrev@1.1.1",
+ "/Users/rebecca/code/npm"
+ ]
+ ],
"_from": "abbrev@1.1.1",
"_id": "abbrev@1.1.1",
"_inBundle": false,
@@ -16,14 +22,12 @@
"fetchSpec": "1.1.1"
},
"_requiredBy": [
- "#USER",
"/",
"/node-gyp/nopt",
"/nopt"
],
"_resolved": "https://registry.npmjs.org/abbrev/-/abbrev-1.1.1.tgz",
- "_shasum": "f8f2c887ad10bf67f634f005b6987fed3179aac8",
- "_spec": "abbrev@1.1.1",
+ "_spec": "1.1.1",
"_where": "/Users/rebecca/code/npm",
"author": {
"name": "Isaac Z. Schlueter",
@@ -32,8 +36,6 @@
"bugs": {
"url": "https://github.com/isaacs/abbrev-js/issues"
},
- "bundleDependencies": false,
- "deprecated": false,
"description": "Like ruby's abbrev module, but in js",
"devDependencies": {
"tap": "^10.1"
diff --git a/deps/npm/node_modules/pacote/node_modules/make-fetch-happen/node_modules/http-proxy-agent/node_modules/agent-base/.travis.yml b/deps/npm/node_modules/agent-base/.travis.yml
similarity index 100%
rename from deps/npm/node_modules/pacote/node_modules/make-fetch-happen/node_modules/http-proxy-agent/node_modules/agent-base/.travis.yml
rename to deps/npm/node_modules/agent-base/.travis.yml
diff --git a/deps/npm/node_modules/agent-base/History.md b/deps/npm/node_modules/agent-base/History.md
new file mode 100644
index 00000000000000..80c88dc401f960
--- /dev/null
+++ b/deps/npm/node_modules/agent-base/History.md
@@ -0,0 +1,113 @@
+
+4.2.0 / 2018-01-15
+==================
+
+ * Add support for returning an `http.Agent` instance
+ * Optimize promisifying logic
+ * Set `timeout` to null for proper cleanup
+ * Remove Node.js <= 0.11.3 special-casing from test case
+
+4.1.2 / 2017-11-20
+==================
+
+ * test Node 9 on Travis
+ * ensure that `https.get()` uses the patched `https.request()`
+
+4.1.1 / 2017-07-20
+==================
+
+ * Correct `https.request()` with a String (#9)
+
+4.1.0 / 2017-06-26
+==================
+
+ * mix in Agent options into Request options
+ * throw when nothing is returned from agent-base callback
+ * do not modify the options object for https requests
+
+4.0.1 / 2017-06-13
+==================
+
+ * add `this` context tests and fixes
+
+4.0.0 / 2017-06-06
+==================
+
+ * drop support for Node.js < 4
+ * drop old versions of Node.js from Travis-CI
+ * specify Node.js >= 4.0.0 in `engines.node`
+ * remove more old code
+ * remove "extend" dependency
+ * remove "semver" dependency
+ * make the Promise logic a bit cleaner
+ * add async function pseudo-example to README
+ * use direct return in README example
+
+3.0.0 / 2017-06-02
+==================
+
+ * drop support for Node.js v0.8 and v0.10
+ * add support for async, Promises, and direct return
+ * add a couple `options` test cases
+ * implement a `"timeout"` option
+ * rename main file to `index.js`
+ * test Node 8 on Travis
+
+2.1.1 / 2017-05-30
+==================
+
+ * Revert [`fe2162e`](https://github.com/TooTallNate/node-agent-base/commit/fe2162e0ba18123f5b301cba4de1e9dd74e437cd) and [`270bdc9`](https://github.com/TooTallNate/node-agent-base/commit/270bdc92eb8e3bd0444d1e5266e8e9390aeb3095) (fixes #7)
+
+2.1.0 / 2017-05-26
+==================
+
+ * unref is not supported for node < 0.9.1 (@pi0)
+ * add tests to dangling socket (@pi0)
+ * check unref() is supported (@pi0)
+ * fix dangling sockets problem (@pi0)
+ * add basic "ws" module tests
+ * make `Agent` be subclassable
+ * turn `addRequest()` into a named function
+ * test: Node.js v4 likes to call `cork` on the stream (#3, @tomhughes)
+ * travis: test node v4, v5, v6 and v7
+
+2.0.1 / 2015-09-10
+==================
+
+ * package: update "semver" to v5.0.1 for WebPack (#1, @vhpoet)
+
+2.0.0 / 2015-07-10
+==================
+
+ * refactor to patch Node.js core for more consistent `opts` values
+ * ensure that HTTP(s) default port numbers are always given
+ * test: use ssl-cert-snakeoil SSL certs
+ * test: add tests for arbitrary options
+ * README: add API section
+ * README: make the Agent HTTP/HTTPS generic in the example
+ * README: use SVG for Travis-CI badge
+
+1.0.2 / 2015-06-27
+==================
+
+ * agent: set `req._hadError` to true after emitting "error"
+ * package: update "mocha" to v2
+ * test: add artificial HTTP GET request test
+ * test: add artificial data events test
+ * test: fix artifical GET response test on node > v0.11.3
+ * test: use a real timeout for the async error test
+
+1.0.1 / 2013-09-09
+==================
+
+ * Fix passing an "error" object to the callback function on the first tick
+
+1.0.0 / 2013-09-09
+==================
+
+ * New API: now you pass a callback function directly
+
+0.0.1 / 2013-07-09
+==================
+
+ * Initial release
diff --git a/deps/npm/node_modules/agent-base/README.md b/deps/npm/node_modules/agent-base/README.md
new file mode 100644
index 00000000000000..dbeceab8a125f6
--- /dev/null
+++ b/deps/npm/node_modules/agent-base/README.md
@@ -0,0 +1,145 @@
+agent-base
+==========
+### Turn a function into an [`http.Agent`][http.Agent] instance
+[![Build Status](https://travis-ci.org/TooTallNate/node-agent-base.svg?branch=master)](https://travis-ci.org/TooTallNate/node-agent-base)
+
+This module provides an `http.Agent` generator. That is, you pass it an async
+callback function, and it returns a new `http.Agent` instance that will invoke the
+given callback function when sending outbound HTTP requests.
+
+#### Some subclasses:
+
+Here's some more interesting uses of `agent-base`.
+Send a pull request to list yours!
+
+ * [`http-proxy-agent`][http-proxy-agent]: An HTTP(s) proxy `http.Agent` implementation for HTTP endpoints
+ * [`https-proxy-agent`][https-proxy-agent]: An HTTP(s) proxy `http.Agent` implementation for HTTPS endpoints
+ * [`pac-proxy-agent`][pac-proxy-agent]: A PAC file proxy `http.Agent` implementation for HTTP and HTTPS
+ * [`socks-proxy-agent`][socks-proxy-agent]: A SOCKS (v4a) proxy `http.Agent` implementation for HTTP and HTTPS
+
+
+Installation
+------------
+
+Install with `npm`:
+
+``` bash
+$ npm install agent-base
+```
+
+
+Example
+-------
+
+Here's a minimal example that creates a new `net.Socket` connection to the server
+for every HTTP request (i.e. the equivalent of `agent: false` option):
+
+```js
+var net = require('net');
+var tls = require('tls');
+var url = require('url');
+var http = require('http');
+var agent = require('agent-base');
+
+var endpoint = 'http://nodejs.org/api/';
+var parsed = url.parse(endpoint);
+
+// This is the important part!
+parsed.agent = agent(function (req, opts) {
+ var socket;
+ // `secureEndpoint` is true when using the https module
+ if (opts.secureEndpoint) {
+ socket = tls.connect(opts);
+ } else {
+ socket = net.connect(opts);
+ }
+ return socket;
+});
+
+// Everything else works just like normal...
+http.get(parsed, function (res) {
+ console.log('"response" event!', res.headers);
+ res.pipe(process.stdout);
+});
+```
+
+Returning a Promise or using an `async` function is also supported:
+
+```js
+agent(async function (req, opts) {
+ await sleep(1000);
+ // etc…
+});
+```
+
+Return another `http.Agent` instance to "pass through" the responsibility
+for that HTTP request to that agent:
+
+```js
+agent(function (req, opts) {
+ return opts.secureEndpoint ? https.globalAgent : http.globalAgent;
+});
+```
+
+
+API
+---
+
+## Agent(Function callback[, Object options]) → [http.Agent][]
+
+Creates a base `http.Agent` that will execute the callback function `callback`
+for every HTTP request that it is used as the `agent` for. The callback function
+is responsible for creating a `stream.Duplex` instance of some kind that will be
+used as the underlying socket in the HTTP request.
+
+The `options` object accepts the following properties:
+
+ * `timeout` - Number - Timeout for the `callback()` function in milliseconds. Defaults to Infinity (optional).
+
+The callback function should have the following signature:
+
+### callback(http.ClientRequest req, Object options, Function cb) → undefined
+
+The ClientRequest `req` can be accessed to read request headers and
+and the path, etc. The `options` object contains the options passed
+to the `http.request()`/`https.request()` function call, and is formatted
+to be directly passed to `net.connect()`/`tls.connect()`, or however
+else you want a Socket to be created. Pass the created socket to
+the callback function `cb` once created, and the HTTP request will
+continue to proceed.
+
+If the `https` module is used to invoke the HTTP request, then the
+`secureEndpoint` property on `options` _will be set to `true`_.
+
+
+License
+-------
+
+(The MIT License)
+
+Copyright (c) 2013 Nathan Rajlich <nathan@tootallnate.net>
+
+Permission is hereby granted, free of charge, to any person obtaining
+a copy of this software and associated documentation files (the
+'Software'), to deal in the Software without restriction, including
+without limitation the rights to use, copy, modify, merge, publish,
+distribute, sublicense, and/or sell copies of the Software, and to
+permit persons to whom the Software is furnished to do so, subject to
+the following conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+[http-proxy-agent]: https://github.com/TooTallNate/node-http-proxy-agent
+[https-proxy-agent]: https://github.com/TooTallNate/node-https-proxy-agent
+[pac-proxy-agent]: https://github.com/TooTallNate/node-pac-proxy-agent
+[socks-proxy-agent]: https://github.com/TooTallNate/node-socks-proxy-agent
+[http.Agent]: https://nodejs.org/api/http.html#http_class_http_agent
diff --git a/deps/npm/node_modules/agent-base/index.js b/deps/npm/node_modules/agent-base/index.js
new file mode 100644
index 00000000000000..b1f42e6317431d
--- /dev/null
+++ b/deps/npm/node_modules/agent-base/index.js
@@ -0,0 +1,160 @@
+'use strict';
+require('./patch-core');
+const inherits = require('util').inherits;
+const promisify = require('es6-promisify');
+const EventEmitter = require('events').EventEmitter;
+
+module.exports = Agent;
+
+function isAgent(v) {
+ return v && typeof v.addRequest === 'function';
+}
+
+/**
+ * Base `http.Agent` implementation.
+ * No pooling/keep-alive is implemented by default.
+ *
+ * @param {Function} callback
+ * @api public
+ */
+function Agent(callback, _opts) {
+ if (!(this instanceof Agent)) {
+ return new Agent(callback, _opts);
+ }
+
+ EventEmitter.call(this);
+
+ // The callback gets promisified if it has 3 parameters
+ // (i.e. it has a callback function) lazily
+ this._promisifiedCallback = false;
+
+ let opts = _opts;
+ if ('function' === typeof callback) {
+ this.callback = callback;
+ } else if (callback) {
+ opts = callback;
+ }
+
+ // timeout for the socket to be returned from the callback
+ this.timeout = (opts && opts.timeout) || null;
+
+ this.options = opts;
+}
+inherits(Agent, EventEmitter);
+
+/**
+ * Override this function in your subclass!
+ */
+Agent.prototype.callback = function callback(req, opts) {
+ throw new Error(
+ '"agent-base" has no default implementation, you must subclass and override `callback()`'
+ );
+};
+
+/**
+ * Called by node-core's "_http_client.js" module when creating
+ * a new HTTP request with this Agent instance.
+ *
+ * @api public
+ */
+Agent.prototype.addRequest = function addRequest(req, _opts) {
+ const ownOpts = Object.assign({}, _opts);
+
+ // Set default `host` for HTTP to localhost
+ if (null == ownOpts.host) {
+ ownOpts.host = 'localhost';
+ }
+
+ // Set default `port` for HTTP if none was explicitly specified
+ if (null == ownOpts.port) {
+ ownOpts.port = ownOpts.secureEndpoint ? 443 : 80;
+ }
+
+ const opts = Object.assign({}, this.options, ownOpts);
+
+ if (opts.host && opts.path) {
+ // If both a `host` and `path` are specified then it's most likely the
+ // result of a `url.parse()` call... we need to remove the `path` portion so
+ // that `net.connect()` doesn't attempt to open that as a unix socket file.
+ delete opts.path;
+ }
+
+ delete opts.agent;
+ delete opts.hostname;
+ delete opts._defaultAgent;
+ delete opts.defaultPort;
+ delete opts.createConnection;
+
+ // Hint to use "Connection: close"
+ // XXX: non-documented `http` module API :(
+ req._last = true;
+ req.shouldKeepAlive = false;
+
+ // Create the `stream.Duplex` instance
+ let timeout;
+ let timedOut = false;
+ const timeoutMs = this.timeout;
+
+ function onerror(err) {
+ if (req._hadError) return;
+ req.emit('error', err);
+ // For Safety. Some additional errors might fire later on
+ // and we need to make sure we don't double-fire the error event.
+ req._hadError = true;
+ }
+
+ function ontimeout() {
+ timeout = null;
+ timedOut = true;
+ const err = new Error(
+ 'A "socket" was not created for HTTP request before ' + timeoutMs + 'ms'
+ );
+ err.code = 'ETIMEOUT';
+ onerror(err);
+ }
+
+ function callbackError(err) {
+ if (timedOut) return;
+ if (timeout != null) {
+ clearTimeout(timeout);
+ timeout = null;
+ }
+ onerror(err);
+ }
+
+ function onsocket(socket) {
+ if (timedOut) return;
+ if (timeout != null) {
+ clearTimeout(timeout);
+ timeout = null;
+ }
+ if (isAgent(socket)) {
+ // `socket` is actually an http.Agent instance, so relinquish
+ // responsibility for this `req` to the Agent from here on
+ socket.addRequest(req, opts);
+ } else if (socket) {
+ req.onSocket(socket);
+ } else {
+ const err = new Error(
+ `no Duplex stream was returned to agent-base for \`${req.method} ${req.path}\``
+ );
+ onerror(err);
+ }
+ }
+
+ if (!this._promisifiedCallback && this.callback.length >= 3) {
+ // Legacy callback function - convert to a Promise
+ this.callback = promisify(this.callback, this);
+ this._promisifiedCallback = true;
+ }
+
+ if (timeoutMs > 0) {
+ timeout = setTimeout(ontimeout, timeoutMs);
+ }
+
+ try {
+ Promise.resolve(this.callback(req, opts)).then(onsocket, callbackError);
+ } catch (err) {
+ Promise.reject(err).catch(callbackError);
+ }
+};
diff --git a/deps/npm/node_modules/agent-base/package.json b/deps/npm/node_modules/agent-base/package.json
new file mode 100644
index 00000000000000..59c5e0be25956a
--- /dev/null
+++ b/deps/npm/node_modules/agent-base/package.json
@@ -0,0 +1,69 @@
+{
+ "_from": "agent-base@4",
+ "_id": "agent-base@4.2.0",
+ "_inBundle": false,
+ "_integrity": "sha512-c+R/U5X+2zz2+UCrCFv6odQzJdoqI+YecuhnAJLa1zYaMc13zPfwMwZrr91Pd1DYNo/yPRbiM4WVf9whgwFsIg==",
+ "_location": "/agent-base",
+ "_phantomChildren": {},
+ "_requested": {
+ "type": "range",
+ "registry": true,
+ "raw": "agent-base@4",
+ "name": "agent-base",
+ "escapedName": "agent-base",
+ "rawSpec": "4",
+ "saveSpec": null,
+ "fetchSpec": "4"
+ },
+ "_requiredBy": [
+ "/http-proxy-agent",
+ "/https-proxy-agent",
+ "/npm-profile/socks-proxy-agent",
+ "/npm-registry-fetch/socks-proxy-agent",
+ "/socks-proxy-agent"
+ ],
+ "_resolved": "https://registry.npmjs.org/agent-base/-/agent-base-4.2.0.tgz",
+ "_shasum": "9838b5c3392b962bad031e6a4c5e1024abec45ce",
+ "_spec": "agent-base@4",
+ "_where": "/Users/rebecca/code/npm/node_modules/http-proxy-agent",
+ "author": {
+ "name": "Nathan Rajlich",
+ "email": "nathan@tootallnate.net",
+ "url": "http://n8.io/"
+ },
+ "bugs": {
+ "url": "https://github.com/TooTallNate/node-agent-base/issues"
+ },
+ "bundleDependencies": false,
+ "dependencies": {
+ "es6-promisify": "^5.0.0"
+ },
+ "deprecated": false,
+ "description": "Turn a function into an `http.Agent` instance",
+ "devDependencies": {
+ "mocha": "^3.4.2",
+ "ws": "^3.0.0"
+ },
+ "engines": {
+ "node": ">= 4.0.0"
+ },
+ "homepage": "https://github.com/TooTallNate/node-agent-base#readme",
+ "keywords": [
+ "http",
+ "agent",
+ "base",
+ "barebones",
+ "https"
+ ],
+ "license": "MIT",
+ "main": "./index.js",
+ "name": "agent-base",
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/TooTallNate/node-agent-base.git"
+ },
+ "scripts": {
+ "test": "mocha --reporter spec"
+ },
+ "version": "4.2.0"
+}
diff --git a/deps/npm/node_modules/agent-base/patch-core.js b/deps/npm/node_modules/agent-base/patch-core.js
new file mode 100644
index 00000000000000..47d26a72b0a65e
--- /dev/null
+++ b/deps/npm/node_modules/agent-base/patch-core.js
@@ -0,0 +1,37 @@
+'use strict';
+const url = require('url');
+const https = require('https');
+
+/**
+ * This currently needs to be applied to all Node.js versions
+ * in order to determine if the `req` is an HTTP or HTTPS request.
+ *
+ * There is currently no PR attempting to move this property upstream.
+ */
+https.request = (function(request) {
+ return function(_options, cb) {
+ let options;
+ if (typeof _options === 'string') {
+ options = url.parse(_options);
+ } else {
+ options = Object.assign({}, _options);
+ }
+ if (null == options.port) {
+ options.port = 443;
+ }
+ options.secureEndpoint = true;
+ return request.call(https, options, cb);
+ };
+})(https.request);
+
+/**
+ * This is needed for Node.js >= 9.0.0 to make sure `https.get()` uses the
+ * patched `https.request()`.
+ *
+ * Ref: https://github.com/nodejs/node/commit/5118f31
+ */
+https.get = function(options, cb) {
+ const req = https.request(options, cb);
+ req.end();
+ return req;
+};
diff --git a/deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/http-proxy-agent/node_modules/agent-base/test/ssl-cert-snakeoil.key b/deps/npm/node_modules/agent-base/test/ssl-cert-snakeoil.key
similarity index 100%
rename from deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/http-proxy-agent/node_modules/agent-base/test/ssl-cert-snakeoil.key
rename to deps/npm/node_modules/agent-base/test/ssl-cert-snakeoil.key
diff --git a/deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/http-proxy-agent/node_modules/agent-base/test/ssl-cert-snakeoil.pem b/deps/npm/node_modules/agent-base/test/ssl-cert-snakeoil.pem
similarity index 100%
rename from deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/http-proxy-agent/node_modules/agent-base/test/ssl-cert-snakeoil.pem
rename to deps/npm/node_modules/agent-base/test/ssl-cert-snakeoil.pem
diff --git a/deps/npm/node_modules/agent-base/test/test.js b/deps/npm/node_modules/agent-base/test/test.js
new file mode 100644
index 00000000000000..da2e91983548b4
--- /dev/null
+++ b/deps/npm/node_modules/agent-base/test/test.js
@@ -0,0 +1,673 @@
+/**
+ * Module dependencies.
+ */
+
+var fs = require('fs');
+var url = require('url');
+var net = require('net');
+var tls = require('tls');
+var http = require('http');
+var https = require('https');
+var WebSocket = require('ws');
+var assert = require('assert');
+var events = require('events');
+var inherits = require('util').inherits;
+var Agent = require('../');
+
+var PassthroughAgent = Agent(function(req, opts) {
+ return opts.secureEndpoint ? https.globalAgent : http.globalAgent;
+});
+
+describe('Agent', function() {
+ describe('subclass', function() {
+ it('should be subclassable', function(done) {
+ function MyAgent() {
+ Agent.call(this);
+ }
+ inherits(MyAgent, Agent);
+
+ MyAgent.prototype.callback = function(req, opts, fn) {
+ assert.equal(req.path, '/foo');
+ assert.equal(req.getHeader('host'), '127.0.0.1:1234');
+ assert.equal(opts.secureEndpoint, true);
+ done();
+ };
+
+ var info = url.parse('https://127.0.0.1:1234/foo');
+ info.agent = new MyAgent();
+ https.get(info);
+ });
+ });
+ describe('options', function() {
+ it('should support an options Object as first argument', function() {
+ var agent = new Agent({ timeout: 1000 });
+ assert.equal(1000, agent.timeout);
+ });
+ it('should support an options Object as second argument', function() {
+ var agent = new Agent(function() {}, { timeout: 1000 });
+ assert.equal(1000, agent.timeout);
+ });
+ it('should be mixed in with HTTP request options', function(done) {
+ var agent = new Agent({
+ host: 'my-proxy.com',
+ port: 3128,
+ foo: 'bar'
+ });
+ agent.callback = function(req, opts, fn) {
+ assert.equal('bar', opts.foo);
+ assert.equal('a', opts.b);
+
+ // `host` and `port` are special-cases, and should always be
+ // overwritten in the request `opts` inside the agent-base callback
+ assert.equal('localhost', opts.host);
+ assert.equal(80, opts.port);
+ done();
+ };
+ var opts = {
+ b: 'a',
+ agent: agent
+ };
+ http.get(opts);
+ });
+ });
+ describe('`this` context', function() {
+ it('should be the Agent instance', function(done) {
+ var called = false;
+ var agent = new Agent();
+ agent.callback = function() {
+ called = true;
+ assert.equal(this, agent);
+ };
+ var info = url.parse('http://127.0.0.1/foo');
+ info.agent = agent;
+ var req = http.get(info);
+ req.on('error', function(err) {
+ assert(/no Duplex stream was returned/.test(err.message));
+ done();
+ });
+ });
+ it('should be the Agent instance with callback signature', function(done) {
+ var called = false;
+ var agent = new Agent();
+ agent.callback = function(req, opts, fn) {
+ called = true;
+ assert.equal(this, agent);
+ fn();
+ };
+ var info = url.parse('http://127.0.0.1/foo');
+ info.agent = agent;
+ var req = http.get(info);
+ req.on('error', function(err) {
+ assert(/no Duplex stream was returned/.test(err.message));
+ done();
+ });
+ });
+ });
+ describe('"error" event', function() {
+ it('should be invoked on `http.ClientRequest` instance if `callback()` has not been defined', function(
+ done
+ ) {
+ var agent = new Agent();
+ var info = url.parse('http://127.0.0.1/foo');
+ info.agent = agent;
+ var req = http.get(info);
+ req.on('error', function(err) {
+ assert.equal(
+ '"agent-base" has no default implementation, you must subclass and override `callback()`',
+ err.message
+ );
+ done();
+ });
+ });
+ it('should be invoked on `http.ClientRequest` instance if Error passed to callback function on the first tick', function(
+ done
+ ) {
+ var agent = new Agent(function(req, opts, fn) {
+ fn(new Error('is this caught?'));
+ });
+ var info = url.parse('http://127.0.0.1/foo');
+ info.agent = agent;
+ var req = http.get(info);
+ req.on('error', function(err) {
+ assert.equal('is this caught?', err.message);
+ done();
+ });
+ });
+ it('should be invoked on `http.ClientRequest` instance if Error passed to callback function after the first tick', function(
+ done
+ ) {
+ var agent = new Agent(function(req, opts, fn) {
+ setTimeout(function() {
+ fn(new Error('is this caught?'));
+ }, 10);
+ });
+ var info = url.parse('http://127.0.0.1/foo');
+ info.agent = agent;
+ var req = http.get(info);
+ req.on('error', function(err) {
+ assert.equal('is this caught?', err.message);
+ done();
+ });
+ });
+ });
+ describe('artificial "streams"', function() {
+ it('should send a GET request', function(done) {
+ var stream = new events.EventEmitter();
+
+ // needed for the `http` module to call .write() on the stream
+ stream.writable = true;
+
+ stream.write = function(str) {
+ assert(0 == str.indexOf('GET / HTTP/1.1'));
+ done();
+ };
+
+ // needed for `http` module in Node.js 4
+ stream.cork = function() {};
+
+ var opts = {
+ method: 'GET',
+ host: '127.0.0.1',
+ path: '/',
+ port: 80,
+ agent: new Agent(function(req, opts, fn) {
+ fn(null, stream);
+ })
+ };
+ var req = http.request(opts);
+ req.end();
+ });
+ it('should receive a GET response', function(done) {
+ var stream = new events.EventEmitter();
+ var opts = {
+ method: 'GET',
+ host: '127.0.0.1',
+ path: '/',
+ port: 80,
+ agent: new Agent(function(req, opts, fn) {
+ fn(null, stream);
+ })
+ };
+ var req = http.request(opts, function(res) {
+ assert.equal('0.9', res.httpVersion);
+ assert.equal(111, res.statusCode);
+ assert.equal('bar', res.headers.foo);
+ done();
+ });
+
+ // have to wait for the "socket" event since `http.ClientRequest`
+ // doesn't *actually* attach the listeners to the "stream" until
+ // this happens
+ req.once('socket', function() {
+ var buf = new Buffer(
+ 'HTTP/0.9 111\r\n' +
+ 'Foo: bar\r\n' +
+ 'Set-Cookie: 1\r\n' +
+ 'Set-Cookie: 2\r\n\r\n'
+ );
+ stream.emit('data', buf);
+ });
+
+ req.end();
+ });
+ });
+});
+
+describe('"http" module', function() {
+ var server;
+ var port;
+
+ // setup test HTTP server
+ before(function(done) {
+ server = http.createServer();
+ server.listen(0, function() {
+ port = server.address().port;
+ done();
+ });
+ });
+
+ // shut down test HTTP server
+ after(function(done) {
+ server.once('close', function() {
+ done();
+ });
+ server.close();
+ });
+
+ it('should work for basic HTTP requests', function(done) {
+ var called = false;
+ var agent = new Agent(function(req, opts, fn) {
+ called = true;
+ var socket = net.connect(opts);
+ fn(null, socket);
+ });
+
+ // add HTTP server "request" listener
+ var gotReq = false;
+ server.once('request', function(req, res) {
+ gotReq = true;
+ res.setHeader('X-Foo', 'bar');
+ res.setHeader('X-Url', req.url);
+ res.end();
+ });
+
+ var info = url.parse('http://127.0.0.1:' + port + '/foo');
+ info.agent = agent;
+ http.get(info, function(res) {
+ assert.equal('bar', res.headers['x-foo']);
+ assert.equal('/foo', res.headers['x-url']);
+ assert(gotReq);
+ assert(called);
+ done();
+ });
+ });
+
+ it('should support direct return in `connect()`', function(done) {
+ var called = false;
+ var agent = new Agent(function(req, opts) {
+ called = true;
+ return net.connect(opts);
+ });
+
+ // add HTTP server "request" listener
+ var gotReq = false;
+ server.once('request', function(req, res) {
+ gotReq = true;
+ res.setHeader('X-Foo', 'bar');
+ res.setHeader('X-Url', req.url);
+ res.end();
+ });
+
+ var info = url.parse('http://127.0.0.1:' + port + '/foo');
+ info.agent = agent;
+ http.get(info, function(res) {
+ assert.equal('bar', res.headers['x-foo']);
+ assert.equal('/foo', res.headers['x-url']);
+ assert(gotReq);
+ assert(called);
+ done();
+ });
+ });
+
+ it('should support returning a Promise in `connect()`', function(done) {
+ var called = false;
+ var agent = new Agent(function(req, opts) {
+ return new Promise(function(resolve, reject) {
+ called = true;
+ resolve(net.connect(opts));
+ });
+ });
+
+ // add HTTP server "request" listener
+ var gotReq = false;
+ server.once('request', function(req, res) {
+ gotReq = true;
+ res.setHeader('X-Foo', 'bar');
+ res.setHeader('X-Url', req.url);
+ res.end();
+ });
+
+ var info = url.parse('http://127.0.0.1:' + port + '/foo');
+ info.agent = agent;
+ http.get(info, function(res) {
+ assert.equal('bar', res.headers['x-foo']);
+ assert.equal('/foo', res.headers['x-url']);
+ assert(gotReq);
+ assert(called);
+ done();
+ });
+ });
+
+ it('should set the `Connection: close` response header', function(done) {
+ var called = false;
+ var agent = new Agent(function(req, opts, fn) {
+ called = true;
+ var socket = net.connect(opts);
+ fn(null, socket);
+ });
+
+ // add HTTP server "request" listener
+ var gotReq = false;
+ server.once('request', function(req, res) {
+ gotReq = true;
+ res.setHeader('X-Url', req.url);
+ assert.equal('close', req.headers.connection);
+ res.end();
+ });
+
+ var info = url.parse('http://127.0.0.1:' + port + '/bar');
+ info.agent = agent;
+ http.get(info, function(res) {
+ assert.equal('/bar', res.headers['x-url']);
+ assert.equal('close', res.headers.connection);
+ assert(gotReq);
+ assert(called);
+ done();
+ });
+ });
+
+ it('should pass through options from `http.request()`', function(done) {
+ var agent = new Agent(function(req, opts, fn) {
+ assert.equal('google.com', opts.host);
+ assert.equal('bar', opts.foo);
+ done();
+ });
+
+ http.get({
+ host: 'google.com',
+ foo: 'bar',
+ agent: agent
+ });
+ });
+
+ it('should default to port 80', function(done) {
+ var agent = new Agent(function(req, opts, fn) {
+ assert.equal(80, opts.port);
+ done();
+ });
+
+ // (probably) not hitting a real HTTP server here,
+ // so no need to add a httpServer request listener
+ http.get({
+ host: '127.0.0.1',
+ path: '/foo',
+ agent: agent
+ });
+ });
+
+ it('should support the "timeout" option', function(done) {
+ // ensure we timeout after the "error" event had a chance to trigger
+ this.timeout(1000);
+ this.slow(800);
+
+ var agent = new Agent(
+ function(req, opts, fn) {
+ // this function will time out
+ },
+ { timeout: 100 }
+ );
+
+ var opts = url.parse('http://nodejs.org');
+ opts.agent = agent;
+
+ var req = http.get(opts);
+ req.once('error', function(err) {
+ assert.equal('ETIMEOUT', err.code);
+ req.abort();
+ done();
+ });
+ });
+
+ describe('PassthroughAgent', function() {
+ it('should pass through to `http.globalAgent`', function(done) {
+ // add HTTP server "request" listener
+ var gotReq = false;
+ server.once('request', function(req, res) {
+ gotReq = true;
+ res.setHeader('X-Foo', 'bar');
+ res.setHeader('X-Url', req.url);
+ res.end();
+ });
+
+ var info = url.parse('http://127.0.0.1:' + port + '/foo');
+ info.agent = PassthroughAgent;
+ http.get(info, function(res) {
+ assert.equal('bar', res.headers['x-foo']);
+ assert.equal('/foo', res.headers['x-url']);
+ assert(gotReq);
+ done();
+ });
+ });
+ });
+});
+
+describe('"https" module', function() {
+ var server;
+ var port;
+
+ // setup test HTTPS server
+ before(function(done) {
+ var options = {
+ key: fs.readFileSync(__dirname + '/ssl-cert-snakeoil.key'),
+ cert: fs.readFileSync(__dirname + '/ssl-cert-snakeoil.pem')
+ };
+ server = https.createServer(options);
+ server.listen(0, function() {
+ port = server.address().port;
+ done();
+ });
+ });
+
+ // shut down test HTTP server
+ after(function(done) {
+ server.once('close', function() {
+ done();
+ });
+ server.close();
+ });
+
+ it('should not modify the passed in Options object', function(done) {
+ var called = false;
+ var agent = new Agent(function(req, opts, fn) {
+ called = true;
+ assert.equal(true, opts.secureEndpoint);
+ assert.equal(443, opts.port);
+ assert.equal('localhost', opts.host);
+ });
+ var opts = { agent: agent };
+ var req = https.request(opts);
+ assert.equal(true, called);
+ assert.equal(false, 'secureEndpoint' in opts);
+ assert.equal(false, 'port' in opts);
+ done();
+ });
+
+ it('should work with a String URL', function(done) {
+ var endpoint = 'https://127.0.0.1:' + port;
+ var req = https.get(endpoint);
+
+ // it's gonna error out since `rejectUnauthorized` is not being passed in
+ req.on('error', function(err) {
+ assert.equal(err.code, 'DEPTH_ZERO_SELF_SIGNED_CERT');
+ done();
+ });
+ });
+
+ it('should work for basic HTTPS requests', function(done) {
+ var called = false;
+ var agent = new Agent(function(req, opts, fn) {
+ called = true;
+ assert(opts.secureEndpoint);
+ var socket = tls.connect(opts);
+ fn(null, socket);
+ });
+
+ // add HTTPS server "request" listener
+ var gotReq = false;
+ server.once('request', function(req, res) {
+ gotReq = true;
+ res.setHeader('X-Foo', 'bar');
+ res.setHeader('X-Url', req.url);
+ res.end();
+ });
+
+ var info = url.parse('https://127.0.0.1:' + port + '/foo');
+ info.agent = agent;
+ info.rejectUnauthorized = false;
+ https.get(info, function(res) {
+ assert.equal('bar', res.headers['x-foo']);
+ assert.equal('/foo', res.headers['x-url']);
+ assert(gotReq);
+ assert(called);
+ done();
+ });
+ });
+
+ it('should pass through options from `https.request()`', function(done) {
+ var agent = new Agent(function(req, opts, fn) {
+ assert.equal('google.com', opts.host);
+ assert.equal('bar', opts.foo);
+ done();
+ });
+
+ https.get({
+ host: 'google.com',
+ foo: 'bar',
+ agent: agent
+ });
+ });
+
+ it('should default to port 443', function(done) {
+ var agent = new Agent(function(req, opts, fn) {
+ assert.equal(true, opts.secureEndpoint);
+ assert.equal(false, opts.rejectUnauthorized);
+ assert.equal(443, opts.port);
+ done();
+ });
+
+ // (probably) not hitting a real HTTPS server here,
+ // so no need to add a httpsServer request listener
+ https.get({
+ host: '127.0.0.1',
+ path: '/foo',
+ agent: agent,
+ rejectUnauthorized: false
+ });
+ });
+
+ describe('PassthroughAgent', function() {
+ it('should pass through to `https.globalAgent`', function(done) {
+ // add HTTP server "request" listener
+ var gotReq = false;
+ server.once('request', function(req, res) {
+ gotReq = true;
+ res.setHeader('X-Foo', 'bar');
+ res.setHeader('X-Url', req.url);
+ res.end();
+ });
+
+ var info = url.parse('https://127.0.0.1:' + port + '/foo');
+ info.agent = PassthroughAgent;
+ info.rejectUnauthorized = false;
+ https.get(info, function(res) {
+ assert.equal('bar', res.headers['x-foo']);
+ assert.equal('/foo', res.headers['x-url']);
+ assert(gotReq);
+ done();
+ });
+ });
+ });
+});
+
+describe('"ws" server', function() {
+ var wss;
+ var server;
+ var port;
+
+ // setup test HTTP server
+ before(function(done) {
+ server = http.createServer();
+ wss = new WebSocket.Server({ server: server });
+ server.listen(0, function() {
+ port = server.address().port;
+ done();
+ });
+ });
+
+ // shut down test HTTP server
+ after(function(done) {
+ server.once('close', function() {
+ done();
+ });
+ server.close();
+ });
+
+ it('should work for basic WebSocket connections', function(done) {
+ function onconnection(ws) {
+ ws.on('message', function(data) {
+ assert.equal('ping', data);
+ ws.send('pong');
+ });
+ }
+ wss.on('connection', onconnection);
+
+ var agent = new Agent(function(req, opts, fn) {
+ var socket = net.connect(opts);
+ fn(null, socket);
+ });
+
+ var client = new WebSocket('ws://127.0.0.1:' + port + '/', {
+ agent: agent
+ });
+
+ client.on('open', function() {
+ client.send('ping');
+ });
+
+ client.on('message', function(data) {
+ assert.equal('pong', data);
+ client.close();
+ wss.removeListener('connection', onconnection);
+ done();
+ });
+ });
+});
+
+describe('"wss" server', function() {
+ var wss;
+ var server;
+ var port;
+
+ // setup test HTTP server
+ before(function(done) {
+ var options = {
+ key: fs.readFileSync(__dirname + '/ssl-cert-snakeoil.key'),
+ cert: fs.readFileSync(__dirname + '/ssl-cert-snakeoil.pem')
+ };
+ server = https.createServer(options);
+ wss = new WebSocket.Server({ server: server });
+ server.listen(0, function() {
+ port = server.address().port;
+ done();
+ });
+ });
+
+ // shut down test HTTP server
+ after(function(done) {
+ server.once('close', function() {
+ done();
+ });
+ server.close();
+ });
+
+ it('should work for secure WebSocket connections', function(done) {
+ function onconnection(ws) {
+ ws.on('message', function(data) {
+ assert.equal('ping', data);
+ ws.send('pong');
+ });
+ }
+ wss.on('connection', onconnection);
+
+ var agent = new Agent(function(req, opts, fn) {
+ var socket = tls.connect(opts);
+ fn(null, socket);
+ });
+
+ var client = new WebSocket('wss://127.0.0.1:' + port + '/', {
+ agent: agent,
+ rejectUnauthorized: false
+ });
+
+ client.on('open', function() {
+ client.send('ping');
+ });
+
+ client.on('message', function(data) {
+ assert.equal('pong', data);
+ client.close();
+ wss.removeListener('connection', onconnection);
+ done();
+ });
+ });
+});
diff --git a/deps/npm/node_modules/agentkeepalive/History.md b/deps/npm/node_modules/agentkeepalive/History.md
new file mode 100644
index 00000000000000..da67a1c4f6f94e
--- /dev/null
+++ b/deps/npm/node_modules/agentkeepalive/History.md
@@ -0,0 +1,148 @@
+
+3.4.1 / 2018-03-08
+==================
+
+**fixes**
+ * [[`4d3a3b1`](http://github.com/node-modules/agentkeepalive/commit/4d3a3b1f7b16595febbbd39eeed72b2663549014)] - fix: Handle ipv6 addresses in host-header correctly with TLS (#53) (Mattias Holmlund <>)
+
+**others**
+ * [[`55a7a5c`](http://github.com/node-modules/agentkeepalive/commit/55a7a5cd33e97f9a8370083dcb041c5552f10ac9)] - test: stop timer after test end (fengmk2 <>)
+
+3.4.0 / 2018-02-27
+==================
+
+**features**
+ * [[`bc7cadb`](http://github.com/node-modules/agentkeepalive/commit/bc7cadb30ecd2071e2b341ac53ae1a2b8155c43d)] - feat: use socket custom freeSocketKeepAliveTimeout first (#59) (fengmk2 <>)
+
+**others**
+ * [[`138eda8`](http://github.com/node-modules/agentkeepalive/commit/138eda81e10b632aaa87bea0cb66d8667124c4e8)] - doc: fix `keepAliveMsecs` params description (#55) (Hongcai Deng <>)
+
+3.3.0 / 2017-06-20
+==================
+
+ * feat: add statusChanged getter (#51)
+ * chore: format License
+
+3.2.0 / 2017-06-10
+==================
+
+ * feat: add expiring active sockets
+ * test: add node 8 (#49)
+
+3.1.0 / 2017-02-20
+==================
+
+ * feat: timeout support humanize ms (#48)
+
+3.0.0 / 2016-12-20
+==================
+
+ * fix: emit agent socket close event
+ * test: add remove excess calls to removeSocket
+ * test: use egg-ci
+ * test: refactor test with eslint rules
+ * feat: merge _http_agent.js from 7.2.1
+
+2.2.0 / 2016-06-26
+==================
+
+ * feat: Add browser shim (noop) for isomorphic use. (#39)
+ * chore: add security check badge
+
+2.1.1 / 2016-04-06
+==================
+
+ * https: fix ssl socket leak when keepalive is used
+ * chore: remove circle ci image
+
+2.1.0 / 2016-04-02
+==================
+
+ * fix: opened sockets number overflow maxSockets
+
+2.0.5 / 2016-03-16
+==================
+
+ * fix: pick _evictSession to httpsAgent
+
+2.0.4 / 2016-03-13
+==================
+
+ * test: add Circle ci
+ * test: add appveyor ci build
+ * refactor: make sure only one error listener
+ * chore: use codecov
+ * fix: handle idle socket error
+ * test: run on more node versions
+
+2.0.3 / 2015-08-03
+==================
+
+ * fix: add default error handler to avoid Unhandled error event throw
+
+2.0.2 / 2015-04-25
+==================
+
+ * fix: remove socket from freeSockets on 'timeout' (@pmalouin)
+
+2.0.1 / 2015-04-19
+==================
+
+ * fix: add timeoutSocketCount to getCurrentStatus()
+ * feat(getCurrentStatus): add getCurrentStatus
+
+2.0.0 / 2015-04-01
+==================
+
+ * fix: socket.destroyed always be undefined on 0.10.x
+ * Make it compatible with node v0.10.x (@lattmann)
+
+1.2.1 / 2015-03-23
+==================
+
+ * patch from iojs: don't overwrite servername option
+ * patch commits from joyent/node
+ * add max sockets test case
+ * add nagle algorithm delayed link
+
+1.2.0 / 2014-09-02
+==================
+
+ * allow set keepAliveTimeout = 0
+ * support timeout on working socket. fixed #6
+
+1.1.0 / 2014-08-28
+==================
+
+ * add some socket counter for deep monitor
+
+1.0.0 / 2014-08-13
+==================
+
+ * update _http_agent, only support 0.11+, only support node 0.11.0+
+
+0.2.2 / 2013-11-19
+==================
+
+ * support node 0.8 and node 0.10
+
+0.2.1 / 2013-11-08
+==================
+
+ * fix socket does not timeout bug, it will hang on life, must use 0.2.x on node 0.11
+
+0.2.0 / 2013-11-06
+==================
+
+ * use keepalive agent on node 0.11+ impl
+
+0.1.5 / 2013-06-24
+==================
+
+ * support coveralls
+ * add node 0.10 test
+ * add 0.8.22 original https.js
+ * add original http.js module to diff
+ * update jscover
+ * mv pem to fixtures
+ * add https agent usage
diff --git a/deps/npm/node_modules/agentkeepalive/README.md b/deps/npm/node_modules/agentkeepalive/README.md
new file mode 100644
index 00000000000000..ce067f10c7fb7a
--- /dev/null
+++ b/deps/npm/node_modules/agentkeepalive/README.md
@@ -0,0 +1,248 @@
+# agentkeepalive
+
+[![NPM version][npm-image]][npm-url]
+[![build status][travis-image]][travis-url]
+[![Appveyor status][appveyor-image]][appveyor-url]
+[![Test coverage][codecov-image]][codecov-url]
+[![David deps][david-image]][david-url]
+[![Known Vulnerabilities][snyk-image]][snyk-url]
+[![npm download][download-image]][download-url]
+
+[npm-image]: https://img.shields.io/npm/v/agentkeepalive.svg?style=flat
+[npm-url]: https://npmjs.org/package/agentkeepalive
+[travis-image]: https://img.shields.io/travis/node-modules/agentkeepalive.svg?style=flat
+[travis-url]: https://travis-ci.org/node-modules/agentkeepalive
+[appveyor-image]: https://ci.appveyor.com/api/projects/status/k7ct4s47di6m5uy2?svg=true
+[appveyor-url]: https://ci.appveyor.com/project/fengmk2/agentkeepalive
+[codecov-image]: https://codecov.io/gh/node-modules/agentkeepalive/branch/master/graph/badge.svg
+[codecov-url]: https://codecov.io/gh/node-modules/agentkeepalive
+[david-image]: https://img.shields.io/david/node-modules/agentkeepalive.svg?style=flat
+[david-url]: https://david-dm.org/node-modules/agentkeepalive
+[snyk-image]: https://snyk.io/test/npm/agentkeepalive/badge.svg?style=flat-square
+[snyk-url]: https://snyk.io/test/npm/agentkeepalive
+[download-image]: https://img.shields.io/npm/dm/agentkeepalive.svg?style=flat-square
+[download-url]: https://npmjs.org/package/agentkeepalive
+
+The Node.js's missing `keep alive` `http.Agent`. Support `http` and `https`.
+
+## What's different from original `http.Agent`?
+
+- `keepAlive=true` by default
+- Disable Nagle's algorithm: `socket.setNoDelay(true)`
+- Add free socket timeout: avoid long time inactivity socket leak in the free-sockets queue.
+- Add active socket timeout: avoid long time inactivity socket leak in the active-sockets queue.
+
+## Install
+
+```bash
+$ npm install agentkeepalive --save
+```
+
+## new Agent([options])
+
+* `options` {Object} Set of configurable options to set on the agent.
+ Can have the following fields:
+ * `keepAlive` {Boolean} Keep sockets around in a pool to be used by
+ other requests in the future. Default = `true`.
+ * `keepAliveMsecs` {Number} When using the keepAlive option, specifies the initial delay
+ for TCP Keep-Alive packets. Ignored when the keepAlive option is false or undefined. Defaults to 1000.
+ Default = `1000`. Only relevant if `keepAlive` is set to `true`.
+ * `freeSocketKeepAliveTimeout`: {Number} Sets the free socket to timeout
+ after `freeSocketKeepAliveTimeout` milliseconds of inactivity on the free socket.
+ Default is `15000`.
+ Only relevant if `keepAlive` is set to `true`.
+ * `timeout`: {Number} Sets the working socket to timeout
+ after `timeout` milliseconds of inactivity on the working socket.
+ Default is `freeSocketKeepAliveTimeout * 2`.
+ * `maxSockets` {Number} Maximum number of sockets to allow per
+ host. Default = `Infinity`.
+ * `maxFreeSockets` {Number} Maximum number of sockets to leave open
+ in a free state. Only relevant if `keepAlive` is set to `true`.
+ Default = `256`.
+ * `socketActiveTTL` {Number} Sets the socket active time to live, even if it's in use.
+ If not setted the behaviour continues the same (the socket will be released only when free)
+ Default = `null`.
+
+## Usage
+
+```js
+const http = require('http');
+const Agent = require('agentkeepalive');
+
+const keepaliveAgent = new Agent({
+ maxSockets: 100,
+ maxFreeSockets: 10,
+ timeout: 60000,
+ freeSocketKeepAliveTimeout: 30000, // free socket keepalive for 30 seconds
+});
+
+const options = {
+ host: 'cnodejs.org',
+ port: 80,
+ path: '/',
+ method: 'GET',
+ agent: keepaliveAgent,
+};
+
+const req = http.request(options, res => {
+ console.log('STATUS: ' + res.statusCode);
+ console.log('HEADERS: ' + JSON.stringify(res.headers));
+ res.setEncoding('utf8');
+ res.on('data', function (chunk) {
+ console.log('BODY: ' + chunk);
+ });
+});
+req.on('error', e => {
+ console.log('problem with request: ' + e.message);
+});
+req.end();
+
+setTimeout(() => {
+ if (keepaliveAgent.statusChanged) {
+ console.log('[%s] agent status changed: %j', Date(), keepaliveAgent.getCurrentStatus());
+ }
+}, 2000);
+
+```
+
+### `getter agent.statusChanged`
+
+counters have change or not after last checkpoint.
+
+### `agent.getCurrentStatus()`
+
+`agent.getCurrentStatus()` will return a object to show the status of this agent:
+
+```js
+{
+ createSocketCount: 10,
+ closeSocketCount: 5,
+ timeoutSocketCount: 0,
+ requestCount: 5,
+ freeSockets: { 'localhost:57479:': 3 },
+ sockets: { 'localhost:57479:': 5 },
+ requests: {}
+}
+```
+
+### Support `https`
+
+```js
+const https = require('https');
+const HttpsAgent = require('agentkeepalive').HttpsAgent;
+
+const keepaliveAgent = new HttpsAgent();
+// https://www.google.com/search?q=nodejs&sugexp=chrome,mod=12&sourceid=chrome&ie=UTF-8
+const options = {
+ host: 'www.google.com',
+ port: 443,
+ path: '/search?q=nodejs&sugexp=chrome,mod=12&sourceid=chrome&ie=UTF-8',
+ method: 'GET',
+ agent: keepaliveAgent,
+};
+
+const req = https.request(options, res => {
+ console.log('STATUS: ' + res.statusCode);
+ console.log('HEADERS: ' + JSON.stringify(res.headers));
+ res.setEncoding('utf8');
+ res.on('data', chunk => {
+ console.log('BODY: ' + chunk);
+ });
+});
+
+req.on('error', e => {
+ console.log('problem with request: ' + e.message);
+});
+req.end();
+
+setTimeout(() => {
+ console.log('agent status: %j', keepaliveAgent.getCurrentStatus());
+}, 2000);
+```
+
+## [Benchmark](https://github.com/node-modules/agentkeepalive/tree/master/benchmark)
+
+run the benchmark:
+
+```bash
+cd benchmark
+sh start.sh
+```
+
+Intel(R) Core(TM)2 Duo CPU P8600 @ 2.40GHz
+
+node@v0.8.9
+
+50 maxSockets, 60 concurrent, 1000 requests per concurrent, 5ms delay
+
+Keep alive agent (30 seconds):
+
+```js
+Transactions: 60000 hits
+Availability: 100.00 %
+Elapsed time: 29.70 secs
+Data transferred: 14.88 MB
+Response time: 0.03 secs
+Transaction rate: 2020.20 trans/sec
+Throughput: 0.50 MB/sec
+Concurrency: 59.84
+Successful transactions: 60000
+Failed transactions: 0
+Longest transaction: 0.15
+Shortest transaction: 0.01
+```
+
+Normal agent:
+
+```js
+Transactions: 60000 hits
+Availability: 100.00 %
+Elapsed time: 46.53 secs
+Data transferred: 14.88 MB
+Response time: 0.05 secs
+Transaction rate: 1289.49 trans/sec
+Throughput: 0.32 MB/sec
+Concurrency: 59.81
+Successful transactions: 60000
+Failed transactions: 0
+Longest transaction: 0.45
+Shortest transaction: 0.00
+```
+
+Socket created:
+
+```
+[proxy.js:120000] keepalive, 50 created, 60000 requestFinished, 1200 req/socket, 0 requests, 0 sockets, 0 unusedSockets, 50 timeout
+{" <10ms":662," <15ms":17825," <20ms":20552," <30ms":17646," <40ms":2315," <50ms":567," <100ms":377," <150ms":56," <200ms":0," >=200ms+":0}
+----------------------------------------------------------------
+[proxy.js:120000] normal , 53866 created, 84260 requestFinished, 1.56 req/socket, 0 requests, 0 sockets
+{" <10ms":75," <15ms":1112," <20ms":10947," <30ms":32130," <40ms":8228," <50ms":3002," <100ms":4274," <150ms":181," <200ms":18," >=200ms+":33}
+```
+
+## License
+
+```
+(The MIT License)
+
+Copyright(c) node-modules and other contributors.
+Copyright(c) 2012 - 2015 fengmk2
+
+Permission is hereby granted, free of charge, to any person obtaining
+a copy of this software and associated documentation files (the
+'Software'), to deal in the Software without restriction, including
+without limitation the rights to use, copy, modify, merge, publish,
+distribute, sublicense, and/or sell copies of the Software, and to
+permit persons to whom the Software is furnished to do so, subject to
+the following conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+```
diff --git a/deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/agentkeepalive/browser.js b/deps/npm/node_modules/agentkeepalive/browser.js
similarity index 100%
rename from deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/agentkeepalive/browser.js
rename to deps/npm/node_modules/agentkeepalive/browser.js
diff --git a/deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/agentkeepalive/index.js b/deps/npm/node_modules/agentkeepalive/index.js
similarity index 100%
rename from deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/agentkeepalive/index.js
rename to deps/npm/node_modules/agentkeepalive/index.js
diff --git a/deps/npm/node_modules/agentkeepalive/lib/_http_agent.js b/deps/npm/node_modules/agentkeepalive/lib/_http_agent.js
new file mode 100644
index 00000000000000..83f1d115eac84a
--- /dev/null
+++ b/deps/npm/node_modules/agentkeepalive/lib/_http_agent.js
@@ -0,0 +1,416 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+// patch from https://github.com/nodejs/node/blob/v7.2.1/lib/_http_agent.js
+
+'use strict';
+
+const net = require('net');
+const util = require('util');
+const EventEmitter = require('events');
+const debug = util.debuglog('http');
+
+// New Agent code.
+
+// The largest departure from the previous implementation is that
+// an Agent instance holds connections for a variable number of host:ports.
+// Surprisingly, this is still API compatible as far as third parties are
+// concerned. The only code that really notices the difference is the
+// request object.
+
+// Another departure is that all code related to HTTP parsing is in
+// ClientRequest.onSocket(). The Agent is now *strictly*
+// concerned with managing a connection pool.
+
+function Agent(options) {
+ if (!(this instanceof Agent))
+ return new Agent(options);
+
+ EventEmitter.call(this);
+
+ var self = this;
+
+ self.defaultPort = 80;
+ self.protocol = 'http:';
+
+ self.options = util._extend({}, options);
+
+ // don't confuse net and make it think that we're connecting to a pipe
+ self.options.path = null;
+ self.requests = {};
+ self.sockets = {};
+ self.freeSockets = {};
+ self.keepAliveMsecs = self.options.keepAliveMsecs || 1000;
+ self.keepAlive = self.options.keepAlive || false;
+ self.maxSockets = self.options.maxSockets || Agent.defaultMaxSockets;
+ self.maxFreeSockets = self.options.maxFreeSockets || 256;
+
+ // [patch start]
+ // free keep-alive socket timeout. By default free socket do not have a timeout.
+ self.freeSocketKeepAliveTimeout = self.options.freeSocketKeepAliveTimeout || 0;
+ // working socket timeout. By default working socket do not have a timeout.
+ self.timeout = self.options.timeout || 0;
+ // the socket active time to live, even if it's in use
+ this.socketActiveTTL = this.options.socketActiveTTL || null;
+ // [patch end]
+
+ self.on('free', function(socket, options) {
+ var name = self.getName(options);
+ debug('agent.on(free)', name);
+
+ if (socket.writable &&
+ self.requests[name] && self.requests[name].length) {
+ // [patch start]
+ debug('continue handle next request');
+ // [patch end]
+ self.requests[name].shift().onSocket(socket);
+ if (self.requests[name].length === 0) {
+ // don't leak
+ delete self.requests[name];
+ }
+ } else {
+ // If there are no pending requests, then put it in
+ // the freeSockets pool, but only if we're allowed to do so.
+ var req = socket._httpMessage;
+ if (req &&
+ req.shouldKeepAlive &&
+ socket.writable &&
+ self.keepAlive) {
+ var freeSockets = self.freeSockets[name];
+ var freeLen = freeSockets ? freeSockets.length : 0;
+ var count = freeLen;
+ if (self.sockets[name])
+ count += self.sockets[name].length;
+
+ if (count > self.maxSockets || freeLen >= self.maxFreeSockets) {
+ socket.destroy();
+ } else {
+ freeSockets = freeSockets || [];
+ self.freeSockets[name] = freeSockets;
+ socket.setKeepAlive(true, self.keepAliveMsecs);
+ socket.unref();
+ socket._httpMessage = null;
+ self.removeSocket(socket, options);
+ freeSockets.push(socket);
+
+ // [patch start]
+ // Add a default error handler to avoid Unhandled 'error' event throw on idle socket
+ // https://github.com/node-modules/agentkeepalive/issues/25
+ // https://github.com/nodejs/node/pull/4482 (fixed in >= 4.4.0 and >= 5.4.0)
+ if (socket.listeners('error').length === 0) {
+ socket.once('error', freeSocketErrorListener);
+ }
+ // set free keepalive timer
+ // try to use socket custom freeSocketKeepAliveTimeout first
+ const freeSocketKeepAliveTimeout = socket.freeSocketKeepAliveTimeout || self.freeSocketKeepAliveTimeout;
+ socket.setTimeout(freeSocketKeepAliveTimeout);
+ debug(`push to free socket queue and wait for ${freeSocketKeepAliveTimeout}ms`);
+ // [patch end]
+ }
+ } else {
+ socket.destroy();
+ }
+ }
+ });
+}
+
+util.inherits(Agent, EventEmitter);
+exports.Agent = Agent;
+
+// [patch start]
+function freeSocketErrorListener(err) {
+ var socket = this;
+ debug('SOCKET ERROR on FREE socket:', err.message, err.stack);
+ socket.destroy();
+ socket.emit('agentRemove');
+}
+// [patch end]
+
+Agent.defaultMaxSockets = Infinity;
+
+Agent.prototype.createConnection = net.createConnection;
+
+// Get the key for a given set of request options
+Agent.prototype.getName = function getName(options) {
+ var name = options.host || 'localhost';
+
+ name += ':';
+ if (options.port)
+ name += options.port;
+
+ name += ':';
+ if (options.localAddress)
+ name += options.localAddress;
+
+ // Pacify parallel/test-http-agent-getname by only appending
+ // the ':' when options.family is set.
+ if (options.family === 4 || options.family === 6)
+ name += ':' + options.family;
+
+ return name;
+};
+
+// [patch start]
+function handleSocketCreation(req) {
+ return function(err, newSocket) {
+ if (err) {
+ process.nextTick(function() {
+ req.emit('error', err);
+ });
+ return;
+ }
+ req.onSocket(newSocket);
+ }
+}
+// [patch end]
+
+Agent.prototype.addRequest = function addRequest(req, options, port/*legacy*/,
+ localAddress/*legacy*/) {
+ // Legacy API: addRequest(req, host, port, localAddress)
+ if (typeof options === 'string') {
+ options = {
+ host: options,
+ port,
+ localAddress
+ };
+ }
+
+ options = util._extend({}, options);
+ options = util._extend(options, this.options);
+
+ if (!options.servername)
+ options.servername = calculateServerName(options, req);
+
+ var name = this.getName(options);
+ if (!this.sockets[name]) {
+ this.sockets[name] = [];
+ }
+
+ var freeLen = this.freeSockets[name] ? this.freeSockets[name].length : 0;
+ var sockLen = freeLen + this.sockets[name].length;
+
+ if (freeLen) {
+ // we have a free socket, so use that.
+ var socket = this.freeSockets[name].shift();
+ debug('have free socket');
+
+ // [patch start]
+ // remove free socket error event handler
+ socket.removeListener('error', freeSocketErrorListener);
+ // restart the default timer
+ socket.setTimeout(this.timeout);
+
+ if (this.socketActiveTTL && Date.now() - socket.createdTime > this.socketActiveTTL) {
+ debug(`socket ${socket.createdTime} expired`);
+ socket.destroy();
+ return this.createSocket(req, options, handleSocketCreation(req));
+ }
+ // [patch end]
+
+ // don't leak
+ if (!this.freeSockets[name].length)
+ delete this.freeSockets[name];
+
+ socket.ref();
+ req.onSocket(socket);
+ this.sockets[name].push(socket);
+ } else if (sockLen < this.maxSockets) {
+ debug('call onSocket', sockLen, freeLen);
+ // If we are under maxSockets create a new one.
+ // [patch start]
+ this.createSocket(req, options, handleSocketCreation(req));
+ // [patch end]
+ } else {
+ debug('wait for socket');
+ // We are over limit so we'll add it to the queue.
+ if (!this.requests[name]) {
+ this.requests[name] = [];
+ }
+ this.requests[name].push(req);
+ }
+};
+
+Agent.prototype.createSocket = function createSocket(req, options, cb) {
+ var self = this;
+ options = util._extend({}, options);
+ options = util._extend(options, self.options);
+
+ if (!options.servername)
+ options.servername = calculateServerName(options, req);
+
+ var name = self.getName(options);
+ options._agentKey = name;
+
+ debug('createConnection', name, options);
+ options.encoding = null;
+ var called = false;
+ const newSocket = self.createConnection(options, oncreate);
+ // [patch start]
+ if (newSocket) {
+ oncreate(null, Object.assign(newSocket, { createdTime: Date.now() }));
+ }
+ // [patch end]
+ function oncreate(err, s) {
+ if (called)
+ return;
+ called = true;
+ if (err)
+ return cb(err);
+ if (!self.sockets[name]) {
+ self.sockets[name] = [];
+ }
+ self.sockets[name].push(s);
+ debug('sockets', name, self.sockets[name].length);
+
+ function onFree() {
+ self.emit('free', s, options);
+ }
+ s.on('free', onFree);
+
+ function onClose(err) {
+ debug('CLIENT socket onClose');
+ // This is the only place where sockets get removed from the Agent.
+ // If you want to remove a socket from the pool, just close it.
+ // All socket errors end in a close event anyway.
+ self.removeSocket(s, options);
+
+ // [patch start]
+ self.emit('close');
+ // [patch end]
+ }
+ s.on('close', onClose);
+
+ // [patch start]
+ // start socket timeout handler
+ function onTimeout() {
+ debug('CLIENT socket onTimeout');
+ s.destroy();
+ // Remove it from freeSockets immediately to prevent new requests from being sent through this socket.
+ self.removeSocket(s, options);
+ self.emit('timeout');
+ }
+ s.on('timeout', onTimeout);
+ // set the default timer
+ s.setTimeout(self.timeout);
+ // [patch end]
+
+ function onRemove() {
+ // We need this function for cases like HTTP 'upgrade'
+ // (defined by WebSockets) where we need to remove a socket from the
+ // pool because it'll be locked up indefinitely
+ debug('CLIENT socket onRemove');
+ self.removeSocket(s, options);
+ s.removeListener('close', onClose);
+ s.removeListener('free', onFree);
+ s.removeListener('agentRemove', onRemove);
+
+ // [patch start]
+ // remove socket timeout handler
+ s.setTimeout(0, onTimeout);
+ // [patch end]
+ }
+ s.on('agentRemove', onRemove);
+ cb(null, s);
+ }
+};
+
+function calculateServerName(options, req) {
+ let servername = options.host;
+ const hostHeader = req.getHeader('host');
+ if (hostHeader) {
+ // abc => abc
+ // abc:123 => abc
+ // [::1] => ::1
+ // [::1]:123 => ::1
+ if (hostHeader.startsWith('[')) {
+ const index = hostHeader.indexOf(']');
+ if (index === -1) {
+ // Leading '[', but no ']'. Need to do something...
+ servername = hostHeader;
+ } else {
+ servername = hostHeader.substr(1, index - 1);
+ }
+ } else {
+ servername = hostHeader.split(':', 1)[0];
+ }
+ }
+ return servername;
+}
+
+Agent.prototype.removeSocket = function removeSocket(s, options) {
+ var name = this.getName(options);
+ debug('removeSocket', name, 'writable:', s.writable);
+ var sets = [this.sockets];
+
+ // If the socket was destroyed, remove it from the free buffers too.
+ if (!s.writable)
+ sets.push(this.freeSockets);
+
+ for (var sk = 0; sk < sets.length; sk++) {
+ var sockets = sets[sk];
+
+ if (sockets[name]) {
+ var index = sockets[name].indexOf(s);
+ if (index !== -1) {
+ sockets[name].splice(index, 1);
+ // Don't leak
+ if (sockets[name].length === 0)
+ delete sockets[name];
+ }
+ }
+ }
+
+ // [patch start]
+ var freeLen = this.freeSockets[name] ? this.freeSockets[name].length : 0;
+ var sockLen = freeLen + this.sockets[name] ? this.sockets[name].length : 0;
+ // [patch end]
+
+ if (this.requests[name] && this.requests[name].length && sockLen < this.maxSockets) {
+ debug('removeSocket, have a request, make a socket');
+ var req = this.requests[name][0];
+ // If we have pending requests and a socket gets closed make a new one
+ this.createSocket(req, options, function(err, newSocket) {
+ if (err) {
+ process.nextTick(function() {
+ req.emit('error', err);
+ });
+ return;
+ }
+ newSocket.emit('free');
+ });
+ }
+};
+
+Agent.prototype.destroy = function destroy() {
+ var sets = [this.freeSockets, this.sockets];
+ for (var s = 0; s < sets.length; s++) {
+ var set = sets[s];
+ var keys = Object.keys(set);
+ for (var v = 0; v < keys.length; v++) {
+ var setName = set[keys[v]];
+ for (var n = 0; n < setName.length; n++) {
+ setName[n].destroy();
+ }
+ }
+ }
+};
+
+exports.globalAgent = new Agent();
diff --git a/deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/agentkeepalive/lib/agent.js b/deps/npm/node_modules/agentkeepalive/lib/agent.js
similarity index 100%
rename from deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/agentkeepalive/lib/agent.js
rename to deps/npm/node_modules/agentkeepalive/lib/agent.js
diff --git a/deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/agentkeepalive/lib/https_agent.js b/deps/npm/node_modules/agentkeepalive/lib/https_agent.js
similarity index 100%
rename from deps/npm/node_modules/npm-profile/node_modules/make-fetch-happen/node_modules/agentkeepalive/lib/https_agent.js
rename to deps/npm/node_modules/agentkeepalive/lib/https_agent.js
diff --git a/deps/npm/node_modules/agentkeepalive/package.json b/deps/npm/node_modules/agentkeepalive/package.json
new file mode 100644
index 00000000000000..c0ce0576bc1070
--- /dev/null
+++ b/deps/npm/node_modules/agentkeepalive/package.json
@@ -0,0 +1,84 @@
+{
+ "_from": "agentkeepalive@^3.4.1",
+ "_id": "agentkeepalive@3.4.1",
+ "_inBundle": false,
+ "_integrity": "sha512-MPIwsZU9PP9kOrZpyu2042kYA8Fdt/AedQYkYXucHgF9QoD9dXVp0ypuGnHXSR0hTstBxdt85Xkh4JolYfK5wg==",
+ "_location": "/agentkeepalive",
+ "_phantomChildren": {},
+ "_requested": {
+ "type": "range",
+ "registry": true,
+ "raw": "agentkeepalive@^3.4.1",
+ "name": "agentkeepalive",
+ "escapedName": "agentkeepalive",
+ "rawSpec": "^3.4.1",
+ "saveSpec": null,
+ "fetchSpec": "^3.4.1"
+ },
+ "_requiredBy": [
+ "/make-fetch-happen",
+ "/npm-profile/make-fetch-happen",
+ "/npm-registry-fetch/make-fetch-happen"
+ ],
+ "_resolved": "https://registry.npmjs.org/agentkeepalive/-/agentkeepalive-3.4.1.tgz",
+ "_shasum": "aa95aebc3a749bca5ed53e3880a09f5235b48f0c",
+ "_spec": "agentkeepalive@^3.4.1",
+ "_where": "/Users/rebecca/code/npm/node_modules/make-fetch-happen",
+ "author": {
+ "name": "fengmk2",
+ "email": "fengmk2@gmail.com",
+ "url": "https://fengmk2.com"
+ },
+ "browser": "browser.js",
+ "bugs": {
+ "url": "https://github.com/node-modules/agentkeepalive/issues"
+ },
+ "bundleDependencies": false,
+ "ci": {
+ "version": "4.3.2, 4, 6, 8, 9"
+ },
+ "dependencies": {
+ "humanize-ms": "^1.2.1"
+ },
+ "deprecated": false,
+ "description": "Missing keepalive http.Agent",
+ "devDependencies": {
+ "autod": "^2.8.0",
+ "egg-bin": "^1.10.3",
+ "egg-ci": "^1.7.0",
+ "eslint": "^3.19.0",
+ "eslint-config-egg": "^4.2.0",
+ "pedding": "^1.1.0"
+ },
+ "engines": {
+ "node": ">= 4.0.0"
+ },
+ "files": [
+ "index.js",
+ "browser.js",
+ "lib"
+ ],
+ "homepage": "https://github.com/node-modules/agentkeepalive#readme",
+ "keywords": [
+ "http",
+ "https",
+ "agent",
+ "keepalive",
+ "agentkeepalive"
+ ],
+ "license": "MIT",
+ "main": "index.js",
+ "name": "agentkeepalive",
+ "repository": {
+ "type": "git",
+ "url": "git://github.com/node-modules/agentkeepalive.git"
+ },
+ "scripts": {
+ "autod": "autod",
+ "ci": "npm run lint && npm run cov",
+ "cov": "egg-bin cov",
+ "lint": "eslint lib test index.js",
+ "test": "egg-bin test"
+ },
+ "version": "3.4.1"
+}
diff --git a/deps/npm/node_modules/request/node_modules/har-validator/node_modules/ajv/.tonic_example.js b/deps/npm/node_modules/ajv/.tonic_example.js
similarity index 100%
rename from deps/npm/node_modules/request/node_modules/har-validator/node_modules/ajv/.tonic_example.js
rename to deps/npm/node_modules/ajv/.tonic_example.js
diff --git a/deps/npm/node_modules/ajv/LICENSE b/deps/npm/node_modules/ajv/LICENSE
new file mode 100644
index 00000000000000..09f090263b226a
--- /dev/null
+++ b/deps/npm/node_modules/ajv/LICENSE
@@ -0,0 +1,21 @@
+The MIT License (MIT)
+
+Copyright (c) 2015 Evgeny Poberezkin
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
diff --git a/deps/npm/node_modules/ajv/README.md b/deps/npm/node_modules/ajv/README.md
new file mode 100644
index 00000000000000..63a265f04d9e84
--- /dev/null
+++ b/deps/npm/node_modules/ajv/README.md
@@ -0,0 +1,1327 @@
+
+
+# Ajv: Another JSON Schema Validator
+
+The fastest JSON Schema validator for Node.js and browser with draft 6 support.
+
+
+[![Build Status](https://travis-ci.org/epoberezkin/ajv.svg?branch=master)](https://travis-ci.org/epoberezkin/ajv)
+[![npm version](https://badge.fury.io/js/ajv.svg)](https://www.npmjs.com/package/ajv)
+[![npm@beta](https://img.shields.io/npm/v/ajv/beta.svg)](https://github.com/epoberezkin/ajv/tree/beta)
+[![npm downloads](https://img.shields.io/npm/dm/ajv.svg)](https://www.npmjs.com/package/ajv)
+[![Coverage Status](https://coveralls.io/repos/epoberezkin/ajv/badge.svg?branch=master&service=github)](https://coveralls.io/github/epoberezkin/ajv?branch=master)
+[![Greenkeeper badge](https://badges.greenkeeper.io/epoberezkin/ajv.svg)](https://greenkeeper.io/)
+[![Gitter](https://img.shields.io/gitter/room/ajv-validator/ajv.svg)](https://gitter.im/ajv-validator/ajv)
+
+
+__Please note__: Ajv [version 6](https://github.com/epoberezkin/ajv/tree/beta) with [JSON Schema draft-07](http://json-schema.org/work-in-progress) support is released. Use `npm install ajv@beta` to install.
+
+
+## Using version 5
+
+[JSON Schema draft-06](https://trac.tools.ietf.org/html/draft-wright-json-schema-validation-01) is published.
+
+[Ajv version 5.0.0](https://github.com/epoberezkin/ajv/releases/tag/5.0.0) that supports draft-06 is released. It may require either migrating your schemas or updating your code (to continue using draft-04 and v5 schemas).
+
+__Please note__: To use Ajv with draft-04 schemas you need to explicitly add meta-schema to the validator instance:
+
+```javascript
+ajv.addMetaSchema(require('ajv/lib/refs/json-schema-draft-04.json'));
+```
+
+
+## Contents
+
+- [Performance](#performance)
+- [Features](#features)
+- [Getting started](#getting-started)
+- [Frequently Asked Questions](https://github.com/epoberezkin/ajv/blob/master/FAQ.md)
+- [Using in browser](#using-in-browser)
+- [Command line interface](#command-line-interface)
+- Validation
+ - [Keywords](#validation-keywords)
+ - [Formats](#formats)
+ - [Combining schemas with $ref](#ref)
+ - [$data reference](#data-reference)
+ - NEW: [$merge and $patch keywords](#merge-and-patch-keywords)
+ - [Defining custom keywords](#defining-custom-keywords)
+ - [Asynchronous schema compilation](#asynchronous-schema-compilation)
+ - [Asynchronous validation](#asynchronous-validation)
+- Modifying data during validation
+ - [Filtering data](#filtering-data)
+ - [Assigning defaults](#assigning-defaults)
+ - [Coercing data types](#coercing-data-types)
+- API
+ - [Methods](#api)
+ - [Options](#options)
+ - [Validation errors](#validation-errors)
+- [Related packages](#related-packages)
+- [Packages using Ajv](#some-packages-using-ajv)
+- [Tests, Contributing, History, License](#tests)
+
+
+## Performance
+
+Ajv generates code using [doT templates](https://github.com/olado/doT) to turn JSON schemas into super-fast validation functions that are efficient for v8 optimization.
+
+Currently Ajv is the fastest and the most standard compliant validator according to these benchmarks:
+
+- [json-schema-benchmark](https://github.com/ebdrup/json-schema-benchmark) - 50% faster than the second place
+- [jsck benchmark](https://github.com/pandastrike/jsck#benchmarks) - 20-190% faster
+- [z-schema benchmark](https://rawgit.com/zaggino/z-schema/master/benchmark/results.html)
+- [themis benchmark](https://cdn.rawgit.com/playlyfe/themis/master/benchmark/results.html)
+
+
+Performance of different validators by [json-schema-benchmark](https://github.com/ebdrup/json-schema-benchmark):
+
+[![performance](https://chart.googleapis.com/chart?chxt=x,y&cht=bhs&chco=76A4FB&chls=2.0&chbh=32,4,1&chs=600x416&chxl=-1:|djv|ajv|json-schema-validator-generator|jsen|is-my-json-valid|themis|z-schema|jsck|skeemas|json-schema-library|tv4&chd=t:100,98,72.1,66.8,50.1,15.1,6.1,3.8,1.2,0.7,0.2)](https://github.com/ebdrup/json-schema-benchmark/blob/master/README.md#performance)
+
+
+## Features
+
+- Ajv implements full JSON Schema [draft 6](http://json-schema.org/) and draft 4 standards:
+ - all validation keywords (see [JSON Schema validation keywords](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md))
+ - full support of remote refs (remote schemas have to be added with `addSchema` or compiled to be available)
+ - support of circular references between schemas
+ - correct string lengths for strings with unicode pairs (can be turned off)
+ - [formats](#formats) defined by JSON Schema draft 4 standard and custom formats (can be turned off)
+ - [validates schemas against meta-schema](#api-validateschema)
+- supports [browsers](#using-in-browser) and Node.js 0.10-8.x
+- [asynchronous loading](#asynchronous-schema-compilation) of referenced schemas during compilation
+- "All errors" validation mode with [option allErrors](#options)
+- [error messages with parameters](#validation-errors) describing error reasons to allow creating custom error messages
+- i18n error messages support with [ajv-i18n](https://github.com/epoberezkin/ajv-i18n) package
+- [filtering data](#filtering-data) from additional properties
+- [assigning defaults](#assigning-defaults) to missing properties and items
+- [coercing data](#coercing-data-types) to the types specified in `type` keywords
+- [custom keywords](#defining-custom-keywords)
+- draft-6 keywords `const`, `contains` and `propertyNames`
+- draft-6 boolean schemas (`true`/`false` as a schema to always pass/fail).
+- keywords `switch`, `patternRequired`, `formatMaximum` / `formatMinimum` and `formatExclusiveMaximum` / `formatExclusiveMinimum` from [JSON-schema extension proposals](https://github.com/json-schema/json-schema/wiki/v5-Proposals) with [ajv-keywords](https://github.com/epoberezkin/ajv-keywords) package
+- [$data reference](#data-reference) to use values from the validated data as values for the schema keywords
+- [asynchronous validation](#asynchronous-validation) of custom formats and keywords
+
+Currently Ajv is the only validator that passes all the tests from [JSON Schema Test Suite](https://github.com/json-schema/JSON-Schema-Test-Suite) (according to [json-schema-benchmark](https://github.com/ebdrup/json-schema-benchmark), apart from the test that requires that `1.0` is not an integer that is impossible to satisfy in JavaScript).
+
+
+## Install
+
+```
+npm install ajv
+```
+
+or to install [version 6](https://github.com/epoberezkin/ajv/tree/beta):
+
+```
+npm install ajv@beta
+```
+
+
+## Getting started
+
+Try it in the Node.js REPL: https://tonicdev.com/npm/ajv
+
+
+The fastest validation call:
+
+```javascript
+var Ajv = require('ajv');
+var ajv = new Ajv(); // options can be passed, e.g. {allErrors: true}
+var validate = ajv.compile(schema);
+var valid = validate(data);
+if (!valid) console.log(validate.errors);
+```
+
+or with less code
+
+```javascript
+// ...
+var valid = ajv.validate(schema, data);
+if (!valid) console.log(ajv.errors);
+// ...
+```
+
+or
+
+```javascript
+// ...
+var valid = ajv.addSchema(schema, 'mySchema')
+ .validate('mySchema', data);
+if (!valid) console.log(ajv.errorsText());
+// ...
+```
+
+See [API](#api) and [Options](#options) for more details.
+
+Ajv compiles schemas to functions and caches them in all cases (using schema serialized with [fast-json-stable-stringify](https://github.com/epoberezkin/fast-json-stable-stringify) or a custom function as a key), so that the next time the same schema is used (not necessarily the same object instance) it won't be compiled again.
+
+The best performance is achieved when using compiled functions returned by `compile` or `getSchema` methods (there is no additional function call).
+
+__Please note__: every time a validation function or `ajv.validate` are called `errors` property is overwritten. You need to copy `errors` array reference to another variable if you want to use it later (e.g., in the callback). See [Validation errors](#validation-errors)
+
+
+## Using in browser
+
+You can require Ajv directly from the code you browserify - in this case Ajv will be a part of your bundle.
+
+If you need to use Ajv in several bundles you can create a separate UMD bundle using `npm run bundle` script (thanks to [siddo420](https://github.com/siddo420)).
+
+Then you need to load Ajv in the browser:
+```html
+
+```
+
+This bundle can be used with different module systems; it creates global `Ajv` if no module system is found.
+
+The browser bundle is available on [cdnjs](https://cdnjs.com/libraries/ajv).
+
+Ajv is tested with these browsers:
+
+[![Sauce Test Status](https://saucelabs.com/browser-matrix/epoberezkin.svg)](https://saucelabs.com/u/epoberezkin)
+
+__Please note__: some frameworks, e.g. Dojo, may redefine global require in such way that is not compatible with CommonJS module format. In such case Ajv bundle has to be loaded before the framework and then you can use global Ajv (see issue [#234](https://github.com/epoberezkin/ajv/issues/234)).
+
+
+## Command line interface
+
+CLI is available as a separate npm package [ajv-cli](https://github.com/jessedc/ajv-cli). It supports:
+
+- compiling JSON-schemas to test their validity
+- BETA: generating standalone module exporting a validation function to be used without Ajv (using [ajv-pack](https://github.com/epoberezkin/ajv-pack))
+- migrate schemas to draft-06 (using [json-schema-migrate](https://github.com/epoberezkin/json-schema-migrate))
+- validating data file(s) against JSON-schema
+- testing expected validity of data against JSON-schema
+- referenced schemas
+- custom meta-schemas
+- files in JSON and JavaScript format
+- all Ajv options
+- reporting changes in data after validation in [JSON-patch](https://tools.ietf.org/html/rfc6902) format
+
+
+## Validation keywords
+
+Ajv supports all validation keywords from draft 4 of JSON-schema standard:
+
+- [type](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#type)
+- [for numbers](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#keywords-for-numbers) - maximum, minimum, exclusiveMaximum, exclusiveMinimum, multipleOf
+- [for strings](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#keywords-for-strings) - maxLength, minLength, pattern, format
+- [for arrays](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#keywords-for-arrays) - maxItems, minItems, uniqueItems, items, additionalItems, [contains](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#contains)
+- [for objects](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#keywords-for-objects) - maxProperties, minProperties, required, properties, patternProperties, additionalProperties, dependencies, [propertyNames](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#propertynames)
+- [for all types](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#keywords-for-all-types) - enum, [const](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#const)
+- [compound keywords](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#compound-keywords) - not, oneOf, anyOf, allOf
+
+With [ajv-keywords](https://github.com/epoberezkin/ajv-keywords) package Ajv also supports validation keywords from [JSON Schema extension proposals](https://github.com/json-schema/json-schema/wiki/v5-Proposals) for JSON-schema standard:
+
+- [switch](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#switch-proposed) - conditional validation with a sequence of if/then clauses
+- [patternRequired](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#patternrequired-proposed) - like `required` but with patterns that some property should match.
+- [formatMaximum, formatMinimum, formatExclusiveMaximum, formatExclusiveMinimum](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md#formatmaximum--formatminimum-and-exclusiveformatmaximum--exclusiveformatminimum-proposed) - setting limits for date, time, etc.
+
+See [JSON Schema validation keywords](https://github.com/epoberezkin/ajv/blob/master/KEYWORDS.md) for more details.
+
+
+## Formats
+
+The following formats are supported for string validation with "format" keyword:
+
+- _date_: full-date according to [RFC3339](http://tools.ietf.org/html/rfc3339#section-5.6).
+- _time_: time with optional time-zone.
+- _date-time_: date-time from the same source (time-zone is mandatory). `date`, `time` and `date-time` validate ranges in `full` mode and only regexp in `fast` mode (see [options](#options)).
+- _uri_: full uri with optional protocol.
+- _url_: [URL record](https://url.spec.whatwg.org/#concept-url).
+- _uri-template_: URI template according to [RFC6570](https://tools.ietf.org/html/rfc6570)
+- _email_: email address.
+- _hostname_: host name according to [RFC1034](http://tools.ietf.org/html/rfc1034#section-3.5).
+- _ipv4_: IP address v4.
+- _ipv6_: IP address v6.
+- _regex_: tests whether a string is a valid regular expression by passing it to RegExp constructor.
+- _uuid_: Universally Unique IDentifier according to [RFC4122](http://tools.ietf.org/html/rfc4122).
+- _json-pointer_: JSON-pointer according to [RFC6901](https://tools.ietf.org/html/rfc6901).
+- _relative-json-pointer_: relative JSON-pointer according to [this draft](http://tools.ietf.org/html/draft-luff-relative-json-pointer-00).
+
+There are two modes of format validation: `fast` and `full`. This mode affects formats `date`, `time`, `date-time`, `uri`, `email`, and `hostname`. See [Options](#options) for details.
+
+You can add additional formats and replace any of the formats above using [addFormat](#api-addformat) method.
+
+The option `unknownFormats` allows changing the default behaviour when an unknown format is encountered. In this case Ajv can either fail schema compilation (default) or ignore it (default in versions before 5.0.0). You also can whitelist specific format(s) to be ignored. See [Options](#options) for details.
+
+You can find patterns used for format validation and the sources that were used in [formats.js](https://github.com/epoberezkin/ajv/blob/master/lib/compile/formats.js).
+
+
+## Combining schemas with $ref
+
+You can structure your validation logic across multiple schema files and have schemas reference each other using `$ref` keyword.
+
+Example:
+
+```javascript
+var schema = {
+ "$id": "http://example.com/schemas/schema.json",
+ "type": "object",
+ "properties": {
+ "foo": { "$ref": "defs.json#/definitions/int" },
+ "bar": { "$ref": "defs.json#/definitions/str" }
+ }
+};
+
+var defsSchema = {
+ "$id": "http://example.com/schemas/defs.json",
+ "definitions": {
+ "int": { "type": "integer" },
+ "str": { "type": "string" }
+ }
+};
+```
+
+Now to compile your schema you can either pass all schemas to Ajv instance:
+
+```javascript
+var ajv = new Ajv({schemas: [schema, defsSchema]});
+var validate = ajv.getSchema('http://example.com/schemas/schema.json');
+```
+
+or use `addSchema` method:
+
+```javascript
+var ajv = new Ajv;
+var validate = ajv.addSchema(defsSchema)
+ .compile(schema);
+```
+
+See [Options](#options) and [addSchema](#api) method.
+
+__Please note__:
+- `$ref` is resolved as the uri-reference using schema $id as the base URI (see the example).
+- References can be recursive (and mutually recursive) to implement the schemas for different data structures (such as linked lists, trees, graphs, etc.).
+- You don't have to host your schema files at the URIs that you use as schema $id. These URIs are only used to identify the schemas, and according to JSON Schema specification validators should not expect to be able to download the schemas from these URIs.
+- The actual location of the schema file in the file system is not used.
+- You can pass the identifier of the schema as the second parameter of `addSchema` method or as a property name in `schemas` option. This identifier can be used instead of (or in addition to) schema $id.
+- You cannot have the same $id (or the schema identifier) used for more than one schema - the exception will be thrown.
+- You can implement dynamic resolution of the referenced schemas using `compileAsync` method. In this way you can store schemas in any system (files, web, database, etc.) and reference them without explicitly adding to Ajv instance. See [Asynchronous schema compilation](#asynchronous-schema-compilation).
+
+
+## $data reference
+
+With `$data` option you can use values from the validated data as the values for the schema keywords. See [proposal](https://github.com/json-schema/json-schema/wiki/$data-(v5-proposal)) for more information about how it works.
+
+`$data` reference is supported in the keywords: const, enum, format, maximum/minimum, exclusiveMaximum / exclusiveMinimum, maxLength / minLength, maxItems / minItems, maxProperties / minProperties, formatMaximum / formatMinimum, formatExclusiveMaximum / formatExclusiveMinimum, multipleOf, pattern, required, uniqueItems.
+
+The value of "$data" should be a [JSON-pointer](https://tools.ietf.org/html/rfc6901) to the data (the root is always the top level data object, even if the $data reference is inside a referenced subschema) or a [relative JSON-pointer](http://tools.ietf.org/html/draft-luff-relative-json-pointer-00) (it is relative to the current point in data; if the $data reference is inside a referenced subschema it cannot point to the data outside of the root level for this subschema).
+
+Examples.
+
+This schema requires that the value in property `smaller` is less or equal than the value in the property larger:
+
+```javascript
+var ajv = new Ajv({$data: true});
+
+var schema = {
+ "properties": {
+ "smaller": {
+ "type": "number",
+ "maximum": { "$data": "1/larger" }
+ },
+ "larger": { "type": "number" }
+ }
+};
+
+var validData = {
+ smaller: 5,
+ larger: 7
+};
+
+ajv.validate(schema, validData); // true
+```
+
+This schema requires that the properties have the same format as their field names:
+
+```javascript
+var schema = {
+ "additionalProperties": {
+ "type": "string",
+ "format": { "$data": "0#" }
+ }
+};
+
+var validData = {
+ 'date-time': '1963-06-19T08:30:06.283185Z',
+ email: 'joe.bloggs@example.com'
+}
+```
+
+`$data` reference is resolved safely - it won't throw even if some property is undefined. If `$data` resolves to `undefined` the validation succeeds (with the exclusion of `const` keyword). If `$data` resolves to incorrect type (e.g. not "number" for maximum keyword) the validation fails.
+
+
+## $merge and $patch keywords
+
+With the package [ajv-merge-patch](https://github.com/epoberezkin/ajv-merge-patch) you can use the keywords `$merge` and `$patch` that allow extending JSON-schemas with patches using formats [JSON Merge Patch (RFC 7396)](https://tools.ietf.org/html/rfc7396) and [JSON Patch (RFC 6902)](https://tools.ietf.org/html/rfc6902).
+
+To add keywords `$merge` and `$patch` to Ajv instance use this code:
+
+```javascript
+require('ajv-merge-patch')(ajv);
+```
+
+Examples.
+
+Using `$merge`:
+
+```json
+{
+ "$merge": {
+ "source": {
+ "type": "object",
+ "properties": { "p": { "type": "string" } },
+ "additionalProperties": false
+ },
+ "with": {
+ "properties": { "q": { "type": "number" } }
+ }
+ }
+}
+```
+
+Using `$patch`:
+
+```json
+{
+ "$patch": {
+ "source": {
+ "type": "object",
+ "properties": { "p": { "type": "string" } },
+ "additionalProperties": false
+ },
+ "with": [
+ { "op": "add", "path": "/properties/q", "value": { "type": "number" } }
+ ]
+ }
+}
+```
+
+The schemas above are equivalent to this schema:
+
+```json
+{
+ "type": "object",
+ "properties": {
+ "p": { "type": "string" },
+ "q": { "type": "number" }
+ },
+ "additionalProperties": false
+}
+```
+
+The properties `source` and `with` in the keywords `$merge` and `$patch` can use absolute or relative `$ref` to point to other schemas previously added to the Ajv instance or to the fragments of the current schema.
+
+See the package [ajv-merge-patch](https://github.com/epoberezkin/ajv-merge-patch) for more information.
+
+
+## Defining custom keywords
+
+The advantages of using custom keywords are:
+
+- allow creating validation scenarios that cannot be expressed using JSON Schema
+- simplify your schemas
+- help bringing a bigger part of the validation logic to your schemas
+- make your schemas more expressive, less verbose and closer to your application domain
+- implement custom data processors that modify your data (`modifying` option MUST be used in keyword definition) and/or create side effects while the data is being validated
+
+If a keyword is used only for side-effects and its validation result is pre-defined, use option `valid: true/false` in keyword definition to simplify both generated code (no error handling in case of `valid: true`) and your keyword functions (no need to return any validation result).
+
+The concerns you have to be aware of when extending JSON-schema standard with custom keywords are the portability and understanding of your schemas. You will have to support these custom keywords on other platforms and to properly document these keywords so that everybody can understand them in your schemas.
+
+You can define custom keywords with [addKeyword](#api-addkeyword) method. Keywords are defined on the `ajv` instance level - new instances will not have previously defined keywords.
+
+Ajv allows defining keywords with:
+- validation function
+- compilation function
+- macro function
+- inline compilation function that should return code (as string) that will be inlined in the currently compiled schema.
+
+Example. `range` and `exclusiveRange` keywords using compiled schema:
+
+```javascript
+ajv.addKeyword('range', {
+ type: 'number',
+ compile: function (sch, parentSchema) {
+ var min = sch[0];
+ var max = sch[1];
+
+ return parentSchema.exclusiveRange === true
+ ? function (data) { return data > min && data < max; }
+ : function (data) { return data >= min && data <= max; }
+ }
+});
+
+var schema = { "range": [2, 4], "exclusiveRange": true };
+var validate = ajv.compile(schema);
+console.log(validate(2.01)); // true
+console.log(validate(3.99)); // true
+console.log(validate(2)); // false
+console.log(validate(4)); // false
+```
+
+Several custom keywords (typeof, instanceof, range and propertyNames) are defined in [ajv-keywords](https://github.com/epoberezkin/ajv-keywords) package - they can be used for your schemas and as a starting point for your own custom keywords.
+
+See [Defining custom keywords](https://github.com/epoberezkin/ajv/blob/master/CUSTOM.md) for more details.
+
+
+## Asynchronous schema compilation
+
+During asynchronous compilation remote references are loaded using supplied function. See `compileAsync` [method](#api-compileAsync) and `loadSchema` [option](#options).
+
+Example:
+
+```javascript
+var ajv = new Ajv({ loadSchema: loadSchema });
+
+ajv.compileAsync(schema).then(function (validate) {
+ var valid = validate(data);
+ // ...
+});
+
+function loadSchema(uri) {
+ return request.json(uri).then(function (res) {
+ if (res.statusCode >= 400)
+ throw new Error('Loading error: ' + res.statusCode);
+ return res.body;
+ });
+}
+```
+
+__Please note__: [Option](#options) `missingRefs` should NOT be set to `"ignore"` or `"fail"` for asynchronous compilation to work.
+
+
+## Asynchronous validation
+
+Example in Node.js REPL: https://tonicdev.com/esp/ajv-asynchronous-validation
+
+You can define custom formats and keywords that perform validation asynchronously by accessing database or some other service. You should add `async: true` in the keyword or format definition (see [addFormat](#api-addformat), [addKeyword](#api-addkeyword) and [Defining custom keywords](#defining-custom-keywords)).
+
+If your schema uses asynchronous formats/keywords or refers to some schema that contains them it should have `"$async": true` keyword so that Ajv can compile it correctly. If asynchronous format/keyword or reference to asynchronous schema is used in the schema without `$async` keyword Ajv will throw an exception during schema compilation.
+
+__Please note__: all asynchronous subschemas that are referenced from the current or other schemas should have `"$async": true` keyword as well, otherwise the schema compilation will fail.
+
+Validation function for an asynchronous custom format/keyword should return a promise that resolves with `true` or `false` (or rejects with `new Ajv.ValidationError(errors)` if you want to return custom errors from the keyword function). Ajv compiles asynchronous schemas to either [es7 async functions](http://tc39.github.io/ecmascript-asyncawait/) that can optionally be transpiled with [nodent](https://github.com/MatAtBread/nodent) or with [regenerator](https://github.com/facebook/regenerator) or to [generator functions](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/function*) that can be optionally transpiled with regenerator as well. You can also supply any other transpiler as a function. See [Options](#options).
+
+The compiled validation function has `$async: true` property (if the schema is asynchronous), so you can differentiate these functions if you are using both synchronous and asynchronous schemas.
+
+If you are using generators, the compiled validation function can be either wrapped with [co](https://github.com/tj/co) (default) or returned as generator function, that can be used directly, e.g. in [koa](http://koajs.com/) 1.0. `co` is a small library, it is included in Ajv (both as npm dependency and in the browser bundle).
+
+Async functions are currently supported in Chrome 55, Firefox 52, Node.js 7 (with --harmony-async-await) and MS Edge 13 (with flag).
+
+Generator functions are currently supported in Chrome, Firefox and Node.js.
+
+If you are using Ajv in other browsers or in older versions of Node.js you should use one of available transpiling options. All provided async modes use global Promise class. If your platform does not have Promise you should use a polyfill that defines it.
+
+Validation result will be a promise that resolves with validated data or rejects with an exception `Ajv.ValidationError` that contains the array of validation errors in `errors` property.
+
+
+Example:
+
+```javascript
+/**
+ * Default mode is non-transpiled generator function wrapped with `co`.
+ * Using package ajv-async (https://github.com/epoberezkin/ajv-async)
+ * you can auto-detect the best async mode.
+ * In this case, without "async" and "transpile" options
+ * (or with option {async: true})
+ * Ajv will choose the first supported/installed option in this order:
+ * 1. native async function
+ * 2. native generator function wrapped with co
+ * 3. es7 async functions transpiled with nodent
+ * 4. es7 async functions transpiled with regenerator
+ */
+
+var setupAsync = require('ajv-async');
+var ajv = setupAsync(new Ajv);
+
+ajv.addKeyword('idExists', {
+ async: true,
+ type: 'number',
+ validate: checkIdExists
+});
+
+
+function checkIdExists(schema, data) {
+ return knex(schema.table)
+ .select('id')
+ .where('id', data)
+ .then(function (rows) {
+ return !!rows.length; // true if record is found
+ });
+}
+
+var schema = {
+ "$async": true,
+ "properties": {
+ "userId": {
+ "type": "integer",
+ "idExists": { "table": "users" }
+ },
+ "postId": {
+ "type": "integer",
+ "idExists": { "table": "posts" }
+ }
+ }
+};
+
+var validate = ajv.compile(schema);
+
+validate({ userId: 1, postId: 19 })
+.then(function (data) {
+ console.log('Data is valid', data); // { userId: 1, postId: 19 }
+})
+.catch(function (err) {
+ if (!(err instanceof Ajv.ValidationError)) throw err;
+ // data is invalid
+ console.log('Validation errors:', err.errors);
+});
+```
+
+### Using transpilers with asynchronous validation functions.
+
+To use a transpiler you should separately install it (or load its bundle in the browser).
+
+Ajv npm package includes minified browser bundles of regenerator and nodent in dist folder.
+
+
+#### Using nodent
+
+```javascript
+var setupAsync = require('ajv-async');
+var ajv = new Ajv({ /* async: 'es7', */ transpile: 'nodent' });
+setupAsync(ajv);
+var validate = ajv.compile(schema); // transpiled es7 async function
+validate(data).then(successFunc).catch(errorFunc);
+```
+
+`npm install nodent` or use `nodent.min.js` from dist folder of npm package.
+
+
+#### Using regenerator
+
+```javascript
+var setupAsync = require('ajv-async');
+var ajv = new Ajv({ /* async: 'es7', */ transpile: 'regenerator' });
+setupAsync(ajv);
+var validate = ajv.compile(schema); // transpiled es7 async function
+validate(data).then(successFunc).catch(errorFunc);
+```
+
+`npm install regenerator` or use `regenerator.min.js` from dist folder of npm package.
+
+
+#### Using other transpilers
+
+```javascript
+var ajv = new Ajv({ async: 'es7', processCode: transpileFunc });
+var validate = ajv.compile(schema); // transpiled es7 async function
+validate(data).then(successFunc).catch(errorFunc);
+```
+
+See [Options](#options).
+
+
+#### Comparison of async modes
+
+|mode|transpile speed*|run-time speed*|bundle size|
+|---|:-:|:-:|:-:|
+|es7 async (native)|-|0.75|-|
+|generators (native)|-|1.0|-|
+|es7.nodent|1.35|1.1|215Kb|
+|es7.regenerator|1.0|2.7|1109Kb|
+|regenerator|1.0|3.2|1109Kb|
+
+\* Relative performance in Node.js 7.x — smaller is better.
+
+[nodent](https://github.com/MatAtBread/nodent) has several advantages:
+
+- much smaller browser bundle than regenerator
+- almost the same performance of generated code as native generators in Node.js and the latest Chrome
+- much better performance than native generators in other browsers
+- works in IE 9 (regenerator does not)
+
+
+## Filtering data
+
+With [option `removeAdditional`](#options) (added by [andyscott](https://github.com/andyscott)) you can filter data during the validation.
+
+This option modifies original data.
+
+Example:
+
+```javascript
+var ajv = new Ajv({ removeAdditional: true });
+var schema = {
+ "additionalProperties": false,
+ "properties": {
+ "foo": { "type": "number" },
+ "bar": {
+ "additionalProperties": { "type": "number" },
+ "properties": {
+ "baz": { "type": "string" }
+ }
+ }
+ }
+}
+
+var data = {
+ "foo": 0,
+ "additional1": 1, // will be removed; `additionalProperties` == false
+ "bar": {
+ "baz": "abc",
+ "additional2": 2 // will NOT be removed; `additionalProperties` != false
+ },
+}
+
+var validate = ajv.compile(schema);
+
+console.log(validate(data)); // true
+console.log(data); // { "foo": 0, "bar": { "baz": "abc", "additional2": 2 }
+```
+
+If `removeAdditional` option in the example above were `"all"` then both `additional1` and `additional2` properties would have been removed.
+
+If the option were `"failing"` then property `additional1` would have been removed regardless of its value and property `additional2` would have been removed only if its value were failing the schema in the inner `additionalProperties` (so in the example above it would have stayed because it passes the schema, but any non-number would have been removed).
+
+__Please note__: If you use `removeAdditional` option with `additionalProperties` keyword inside `anyOf`/`oneOf` keywords your validation can fail with this schema, for example:
+
+```json
+{
+ "type": "object",
+ "oneOf": [
+ {
+ "properties": {
+ "foo": { "type": "string" }
+ },
+ "required": [ "foo" ],
+ "additionalProperties": false
+ },
+ {
+ "properties": {
+ "bar": { "type": "integer" }
+ },
+ "required": [ "bar" ],
+ "additionalProperties": false
+ }
+ ]
+}
+```
+
+The intention of the schema above is to allow objects with either the string property "foo" or the integer property "bar", but not with both and not with any other properties.
+
+With the option `removeAdditional: true` the validation will pass for the object `{ "foo": "abc"}` but will fail for the object `{"bar": 1}`. It happens because while the first subschema in `oneOf` is validated, the property `bar` is removed because it is an additional property according to the standard (because it is not included in `properties` keyword in the same schema).
+
+While this behaviour is unexpected (issues [#129](https://github.com/epoberezkin/ajv/issues/129), [#134](https://github.com/epoberezkin/ajv/issues/134)), it is correct. To have the expected behaviour (both objects are allowed and additional properties are removed) the schema has to be refactored in this way:
+
+```json
+{
+ "type": "object",
+ "properties": {
+ "foo": { "type": "string" },
+ "bar": { "type": "integer" }
+ },
+ "additionalProperties": false,
+ "oneOf": [
+ { "required": [ "foo" ] },
+ { "required": [ "bar" ] }
+ ]
+}
+```
+
+The schema above is also more efficient - it will compile into a faster function.
+
+
+## Assigning defaults
+
+With [option `useDefaults`](#options) Ajv will assign values from `default` keyword in the schemas of `properties` and `items` (when it is the array of schemas) to the missing properties and items.
+
+This option modifies original data.
+
+__Please note__: by default the default value is inserted in the generated validation code as a literal (starting from v4.0), so the value inserted in the data will be the deep clone of the default in the schema.
+
+If you need to insert the default value in the data by reference pass the option `useDefaults: "shared"`.
+
+Inserting defaults by reference can be faster (in case you have an object in `default`) and it allows to have dynamic values in defaults, e.g. timestamp, without recompiling the schema. The side effect is that modifying the default value in any validated data instance will change the default in the schema and in other validated data instances. See example 3 below.
+
+
+Example 1 (`default` in `properties`):
+
+```javascript
+var ajv = new Ajv({ useDefaults: true });
+var schema = {
+ "type": "object",
+ "properties": {
+ "foo": { "type": "number" },
+ "bar": { "type": "string", "default": "baz" }
+ },
+ "required": [ "foo", "bar" ]
+};
+
+var data = { "foo": 1 };
+
+var validate = ajv.compile(schema);
+
+console.log(validate(data)); // true
+console.log(data); // { "foo": 1, "bar": "baz" }
+```
+
+Example 2 (`default` in `items`):
+
+```javascript
+var schema = {
+ "type": "array",
+ "items": [
+ { "type": "number" },
+ { "type": "string", "default": "foo" }
+ ]
+}
+
+var data = [ 1 ];
+
+var validate = ajv.compile(schema);
+
+console.log(validate(data)); // true
+console.log(data); // [ 1, "foo" ]
+```
+
+Example 3 (inserting "defaults" by reference):
+
+```javascript
+var ajv = new Ajv({ useDefaults: 'shared' });
+
+var schema = {
+ properties: {
+ foo: {
+ default: { bar: 1 }
+ }
+ }
+}
+
+var validate = ajv.compile(schema);
+
+var data = {};
+console.log(validate(data)); // true
+console.log(data); // { foo: { bar: 1 } }
+
+data.foo.bar = 2;
+
+var data2 = {};
+console.log(validate(data2)); // true
+console.log(data2); // { foo: { bar: 2 } }
+```
+
+`default` keywords in other cases are ignored:
+
+- not in `properties` or `items` subschemas
+- in schemas inside `anyOf`, `oneOf` and `not` (see [#42](https://github.com/epoberezkin/ajv/issues/42))
+- in `if` subschema of `switch` keyword
+- in schemas generated by custom macro keywords
+
+
+## Coercing data types
+
+When you are validating user inputs all your data properties are usually strings. The option `coerceTypes` allows you to have your data types coerced to the types specified in your schema `type` keywords, both to pass the validation and to use the correctly typed data afterwards.
+
+This option modifies original data.
+
+__Please note__: if you pass a scalar value to the validating function its type will be coerced and it will pass the validation, but the value of the variable you pass won't be updated because scalars are passed by value.
+
+
+Example 1:
+
+```javascript
+var ajv = new Ajv({ coerceTypes: true });
+var schema = {
+ "type": "object",
+ "properties": {
+ "foo": { "type": "number" },
+ "bar": { "type": "boolean" }
+ },
+ "required": [ "foo", "bar" ]
+};
+
+var data = { "foo": "1", "bar": "false" };
+
+var validate = ajv.compile(schema);
+
+console.log(validate(data)); // true
+console.log(data); // { "foo": 1, "bar": false }
+```
+
+Example 2 (array coercions):
+
+```javascript
+var ajv = new Ajv({ coerceTypes: 'array' });
+var schema = {
+ "properties": {
+ "foo": { "type": "array", "items": { "type": "number" } },
+ "bar": { "type": "boolean" }
+ }
+};
+
+var data = { "foo": "1", "bar": ["false"] };
+
+var validate = ajv.compile(schema);
+
+console.log(validate(data)); // true
+console.log(data); // { "foo": [1], "bar": false }
+```
+
+The coercion rules, as you can see from the example, are different from JavaScript both to validate user input as expected and to have the coercion reversible (to correctly validate cases where different types are defined in subschemas of "anyOf" and other compound keywords).
+
+See [Coercion rules](https://github.com/epoberezkin/ajv/blob/master/COERCION.md) for details.
+
+
+## API
+
+##### new Ajv(Object options) -> Object
+
+Create Ajv instance.
+
+
+##### .compile(Object schema) -> Function<Object data>
+
+Generate validating function and cache the compiled schema for future use.
+
+Validating function returns boolean and has properties `errors` with the errors from the last validation (`null` if there were no errors) and `schema` with the reference to the original schema.
+
+Unless the option `validateSchema` is false, the schema will be validated against meta-schema and if schema is invalid the error will be thrown. See [options](#options).
+
+
+##### .compileAsync(Object schema [, Boolean meta] [, Function callback]) -> Promise
+
+Asynchronous version of `compile` method that loads missing remote schemas using asynchronous function in `options.loadSchema`. This function returns a Promise that resolves to a validation function. An optional callback passed to `compileAsync` will be called with 2 parameters: error (or null) and validating function. The returned promise will reject (and the callback will be called with an error) when:
+
+- missing schema can't be loaded (`loadSchema` returns a Promise that rejects).
+- a schema containing a missing reference is loaded, but the reference cannot be resolved.
+- schema (or some loaded/referenced schema) is invalid.
+
+The function compiles schema and loads the first missing schema (or meta-schema) until all missing schemas are loaded.
+
+You can asynchronously compile meta-schema by passing `true` as the second parameter.
+
+See example in [Asynchronous compilation](#asynchronous-schema-compilation).
+
+
+##### .validate(Object schema|String key|String ref, data) -> Boolean
+
+Validate data using passed schema (it will be compiled and cached).
+
+Instead of the schema you can use the key that was previously passed to `addSchema`, the schema id if it was present in the schema or any previously resolved reference.
+
+Validation errors will be available in the `errors` property of Ajv instance (`null` if there were no errors).
+
+__Please note__: every time this method is called the errors are overwritten so you need to copy them to another variable if you want to use them later.
+
+If the schema is asynchronous (has `$async` keyword on the top level) this method returns a Promise. See [Asynchronous validation](#asynchronous-validation).
+
+
+##### .addSchema(Array<Object>|Object schema [, String key]) -> Ajv
+
+Add schema(s) to validator instance. This method does not compile schemas (but it still validates them). Because of that dependencies can be added in any order and circular dependencies are supported. It also prevents unnecessary compilation of schemas that are containers for other schemas but not used as a whole.
+
+Array of schemas can be passed (schemas should have ids), the second parameter will be ignored.
+
+Key can be passed that can be used to reference the schema and will be used as the schema id if there is no id inside the schema. If the key is not passed, the schema id will be used as the key.
+
+
+Once the schema is added, it (and all the references inside it) can be referenced in other schemas and used to validate data.
+
+Although `addSchema` does not compile schemas, explicit compilation is not required - the schema will be compiled when it is used first time.
+
+By default the schema is validated against meta-schema before it is added, and if the schema does not pass validation the exception is thrown. This behaviour is controlled by `validateSchema` option.
+
+__Please note__: Ajv uses the [method chaining syntax](https://en.wikipedia.org/wiki/Method_chaining) for all methods with the prefix `add*` and `remove*`.
+This allows you to do nice things like the following.
+
+```javascript
+var validate = new Ajv().addSchema(schema).addFormat(name, regex).getSchema(uri);
+```
+
+##### .addMetaSchema(Array<Object>|Object schema [, String key]) -> Ajv
+
+Adds meta schema(s) that can be used to validate other schemas. That function should be used instead of `addSchema` because there may be instance options that would compile a meta schema incorrectly (at the moment it is `removeAdditional` option).
+
+There is no need to explicitly add draft 6 meta schema (http://json-schema.org/draft-06/schema and http://json-schema.org/schema) - it is added by default, unless option `meta` is set to `false`. You only need to use it if you have a changed meta-schema that you want to use to validate your schemas. See `validateSchema`.
+
+
+##### .validateSchema(Object schema) -> Boolean
+
+Validates schema. This method should be used to validate schemas rather than `validate` due to the inconsistency of `uri` format in JSON Schema standard.
+
+By default this method is called automatically when the schema is added, so you rarely need to use it directly.
+
+If schema doesn't have `$schema` property, it is validated against draft 6 meta-schema (option `meta` should not be false).
+
+If schema has `$schema` property, then the schema with this id (that should be previously added) is used to validate passed schema.
+
+Errors will be available at `ajv.errors`.
+
+
+##### .getSchema(String key) -> Function<Object data>
+
+Retrieve compiled schema previously added with `addSchema` by the key passed to `addSchema` or by its full reference (id). The returned validating function has `schema` property with the reference to the original schema.
+
+
+##### .removeSchema([Object schema|String key|String ref|RegExp pattern]) -> Ajv
+
+Remove added/cached schema. Even if schema is referenced by other schemas it can be safely removed as dependent schemas have local references.
+
+Schema can be removed using:
+- key passed to `addSchema`
+- it's full reference (id)
+- RegExp that should match schema id or key (meta-schemas won't be removed)
+- actual schema object that will be stable-stringified to remove schema from cache
+
+If no parameter is passed all schemas but meta-schemas will be removed and the cache will be cleared.
+
+
+##### .addFormat(String name, String|RegExp|Function|Object format) -> Ajv
+
+Add custom format to validate strings or numbers. It can also be used to replace pre-defined formats for Ajv instance.
+
+Strings are converted to RegExp.
+
+Function should return validation result as `true` or `false`.
+
+If object is passed it should have properties `validate`, `compare` and `async`:
+
+- _validate_: a string, RegExp or a function as described above.
+- _compare_: an optional comparison function that accepts two strings and compares them according to the format meaning. This function is used with keywords `formatMaximum`/`formatMinimum` (defined in [ajv-keywords](https://github.com/epoberezkin/ajv-keywords) package). It should return `1` if the first value is bigger than the second value, `-1` if it is smaller and `0` if it is equal.
+- _async_: an optional `true` value if `validate` is an asynchronous function; in this case it should return a promise that resolves with a value `true` or `false`.
+- _type_: an optional type of data that the format applies to. It can be `"string"` (default) or `"number"` (see https://github.com/epoberezkin/ajv/issues/291#issuecomment-259923858). If the type of data is different, the validation will pass.
+
+Custom formats can be also added via `formats` option.
+
+
+##### .addKeyword(String keyword, Object definition) -> Ajv
+
+Add custom validation keyword to Ajv instance.
+
+Keyword should be different from all standard JSON schema keywords and different from previously defined keywords. There is no way to redefine keywords or to remove keyword definition from the instance.
+
+Keyword must start with a letter, `_` or `$`, and may continue with letters, numbers, `_`, `$`, or `-`.
+It is recommended to use an application-specific prefix for keywords to avoid current and future name collisions.
+
+Example Keywords:
+- `"xyz-example"`: valid, and uses prefix for the xyz project to avoid name collisions.
+- `"example"`: valid, but not recommended as it could collide with future versions of JSON schema etc.
+- `"3-example"`: invalid as numbers are not allowed to be the first character in a keyword
+
+Keyword definition is an object with the following properties:
+
+- _type_: optional string or array of strings with data type(s) that the keyword applies to. If not present, the keyword will apply to all types.
+- _validate_: validating function
+- _compile_: compiling function
+- _macro_: macro function
+- _inline_: compiling function that returns code (as string)
+- _schema_: an optional `false` value used with "validate" keyword to not pass schema
+- _metaSchema_: an optional meta-schema for keyword schema
+- _modifying_: `true` MUST be passed if keyword modifies data
+- _valid_: pass `true`/`false` to pre-define validation result, the result returned from validation function will be ignored. This option cannot be used with macro keywords.
+- _$data_: an optional `true` value to support [$data reference](#data-reference) as the value of custom keyword. The reference will be resolved at validation time. If the keyword has meta-schema it would be extended to allow $data and it will be used to validate the resolved value. Supporting $data reference requires that keyword has validating function (as the only option or in addition to compile, macro or inline function).
+- _async_: an optional `true` value if the validation function is asynchronous (whether it is compiled or passed in _validate_ property); in this case it should return a promise that resolves with a value `true` or `false`. This option is ignored in case of "macro" and "inline" keywords.
+- _errors_: an optional boolean indicating whether keyword returns errors. If this property is not set Ajv will determine if the errors were set in case of failed validation.
+
+_compile_, _macro_ and _inline_ are mutually exclusive, only one should be used at a time. _validate_ can be used separately or in addition to them to support $data reference.
+
+__Please note__: If the keyword is validating data type that is different from the type(s) in its definition, the validation function will not be called (and expanded macro will not be used), so there is no need to check for data type inside validation function or inside schema returned by macro function (unless you want to enforce a specific type and for some reason do not want to use a separate `type` keyword for that). In the same way as standard keywords work, if the keyword does not apply to the data type being validated, the validation of this keyword will succeed.
+
+See [Defining custom keywords](#defining-custom-keywords) for more details.
+
+
+##### .getKeyword(String keyword) -> Object|Boolean
+
+Returns custom keyword definition, `true` for pre-defined keywords and `false` if the keyword is unknown.
+
+
+##### .removeKeyword(String keyword) -> Ajv
+
+Removes custom or pre-defined keyword so you can redefine them.
+
+While this method can be used to extend pre-defined keywords, it can also be used to completely change their meaning - it may lead to unexpected results.
+
+__Please note__: schemas compiled before the keyword is removed will continue to work without changes. To recompile schemas use `removeSchema` method and compile them again.
+
+
+##### .errorsText([Array<Object> errors [, Object options]]) -> String
+
+Returns the text with all errors in a String.
+
+Options can have properties `separator` (string used to separate errors, ", " by default) and `dataVar` (the variable name that dataPaths are prefixed with, "data" by default).
+
+
+## Options
+
+Defaults:
+
+```javascript
+{
+ // validation and reporting options:
+ $data: false,
+ allErrors: false,
+ verbose: false,
+ jsonPointers: false,
+ uniqueItems: true,
+ unicode: true,
+ format: 'fast',
+ formats: {},
+ unknownFormats: true,
+ schemas: {},
+ logger: undefined,
+ // referenced schema options:
+ schemaId: undefined // recommended '$id'
+ missingRefs: true,
+ extendRefs: 'ignore', // recommended 'fail'
+ loadSchema: undefined, // function(uri: string): Promise {}
+ // options to modify validated data:
+ removeAdditional: false,
+ useDefaults: false,
+ coerceTypes: false,
+ // asynchronous validation options:
+ async: 'co*',
+ transpile: undefined, // requires ajv-async package
+ // advanced options:
+ meta: true,
+ validateSchema: true,
+ addUsedSchema: true,
+ inlineRefs: true,
+ passContext: false,
+ loopRequired: Infinity,
+ ownProperties: false,
+ multipleOfPrecision: false,
+ errorDataPath: 'object',
+ messages: true,
+ sourceCode: false,
+ processCode: undefined, // function (str: string): string {}
+ cache: new Cache,
+ serialize: undefined
+}
+```
+
+##### Validation and reporting options
+
+- _$data_: support [$data references](#data-reference). Draft 6 meta-schema that is added by default will be extended to allow them. If you want to use another meta-schema you need to use $dataMetaSchema method to add support for $data reference. See [API](#api).
+- _allErrors_: check all rules collecting all errors. Default is to return after the first error.
+- _verbose_: include the reference to the part of the schema (`schema` and `parentSchema`) and validated data in errors (false by default).
+- _jsonPointers_: set `dataPath` property of errors using [JSON Pointers](https://tools.ietf.org/html/rfc6901) instead of JavaScript property access notation.
+- _uniqueItems_: validate `uniqueItems` keyword (true by default).
+- _unicode_: calculate correct length of strings with unicode pairs (true by default). Pass `false` to use `.length` of strings that is faster, but gives "incorrect" lengths of strings with unicode pairs - each unicode pair is counted as two characters.
+- _format_: formats validation mode ('fast' by default). Pass 'full' for more correct and slow validation or `false` not to validate formats at all. E.g., 25:00:00 and 2015/14/33 will be invalid time and date in 'full' mode but it will be valid in 'fast' mode.
+- _formats_: an object with custom formats. Keys and values will be passed to `addFormat` method.
+- _unknownFormats_: handling of unknown formats. Option values:
+ - `true` (default) - if an unknown format is encountered the exception is thrown during schema compilation. If `format` keyword value is [$data reference](#data-reference) and it is unknown the validation will fail.
+ - `[String]` - an array of unknown format names that will be ignored. This option can be used to allow usage of third party schemas with format(s) for which you don't have definitions, but still fail if another unknown format is used. If `format` keyword value is [$data reference](#data-reference) and it is not in this array the validation will fail.
+ - `"ignore"` - to log warning during schema compilation and always pass validation (the default behaviour in versions before 5.0.0). This option is not recommended, as it allows to mistype format name and it won't be validated without any error message. This behaviour is required by JSON-schema specification.
+- _schemas_: an array or object of schemas that will be added to the instance. In case you pass the array the schemas must have IDs in them. When the object is passed the method `addSchema(value, key)` will be called for each schema in this object.
+- _logger_: sets the logging method. Default is the global `console` object that should have methods `log`, `warn` and `error`. Option values:
+ - custom logger - it should have methods `log`, `warn` and `error`. If any of these methods is missing an exception will be thrown.
+ - `false` - logging is disabled.
+
+
+##### Referenced schema options
+
+- _schemaId_: this option defines which keywords are used as schema URI. Option value:
+ - `"$id"` (recommended) - only use `$id` keyword as schema URI (as specified in JSON Schema draft-06), ignore `id` keyword (if it is present a warning will be logged).
+ - `"id"` - only use `id` keyword as schema URI (as specified in JSON Schema draft-04), ignore `$id` keyword (if it is present a warning will be logged).
+ - `undefined` (default) - use both `$id` and `id` keywords as schema URI. If both are present (in the same schema object) and different the exception will be thrown during schema compilation.
+- _missingRefs_: handling of missing referenced schemas. Option values:
+ - `true` (default) - if the reference cannot be resolved during compilation the exception is thrown. The thrown error has properties `missingRef` (with hash fragment) and `missingSchema` (without it). Both properties are resolved relative to the current base id (usually schema id, unless it was substituted).
+ - `"ignore"` - to log error during compilation and always pass validation.
+ - `"fail"` - to log error and successfully compile schema but fail validation if this rule is checked.
+- _extendRefs_: validation of other keywords when `$ref` is present in the schema. Option values:
+ - `"ignore"` (default) - when `$ref` is used other keywords are ignored (as per [JSON Reference](https://tools.ietf.org/html/draft-pbryan-zyp-json-ref-03#section-3) standard). A warning will be logged during the schema compilation.
+ - `"fail"` (recommended) - if other validation keywords are used together with `$ref` the exception will be thrown when the schema is compiled. This option is recommended to make sure schema has no keywords that are ignored, which can be confusing.
+ - `true` - validate all keywords in the schemas with `$ref` (the default behaviour in versions before 5.0.0).
+- _loadSchema_: asynchronous function that will be used to load remote schemas when `compileAsync` [method](#api-compileAsync) is used and some reference is missing (option `missingRefs` should NOT be 'fail' or 'ignore'). This function should accept remote schema uri as a parameter and return a Promise that resolves to a schema. See example in [Asynchronous compilation](#asynchronous-schema-compilation).
+
+
+##### Options to modify validated data
+
+- _removeAdditional_: remove additional properties - see example in [Filtering data](#filtering-data). This option is not used if schema is added with `addMetaSchema` method. Option values:
+ - `false` (default) - not to remove additional properties
+ - `"all"` - all additional properties are removed, regardless of `additionalProperties` keyword in schema (and no validation is made for them).
+ - `true` - only additional properties with `additionalProperties` keyword equal to `false` are removed.
+ - `"failing"` - additional properties that fail schema validation will be removed (where `additionalProperties` keyword is `false` or schema).
+- _useDefaults_: replace missing properties and items with the values from corresponding `default` keywords. Default behaviour is to ignore `default` keywords. This option is not used if schema is added with `addMetaSchema` method. See examples in [Assigning defaults](#assigning-defaults). Option values:
+ - `false` (default) - do not use defaults
+ - `true` - insert defaults by value (safer and slower, object literal is used).
+ - `"shared"` - insert defaults by reference (faster). If the default is an object, it will be shared by all instances of validated data. If you modify the inserted default in the validated data, it will be modified in the schema as well.
+- _coerceTypes_: change data type of data to match `type` keyword. See the example in [Coercing data types](#coercing-data-types) and [coercion rules](https://github.com/epoberezkin/ajv/blob/master/COERCION.md). Option values:
+ - `false` (default) - no type coercion.
+ - `true` - coerce scalar data types.
+ - `"array"` - in addition to coercions between scalar types, coerce scalar data to an array with one element and vice versa (as required by the schema).
+
+
+##### Asynchronous validation options
+
+- _async_: determines how Ajv compiles asynchronous schemas (see [Asynchronous validation](#asynchronous-validation)) to functions. Option values:
+ - `"*"` / `"co*"` (default) - compile to generator function ("co*" - wrapped with `co.wrap`). If generators are not supported and you don't provide `processCode` option (or `transpile` option if you use [ajv-async](https://github.com/epoberezkin/ajv-async) package), the exception will be thrown when async schema is compiled.
+ - `"es7"` - compile to es7 async function. Unless your platform supports them you need to provide `processCode` or `transpile` option. According to [compatibility table](http://kangax.github.io/compat-table/es7/)) async functions are supported by:
+ - Firefox 52,
+ - Chrome 55,
+ - Node.js 7 (with `--harmony-async-await`),
+ - MS Edge 13 (with flag).
+ - `undefined`/`true` - auto-detect async mode. It requires [ajv-async](https://github.com/epoberezkin/ajv-async) package. If `transpile` option is not passed, ajv-async will choose the first of supported/installed async/transpile modes in this order:
+ - "es7" (native async functions),
+ - "co*" (native generators with co.wrap),
+ - "es7"/"nodent",
+ - "co*"/"regenerator" during the creation of the Ajv instance.
+
+ If none of the options is available the exception will be thrown.
+- _transpile_: Requires [ajv-async](https://github.com/epoberezkin/ajv-async) package. It determines whether Ajv transpiles compiled asynchronous validation function. Option values:
+ - `"nodent"` - transpile with [nodent](https://github.com/MatAtBread/nodent). If nodent is not installed, the exception will be thrown. nodent can only transpile es7 async functions; it will enforce this mode.
+ - `"regenerator"` - transpile with [regenerator](https://github.com/facebook/regenerator). If regenerator is not installed, the exception will be thrown.
+ - a function - this function should accept the code of validation function as a string and return transpiled code. This option allows you to use any other transpiler you prefer. If you are passing a function, you can simply pass it to `processCode` option without using ajv-async.
+
+
+##### Advanced options
+
+- _meta_: add [meta-schema](http://json-schema.org/documentation.html) so it can be used by other schemas (true by default). If an object is passed, it will be used as the default meta-schema for schemas that have no `$schema` keyword. This default meta-schema MUST have `$schema` keyword.
+- _validateSchema_: validate added/compiled schemas against meta-schema (true by default). `$schema` property in the schema can either be http://json-schema.org/schema or http://json-schema.org/draft-04/schema or absent (draft-4 meta-schema will be used) or can be a reference to the schema previously added with `addMetaSchema` method. Option values:
+ - `true` (default) - if the validation fails, throw the exception.
+ - `"log"` - if the validation fails, log error.
+ - `false` - skip schema validation.
+- _addUsedSchema_: by default methods `compile` and `validate` add schemas to the instance if they have `$id` (or `id`) property that doesn't start with "#". If `$id` is present and it is not unique the exception will be thrown. Set this option to `false` to skip adding schemas to the instance and the `$id` uniqueness check when these methods are used. This option does not affect `addSchema` method.
+- _inlineRefs_: Affects compilation of referenced schemas. Option values:
+ - `true` (default) - the referenced schemas that don't have refs in them are inlined, regardless of their size - that substantially improves performance at the cost of the bigger size of compiled schema functions.
+ - `false` - to not inline referenced schemas (they will be compiled as separate functions).
+ - integer number - to limit the maximum number of keywords of the schema that will be inlined.
+- _passContext_: pass validation context to custom keyword functions. If this option is `true` and you pass some context to the compiled validation function with `validate.call(context, data)`, the `context` will be available as `this` in your custom keywords. By default `this` is Ajv instance.
+- _loopRequired_: by default `required` keyword is compiled into a single expression (or a sequence of statements in `allErrors` mode). In case of a very large number of properties in this keyword it may result in a very big validation function. Pass integer to set the number of properties above which `required` keyword will be validated in a loop - smaller validation function size but also worse performance.
+- _ownProperties_: by default Ajv iterates over all enumerable object properties; when this option is `true` only own enumerable object properties (i.e. found directly on the object rather than on its prototype) are iterated. Contributed by @mbroadst.
+- _multipleOfPrecision_: by default `multipleOf` keyword is validated by comparing the result of division with parseInt() of that result. It works for dividers that are bigger than 1. For small dividers such as 0.01 the result of the division is usually not integer (even when it should be integer, see issue [#84](https://github.com/epoberezkin/ajv/issues/84)). If you need to use fractional dividers set this option to some positive integer N to have `multipleOf` validated using this formula: `Math.abs(Math.round(division) - division) < 1e-N` (it is slower but allows for float arithmetics deviations).
+- _errorDataPath_: set `dataPath` to point to 'object' (default) or to 'property' when validating keywords `required`, `additionalProperties` and `dependencies`.
+- _messages_: Include human-readable messages in errors. `true` by default. `false` can be passed when custom messages are used (e.g. with [ajv-i18n](https://github.com/epoberezkin/ajv-i18n)).
+- _sourceCode_: add `sourceCode` property to validating function (for debugging; this code can be different from the result of toString call).
+- _processCode_: an optional function to process generated code before it is passed to Function constructor. It can be used to either beautify (the validating function is generated without line-breaks) or to transpile code. Starting from version 5.0.0 this option replaced options:
+ - `beautify` that formatted the generated function using [js-beautify](https://github.com/beautify-web/js-beautify). If you want to beautify the generated code pass `require('js-beautify').js_beautify`.
+ - `transpile` that transpiled asynchronous validation function. You can still use `transpile` option with [ajv-async](https://github.com/epoberezkin/ajv-async) package. See [Asynchronous validation](#asynchronous-validation) for more information.
+- _cache_: an optional instance of cache to store compiled schemas using stable-stringified schema as a key. For example, set-associative cache [sacjs](https://github.com/epoberezkin/sacjs) can be used. If not passed then a simple hash is used which is good enough for the common use case (a limited number of statically defined schemas). Cache should have methods `put(key, value)`, `get(key)`, `del(key)` and `clear()`.
+- _serialize_: an optional function to serialize schema to cache key. Pass `false` to use schema itself as a key (e.g., if WeakMap used as a cache). By default [fast-json-stable-stringify](https://github.com/epoberezkin/fast-json-stable-stringify) is used.
+
+
+## Validation errors
+
+In case of validation failure, Ajv assigns the array of errors to `errors` property of validation function (or to `errors` property of Ajv instance when `validate` or `validateSchema` methods were called). In case of [asynchronous validation](#asynchronous-validation), the returned promise is rejected with exception `Ajv.ValidationError` that has `errors` property.
+
+
+### Error objects
+
+Each error is an object with the following properties:
+
+- _keyword_: validation keyword.
+- _dataPath_: the path to the part of the data that was validated. By default `dataPath` uses JavaScript property access notation (e.g., `".prop[1].subProp"`). When the option `jsonPointers` is true (see [Options](#options)) `dataPath` will be set using JSON pointer standard (e.g., `"/prop/1/subProp"`).
+- _schemaPath_: the path (JSON-pointer as a URI fragment) to the schema of the keyword that failed validation.
+- _params_: the object with the additional information about error that can be used to create custom error messages (e.g., using [ajv-i18n](https://github.com/epoberezkin/ajv-i18n) package). See below for parameters set by all keywords.
+- _message_: the standard error message (can be excluded with option `messages` set to false).
+- _schema_: the schema of the keyword (added with `verbose` option).
+- _parentSchema_: the schema containing the keyword (added with `verbose` option)
+- _data_: the data validated by the keyword (added with `verbose` option).
+
+__Please note__: `propertyNames` keyword schema validation errors have an additional property `propertyName`, `dataPath` points to the object. After schema validation for each property name, if it is invalid an additional error is added with the property `keyword` equal to `"propertyNames"`.
+
+
+### Error parameters
+
+Properties of `params` object in errors depend on the keyword that failed validation.
+
+- `maxItems`, `minItems`, `maxLength`, `minLength`, `maxProperties`, `minProperties` - property `limit` (number, the schema of the keyword).
+- `additionalItems` - property `limit` (the maximum number of allowed items in case when `items` keyword is an array of schemas and `additionalItems` is false).
+- `additionalProperties` - property `additionalProperty` (the property not used in `properties` and `patternProperties` keywords).
+- `dependencies` - properties:
+ - `property` (dependent property),
+ - `missingProperty` (required missing dependency - only the first one is reported currently)
+ - `deps` (required dependencies, comma separated list as a string),
+ - `depsCount` (the number of required dependencies).
+- `format` - property `format` (the schema of the keyword).
+- `maximum`, `minimum` - properties:
+ - `limit` (number, the schema of the keyword),
+ - `exclusive` (boolean, the schema of `exclusiveMaximum` or `exclusiveMinimum`),
+ - `comparison` (string, comparison operation to compare the data to the limit, with the data on the left and the limit on the right; can be "<", "<=", ">", ">=")
+- `multipleOf` - property `multipleOf` (the schema of the keyword)
+- `pattern` - property `pattern` (the schema of the keyword)
+- `required` - property `missingProperty` (required property that is missing).
+- `propertyNames` - property `propertyName` (an invalid property name).
+- `patternRequired` (in ajv-keywords) - property `missingPattern` (required pattern that did not match any property).
+- `type` - property `type` (required type(s), a string, can be a comma-separated list)
+- `uniqueItems` - properties `i` and `j` (indices of duplicate items).
+- `enum` - property `allowedValues` pointing to the array of values (the schema of the keyword).
+- `$ref` - property `ref` with the referenced schema URI.
+- custom keywords (in case keyword definition doesn't create errors) - property `keyword` (the keyword name).
+
+
+## Related packages
+
+- [ajv-async](https://github.com/epoberezkin/ajv-async) - configure async validation mode
+- [ajv-cli](https://github.com/jessedc/ajv-cli) - command line interface
+- [ajv-errors](https://github.com/epoberezkin/ajv-errors) - custom error messages
+- [ajv-i18n](https://github.com/epoberezkin/ajv-i18n) - internationalised error messages
+- [ajv-istanbul](https://github.com/epoberezkin/ajv-istanbul) - instrument generated validation code to measure test coverage of your schemas
+- [ajv-keywords](https://github.com/epoberezkin/ajv-keywords) - custom validation keywords (if/then/else, select, typeof, etc.)
+- [ajv-merge-patch](https://github.com/epoberezkin/ajv-merge-patch) - keywords $merge and $patch
+- [ajv-pack](https://github.com/epoberezkin/ajv-pack) - produces a compact module exporting validation functions
+
+
+## Some packages using Ajv
+
+- [webpack](https://github.com/webpack/webpack) - a module bundler. Its main purpose is to bundle JavaScript files for usage in a browser
+- [jsonscript-js](https://github.com/JSONScript/jsonscript-js) - the interpreter for [JSONScript](http://www.jsonscript.org) - scripted processing of existing endpoints and services
+- [osprey-method-handler](https://github.com/mulesoft-labs/osprey-method-handler) - Express middleware for validating requests and responses based on a RAML method object, used in [osprey](https://github.com/mulesoft/osprey) - validating API proxy generated from a RAML definition
+- [har-validator](https://github.com/ahmadnassri/har-validator) - HTTP Archive (HAR) validator
+- [jsoneditor](https://github.com/josdejong/jsoneditor) - a web-based tool to view, edit, format, and validate JSON http://jsoneditoronline.org
+- [JSON Schema Lint](https://github.com/nickcmaynard/jsonschemalint) - a web tool to validate JSON/YAML document against a single JSON-schema http://jsonschemalint.com
+- [objection](https://github.com/vincit/objection.js) - SQL-friendly ORM for Node.js
+- [table](https://github.com/gajus/table) - formats data into a string table
+- [ripple-lib](https://github.com/ripple/ripple-lib) - a JavaScript API for interacting with [Ripple](https://ripple.com) in Node.js and the browser
+- [restbase](https://github.com/wikimedia/restbase) - distributed storage with REST API & dispatcher for backend services built to provide a low-latency & high-throughput API for Wikipedia / Wikimedia content
+- [hippie-swagger](https://github.com/CacheControl/hippie-swagger) - [Hippie](https://github.com/vesln/hippie) wrapper that provides end to end API testing with swagger validation
+- [react-form-controlled](https://github.com/seeden/react-form-controlled) - React controlled form components with validation
+- [rabbitmq-schema](https://github.com/tjmehta/rabbitmq-schema) - a schema definition module for RabbitMQ graphs and messages
+- [@query/schema](https://www.npmjs.com/package/@query/schema) - stream filtering with a URI-safe query syntax parsing to JSON Schema
+- [chai-ajv-json-schema](https://github.com/peon374/chai-ajv-json-schema) - chai plugin to us JSON-schema with expect in mocha tests
+- [grunt-jsonschema-ajv](https://github.com/SignpostMarv/grunt-jsonschema-ajv) - Grunt plugin for validating files against JSON Schema
+- [extract-text-webpack-plugin](https://github.com/webpack-contrib/extract-text-webpack-plugin) - extract text from bundle into a file
+- [electron-builder](https://github.com/electron-userland/electron-builder) - a solution to package and build a ready for distribution Electron app
+- [addons-linter](https://github.com/mozilla/addons-linter) - Mozilla Add-ons Linter
+- [gh-pages-generator](https://github.com/epoberezkin/gh-pages-generator) - multi-page site generator converting markdown files to GitHub pages
+
+
+## Tests
+
+```
+npm install
+git submodule update --init
+npm test
+```
+
+## Contributing
+
+All validation functions are generated using doT templates in [dot](https://github.com/epoberezkin/ajv/tree/master/lib/dot) folder. Templates are precompiled so doT is not a run-time dependency.
+
+`npm run build` - compiles templates to [dotjs](https://github.com/epoberezkin/ajv/tree/master/lib/dotjs) folder.
+
+`npm run watch` - automatically compiles templates when files in dot folder change
+
+Please see [Contributing guidelines](https://github.com/epoberezkin/ajv/blob/master/CONTRIBUTING.md)
+
+
+## Changes history
+
+See https://github.com/epoberezkin/ajv/releases
+
+__Please note__: [Changes in version 5.0.0](https://github.com/epoberezkin/ajv/releases/tag/5.0.0).
+
+[Changes in version 4.6.0](https://github.com/epoberezkin/ajv/releases/tag/4.6.0).
+
+[Changes in version 4.0.0](https://github.com/epoberezkin/ajv/releases/tag/4.0.0).
+
+[Changes in version 3.0.0](https://github.com/epoberezkin/ajv/releases/tag/3.0.0).
+
+[Changes in version 2.0.0](https://github.com/epoberezkin/ajv/releases/tag/2.0.0).
+
+
+## License
+
+[MIT](https://github.com/epoberezkin/ajv/blob/master/LICENSE)
diff --git a/deps/npm/node_modules/request/node_modules/har-validator/node_modules/ajv/dist/ajv.bundle.js b/deps/npm/node_modules/ajv/dist/ajv.bundle.js
similarity index 93%
rename from deps/npm/node_modules/request/node_modules/har-validator/node_modules/ajv/dist/ajv.bundle.js
rename to deps/npm/node_modules/ajv/dist/ajv.bundle.js
index ea6a78f162208e..25843d30c8535d 100644
--- a/deps/npm/node_modules/request/node_modules/har-validator/node_modules/ajv/dist/ajv.bundle.js
+++ b/deps/npm/node_modules/ajv/dist/ajv.bundle.js
@@ -381,7 +381,7 @@ function regex(str) {
var resolve = require('./resolve')
, util = require('./util')
, errorClasses = require('./error_classes')
- , stableStringify = require('json-stable-stringify');
+ , stableStringify = require('fast-json-stable-stringify');
var validateGenerator = require('../dotjs/validate');
@@ -482,6 +482,7 @@ function compile(schema, root, localRefs, baseId) {
useCustomRule: useCustomRule,
opts: opts,
formats: formats,
+ logger: self.logger,
self: self
});
@@ -524,7 +525,7 @@ function compile(schema, root, localRefs, baseId) {
refVal[0] = validate;
} catch(e) {
- console.error('Error compiling schema, function code:', sourceCode);
+ self.logger.error('Error compiling schema, function code:', sourceCode);
throw e;
}
@@ -638,7 +639,7 @@ function compile(schema, root, localRefs, baseId) {
var valid = validateSchema(schema);
if (!valid) {
var message = 'keyword schema is invalid: ' + self.errorsText(validateSchema.errors);
- if (self._opts.validateSchema == 'log') console.error(message);
+ if (self._opts.validateSchema == 'log') self.logger.error(message);
else throw new Error(message);
}
}
@@ -756,7 +757,7 @@ function vars(arr, statement) {
return code;
}
-},{"../dotjs/validate":35,"./error_classes":5,"./resolve":8,"./util":12,"co":40,"fast-deep-equal":41,"json-stable-stringify":43}],8:[function(require,module,exports){
+},{"../dotjs/validate":35,"./error_classes":5,"./resolve":8,"./util":12,"co":40,"fast-deep-equal":41,"fast-json-stable-stringify":42}],8:[function(require,module,exports){
'use strict';
var url = require('url')
@@ -1029,7 +1030,7 @@ function resolveIds(schema) {
return localRefs;
}
-},{"./schema_obj":10,"./util":12,"fast-deep-equal":41,"json-schema-traverse":42,"url":51}],9:[function(require,module,exports){
+},{"./schema_obj":10,"./util":12,"fast-deep-equal":41,"json-schema-traverse":43,"url":48}],9:[function(require,module,exports){
'use strict';
var ruleModules = require('./_rules')
@@ -1052,7 +1053,7 @@ module.exports = function rules() {
var ALL = [ 'type' ];
var KEYWORDS = [
- 'additionalItems', '$schema', 'id', 'title',
+ 'additionalItems', '$schema', '$id', 'id', 'title',
'description', 'default', 'definitions'
];
var TYPES = [ 'number', 'integer', 'string', 'array', 'object', 'boolean', 'null' ];
@@ -2563,7 +2564,7 @@ module.exports = function generate_format(it, $keyword, $ruleType) {
var $format = it.formats[$schema];
if (!$format) {
if ($unknownFormats == 'ignore') {
- console.warn('unknown format "' + $schema + '" ignored in schema at path "' + it.errSchemaPath + '"');
+ it.logger.warn('unknown format "' + $schema + '" ignored in schema at path "' + it.errSchemaPath + '"');
if ($breakOnError) {
out += ' if (true) { ';
}
@@ -3687,7 +3688,7 @@ module.exports = function generate_ref(it, $keyword, $ruleType) {
if ($refVal === undefined) {
var $message = it.MissingRefError.message(it.baseId, $schema);
if (it.opts.missingRefs == 'fail') {
- console.error($message);
+ it.logger.error($message);
var $$outStack = $$outStack || [];
$$outStack.push(out);
out = ''; /* istanbul ignore else */
@@ -3718,7 +3719,7 @@ module.exports = function generate_ref(it, $keyword, $ruleType) {
out += ' if (false) { ';
}
} else if (it.opts.missingRefs == 'ignore') {
- console.warn($message);
+ it.logger.warn($message);
if ($breakOnError) {
out += ' if (true) { ';
}
@@ -4256,7 +4257,7 @@ module.exports = function generate_validate(it, $keyword, $ruleType) {
throw new Error('$ref: validation keywords used in schema at path "' + it.errSchemaPath + '" (see option extendRefs)');
} else if (it.opts.extendRefs !== true) {
$refKeywords = false;
- console.warn('$ref: keywords ignored in schema at path "' + it.errSchemaPath + '"');
+ it.logger.warn('$ref: keywords ignored in schema at path "' + it.errSchemaPath + '"');
}
}
if ($typeSchema) {
@@ -4414,7 +4415,7 @@ module.exports = function generate_validate(it, $keyword, $ruleType) {
}
} else {
if (it.opts.v5 && it.schema.patternGroups) {
- console.warn('keyword "patternGroups" is deprecated and disabled. Use option patternGroups: true to enable.');
+ it.logger.warn('keyword "patternGroups" is deprecated and disabled. Use option patternGroups: true to enable.');
}
var arr2 = it.RULES;
if (arr2) {
@@ -4579,10 +4580,10 @@ module.exports = function generate_validate(it, $keyword, $ruleType) {
}
function $shouldUseRule($rule) {
- return it.schema[$rule.keyword] !== undefined || ($rule.implements && $ruleImlementsSomeKeyword($rule));
+ return it.schema[$rule.keyword] !== undefined || ($rule.implements && $ruleImplementsSomeKeyword($rule));
}
- function $ruleImlementsSomeKeyword($rule) {
+ function $ruleImplementsSomeKeyword($rule) {
var impl = $rule.implements;
for (var i = 0; i < impl.length; i++)
if (it.schema[impl[i]] !== undefined) return true;
@@ -4607,6 +4608,7 @@ module.exports = {
* @this Ajv
* @param {String} keyword custom keyword, should be unique (including different from all standard, custom and macro keywords).
* @param {Object} definition keyword definition object with properties `type` (type(s) which the keyword applies to), `validate` or `compile`.
+ * @return {Ajv} this for method chaining
*/
function addKeyword(keyword, definition) {
/* jshint validthis: true */
@@ -4684,6 +4686,8 @@ function addKeyword(keyword, definition) {
function checkDataType(dataType) {
if (!RULES.types[dataType]) throw new Error('Unknown type ' + dataType);
}
+
+ return this;
}
@@ -4704,6 +4708,7 @@ function getKeyword(keyword) {
* Remove keyword
* @this Ajv
* @param {String} keyword pre-defined or custom keyword.
+ * @return {Ajv} this for method chaining
*/
function removeKeyword(keyword) {
/* jshint validthis: true */
@@ -4720,6 +4725,7 @@ function removeKeyword(keyword) {
}
}
}
+ return this;
}
},{"./dotjs/custom":21}],37:[function(require,module,exports){
@@ -4771,7 +4777,7 @@ module.exports={
"$data": {
"type": "string",
"anyOf": [
- { "format": "relative-json-pointer" },
+ { "format": "relative-json-pointer" },
{ "format": "json-pointer" }
]
}
@@ -4839,6 +4845,10 @@ module.exports={
"type": "string"
},
"default": {},
+ "examples": {
+ "type": "array",
+ "items": {}
+ },
"multipleOf": {
"type": "number",
"exclusiveMinimum": 0
@@ -5218,6 +5228,67 @@ module.exports = function equal(a, b) {
},{}],42:[function(require,module,exports){
'use strict';
+module.exports = function (data, opts) {
+ if (!opts) opts = {};
+ if (typeof opts === 'function') opts = { cmp: opts };
+ var cycles = (typeof opts.cycles === 'boolean') ? opts.cycles : false;
+
+ var cmp = opts.cmp && (function (f) {
+ return function (node) {
+ return function (a, b) {
+ var aobj = { key: a, value: node[a] };
+ var bobj = { key: b, value: node[b] };
+ return f(aobj, bobj);
+ };
+ };
+ })(opts.cmp);
+
+ var seen = [];
+ return (function stringify (node) {
+ if (node && node.toJSON && typeof node.toJSON === 'function') {
+ node = node.toJSON();
+ }
+
+ if (node === undefined) return;
+ if (typeof node == 'number') return isFinite(node) ? '' + node : 'null';
+ if (typeof node !== 'object') return JSON.stringify(node);
+
+ var i, out;
+ if (Array.isArray(node)) {
+ out = '[';
+ for (i = 0; i < node.length; i++) {
+ if (i) out += ',';
+ out += stringify(node[i]) || 'null';
+ }
+ return out + ']';
+ }
+
+ if (node === null) return 'null';
+
+ if (seen.indexOf(node) !== -1) {
+ if (cycles) return JSON.stringify('__cycle__');
+ throw new TypeError('Converting circular structure to JSON');
+ }
+
+ var seenIndex = seen.push(node) - 1;
+ var keys = Object.keys(node).sort(cmp && cmp(node));
+ out = '';
+ for (i = 0; i < keys.length; i++) {
+ var key = keys[i];
+ var value = stringify(node[key]);
+
+ if (!value) continue;
+ if (out) out += ',';
+ out += JSON.stringify(key) + ':' + value;
+ }
+ seen.splice(seenIndex, 1);
+ return '{' + out + '}';
+ })(data);
+};
+
+},{}],43:[function(require,module,exports){
+'use strict';
+
var traverse = module.exports = function (schema, opts, cb) {
if (typeof opts == 'function') {
cb = opts;
@@ -5298,528 +5369,7 @@ function escapeJsonPtr(str) {
return str.replace(/~/g, '~0').replace(/\//g, '~1');
}
-},{}],43:[function(require,module,exports){
-var json = typeof JSON !== 'undefined' ? JSON : require('jsonify');
-
-module.exports = function (obj, opts) {
- if (!opts) opts = {};
- if (typeof opts === 'function') opts = { cmp: opts };
- var space = opts.space || '';
- if (typeof space === 'number') space = Array(space+1).join(' ');
- var cycles = (typeof opts.cycles === 'boolean') ? opts.cycles : false;
- var replacer = opts.replacer || function(key, value) { return value; };
-
- var cmp = opts.cmp && (function (f) {
- return function (node) {
- return function (a, b) {
- var aobj = { key: a, value: node[a] };
- var bobj = { key: b, value: node[b] };
- return f(aobj, bobj);
- };
- };
- })(opts.cmp);
-
- var seen = [];
- return (function stringify (parent, key, node, level) {
- var indent = space ? ('\n' + new Array(level + 1).join(space)) : '';
- var colonSeparator = space ? ': ' : ':';
-
- if (node && node.toJSON && typeof node.toJSON === 'function') {
- node = node.toJSON();
- }
-
- node = replacer.call(parent, key, node);
-
- if (node === undefined) {
- return;
- }
- if (typeof node !== 'object' || node === null) {
- return json.stringify(node);
- }
- if (isArray(node)) {
- var out = [];
- for (var i = 0; i < node.length; i++) {
- var item = stringify(node, i, node[i], level+1) || json.stringify(null);
- out.push(indent + space + item);
- }
- return '[' + out.join(',') + indent + ']';
- }
- else {
- if (seen.indexOf(node) !== -1) {
- if (cycles) return json.stringify('__cycle__');
- throw new TypeError('Converting circular structure to JSON');
- }
- else seen.push(node);
-
- var keys = objectKeys(node).sort(cmp && cmp(node));
- var out = [];
- for (var i = 0; i < keys.length; i++) {
- var key = keys[i];
- var value = stringify(node, key, node[key], level+1);
-
- if(!value) continue;
-
- var keyValue = json.stringify(key)
- + colonSeparator
- + value;
- ;
- out.push(indent + space + keyValue);
- }
- seen.splice(seen.indexOf(node), 1);
- return '{' + out.join(',') + indent + '}';
- }
- })({ '': obj }, '', obj, 0);
-};
-
-var isArray = Array.isArray || function (x) {
- return {}.toString.call(x) === '[object Array]';
-};
-
-var objectKeys = Object.keys || function (obj) {
- var has = Object.prototype.hasOwnProperty || function () { return true };
- var keys = [];
- for (var key in obj) {
- if (has.call(obj, key)) keys.push(key);
- }
- return keys;
-};
-
-},{"jsonify":44}],44:[function(require,module,exports){
-exports.parse = require('./lib/parse');
-exports.stringify = require('./lib/stringify');
-
-},{"./lib/parse":45,"./lib/stringify":46}],45:[function(require,module,exports){
-var at, // The index of the current character
- ch, // The current character
- escapee = {
- '"': '"',
- '\\': '\\',
- '/': '/',
- b: '\b',
- f: '\f',
- n: '\n',
- r: '\r',
- t: '\t'
- },
- text,
-
- error = function (m) {
- // Call error when something is wrong.
- throw {
- name: 'SyntaxError',
- message: m,
- at: at,
- text: text
- };
- },
-
- next = function (c) {
- // If a c parameter is provided, verify that it matches the current character.
- if (c && c !== ch) {
- error("Expected '" + c + "' instead of '" + ch + "'");
- }
-
- // Get the next character. When there are no more characters,
- // return the empty string.
-
- ch = text.charAt(at);
- at += 1;
- return ch;
- },
-
- number = function () {
- // Parse a number value.
- var number,
- string = '';
-
- if (ch === '-') {
- string = '-';
- next('-');
- }
- while (ch >= '0' && ch <= '9') {
- string += ch;
- next();
- }
- if (ch === '.') {
- string += '.';
- while (next() && ch >= '0' && ch <= '9') {
- string += ch;
- }
- }
- if (ch === 'e' || ch === 'E') {
- string += ch;
- next();
- if (ch === '-' || ch === '+') {
- string += ch;
- next();
- }
- while (ch >= '0' && ch <= '9') {
- string += ch;
- next();
- }
- }
- number = +string;
- if (!isFinite(number)) {
- error("Bad number");
- } else {
- return number;
- }
- },
-
- string = function () {
- // Parse a string value.
- var hex,
- i,
- string = '',
- uffff;
-
- // When parsing for string values, we must look for " and \ characters.
- if (ch === '"') {
- while (next()) {
- if (ch === '"') {
- next();
- return string;
- } else if (ch === '\\') {
- next();
- if (ch === 'u') {
- uffff = 0;
- for (i = 0; i < 4; i += 1) {
- hex = parseInt(next(), 16);
- if (!isFinite(hex)) {
- break;
- }
- uffff = uffff * 16 + hex;
- }
- string += String.fromCharCode(uffff);
- } else if (typeof escapee[ch] === 'string') {
- string += escapee[ch];
- } else {
- break;
- }
- } else {
- string += ch;
- }
- }
- }
- error("Bad string");
- },
-
- white = function () {
-
-// Skip whitespace.
-
- while (ch && ch <= ' ') {
- next();
- }
- },
-
- word = function () {
-
-// true, false, or null.
-
- switch (ch) {
- case 't':
- next('t');
- next('r');
- next('u');
- next('e');
- return true;
- case 'f':
- next('f');
- next('a');
- next('l');
- next('s');
- next('e');
- return false;
- case 'n':
- next('n');
- next('u');
- next('l');
- next('l');
- return null;
- }
- error("Unexpected '" + ch + "'");
- },
-
- value, // Place holder for the value function.
-
- array = function () {
-
-// Parse an array value.
-
- var array = [];
-
- if (ch === '[') {
- next('[');
- white();
- if (ch === ']') {
- next(']');
- return array; // empty array
- }
- while (ch) {
- array.push(value());
- white();
- if (ch === ']') {
- next(']');
- return array;
- }
- next(',');
- white();
- }
- }
- error("Bad array");
- },
-
- object = function () {
-
-// Parse an object value.
-
- var key,
- object = {};
-
- if (ch === '{') {
- next('{');
- white();
- if (ch === '}') {
- next('}');
- return object; // empty object
- }
- while (ch) {
- key = string();
- white();
- next(':');
- if (Object.hasOwnProperty.call(object, key)) {
- error('Duplicate key "' + key + '"');
- }
- object[key] = value();
- white();
- if (ch === '}') {
- next('}');
- return object;
- }
- next(',');
- white();
- }
- }
- error("Bad object");
- };
-
-value = function () {
-
-// Parse a JSON value. It could be an object, an array, a string, a number,
-// or a word.
-
- white();
- switch (ch) {
- case '{':
- return object();
- case '[':
- return array();
- case '"':
- return string();
- case '-':
- return number();
- default:
- return ch >= '0' && ch <= '9' ? number() : word();
- }
-};
-
-// Return the json_parse function. It will have access to all of the above
-// functions and variables.
-
-module.exports = function (source, reviver) {
- var result;
-
- text = source;
- at = 0;
- ch = ' ';
- result = value();
- white();
- if (ch) {
- error("Syntax error");
- }
-
- // If there is a reviver function, we recursively walk the new structure,
- // passing each name/value pair to the reviver function for possible
- // transformation, starting with a temporary root object that holds the result
- // in an empty key. If there is not a reviver function, we simply return the
- // result.
-
- return typeof reviver === 'function' ? (function walk(holder, key) {
- var k, v, value = holder[key];
- if (value && typeof value === 'object') {
- for (k in value) {
- if (Object.prototype.hasOwnProperty.call(value, k)) {
- v = walk(value, k);
- if (v !== undefined) {
- value[k] = v;
- } else {
- delete value[k];
- }
- }
- }
- }
- return reviver.call(holder, key, value);
- }({'': result}, '')) : result;
-};
-
-},{}],46:[function(require,module,exports){
-var cx = /[\u0000\u00ad\u0600-\u0604\u070f\u17b4\u17b5\u200c-\u200f\u2028-\u202f\u2060-\u206f\ufeff\ufff0-\uffff]/g,
- escapable = /[\\\"\x00-\x1f\x7f-\x9f\u00ad\u0600-\u0604\u070f\u17b4\u17b5\u200c-\u200f\u2028-\u202f\u2060-\u206f\ufeff\ufff0-\uffff]/g,
- gap,
- indent,
- meta = { // table of character substitutions
- '\b': '\\b',
- '\t': '\\t',
- '\n': '\\n',
- '\f': '\\f',
- '\r': '\\r',
- '"' : '\\"',
- '\\': '\\\\'
- },
- rep;
-
-function quote(string) {
- // If the string contains no control characters, no quote characters, and no
- // backslash characters, then we can safely slap some quotes around it.
- // Otherwise we must also replace the offending characters with safe escape
- // sequences.
-
- escapable.lastIndex = 0;
- return escapable.test(string) ? '"' + string.replace(escapable, function (a) {
- var c = meta[a];
- return typeof c === 'string' ? c :
- '\\u' + ('0000' + a.charCodeAt(0).toString(16)).slice(-4);
- }) + '"' : '"' + string + '"';
-}
-
-function str(key, holder) {
- // Produce a string from holder[key].
- var i, // The loop counter.
- k, // The member key.
- v, // The member value.
- length,
- mind = gap,
- partial,
- value = holder[key];
-
- // If the value has a toJSON method, call it to obtain a replacement value.
- if (value && typeof value === 'object' &&
- typeof value.toJSON === 'function') {
- value = value.toJSON(key);
- }
-
- // If we were called with a replacer function, then call the replacer to
- // obtain a replacement value.
- if (typeof rep === 'function') {
- value = rep.call(holder, key, value);
- }
-
- // What happens next depends on the value's type.
- switch (typeof value) {
- case 'string':
- return quote(value);
-
- case 'number':
- // JSON numbers must be finite. Encode non-finite numbers as null.
- return isFinite(value) ? String(value) : 'null';
-
- case 'boolean':
- case 'null':
- // If the value is a boolean or null, convert it to a string. Note:
- // typeof null does not produce 'null'. The case is included here in
- // the remote chance that this gets fixed someday.
- return String(value);
-
- case 'object':
- if (!value) return 'null';
- gap += indent;
- partial = [];
-
- // Array.isArray
- if (Object.prototype.toString.apply(value) === '[object Array]') {
- length = value.length;
- for (i = 0; i < length; i += 1) {
- partial[i] = str(i, value) || 'null';
- }
-
- // Join all of the elements together, separated with commas, and
- // wrap them in brackets.
- v = partial.length === 0 ? '[]' : gap ?
- '[\n' + gap + partial.join(',\n' + gap) + '\n' + mind + ']' :
- '[' + partial.join(',') + ']';
- gap = mind;
- return v;
- }
-
- // If the replacer is an array, use it to select the members to be
- // stringified.
- if (rep && typeof rep === 'object') {
- length = rep.length;
- for (i = 0; i < length; i += 1) {
- k = rep[i];
- if (typeof k === 'string') {
- v = str(k, value);
- if (v) {
- partial.push(quote(k) + (gap ? ': ' : ':') + v);
- }
- }
- }
- }
- else {
- // Otherwise, iterate through all of the keys in the object.
- for (k in value) {
- if (Object.prototype.hasOwnProperty.call(value, k)) {
- v = str(k, value);
- if (v) {
- partial.push(quote(k) + (gap ? ': ' : ':') + v);
- }
- }
- }
- }
-
- // Join all of the member texts together, separated with commas,
- // and wrap them in braces.
-
- v = partial.length === 0 ? '{}' : gap ?
- '{\n' + gap + partial.join(',\n' + gap) + '\n' + mind + '}' :
- '{' + partial.join(',') + '}';
- gap = mind;
- return v;
- }
-}
-
-module.exports = function (value, replacer, space) {
- var i;
- gap = '';
- indent = '';
-
- // If the space parameter is a number, make an indent string containing that
- // many spaces.
- if (typeof space === 'number') {
- for (i = 0; i < space; i += 1) {
- indent += ' ';
- }
- }
- // If the space parameter is a string, it will be used as the indent string.
- else if (typeof space === 'string') {
- indent = space;
- }
-
- // If there is a replacer, it must be a function or an array.
- // Otherwise, throw an error.
- rep = replacer;
- if (replacer && typeof replacer !== 'function'
- && (typeof replacer !== 'object' || typeof replacer.length !== 'number')) {
- throw new Error('JSON.stringify');
- }
-
- // Make a fake root object containing our value under the key of ''.
- // Return the result of stringifying the value.
- return str('', {'': value});
-};
-
-},{}],47:[function(require,module,exports){
+},{}],44:[function(require,module,exports){
(function (global){
/*! https://mths.be/punycode v1.4.1 by @mathias */
;(function(root) {
@@ -6356,7 +5906,7 @@ module.exports = function (value, replacer, space) {
}(this));
}).call(this,typeof global !== "undefined" ? global : typeof self !== "undefined" ? self : typeof window !== "undefined" ? window : {})
-},{}],48:[function(require,module,exports){
+},{}],45:[function(require,module,exports){
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
@@ -6442,7 +5992,7 @@ var isArray = Array.isArray || function (xs) {
return Object.prototype.toString.call(xs) === '[object Array]';
};
-},{}],49:[function(require,module,exports){
+},{}],46:[function(require,module,exports){
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
@@ -6529,13 +6079,13 @@ var objectKeys = Object.keys || function (obj) {
return res;
};
-},{}],50:[function(require,module,exports){
+},{}],47:[function(require,module,exports){
'use strict';
exports.decode = exports.parse = require('./decode');
exports.encode = exports.stringify = require('./encode');
-},{"./decode":48,"./encode":49}],51:[function(require,module,exports){
+},{"./decode":45,"./encode":46}],48:[function(require,module,exports){
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
@@ -7269,7 +6819,7 @@ Url.prototype.parseHost = function() {
if (host) this.hostname = host;
};
-},{"./util":52,"punycode":47,"querystring":50}],52:[function(require,module,exports){
+},{"./util":49,"punycode":44,"querystring":47}],49:[function(require,module,exports){
'use strict';
module.exports = {
@@ -7294,7 +6844,7 @@ var compileSchema = require('./compile')
, resolve = require('./compile/resolve')
, Cache = require('./cache')
, SchemaObject = require('./compile/schema_obj')
- , stableStringify = require('json-stable-stringify')
+ , stableStringify = require('fast-json-stable-stringify')
, formats = require('./compile/formats')
, rules = require('./compile/rules')
, $dataMetaSchema = require('./$data')
@@ -7342,6 +6892,7 @@ var META_SUPPORT_DATA = ['/properties'];
function Ajv(opts) {
if (!(this instanceof Ajv)) return new Ajv(opts);
opts = this._opts = util.copy(opts) || {};
+ setLogger(this);
this._schemas = {};
this._refs = {};
this._fragments = {};
@@ -7371,7 +6922,7 @@ function Ajv(opts) {
/**
* Validate data using schema
- * Schema will be compiled and cached (using serialized JSON as key. [json-stable-stringify](https://github.com/substack/json-stable-stringify) is used to serialize.
+ * Schema will be compiled and cached (using serialized JSON as key. [fast-json-stable-stringify](https://github.com/epoberezkin/fast-json-stable-stringify) is used to serialize.
* @this Ajv
* @param {String|Object} schemaKeyRef key, ref or schema object
* @param {Any} data to be validated
@@ -7415,11 +6966,12 @@ function compile(schema, _meta) {
* @param {String} key Optional schema key. Can be passed to `validate` method instead of schema object or id/ref. One schema per instance can have empty `id` and `key`.
* @param {Boolean} _skipValidation true to skip schema validation. Used internally, option validateSchema should be used instead.
* @param {Boolean} _meta true if schema is a meta-schema. Used internally, addMetaSchema should be used instead.
+ * @return {Ajv} this for method chaining
*/
function addSchema(schema, key, _skipValidation, _meta) {
if (Array.isArray(schema)){
for (var i=0; i=1&&t<=12&&a>=1&&a<=h[t]}function o(e,r){var t=e.match(u);if(!t)return!1;return t[1]<=23&&t[2]<=59&&t[3]<=59&&(!r||t[5])}function i(e){if(E.test(e))return!1;try{return new RegExp(e),!0}catch(e){return!1}}var n=e("./util"),l=/^\d\d\d\d-(\d\d)-(\d\d)$/,h=[0,31,29,31,30,31,30,31,31,30,31,30,31],u=/^(\d\d):(\d\d):(\d\d)(\.\d+)?(z|[+-]\d\d:\d\d)?$/i,c=/^[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?(?:\.[a-z0-9](?:[-0-9a-z]{0,61}[0-9a-z])?)*$/i,d=/^(?:[a-z][a-z0-9+\-.]*:)(?:\/?\/(?:(?:[a-z0-9\-._~!$&'()*+,;=:]|%[0-9a-f]{2})*@)?(?:\[(?:(?:(?:(?:[0-9a-f]{1,4}:){6}|::(?:[0-9a-f]{1,4}:){5}|(?:[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){4}|(?:(?:[0-9a-f]{1,4}:){0,1}[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){3}|(?:(?:[0-9a-f]{1,4}:){0,2}[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){2}|(?:(?:[0-9a-f]{1,4}:){0,3}[0-9a-f]{1,4})?::[0-9a-f]{1,4}:|(?:(?:[0-9a-f]{1,4}:){0,4}[0-9a-f]{1,4})?::)(?:[0-9a-f]{1,4}:[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?))|(?:(?:[0-9a-f]{1,4}:){0,5}[0-9a-f]{1,4})?::[0-9a-f]{1,4}|(?:(?:[0-9a-f]{1,4}:){0,6}[0-9a-f]{1,4})?::)|[Vv][0-9a-f]+\.[a-z0-9\-._~!$&'()*+,;=:]+)\]|(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)|(?:[a-z0-9\-._~!$&'()*+,;=]|%[0-9a-f]{2})*)(?::\d*)?(?:\/(?:[a-z0-9\-._~!$&'()*+,;=:@]|%[0-9a-f]{2})*)*|\/(?:(?:[a-z0-9\-._~!$&'()*+,;=:@]|%[0-9a-f]{2})+(?:\/(?:[a-z0-9\-._~!$&'()*+,;=:@]|%[0-9a-f]{2})*)*)?|(?:[a-z0-9\-._~!$&'()*+,;=:@]|%[0-9a-f]{2})+(?:\/(?:[a-z0-9\-._~!$&'()*+,;=:@]|%[0-9a-f]{2})*)*)(?:\?(?:[a-z0-9\-._~!$&'()*+,;=:@/?]|%[0-9a-f]{2})*)?(?:#(?:[a-z0-9\-._~!$&'()*+,;=:@/?]|%[0-9a-f]{2})*)?$/i,f=/^(?:(?:[^\x00-\x20"'<>%\\^`{|}]|%[0-9a-f]{2})|\{[+#./;?&=,!@|]?(?:[a-z0-9_]|%[0-9a-f]{2})+(?::[1-9][0-9]{0,3}|\*)?(?:,(?:[a-z0-9_]|%[0-9a-f]{2})+(?::[1-9][0-9]{0,3}|\*)?)*\})*$/i,p=/^(?:(?:http[s\u017F]?|ftp):\/\/)(?:(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+(?::(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])*)?@)?(?:(?!10(?:\.[0-9]{1,3}){3})(?!127(?:\.[0-9]{1,3}){3})(?!169\.254(?:\.[0-9]{1,3}){2})(?!192\.168(?:\.[0-9]{1,3}){2})(?!172\.(?:1[6-9]|2[0-9]|3[01])(?:\.[0-9]{1,3}){2})(?:[1-9][0-9]?|1[0-9][0-9]|2[01][0-9]|22[0-3])(?:\.(?:1?[0-9]{1,2}|2[0-4][0-9]|25[0-5])){2}(?:\.(?:[1-9][0-9]?|1[0-9][0-9]|2[0-4][0-9]|25[0-4]))|(?:(?:(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+-?)*(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+)(?:\.(?:(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+-?)*(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+)*(?:\.(?:(?:[KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF]){2,})))(?::[0-9]{2,5})?(?:\/(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])*)?$/i,m=/^(?:urn:uuid:)?[0-9a-f]{8}-(?:[0-9a-f]{4}-){3}[0-9a-f]{12}$/i,v=/^(?:\/(?:[^~/]|~0|~1)*)*$|^#(?:\/(?:[a-z0-9_\-.!$&'()*+,;:=@]|%[0-9a-f]{2}|~0|~1)*)*$/i,y=/^(?:0|[1-9][0-9]*)(?:#|(?:\/(?:[^~/]|~0|~1)*)*)$/;r.exports=a,a.fast={date:/^\d\d\d\d-[0-1]\d-[0-3]\d$/,time:/^[0-2]\d:[0-5]\d:[0-5]\d(?:\.\d+)?(?:z|[+-]\d\d:\d\d)?$/i,"date-time":/^\d\d\d\d-[0-1]\d-[0-3]\d[t\s][0-2]\d:[0-5]\d:[0-5]\d(?:\.\d+)?(?:z|[+-]\d\d:\d\d)$/i,uri:/^(?:[a-z][a-z0-9+-.]*)(?::|\/)\/?[^\s]*$/i,"uri-reference":/^(?:(?:[a-z][a-z0-9+-.]*:)?\/\/)?[^\s]*$/i,"uri-template":f,url:p,email:/^[a-z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?(?:\.[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?)*$/i,hostname:c,ipv4:/^(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)$/,ipv6:/^\s*(?:(?:(?:[0-9a-f]{1,4}:){7}(?:[0-9a-f]{1,4}|:))|(?:(?:[0-9a-f]{1,4}:){6}(?::[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){5}(?:(?:(?::[0-9a-f]{1,4}){1,2})|:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){4}(?:(?:(?::[0-9a-f]{1,4}){1,3})|(?:(?::[0-9a-f]{1,4})?:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){3}(?:(?:(?::[0-9a-f]{1,4}){1,4})|(?:(?::[0-9a-f]{1,4}){0,2}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){2}(?:(?:(?::[0-9a-f]{1,4}){1,5})|(?:(?::[0-9a-f]{1,4}){0,3}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){1}(?:(?:(?::[0-9a-f]{1,4}){1,6})|(?:(?::[0-9a-f]{1,4}){0,4}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?::(?:(?:(?::[0-9a-f]{1,4}){1,7})|(?:(?::[0-9a-f]{1,4}){0,5}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(?:%.+)?\s*$/i,regex:i,uuid:m,"json-pointer":v,"relative-json-pointer":y},a.full={date:s,time:o,"date-time":function(e){var r=e.split(g);return 2==r.length&&s(r[0])&&o(r[1],!0)},uri:function(e){return P.test(e)&&d.test(e)},"uri-reference":/^(?:[a-z][a-z0-9+\-.]*:)?(?:\/?\/(?:(?:[a-z0-9\-._~!$&'()*+,;=:]|%[0-9a-f]{2})*@)?(?:\[(?:(?:(?:(?:[0-9a-f]{1,4}:){6}|::(?:[0-9a-f]{1,4}:){5}|(?:[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){4}|(?:(?:[0-9a-f]{1,4}:){0,1}[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){3}|(?:(?:[0-9a-f]{1,4}:){0,2}[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){2}|(?:(?:[0-9a-f]{1,4}:){0,3}[0-9a-f]{1,4})?::[0-9a-f]{1,4}:|(?:(?:[0-9a-f]{1,4}:){0,4}[0-9a-f]{1,4})?::)(?:[0-9a-f]{1,4}:[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?))|(?:(?:[0-9a-f]{1,4}:){0,5}[0-9a-f]{1,4})?::[0-9a-f]{1,4}|(?:(?:[0-9a-f]{1,4}:){0,6}[0-9a-f]{1,4})?::)|[Vv][0-9a-f]+\.[a-z0-9\-._~!$&'()*+,;=:]+)\]|(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)|(?:[a-z0-9\-._~!$&'"()*+,;=]|%[0-9a-f]{2})*)(?::\d*)?(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*|\/(?:(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})+(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*)?|(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})+(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*)?(?:\?(?:[a-z0-9\-._~!$&'"()*+,;=:@/?]|%[0-9a-f]{2})*)?(?:#(?:[a-z0-9\-._~!$&'"()*+,;=:@/?]|%[0-9a-f]{2})*)?$/i,"uri-template":f,url:p,email:/^[a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&''*+/=?^_`{|}~-]+)*@(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?$/i,hostname:function(e){return e.length<=255&&c.test(e)},ipv4:/^(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)$/,ipv6:/^\s*(?:(?:(?:[0-9a-f]{1,4}:){7}(?:[0-9a-f]{1,4}|:))|(?:(?:[0-9a-f]{1,4}:){6}(?::[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){5}(?:(?:(?::[0-9a-f]{1,4}){1,2})|:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){4}(?:(?:(?::[0-9a-f]{1,4}){1,3})|(?:(?::[0-9a-f]{1,4})?:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){3}(?:(?:(?::[0-9a-f]{1,4}){1,4})|(?:(?::[0-9a-f]{1,4}){0,2}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){2}(?:(?:(?::[0-9a-f]{1,4}){1,5})|(?:(?::[0-9a-f]{1,4}){0,3}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){1}(?:(?:(?::[0-9a-f]{1,4}){1,6})|(?:(?::[0-9a-f]{1,4}){0,4}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?::(?:(?:(?::[0-9a-f]{1,4}){1,7})|(?:(?::[0-9a-f]{1,4}){0,5}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(?:%.+)?\s*$/i,regex:i,uuid:m,"json-pointer":v,"relative-json-pointer":y};var g=/t|\s/i,P=/\/|:/,E=/[^\\]\\Z/},{"./util":12}],7:[function(e,r,t){"use strict";function a(e,r,t,P){function E(){var e=C.validate,r=e.apply(null,arguments);return E.errors=e.errors,r}function w(e,t,s,f){var P=!t||t&&t.schema==e;if(t.schema!=r.schema)return a.call($,e,t,s,f);var E=!0===e.$async,w=p({isTop:!0,schema:e,isRoot:P,baseId:f,root:t,schemaPath:"",errSchemaPath:"#",errorPath:'""',MissingRefError:d.MissingRef,RULES:U,validate:p,util:c,resolve:u,resolveRef:b,usePattern:_,useDefault:x,useCustomRule:F,opts:R,formats:Q,logger:$.logger,self:$});w=h(O,n)+h(I,o)+h(k,i)+h(L,l)+w,R.processCode&&(w=R.processCode(w));var S;try{S=new Function("self","RULES","formats","root","refVal","defaults","customRules","co","equal","ucs2length","ValidationError",w)($,U,Q,r,O,k,L,m,y,v,g),O[0]=S}catch(e){throw $.logger.error("Error compiling schema, function code:",w),e}return S.schema=e,S.errors=null,S.refs=D,S.refVal=O,S.root=P?S:t,E&&(S.$async=!0),!0===R.sourceCode&&(S.source={code:w,patterns:I,defaults:k}),S}function b(e,s,o){s=u.url(e,s);var i,n,l=D[s];if(void 0!==l)return i=O[l],n="refVal["+l+"]",j(i,n);if(!o&&r.refs){var h=r.refs[s];if(void 0!==h)return i=r.refVal[h],n=S(s,i),j(i,n)}n=S(s);var c=u.call($,w,r,s);if(void 0===c){var d=t&&t[s];d&&(c=u.inlineRef(d,R.inlineRefs)?d:a.call($,d,r,t,e))}if(void 0!==c)return function(e,r){O[D[e]]=r}(s,c),j(c,n);!function(e){delete D[e]}(s)}function S(e,r){var t=O.length;return O[t]=r,D[e]=t,"refVal"+t}function j(e,r){return"object"==typeof e||"boolean"==typeof e?{code:r,schema:e,inline:!0}:{code:r,$async:e&&e.$async}}function _(e){var r=A[e];return void 0===r&&(r=A[e]=I.length,I[r]=e),"pattern"+r}function x(e){switch(typeof e){case"boolean":case"number":return""+e;case"string":return c.toQuotedString(e);case"object":if(null===e)return"null";var r=f(e),t=q[r];return void 0===t&&(t=q[r]=k.length,k[t]=e),"default"+t}}function F(e,r,t,a){var s=e.definition.validateSchema;if(s&&!1!==$._opts.validateSchema){if(!s(r)){var o="keyword schema is invalid: "+$.errorsText(s.errors);if("log"!=$._opts.validateSchema)throw new Error(o);$.logger.error(o)}}var i,n=e.definition.compile,l=e.definition.inline,h=e.definition.macro;if(n)i=n.call($,r,t,a);else if(h)i=h.call($,r,t,a),!1!==R.validateSchema&&$.validateSchema(i,!0);else if(l)i=l.call($,a,e.keyword,r,t);else if(!(i=e.definition.validate))return;if(void 0===i)throw new Error('custom keyword "'+e.keyword+'"failed to compile');var u=L.length;return L[u]=i,{code:"customRule"+u,validate:i}}var $=this,R=this._opts,O=[void 0],D={},I=[],A={},k=[],q={},L=[],z=function(e,r,t){var a=s.call(this,e,r,t);return a>=0?{index:a,compiling:!0}:(a=this._compilations.length,this._compilations[a]={schema:e,root:r,baseId:t},{index:a,compiling:!1})}.call(this,e,r=r||{schema:e,refVal:O,refs:D},P),C=this._compilations[z.index];if(z.compiling)return C.callValidate=E;var Q=this._formats,U=this.RULES;try{var V=w(e,r,t,P);C.validate=V;var N=C.callValidate;return N&&(N.schema=V.schema,N.errors=null,N.refs=V.refs,N.refVal=V.refVal,N.root=V.root,N.$async=V.$async,R.sourceCode&&(N.source=V.source)),V}finally{(function(e,r,t){var a=s.call(this,e,r,t);a>=0&&this._compilations.splice(a,1)}).call(this,e,r,P)}}function s(e,r,t){for(var a=0;a=55296&&r<=56319&&s=r)throw new Error("Cannot access property/index "+a+" levels up, current level is "+r);return t[r-a]}if(a>r)throw new Error("Cannot access data "+a+" levels up, current level is "+r);if(i="data"+(r-a||""),!s)return i}for(var l=i,h=s.split("/"),c=0;c",y=f?">":"<",g=void 0;if(e.opts.$data&&m&&m.$data){var P=e.util.getData(m.$data,i,e.dataPathArr),E="exclusive"+o,w="exclType"+o,b="exclIsNumber"+o,S="' + "+(_="op"+o)+" + '";s+=" var schemaExcl"+o+" = "+P+"; ",s+=" var "+E+"; var "+w+" = typeof "+(P="schemaExcl"+o)+"; if ("+w+" != 'boolean' && "+w+" != 'undefined' && "+w+" != 'number') { ";g=p;(x=x||[]).push(s),s="",!1!==e.createErrors?(s+=" { keyword: '"+(g||"_exclusiveLimit")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(h)+" , params: {} ",!1!==e.opts.messages&&(s+=" , message: '"+p+" should be boolean' "),e.opts.verbose&&(s+=" , schema: validate.schema"+l+" , parentSchema: validate.schema"+e.schemaPath+" , data: "+c+" "),s+=" } "):s+=" {} ";var j=s;s=x.pop(),s+=!e.compositeRule&&u?e.async?" throw new ValidationError(["+j+"]); ":" validate.errors = ["+j+"]; return false; ":" var err = "+j+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",s+=" } else if ( ",d&&(s+=" ("+a+" !== undefined && typeof "+a+" != 'number') || "),s+=" "+w+" == 'number' ? ( ("+E+" = "+a+" === undefined || "+P+" "+v+"= "+a+") ? "+c+" "+y+"= "+P+" : "+c+" "+y+" "+a+" ) : ( ("+E+" = "+P+" === true) ? "+c+" "+y+"= "+a+" : "+c+" "+y+" "+a+" ) || "+c+" !== "+c+") { var op"+o+" = "+E+" ? '"+v+"' : '"+v+"=';"}else{S=v;if((b="number"==typeof m)&&d){var _="'"+S+"'";s+=" if ( ",d&&(s+=" ("+a+" !== undefined && typeof "+a+" != 'number') || "),s+=" ( "+a+" === undefined || "+m+" "+v+"= "+a+" ? "+c+" "+y+"= "+m+" : "+c+" "+y+" "+a+" ) || "+c+" !== "+c+") { "}else{b&&void 0===n?(E=!0,g=p,h=e.errSchemaPath+"/"+p,a=m,y+="="):(b&&(a=Math[f?"min":"max"](m,n)),m===(!b||a)?(E=!0,g=p,h=e.errSchemaPath+"/"+p,y+="="):(E=!1,S+="="));_="'"+S+"'";s+=" if ( ",d&&(s+=" ("+a+" !== undefined && typeof "+a+" != 'number') || "),s+=" "+c+" "+y+" "+a+" || "+c+" !== "+c+") { "}}g=g||r;var x;(x=x||[]).push(s),s="",!1!==e.createErrors?(s+=" { keyword: '"+(g||"_limit")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(h)+" , params: { comparison: "+_+", limit: "+a+", exclusive: "+E+" } ",!1!==e.opts.messages&&(s+=" , message: 'should be "+S+" ",s+=d?"' + "+a:a+"'"),e.opts.verbose&&(s+=" , schema: ",s+=d?"validate.schema"+l:""+n,s+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+c+" "),s+=" } "):s+=" {} ";j=s;return s=x.pop(),s+=!e.compositeRule&&u?e.async?" throw new ValidationError(["+j+"]); ":" validate.errors = ["+j+"]; return false; ":" var err = "+j+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",s+=" } ",u&&(s+=" else { "),s}},{}],14:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a,s=" ",o=e.level,i=e.dataLevel,n=e.schema[r],l=e.schemaPath+e.util.getProperty(r),h=e.errSchemaPath+"/"+r,u=!e.opts.allErrors,c="data"+(i||""),d=e.opts.$data&&n&&n.$data;d?(s+=" var schema"+o+" = "+e.util.getData(n.$data,i,e.dataPathArr)+"; ",a="schema"+o):a=n;s+="if ( ",d&&(s+=" ("+a+" !== undefined && typeof "+a+" != 'number') || "),s+=" "+c+".length "+("maxItems"==r?">":"<")+" "+a+") { ";var f=r,p=p||[];p.push(s),s="",!1!==e.createErrors?(s+=" { keyword: '"+(f||"_limitItems")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(h)+" , params: { limit: "+a+" } ",!1!==e.opts.messages&&(s+=" , message: 'should NOT have ",s+="maxItems"==r?"more":"less",s+=" than ",s+=d?"' + "+a+" + '":""+n,s+=" items' "),e.opts.verbose&&(s+=" , schema: ",s+=d?"validate.schema"+l:""+n,s+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+c+" "),s+=" } "):s+=" {} ";var m=s;return s=p.pop(),s+=!e.compositeRule&&u?e.async?" throw new ValidationError(["+m+"]); ":" validate.errors = ["+m+"]; return false; ":" var err = "+m+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",s+="} ",u&&(s+=" else { "),s}},{}],15:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a,s=" ",o=e.level,i=e.dataLevel,n=e.schema[r],l=e.schemaPath+e.util.getProperty(r),h=e.errSchemaPath+"/"+r,u=!e.opts.allErrors,c="data"+(i||""),d=e.opts.$data&&n&&n.$data;d?(s+=" var schema"+o+" = "+e.util.getData(n.$data,i,e.dataPathArr)+"; ",a="schema"+o):a=n;s+="if ( ",d&&(s+=" ("+a+" !== undefined && typeof "+a+" != 'number') || "),s+=!1===e.opts.unicode?" "+c+".length ":" ucs2length("+c+") ",s+=" "+("maxLength"==r?">":"<")+" "+a+") { ";var f=r,p=p||[];p.push(s),s="",!1!==e.createErrors?(s+=" { keyword: '"+(f||"_limitLength")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(h)+" , params: { limit: "+a+" } ",!1!==e.opts.messages&&(s+=" , message: 'should NOT be ",s+="maxLength"==r?"longer":"shorter",s+=" than ",s+=d?"' + "+a+" + '":""+n,s+=" characters' "),e.opts.verbose&&(s+=" , schema: ",s+=d?"validate.schema"+l:""+n,s+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+c+" "),s+=" } "):s+=" {} ";var m=s;return s=p.pop(),s+=!e.compositeRule&&u?e.async?" throw new ValidationError(["+m+"]); ":" validate.errors = ["+m+"]; return false; ":" var err = "+m+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",s+="} ",u&&(s+=" else { "),s}},{}],16:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a,s=" ",o=e.level,i=e.dataLevel,n=e.schema[r],l=e.schemaPath+e.util.getProperty(r),h=e.errSchemaPath+"/"+r,u=!e.opts.allErrors,c="data"+(i||""),d=e.opts.$data&&n&&n.$data;d?(s+=" var schema"+o+" = "+e.util.getData(n.$data,i,e.dataPathArr)+"; ",a="schema"+o):a=n;s+="if ( ",d&&(s+=" ("+a+" !== undefined && typeof "+a+" != 'number') || "),s+=" Object.keys("+c+").length "+("maxProperties"==r?">":"<")+" "+a+") { ";var f=r,p=p||[];p.push(s),s="",!1!==e.createErrors?(s+=" { keyword: '"+(f||"_limitProperties")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(h)+" , params: { limit: "+a+" } ",!1!==e.opts.messages&&(s+=" , message: 'should NOT have ",s+="maxProperties"==r?"more":"less",s+=" than ",s+=d?"' + "+a+" + '":""+n,s+=" properties' "),e.opts.verbose&&(s+=" , schema: ",s+=d?"validate.schema"+l:""+n,s+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+c+" "),s+=" } "):s+=" {} ";var m=s;return s=p.pop(),s+=!e.compositeRule&&u?e.async?" throw new ValidationError(["+m+"]); ":" validate.errors = ["+m+"]; return false; ":" var err = "+m+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",s+="} ",u&&(s+=" else { "),s}},{}],17:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a=" ",s=e.schema[r],o=e.schemaPath+e.util.getProperty(r),i=e.errSchemaPath+"/"+r,n=!e.opts.allErrors,l=e.util.copy(e),h="";l.level++;var u="valid"+l.level,c=l.baseId,d=!0,f=s;if(f)for(var p,m=-1,v=f.length-1;m=0)return h&&(a+=" if (true) { "),a;throw new Error('unknown format "'+i+'" is used in schema at path "'+e.errSchemaPath+'"')}var v,y=(v="object"==typeof m&&!(m instanceof RegExp)&&m.validate)&&m.type||"string";if(v){var g=!0===m.async;m=m.validate}if(y!=t)return h&&(a+=" if (true) { "),a;if(g){if(!e.async)throw new Error("async format in sync schema");var P="formats"+e.util.getProperty(i)+".validate";a+=" if (!("+e.yieldAwait+" "+P+"("+u+"))) { "}else{a+=" if (! ";P="formats"+e.util.getProperty(i);v&&(P+=".validate"),a+="function"==typeof m?" "+P+"("+u+") ":" "+P+".test("+u+") ",a+=") { "}}var E=E||[];E.push(a),a="",!1!==e.createErrors?(a+=" { keyword: 'format' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { format: ",a+=d?""+c:""+e.util.toQuotedString(i),a+=" } ",!1!==e.opts.messages&&(a+=" , message: 'should match format \"",a+=d?"' + "+c+" + '":""+e.util.escapeQuotes(i),a+="\"' "),e.opts.verbose&&(a+=" , schema: ",a+=d?"validate.schema"+n:""+e.util.toQuotedString(i),a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var w=a;return a=E.pop(),a+=!e.compositeRule&&h?e.async?" throw new ValidationError(["+w+"]); ":" validate.errors = ["+w+"]; return false; ":" var err = "+w+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } ",h&&(a+=" else { "),a}},{}],25:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,h=!e.opts.allErrors,u="data"+(o||""),c="valid"+s,d="errs__"+s,f=e.util.copy(e),p="";f.level++;var m="valid"+f.level,v="i"+s,y=f.dataLevel=e.dataLevel+1,g="data"+y,P=e.baseId;if(a+="var "+d+" = errors;var "+c+";",Array.isArray(i)){var E=e.schema.additionalItems;if(!1===E){a+=" "+c+" = "+u+".length <= "+i.length+"; ";var w=l;l=e.errSchemaPath+"/additionalItems",a+=" if (!"+c+") { ";var b=b||[];b.push(a),a="",!1!==e.createErrors?(a+=" { keyword: 'additionalItems' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { limit: "+i.length+" } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have more than "+i.length+" items' "),e.opts.verbose&&(a+=" , schema: false , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var S=a;a=b.pop(),a+=!e.compositeRule&&h?e.async?" throw new ValidationError(["+S+"]); ":" validate.errors = ["+S+"]; return false; ":" var err = "+S+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } ",l=w,h&&(p+="}",a+=" else { ")}var j=i;if(j)for(var _,x=-1,F=j.length-1;x "+x+") { ";var $=u+"["+x+"]";f.schema=_,f.schemaPath=n+"["+x+"]",f.errSchemaPath=l+"/"+x,f.errorPath=e.util.getPathExpr(e.errorPath,x,e.opts.jsonPointers,!0),f.dataPathArr[y]=x;var R=e.validate(f);f.baseId=P,e.util.varOccurences(R,g)<2?a+=" "+e.util.varReplace(R,g,$)+" ":a+=" var "+g+" = "+$+"; "+R+" ",a+=" } ",h&&(a+=" if ("+m+") { ",p+="}")}if("object"==typeof E&&e.util.schemaHasRules(E,e.RULES.all)){f.schema=E,f.schemaPath=e.schemaPath+".additionalItems",f.errSchemaPath=e.errSchemaPath+"/additionalItems",a+=" "+m+" = true; if ("+u+".length > "+i.length+") { for (var "+v+" = "+i.length+"; "+v+" < "+u+".length; "+v+"++) { ",f.errorPath=e.util.getPathExpr(e.errorPath,v,e.opts.jsonPointers,!0);$=u+"["+v+"]";f.dataPathArr[y]=v;R=e.validate(f);f.baseId=P,e.util.varOccurences(R,g)<2?a+=" "+e.util.varReplace(R,g,$)+" ":a+=" var "+g+" = "+$+"; "+R+" ",h&&(a+=" if (!"+m+") break; "),a+=" } } ",h&&(a+=" if ("+m+") { ",p+="}")}}else if(e.util.schemaHasRules(i,e.RULES.all)){f.schema=i,f.schemaPath=n,f.errSchemaPath=l,a+=" for (var "+v+" = 0; "+v+" < "+u+".length; "+v+"++) { ",f.errorPath=e.util.getPathExpr(e.errorPath,v,e.opts.jsonPointers,!0);$=u+"["+v+"]";f.dataPathArr[y]=v;R=e.validate(f);f.baseId=P,e.util.varOccurences(R,g)<2?a+=" "+e.util.varReplace(R,g,$)+" ":a+=" var "+g+" = "+$+"; "+R+" ",h&&(a+=" if (!"+m+") break; "),a+=" }"}return h&&(a+=" "+p+" if ("+d+" == errors) {"),a=e.util.cleanUpCode(a)}},{}],26:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a,s=" ",o=e.level,i=e.dataLevel,n=e.schema[r],l=e.schemaPath+e.util.getProperty(r),h=e.errSchemaPath+"/"+r,u=!e.opts.allErrors,c="data"+(i||""),d=e.opts.$data&&n&&n.$data;d?(s+=" var schema"+o+" = "+e.util.getData(n.$data,i,e.dataPathArr)+"; ",a="schema"+o):a=n,s+="var division"+o+";if (",d&&(s+=" "+a+" !== undefined && ( typeof "+a+" != 'number' || "),s+=" (division"+o+" = "+c+" / "+a+", ",s+=e.opts.multipleOfPrecision?" Math.abs(Math.round(division"+o+") - division"+o+") > 1e-"+e.opts.multipleOfPrecision+" ":" division"+o+" !== parseInt(division"+o+") ",s+=" ) ",d&&(s+=" ) "),s+=" ) { ";var f=f||[];f.push(s),s="",!1!==e.createErrors?(s+=" { keyword: 'multipleOf' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(h)+" , params: { multipleOf: "+a+" } ",!1!==e.opts.messages&&(s+=" , message: 'should be multiple of ",s+=d?"' + "+a:a+"'"),e.opts.verbose&&(s+=" , schema: ",s+=d?"validate.schema"+l:""+n,s+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+c+" "),s+=" } "):s+=" {} ";var p=s;return s=f.pop(),s+=!e.compositeRule&&u?e.async?" throw new ValidationError(["+p+"]); ":" validate.errors = ["+p+"]; return false; ":" var err = "+p+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",s+="} ",u&&(s+=" else { "),s}},{}],27:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,h=!e.opts.allErrors,u="data"+(o||""),c="errs__"+s,d=e.util.copy(e);d.level++;var f="valid"+d.level;if(e.util.schemaHasRules(i,e.RULES.all)){d.schema=i,d.schemaPath=n,d.errSchemaPath=l,a+=" var "+c+" = errors; ";var p=e.compositeRule;e.compositeRule=d.compositeRule=!0,d.createErrors=!1;var m;d.opts.allErrors&&(m=d.opts.allErrors,d.opts.allErrors=!1),a+=" "+e.validate(d)+" ",d.createErrors=!0,m&&(d.opts.allErrors=m),e.compositeRule=d.compositeRule=p,a+=" if ("+f+") { ";var v=v||[];v.push(a),a="",!1!==e.createErrors?(a+=" { keyword: 'not' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: {} ",!1!==e.opts.messages&&(a+=" , message: 'should NOT be valid' "),e.opts.verbose&&(a+=" , schema: validate.schema"+n+" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var y=a;a=v.pop(),a+=!e.compositeRule&&h?e.async?" throw new ValidationError(["+y+"]); ":" validate.errors = ["+y+"]; return false; ":" var err = "+y+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } else { errors = "+c+"; if (vErrors !== null) { if ("+c+") vErrors.length = "+c+"; else vErrors = null; } ",e.opts.allErrors&&(a+=" } ")}else a+=" var err = ",!1!==e.createErrors?(a+=" { keyword: 'not' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: {} ",!1!==e.opts.messages&&(a+=" , message: 'should NOT be valid' "),e.opts.verbose&&(a+=" , schema: validate.schema"+n+" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ",a+="; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",h&&(a+=" if (false) { ");return a}},{}],28:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,h=!e.opts.allErrors,u="data"+(o||""),c="valid"+s,d="errs__"+s,f=e.util.copy(e),p="";f.level++;var m="valid"+f.level;a+="var "+d+" = errors;var prevValid"+s+" = false;var "+c+" = false;";var v=f.baseId,y=e.compositeRule;e.compositeRule=f.compositeRule=!0;var g=i;if(g)for(var P,E=-1,w=g.length-1;E5)a+=" || validate.schema"+n+"["+v+"] ";else{var L=w;if(L)for(var z=-1,C=L.length-1;z= "+ve+"; ",l=e.errSchemaPath+"/patternGroups/minimum",a+=" if (!"+c+") { ";(we=we||[]).push(a),a="",!1!==e.createErrors?(a+=" { keyword: 'patternGroups' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { reason: '"+Pe+"', limit: "+ge+", pattern: '"+e.util.escapeQuotes(ce)+"' } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have "+Ee+" than "+ge+' properties matching pattern "'+e.util.escapeQuotes(ce)+"\"' "),e.opts.verbose&&(a+=" , schema: validate.schema"+n+" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";B=a;a=we.pop(),a+=!e.compositeRule&&h?e.async?" throw new ValidationError(["+B+"]); ":" validate.errors = ["+B+"]; return false; ":" var err = "+B+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } ",void 0!==ye&&(a+=" else ")}if(void 0!==ye){ge=ye,Pe="maximum",Ee="more";a+=" "+c+" = pgPropCount"+s+" <= "+ye+"; ",l=e.errSchemaPath+"/patternGroups/maximum",a+=" if (!"+c+") { ";var we;(we=we||[]).push(a),a="",!1!==e.createErrors?(a+=" { keyword: 'patternGroups' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { reason: '"+Pe+"', limit: "+ge+", pattern: '"+e.util.escapeQuotes(ce)+"' } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have "+Ee+" than "+ge+' properties matching pattern "'+e.util.escapeQuotes(ce)+"\"' "),e.opts.verbose&&(a+=" , schema: validate.schema"+n+" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";B=a;a=we.pop(),a+=!e.compositeRule&&h?e.async?" throw new ValidationError(["+B+"]); ":" validate.errors = ["+B+"]; return false; ":" var err = "+B+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } "}l=K,h&&(a+=" if ("+c+") { ",p+="}")}}}}return h&&(a+=" "+p+" if ("+d+" == errors) {"),a=e.util.cleanUpCode(a)}},{}],31:[function(e,r,t){"use strict";r.exports=function(e,r,t){var a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,h=!e.opts.allErrors,u="data"+(o||""),c="errs__"+s,d=e.util.copy(e);d.level++;var f="valid"+d.level;if(e.util.schemaHasRules(i,e.RULES.all)){d.schema=i,d.schemaPath=n,d.errSchemaPath=l;var p="key"+s,m="idx"+s,v="i"+s,y="' + "+p+" + '",g="data"+(d.dataLevel=e.dataLevel+1),P="dataProperties"+s,E=e.opts.ownProperties,w=e.baseId;a+=" var "+c+" = errors; ",E&&(a+=" var "+P+" = undefined; "),a+=E?" "+P+" = "+P+" || Object.keys("+u+"); for (var "+m+"=0; "+m+"<"+P+".length; "+m+"++) { var "+p+" = "+P+"["+m+"]; ":" for (var "+p+" in "+u+") { ",a+=" var startErrs"+s+" = errors; ";var b=p,S=e.compositeRule;e.compositeRule=d.compositeRule=!0;var j=e.validate(d);d.baseId=w,e.util.varOccurences(j,g)<2?a+=" "+e.util.varReplace(j,g,b)+" ":a+=" var "+g+" = "+b+"; "+j+" ",e.compositeRule=d.compositeRule=S,a+=" if (!"+f+") { for (var "+v+"=startErrs"+s+"; "+v+"=e.opts.loopRequired,b=e.opts.ownProperties;if(h)if(a+=" var missing"+s+"; ",w){d||(a+=" var "+f+" = validate.schema"+n+"; ");var S="' + "+(R="schema"+s+"["+(x="i"+s)+"]")+" + '";e.opts._errorDataPathProperty&&(e.errorPath=e.util.getPathExpr(E,R,e.opts.jsonPointers)),a+=" var "+c+" = true; ",d&&(a+=" if (schema"+s+" === undefined) "+c+" = true; else if (!Array.isArray(schema"+s+")) "+c+" = false; else {"),a+=" for (var "+x+" = 0; "+x+" < "+f+".length; "+x+"++) { "+c+" = "+u+"["+f+"["+x+"]] !== undefined ",b&&(a+=" && Object.prototype.hasOwnProperty.call("+u+", "+f+"["+x+"]) "),a+="; if (!"+c+") break; } ",d&&(a+=" } "),a+=" if (!"+c+") { ";($=$||[]).push(a),a="",!1!==e.createErrors?(a+=" { keyword: 'required' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { missingProperty: '"+S+"' } ",!1!==e.opts.messages&&(a+=" , message: '",a+=e.opts._errorDataPathProperty?"is a required property":"should have required property \\'"+S+"\\'",a+="' "),e.opts.verbose&&(a+=" , schema: validate.schema"+n+" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var j=a;a=$.pop(),a+=!e.compositeRule&&h?e.async?" throw new ValidationError(["+j+"]); ":" validate.errors = ["+j+"]; return false; ":" var err = "+j+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } else { "}else{a+=" if ( ";var _=p;if(_)for(var x=-1,F=_.length-1;x 1) { var i = "+c+".length, j; outer: for (;i--;) { for (j = i; j--;) { if (equal("+c+"[i], "+c+"[j])) { "+d+" = false; break outer; } } } } ",f&&(s+=" } "),s+=" if (!"+d+") { ";var p=p||[];p.push(s),s="",!1!==e.createErrors?(s+=" { keyword: 'uniqueItems' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(h)+" , params: { i: i, j: j } ",!1!==e.opts.messages&&(s+=" , message: 'should NOT have duplicate items (items ## ' + j + ' and ' + i + ' are identical)' "),e.opts.verbose&&(s+=" , schema: ",s+=f?"validate.schema"+l:""+n,s+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+c+" "),s+=" } "):s+=" {} ";var m=s;s=p.pop(),s+=!e.compositeRule&&u?e.async?" throw new ValidationError(["+m+"]); ":" validate.errors = ["+m+"]; return false; ":" var err = "+m+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",s+=" } ",u&&(s+=" else { ")}else u&&(s+=" if (true) { ");return s}},{}],35:[function(e,r,t){"use strict";r.exports=function(e,r,t){function a(e){for(var r=e.rules,t=0;t2&&(r=n.call(arguments,1)),t(r)})})}.call(this,e):Array.isArray(e)?function(e){return Promise.all(e.map(s,this))}.call(this,e):function(e){return Object==e.constructor}(e)?function(e){for(var r=new e.constructor,t=Object.keys(e),a=[],i=0;i1&&(a=t[0]+"@",e=t[1]);return a+o((e=e.replace(O,".")).split("."),r).join(".")}function n(e){for(var r,t,a=[],s=0,o=e.length;s=55296&&r<=56319&&s65535&&(r+=k((e-=65536)>>>10&1023|55296),e=56320|1023&e),r+=k(e)}).join("")}function h(e){return e-48<10?e-22:e-65<26?e-65:e-97<26?e-97:E}function u(e,r){return e+22+75*(e<26)-((0!=r)<<5)}function c(e,r,t){var a=0;for(e=t?A(e/j):e>>1,e+=A(e/r);e>I*b>>1;a+=E)e=A(e/I);return A(a+(I+1)*e/(e+S))}function d(e){var r,t,a,o,i,n,u,d,f,p,m=[],v=e.length,y=0,g=x,S=_;for((t=e.lastIndexOf(F))<0&&(t=0),a=0;a=128&&s("not-basic"),m.push(e.charCodeAt(a));for(o=t>0?t+1:0;o=v&&s("invalid-input"),((d=h(e.charCodeAt(o++)))>=E||d>A((P-y)/n))&&s("overflow"),y+=d*n,f=u<=S?w:u>=S+b?b:u-S,!(dA(P/(p=E-f))&&s("overflow"),n*=p;S=c(y-i,r=m.length+1,0==i),A(y/r)>P-g&&s("overflow"),g+=A(y/r),y%=r,m.splice(y++,0,g)}return l(m)}function f(e){var r,t,a,o,i,l,h,d,f,p,m,v,y,g,S,j=[];for(v=(e=n(e)).length,r=x,t=0,i=_,l=0;l=r&&mA((P-t)/(y=a+1))&&s("overflow"),t+=(h-r)*y,r=h,l=0;lP&&s("overflow"),m==r){for(d=t,f=E;p=f<=i?w:f>=i+b?b:f-i,!(d