-
Notifications
You must be signed in to change notification settings - Fork 3k
package.json should be required to be JSON, and never eval()'ed. #408
Comments
If JSON.parse fails, then the package.json is evaluated using Script.runInNewContext, so that it has no access to any of node's capabilities. The security implication is nil. It's just pure JS (in parens), not a node program. But allowing JavaScript constructs does allow for strings that span multiple lines, comments, and unquoted keys, all of which are pretty nice-to-have features. I'm not worried about compatibility with other packagers. What ends up on the registry and in the cache is sanitary JSON, so it's not an issue. If package authors want ot be compatible with systems that don't allow these features, then it's up to them to be more strict with themselves. |
Even if there is no security issue, (I guess that depends how package.json is used; I wasn't aware the original package.json isn't served from the registry) I still think fake JSON should be discouraged, and npm is the only place to do that when people are writing package.json files only for npm. If someone mirrors the npm package database, won't the uploaded npm packages still contain the original package.json? Putting non-JSON in package.json is exactly the sort of thing I think tools need to prevent as early as possible, because it is increasingly harder to change once people rely on it. Comments are convenient but not necessary, and unquoted property names certainly aren't worth giving up JSON-compatibility for. If package.json is going to become a standard location for package metadata, other tools are going to want to read it. It would be disappointing if we end up with a de-facto standard of arbitrary code in package.json just to support a little more author laziness in early days. |
The registry is a couch app. It's pretty finicky about the JSON you put in. In fact, it can't even have property names that start with _ (except the blessed _rev and _id), and has to pass through a pretty strict validate_doc_update function.
The tarballs do contain the original unmolested package.json. But again, the security issue isn't because package authors can't execute arbitrary JavaScript. It's because the arbitrary JavaScript they can execute is powerless. No "process", no "require", etc. No IO of any kind. Nothing except what's in the ES5 spec. They could conceivably do On the other hand, you're totally right, of course. The very aspects that make the system secure also make it crappy. If you wanted some clever build system that generates the package.json file based on some git state or gemspec or something (as some people currently do), it'd be nice to have a way to do that easily. Furthermore, ".json" should be a JSON file, and ".js" should be a JS file. So, here's an idea, lemme know how it strikes you: When publishing or installing/linking a local folder, if there's no package.json file, but there is a package.js file, then do When installing a published name/version combination, if there is no package.json, then fail. This way, you'll be able to do your fancy build step stuff on your own kit, but stuff sent to the registry will have to be "frozen" at some spot. |
That sounds good to me. I use a regular build system (make, shell scripts, node scripts, miscellaneous other stuff) to generate a package, then I tar it up and |
Yeah, also, "hey, if you want comments (and for that matter, all of node) just use a package.js and set module.exports = { package.json stuff }". That's neat. |
Is there any concern that "package.js" would make it impossible to have a "package" module in the root directory of the project? An edge case, to be sure... |
The text was updated successfully, but these errors were encountered: