Replies: 9 comments 13 replies
-
Why is it needed? "injected" is set explicitly for every dependency that needs it. We did not add a setting to turn it on for all dependencies. |
Beta Was this translation helpful? Give feedback.
-
This is semi-related, I had an alternate approach for this same problem: tl;dr: Use the the explicit "files" property in the package.json to symlink just the files to the proper place (solving the need to re-run an install when files change), and then pull the "dependencies"/"peerDependencies" locally as if it came from the registry (so you get the correct directory structure/peer dependency installation). If the "files" property changes, you'd need to re-run an install to symlink the new directories/files. |
Beta Was this translation helpful? Give feedback.
-
Can we imagine a In dev mode, we can install with development dependencies. So no need for injected / hard link In production mode (eg. in a Dockerfile) with the
I'm maybe missing something here. Let me know if it's something worth discussing / test or if it's better if i move the discussion in a new Discussion :) |
Beta Was this translation helpful? Give feedback.
-
has anyone solved this? I'm thinking of trying to code that watcher which would be re-creating the hard links every time we add files to the hard-linked packages😕 |
Beta Was this translation helpful? Give feedback.
-
I created a utility package that symlinks the files and folders in the root directory of any workspace package that is using In my case, I'm using rush to manage the monorepo, keep versions consistent, etc. This really turned out to be a foot gun. Everything looked as if it worked until running |
Beta Was this translation helpful? Give feedback.
-
Anyone have a solve for this? This is our CI:
This was failing since
But now this issue when there is a cache hit for the |
Beta Was this translation helpful? Give feedback.
-
has someone found an alternative for the injected feature? 😭 |
Beta Was this translation helpful? Give feedback.
-
I have this plan/idea, but have not had time to implement: #4965 (comment) |
Beta Was this translation helpful? Give feedback.
-
Is there any update regarding this issue? After following way too many threads, searching for solutions all over the internet and stumbling over issues regarding the injected keyword with other frameworks I am revisiting the concern once again. Out Setup To use the packages in other packages or services, they must first be built since we primarily use TypeScript. Once built, the package can be included via the workspace protocol in the package.json. To enhance the developer experience, we have implemented file watchers for most packages, which can be used in conjunction with hot reloading or file watchers in the services. This accelerates the development process. Changes to a package are synchronized to the node_modules of the service and are detected by the service's file watcher, triggering a rebuild of the service. This efficient setup promotes rapid local development and quick iteration steps. Moreover, this means that only one version of a package is present at any given time. We currently version very infrequently, and most packages are used directly from the disk, even in production (although the process there is slightly different but not relevant for this discussion). Understanding the Issue The workspace protocol of pnpm symlinks packages by default. This approach offers an advantage: any modification to a package is immediately synchronized across all other packages or services using it within the mono repository. This enables swift development workflows wherein packages have their own file watcher, and services can automatically hot-reload their website, incorporating the package changes. However, a significant issue arises: Peer dependencies are completely disregarded, and due to Node's node_modules resolution, this cannot be easily rectified. The problem occurs because Node follows symlinks to their actual location on the disk, which is the folder in your mono repository where your package resides. From there, it reads the source files of your package and starts resolving package imports. These imports are resolved from the context of your mono repository package which in this case is the folder where the package is located. This becomes problematic with peer dependencies. Peer dependencies are used to inform a package that it needs to have a specific package installed to function, but it doesn't necessarily care about the version of the package. This is crucial for packages like React, where only one React version should be used per webside due to global singletons, etc. This means that if service A has React 16 installed and another one has React 18 installed, the workspace package using React needs to select a different React version depending on where it's being used. However, due to how Node resolves the symlinks, you only have one package location to work with, and installing both React versions in one node_modules won't work. To address this, pnpm introduced the dependenciesMeta.inject option, which allows hard-linking packages into the .pnpm folder instead of symlinking them. This generally resolves the issue, as hard-linked packages will not be resolved by Node to their source. This allows multiple copies of the hard-linked package in the .pnpm folder, each using different React versions in their node_modules. Now, packages can have peer dependencies, and the local version is used. However, this solution presents a new problem: synchronization of file changes. The Problem with Hard Links Hard links only reference the content of another file. If you're familiar with C++, a hard link is essentially a pointer. It points to the contents of a file. If any pointer to the contents of a file exists, the content will still exist. If all pointers are deleted, the file content will also be deleted. In our case, it means that if you run, for instance, tsc to recompile the dist folder and it recreates the files, the hard links are essentially lost. The files in .pnpm still point to the old file content. Moreover, new files and deleted files are not detected at all. This is why you need to run pnpm install after each package change. This is very inefficient and slow to work with. Moving Forward I understand that this issue is not rooted in pnpm and cannot be easily fixed by pnpm. However, I'm curious to know if there are any ideas on how to proceed here, @zkochan. Has anyone created any issues in the Node GitHub repository? Are there any planned updates that might alleviate the pain points with hard links? This issue is frustrating because it means that pnpm doesn't work with one of the core features of the Node ecosystem. Peer dependencies are used everywhere, and if you want to use packages without publishing them to a registry, you can't avoid them at all. This is evident from the number of issues created regarding peer dependencies. Most of them are completely confused on why this is the matter. Because other solutions are not very appealing. For example, Nx recommends installing all dependencies of a mono repository in the root package.json of the mono repository source. We strongly disagree with this approach because it means that every package you want to update requires you to update the entire mono repository. This becomes extremely complex and challenging, especially when packages are using all over the mono repository and changes introduce complete overhauls in their usage. Solving this issue by not building the package itself and instead building it from the service isn't viable either. First, deletions and file creations are not picked up but can be fixed by doing pnpm install. But we also noticed that file watchers of webpack or nextjs had a lot of troubles picking up the file changes if the content of a hard-linked file got changed. Most of the time, they didn't rebuild or just completed in an unfinished state, which makes our use of hard links practically impossible. There are even more examples of where this structure is not working. For Turbo, the Nx alternative, it allows you to define pipelines. For instance, if you want to build a service, you can define a pipeline that builds all packages required by the service before building the service itself. This process is similar to running a pnpm --filter service-a... build command. It also adds smart caching to the mix. However, this entire process falls apart because after a package has been built, pnpm install should be executed. Upon conducting some research, it appears that there's no viable solution to this issue in Turbo. Conclusion Can anyone tell me if there are any solutions out there that make this whole matter easier? Are we using pnpm in combiation with the mono repositroy wrong? I invite further comments and suggestions on this topic, as I find it crucial for the pnpm community to find a way to make peer dependencies work. |
Beta Was this translation helpful? Give feedback.
-
Reference: #3915
I think this is going to be a foot gun. People are going to use it to solve all their inevitable peer dep issues, but then iterative development workflow is completely broken as you would require
pnpm i
for every change. Any codebase that uses this will be harder to work with.For every
react
update, you are going to end up supporting multiple versions, hence everyone will just hard-link to avoid fixing things properly.Changes
.npmrc
config option to disable it, or configure it for all packages for testing purposes.Alternatives
--experimental-loader
hook in ESM to use the correct versionBeta Was this translation helpful? Give feedback.
All reactions