-
-
Notifications
You must be signed in to change notification settings - Fork 32.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[core] Clean up the API docs generation scripts #35244
[core] Clean up the API docs generation scripts #35244
Conversation
|
03fe140
to
3a4fe69
Compare
c046e55
to
94a5ee8
Compare
package.json
Outdated
@@ -14,7 +14,7 @@ | |||
"release:publish:dry-run": "lerna publish from-package --dist-tag latest --contents build --registry=\"http://localhost:4873/\"", | |||
"release:tag": "node scripts/releaseTag.mjs", | |||
"docs:api": "rimraf ./docs/pages/**/api-docs ./docs/pages/**/api && yarn docs:api:build", | |||
"docs:api:build": "cross-env BABEL_ENV=development __NEXT_EXPORT_TRAILING_SLASH=true babel-node --extensions \".tsx,.ts,.js\" ./docs/scripts/buildApi.ts", | |||
"docs:api:build": "ts-node ./docs/packages/api-docs-builder/buildApi.ts", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps it makes sense to move this script to the api-docs-builder
package as a build
script and instead replace this with
"docs:api:build": "ts-node ./docs/packages/api-docs-builder/buildApi.ts", | |
"docs:api:build": "yarn workspace @mui-internal/api-docs-builder build", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It turns out that this creates a heap of problems as the scripts assume the cwd
is at the repo root. It's fixable, of course, but let's do it in a separate PR.
"name": "@mui-internal/docs-utilities", | ||
"version": "1.0.0", | ||
"private": "true", | ||
"main": "index.ts" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure this setup will work well going forward. Every consumer of this package will have to transpile it. And I'm not sure how this would convert to an ESM package.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does work with the existing scripts (they are in TS), and we tend to write new ones in TS as well, so this shouldn't be a problem. If we hit any roadblock, we can rewrite it to JS + .d.ts.
As for ESM, TypeScript should handle it (it does respect the type: module
), but I won't know for sure until I try.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, my main worry is that we're introducing a non-standard package layout here just (it seems to me) to avoid having to build it. Sooner or later one of these packages will need to be consumed by a target that isn't ts-node
and we'll be back at the start again trying to consume non-js and non-standard package layouts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, that's a fair point. Assuming we stick to TS, how would you ensure that the build output is up to date when imported by a downstream package?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe upstream packages should externalize their dependencies in the build. So that their build output only depends on code contained in the package itself. That way there doesn't need to be any synchronization between the builds of the different packages. At least that's how bundlers like e.g. tsup
do it.
At the end of the line it's next.js that bundles all packages together and so you'd only need to wait with the production build of the docs until all the other packages have been built. This can also be automated with tooling like turborepo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't mean the production code. To give a concrete example - let's say we've got @mui-internal/docs-utilities
written in TS and a utility JS script that uses it. How can we ensure that the @mui-internal/docs-utilities
is transpiled to JS before it's used in the script? One way would be to check in the build output to git, but I'm not very keen on it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah yes, for that we'd need a monorepo task runner that takes topological order of the workspaces into account. e.g. you can specify for a command X in one package that it depends on a command Y in another package. That way you can specify a package that needs to be built before a command can run. It can be emulated in lerna
using their --toposort
flag but that seems clunky. Tools like turborepo
or nx
are built for this use-case.
Granted, probably not ideal to introduce such a big tooling decision in the current PR. But perhaps to consider as part of the North star for the infrastructure?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right, let's discuss the ideal shape of the infrastructure and the steps toward it elsewhere. For the time being, I'll change docs-utilities to JS + .d.ts. to keep the standards.
The current setup is a bit odd. It's a workspace nested inside a workspace. I believe |
Right, I missed that the |
Do you think it's worth splitting into very small packages like By the way, we don't use a lot of the files you moved in the new packages, because they don't match the X needs. Maybe it would be interesting to only put in those packages, content that is developed in a way it can be reused by the other teams, not just by the core. |
The docs-utilities are used by other scripts as well, so it made sense to extract it. I'm generally in favor of having smaller packages as they are easier to reason about.
Isolating the packages so they don't import individual files from the production code is just the first step. I'm in favor of making them more generic, but let's do it in a separate PR. |
@Janpot, @flaviendelangle - just to be sure before I merge: removing the |
We don't use it |
We don't use it either 👍 |
@flaviendelangle These two problems that you raised feels quite real to me:
How to clearly resonate about the changes that could have downstream impacts? I can understand that checking each module change could be painful. I personally default to check each time I change the exported APIs of a file, running a search with the full import path to see who depends on it. I have MUI X, MUI Toolpad, MUI Core, loaded in my VS Code workspace, it searches all these folders at the same time: but to explore what a better workflow could look like. Maybe an API exported from a package.json prefixed with
Oh yeah, all the code inside a script is unusable by other scripts, so 👍 to create intermediary files where relevant. |
This is the first step towards better isolation and modularisation of our infrastructure packages.
In this PR I extracted the API docs generation logic into a separate internal package and placed it in the /packages directory. As a next step, I can convert it to ESM.
I was also considering moving the package to the root packages directory to be consistent with others, like typescript-to-proptypes. We could then move all the other packages there (there are two more in docs/packages, plus some code that can be turned into a package, like modules/waterfall).Another idea I had was to keep the
buildApiDocs
in docs/scripts and use it to trigger the logic in @mui-internal/api-docs-builder. This should prevent breaking changes as the script entry point would stay the same.Going forward, we could also package the scripts, reducing the number of dependencies in the root package.json and relying on individual workspaces' dependencies (Yarn docs state that root dependencies are not a recommended way of doing things: https://classic.yarnpkg.com/en/docs/cli/add#toc-yarn-add-ignore-workspace-root-check-w)
@mui/code-infra, @mui/docs-infra I'm open to suggestions.