-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor crawler.rb
to depend on GHA workflows
#35
Comments
In my opinion this is something that should be part for If we decide to go with GHA we will lose the possibility to use the reusable |
We won't re-use this code because it is just about |
@ronaldtse where else this code can be reused? |
We definitely need to move the SI Brochure build somewhere. It takes long time to build it and if it fails it blocks to update docs from other sources. Lines 26 to 31 in 9affdaf
and make the generated documents available for fetching. |
I agree with @andrew2net . In any case, we have a problem with this in general. Look at how we need to compile many documents at the CalConnect document repository website. We need to provide a way to handle a publication workflow (published artifacts at stages), offering a Relaton dataset based on a set of Metanorma documents, and additionally some Relaton content (i.e. only bibdata but no document available or not in Metanorma). |
I opened #36 to illustrate what I meant. |
@CAMOBAP the problem is that the building of all the documents takes a long time, and we cannot rebuild the entire document collection just to obtain bibliographic data from them. We need to be able to publish bibliographic data in a document repository itself. |
This code can be removed if we can re-use the well-maintained
actions-mn
workflows managed by @CAMOBAPhttps://github.com/relaton/relaton-data-bipm/blob/9affdaf0d71dcc758666fe8a9ab9631f2b36f542/crawler.rb#L20C1-L31C4
The text was updated successfully, but these errors were encountered: