Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DocBook rendering is broken #303

Closed
ncfavier opened this issue Apr 15, 2021 · 2 comments
Closed

DocBook rendering is broken #303

ncfavier opened this issue Apr 15, 2021 · 2 comments
Assignees
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@ncfavier
Copy link
Member

ncfavier commented Apr 15, 2021

The rendering of DocBook option descriptions to HTML implemented in #86 using Pandoc has multiple problems. The ones I could find are all conveniently illustrated in the description for environment.profileRelativeSessionVariables:

  1. cross-reference (xref elements) titles are not resolved properly and are replaced with ???. This is because the cross-referenced element isn't present in the snippet we feed to Pandoc.
  2. cross-reference links just point to #${id}, which has no effect on Elasticsearch. They should probably point to the appropriate anchor in options.html, or to a search.nixos.org URL such as the one I crafted above, or we should somehow teach Elasticsearch to deal with those anchors.
  3. man page references (citerefentry elements) aren't rendered properly. Pandoc doesn't seem to support that element at all at the moment.
  4. descriptions aren't broken into paragraphs at \n\n, like they are in the manual. This is because this is part of the post-processing that happens in optionsDocBook but not in optionsJSON.

I'm not sure how we should fix this. The ways I can think of are:

  1. Parse the options.html generated by manualHTML directly. This avoids a lot of code duplication, descriptions are rendered exactly as they are in the manual. On the downside, it requires parsing HTML and making assumptions about the generated format (default values, examples etc. aren't properly separated from the description text; CSS classes used in the manual might clash with those used by Elasticsearch).
  2. PR nixpkgs to expose the olinkDB which maps cross-reference IDs to option names, then use it to do the rendering ourselves from Python using lxml. This is probably cleaner, although it would require backporting the fix to stable channels. It also doesn't fix problem 4, but that can also be fixed in nixpkgs.

Let me know what you think.

@ncfavier ncfavier self-assigned this Apr 15, 2021
@ncfavier ncfavier added bug Something isn't working help wanted Extra attention is needed labels Apr 15, 2021
@jtojnar
Copy link
Member

jtojnar commented Jul 8, 2021

This somewhat overlaps with our conversion of manual to Markdown.

  1. We have a pandoc filter for that that removes the question marks and then another one that converts links without label to raw docbook xrefs. You will probably want something similar, replacing it with HTML link.
  2. Opened DocBook reader: add support for citerefentry jgm/pandoc#7433 to add citerefentry support to pandoc.
  3. This would be solved by converting the descriptions to Markdown, hopefully we will do that soon.

For the option name resolution, I would prefer the olinkDB but it is unclear how that will look once we get rid of DocBook completely.

But in the short term, we could just handle cross-references starting with opt-, which we could just resolve to whatever is after the dash. I expect that would handle majority of the links. (Submodule links would still be displayed incorrectly but they should link correctly.)

@ncfavier
Copy link
Member Author

Going to close this as the last remaining problem is very minor and we're trying to move to markdown anyway

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants