Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Toolchain for maintaining HW/metrics/counter docs + presenting in UI #10

Open
rib opened this issue Nov 20, 2015 · 0 comments
Open

Toolchain for maintaining HW/metrics/counter docs + presenting in UI #10

rib opened this issue Nov 20, 2015 · 0 comments

Comments

@rib
Copy link
Owner

rib commented Nov 20, 2015

Note: this isn't focused on writting the documentation itself, but rather on being able to maintain and present documentation in the gputop UI

For the Gen graphics metrics to be of more practical use to developers we really need to improve how we document the metric sets and counters, including details of how the counter values are normalized and caveats/errata that may affect their interpretation.

Different users will have different levels of familiarity with Gen graphics hardware and depending on whether someone is profiling to optimize a game/application or the driver stack itself may affect what details are pertinent.

Considering how we're using python for codegen based on the gputop/oa-*.xml files I would currently propose we use reStructured text for maintaining extra docs - that or Markdown, which may be familiar to more people. One reason I'm thinking reStructured text might make more sense is that it has a well defined xml serialization so on the one hand we can use the ascii markup for writting the documentation but our tools for maintaining the oa-*.xml files could then support parsing and serializing the docs to xml and adding the docs to the oa-*.xml files to easily be shared between projects.

Just thinking aloud a.t.m, but from a maintainability pov, and to allow others to easily help improve the documentation, maybe it could be interesting to see if the documentation could be maintained on the github wiki and updating our meta-data toolchain to be able to pull updates from the wiki.

It's awkward a.t.m that our common toolchain for generating oa-*.xml files which are shared between projects currently private since it has to parse an internal-only description of counters. Ideally this documentation toolchain should be public. Maybe to start with we should implement this within gputop and figure out how to enable other projects to leverage the extra data later.

Summary of things to solve here

  1. come up with a maintainable scheme for writting per-gen, per-metric-set, per-metric and raw-counter documentation, that can provide an overview of how to interpret metrics for each Gen (including errata to beware of, or caveats about power management/clock gating constraints applied while capturing.
  2. Extend the python codegen toolchain to be able to process the documentation into whatever form is appropriate for presenting in gputop (probably converting to HTML)
  3. Extend the Web UI to have a documentation pane that can show the most contextually relevent documentation at any point in time. By default it would likely show the most general information about all metrics on the current platform, but the UI should allow selecting specific counters in the overview whereby the documentation pane would then describe that counter including rendering the normalization equation for the counter.

Considering how metrics can be normalized and derived from multiple raw counters, it would be good for the rendering of normalization equations to link through to the documentation for any referenced raw counters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant