You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Note: this isn't focused on writting the documentation itself, but rather on being able to maintain and present documentation in the gputop UI
For the Gen graphics metrics to be of more practical use to developers we really need to improve how we document the metric sets and counters, including details of how the counter values are normalized and caveats/errata that may affect their interpretation.
Different users will have different levels of familiarity with Gen graphics hardware and depending on whether someone is profiling to optimize a game/application or the driver stack itself may affect what details are pertinent.
Considering how we're using python for codegen based on the gputop/oa-*.xml files I would currently propose we use reStructured text for maintaining extra docs - that or Markdown, which may be familiar to more people. One reason I'm thinking reStructured text might make more sense is that it has a well defined xml serialization so on the one hand we can use the ascii markup for writting the documentation but our tools for maintaining the oa-*.xml files could then support parsing and serializing the docs to xml and adding the docs to the oa-*.xml files to easily be shared between projects.
Just thinking aloud a.t.m, but from a maintainability pov, and to allow others to easily help improve the documentation, maybe it could be interesting to see if the documentation could be maintained on the github wiki and updating our meta-data toolchain to be able to pull updates from the wiki.
It's awkward a.t.m that our common toolchain for generating oa-*.xml files which are shared between projects currently private since it has to parse an internal-only description of counters. Ideally this documentation toolchain should be public. Maybe to start with we should implement this within gputop and figure out how to enable other projects to leverage the extra data later.
Summary of things to solve here
come up with a maintainable scheme for writting per-gen, per-metric-set, per-metric and raw-counter documentation, that can provide an overview of how to interpret metrics for each Gen (including errata to beware of, or caveats about power management/clock gating constraints applied while capturing.
Extend the python codegen toolchain to be able to process the documentation into whatever form is appropriate for presenting in gputop (probably converting to HTML)
Extend the Web UI to have a documentation pane that can show the most contextually relevent documentation at any point in time. By default it would likely show the most general information about all metrics on the current platform, but the UI should allow selecting specific counters in the overview whereby the documentation pane would then describe that counter including rendering the normalization equation for the counter.
Considering how metrics can be normalized and derived from multiple raw counters, it would be good for the rendering of normalization equations to link through to the documentation for any referenced raw counters.
The text was updated successfully, but these errors were encountered:
Note: this isn't focused on writting the documentation itself, but rather on being able to maintain and present documentation in the gputop UI
For the Gen graphics metrics to be of more practical use to developers we really need to improve how we document the metric sets and counters, including details of how the counter values are normalized and caveats/errata that may affect their interpretation.
Different users will have different levels of familiarity with Gen graphics hardware and depending on whether someone is profiling to optimize a game/application or the driver stack itself may affect what details are pertinent.
Considering how we're using python for codegen based on the
gputop/oa-*.xml
files I would currently propose we use reStructured text for maintaining extra docs - that or Markdown, which may be familiar to more people. One reason I'm thinking reStructured text might make more sense is that it has a well defined xml serialization so on the one hand we can use the ascii markup for writting the documentation but our tools for maintaining theoa-*.xml
files could then support parsing and serializing the docs to xml and adding the docs to theoa-*.xml
files to easily be shared between projects.Just thinking aloud a.t.m, but from a maintainability pov, and to allow others to easily help improve the documentation, maybe it could be interesting to see if the documentation could be maintained on the github wiki and updating our meta-data toolchain to be able to pull updates from the wiki.
It's awkward a.t.m that our common toolchain for generating
oa-*.xml
files which are shared between projects currently private since it has to parse an internal-only description of counters. Ideally this documentation toolchain should be public. Maybe to start with we should implement this within gputop and figure out how to enable other projects to leverage the extra data later.Summary of things to solve here
Considering how metrics can be normalized and derived from multiple raw counters, it would be good for the rendering of normalization equations to link through to the documentation for any referenced raw counters.
The text was updated successfully, but these errors were encountered: