# install from pypi
pip3 install flashprof
# collect tiflash logs from tiup cluster to current directory, which will also be parsed to json
flashprof collect --cluster $CLUSTER_NAME
# render all cluster runtime info, currently supports task DAG
flashprof render
# help
flashprof -h
flashprof <subcommand> -h
Currently only task runtime info is visualized as DAG. Tasks in a single query may span across several tiflash instances, and those with status != FINISHED
or error_message != ""
will be labelled with red border.
The collected/generated artifacts have the following layout
flashprof
└── cluster
├── cluster1_name
│ ├── log (collected from tiflash log dir)
│ │ ├── ip1.tiflash.log
│ │ └── ip2.tiflash.log
│ └── task_dag (parsed and combined task dag)
│ ├── json
│ │ ├── ip1.tiflash.log.task_dag.json
│ │ ├── ip2.tiflash.log.task_dag.json
│ │ └── cluster.task_dag.json
│ ├── png (rendered png files)
│ └── svg (rendered svg files)
└── cluster2_name
...
# install a local dev version of python package, then we can call flashprof
# rerun this when code is changed
# it internally creates a symbolic link to the current source code
pip3 install -e .
# remove if you want
pip3 uninstall flashprof
pip3 install build
python3 -m build
twine check dist/*
twine upload dist/*
# upload to test.pypi.org for package publish related test
# twine upload --repository testpypi dist/*
Please refer to https://packaging.python.org/guides/distributing-packages-using-setuptools/ for detaild instructions.
collect
command collects tiflash logs according to the tiup configurations for the specified --cluster $CLUSTER_NAME
, and logs are named $IP.tiflash.log
in flashprof/cluster/$CLUSTER_NAME/log
.
parse
command parses all the tiflash logs collected above to the json format, which only contains task DAGs for now. The json files a then merged into a cluster.json
in flashprof/cluster/$CLUSTER_NAME/task_dag/json
.
render
command renders cluster.json
into dag graphs per query_tso
in flashprof/cluster/$CLUSTER_NAME/$FORMAT
.