-
-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build benchmarking infrastructure to compare parallel speed ups #12
Comments
The current benchmarking scripts are not too portable, they should run in some automated fashion. |
@MridulS I'd like to take up this issue, but I don't have any experience in building benchmarking infrastructures, so please guide me on this. A possible benchmarking approach :we can check if the average of all the speedup values(in heatmap, for graphs of different sizes and densities) is greater than 1, to ensure that parallel algos are more time efficient or we can also use
num,p=300,0.5
G = nx.fast_gnp_random_graph(num, p, directed=False)
H = nx_parallel.ParallelGraph(G)
@pytest.mark.benchmark
def test_algorithm_performance_G(benchmark):
result_seq = benchmark(nx.betweenness_centrality, G) #replace "betweenness_centrality" with the new algorithm added
@pytest.mark.benchmark
def test_algorithm_performance_H(benchmark):
result_para = benchmark(nx.betweenness_centrality, H) Test output( What are your thoughts on this? What all should I keep in mind before structuring it? Thank you :) |
@Schefflera-Arboricola yes, pytest-benchmark could be one way of doing this. We use ASV for networkx benchmarking. But it's possible we would need to come up with a way which incorporates benchmarks with networkx dispatching. Ideally this benchmark suite will be able to swap in any backend (graphblas, nx-parallel, cugraph) and run it against all of them. We still need to think a bit more about how to approach this :) |
by "github actions" did you mean something like this: networkx/networkx#6834 ? or something else? |
Yes! I'll try to finish the one in NX main repo soon. I think it's already good to go. |
Just adding this for reference here: https://conbench.github.io/conbench/ pytest-benchmarks --> cannot host like asv ; asv benchmarks --> nice tool to compare a library with its past versions but not the best option when we need to compare 2 libraries(i.e. networkx and nx-parallel here). |
We need to have a quick way of using either github actions or scripts to run some crude benchmarks while developing new algorithms.
The text was updated successfully, but these errors were encountered: