Closed
Description
Opening this issue for tracking purposes. Measured with Master nightly build against 1.10.3. Possible fix on master.
@hjelmn or @bosilca please comment.
PML - Yalla
OMPI – 1.10.3
$mpirun -np 2 --map-by node --bind-to core -mca pml yalla -x MXM_RDMA_PORTS=mlx5_0:1 -mca btl_openib_if_include mlx5_0:1 /hpc/local/benchmarks/hpc-stack-gcc/install/ompi-v1.10/tests/osu-micro-benchmarks-5.2/osu_mbw_mr
# OSU MPI Multiple Bandwidth / Message Rate Test v5.2
# [ pairs: 1 ] [ window size: 64 ]
# Size MB/s Messages/s
1 4.01 4005006.11
2 8.24 4121056.15
4 16.39 4097311.09
8 32.45 4055766.73
16 64.16 4010025.24
32 127.13 3972687.66
64 237.04 3703703.70
128 455.11 3555555.62
256 860.96 3363110.99
512 1592.23 3109815.42
1024 2811.50 2745602.68
2048 4972.38 2427921.16
4096 5430.79 1325875.29
8192 5933.54 724309.64
16384 6155.42 375697.10
32768 6328.16 193120.10
65536 6398.15 97627.95
131072 6433.23 49081.64
262144 5161.27 19688.67
524288 5731.10 10931.20
1048576 6046.06 5765.97
2097152 6215.12 2963.60
4194304 6306.30 1503.54
---------------------
PML – Yalla
OMPI – Master
$mpirun -np 2 --map-by node --bind-to core -mca pml yalla -x MXM_RDMA_PORTS=mlx5_0:1 -mca btl_openib_if_include mlx5_0:1 /hpc/local/benchmarks/hpc-stack-gcc/install/ompi-master/tests/osu-micro-benchmarks-5.2/osu_mbw_mr
# OSU MPI Multiple Bandwidth / Message Rate Test v5.2
# [ pairs: 1 ] [ window size: 64 ]
# Size MB/s Messages/s
1 1.89 1887305.40
2 3.80 1898890.08
4 7.56 1889678.24
8 15.31 1914346.13
16 30.41 1900517.95
32 60.30 1884510.12
64 119.99 1874796.93
128 227.47 1777098.03
256 454.66 1776025.43
512 870.71 1700598.93
1024 1599.39 1561900.54
2048 3228.97 1576645.16
4096 4453.33 1087237.56
8192 5822.02 710695.44
16384 6213.84 379262.51
32768 6336.49 193374.24
65536 6403.37 97707.72
131072 6438.18 49119.44
262144 5126.38 19555.59
524288 5708.48 10888.06
1048576 6033.43 5753.92
2097152 6208.48 2960.43
4194304 6303.32 1502.83
-----------------------------
PML - OB1
OMPI – 1.10.3
$mpirun -np 2 --map-by node --bind-to core -mca pml ob1 -x MXM_RDMA_PORTS=mlx5_0:1 -mca btl_openib_if_include mlx5_0:1 /hpc/local/benchmarks/hpc-stack-gcc/install/ompi-v1.10/tests/osu-micro-benchmarks-5.2/osu_mbw_mr
# OSU MPI Multiple Bandwidth / Message Rate Test v5.2
# [ pairs: 1 ] [ window size: 64 ]
# Size MB/s Messages/s
1 3.20 3204807.28
2 6.91 3453858.56
4 13.82 3453858.78
8 27.41 3426124.16
16 54.12 3382663.90
32 105.73 3304078.53
64 208.55 3258655.72
128 402.16 3141875.26
256 780.19 3047618.99
512 1324.49 2586903.70
1024 2392.70 2336619.29
2048 4147.85 2025316.48
4096 5411.73 1321222.14
8192 5900.16 720234.07
16384 6083.99 371337.40
32768 6329.11 193149.24
65536 6427.56 98076.78
131072 6478.69 49428.48
262144 6503.55 24809.09
524288 6517.20 12430.56
1048576 6523.66 6221.44
2097152 6526.58 3112.11
4194304 6528.46 1556.51
----------------------------------
PML – OB1
OMPI - Master
$mpirun -np 2 --map-by node -mca pml ob1 -mca btl_openib_if_include mlx5_0:1 /hpc/local/benchmarks/hpc-stack-gcc/install/ompi-master/tests/osu-micro-benchmarks-5.2/osu_mbw_mr
# OSU MPI Multiple Bandwidth / Message Rate Test v5.2
# [ pairs: 1 ] [ window size: 64 ]
# Size MB/s Messages/s
1 1.64 1636174.22
2 4.79 2392507.23
4 9.69 2423259.37
8 19.08 2384926.46
16 38.57 2410744.90
32 75.80 2368681.59
64 149.17 2330745.92
128 281.28 2197461.83
256 539.24 2106415.38
512 1065.10 2080264.37
1024 1807.65 1765284.91
2048 3429.21 1674421.30
4096 5233.04 1277597.35
8192 5634.88 687851.71
16384 5303.44 323696.21
32768 6091.79 185906.65
65536 6392.29 97538.57
131072 6459.20 49279.78
262144 6494.59 24774.90
524288 6512.32 12421.26
1048576 6521.32 6219.21
2097152 6525.70 3111.70
4194304 6526.97 1556.15