-
Notifications
You must be signed in to change notification settings - Fork 937
Description
Excuse me, does MPI can call collective communication concurrently in multi-threads? I want to do collective communication like MPI_Allreduce in parallel threads to deal with different tasks for some optimizations. I initialize with MPI_THREAD_MULTIPLE in MPI_Init_thread(), but it does not work ("MPI_ERR_TRUNCATE: message truncated"). I found the reason in the website, in Open MPI, each collective communication uses internally the same message tag values and hence different communication can interfere with each other in not issued in sequence. But I also try to use asynchronous communication interface MPI_Iallreduce which obey MPI-3.0 standard to tackle this problem, there are still some errors occurred("MPI Error in MPI_Testall() (18)"). Moreover, if communicate with same size message all the time, MPI_Allreduce and MPI_Iallreduce will operate without checking correctness of communication results.