Skip to content

Problem with collective communication in parallel threads. #2427

@hujie-frank

Description

@hujie-frank

Excuse me, does MPI can call collective communication concurrently in multi-threads? I want to do collective communication like MPI_Allreduce in parallel threads to deal with different tasks for some optimizations. I initialize with MPI_THREAD_MULTIPLE in MPI_Init_thread(), but it does not work ("MPI_ERR_TRUNCATE: message truncated"). I found the reason in the website, in Open MPI, each collective communication uses internally the same message tag values and hence different communication can interfere with each other in not issued in sequence. But I also try to use asynchronous communication interface MPI_Iallreduce which obey MPI-3.0 standard to tackle this problem, there are still some errors occurred("MPI Error in MPI_Testall() (18)"). Moreover, if communicate with same size message all the time, MPI_Allreduce and MPI_Iallreduce will operate without checking correctness of communication results.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions