You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What version (or hash if on master) of pybind11 are you using?
2.13.1
Problem description
When building a large pybind11 module with GCC+CMake+Ninja, I find that the linking step is serial and takes forever. When the Makefiles generator is specified, linking (LTO) uses all available cpus, and linking is much faster.
environment: Conda-forge, linux x64
GCC 12.3.0
CMake 3.30.0
GNU Make 4.4.1
Ninja 1.12.1
The problem seems related to the fact that Ninja does not support the GNU Make jobserver protocol. Therefore, the -flto flag does serial LTO with Ninja and parallel LTO with Make. You need to explicitly set -flto=auto or -flto=n when you use Ninja for reasonable link times.
It might be a good idea to change set(thin "") to set(thin "=auto") here:
Required prerequisites
What version (or hash if on master) of pybind11 are you using?
2.13.1
Problem description
When building a large pybind11 module with GCC+CMake+Ninja, I find that the linking step is serial and takes forever. When the Makefiles generator is specified, linking (LTO) uses all available cpus, and linking is much faster.
environment: Conda-forge, linux x64
GCC 12.3.0
CMake 3.30.0
GNU Make 4.4.1
Ninja 1.12.1
The problem seems related to the fact that Ninja does not support the GNU Make jobserver protocol. Therefore, the
-flto
flag does serial LTO with Ninja and parallel LTO with Make. You need to explicitly set-flto=auto
or-flto=n
when you use Ninja for reasonable link times.It might be a good idea to change
set(thin "")
toset(thin "=auto")
here:pybind11/tools/pybind11Common.cmake
Line 327 in bb05e08
Clang supports
-flto=auto
too (with Clang, it means the same as-flto
).Reproducible example code
No response
Is this a regression? Put the last known working version here if it is.
Not a regression
The text was updated successfully, but these errors were encountered: