-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch to Intel MKL on systems with Intel compiler #759
Comments
Can you elaborate on "remain compatible with the different providers," i.e., are you talking about the downstream packages or spack-stack or..? All other things being equal I would be inclined to use openblas+fftw for everything just for consistency, but if some applications need MKL for their own fourier transform/linear algebra purposes then that's another matter. |
If you take a look at the nautilus site config, you'll be able to see I think. I propose that when we use Intel compilers, we use MKL as the provider for blas, lapack, fftw-api. When we use GNU, we keep the current combination of openblas as provider for blas and lapack, and fftw for fftw-api. |
Would it be a reasonable goal for the 1.8.0 release to build a second unified env (or subset thereof) using MKL on a handful of systems? That would give developers a chance to test and give feedback. |
Yes, of course. Good idea |
Description
The Nautilus site config shows how to switch to Intel MKL as provider for
blas
andlapack
(andfftw-api
if I remember correctly).I think it would be good to switch to Intel MKL when using the Intel compiler, and keep
openblas
/fftw
forgcc
andclang
. This ensures that we remain compatible with the different providers. We can probably make use of therequire: one_of / when
syntax to implement this.Update 2023/10/11: See JCSDA/spack#342 for work that needs to be done on
ectrans
andecmwf-atlas
.Requirements
See above
Acceptance Criteria (Definition of Done)
A clean solution to use Intel MKL with at least the Intel compilers.
Dependencies
n/a
The text was updated successfully, but these errors were encountered: