Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CMake cannot verify NVCC compiler after rerender to 2019 Windows Image #76

Open
1 task done
carterbox opened this issue Dec 20, 2021 · 9 comments
Open
1 task done

Comments

@carterbox
Copy link
Member

Issue:
After a feedstock was re-rendered to the 2019 Windows image, CMake can no longer verify NVCC as a working compiler. The version of the nvcc_win-64 is the same before and after the rerender. At first, I thought the problem was that the wrong host compiler was being assigned to NVCC because the version of the compiler linked to %CXX% did not match the host compiler reported by CMake, but then I also tried manually assigning NVCC and host compiler paths.

I really don't know what's going on.


Environment (conda list):
    cmake:          3.21.3-h39d44d4_0       conda-forge
    ninja:          1.10.2-h2d74725_1       conda-forge
    nvcc_win-64:    11.2-h929061a_15        conda-forge
    ucrt:           10.0.20348.0-h57928b3_0 conda-forge
    vc:             14.2-hb210afc_5         conda-forge
    vs2015_runtime: 14.29.30037-h902a5da_5  conda-forge
    vs2017_win-64:  19.16.27038-h2e3bad8_2  conda-forge
    vswhere:        2.8.4-h57928b3_0        conda-forge



Details about conda and system ( conda info ):
$ conda info

@leofang
Copy link
Member

leofang commented Jan 4, 2022

cc: @jaimergp for vis

@leofang
Copy link
Member

leofang commented Jul 2, 2022

Looks like we have a growing number of people hitting this issue:

'cl.exe' is not recognized as an internal or external command,
operable program or batch file.

Sounds like vs2019 is to blame?

cc: @conda-forge/core @conda-forge/nvcc @conda-forge/cudatoolkit

@leofang
Copy link
Member

leofang commented Jul 2, 2022

I figured it out. We need to set --use-local-env flag to nvcc. Let me send a patch.

@leofang leofang mentioned this issue Jul 2, 2022
5 tasks
@leofang

This comment was marked as resolved.

@leofang
Copy link
Member

leofang commented Jul 2, 2022

#88 (comment)

@hadim
Copy link
Member

hadim commented Oct 18, 2022

I am not sure what is the status here but also got the cl.exe error at conda-forge/katago-feedstock#1 and when trying to use your fix:

    set CUDNN_INCLUDE_DIR=%LIBRARY_PREFIX%\include
    for /f "tokens=* usebackq" %%f in (`where nvcc`) do (set "dummy=%%f" && call set "NVCC=%%dummy:\=\\%%")
    set "NVCC=%NVCC% --use-local-env"
    echo "nvcc is %NVCC%, CUDA path is %CUDA_PATH%"

then I got a Failed to find nvcc. error.

Any idea?

@leofang
Copy link
Member

leofang commented Oct 18, 2022

No, unfortunately. I don't really know what's going on. The patch #88 did not work...

@jaimergp
Copy link
Member

Isuru's patch did work in this other PR, so I don't know :(

@leofang
Copy link
Member

leofang commented Oct 18, 2022

Interesting, @isuruf have you figured out what's missing in #88?

carterbox added a commit to carterbox/magma-feedstock that referenced this issue Oct 20, 2022
conda-forge/nvcc-feedstock#76 This library uses an old version of CMake
CUDA find_library and a strange way of declaring the build artifacts, so
the flag needs to be added via -DCUDA_NVCC_FLAGS="--use-local-env" instead
of the CUDA_FLAGS environment variable or by appending the option to the
CUDACXX environment variable.

Changes the build and install steps to call cmake instead of
generator/build system directly. Switches generator to Ninja for all
platforms. Based on this PR, Ninja builds seem significantly faster or
approximately the same duration.

Adds a special case for 11.2 in the Windows build script because it was missing.

Adds checks for additional build artifacts.

Adds a strict run_export.

Changes the maintainers from @isuruf to myself and @conda-forge/pytorch-cpu.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants