Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split KernelAbstractions into frontend and backends #200

Merged
merged 7 commits into from
Feb 12, 2021
Merged

Split KernelAbstractions into frontend and backends #200

merged 7 commits into from
Feb 12, 2021

Conversation

vchuravy
Copy link
Member

In order to have packages not need to depend on both AMDGPU.jl and CUDA.jl through KernelAbstractions.jl

The other alternative is to move the code into CUDA.jl or AMDGPU.jl, thoughts?

@maleadt @jpsamaroo

@jpsamaroo
Copy link
Member

This is cool! I'm guessing we'll need to separately register these sub-packages in General?

@maleadt
Copy link
Member

maleadt commented Jan 12, 2021

On the one hand I'm happy to put that stuff in CUDA.jl, but this feels way nicer, and is easier to develop in tandem with KA.jl.

@vchuravy
Copy link
Member Author

Yeah I need to figure out tests first.

The two things I would like to do, which I don't think this allows for is:

  • Use KernelAbstractions to provide generic implementations for GPUArrays
  • Have the CUDA backend automatically work when both KA and CUDA are loaded.

@DilumAluthge
Copy link
Collaborator

Is the plan that you'll register CPUKernels in the General registry as a subdirectory package?

@DilumAluthge
Copy link
Collaborator

DilumAluthge commented Jan 12, 2021

Personally I'm not a big fan of the "multiple packages in a single GitHub repo" approach. (Despite the fact that I once asked for that feature 😂.) From my observation, it ends up causing some difficult issues down the line.

Would you be open to instead putting CPUKernels in its own GitHub repo?

@maleadt
Copy link
Member

maleadt commented Jan 13, 2021

From my observation, it ends up causing some difficult issues down the line.

Such as? Developing the interface between these packages is much easier if you can do it in a single PR rather than having to create PRs across packages, deal with semver and/or work with Manifests to make sure changes are picked up, duplicate (reverse) CI across packages, etc.

@DilumAluthge
Copy link
Collaborator

The main problem is if you ever decide in the future that you want to put the package in a separate GitHub repo. You now have to figure out a way to make sure that the new repo contains all of the same trees that the package had when it was in this repo.

There used to be a subdir package in SnoopCompile.jl called SnoopCompileBot.jl. They wanted to split it out, but the process ended up being painful enough that they just made a new package (CompileBot.jl).

@DilumAluthge
Copy link
Collaborator

If the packages are that closely related, I would say they should just be the same package?

@DilumAluthge
Copy link
Collaborator

What's the main motivation of splitting KernelAbstractions and CPUKernels into separate packages? Is it so that people can load KernelAbstractions without having to load CUDA? If so, instead of making CPUKernels a different package, you could make it a submodule, and then lazily load that submodule using Requires.

@maleadt
Copy link
Member

maleadt commented Jan 13, 2021

lazily load that submodule using Requires.

Blasphemy! Requires doesn't respect semver, is hard to PackageCompile, etc etc. The goal is indeed to prevent a dependency on both CUDA/AMDGPU/..., and the only proper way to do so currently is to create more packages. Maybe with Preferences we could better do Requires-style depends, but then you still have the package dependency on both CUDA.jl and AMDGPU.jl (even though you may not end up loading the package), which is unacceptable.

@DilumAluthge
Copy link
Collaborator

you still have the package dependency on both CUDA.jl and AMDGPU.jl (even though you may not end up loading the package), which is unacceptable.

This is what I had in mind when I suggested Requires - you keep the dependency in the [deps] section of the Project.toml file, but you only do import CUDA if you really need it.

@DilumAluthge
Copy link
Collaborator

and the only proper way to do so currently is to create more packages.

Unless we get a proper solution to JuliaLang/Pkg.jl#1285

@maleadt
Copy link
Member

maleadt commented Jan 13, 2021

Unless we get a proper solution to JuliaLang/Pkg.jl#1285

Sure, but that's not going to happen anytime soon.

@vchuravy
Copy link
Member Author

What's the main motivation of splitting KernelAbstractions and CPUKernels into separate packages?

I was taking the idea to it's extreme. But yes the general idea is to be able to use KerneAbstractions just with AMDGPU.jl or with CUDA.jl and not force a dependency on both.

@jpsamaroo
Copy link
Member

It's also not necessarily possible to depend on both CUDA.jl and AMDGPU.jl at the same time; AMDGPU.jl tends to lag significantly behind CUDA.jl, which can cause version incompatibilities between shared deps, like GPUArrays or Adapt.

Added new package CUDAKernels.jl

Co-authored-by: Julian P Samaroo <jpsamaroo@jpsamaroo.me>
@jpsamaroo jpsamaroo marked this pull request as ready for review February 11, 2021 17:57
@vchuravy
Copy link
Member Author

bors try

bors bot added a commit that referenced this pull request Feb 11, 2021
@bors
Copy link
Contributor

bors bot commented Feb 11, 2021

@vchuravy
Copy link
Member Author

Thanks Julian! As discussed on Slack we should test the version of CUDAKernels that is inside the monorepo.

@maleadt ideas on how to get Pkg to agree with us on that?

@vchuravy
Copy link
Member Author

bors try

bors bot added a commit that referenced this pull request Feb 11, 2021
@bors
Copy link
Contributor

bors bot commented Feb 11, 2021

try

Build failed:

@vchuravy
Copy link
Member Author

bors try

bors bot added a commit that referenced this pull request Feb 11, 2021
@bors
Copy link
Contributor

bors bot commented Feb 11, 2021

@vchuravy
Copy link
Member Author

bors try

bors bot added a commit that referenced this pull request Feb 11, 2021
@bors
Copy link
Contributor

bors bot commented Feb 11, 2021

@vchuravy
Copy link
Member Author

bors try

bors bot added a commit that referenced this pull request Feb 11, 2021
@bors
Copy link
Contributor

bors bot commented Feb 11, 2021

@tkf tkf mentioned this pull request Feb 11, 2021
@vchuravy
Copy link
Member Author

bors try

bors bot added a commit that referenced this pull request Feb 12, 2021
commands:
- julia --project=test -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd()))'
- julia --project=test -e 'using Pkg; Pkg.develop(PackageSpec(path=joinpath(pwd(),"lib","CUDAKernels")))'
- julia --project=test --color=yes --check-bounds=yes --code-coverage=user --depwarn=yes test/runtests.jl
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just for the record: it should be possible to put back JULIA_LOAD_PATH=@ once a fix like JuliaPackaging/BinaryBuilder.jl#1007 is implemented

(But using JULIA_LOAD_PATH=@ was just my preference for making sure that test is reproducible; i.e., avoid accidentally using @stdlib and @v#.#. It's not a strict requirement to make CI work.)

@vchuravy
Copy link
Member Author

@DilumAluthge do you remember why we have three different CI files? Is it because of bors?

@DilumAluthge
Copy link
Collaborator

It's for the status badges in the README, I believe. You cannot generate more than one status badge for a single GHA workflow file.

@vchuravy
Copy link
Member Author

bors r+

@bors bors bot merged commit 4821d27 into master Feb 12, 2021
@bors bors bot deleted the vc/split branch February 12, 2021 05:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants