Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't work with Pluto.jl #178

Open
Vcholerae1 opened this issue Nov 3, 2024 · 6 comments
Open

Can't work with Pluto.jl #178

Vcholerae1 opened this issue Nov 3, 2024 · 6 comments

Comments

@Vcholerae1
Copy link

I want to write code on Pluto.jl. But I found this issue.
image

@omlins
Copy link
Owner

omlins commented Nov 4, 2024

@Vcholerae1 Thanks for reporting the issue. Does this issue occur only when you try to rerun the notebook or already the first time when you run it?

@Vcholerae1
Copy link
Author

Hello @omlins

Thank you for your response!

I encountered this issue the very first time I ran the notebook.

Additionally, I’m experiencing another problem in Pluto.jl. When I use using ParallelStencil, it doesn’t automatically import CUDA. (This might be a design limitation of Pluto.jl, but I’m not entirely sure, as I’m still new to Julia😊). When I explicitly specify the CUDA dependency, running @init_parallel_stencil results in the following error:

Module Data from previous module initialization found in caller module (Main.var"workspace#12");'
module Data not created. Note: this warning is only shown in non-interactive mode.

then I can't using marco '@fill' to release the power of ParallelStencil.jl
the warning message is like this

NotInitializedError: no ParallelStencil macro or function can be called before @init_parallel_stencil in each module (missing call in Main.var"workspace#14").

Stack trace
Here is what happened, the most recent locations are first:

check_initialized(caller::Module) @ init_parallel_stencil.jl:94
var"@fill"(__source__::LineNumberNode, __module__::Module, args::Vararg{Any}) @ init_parallel_stencil.jl:8
#macroexpand#66 @ expr.jl:122
macroexpand @ expr.jl:120

I am willing to help with testing related issues.

@omlins
Copy link
Owner

omlins commented Nov 5, 2024

Could you please try once __init__() = @init_parallel_stencil(CUDA, Float64, 3) instead of just @init_parallel_stencil(CUDA, Float64, 3) ?

@albert-de-montserrat
Copy link
Contributor

@omlins I already tried that without success... got same error

@Vcholerae1
Copy link
Author

@omlins I got the same conclusion.

@omlins
Copy link
Owner

omlins commented Nov 8, 2024

The root cause for the issue could be the same as in #167 . Waiting for a feedback from @timholy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants