-
-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Should Game::update
receive &mut Window
?
#103
Comments
The idea here is that The current design in Coffee gets us halfway there. You can't draw or talk to the GPU in The current For your particular use case, I would probably generate and upload the game assets during In any case, thank you for sharing your use case! I will keep it in mind! |
I don't know yet, I'm just beginning with this (amazing) crate! But here's my guess: my map will not be static and actually depend on game logic, so its generation should be done on However, map generation can be done completely on the CPU. On And indeed, I would be happy with async/await support for asset loading (in my head it should work like this: I start uploading to the GPU asynchronously on
But, more philosophically.. I'm thinking about this. What I really wanted to do is to eventually move some game logic to the gpu by writing custom shaders. But yeah, this would make it harder to test (I don't know any CI with GPU support).
Oh yes I'm actually using
And what method could do this? You mean something like |
Will you procedurally generate every pixel of the map? This sounds very expensive. Normally, you generate a map as a bunch of game entities or terrain. I have a prototype game that does procedural generation and lazy chunk loading. In When it comes to rendering, all I do is simply query the chunks of my map that are visible, assign sprites to each of the terrain tiles and entities, and draw them all at once using a
Yes. Keep in mind that |
My current plan is to first generate a map skeleton using petgraph, then lay it out on a lower resolution image, then do a bunch of stuff on top of it like maze generation (each pixel would roughly correspond to what a tile would be in size), then scale it to a higher resolution image and create finer detail (similar to this - the previous steps of the pipelines would provide rough shapes for it to work). Yes, this is supposed to be expensive - that's why it's fun! I'm inspired by some stunning stuff produced by contemporary demoscene (which is surprisingly fast even on modest hardware). If this project grows I might need to make heavy use of rayon and simd on the CPU side, and move a lot of stuff to shaders on the GPU side, and at this point it might outgrow coffee (but knowing me, I'm much more likely to abandon the project much before this, so I'm fine, 😅) |
That looks great! Be aware that Coffee does not support custom shaders (see #57), so it may not be a good fit! In any case, I still think you should be able to keep a logic representation of the masks, and then map to actual colors (images) in This is what I personally do to implement my minimap, where I draw a LOT of individual pixels. You could also distribute the CPU load in multiple frames with an animation. |
I note that
Game
has the following methods:But
load::Task::run
can only be called if I have&mut Window
, so I can only call it oninteract
, not onupdate
. Is this by design? Is the rationale on that written somewhere?I suppose that, since
Game::update
can potentially be called more times thanGame::interact
, it might not allow callingTask::run
because it would be expensive. However, what if I'm procedurally generating GPU assets onGame::update
and want to send it to the GPU ASAP? Should I store it somewhere and wait for the nextGame::interact
? (this looks ugly)The text was updated successfully, but these errors were encountered: