-
-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gltf extras component #2153
Comments
An alternative would be to register extras with the gltf loadet such that every extra results in a typed component specific for such extra. This would be similar to |
Yeah that would work. I'm very new to bevy so forgive me if that isn't phrased correctly |
Could you share a GLTF file with those extra set? |
The One i'm working on is very large so i made a simple example file of a cube with 1 'String' property and 1 array property. (As well as one from blenderkit). I have no problem sharing the larger one if that helps it but only has a single property set on one of the mesh objects. Github wouldn't let me upload a gltf file so here it is as txt. I'm not sure exactly what schema i would use for now i only want to create simple Colliders with rapier3d |
fixes bevyengine#2153 expose the `extras` field value as a string
fixes bevyengine#2153 expose the `extras` field value as a string
What problem does this solve or what need does it fill?
In my game i would like to store a few variables in the gltf "extras" field in blender and use them to set physics properties on the created entities.
What solution would you like?
Create a struct to store extras data, something like:
and add it when creating entities in the gltf loader. I'm not sure as to what the best type to store the data would be,
according to the gltf spec they should be represented as an object whenever possible, so either a hashmap or something similar.
What alternative(s) have you considered?
I don't think there is another way to do this, but i would like suggestions if there are.
Additional context
The text was updated successfully, but these errors were encountered: