-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What would you like to see in next gen materials? #1442
Comments
Visibility keyframes--they make possible small cache-style animations of individual frames. |
Please add official support for Lightmap 👍 Popular 3D engines all have lightmap in their PBR material. It's an essential feature to achieve much better lighting quality and performance for static objects (benefit not just for game industry but also product visualization, arch viz...) stevenvergenz already created an extension for it here: #1017 |
MDL support would be great. https://developer.nvidia.com/mdl-sdk |
see https://blog.selfshadow.com/publications/s2017-shading-course/drobot/s2017_pbs_multilayered.pdf |
|
If that's useful we have a document with real-time definitions for clear coat, anisotropy and cloth which can be added to existing PBR engines: https://google.github.io/filament/Filament.md.html |
Photogrammetry users of Cesium have a slight request for refraction and reflection. The later, I think is already covered enough in today's glTF PBR. |
See also:
|
for non-photorealistic rendering. |
Handle UV tiles/UDIMs in the specification. As described here in blender (https://developer.blender.org/T55435); we essentially only need a level between a glTF "image" and a "texture":
The UV tiles is only the idea to have multiple image files to cover the UV space whereas UDIM is more a naming scheme. More info here: |
@jbouny I was wondering if anyone would ask for UDIMs, I've been interested in them for a while. But I'm concerned that they're not very compatible with realtime shaders. In that Allegorithmic documentation page you linked to, there's a section titled "Are UDIM compatible with game engines?" and ends with this assessment: "Therefore using UDIMs for real-time rendering is not recommended for the moment." Of course, glTF is meant to be a general delivery format, for end-user consumption be it realtime or not. Currently though, a large percentage of the glTF ecosystem is realtime interactive, so ecosystem support for UDIMs could be limited. Even so, it's the sort of thing that could be specified as a glTF extension fairly easily, for any implementations that were willing to take on the UDIM challenge. |
While it's cool to see so much momentum around glTF, there's also a danger that it becomes unusable because it would be considered to be too complex at some point... thinking about COLLADA, or even X3D, I believe it's very important to keep the standard neat and lean... or, in other words: If the mission of glTF is to be the "JPEG for 3D", let's be very careful not to make it the "SVG/PSD/AVI... for 3D", all at the same time, because that won't work and some other format will just come and replace glTF in certain domains, as it would be considered too bloated to be usable. With that said, it would probably be useful to focus on glTF's mission as a lean ready-to-render delivery format, and less on artistic aspects that would be more relevant for an editing/exchange format... so, if 95% of engines would support UDIM, for example, it would make sense to have it in glTF, otherwise one should probably take another detailed look to figure out why it's not broadly used (yet?) and hold off with adoption into the standard until there's a broad support and demand for the feature itself in real-time rendering applications. Again, just an example and some general thoughts, don't want to start a big discussion about UDIM here... that should probably be a different thread. Would be interested to learn what kind of non-glTF PBR texture maps or material properties the large user community of Sketchfab (@AurL?) is making heavy use of - it's always nice to have statistics ;-) |
@sbtron would you be able to post a summary of the discussion thus far? |
Will try to summarize discussion so far (please correct if I miss anything and I will update this comment) I was initially thinking about this from the perspective of what material properties are missing from the PBR (BSDF) style of materials but looks like there are some broader material requests (and maybe we should split them apart into multiple issues). Additional properties from Disney principled BRDF
Transparency
Fabric/Cloth
Additional Maps
Non Photo Realistic
Couple of other comments Ensuring that the materials work for realtime interactive type of scenarios is certainly an important consideration and maybe next step is to evaluate current engine support and usage similar to what was done in #696 Updated (10/24/18): Recategorized capabilites and added Engine support column. |
Here's one I lost track of: Parallax Occlusion using height maps. #948 |
I just came here to add a reference to #948 :) |
Updated comment above with additional capabilities and a column for engine support. |
Not sure Sketchfab could fit in the list but here what we are supporting: on transparency: |
@cess This is slightly tangential but have you seen ADOBE_materials_thin_transparency? So far it's only supported by Adobe Dimension and BabylonJS, but it can be adopted by a wider audience if it's useful. |
@emackey Thanks, I didn't know of that extension!! Something like ADOBE_materials_thin_transparency would be in the lines of what I'd like to have. However, maybe there's some newer research with a more powerful BSDF that is also still simple in number of parameters. I mean, glTF should be kept very simple, yet with data enough for ensuring that most of the content in a PBR scene can be transmitted (even if transmission means some sort of conversion to different lighting models, provided such conversion is physically plausible). |
Hi, I just found this thread, I've been working in my gltf IO library, and I've found how materials are defined in glTF v2 a bit hard to use. For ease of use, I've exposed a simplified approach to materials in my library, but it could be used as a base design for next materials revision. Right now, the current glTF v2 specification looks like this:
I would propose something like this:
ShaderStyle is just a string property, it does not define any shader code, it just states how to interpret and constrain the collection of channels that comes next. It could also be defined in an enumeration. A Notice that every MaterialChannel would be indexed in a dictionary inside Material (pretty much like the VertexAccessors of a MeshPrimitive). Also, the fallback material would be a cheap way to fall back to a supported material, and the schema would require that all non standard materials to define a standard fallback material. I believe this architecture is robust, it's a simplification over the v2 schema (without loosing any feature) and it allows to add new materials without breaking compatibility (thanks to the fallback material) and in many cases, without requiring extensions. Another advantage is that, in case we want animation support, having MaterialChannel as a generic sub-node of materials, simplifies the access to different components of a |
I've gathered my notes on available material parameters from various sources that are under consideration as possible inspiration for a next-generation PBR material definition in glTF. I looked at the Dassault Systèmes Enterprise PBR Shading Model and the Autodesk Standard Surface, and I'll compare against what's currently in glTF 2.0, and with the Blender 2.80 Principled BSDF node inputs (when I mention "Blender" below, I mean this version of this node specifically). I don't consider myself an expert in using all these parameters, so I haven't attempted to explain whether similarly-named ones work the same way or not. If anyone can add additional comparisons or details, please feel free. In each of these systems, the advanced PBR parameters can generally be broken up into groups I'll present each group with a heading, roughly ordered by what I think might be the likelihood Coat (possibly Clear Coat)All systems that I mentioned above support this, and offer a scalar value that enables the effect. Additionally they offer a roughness value for the coat, and a normal map for the coat. These three settings could easily go into a new glTF extension (as texture + factor in the typical glTF fashion). Some systems, including Enterprise PBR and Blender, call this "Clear Coat" and do not offer any Autodesk Standard Surface offers a AnisotropyEnterprise PBR and Standard Surface, along with Blender and others, agree on a convention using one float for the amount of anisotropy, and another float for the "rotation" (which I believe is a rotation delta from the +U texture direction). Ben Houston mentions in a comment that it would be better (more modern / more artist-friendly) to do what Sketchfab does, supplying a vector map (similar to a normal map), instead of scalars. I'm not sure if this makes it more friendly to real-time renderers as well, but I can easily see why some artists like this better. Still, going this route breaks compatibility with anisotropy in both Standard Surface and Enterprise PBR. Enterprise PBR and Blender each have a single anisotropy setting with a single rotation on it. Standard Surface offers no less than 4 places for anisotropy: Specular (the typically monochrome kind)"Specular" is an over-used term in our industry. I'm not talking about the classic workflow, nor am I No, I'm talking about a "specular" component of the Metallic/Roughness PBR workflow. This component is typically monochrome, but some implementations offer a "specular tint" for it. In glTF 2.0 core, this parameter was overlooked, and F0 was hard-coded to 0.04. But, not all artists are satisfied with that. This component is used to tweak the F0 factor of the metal/rough PBR workflow, while staying within that workflow. Enterprise PBR and Standard Surface both offer a scalar here, and separately offer a full-color specular tint. Blender, by comparison, offers a scalar for "Specular Tint" that merely chooses between white (un-tinted) or the base color (tinted). (I have personal doubts about the usefulness of "Specular Tint", for whatever that's worth, but I do think Specular was overlooked as being part of the Metal/Rough core workflow). SheenMost real-time renderers don't attempt this, but path tracers find it invaluable for fabrics and clothing. Enterprise PBR offers a single scalar (on/off) here. Standard Surface has that, plus Blender has sheen on/off, plus a "sheen tint" (as a scalar selection between un-tinted and base-color-tinted). Better EmissionThe emissive channel in glTF 2.0 core is standard 8-bit PNG, with no option to raise the brightness above 1.0. There should be an extension to enable emissive images with high dynamic range of some kind, preferably of a similar kind to the ones used in the IBL extension. Standard surface and Enterprise PBR both offer a color and a value. Enterprise PBR additionally offers a mode (POWER or EMITTANCE), and an "Energy Normalization" flag. Better Volume / Transmission / SubsurfaceAs we know, Adobe contributed ADOBE_materials_thin_transparency here. Enterprise PBR includes "Subsurface Color", "Attenuation Color", "Attenuation Distance", "Index of Refraction", and a "Thin Walled" flag. Standard Surface includes many parameters related to this:
Blender includes "Subsurface", "Subsurface Color", "Subsurface Radius" (3 components, like Standard Surface), IOR, "Transmission", and "Transmission Roughness". Perhaps transmission and subsurface can be broken into separate extensions, but it's not clear to me how much inter-dependency there is here. Neither one is particularly friendly to real-time rendering systems. Missing PropertiesEnterprise PBR, Standard Surface, and even Blender's Principled BSDF node all lack support for a pre-baked Ambient Occlusion channel, which is included in glTF 2.0 core. Generally these systems compute AO during path tracing. The remaining core parameters are included in each of these systems (although Standard Surface appears to have multiple roughness values, and I don't understand the differentiation). Several users in this thread have also expressed interest in "detail maps", where a smaller texture is merged with a larger texture in the same channel, such that the final texture appears to have both large and small details at a wide range of distances. Detail maps are not directly addressed by the Enterprise PBR or Standard Surface documentation, but presumably could be (or are) implemented outside of that, prior to feeding the resulting merged texture into the material system. This could be its own extension, similar to Also, none of these systems address displacement maps. But I could imagine that might happen outside of the material system as well, and could be worthy of its own glTF extension. ConclusionReally I'm trying to lay out the options, not draw any profound conclusion from all this. But as you may have already guessed, there does not seem to be any option that offers 1:1 parity with all of the different material systems available. Certainly a "monolithic" extension could include 1:1 mappings with any one chosen material system, at the expense of complete compatibility with all other available material systems. I suspect that part of glTF's core material popularity comes from its status as being near the least common denominator (LCD) for PBR systems, and we could extend that with LCD extensions for clear coat, anisotropy, etc., allowing implementations that support those things to import and export at least the basic settings with ease. |
Thank you @emackey , awesome work!
Only a quick note on Anisotropy: The vector map approach is also much more filtering friendly. |
For some history, this is the original thread discussing some of this:
Are you thinking a single factor or a texture with values or both for the F0 value for dielectrics? |
I was thinking the typical glTF way, an optional 1-channel texture and an optional float factor. I don't know off the top of my head how these correlate to F0, but it looks like 100% specular is nowhere near 100% F0. Someone would have to supply the math (probably Enterprise PBR or Standard Surface). |
We really like the Enterprise PBR model for specular that uses a
specularColor, an IOR (which specifies F0) as well as a specular
intensity. It is very well thought out in terms of the flexibility
that advanced artists want. Look at the equations that Dassault gives
for it, it is actually excellent. Please **please** please do not
simplify it to something less capable.
I would advocate for a sheen roughness, that would be really neat.
But it isn't absolutely necessary.
Only a quick note on Anisotropy: The vector map approach is also much more filtering friendly.
Our in house artist team has flipped back and forth on the vector map
approach versus the angular maps. The vector map does avoid the
discontinuities that are in the angular map approach and that is the
best argument for it -- no other map in glTF has discontinuities like
the angular anisotropic maps thus I think we should not use that
approach and instead favor vector maps. I think that vector maps
lacks tooling support though. But if glTF pushes this, the tooling
should come relatively quickly.
…On Mon, Aug 19, 2019 at 9:03 PM Ed Mackey ***@***.***> wrote:
Are you thinking a single factor or a texture with values or both for the F0 value for dielectrics?
I was thinking the typical glTF way, an optional 1-channel texture and an optional float factor. I don't know off the top of my head how these correlate to F0, but it looks like 100% specular is nowhere near 100% F0. Someone would have to supply the math (probably Enterprise PBR or Standard Surface).
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
--
Ben Houston
CTO
M: +1-613-762-4113
bhouston@threekit.com
Ottawa, ON
ThreeKit Visualization Platform: 3D, 2D, AV, VR
|
To be clear, ThreeKit is actively contributing PBR Next features to
Three.JS right now.
I described our intent to do this here:
mrdoob/three.js#16977
And we have so far done:
- Sheen: mrdoob/three.js#16971
- Clear Coat Normals: mrdoob/three.js#17079
- Transparency: mrdoob/three.js#16996 or
mrdoob/three.js#17114
Left are:
- Anisotropy
- Subsurface
- Reflectivity color & Specular strength
…On Tue, Aug 20, 2019 at 8:49 AM Ben Houston ***@***.***> wrote:
We really like the Enterprise PBR model for specular that uses a
specularColor, an IOR (which specifies F0) as well as a specular
intensity. It is very well thought out in terms of the flexibility
that advanced artists want. Look at the equations that Dassault gives
for it, it is actually excellent. Please **please** please do not
simplify it to something less capable.
I would advocate for a sheen roughness, that would be really neat.
But it isn't absolutely necessary.
> Only a quick note on Anisotropy: The vector map approach is also much
more filtering friendly.
Our in house artist team has flipped back and forth on the vector map
approach versus the angular maps. The vector map does avoid the
discontinuities that are in the angular map approach and that is the
best argument for it -- no other map in glTF has discontinuities like
the angular anisotropic maps thus I think we should not use that
approach and instead favor vector maps. I think that vector maps
lacks tooling support though. But if glTF pushes this, the tooling
should come relatively quickly.
On Mon, Aug 19, 2019 at 9:03 PM Ed Mackey ***@***.***>
wrote:
>
> Are you thinking a single factor or a texture with values or both for
the F0 value for dielectrics?
>
> I was thinking the typical glTF way, an optional 1-channel texture and
an optional float factor. I don't know off the top of my head how these
correlate to F0, but it looks like 100% specular is nowhere near 100% F0.
Someone would have to supply the math (probably Enterprise PBR or Standard
Surface).
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub, or mute the thread.
--
Ben Houston
CTO
M: +1-613-762-4113
***@***.***
Ottawa, ON
ThreeKit Visualization Platform: 3D, 2D, AV, VR
--
*Ben Houston*CTO
*M: *+1-613-762-4113
bhouston@threekit.com
Ottawa, ON
[image: LogoThreeKit215x491559143581.png] <https://www.threekit.com/>
ThreeKit Visualization Platform: 3D, 2D, AV, VR <https://www.threekit.com/>
|
Most of the following refers to the Babylon.js approach. About anisotropy, we choose to support direction instead of rotation for filtering purpose: https://doc.babylonjs.com/how_to/physically_based_rendering_master#anisotropy We are using the same approach than filament regarding the equations. On the clear coat front, we allowed both the coat tint and normals as we found them really great in a lot of cases: https://doc.babylonjs.com/how_to/physically_based_rendering_master#clear-coat We also support IOR as a parameter (with a fast path when default is used) as it was sometimes helpful to better achieve the desired effect. On the sheen side we support both the Dassault model and one where you can define yourself a separate tint and roughness: https://doc.babylonjs.com/how_to/physically_based_rendering_master#sheen Keeping both tint and roughness can really be helpful in some cases but I agree that having less parameterization is also pretty compelling. Not being able to pick and chose between both approaches lead us to implement and keep both :-) |
@sebavan and I had extensive discussions on the implementation and I'd love to hear from folks working on deferred renderers. The proliferations of parameters might become an issue for the size of the gbuffers there. |
There's a ton of math in the Enterprise PBR doc, and I apologize for getting a bit lost in there. Can you point to the specific formula(s) where this is covered, or explain a bit deeper? I hadn't realized IOR was directly tied to specular here, that's new to me. Also, would a new extension covering this be mutually exclusive (within the same material) with the |
The relation between IOR and Fresnel reflectance at normal and grazing incidence is given in Sec. 1.1.3, eq. 24. In Enterprise PBR, the IOR is not only used to change the amount of reflection, but also to compute the corresponding refraction direction of light rays at the interface between two media, according to Snell's law. In case the material is applied to a non-thin walled, closed mesh, the IOR parameter defines the refractive index of the enclosed volume. The IOR is not spatially-varying (if needed, use the |
The key smart parts of the Enterprise PBR model are the following:
Basically each of these gives a separate degree of control that can not be mathematically reduced further. Artists want each of these and they want them indepedently. IF we go with this parameterization there is less of a need for the specular/glossiness workflow because these parameters allow for recreating the looks which that workflow enables that the more basic metalness/roughness doesn't. |
I would completely support something along the lines of Nvidia MDL (which would be banal to express in json) except that the BRDF blocks that can be composed in a DAG are already predefined, much like Mitsuba's Such a solution is feasible, my team has a material compiler from a DAG Intermediate Representation into a GPU state machine which optimizes out states that can never be transferred into, right now its only frontend is Mitsuba XML and backend is GLSL. However we aim to support Nvidia MDL as a front end, while CUDA, CUDA+OptiX (leveraging SBT) and GLSL closestHit shaders as extra backends. Futhermore it would unify #1717 , #1718 and #1719 into a single solution instead of fragmented extensions. CC: @vsnappy |
For those following along, please note that the definition of next gen PBR materials is already underway, in the issues and PRs linked here: https://github.com/KhronosGroup/glTF/milestone/2 At this stage, I would recommend leaving feedback directly on those issues, rather than this "catch-all" issue. The work is closely informed by Autodesk Standard Surface and Dassault Systèmes Enterprise PBR Shading Model, and broadly compatible with both. That definition will take one extension per discrete feature (such as clearcoat, sheen, etc.) initially. These PBR features may be consolidated into a core glTF version or a single unified extension down the road, as ecosystem support consolidates. We've also discussed things with NVIDIA MDL authors, and intend that MDL can be "distilled" down to equivalent factor/texture pairs in the short term. A more flexible node-based system like MaterialX or (full) MDL is not the immediate goal, but may be of longer term interest. |
Agreed. Sounds like this catch-all issue should be closed in favor of individual extensions in the PBR Next milestone. At some future point I'm sure we'll open a new issue for whatever comes after PBR Next, which could be something like MDL. Also at some point we should scavenge all the ideas here that didn't make it into our current view of PBR Next, so they can be reconsidered as well. |
UDIM would be really nice, having the ability to apply an image texture to a specific UV tile (or at least that's my understanding of it). This would be nice for people like me who have materials shared across multiple objects but can't link them together because of that damn AO image texture (that is specific to each objects). With UDIM I could move the UV of each object by one tile and associate their respective AO to that specific tile in the same material. |
UDIMs sound like a handy tool for artists, but I'm not convinced that a well-behaved glTF exporter should be writing them into a glTF file, where they become a problem for client runtimes to sort out when a glTF is received. For example, in the case where an artist has one material shared across several objects, and is using UDIMs to give each object its own AO map, the solution seems straightforward: The glTF exporter should make copies of that material, and in each one assign a different AO map but the same properties and maps in the remaining material parameters. Then, the client receiving the file doesn't see a UDIM, it just loads the appropriate AO map for each object as specified directly in glTF. The client has to make separate draw calls for separate objects anyway, so splitting up the material per object doesn't add cost there. So my impression is, UDIMs should be handled or decomposed by glTF exporters, and not shipped to clients in glTF itself. |
In the case of a single mesh using a UDIM, the exporter could be faced with some work: It would have to split that mesh into multiple primitives, choosing which primitive each triangle ends up in based on which UDIM that triangle uses. The material would likewise be split up across primitives, one per UDIM. I'm thinking the process should be similar to what happens when content creation packages render UDIMs anyway, since a GPU has no idea what a UDIM is. So if the exporter didn't do this work, that means you're asking some battery-powered mobile device to do the work instead, and that's not good. Generally this is the strategy for any feature "X" that's not directly supported on a GPU or in a raw graphics API like WebGL and OpenGL. There are no UDIMs or whatever feature "X" in those APIs, but there are ways to implement them on top of the API. Follow that same strategy when exporting to glTF. Think of the exported data as being a snapshot of what will be sent to a remote GPU. That's the role of glTF. |
I agree with everything you said, UDIM can just be used inside Blender and translated at export to whatever fit best the format. That being said I wonder if UDIM are the best thing (or only thing) that could be done. The goal behind this is (at least for me) to be able to apply the same materials over multiple objects, and by extension it would also be to have linked materials across blend file (i.e big material libraries). The average user wouldn't care but for scalability this is huge! The thing is that when you never use texture atlas and only rely on seamless textures your only remaining enemy preventing such system scalability is just the AO texture. (I realize this is most likely beyond the scope of the exporter and I'm working on scripting this kind of system for myself with the current exporter capabilities, but I thought I'd just share this in case it gave someone a good idea) |
Hi, sorry to interrupt. But does it change something? |
There are other standards like https://dassaultsystemes-technology.github.io/EnterprisePBRShadingModel/ and https://autodesk.github.io/standard-surface/ |
@abgrac Just to clarify the meaning here, in the Khronos PBR TSG we're trying to ensure compatibility or at least some level of interoperability with many materials standards, including ASWF's MaterialX, NVIDIA's MDL, 3DS Enterprise PBR, and Adobe's ASM. But the glTF PBR material definition is its own open standard, geared towards broad compatibility across platforms and rendering architectures, influenced and sometimes guided by these other standards and their authors. Like glTF itself, the glTF PBR material definition is geared towards final delivery of a finished asset. |
There have been some discussions lately around an updated glTF material model to support additional capabilities. Please consider this a requirements gathering exercise to understand what kind of capabilities everyone would like to see in the next gen material model for glTF.
Below are just some suggestions in no particular order
Are there additional capabilities anyone would like to see? Any preferences or priorities on what we should go after next from the list above?
Without getting into details on how exactly we would implement this (for now) our high level goal would still be to ensure any new capability is implementable across all graphics APIs.
The text was updated successfully, but these errors were encountered: