Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What would you like to see in next gen materials? #1442

Closed
sbtron opened this issue Sep 5, 2018 · 63 comments
Closed

What would you like to see in next gen materials? #1442

sbtron opened this issue Sep 5, 2018 · 63 comments

Comments

@sbtron
Copy link
Contributor

sbtron commented Sep 5, 2018

There have been some discussions lately around an updated glTF material model to support additional capabilities. Please consider this a requirements gathering exercise to understand what kind of capabilities everyone would like to see in the next gen material model for glTF.

Below are just some suggestions in no particular order

  • Better Transparency (Refractions etc)
  • Subsurface scattering
  • Anisotropy
  • Coating

Are there additional capabilities anyone would like to see? Any preferences or priorities on what we should go after next from the list above?
Without getting into details on how exactly we would implement this (for now) our high level goal would still be to ensure any new capability is implementable across all graphics APIs.

@n1ckfg
Copy link

n1ckfg commented Sep 6, 2018

Visibility keyframes--they make possible small cache-style animations of individual frames.

@Ben-Mack
Copy link

Ben-Mack commented Sep 7, 2018

Please add official support for Lightmap 👍

Popular 3D engines all have lightmap in their PBR material.

It's an essential feature to achieve much better lighting quality and performance for static objects (benefit not just for game industry but also product visualization, arch viz...)

stevenvergenz already created an extension for it here: #1017

@vsnappy
Copy link

vsnappy commented Sep 7, 2018

MDL support would be great. https://developer.nvidia.com/mdl-sdk

@UX3D-nopper
Copy link
Contributor

  • Thin film
  • Multi-layer

see https://blog.selfshadow.com/publications/s2017-shading-course/drobot/s2017_pbs_multilayered.pdf

@romainguy
Copy link

romainguy commented Sep 12, 2018

  • Backscattering (cloth)
  • Thin translucency

@romainguy
Copy link

If that's useful we have a document with real-time definitions for clear coat, anisotropy and cloth which can be added to existing PBR engines: https://google.github.io/filament/Filament.md.html

@pjcozzi
Copy link
Member

pjcozzi commented Sep 15, 2018

Photogrammetry users of Cesium have a slight request for refraction and reflection. The later, I think is already covered enough in today's glTF PBR.

@emackey
Copy link
Member

emackey commented Sep 19, 2018

See also:

@vyv03354
Copy link

  • Toon shaded materials
  • Outlines

for non-photorealistic rendering.

@jbouny
Copy link

jbouny commented Sep 21, 2018

Handle UV tiles/UDIMs in the specification.

As described here in blender (https://developer.blender.org/T55435); we essentially only need a level between a glTF "image" and a "texture":

a tiled image consists of multiple files, where each file covers one unit square in the UV space.

The UV tiles is only the idea to have multiple image files to cover the UV space whereas UDIM is more a naming scheme.
It is really used in the VFX world so it could be a nice improvement to have a standard support; or at least an extension.

More info here:
https://support.allegorithmic.com/documentation/spdoc/udim-144310352.html

@emackey
Copy link
Member

emackey commented Sep 21, 2018

@jbouny I was wondering if anyone would ask for UDIMs, I've been interested in them for a while. But I'm concerned that they're not very compatible with realtime shaders.

In that Allegorithmic documentation page you linked to, there's a section titled "Are UDIM compatible with game engines?" and ends with this assessment: "Therefore using UDIMs for real-time rendering is not recommended for the moment."

Of course, glTF is meant to be a general delivery format, for end-user consumption be it realtime or not. Currently though, a large percentage of the glTF ecosystem is realtime interactive, so ecosystem support for UDIMs could be limited.

Even so, it's the sort of thing that could be specified as a glTF extension fairly easily, for any implementations that were willing to take on the UDIM challenge.

@mlimper
Copy link
Contributor

mlimper commented Oct 4, 2018

While it's cool to see so much momentum around glTF, there's also a danger that it becomes unusable because it would be considered to be too complex at some point... thinking about COLLADA, or even X3D, I believe it's very important to keep the standard neat and lean... or, in other words: If the mission of glTF is to be the "JPEG for 3D", let's be very careful not to make it the "SVG/PSD/AVI... for 3D", all at the same time, because that won't work and some other format will just come and replace glTF in certain domains, as it would be considered too bloated to be usable.

With that said, it would probably be useful to focus on glTF's mission as a lean ready-to-render delivery format, and less on artistic aspects that would be more relevant for an editing/exchange format... so, if 95% of engines would support UDIM, for example, it would make sense to have it in glTF, otherwise one should probably take another detailed look to figure out why it's not broadly used (yet?) and hold off with adoption into the standard until there's a broad support and demand for the feature itself in real-time rendering applications. Again, just an example and some general thoughts, don't want to start a big discussion about UDIM here... that should probably be a different thread.

Would be interested to learn what kind of non-glTF PBR texture maps or material properties the large user community of Sketchfab (@AurL?) is making heavy use of - it's always nice to have statistics ;-)

@pjcozzi
Copy link
Member

pjcozzi commented Oct 9, 2018

@sbtron would you be able to post a summary of the discussion thus far?

@sbtron
Copy link
Contributor Author

sbtron commented Oct 10, 2018

Will try to summarize discussion so far (please correct if I miss anything and I will update this comment)

I was initially thinking about this from the perspective of what material properties are missing from the PBR (BSDF) style of materials but looks like there are some broader material requests (and maybe we should split them apart into multiple issues).

Additional properties from Disney principled BRDF

Unity Unreal Filament three.js Babylon.js UX3D
ClearCoat
Subsurface
Anisotropic
Sheen

Transparency

Unity Unreal Filament three.js Babylon.js UX3D
Additional alpha properties #1457
IOR
Thin Film Transparency

Fabric/Cloth

Unity Unreal Filament three.js Babylon.js UX3D
Backscattering

Additional Maps

Unity Unreal Filament three.js Babylon.js UX3D
Lightmaps
DisplacementMaps

Non Photo Realistic

Unity Unreal Filament three.js Babylon.js UX3D
Toon Shaded
Outlines

Couple of other comments
UDIM support seems to be independent of extending the material definitions so maybe we track that separately.

Ensuring that the materials work for realtime interactive type of scenarios is certainly an important consideration and maybe next step is to evaluate current engine support and usage similar to what was done in #696

Updated (10/24/18): Recategorized capabilites and added Engine support column.

@emackey
Copy link
Member

emackey commented Oct 10, 2018

Here's one I lost track of: Parallax Occlusion using height maps. #948

@MiiBond
Copy link
Contributor

MiiBond commented Oct 12, 2018

I just came here to add a reference to #948 :)
Adobe will most likely be very interested in this in the coming months.

@sbtron
Copy link
Contributor Author

sbtron commented Oct 24, 2018

Updated comment above with additional capabilities and a column for engine support.
It would be very useful if folks could leave comments with information about specific engines what capabilities they support. I can keep the table updated as a summary.

@cedricpinson
Copy link

Not sure Sketchfab could fit in the list but here what we are supporting:
clearCoat
subSurface
lightmaps
anisotropy is coming soon

on transparency:
dithering
refraction/ior
blending
additive
mask

@emackey
Copy link
Member

emackey commented Jun 15, 2019

@cess This is slightly tangential but have you seen ADOBE_materials_thin_transparency? So far it's only supported by Adobe Dimension and BabylonJS, but it can be adopted by a wider audience if it's useful.

@cesss
Copy link

cesss commented Jun 15, 2019

@emackey Thanks, I didn't know of that extension!! Something like ADOBE_materials_thin_transparency would be in the lines of what I'd like to have. However, maybe there's some newer research with a more powerful BSDF that is also still simple in number of parameters. I mean, glTF should be kept very simple, yet with data enough for ensuring that most of the content in a PBR scene can be transmitted (even if transmission means some sort of conversion to different lighting models, provided such conversion is physically plausible).

@vpenades
Copy link
Contributor

vpenades commented Jul 27, 2019

Hi, I just found this thread, I've been working in my gltf IO library, and I've found how materials are defined in glTF v2 a bit hard to use. For ease of use, I've exposed a simplified approach to materials in my library, but it could be used as a base design for next materials revision.

Right now, the current glTF v2 specification looks like this:

Material
├── TextureInfo "Emissive"
├── MaterialNormalTextureInfo
│   └── TextureInfo "Normal"
├── MaterialOcclusionTextureInfo 
│   └── TextureInfo "Occlussion"
├── MaterialPBRMetallicRoughness
│   ├── TextureInfo "BaseColor"
│   └── TextureInfo "MetallicRoughness"
└── Extensions
    ├── MaterialUnlit
    └── MaterialPBRSpecularGlossiness
        ├── TextureInfo "Diffuse"
        └── TextureInfo "SpecularGlossiness"

I would propose something like this:

Material "Material1"
├── ShaderStyle = "Unlit"
├── MaterialChannel "Emissive"
│   └── TextureInfo
├── MaterialChannel "Occlussion"
│   └── TextureInfo
└── MaterialChannel "BaseColor"
Material "Material2"
├── ShaderStyle = "PBRMetallicRoughness"
├── FallbackMaterial= "Material1"
├── MaterialChannel "Emissive"
│   └── TextureInfo
├── MaterialChannel "Occlussion"
│   └── TextureInfo
├── MaterialChannel "Normal"
│   └── TextureInfo
├── MaterialChannel "BaseColor"
│   └── TextureInfo
└── MaterialChannel "MetallicRoughness"
    └── TextureInfo
Material "Material3"
├── ShaderStyle = "PBRSpecularGlossiness"
├── FallbackMaterial= "Material2"
├── MaterialChannel "Emissive"
│   └── TextureInfo
├── MaterialChannel "Occlussion"
│   └── TextureInfo
├── MaterialChannel "Normal"
│   └── TextureInfo
├── MaterialChannel "Diffuse"
│   └── TextureInfo
└── MaterialChannel "SpecularGlossiness"
    └── TextureInfo

ShaderStyle is just a string property, it does not define any shader code, it just states how to interpret and constrain the collection of channels that comes next. It could also be defined in an enumeration.

A MaterialChannel would be a combination of a TextureInfo and a named parameter... this parameter is typically associated to the TextureInfo and has a size between 1 to 4 floats... it's usually called "Color", "Strength", "Scale"... but it always boils down to a Float, a Vector3 or a Vector4 value.

Notice that every MaterialChannel would be indexed in a dictionary inside Material (pretty much like the VertexAccessors of a MeshPrimitive).

Also, the fallback material would be a cheap way to fall back to a supported material, and the schema would require that all non standard materials to define a standard fallback material.

I believe this architecture is robust, it's a simplification over the v2 schema (without loosing any feature) and it allows to add new materials without breaking compatibility (thanks to the fallback material) and in many cases, without requiring extensions.

Another advantage is that, in case we want animation support, having MaterialChannel as a generic sub-node of materials, simplifies the access to different components of a Material for animations. So an animation channel could target Nodes and MaterialChannels indistinctly.

@emackey
Copy link
Member

emackey commented Aug 19, 2019

I've gathered my notes on available material parameters from various sources that are under consideration as possible inspiration for a next-generation PBR material definition in glTF. I looked at the Dassault Systèmes Enterprise PBR Shading Model and the Autodesk Standard Surface, and I'll compare against what's currently in glTF 2.0, and with the Blender 2.80 Principled BSDF node inputs (when I mention "Blender" below, I mean this version of this node specifically). I don't consider myself an expert in using all these parameters, so I haven't attempted to explain whether similarly-named ones work the same way or not. If anyone can add additional comparisons or details, please feel free.

In each of these systems, the advanced PBR parameters can generally be broken up into groups
of about 2 to 5 related parameters each. I think these could make for natural boundaries between glTF extensions. Separating such groups into extensions would allow implementors to implement these separately, lowering the barrier to entry by breaking up an otherwise monolithic task, and also makes it possible for models to use one or more groups without requiring that the entire next-gen PBR implementation be present.

I'll present each group with a heading, roughly ordered by what I think might be the likelihood
of the existing glTF ecosystem attempting to embrace each one, with "Coat" being the most likely
(and most-requested) of any of these.

Coat (possibly Clear Coat)

All systems that I mentioned above support this, and offer a scalar value that enables the effect. Additionally they offer a roughness value for the coat, and a normal map for the coat. These three settings could easily go into a new glTF extension (as texture + factor in the typical glTF fashion).

Some systems, including Enterprise PBR and Blender, call this "Clear Coat" and do not offer any
way to tint the coat.

Autodesk Standard Surface offers a coat_color setting, as well as an Index of Refraction for the coat, and anisotropy settings on the coat itself. Standard Surface goes on to define that the tinted coat is applied on top of any emissive properties of the underlying material.

Anisotropy

Enterprise PBR and Standard Surface, along with Blender and others, agree on a convention using one float for the amount of anisotropy, and another float for the "rotation" (which I believe is a rotation delta from the +U texture direction). Ben Houston mentions in a comment that it would be better (more modern / more artist-friendly) to do what Sketchfab does, supplying a vector map (similar to a normal map), instead of scalars. I'm not sure if this makes it more friendly to real-time renderers as well, but I can easily see why some artists like this better. Still, going this route breaks compatibility with anisotropy in both Standard Surface and Enterprise PBR.

Enterprise PBR and Blender each have a single anisotropy setting with a single rotation on it. Standard Surface offers no less than 4 places for anisotropy: specular_anisotropy, subsurface_anisotropy, transmission_scatter_anisotropy, and coat_anisotropy.

Specular (the typically monochrome kind)

"Specular" is an over-used term in our industry. I'm not talking about the classic workflow, nor am I
talking about the usage covered by the KHR_materials_pbrSpecularGlossiness extension for the spec/gloss PBR workflow.

No, I'm talking about a "specular" component of the Metallic/Roughness PBR workflow. This component is typically monochrome, but some implementations offer a "specular tint" for it. In glTF 2.0 core, this parameter was overlooked, and F0 was hard-coded to 0.04. But, not all artists are satisfied with that. This component is used to tweak the F0 factor of the metal/rough PBR workflow, while staying within that workflow.

Enterprise PBR and Standard Surface both offer a scalar here, and separately offer a full-color specular tint. Blender, by comparison, offers a scalar for "Specular Tint" that merely chooses between white (un-tinted) or the base color (tinted). (I have personal doubts about the usefulness of "Specular Tint", for whatever that's worth, but I do think Specular was overlooked as being part of the Metal/Rough core workflow).

Sheen

Most real-time renderers don't attempt this, but path tracers find it invaluable for fabrics and clothing.

Enterprise PBR offers a single scalar (on/off) here.

Standard Surface has that, plus sheen_color and sheen_roughness.

Blender has sheen on/off, plus a "sheen tint" (as a scalar selection between un-tinted and base-color-tinted).

Better Emission

The emissive channel in glTF 2.0 core is standard 8-bit PNG, with no option to raise the brightness above 1.0. There should be an extension to enable emissive images with high dynamic range of some kind, preferably of a similar kind to the ones used in the IBL extension.

Standard surface and Enterprise PBR both offer a color and a value. Enterprise PBR additionally offers a mode (POWER or EMITTANCE), and an "Energy Normalization" flag.

Better Volume / Transmission / Subsurface

As we know, Adobe contributed ADOBE_materials_thin_transparency here.

Enterprise PBR includes "Subsurface Color", "Attenuation Color", "Attenuation Distance", "Index of Refraction", and a "Thin Walled" flag.

Standard Surface includes many parameters related to this:

thin_film_thickness
thin_film_IOR
thin_walled
transmission
transmission_color
transmission_depth
transmission_scatter
transmission_scatter_anisotropy
transmission_dispersion
​transmission_extra_roughness
subsurface
subsurface_color
subsurface_radius
subsurface_scale
subsurface_anisotropy

Blender includes "Subsurface", "Subsurface Color", "Subsurface Radius" (3 components, like Standard Surface), IOR, "Transmission", and "Transmission Roughness".

Perhaps transmission and subsurface can be broken into separate extensions, but it's not clear to me how much inter-dependency there is here. Neither one is particularly friendly to real-time rendering systems.

Missing Properties

Enterprise PBR, Standard Surface, and even Blender's Principled BSDF node all lack support for a pre-baked Ambient Occlusion channel, which is included in glTF 2.0 core. Generally these systems compute AO during path tracing. The remaining core parameters are included in each of these systems (although Standard Surface appears to have multiple roughness values, and I don't understand the differentiation).

Several users in this thread have also expressed interest in "detail maps", where a smaller texture is merged with a larger texture in the same channel, such that the final texture appears to have both large and small details at a wide range of distances. Detail maps are not directly addressed by the Enterprise PBR or Standard Surface documentation, but presumably could be (or are) implemented outside of that, prior to feeding the resulting merged texture into the material system. This could be its own extension, similar to KHR_texture_transform, with its own set of implications for raising the
level of complexity of ways that textures can interact with the material.

Also, none of these systems address displacement maps. But I could imagine that might happen outside of the material system as well, and could be worthy of its own glTF extension.

Conclusion

Really I'm trying to lay out the options, not draw any profound conclusion from all this. But as you may have already guessed, there does not seem to be any option that offers 1:1 parity with all of the different material systems available. Certainly a "monolithic" extension could include 1:1 mappings with any one chosen material system, at the expense of complete compatibility with all other available material systems. I suspect that part of glTF's core material popularity comes from its status as being near the least common denominator (LCD) for PBR systems, and we could extend that with LCD extensions for clear coat, anisotropy, etc., allowing implementations that support those things to import and export at least the basic settings with ease.

@jbouny
Copy link

jbouny commented Aug 19, 2019

Thank you @emackey , awesome work!

Anisotropy

Enterprise PBR and Standard Surface, along with Blender and others, agree on a convention using one float for the amount of anisotropy, and another float for the "rotation" (which I believe is a rotation delta from the +U texture direction). Ben Houston mentions in a comment that it would be better (more modern / more artist-friendly) to do what Sketchfab does, supplying a vector map (similar to a normal map), instead of scalars. I'm not sure if this makes it more friendly to real-time renderers as well, but I can easily see why some artists like this better. Still, going this route breaks compatibility with anisotropy in both Standard Surface and Enterprise PBR.

Only a quick note on Anisotropy: The vector map approach is also much more filtering friendly.
Indeed, having a rotation stored in a float from 0 to 1 can lead to artifacts when we have extreme values: filtering between a texel with a value of 0 and texel with a value of 1 could lead to a value of 0.5, interpreted as the opposite rotation.

@bghgary
Copy link
Contributor

bghgary commented Aug 20, 2019

Specular (the typically monochrome kind)

For some history, this is the original thread discussing some of this:
#696 (search for F0)

I have personal doubts about the usefulness of "Specular Tint", for whatever that's worth, but I do think Specular was overlooked as being part of the Metal/Rough core workflow

Are you thinking a single factor or a texture with values or both for the F0 value for dielectrics?

@emackey
Copy link
Member

emackey commented Aug 20, 2019

Are you thinking a single factor or a texture with values or both for the F0 value for dielectrics?

I was thinking the typical glTF way, an optional 1-channel texture and an optional float factor. I don't know off the top of my head how these correlate to F0, but it looks like 100% specular is nowhere near 100% F0. Someone would have to supply the math (probably Enterprise PBR or Standard Surface).

@bhouston
Copy link
Contributor

bhouston commented Aug 20, 2019 via email

@bhouston
Copy link
Contributor

bhouston commented Aug 20, 2019 via email

@sebavan
Copy link
Contributor

sebavan commented Aug 20, 2019

Most of the following refers to the Babylon.js approach.

About anisotropy, we choose to support direction instead of rotation for filtering purpose: https://doc.babylonjs.com/how_to/physically_based_rendering_master#anisotropy We are using the same approach than filament regarding the equations.

On the clear coat front, we allowed both the coat tint and normals as we found them really great in a lot of cases: https://doc.babylonjs.com/how_to/physically_based_rendering_master#clear-coat We also support IOR as a parameter (with a fast path when default is used) as it was sometimes helpful to better achieve the desired effect.

On the sheen side we support both the Dassault model and one where you can define yourself a separate tint and roughness: https://doc.babylonjs.com/how_to/physically_based_rendering_master#sheen Keeping both tint and roughness can really be helpful in some cases but I agree that having less parameterization is also pretty compelling. Not being able to pick and chose between both approaches lead us to implement and keep both :-)

@romainguy
Copy link

@sebavan and I had extensive discussions on the implementation and sheen and I believe we both agreed that while fewer parameters is a good thing, sheen_roughness seems valuable. Tying the roughness of the sheen to its presence didn't feel satisfying in our independent tests (same goes for tint). My (still pending) implementation of sheen for Filament will provide roughness and tint control for those reasons (we already provide a separate cloth material model, sheen will just merge it with our standard model).

I'd love to hear from folks working on deferred renderers. The proliferations of parameters might become an issue for the size of the gbuffers there.

@emackey
Copy link
Member

emackey commented Aug 23, 2019

@bhouston

We really like the Enterprise PBR model for specular that uses a specularColor, an IOR (which specifies F0) as well as a specular intensity. It is very well thought out in terms of the flexibility that advanced artists want. Look at the equations that Dassault gives for it, it is actually excellent.

There's a ton of math in the Enterprise PBR doc, and I apologize for getting a bit lost in there. Can you point to the specific formula(s) where this is covered, or explain a bit deeper? I hadn't realized IOR was directly tied to specular here, that's new to me.

Also, would a new extension covering this be mutually exclusive (within the same material) with the KHR_materials_pbrSpecularGlossiness extension? This is a different treatment of specular for dielectrics only, right?

@proog128
Copy link
Contributor

The relation between IOR and Fresnel reflectance at normal and grazing incidence is given in Sec. 1.1.3, eq. 24. In Enterprise PBR, the IOR is not only used to change the amount of reflection, but also to compute the corresponding refraction direction of light rays at the interface between two media, according to Snell's law. In case the material is applied to a non-thin walled, closed mesh, the IOR parameter defines the refractive index of the enclosed volume. The IOR is not spatially-varying (if needed, use the specular parameter) to easily support nested dielectrics with priorities, as described here.

@bhouston
Copy link
Contributor

The key smart parts of the Enterprise PBR model are the following:

  • s, specularStrength, is directly independent.
  • p[s], specular tint, is directly dependent.
  • ior/F0, index of refraction controls the specular tint directionally dependent falloff.

Basically each of these gives a separate degree of control that can not be mathematically reduced further. Artists want each of these and they want them indepedently. IF we go with this parameterization there is less of a need for the specular/glossiness workflow because these parameters allow for recreating the looks which that workflow enables that the more basic metalness/roughness doesn't.

@proog128
Copy link
Contributor

proog128 commented Dec 3, 2019

Added three pull requests related to PBR Next. #1717 extends the section about the BRDF implementation in the core specification to give more flexibility in implementations and prepare for extensions. #1718 adds a parameter for IOR. #1719 adds parameters specular and specularColor.

@devshgraphicsprogramming

I would completely support something along the lines of Nvidia MDL (which would be banal to express in json) except that the BRDF blocks that can be composed in a DAG are already predefined, much like Mitsuba's <bsdf> plugins.

Such a solution is feasible, my team has a material compiler from a DAG Intermediate Representation into a GPU state machine which optimizes out states that can never be transferred into, right now its only frontend is Mitsuba XML and backend is GLSL. However we aim to support Nvidia MDL as a front end, while CUDA, CUDA+OptiX (leveraging SBT) and GLSL closestHit shaders as extra backends.

Futhermore it would unify #1717 , #1718 and #1719 into a single solution instead of fragmented extensions.

CC: @vsnappy

@donmccurdy
Copy link
Contributor

donmccurdy commented Aug 18, 2020

For those following along, please note that the definition of next gen PBR materials is already underway, in the issues and PRs linked here:

https://github.com/KhronosGroup/glTF/milestone/2

At this stage, I would recommend leaving feedback directly on those issues, rather than this "catch-all" issue.

The work is closely informed by Autodesk Standard Surface and Dassault Systèmes Enterprise PBR Shading Model, and broadly compatible with both. That definition will take one extension per discrete feature (such as clearcoat, sheen, etc.) initially. These PBR features may be consolidated into a core glTF version or a single unified extension down the road, as ecosystem support consolidates.

We've also discussed things with NVIDIA MDL authors, and intend that MDL can be "distilled" down to equivalent factor/texture pairs in the short term. A more flexible node-based system like MaterialX or (full) MDL is not the immediate goal, but may be of longer term interest.

@emackey
Copy link
Member

emackey commented Aug 18, 2020

Agreed. Sounds like this catch-all issue should be closed in favor of individual extensions in the PBR Next milestone.

At some future point I'm sure we'll open a new issue for whatever comes after PBR Next, which could be something like MDL.

Also at some point we should scavenge all the ideas here that didn't make it into our current view of PBR Next, so they can be reconsidered as well.

@globglob3D
Copy link

UDIM would be really nice, having the ability to apply an image texture to a specific UV tile (or at least that's my understanding of it). This would be nice for people like me who have materials shared across multiple objects but can't link them together because of that damn AO image texture (that is specific to each objects). With UDIM I could move the UV of each object by one tile and associate their respective AO to that specific tile in the same material.

@emackey
Copy link
Member

emackey commented Jan 28, 2021

UDIMs sound like a handy tool for artists, but I'm not convinced that a well-behaved glTF exporter should be writing them into a glTF file, where they become a problem for client runtimes to sort out when a glTF is received.

For example, in the case where an artist has one material shared across several objects, and is using UDIMs to give each object its own AO map, the solution seems straightforward: The glTF exporter should make copies of that material, and in each one assign a different AO map but the same properties and maps in the remaining material parameters. Then, the client receiving the file doesn't see a UDIM, it just loads the appropriate AO map for each object as specified directly in glTF. The client has to make separate draw calls for separate objects anyway, so splitting up the material per object doesn't add cost there.

So my impression is, UDIMs should be handled or decomposed by glTF exporters, and not shipped to clients in glTF itself.

@emackey
Copy link
Member

emackey commented Jan 28, 2021

In the case of a single mesh using a UDIM, the exporter could be faced with some work: It would have to split that mesh into multiple primitives, choosing which primitive each triangle ends up in based on which UDIM that triangle uses. The material would likewise be split up across primitives, one per UDIM. I'm thinking the process should be similar to what happens when content creation packages render UDIMs anyway, since a GPU has no idea what a UDIM is. So if the exporter didn't do this work, that means you're asking some battery-powered mobile device to do the work instead, and that's not good.

Generally this is the strategy for any feature "X" that's not directly supported on a GPU or in a raw graphics API like WebGL and OpenGL. There are no UDIMs or whatever feature "X" in those APIs, but there are ways to implement them on top of the API. Follow that same strategy when exporting to glTF. Think of the exported data as being a snapshot of what will be sent to a remote GPU. That's the role of glTF.

@globglob3D
Copy link

globglob3D commented Jan 28, 2021

I agree with everything you said, UDIM can just be used inside Blender and translated at export to whatever fit best the format.

That being said I wonder if UDIM are the best thing (or only thing) that could be done. The goal behind this is (at least for me) to be able to apply the same materials over multiple objects, and by extension it would also be to have linked materials across blend file (i.e big material libraries).

The average user wouldn't care but for scalability this is huge! The thing is that when you never use texture atlas and only rely on seamless textures your only remaining enemy preventing such system scalability is just the AO texture.

(I realize this is most likely beyond the scope of the exporter and I'm working on scripting this kind of system for myself with the current exporter capabilities, but I thought I'd just share this in case it gave someone a good idea)

@abgrac
Copy link

abgrac commented Sep 9, 2022

@UX3D-nopper
Copy link
Contributor

There are other standards like https://dassaultsystemes-technology.github.io/EnterprisePBRShadingModel/ and https://autodesk.github.io/standard-surface/
All of the initiators of the material standards are Khronos members and we try to get all these materials under one hood with glTF.
Also, there is the challenge to have a definition and specification for real-time 3D.

@emackey
Copy link
Member

emackey commented Sep 9, 2022

we try to get all these materials under one hood with glTF

@abgrac Just to clarify the meaning here, in the Khronos PBR TSG we're trying to ensure compatibility or at least some level of interoperability with many materials standards, including ASWF's MaterialX, NVIDIA's MDL, 3DS Enterprise PBR, and Adobe's ASM. But the glTF PBR material definition is its own open standard, geared towards broad compatibility across platforms and rendering architectures, influenced and sometimes guided by these other standards and their authors. Like glTF itself, the glTF PBR material definition is geared towards final delivery of a finished asset.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests