-
-
Notifications
You must be signed in to change notification settings - Fork 35.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support of Compressed Textures in core #13911
Comments
I agree with you on the importance of compressed textures, but I think the suggested step of putting them in
... the extra step of grabbing I assume that's also what you're getting at with "tools for on-line compression", and I think something in that vein is a better first step... I assume you mean an online service, not compressing images at runtime? Were there tools you had in mind, or would particularly like to see? This might also be a great area for better documentation, tutorials, etc. |
(Or, if I've totally missed the mark and there are real pain points with using compressed textures because they're not in the core library, that would also be great feedback...) |
@donmccurdy Suppose i have a model, and the only textures i have for it are already compressed, i don't know much about all these fancy formats and what makes a model - i just want to load it in and see it, a-la Lastly, and that's the point you have alluded to - say i have uncompressed textures, and i want to compress them, and I don't know about looks like crunch, i only know that texture compression exists and it's a good thing to have for my needs. Offering a simple to use tool would be pretty awesome. i.e. your online conversion tool. It could use the same code as what's written for usecase1, and run entirely in the browser, the main point is convenience. |
I don't think you would want every visitor generating DDS textures at load time in the browser; better to do this once, offline, in advance. I'm not aware of good in-browser tools to do the conversion. For the other two cases, I think the last ...
... is the best place to start. If people can create models with compressed textures easily and loading them becomes the pain point, that will be a good problem to have. Make this as easy as we can with tools and tutorials, but only move it into the core library when it's more common practice. |
https://blog.playcanvas.com/webgl-texture-compression-made-easy/ Unfortunately, the direct comparison between |
@Mugen87 Compressed textures offer benefits beyond saving memory, well, indirectly. One major benefit is faster sampling, as it involves less bandwidth for the same number of samples. |
@Usnul it would be amazing to have this functionality in three.js (at least depending on how fast the compression can be done at runtime), but it would make more sense as an example. There would need to be a good chunk of logic around deciding which format the current device supports, and then there would need to be compression utilities for each of these formats. From the Playcanvas article it look like DXT, PVR and ECT1 would cover all devices, but that still means three texture compressors. That's not going to be small. Or, I suspect, trivial to write in JS. That said, an easy to use plugin that did this in a fairly easy to use way at run time would be an amazing addition to the library. But you will get a lot more support for this idea if you push it as an example first. Once the usefulness has been demonstrated, then you can push for adding it to the core. |
It will be even more complex since the most promising standard ASTC should also be supported. I guess In general, I support @donmccurdy earlier post. I think it's preferable to do the conversion just once with existing tools and not for each user at the client-side via JavaScript. |
Yeah, that probably makes more sense. In that case, two things are lacking:
|
I'd like to highlight that it's no good idea to develop a conversion tool from scratch. For example the encoding process of ASTC is very sophisticated, so you definitely want to use the existing tool from ARM: ASTC encoder. BTW: Since the encoding process for high quality ASTC textures can be very long, it's no good idea to do this on the client-side. So the idea might be a wrapper node script that calls existing CLI-tools. However, since these tools are normally written in C/C++, you need to deliver different binaries for multi platform support. |
A few places to start:
|
Agreed, I was in no way suggesting that we do so! 😅 Seems like Compressonator can do everything except for PVR. It does ASTC, ATC, ATInN, BCn, ETCn, DXTn, swizzle DXTn formats. So, we can generally recommend users to use that. There's also pvrtextool which compresses to PVRTC, ETC and DXT and has GUIs for windows, mac and linux, |
Have you guys played any recent games, for the past 10 years or so with large open terrain spaces? If you did - you witnessed virtual textures in action more than likely. Virtual textures require building a texture at run-time, I won't get into why right now - it's a much larger topic all-together. Virtual textures require you to do texture compression on the fly at run-time also, if you wish to benefit from that tech. So guess what? most engines that implement virtual textures also implement texture compression. I see a lot of argumentation here along the lines of "runtime compression is not possible" or "runtime compression is not useful". I think in WebGL it's especially useful, since we don't have a common ground on texture compression formats and bandwidth is a consideration. I am not claiming that it's easy, i'm making an argument for it's usefulness and it's feasibility. You use a compressed texture and save up say 4x space, this means that your sampling runs up-to 4x faster now, since main bottleneck is the memory bandwith, this means that if, say, you use a compressed texture for shadows - you can effectively double the resolution for a compressed texture and have exactly the same performance as with an uncompressed texture of half that size when sampling. Does that sound useful? How about 4x higher number of samples for PCF? Or any other multisample technique. I do not get the argument. |
I'm not opposed to doing both, either. 😁 The crunch library is only 150kb when compiled to JS, so shipping that with a web experience seems totally reasonable. Someone want to give this a try? |
I've made a library some time ago that does what @Mugen87 describes above, it is a wrapper Node script that abstracts away the specific CLI flags for you and makes it easy to have ASTC, ETC, PVRTC and S3TC in KTX containers. It is released under the MIT licence here: https://github.com/TimvanScherpenzeel/texture-compressor. Of course this is not the final solution to the problem but it does the trick for me for now. A major advantage is that you only need a single compressed texture loader for decoding: KTXLoader. A much better universal alternative is being developed by Binomial called Basis: http://www.binomial.info/ and is going to be contributed to the GLTF project: https://www.khronos.org/assets/uploads/developers/library/2017-gdc-webgl-webvr-gltf-meetup/6%20-%20Universal%20Texture%20Compression%20Format_Mar17.pdf. |
Interesting. I've never heard of Basis... |
Basis is neither open source nor free... 😞 Curious how that's going to work with glTF? EDIT: looks like they're going to open source it. Looks like a promising option for the future. |
@TimvanScherpenzeel I've checked out your tool and I'm wondering if it's possible to pass in ASTC-specific parameters? For example there are The leftmost normal map is compressed with default settings, the second uses Source: https://developer.arm.com/-/media/Files/pdf/graphics-and-multimedia/Stacy_ASTC_white%20paper.pdf The whitepaper from ARM actually recommends to use these presets when compressing normal maps. |
Hey @Mugen87, As documented you can use the This should work when running it from the root of the repo: If you have any other questions regarding my tool I think it is better to ask them in my repo (feel free to open an issue) that keeps the conversation here more to the point. |
@TimvanScherpenzeel Thanks for the feedback. I just wanted to address this topic right here since it's important that automated conversion tools make actually use of format-specific optimization features if possible. As mentioned above, the quality difference can be very noticeable. |
Hi @Mugen87, After a second thought and testing it out again and it appears that does in fact won't work. I've opted for using ASTC through PVRTexTool in order to add mipmapping support and Y-flipping of textures in order to avoid introducing more dependencies and issues I was having with file reading / file writing callbacks. |
Now that Basis textures are free and well supported the title of this issue should be changed to:
Is that something we want to do? If not, this issue can probably be closed. @Usnul does the BasisTextureLoader cover what you mean by "compressed texture support"? |
I wouldn't move BasisTextureLoader into core right now, for a few reasons:
|
I think having these loaders in the examples is absolutely sufficient. In general, the idea of "moving stuff into the core" has lost importance anyway since all example files are now available as modules. They can easily be imported via ES6 imports. |
It would be a helpful step. It covers the "first class compressed texture support", but online compression is still not covered. If you load a normal PNG or a JPG - being able to convert it to a compressed format inside the library would be beneficial for a lot of users, I believe. For me personally, I want to be able to compress textures that were generated in code, things like texture atlases, noise maps etc. If we had a PNG -> basis encoder, then the whole chain would be covered.
I agree, sadly. It seems that supporting compressed textures on a wide range of devices without bloating the library is some ways off right now. |
I'm not sure that three.js is the right place for a basic compression tool. We should focus on asset consumption and let other people deal with asset creation. Is there a browser-based texture compression tool we could recommend rather than creating one here? Aside from that, is compressing to basis fast enough to do client-side? Or is it better to do this as a build step? |
My impression is that GPU texture compression is slow to do clientside, at runtime, for nontrivial texture sizes. And if your texture sizes are trivial, you are unlikely to benefit from GPU compression. It also tends to require some visual inspection or trial and error, Basis Universal in particular needs tuning. I think this is mostly a job for a good "offline" tool. Or something like https://squoosh.app/, but for Basis. That said, a WASM Basis encoder would almost certainly still have its uses, even if it is slow. |
In the interest of taking action on this issue, how about this:
We can revisit these recommendations later if needed. For now, I think we can close this issue.
This is addressed by the final point above. Unless someone can demonstrate a way to compress texture data on the fly that is reliable, fast, and doesn't require visual inspection to verify that it looks OK, I don't think there's anything to be done here. |
@looeee That sounds all good to me! |
I like the idea of having a robust set of texture compression tools and good support for compressed textures. It is something that would raise the performance bar for a lot of use cases. As far as online(or runtime) compression on client side goes, this is something that has existed for a long time in AAA industry, I remember reading a paper on runtime texture compression from around 2005, the point of the paper was that doing really poor job at texture compression was still beneficial, as you could do it fast and still gain in terms of overall quality through increased affordable resolution. There are a lot of techniques that require dynamic textures, most prominent one is "virtual texture", where an actual texture that is submitted to the GPU is basically a texture atlas, made up tiles of a virtual texture. I don't think that performance is necessarily a bottleneck, I for one would be fine with having an "optimization" pipeline, where a worker thread could be used to perform texture compression and while you wait - you can use the uncompressed texture. In many ways it's similar to what we already do, assets are loaded progressively, you load a model and you start to see the geometry before the textures finish loading, waiting for compression is not so dissimilar in my view. |
Probably that would involve taking an existing tool and converting it to WASM (maybe basisu)? However, any such tool would work with any 3D web framework, and creating it would be a large project in itself. That's why I don't think this repo is the right place for such a tool - it doesn't need to be tied to three.js. |
No, the last time I used it it took 30 seconds to compress a 4k texture on my machine. |
I don't think we should support "compressed textures" in core because every GPU supports a different type. However, we may want to support some sort of |
Not sure #16524 is the right place — I'm closing that one since its original purpose is mostly done. But I do think
One problem here would be that uploading the uncompressed texture to the GPU will freeze rendering for a non-trivial amount of time, and avoiding that is (IMO) a key benefit of using Basis in the first place. That said, Basis did recently begin providing a WASM encoder, so converting uncompressed textures to Basis in the browser is technically possible. I'm planning to switch over glTF-Transform to use that rather than the |
Compressed Textures
Compressed textures as a first-class citizen, along with tools for on-line compression
Motivation
Compressed textures offer a great amount of extra detail requiring only a little space, for applications with large textures and/or large number of textures, this draws a line between interactive frame-rate and a slide-show, this point becomes more relevant for lower-end GPUs, as they tend to have less RAM, being able to draw 2024 compressed textures instead of 512 uncompressed ones is a extremely important, as they take up potentially the same amount of GPU RAM. Compressed textures take less time to load and put less stress on browser, since decompression is not done by default (unlike PNG).
born as a result of this discussion: #13807
The text was updated successfully, but these errors were encountered: