-
-
Notifications
You must be signed in to change notification settings - Fork 218
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SMAAPass on a render target #62
Comments
Hi, thanks for the issue report! This could actually be a Chrome bug; the images that are supposedly incomplete are immediately available in the form of base64 strings, but Chrome seems to not create the Image instances properly. I created a testbed for this issue and was able to replicate the error messages. However, after re-running the fiddle a few times (Ctrl + Return), the errors disappear. Furthermore, the errors don't show up at all in Microsoft Edge. I haven't tried Firefox, but I don't think that's necessary at this point. I'll try to create a minimal replication of the error for a Chrome bug report. Apart from that, I suggest creating a custom |
After some more testing and reading I can now say that it's not a Chrome bug. Loading images is apparently always asynchronous, even if the // Define the image data.
const imgData = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAEIAAAAhCAAAAABIXyLAAAAAOElEQVRIx2NgGAWjYBSMglEwEICREYRgFBZBqDCSLA2MGPUIVQETE9iNUAqLR5gIeoQKRgwXjwAAGn4AtaFeYLEAAAAASUVORK5CYII=";
// Create a new image.
const img = new Image();
// This logs true, see https://developer.mozilla.org/en-US/docs/Web/API/HTMLImageElement
console.log("no src defined, complete:", img.complete);
// Verify that complete is true on load.
img.addEventListener("load", function() {
console.log("image data has been loaded, complete:", this.complete);
});
// Set the data.
img.src = imgData;
// Immediately check complete.
console.log("src has been set, complete:", img.complete);
// Output:
// > no src defined, complete: true
// > src has been set, complete: false
// > image data has been loaded, complete: true The images that are used in the |
Thanks for your reply. Actually I noticed the problem on Edge too and today after repeating the operation several times i think the AA was applied to one of my generated PNGs. So the async problem could explain that inconsistent behaviour (even if rare, as i could never replicate it that again). I know nothing about AA algorithms and have still little understanding of the architecture of your library, but what is the image actually used for and could it be avoided? You could load the image synchronously by using a dreaded sync XHR call (which could be acceptable here since the code is already in memory, not a real call to a remote server):
Fiddle: https://jsfiddle.net/rj350y4k/2/ |
The Subpixel Morphological Antialiasing technique uses two textures (look-up-tables) to detect specific patterns (weighted color-edges). The images are a central part of the algorithm. Since the intention behind the current code was to load the image synchronously, a synchronous XHR call seems feasible. However, I'd like to avoid the use of browser-specific constructors inside this library. There's an alternative that requires a bit more time to implement: the original SMAA repository contains scripts for the generation of the look-up-tables. I'll port these scripts to JavaScript and integrate them into the SMAAPass. The images can then be created synchronously using the Canvas API when the pass is added to the composer. This could also reduce the file size of the Until then, you could use the following hacky (and asynchronous) workaround to ensure that the images are loaded before you start rendering: /**
* Creates a new SMAAPass and ensures that images are fully loaded.
*
* @param {Function} done - A callback. The new SMAAPass will be passed to this function.
*/
function createSMAAPass(done) {
const smaaPass = new SMAAPass(Image);
const areaTexture = smaaPass.weightsMaterial.uniforms.tArea.value;
const searchTexture = smaaPass.weightsMaterial.uniforms.tSearch.value;
areaTexture.image.addEventListener("load", function() {
if(searchTexture.image.complete) {
done(smaaPass);
}
});
searchTexture.image.addEventListener("load", function() {
if(areaTexture.image.complete) {
done(smaaPass);
}
});
// Reload.
areaTexture.image.src = smaaPass.weightsMaterial.areaImage;
searchTexture.image.src = smaaPass.weightsMaterial.searchImage;
} |
Well, to be fair, the Canvas API is also browser-specific, but there's no way around that one 😅 |
I was actually thinking if the image element could be replaced by the canvas api. Your solution sounds reasonable. Passing an empty Image element to the constructor feels very mysterious. Anyway your hack is hacky but it works great!! My high resolution images look amazing, thanks :) I now need to find out how to change the parameters of the SMAA pass. Will dig again into your code. Cheers |
I got good results but the current implementation of the SMAA pass breaks something at high resolutions. I can get renderings up to 16K x 16K (based on gpu MAX_TEXTURE_SIZE) without post-processing, but when i add the SMAA pass images beyond a certain size fail (result is an empty image, but i'm not sure if that happens during post-processing or later in the process). Is there any limitation with the image sizes used by the SMAA pass? I'm afraid that browser implementations of the canvas element might also put some limits there. |
The postprocessing library imposes no artificial limitation on the size of render targets. Most of the included passes create their internal render targets based on the main render size of the renderer. The only theoretical limitation in this regard is If you set the render size to How much VRAM do you have? |
I just noticed that one of the render targets created by the |
That makes sense. The limit is the VRAM. I'm working on a computer with 2Gb only. So it's probably ok when it uses a single render target of 16x16K (no AA), but no more. I just tested half max size: 8K x 8K (took 84 seconds) but it worked with SMAA and the quality is very nice. I should probably force a 50% safe limit or see if there is a workaround for my use case: i don't need realtime performance anyway so doesn't matter if it's slow. I'll open source something when it'll be more stable. |
Alright, good luck with your work! Meanwhile, I found some weird inconsistencies in the SMAA WebGL port from the The search texture that is currently being used is completely black and doesn't seem to be cropped properly. I ported the search texture generation script and produced the correct image. This is how it should look like (Y-flipped): Unfortunately, the area texture generation procedure seems much more complex. The bad thing is that the script is poorly optimised and supposedly takes minutes to generate the texture. I'll port it anyway and see if that's true. Alternatively, I could try to port the script to GLSL and generate that texture on the GPU. Another decent option is to store the image data as an array in a source file. Creating an image from that data would be fast and synchronous, but the data would inflate the file size just like the base64 strings. Furthermore, the current SMAA shaders differ quite a lot from the original reference implementation. I'll attempt to revise the shaders, too. |
Whew, porting the python scripts took me a little longer than expected. Pesty bugs 🐛 Now to the results: Generating the SMAA search image takes around 1 millisecond. While the area image generation doesn't take minutes to finish, it still takes too long for my taste. It wouldn't feel right to perform this complex generation step in the I'll include the ported scripts in the library for future reference, but they won't play an active role in the I haven't looked at the SMAA shaders yet, but that's next on my list. |
It turns out that my plans were too optimistic. The raw image data takes up a lot of space: ~310Kb for the area image and ~3.5Kb for the search image. Storing the data as base64-encoded strings results in ~65Kb and 0.4Kb. That's much better, but decoding this data is not easy because it's also PNG-compressed. The best way to decode the pixel data is to create an Image and load the data asynchronously which would bring us back to square one. Our options are as follows:
The first option considerably inflates the bundle size of every project that depends on this library while the second option turns the The last option doesn't try to hide the image loading and allows the user to consider the images during a dedicated loading process ( If there are no objections or alternative ideas, I'll just go ahead and implement that last option. |
Sounds totally reasonable. After all async operations are normal in web applications and this solution gives the user a lot of flexibility. Should be documented though :) |
Fixed in Some notes:I messed around with the SMAA shaders and tried some things out, but with little success. The shaders are actually missing some key features such as advanced diagonal pattern detection and corner rounding. Porting these capabilities to WebGL requires a more in-depth understanding of the algorithms. Unfortunately, I don't have the time to work on this, so I'll leave it as is for now. I replaced the previous area image as it was not the same as the one found in the official SMAA repository. This should have a positive effect, although barely noticeable. The search image was actually fine. I thought it was completely black, but the color values are simply not scaled up so they appear almost black. |
Hello. Thanks a lot for this amazing library! It's the first time i'm using it, so apologies if my question is kind of stupid.
I am creating a tool to export high resolution images for prints from Three.js, like 16K x 16K pixels. It works well: i'm rendering on a large render target and then creating a PNG from there. Only problem: I learnt that rendering on a render target disables the antialisasing.
So the only way to get it, is by using something like SMAAPass. All the other filters work perfectly and i can export the PNG with those effects. But with SMAAPass I have an error in the console:
THREE.WebGLRenderer: Texture marked for update but image is incomplete Texture
. This happens with low resolution exports, any size, consistently.Here is what i'm doing:
The result is a png perfectly rendered but with no antialias and the error in the console. Do you think there could be a way to solve this?
The text was updated successfully, but these errors were encountered: