Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SMAAPass on a render target #62

Closed
taseenb opened this issue Sep 24, 2017 · 15 comments
Closed

SMAAPass on a render target #62

taseenb opened this issue Sep 24, 2017 · 15 comments
Assignees
Labels
bug Something isn't working

Comments

@taseenb
Copy link

taseenb commented Sep 24, 2017

Hello. Thanks a lot for this amazing library! It's the first time i'm using it, so apologies if my question is kind of stupid.

I am creating a tool to export high resolution images for prints from Three.js, like 16K x 16K pixels. It works well: i'm rendering on a large render target and then creating a PNG from there. Only problem: I learnt that rendering on a render target disables the antialisasing.

So the only way to get it, is by using something like SMAAPass. All the other filters work perfectly and i can export the PNG with those effects. But with SMAAPass I have an error in the console: THREE.WebGLRenderer: Texture marked for update but image is incomplete Texture. This happens with low resolution exports, any size, consistently.

Here is what i'm doing:

const composer = new EffectComposer(renderer)
composer.addPass(new RenderPass(scene, camera))
const pass = new SMAAPass(window.Image)
composer.addPass(pass)
composer.render()

const rtt = this.composer.writeBuffer
renderer.readRenderTargetPixels( rtt, 0, 0, size.width, size.height, array )
// and here i convert the array to PNG

The result is a png perfectly rendered but with no antialias and the error in the console. Do you think there could be a way to solve this?

@vanruesc
Copy link
Member

Hi,

thanks for the issue report!

This could actually be a Chrome bug; the images that are supposedly incomplete are immediately available in the form of base64 strings, but Chrome seems to not create the Image instances properly.

I created a testbed for this issue and was able to replicate the error messages. However, after re-running the fiddle a few times (Ctrl + Return), the errors disappear. Furthermore, the errors don't show up at all in Microsoft Edge. I haven't tried Firefox, but I don't think that's necessary at this point.

I'll try to create a minimal replication of the error for a Chrome bug report.

Apart from that, I suggest creating a custom Pass for your use-case as shown in the fiddle. That way, you don't have to access the private internal read or write buffer of the EffectComposer and you'll always be able to find the result of the previous pass in the readBuffer that is passed into your Pass's render method.

@vanruesc vanruesc added bug Something isn't working chrome labels Sep 24, 2017
@vanruesc
Copy link
Member

After some more testing and reading I can now say that it's not a Chrome bug. Loading images is apparently always asynchronous, even if the src property is set to a base64 data string. The following example illustrates this behaviour:

// Define the image data.
const imgData = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAEIAAAAhCAAAAABIXyLAAAAAOElEQVRIx2NgGAWjYBSMglEwEICREYRgFBZBqDCSLA2MGPUIVQETE9iNUAqLR5gIeoQKRgwXjwAAGn4AtaFeYLEAAAAASUVORK5CYII=";

// Create a new image.
const img = new Image();

// This logs true, see https://developer.mozilla.org/en-US/docs/Web/API/HTMLImageElement
console.log("no src defined, complete:", img.complete);

// Verify that complete is true on load.
img.addEventListener("load", function() {

	console.log("image data has been loaded, complete:", this.complete);

});

// Set the data.
img.src = imgData;

// Immediately check complete.
console.log("src has been set, complete:", img.complete);

// Output:
//  > no src defined, complete: true
//  > src has been set, complete: false
//  > image data has been loaded, complete: true

The images that are used in the SMAAPass should be loaded asynchronously, but I'll first need to find a good way to do that. Suggestions are welcome.

@vanruesc vanruesc removed the chrome label Sep 24, 2017
@taseenb
Copy link
Author

taseenb commented Sep 24, 2017

Thanks for your reply. Actually I noticed the problem on Edge too and today after repeating the operation several times i think the AA was applied to one of my generated PNGs. So the async problem could explain that inconsistent behaviour (even if rare, as i could never replicate it that again).

I know nothing about AA algorithms and have still little understanding of the architecture of your library, but what is the image actually used for and could it be avoided?

You could load the image synchronously by using a dreaded sync XHR call (which could be acceptable here since the code is already in memory, not a real call to a remote server):

var request = new XMLHttpRequest();
request.open('GET', 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAEIAAAAhCAAAAABIXyLAAAAAOElEQVRIx2NgGAWjYBSMglEwEICREYRgFBZBqDCSLA2MGPUIVQETE9iNUAqLR5gIeoQKRgwXjwAAGn4AtaFeYLEAAAAASUVORK5CYII=', false);  // `false` makes the request synchronous
request.send(null);

console.log(request.response); // Image available here

Fiddle: https://jsfiddle.net/rj350y4k/2/

Info: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Synchronous_and_Asynchronous_Requests

@vanruesc
Copy link
Member

what is the image actually used for and could it be avoided?

The Subpixel Morphological Antialiasing technique uses two textures (look-up-tables) to detect specific patterns (weighted color-edges). The images are a central part of the algorithm.

Since the intention behind the current code was to load the image synchronously, a synchronous XHR call seems feasible. However, I'd like to avoid the use of browser-specific constructors inside this library.

There's an alternative that requires a bit more time to implement: the original SMAA repository contains scripts for the generation of the look-up-tables. I'll port these scripts to JavaScript and integrate them into the SMAAPass. The images can then be created synchronously using the Canvas API when the pass is added to the composer. This could also reduce the file size of the SMAAPass class because no image data needs to be stored anymore. Let's hope this works out 😇

Until then, you could use the following hacky (and asynchronous) workaround to ensure that the images are loaded before you start rendering:

/**
 * Creates a new SMAAPass and ensures that images are fully loaded.
 *
 * @param {Function} done - A callback. The new SMAAPass will be passed to this function.
 */

function createSMAAPass(done) {

	const smaaPass = new SMAAPass(Image);

	const areaTexture = smaaPass.weightsMaterial.uniforms.tArea.value;
	const searchTexture = smaaPass.weightsMaterial.uniforms.tSearch.value;

	areaTexture.image.addEventListener("load", function() {

		if(searchTexture.image.complete) {

			done(smaaPass);

		}

	});

	searchTexture.image.addEventListener("load", function() {

		if(areaTexture.image.complete) {

			done(smaaPass);

		}

	});

	// Reload.
	areaTexture.image.src = smaaPass.weightsMaterial.areaImage;
	searchTexture.image.src = smaaPass.weightsMaterial.searchImage;

}

I encountered no errors with this setup.

@vanruesc
Copy link
Member

Well, to be fair, the Canvas API is also browser-specific, but there's no way around that one 😅

@taseenb
Copy link
Author

taseenb commented Sep 25, 2017

I was actually thinking if the image element could be replaced by the canvas api. Your solution sounds reasonable. Passing an empty Image element to the constructor feels very mysterious. Anyway your hack is hacky but it works great!! My high resolution images look amazing, thanks :)

I now need to find out how to change the parameters of the SMAA pass. Will dig again into your code. Cheers

@taseenb
Copy link
Author

taseenb commented Sep 26, 2017

I got good results but the current implementation of the SMAA pass breaks something at high resolutions. I can get renderings up to 16K x 16K (based on gpu MAX_TEXTURE_SIZE) without post-processing, but when i add the SMAA pass images beyond a certain size fail (result is an empty image, but i'm not sure if that happens during post-processing or later in the process).

Is there any limitation with the image sizes used by the SMAA pass? I'm afraid that browser implementations of the canvas element might also put some limits there.

@vanruesc
Copy link
Member

Is there any limitation with the image sizes used by the SMAA pass? I'm afraid that browser implementations of the canvas element might also put some limits there.

The postprocessing library imposes no artificial limitation on the size of render targets. Most of the included passes create their internal render targets based on the main render size of the renderer.

The only theoretical limitation in this regard is gl.MAX_TEXTURE_SIZE which you've already mentioned. This constant "returns the maximum dimension the GPU can address" -gman. However, this doesn't mean that it's safe to create several frame buffers of this size; the amount of available physical VRAM is still the hard limit.

If you set the render size to gl.MAX_TEXTURE_SIZE and use a simple RenderPass to render a scene, you'll already generate a lot of data. Assuming that a query for gl.MAX_TEXTURE_SIZE returns 16384 for most GPUs, a single RGB render operation will create 16384 x 16384 x 3 (RGB Format) x 8 (Unsigned Byte Type) = 805.31Mb of data on the GPU. This is fine as long as you have that much VRAM available, but post processing often involves the creation of several off-screen frame buffers to store computed per-pixel information. The SMAAPass needs two additional render targets that must both be of the same size as the input texture. Together, these three textures are used to render the result to a fourth render target (the screen). This amounts to roughly 3.22Gb.

How much VRAM do you have?

@vanruesc
Copy link
Member

I just noticed that one of the render targets created by the SMAAPass uses the RGBA format, so we end up at 3 x (16384 x 16384 x 3 x 8) + (16384 x 16384 x 4 x 8) = 3.49Gb.

@taseenb
Copy link
Author

taseenb commented Sep 26, 2017

That makes sense. The limit is the VRAM. I'm working on a computer with 2Gb only. So it's probably ok when it uses a single render target of 16x16K (no AA), but no more.

I just tested half max size: 8K x 8K (took 84 seconds) but it worked with SMAA and the quality is very nice. I should probably force a 50% safe limit or see if there is a workaround for my use case: i don't need realtime performance anyway so doesn't matter if it's slow. I'll open source something when it'll be more stable.

@vanruesc
Copy link
Member

Alright, good luck with your work!

Meanwhile, I found some weird inconsistencies in the SMAA WebGL port from the three.js examples.

The search texture that is currently being used is completely black and doesn't seem to be cropped properly. I ported the search texture generation script and produced the correct image.

This is how it should look like (Y-flipped):

search

Unfortunately, the area texture generation procedure seems much more complex. The bad thing is that the script is poorly optimised and supposedly takes minutes to generate the texture. I'll port it anyway and see if that's true. Alternatively, I could try to port the script to GLSL and generate that texture on the GPU. Another decent option is to store the image data as an array in a source file. Creating an image from that data would be fast and synchronous, but the data would inflate the file size just like the base64 strings.

Furthermore, the current SMAA shaders differ quite a lot from the original reference implementation. I'll attempt to revise the shaders, too.

@vanruesc
Copy link
Member

Whew, porting the python scripts took me a little longer than expected. Pesty bugs 🐛

Now to the results:

Generating the SMAA search image takes around 1 millisecond.
Generating the SMAA area image takes around 1 second.

While the area image generation doesn't take minutes to finish, it still takes too long for my taste. It wouldn't feel right to perform this complex generation step in the SMAAPass after all.

I'll include the ported scripts in the library for future reference, but they won't play an active role in the SMAAPass. The search and area images will be integrated in the form of raw image data instead of base64 encoded data strings. It's easy and fast to create a canvas from this data which can then be used as a texture.

I haven't looked at the SMAA shaders yet, but that's next on my list.

@vanruesc vanruesc self-assigned this Sep 29, 2017
@vanruesc
Copy link
Member

It turns out that my plans were too optimistic.

The raw image data takes up a lot of space: ~310Kb for the area image and ~3.5Kb for the search image. Storing the data as base64-encoded strings results in ~65Kb and 0.4Kb. That's much better, but decoding this data is not easy because it's also PNG-compressed. The best way to decode the pixel data is to create an Image and load the data asynchronously which would bring us back to square one.

Our options are as follows:

  1. Include the full, uncompressed raw image data in the source.
  2. Generate the images in the SMAAPass synchronously despite the computational complexity.
  3. Expose the base64-encoded, PNG-compressed images to the user and let him/her take care of loading the images beforehand. The loaded images would then be passed to the constructor of the SMAAPass.

The first option considerably inflates the bundle size of every project that depends on this library while the second option turns the SMAAPass into a clunky pass that blocks the main thread.

The last option doesn't try to hide the image loading and allows the user to consider the images during a dedicated loading process (LoadingManager from three.js). It grants the most freedom and seems like the best way forward.

If there are no objections or alternative ideas, I'll just go ahead and implement that last option.

@taseenb
Copy link
Author

taseenb commented Sep 30, 2017

Sounds totally reasonable. After all async operations are normal in web applications and this solution gives the user a lot of flexibility. Should be documented though :)

@vanruesc
Copy link
Member

vanruesc commented Oct 3, 2017

Fixed in postprocessing@3.0.0.

Some notes:

I messed around with the SMAA shaders and tried some things out, but with little success. The shaders are actually missing some key features such as advanced diagonal pattern detection and corner rounding. Porting these capabilities to WebGL requires a more in-depth understanding of the algorithms. Unfortunately, I don't have the time to work on this, so I'll leave it as is for now.

I replaced the previous area image as it was not the same as the one found in the official SMAA repository. This should have a positive effect, although barely noticeable.

The search image was actually fine. I thought it was completely black, but the color values are simply not scaled up so they appear almost black.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants