You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Color and opacity stop being assigned properly after 4K points. Occasional points beyond this get the right color, but most get some random constant. Note that this problem doesn't happen if you remove opacity, see Coloring for markers using scattergl isn't working properly #1909.
I asked for 20K points, the clustering algo is removing way too many of them.
4096 colors limit - that takes a separate effort. We cannot easily increase palette texture size, that linearly affects performance, and for say 4096×4096 colors initial load will be extra ~150ms or more (if not 500ms). As for color quantization - it needs a separate package for efficient decomposing voronoi diagram, as described here. Because basically for every new data point with unknown color we have to find nearest cluster, and making such search for 10k points or more is inefficient.
There are solutions to that - storing per-point colors in GPU, but that blows up GPU memory use, since we do not reuse colors from palette and allocate extra 64bits per point (extra 64Mb of memory for 1M points). That is also possible to store opacity values in a separate buffer, that will be extra 8Mb per 1M points.
Is there a sense to postpone 4096 colors issue for a bit @alexcjohnson?
See https://codepen.io/alexcjohnson/pen/OQOMmp - two issues are apparent here:
cc @dfcreative @etpinard
The text was updated successfully, but these errors were encountered: