-
-
Notifications
You must be signed in to change notification settings - Fork 739
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Benchmark only compares against mapbox releases #76
Comments
This addresses 2 related bugs: 1. An unevaluated mapbox expression-object was used in a condition (`if (expression)`) to decide if sorting was necessary, the assumption was probably that the default constant value of `undefined` would skip sorting. However, the unevaluated mapbox expression is not the value `undefined`, but an expression-object describing the value and its type. So, even if the mapbox expression was a constant `undefined`, the check would never be true. This leads to a lot of unnecessary sorting. This was already correct for the drawing sort in circles and symbols, and this is where the `sortFeaturesByKey` variable name originates. 2. Even where the check was correct, it would explicitly check if the sort-key was a constant with value `undefined` to skip sorting. However, we also know that the sort-key is irrelevant if it's *any* constant, not just `undefined`. Note that the second optimization would not be possible if cross-layer sorting was implemented (see mapbox/mapbox-gl-js#1349). I'm unable to provide good performance benchmarks because of #76.
I'd like to push this forward. As far as I gather we just need the benchmarks_generated.js for each release somewhere. Would it be possible to add this file as an additional asset to the releases on github? Even better if we could do this retroactively for the made releases. The only downside I see is that this asset is little bit confusing for normal users who don't care about benchmarks and development in general. |
Can you elaborate on what's needed exactly? I'd love to help push this forward as well. |
The command |
Can you paste the content of this file here? |
Thinking about it we probably can't/shouldn't do it retroactively as the generated file will have mapbox specific stuff hardcoded (I'll try to test that). But why doesn't it make sense to add it to the releases going forward? Just to clarify it would of course be not the same file for every release but the one build from that specific release. The benchmarks historically where run against the local version and the last official release. But you could also compare against older releases per environment variable. |
@xabbu42 let's focus for now on making it work against latest maplibre release. Later on we can see how to improve it. Sounds fair enough? |
To work against latest maplibre release we need the benchmarks_generated.js for hat release. That is how it currently is. Of course I will work first on fixing all benchmarks to run against the local checkout if you meant that, but imho it does not hurt to already discuss a solution for this issue as we will need that anyway. Do you have an objection to adding benchmarks_generated.js as a separate asset to the releases and/or a better alternative? |
I'm missing some basic things here - the latest version is compiled and tested locally. the last release can be fetched from CDN and tested too, locally, I'm not sure I understand why there's a need to add another file, and this is what I want to understand. |
I don't know why it is the way it is, perhaps mapbox wanted the benchmark code together with the releases so they could benchmark older releases after breaking changes? The generated file is a minified module bundle generated from https://github.com/maplibre/maplibre-gl-js/blob/main/bench/rollup_config_benchmarks.js It is no longer human readable and I doubt you'll get much out of it. |
In addition, we should get the benchmarks running in CI on every pull request. I omitted 2 things from the circleci config when setting up the initial CI: render tests, and benchmarks (because I thought they both needed a remote datasource which wasn't set up then, but is now with the maptiler docs key). @HarelM and @wipfli are getting the render tests working in #326, but we should also track getting the performance stats running on every PR. Should we use this issue to track that or should I open a new one? This was the initial .circleci config that's no longer running now:
Glad you're interested in working on some performance improvements @JannikGM! When I have more time to get back to some work on maplibre I'm also interested in looking into some performance improvements that jumped out while I was writing up the life-of-a-tile docs. |
I think this issue is good for this. |
This is fixed with #391. |
For development, I need to be able to benchmark against maplibre releases (see https://github.com/maplibre/maplibre-gl-js/tree/main/bench for benchmarking mapbox / maplibre). However, as reference, maplibre benchmarks still fetch mapbox releases:
maplibre-gl-js/bench/versions/index.html
Lines 21 to 29 in e5e320f
We can't simply replace this with maplibre specifics, because maplibe does not have any github releases yet:
https://api.github.com/repos/maplibre/maplibre-gl-js/releases/latest returns 404 (at the time of writing).
Even if we explicitly specify versions, there's also no bucket / hosting for maplibre-gl-js benchmark.js: https://s3.amazonaws.com/maplibre-gl-js/main/benchmarks.js
A minor related annoyance: It also still uses mapbox.com for the CSS of the benchmark results page:
maplibre-gl-js/bench/versions/index.html
Line 9 in e5e320f
I have a number of potential performance changes and would like to submit them to maplibre, but I won't be able to test their usefulness if there's no good way to benchmark against the
main
branch, without convoluted hacks.When porting my mapbox 1.12.0 changes to maplibre, I noticed that there have already been changes to hot code paths in maplibre or mapbox 1.13.0, so having recent benchmark releases is very much required.
The text was updated successfully, but these errors were encountered: