-
Notifications
You must be signed in to change notification settings - Fork 312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Benchmark json.NewEncoder vs json.Marshal #409
Comments
Can't say, I haven't done any benchmarks. I'm just deferring to the stdlib as I strongly suspect it has a positive impact on performance though as most websocket messages tend to be similar sized and there are tons of them. Feel free to open a PR against dev with benchmarks comparing the two. I'm not going to get to this for a while. |
json.Encoder is 42% faster than json.Marshal thanks to the memory reuse. goos: linux goarch: amd64 pkg: nhooyr.io/websocket/wsjson cpu: 12th Gen Intel(R) Core(TM) i5-1235U BenchmarkJSON/json.Encoder-12 3517579 340.2 ns/op 24 B/op 1 allocs/op BenchmarkJSON/json.Marshal-12 2374086 484.3 ns/op 728 B/op 2 allocs/op Closes coder#409
Done in 293f204
|
That was at 128 byte messages which I think is realistic enough but not too big. |
I extended the benchmark for more sizes and
|
Very interesting! Even greater margin than I would have anticipated. I would explore other libraries too: https://github.com/json-iterator/go-benchmark. Golang native is known to be slower than most. That said, for this project the zero dependencies is far more appealing, but perhaps a means of passing in a json encoder for those who want the feature could be nice. |
Yea I was surprised too. You can just use c.Writer or c.Write and use whichever json encoder you want. |
json.Encoder is 42% faster than json.Marshal thanks to the memory reuse. goos: linux goarch: amd64 pkg: nhooyr.io/websocket/wsjson cpu: 12th Gen Intel(R) Core(TM) i5-1235U BenchmarkJSON/json.Encoder-12 3517579 340.2 ns/op 24 B/op 1 allocs/op BenchmarkJSON/json.Marshal-12 2374086 484.3 ns/op 728 B/op 2 allocs/op Closes coder#409
json.Encoder is 42% faster than json.Marshal thanks to the memory reuse. goos: linux goarch: amd64 pkg: nhooyr.io/websocket/wsjson cpu: 12th Gen Intel(R) Core(TM) i5-1235U BenchmarkJSON/json.Encoder-12 3517579 340.2 ns/op 24 B/op 1 allocs/op BenchmarkJSON/json.Marshal-12 2374086 484.3 ns/op 728 B/op 2 allocs/op Closes coder#409
Sync pool can be useful, but in places like in
wsjson
are there worry about variable message sizes and the impact on memory?See golang/go#23199 for info on memory growth through sync.pool.
The GC is much more effective than it used to be. Maybe still useful for fixed length pools? I haven't benchmarked it though.
The text was updated successfully, but these errors were encountered: