-
Notifications
You must be signed in to change notification settings - Fork 1k
Stream.Flush should reuse buffer #438
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I've opened a pull request to make this update and add a benchmark which demonstrates that allocations are made prior to the change, but not after. Please see #440 |
zhenzou
pushed a commit
to zhenzou/jsoniter
that referenced
this issue
Feb 2, 2022
Previously it would append to the end of the buffer instead of reusing the now-free space. Benchmark demonstrates the improvement, run with -benchtime=10s benchmark old ns/op new ns/op delta Benchmark_encode_string_with_SetEscapeHTML-8 447 442 -1.12% Benchmark_jsoniter_large_file-8 20998 21222 +1.07% Benchmark_json_large_file-8 39593 40187 +1.50% Benchmark_stream_encode_big_object-8 10787 8611 -20.17% benchmark old allocs new allocs delta Benchmark_encode_string_with_SetEscapeHTML-8 6 6 +0.00% Benchmark_jsoniter_large_file-8 78 78 +0.00% Benchmark_json_large_file-8 13 13 +0.00% Benchmark_stream_encode_big_object-8 31 0 -100.00% benchmark old bytes new bytes delta Benchmark_encode_string_with_SetEscapeHTML-8 760 760 +0.00% Benchmark_jsoniter_large_file-8 4920 4920 +0.00% Benchmark_json_large_file-8 6640 6640 +0.00% Benchmark_stream_encode_big_object-8 10056 0 -100.00% Fixes json-iterator#438
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Stream.Flush copies the contents of the stream's buffer to the output, but instead of resetting its buffer to reuse the allocation, it keeps appending to the end of the buffer. For large JSON objects this results in much more allocation than necessary.
I think the marked line above should instead be
Am I missing something?
The text was updated successfully, but these errors were encountered: