Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Checking for max batch size ignores compression #4161

Open
sehz opened this issue Aug 31, 2024 · 1 comment
Open

Checking for max batch size ignores compression #4161

sehz opened this issue Aug 31, 2024 · 1 comment

Comments

@sehz
Copy link
Contributor

sehz commented Aug 31, 2024

Steps to reproduce:

  1. create a topic: fluvio topic create test
  2. try to send large json greater than 128k but reduces to 4k.
    fluvio produce test --file large.json --raw --linger 0ms --compression zstd

This result in error:

the given record is larger than the buffer max_size (16384 bytes). Try increasing the producer batch size or reducing the record size enabling a compression algorithm

even thought compressed record is small 4k

@digikata
Copy link
Contributor

digikata commented Sep 3, 2024

It might be a good to also modify the batching flow logic a little too in correcting this bug?

If the compressed record is larger than the batch size, send the accumulated batch (w/o the record), and then if the compressed record is still larger than the batch size, send it individually, and start an new batch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Blocker
Development

No branches or pull requests

2 participants