-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
compressed array become bigger #11
Comments
I have forgot 4th workaround: DO NOTHING 👅 |
In case this is useful to you, I have now made you a collaborator on my version of the project. This means that you can work directly on the main code without going through a pull request. (I trust you.) Let us review this problem. First of all, you can use a fairly large output buffer... one that is larger than the input buffer. Say the input buffer contains N integers, you can just make sure that the output buffer contains at least 1.1 * N + 1024 integers and you are good with high probability. When the output buffer is too small, an exception is thrown. One can catch this exception and realize that the output buffer was too small... one can then extend the buffer and try again. Yes, it is ugly... but in a given application, this can be made very unlikely. How can we make this less ugly?
To minimize overhead, you might want to have an API that allows you to read and write data in blocks. For example, you might want to have something like: ReadableIntView readBlock(int number); and WritableIntView writeBlock(int number) (This is just a vague idea.) The goal would be that the overhead of using a stream interface would be tiny compared to the array-based interface. Getting this right might require non-trivial engineering, but I think it is doable. |
I'll make some trials for this with these priority
|
We are in agreement. |
I repeatedly get compressed array larger than original, while compressing random integers. Tested with versions 0.1.5 and 0.1.6 for a number of codecs. For which of them the output size is identical.
|
Please consult our README on what the library does: |
Sorry, my bad. |
There are some cases that compressed array becomes bigger than original.
You can see sample for it here: https://github.com/koron/JavaFastPFOR/compare/adhoc-tests
Of course I know it is very rare and not target of this library.
But it may cause a kind of uncovenience for users.
There are some workarounds:
IntArrayOutputStream
or so.What workaround do we choose?
The text was updated successfully, but these errors were encountered: