-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large file error #19
Comments
Hi Boris - I'll check into this ASAP. I'm guessing it's a bug in On Fri, Nov 4, 2016 at 10:27 AM, Boris Mansencal [email protected]
|
Hi, However, I am wondering if elemental_vector should be changed to better handle large sizes.
I think there should be a check before the assert (when grow_hint==true):
The code could be changed to something like this:
Here, if we have the grow_hint set to true, and next power of two is greater than UINT32_MAX, we set the new_capacity to UINT32_MAX (so not a power of two...). I have not tried to compress my buffer with this modification yet. I will ASAP. |
Actually, setting new_capacity to UINT32_MAX does not work. I don't know how much bigger actual_size can be (it seems very implementation dependent according to malloc_usable_size man page). On my platform, if I set new_capacity to maximum UINT32_MAX-4096, I am able to compress my buffer. So my code for elemental_vector looks like this:
Besides, I think the two following asserts could be added at the end of elemental_vector::increase_capacity(), just before the return true:
|
Hi,
First of all, thank you for this excellent open source codec.
I am using LZHAM to compress a rather large buffer (6140582460 bytes). I compress by chunks of maximum UINT32_MAX (=4294967295) bytes , but it fails (on one particular buffer).
Is UINT32_MAX allowed for lzham_compress_memory() or is it too large ?
In release I have a core dumped.
When compiled in Debug mode, I have the following backtrace:
It seems I have a vector of capacity 2147487728, that size is to be increased to at least 2147603813,
but next_pow2(2147603813)=4294967296 > UINT32_MAX=4294967295.
Is it a bug or should I use smaller chunks ?
This is on Linux x86_64 (Fedora 22, gcc 5.3.1), on an Intel Xeon CPU E5-2695 v3.
Thank you for you help.
Boris.
The text was updated successfully, but these errors were encountered: