-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Fetching an item while writing the item into Cache #294
Comments
I didn’t think the item was committed to the SQLite database until after the value was written. Is it overwriting an existing key? Otherwise, I would probably go with (2). Perhaps there is a way to prevent a duplicate write in that scenario by testing a read. |
That is correct! It is written first. We see this error quite infrequently but it still occurs. I wonder if an overwrite and a read could cause it since the file would be remade? Any thoughts? |
Is it a big value? I wonder if there’s a period where the serialized value is basically in an inconsistent state. Like, half of it is written when the read occurs. So if loading the value fails in any way, then the item should be treated as though it were not present in the cache. Another thought is to write the value to a different temporary file and then rename the file into the correct place. On Linux there are ways to guarantee the rename is atomic. In that case, I think the cache would always be in a consistent state. Probably should use https://docs.python.org/3/library/os.html#os.replace |
The strange part is that the error means the file is empty. I will try and see where i can re-produce this. |
Hi DiskCache Team,
We have recently run into an issue where we have two threads attempting to write and read the same item from the cache which is resulting in a Pickle Error since the disk cache file is empty.
Example Error
After some deep diving, I believe it is caused by the following
I would be happy to submit a PR for the above error, but would like some guidance on a preferred approach:
Any other suggestions welcome
The text was updated successfully, but these errors were encountered: