Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement lazy decoding #9

Open
neilisaac opened this issue Dec 30, 2019 · 1 comment
Open

Implement lazy decoding #9

neilisaac opened this issue Dec 30, 2019 · 1 comment

Comments

@neilisaac
Copy link
Contributor

neilisaac commented Dec 30, 2019

Dequeue has variable latency since it may advance firstSegment, resulting in all elements of the next file getting decoded synchronously. This will hold the mutex for an extended period of time, blocking Enqueue operations, and may delay the consumer unnecessarily.

Instead of Peek() (interface{}, error) and Dequeue() (interface{}, error) we could have

Peek(interface{}) error
Dequeue(interface{}) error

This emulates the API from json.Decoder.Decode, removing the need for providing an object builder.

This would allow storing []byte arrays for each object rather than decoded objects. This may also reduce memory due to gob's encoding format, depending on the application.

A further optimization to consider is seeking within the file, rather than loading the whole file into memory.

This is not critical for my current application, but is worth discussing/considering.

@neilisaac
Copy link
Contributor Author

On second thought, reducing memory use is important for one of the applications I have in mind (storing media payloads in the queue), where storing a whole segment in memory is undesirable. Using a small max segment size may work, but is not idea.

The interface suggestion is not strictly related to lazy decoding.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant