-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deduplicate entries #224
Comments
If they are contiguous indexes they should not duplicate, that's handled by is_possible_partial. Could you give an example? |
Yeah, these are not contiguous entries. This is actually intentional: one of the pieces of feedback from "old" clipmenu was that deduplication broke the sense of time of copying. In fact we have code specifically to preserve this behaviour. |
I'm not sure what this means, but I'm assuming it means the following:
Is that correct? If yes, then it does not seem like people want duplicate entries but rather they want new duplicate's "timestamp" to be bumped/renewed. I haven't looked into |
It means you have entries:
You then copy
But the desired behaviour is actually:
Obviously one can make an argument in favour of either approach, but we used to do deduplication across the entire line cache and people would complain that their clips "disappeared", when in fact they just moved around after being copied again. |
Ah I see, so they do want duplicated entries.
Seems like there's no "objective" right behavior here, so would a config var to do de-duplication be acceptable then? It would serve both cases, user gets to pick the behavior he prefers. |
A config option is fine as long as it doesn't blow up the code complexity/size too much. Thanks! |
I've poked a bit into this. The clip_store thing seems pretty delicate and I haven't looked through it fully, but something like the following seems to be working, in pseudo-code: // in cs_content_add
if (dupe && do_dedup) {
_drop_ ... = cs_ref(..);
int dup_index = find_hash(cs->snips);
move_to_end(cs->snips, dup_index);
return 1;
} Is this okay? Or will it cause some problem. |
uses a linear search to find the duplicate and a memmove to move it over to the end. not perticularly efficient since each snip is 256 bytes while we're only interested in the first 8. and so iterating over it linearly like this isn't very cache friendly. the memmove worst case can also be around 250KiB with the default config. Closes: cdown#224
Currently, duplicate snips get multiple entries clogging up the list of clips. This is especially problematic when you're using a small number of max clips (e.g 32). It'd be nice to avoid duplicate entries from being registered.
The text was updated successfully, but these errors were encountered: