Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MultiProcessRedisLock locks on lock name not on encoded key #60

Open
mthipparthi opened this issue Sep 15, 2023 · 1 comment
Open

MultiProcessRedisLock locks on lock name not on encoded key #60

mthipparthi opened this issue Sep 15, 2023 · 1 comment

Comments

@mthipparthi
Copy link

Issue : The Redis lock is applied to a key labelled "LOCK_NAME" which remains consistent across all requests. This ensures that only one request can be processed at any given time. To improve this approach, it might be more appropriate to first append an encoded key to the label before applying the lock.

https://github.com/yoyowallet/django-idempotency-key/blob/master/idempotency_key/locks/redis.py#L20

Solution: I reckon we can add encoded key following way. I can create PR if you are happy with below solution.


class CustomExemptIdempotencyKeyMiddleware(ExemptIdempotencyKeyMiddleware):
    def __init__(self, get_response):
        super().__init__(get_response)
        self.storage_lock_class = utils.get_lock_class()

    def reload_storage_lock(self, encoded_key):
        self.storage_lock = self.storage_lock_class(encoded_key)

    def release_lock(self):
        self.storage_lock.release()

    def acquire_lock(self):
        return self.storage_lock.acquire()

    def generate_response(self, request, encoded_key, lock=None):
        # The default storage lock is established with a common lock prefix shared among all locks.
        # To generate a lock with an encoded key, simply reload the storage lock.
        self.reload_storage_lock(encoded_key)

        if lock is None:
            lock = utils.get_lock_enable()

        if not lock:
            return self.perform_generate_response(request, encoded_key)

        # If there was a timeout for a lock on the storage object then return a
        # HTTP_423_LOCKED
        if not self.acquire_lock():
            return resource_locked(request, None)
        try:
            return self.perform_generate_response(request, encoded_key)
        finally:
            self.release_lock()
class CustomMultiProcessRedisLock(IdempotencyKeyLock):
    """
    Should be used if a lock is required across processes. Note that this class uses
    Redis in order to perform the lock.
    """

    def __init__(self, idempotency_key=None):
        location = utils.get_lock_location()
        if location is None or location == "":
            raise ValueError("Redis server location must be set in the settings file.")

        self.redis_obj = Redis.from_url(location)

        lock_name = (
            f"{utils.get_lock_name()}-{idempotency_key}"
            if idempotency_key
            else utils.get_lock_name()
        )
        self.storage_lock = self.redis_obj.lock(
            name=lock_name,
            # Time before lock is forcefully released.
            timeout=utils.get_lock_time_to_live(),
            blocking_timeout=utils.get_lock_timeout(),
        )

    def acquire(self, *args, **kwargs) -> bool:
        return self.storage_lock.acquire(blocking=False)

    def release(self):
        # Just incase if Middleware releases without acquiring. Less likely. Ignoring it instead of popping up an error.
        with suppress(Exception):
            self.storage_lock.release()
@mthipparthi mthipparthi changed the title MultiProcessRedisLock locks on label not on encoded key MultiProcessRedisLock locks on lock name not on encoded key Sep 15, 2023
@dashdanw
Copy link

dashdanw commented Jan 3, 2024

so in other words this might be required if you were running this idempotency lib with a multi process setup via uwsgi or gunicorn?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants