Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add task to create a pending enterprise learner for an assignment #273

Merged
merged 1 commit into from
Sep 22, 2023

Conversation

pwnage101
Copy link
Contributor

ENT-7539

@pwnage101 pwnage101 force-pushed the pwnage101/ENT-7539-2 branch 2 times, most recently from 29d3ee5 to fc7a1e2 Compare September 21, 2023 00:19
@pwnage101 pwnage101 marked this pull request as ready for review September 21, 2023 00:19
Copy link
Contributor

@iloveagent57 iloveagent57 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good, couple of random thoughts.

enterprise_access/apps/content_assignments/tasks.py Outdated Show resolved Hide resolved
Comment on lines 42 to 45
if isinstance(exc, HTTPError):
# The failure resulted from too many retries. This fact would be a useful thing to record in a "reason"
# field on the assignment if one existed.
logger.error('The task failure resulted from too many retries.')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't quite understand what this means - are you talking specifically about 429 HTTPErrors, or celery task retries? And could there be HTTP errors that aren't about rate limiting that cause this task to fail?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

An LMS/enterprise API response with any 4xx/5xx status code will cause the celery task to retry. If execution enters on_failure, then we can infer that it was the result of exceeding max_retries if the exc is any of the following:

autoretry_for = (
RequestsConnectionError,
RequestsTimeoutError,
HTTPError,
OperationalError,
BrazeClientError,
)

If anything, I should probably write this like:

Suggested change
if isinstance(exc, HTTPError):
# The failure resulted from too many retries. This fact would be a useful thing to record in a "reason"
# field on the assignment if one existed.
logger.error('The task failure resulted from too many retries.')
if isinstance(exc, (RequestsConnectionError, RequestsTimeoutError, HTTPError, OperationalError)):
# The failure resulted from too many retries. This fact would be a useful thing to record in a "reason"
# field on the assignment if one existed.
logger.error('The task failure resulted from too many retries.')

So, as much as I'd like to refactor LoggedTaskWithRetry to be more specific about which specific status codes to treat as retryable, I think that's probably out of scope since it would either mean impacting all other consumers of LoggedTaskWithRetry, or adding configurability to LoggedTaskWithRetry that could fit into a different ticket/feature request.

enterprise_access/apps/api_client/lms_client.py Outdated Show resolved Hide resolved
@pwnage101 pwnage101 force-pushed the pwnage101/ENT-7539-2 branch 4 times, most recently from 7b3b401 to 103d206 Compare September 22, 2023 02:29
@pwnage101
Copy link
Contributor Author

I searched the world wide web again and discovered that there's actually a sanctioned way to unit test celery workers using a celery-provided fixture that spins up a mini celery worker in a separate process (see conftest.py). This made it way easier to test the celery task retry logic.

@pwnage101 pwnage101 force-pushed the pwnage101/ENT-7539-2 branch 2 times, most recently from 26329d9 to 22313db Compare September 22, 2023 02:58
Copy link
Contributor

@iloveagent57 iloveagent57 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great, great find on that pytest celery thing. One small request to shore up a unit test.

@pwnage101 pwnage101 merged commit 04f88de into main Sep 22, 2023
@pwnage101 pwnage101 deleted the pwnage101/ENT-7539-2 branch September 22, 2023 18:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants