You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a check for Django storages. It goes through all storages, tries to write a file, read and then delete a file on them.
For S3 storages it also makes the file public and tries to access it's URL.
Here is the code:
importrandomimportstringimportrequestsfrombotocore.exceptionsimportClientErrorfromdjango.confimportsettingsfromdjango.core.files.baseimportContentFilefromdjango.core.files.storageimportstoragesfromdjango_aliveimportHealthcheckFailurefromstorages.backends.s3boto3importS3Boto3Storagedefgenerate_random_content(size=20):
"""Generate a random string of fixed size"""return"".join(random.choices(string.ascii_letters+string.digits, k=size))
defmake_s3_file_public(storage, name):
"""Make an S3 file public after uploading"""try:
storage.bucket.Object(name).Acl().put(ACL="public-read")
exceptClientErrorase:
raiseHealthcheckFailure(f"Failed to set ACL for S3 file '{name}': {e}") fromedefcheck_url(storage, storage_name, name, test_content):
# Check URL only for S3Boto3Storageifisinstance(storage, S3Boto3Storage) andhasattr(storage, "url"):
print(storage_name)
file_url=storage.url(name)
print(file_url)
response=requests.get(file_url, timeout=5)
print(response.content)
print(test_content)
ifresponse.content!=test_content:
raiseHealthcheckFailure(
f"HTTP downloaded content does not match for S3 storage '{storage_name}'"
)
defcheck_storages():
errors= []
forstorage_nameinsettings.STORAGES.keys():
test_file_name=f"storage_test_file_{storage_name}_{generate_random_content(size=5)}.txt"storage=storages[storage_name]
test_content=f"{storage_name}{generate_random_content()}".encode("utf-8")
try:
try:
storage.delete(test_file_name)
exceptFileNotFoundError:
pass# Write operationname=storage.save(test_file_name, ContentFile(test_content))
# For S3 storage, make the file publicifisinstance(storage, S3Boto3Storage):
make_s3_file_public(
storage,
storage.location+"/"+nameifstorage.locationelsename,
)
# Read operationwithstorage.open(name, "rb") asfile:
content=file.read()
ifcontent!=test_content:
raiseHealthcheckFailure(
f"Read content does not match written content for storage '{storage_name}'"
)
check_url(storage, storage_name, name, test_content)
# Clean up: Delete the test filestorage.delete(name)
exceptExceptionase: # noqaerrors.append(f"Storage '{storage_name}' failed: {e}")
iferrors:
raiseHealthcheckFailure("; ".join(errors))
The basic check for the storages (write, read, delete) could be performed on any Django storage using just pure Django (although I am not sure how much it is useful for filebased storages).
The second part of the testing is probably storage dependent.
I have a check for Django storages. It goes through all storages, tries to write a file, read and then delete a file on them.
For S3 storages it also makes the file public and tries to access it's URL.
Here is the code:
The basic check for the storages (write, read, delete) could be performed on any Django storage using just pure Django (although I am not sure how much it is useful for filebased storages).
The second part of the testing is probably storage dependent.
@ipmb What do you think about this test? Do you think that the basic test fits
django-alive
(after some work on the code), or should I place the whole test to https://github.com/PetrDlouhy/django-alive-checks?The text was updated successfully, but these errors were encountered: