Today, I am going to write about few useful snippets/functionalities which I have used for Amazon S3 or any S3 compitable storage using Boto3 and Django Storage. FYI, this post focuses on using S3 with Django.
So without further ado, lets begin:
Configuring S3 ︎
When using django storage, you don’t need to put configurations like
~/.aws/credentials which is required in boto3. You can define those in settings in django settings, like:
# settings.py AWS_ACCESS_KEY_ID = 'XXXX' AWS_SECRET_ACCESS_KEY = 'XXXX' AWS_STORAGE_BUCKET_NAME = 'your-bucket'
Using S3 Which Does Not Belong to AWS ︎
If you want to use S3 provided by any company other than AWS, then configure the URL End Point in
AWS_S3_ENDPOINT_URL = "your-bucket-proivder.domain"
Serve S3 in Different Domain(Through CloudFront or Varnish) ︎
Lets say, you want to serve S3 contents in
cdn.abc.com/static/...(maybe through cloudfront or varnish), then put the following configuration in
AWS_S3_CUSTOM_DOMAIN = 'cdn.abc.com'
Create New Bucket ︎
Creating buckets is fairly easy in boto3. Here I am going to share how you can do that using Django Storage:
import boto3 from django.conf import settings session = boto3.session.Session() s3 = session.resource( 's3', aws_access_key_id=settings.AWS_ACCESS_KEY_ID, aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY, endpoint_url=settings.AWS_S3_ENDPOINT_URL ) s3.create_bucket(Bucket="your-bucket")
See All Buckets ︎
Use the following command to see all buckets:
import boto3 from django.conf import settings s3 = boto3.client( 's3', aws_access_key_id=settings.AWS_ACCESS_KEY_ID, aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY, endpoint_url=settings.AWS_S3_ENDPOINT_URL ) for bucket in s3.buckets.all(): print(bucket.name) # to see items inside buckets for item in bucket.objects.all(): print(item)
Change Access Control to A Bucket ︎
You can change a bucket’s access control like this:
s3 = boto3.client('s3') bucket = s3.Bucket('my-bucket') bucket.Acl().put(ACL='public-read') # Set all Bucket Policy for bucket in s3.buckets.all(): if bucket.name == bucket_name: bucket.Acl().put(ACL='public-read')
Delete Objects from A Bucket(Also Bucket Itself) ︎
You can use the following code to remove a bucket:
bucket = s3.Bucket('my-bucket') bucket.objects.all().delete() # Or do it like this # for obj in bucket.objects.all(): # obj.delete() # If you also want to completely remove the empty bucket itself: bucket.delete() # Or delete in all buckets for bucket in s3.buckets.all(): bucket.objects.all().delete() bucket.delete()
Use Different Folders When Storing Media and Static Contents ︎
I got this from django-cookiecutter.
For this purpose, we need to create two new storage classes, subclassing from
# in somefile.py from storages.backends.s3boto3 import S3Boto3Storage class StaticRootS3Boto3Storage(S3Boto3Storage): location = "static" class MediaRootS3Boto3Storage(S3Boto3Storage): location = "media" file_overwrite = False
Then import them in
STATICFILES_STORAGE = "path.to.somefile.StaticRootS3Boto3Storage" DEFAULT_FILE_STORAGE = "path.to.somefile.MediaRootS3Boto3Storage"
Make Collect Static Faster ︎
Django’s collect static can be very slow when you will use S3 as storage. For that, you can use Collectfast to make this faster. Install it using
pip install Collectfast. Then update the
settings.py like this:
AWS_PRELOAD_METADATA = True INSTALLED_APPS = ( # … 'collectfast', )
Check ACL Status ︎
You can do it like this:
Thats all for now. Will update the post with more functionalities or snippets whenever I find one. Also, please suggest more snippets in the comment section below. Thanks for reading. Cheers!!