Do More in S3 Using Django Storage and Boto3
Apr 06, 2019 · 3 Min Read · 6 Likes · 2 CommentsToday, I am going to write about few useful snippets/functionalities which I have used for Amazon S3 or any S3 compatible storage using Boto3 and Django Storage. FYI, this post focuses on using S3 with Django.
So without further ado, let us begin.
Configuring S3
When using django storage, you don’t need to put configurations like ~/.aws/credentials
which is required in boto3. You can define those in settings in django settings, like:
# settings.py
AWS_ACCESS_KEY_ID = 'XXXX'
AWS_SECRET_ACCESS_KEY = 'XXXX'
AWS_STORAGE_BUCKET_NAME = 'your-bucket'
Using S3 which it does not belong to AWS
If you want to use S3 provided by any company other than AWS, then configure the URL End Point in settings.py
:
AWS_S3_ENDPOINT_URL = "your-bucket-proivder.domain"
Serve S3 in different domain(through cloudFront or varnish)
Lets say, you want to serve S3 contents in cdn.abc.com/static/...
(maybe through cloudfront or varnish), then put the following configuration in settings.py
:
AWS_S3_CUSTOM_DOMAIN = 'cdn.abc.com'
Create new bucket
Creating buckets is fairly easy in boto3. Here I am going to share how you can do that using Django Storage:
import boto3
from django.conf import settings
session = boto3.session.Session()
s3 = session.resource(
's3',
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
endpoint_url=settings.AWS_S3_ENDPOINT_URL
)
s3.create_bucket(Bucket="your-bucket")
See all buckets
Use the following command to see all buckets:
import boto3
from django.conf import settings
s3 = boto3.client(
's3',
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
endpoint_url=settings.AWS_S3_ENDPOINT_URL
)
for bucket in s3.buckets.all():
print(bucket.name)
# to see items inside buckets
for item in bucket.objects.all():
print(item)
Change access control to a bucket
You can change a bucket’s access control like this:
s3 = boto3.client('s3')
bucket = s3.Bucket('my-bucket')
bucket.Acl().put(ACL='public-read')
# Set all Bucket Policy
for bucket in s3.buckets.all():
if bucket.name == bucket_name:
bucket.Acl().put(ACL='public-read')
Delete objects from a bucket(also bucket itself)
You can use the following code to remove a bucket:
bucket = s3.Bucket('my-bucket')
bucket.objects.all().delete()
# Or do it like this
# for obj in bucket.objects.all():
# obj.delete()
# If you also want to completely remove the empty bucket itself:
bucket.delete()
# Or delete in all buckets
for bucket in s3.buckets.all():
bucket.objects.all().delete()
bucket.delete()
Use different folders when storing media and static contents
I got this from django-cookiecutter.
For this purpose, we need to create two new storage classes, subclassing from S3Boto3Storage
:
# in somefile.py
from storages.backends.s3boto3 import S3Boto3Storage
class StaticRootS3Boto3Storage(S3Boto3Storage):
location = "static"
class MediaRootS3Boto3Storage(S3Boto3Storage):
location = "media"
file_overwrite = False
Then import them in settings.py
:
STATICFILES_STORAGE = "path.to.somefile.StaticRootS3Boto3Storage"
DEFAULT_FILE_STORAGE = "path.to.somefile.MediaRootS3Boto3Storage"
Make ‘collect static’ faster
Django’s collect static can be very slow when you will use S3 as storage. For that, you can use Collectfast
to make this faster. Install it using pip install Collectfast
. Then update the settings.py
like this:
AWS_PRELOAD_METADATA = True
INSTALLED_APPS = (
# β¦
'collectfast',
)
Check ACL status
You can do it like this:
s3.get_bucket_acl(Bucket='bucket-name')
In conclusion
Thats all for now. Will update the post with more functionalities or snippets whenever I find one. Also, please suggest more snippets in the comment section below. Thanks for reading. Cheers!!
Last updated: Jul 13, 2024
I won't spam you. Unsubscribe at any time.