How to Set-up AWS S3 with Django REST like a Beast
Frequently we have come across the question of whether we should serve static files with Django or with some cloud-based storage solutions. However, when the system tends to scale, serving static files becomes a bottleneck. Since the server will often be used to serve these files, the best and most optimal solution will be to use Amazon S3 as it’s simple and scalable, which is why most companies use it.
This tutorial has assumed that you have already set up your AWS S3 bucket and configured CORS policy. If not, follow this tutorial to complete that first.
Here we will use functional partitioning which is segregating media file to separate folders so that it’s less cluttered and can be easily traced if needed.
And another reason will be to support the same file name upload. Since if there is a file with the same name, it will change the file name(depending on the file_overwrite parameter). One good argument here will be to override the existing image; however, that will not work since images are cached in the browser and until the cache gets cleared user will see the previous image.
The answer is to make a separate folder for every file upload inside the main folder. You will know how to implement it when we go to step 4.
Note: filename problem becomes substantial when we start using this in a chat app where it would be a bad user experience when the filename for their upload changes
├── S3 Bucket
├── blog
├── blogid1
├── uuid1
├── file.png
├── uuid2
├── file.png
├── blogid2
├── uuid1
├── file.png
├── avatar
├── username1
├── uuid1
├── file.png
├── uuid2
├── file.png
├── username2
├── uuid1
├── file.png
- Installing new dependencies
We will install two new dependencies, so don’t forget to add this into your requirements.txt file.
Note: This is the latest version as of Jan 2021, however, do check for the latest version on pipy
install using pip install boto3 django-storages
//requirements.txt
django-storages>=1.11.1,<1.12
boto3>=1.16.43,<1.17
django-storage: This library helps us customize the storage location to something new (S3 in our case) instead of default Django stores media.
boto3: works like a python interface for accessing Amazon Web services
2. Setting up a new App
Create a new App called “UploadFiles”. So that we can define our custom storage back-end to store media in separate folders and setup APIs
python manage.py startapp UploadFiles
3. Adding constant variables
Note: Ideally this all constants should be put in a .env file. Show that no one can break into our S3 bucket. However, that’s a topic for another medium article, so make sure you don’t push or share code with anyone or remove secret keys when you commit your code
Now let’s set up our settings.py file.
#-------------------------------AWS----------------------------MAX_BLOG_FILE_SIZE_MB = 4
MAX_AVATAR_IMAGE_SIZE_MB = 2AWS_STORAGE_BUCKET_NAME = "yourproject-static"
AWS_S3_REGION_NAME = 'us-east-1'
AWS_ACCESS_KEY_ID = "AKIAQQSDDEDEASDEDRR9"
AWS_SECRET_ACCESS_KEY = "D7IRdDSrmDadgg6yGO0bffrfr*L4Hm*Opnnk"AWS_DEFAULT_ACL = 'public-read'
AWS_S3_CUSTOM_DOMAIN = f'{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com'
AWS_S3_OBJECT_PARAMETERS = {'CacheControl': 'max-age=86400'}# s3 avatar settings
DEFAULT_PROFILE_AVATAR = "https://yourProject.s3.amazonaws.com/default/default_image.png"AVATAR_LOCATION = 'avatar'
AVATAR_STORAGE = 'UploadFiles.storage_backends.AvatarStorage'
# s3 blog images settings
BLOG_LOCATION = 'blog'
BLOG_STORAGE = 'UploadFiles.storage_backends.BlogStorage'
4. Creating custom storage back-end
Now to get the location of the custom storage folder inside our storage_backends file, we will get the constant variables defined in our settings.py file
And if you want to change the default CacheControl for the media, you can do that by overriding the get_object_parameters function, which by default returns values in “AWS_S3_OBJECT_PARAMETERS”.
5. Now let’s define our Database models
Now there is a lot of things going on here. First, we have defined our basic model, and after that, in order to store files in a separate folder, we have to override the save method
Here, we can extract the file name in the save method and can also change it as shown below. And to create a custom bucket patch, we can change the self.file.storage.location variable, and that becomes our file URL.
Here we have used the model id as the folder name in the bucket; however, you can also change it to a timestamp shown below. However, doing that will change the way you retrieve the data.
One common misconception that I want to clear here is that we are not storing the location in the database. It’s been set while storing. Similarly, whenever we want to retrieve the URL from the model, we have to override the retrial function. The straight forward answer will be to override the S3Boto3Storage function. However, that will not work since we have to pass UUID during runtime. The solution to that has is being shown in step 7.
6. Making an REST APIs for uploading file
Now we can use CreateAPIView from rest_framework.generics to create an API by overriding create function. And we are then performing checks for the incoming file. And since there is only one Avatar for a user, we can delete the old image record and add a new record. However, deleting here doesn’t mean the file will be deleted from S3.
To delete the file from S3, one way will be to store the file location in a separate database model and create a lambda function so that it will be handled separately( not in the request-response cycle) to make uploading faster.
7. Getting the URLs in API response using serializer
However, for the blog, the URL for the image will be stored in the front-end editor. So we don’t have to serve those URLs in response to the blog. But in the case of an avatar image, we have to pass that with the user object.
To do that, we have to define a SerializerMethodField called profile_pic and use get_profile_pic to get the user image.
🎊 We are done now; let’s register the new views to the url.py file.
8. Adding URLs
And that’s it! We are done configuring our custom Django S3 bucket.
Thank you for reading this blog. Feel free to reach out if you have any questions. I’ll be happy to help you.
you can connect with me on my website