Django and SQL: Your Dynamic Duo for Scaling Databases


Scaling and optimizing databases to fulfill the wants of your purposes could be a important problem. When you haven’t learn my latest weblog about how Django can do the heavy lifting for Python and SQL database purposes, I extremely advocate you test it out. However the TL;DR model is that SQL is optimized for SQL databases, Python isn’t, and Django is a superb middleman that can assist you construct simpler purposes, with much less friction, complexity, and code when utilizing these two languages collectively.

So whereas Django does the heavy lifting of making the database app, you’re nonetheless liable for the day-to-day administration and monitoring of your databases. A few of these administration duties could be deferred to your cloud supplier, utilizing companies like Linode Managed Databases, however you may uncover new roadblocks as you scale, resembling:

  • Database Migrations. Changing an present database to a brand new, desired state with managed modifications to the database scheme.
  • Multi-Database Deployments. To optimize efficiency, builders can design their purposes to make use of separate databases for segmented capabilities. For instance, a main learn/write database and a learn reproduction database for frequent queries.

If one in every of your databases makes use of SQL, you may use Django to scale back friction and make your life so much simpler whereas dealing with a big quantity of information.

This introduction to 2 key database administration ideas pairs with the step-by-step directions to constructing a production-ready Django utility discovered within the Understanding Databases book and my new academic video sequence. Both studying path will provide help to get Django to do the SQL heavy lifting for you.

Database Migrations
Whenever you’re beginning out, getting the information sorts proper for any given column could be a bit difficult, particularly since your knowledge wants will inevitably change over time. What if you happen to wished your title area to be simply 80 characters lengthy? What if you should add a timestamp area so you may observe precisely when gadgets have been added to the database?

Altering a desk after it has been created can get fairly messy for a couple of causes:

  • What do you do with pre-existing values?
  • What if pre-existing rows are lacking knowledge for brand new columns/fields?
  • What if you happen to take away a column/area? What occurs to the information?
  • What if you happen to add a relation that didn’t exist earlier than (ie international keys)?

Fortunately for Django builders, now we have one thing referred to as makemigrations and migrate.

Let’s check out the way it works in motion.

Right here’s our instance Django knowledge mannequin:

class BlogArticle(fashions.Mannequin):
    consumer = fashions.ForeignKey(Consumer, default=1, on_delete=fashions.SET_DEFAULT)
    title = fashions.CharField(max_length=120)
    slug = fashions.SlugField(clean=True, null=True)
    content material = fashions.TextField(clean=True, null=True)
    publish_timestamp = fashions.DateTimeField(

Let’s add the sector:

updated_by = fashions.ForeignKey(
        Consumer, related_name="editor", null=True, clean=True, on_delete=fashions.SET_NULL

This area will permit us to trace the final consumer to make a change to our mannequin. 

Let’s replace our mannequin:

class BlogArticle(fashions.Mannequin):
    consumer = fashions.ForeignKey(Consumer, default=1, on_delete=fashions.SET_DEFAULT)
    title = fashions.CharField(max_length=120)
    slug = fashions.SlugField(clean=True, null=True)
    content material = fashions.TextField(clean=True, null=True)
    publish_timestamp = fashions.DateTimeField(
    # our new area
    updated_by = fashions.ForeignKey(
        Consumer, related_name="editor", null=True, clean=True, on_delete=fashions.SET_NULL

Now, after we save this file this BlogArticle class is said in (, how can we let our database know this modification occurred?

There’s two methods:

  1. python makemigrations
  2. python migrate

Let’s talk about what these two instructions do:

python makemigrations

python makemigrations seems for modifications in all recordsdata throughout your Django mission and appears for modifications. If modifications are discovered, a brand new python file can be created with the proposed modifications that our SQL database wants to make. The proposed modifications look one thing like:

from django.conf import settings
from django.db import migrations, fashions
import django.db.fashions.deletion

class Migration(migrations.Migration):

    dependencies = [
        ('articles', '0001_initial'),

    operations = [
            field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="editor", to=settings.AUTH_USER_MODEL),

This, after all, is simply one other Python file. This file is letting us (the builders) know what ought to occur in our database. It’s written in Python and never SQL to keep up cohesion and to leverage the Django ORM’s built-in options.

However why is that this a file for what ought to occur? Nicely, there’s a couple of causes for this:

  • If we have to overview what ought to occur earlier than it does, we will catch it right here.
  • This makemigrations command doesn’t test with the database to see if this modification can even occur.
  • The database might have already been modified to suit these necessities (relying on numerous components associated to who/what’s managing the database).
  • If we have to run exams previous to altering a manufacturing database, proper now can be a tremendous time to take action.

Assuming that this modification is legitimate (so far as we will inform), we will commit the modifications:

python migrate

python migrate will try to vary our database for us — all fields, columns, tables, international keys, you title it — Django will do the work for us to assist make sure the database is up to date in the way in which we supposed.

It’s essential to notice that Django may fail to make these modifications for numerous causes. For brand spanking new Django builders, that is virtually all the time attributable to including and eradicating fields and columns and failing to run migrations appropriately.

When achieved appropriately, python migrate ensures a secure system that matches our Python code with our SQL tables, thus permitting us all of the awesomeness that each Django and SQL databases present.

How does this give us extra flexibility?

Python has broad purposes the place SQL doesn’t. Structured Question Language has its limitations written within the title. Who’s creating Pixar animations with simply SQL?

OK, all of that is to say that the simplicity of Python really helps builders undertake the ability of SQL and probably with out even figuring out it.

Why Managed Databases and Django Make Sense

In the case of creating an internet purposes, together with Django, you’ll have to determine a couple of issues:

  • Which knowledge storage resolution(s) do we would like? MySQL, Postgres, MongoDB, Redis, Object Storage, and so forth
  • How will we run/combine with the information storage resolution?
  • How will we get better from interruption or downtime?
  • How will we preserve the storage resolution?
  • How will we safe our storage resolution?
  • How will we backup our storage resolution?

The solutions to those questions might change as your mission grows in complexity however all of them begin in the identical place: deciding between self-managed versus third-party managed.


  • Professionals: Management and price.
  • (Important) Con: You’re liable for the whole lot.

Managed companies typically value extra money from the beginning, whereas self-managing means you should utilize your most well-liked Linux distro that’s by some means (or considerably) optimized for what you want. This may embody operating a forked model of MySQL that your staff has modified. You may save {dollars} on operating your service, however this can all the time take extra time to keep up.

Third-party managed databases: 

Sure, it is likely to be barely costlier in {dollars} and cents however it’ll take considerably much less time to keep up. This selection and managed knowledge storage options are my de facto alternative for my internet purposes. On this instance, we’re already using Django to handle database transactions. SQLAlchemy additionally shares this power as it’s used with frameworks resembling FastAPI, Flask, and plenty of others. When you’re already outsourcing your SQL writing to a Python bundle, why not outsource operating your SQL servers?

Now, given the effectiveness of Python ORMs (like Django ORM and SQLAlchemy), I like to recommend that you simply use managed database and/or managed knowledge storage companies every time potential, right here’s what you stand to achieve if you happen to do:

  • Decreased improvement time
  • Decreased administration time
  • Decreased restoration time
  • Decreased service interruptions
  • Decreased deployment and improvement complexity
  • Decreased complexity in Database migrations (from different companies)
  • Decreased repetitive/ineffective/inefficient actions for SQL builders
  • Decreased DevOps/Ops complexity
  • Elevated effectiveness of non-SQL builders
  • Elevated deployment and improvement pace
  • Elevated reliability (typically backed by a Service Degree Settlement)
  • Elevated safety
  • Elevated maintainability
  • Elevated in backups and redundancy
  • Marginal improve in value

I made the checklist above with the mindset of utilizing a Managed MySQL Database Cluster on Linode as effectively Linode Object Storage (for storing recordsdata like CSS, JavaScript, photographs, movies, and so forth). Virtually talking, utilizing these companies helps us preserve give attention to constructing a wonderful internet utility with Django, FastAPI, Flask, Node.js, or no matter. To place it one other method, we shift the give attention to constructing the instruments and software program your customers really need. You already know, the place the actual worth is to them.

MySQL, PostgreSQL, Redis, and Django

For a very long time, Django’s main database was PostgreSQL. I’d argue that is, largely, attributable to the truth that you may use a JSONField inside Postgres solely. With Django 3.2+ and MySQL 5.7.8+, the JSONField is now out there for MySQL as effectively.

Why is that this essential?

Storing unstructured knowledge, like JSON, is usually required when dealing with user-generated content material or storing knowledge from different API companies. Let’s see how:

from django.db import fashions

class Pet(fashions.Mannequin):
    title = fashions.CharField(max_length=200)
    knowledge = fashions.JSONField(null=True)

    def __str__(self):
        return self.title

Right here’s the information I need to retailer in relation to this Pet:

pet1 = {
    "title": "Bruno",
    "sort": "Rat",
    "nickname": "We do not discuss it",
    "age": 2,
    "age_interval": "months"

pet2 = {
    "title": "Tom",
    "sort": "Cat",
    "breed": "Blended"
    "age": 4,
    "age_interval: "years",
    "favorite_food": [{"brand": "Acme", "flavor": "Tuna" }]

pet3 = {
    "title": "Stewey",
    "sort": "Canine",
    "breed": "unknown"
    "age": 34,
    "age_interval: "canine years",
    "nickname": "Soccer"

This knowledge exhibits us once we may want a JSONField . We are able to retailer all of the pet names (utilizing the title key) and preserve the remaining to be saved within the JSONField. The cool factor about JSONFields is they are often queried very similar to every other customary Django area even with these various schemas.

There’s an ongoing debate amongst Django builders as to which database to make use of: MySQL or PostgreSQL. For the longest time, I all the time opted for PostgreSQL because of the truth the JSONField was solely out there on PostgreSQL, and that’s not the case. I say decide one and keep it up till it not serves your wants.

However what can we use Redis for?

Redis is an in-memory datastore that’s extremely quick and infrequently used as a brief database (extra on this in a second), a caching service, and/or a messaging queue. The rationale I name it a brief database is because of the truth that it’s in-memory. Reminiscence is usually costlier than disk storage and thus making storing knowledge long run in-memory is usually not possible.

My main use case for Redis and Django are caching and queuing

Caching: Let’s say you have got a number of internet pages that customers go to a lot. You need the information in these pages to be proven to customers as rapidly as potential. Redis, as a caching system for Django, makes doing this extremely simple. The information inside these pages may be rendered from a SQL database however Redis can retailer that rendered knowledge from the cache. In different phrases, utilizing Redis with SQL can typically pace up your responses whereas lowering the quantity of queries to your SQL databases.

Queuing: One other well-liked use case of Redis is to dump long-running duties to a different course of (typically via a Python bundle referred to as Celery). When you should do that, you should utilize Redis as a queue of the duties that ought to be accomplished at one other time.

For instance, when you have a consumer that wants a report of all of their transactions for the previous 5 years, the software program may take hours to really generate that report. Clearly, nobody goes to stare at a machine for hours. So we’d offload this request from our consumer to a Redis queue. As soon as in Redis, we will have a employee course of operating (like utilizing Celery with Django) to really generate the report. As soon as the report is finished, regardless of how lengthy it took, the consumer can be notified. This notification, as with different notifications, may be achieved via a Redis Queue coupled with a Celery/Django employee course of.

That is all to say that Redis and MySQL really complement one another very effectively. You’ll be able to deploy a self-managed Redis database server through the Linode Market.

Object Storage

The final data-related managed service I like to recommend utilizing is Linode Object Storage. Object Storage is liable for all the opposite varieties of information you might have to retailer. For instance, we’d not retailer all of the bytes in a video in MySQL. As a substitute, we’d retailer metadata associated to that video and retailer the video in Object Storage.

Right here are some things you’ll use object storage for:

  • Cascading Type Sheets (CSS)
  • JavaScript (like React.js, Vue.js, Vanilla.js, and so forth)
  • Movies
  • Photos (uncooked and compressed)
  • CSVs, XLSX
  • Database Backups
  • Docker Container Picture Layers (if self-managed)
  • Iterations of Skilled Machine Studying Algorithms
  • Terraform State Recordsdata
  • PDFs (each massive and small)
  • Any persistent file that must be downloaded typically (or uploaded)


After studying this, I hope you are feeling motivated to leverage the ability of managed companies together with your internet utility initiatives. Django is a superb resolution for constructing internet apps on high of SQL databases, but it surely’s definitely not the one one. If you wish to dive into the internals of SQL and SQL servers, I feel it’s a worthwhile train to see what number of profitable purposes leverage Django to deal with the majority of what Django can do.

Right here’s a couple of (or many) highlights that make Django with Managed MySQL on Linode superior:

  • Django does the heavy SQL lifting for you (so do instruments like SQLAlchemy for Flask/FastAPI)
  • Django allows uncooked SQL instructions too (once more, so do instruments like SQLAlchemy)
  • Django helps freshmen be taught SQL instructions
  • Django has built-in help for MySQL and PostgreSQL (along with a db-specific python consumer)
  • Will increase pace to manufacturing deployments
  • Elevated reliability and recoverability
  • Permits improvement and manufacturing environments to match database expertise virtually precisely
  • Makes container-based Django simpler and extra dependable
  • Unlocks scaling from a single-node deployment to multi-node and even full fledge transition to Kubernetes
  • Simpler for brand new Django/Python builders to make use of production-grade programs
  • Sharing databases throughout a number of python-based apps is less complicated and safer (resembling a FastAPI utility studying/writing from/to a Django-based MySQL database).
  • Django’s JSONField now supported utilizing MySQL (beforehand solely PostgreSQL)
  • Simple to check (throughout CI/CD or in native improvement environments)
  • Scales to fulfill Django calls for
  • Assist for a number of databases in a single Django mission resembling: utilizing MySQL as main learn/write database and a MySQL learn reproduction database for frequent queries.
  • Strict Entry Controls (Linode Personal IPs, native improvement)
  • Requires SSL Certificates for connection (add complexity to deployments but in addition will increase safety)
  • Permits personal connection (in identical area; lowers connection prices)

When you’re all in favour of a contemporary strategy to deploying Django purposes on Linode together with a managed MySQL Database, GitHub Actions for CI/CD, Terraform, and Ansible, leap in to tons of free step-by-step academic content material:

To assist get you began, the Coding for Entrepreneurs GitHub has a repository of code that goes with every step within the sequence. Good luck, and make sure to let me understand how issues are going through Twitter @JustinMitchel.


Please enter your comment!
Please enter your name here

Share post:




More like this

5 Ideas Find out how to Put together Your Enterprise for a Recession

y Tyson Yoon – Sr. Director Of Enterprise...

BankBazaar Aspiration Index 2022 | BankBazaar

The info this 12 months factors to the...