Recently I had to dabble considerably with Go at work. One of the fascinating things I came across while dipping my
toes in the Go world was the idea of a single binary based deployment of your code to production nodes. The idea is,
that unlike the typical traditional approach you'd follow with a Python web app where you install the right version
of Python on your server, pull the codebase from a VCS repository, install the Python dependencies, and run a Python
web server like gunicorn along with any worker processes such as Celery workers -- you can instead just scp (command
to copy a file
across different machines) a pre-built binary to your server and simply run it. This idea certainly has some appeal, at least to some of us. It feels very lightweight and simplified, capable of steering clear of many of the pitfalls and headaches one often runs into when managing a complex environment
by hand. I understand that in most cases you would tools like Docker, Ansible and so on to automate the deployment process and make this process easier and more reliable. However this still doesn't do away with the complexity of your production environment and adds overhead in different forms, such as having to monitor multiple daemons and processes, dealing with edge cases, measuring memory/CPU usage, and quite simply whatever cost that comes with added complexity. Using a single binary approach also makes it much faster to boot up and get your new node running when you are scaling up and want to handle the increased load rapidly.
Given theses factors I decided to give the idea of single binary deployment a go, but not for a statically compiled
and linked language such as Go or Rust, but instead for dearly Python. To some the idea of deploying Python apps on
to a server using a single binary might seem completely absurd at first thought. But if you have spent some time
working on desktop applications in Python and tried to provide an easy installation process for your users you might
already have some clue. There are already some well-known libraries which allow you to create a binary executable
codebase. Some of these tools include py2exe, py2app, pyInstaller, cx_Freeze and so on. Generally the idea is that you figure out all the dependencies from Python's standard library in your codebase, the third-party dependencies, and then pack your Python code and third-party dependencies with a trimmed down version of the Python interpreter of the same version as the OS you're currently running. This for example allows you to create a Windows .exe file from your Python codebase if you are running Windows, and your program can be launched by simply double-clicking the exe file. This is certainly a much more user-friendly way of distributing your software written in Python than asking a non-technical person to install Python on their OS and launch your software by running a command on the command prompt. Just like Windows, you can create native executables for Linux and Mac too provided you are running the same OS.
So if you are deploying to a Linux server you can use a tool like cx_Freeze and generate a dist directory containing a binary entry point to run your app. And not just that, you can configure the path for static assets and the dist will package them along with the modules, thus letting you deploy not just the app binary, but also any static assets you need, all packed together. So using this approach you can create a self-contained directory with all the data and information needed to run your app, and you can simply copy it to the production environment and just launch the binary entry point. End of the story right? Not exactly.
It was a great exercise being able to deploy my Python web app by just building the executable locally and scp-ing the new executable to the server and running the new binary. However I wanted to try out and push this notion of single binary further. In my application architecture I was basically forking two processes, one of which was the uvicorn web server process serving a FastAPI web app. The other was a Celery worker handling scheduled tasks. While handling the complexity of two processes wasn't an issue, this approach still left me itching for an even more minimalist design under my newfound monolithic obsession. The idea was that unlike using a process-based web framework like Django or Flask, which spawn a new process for each request, I was already utilizing the asynchronous model with FastAPI which was able to perform with just a single process. FastAPI uses Python's asyncio so it's possible to tap into the event loop to perform other tasks along with the work FastAPI was doing. This drove me to the path of slashing yet another component off my stack and Celery was the victim this time. I decided to build a simple model to handle task scheduling by myself, using the functionalities provided by asyncio. This model basically picked tasks from a database-persisted priority queue whenever its scheduled time was reached and then ran the task asynchronously. Likewise new tasks simply had to be added to the queue and it would be persisted and processed at the right time. This meant that I only had one process running and it made it easier to monitor and profile the memory/CPU usage of the web app.
If your web app is serving thousands of users daily and has a lot of data processing involved, you need the full power of a modern database server like Postgres or MySQL. You will also want to use Redis to cache results, session/cookie data and so on. However there are certain scenarios where you don't actually need Postgres. No, I'm not advocating plain text files. I just want to throw some positive vibes to sqlite which I feel is underrated by many developers. Yes sqlite is nowhere near as powerful as workhorses like Postgres and MySQL, but chances are you might have been considerably underestimating this plucky pony of a database called sqlite. If you are trying out a new project idea or have an existing application with limited load, sqlite might be enough for your needs. Combined with the single binary approach, using sqlite which comes built in with Python can further simplify your workflow and help you rapidly make changes to your code, with reduced friction caused by running your app on multiple environments.